If I were founding a university I would begin with a smoking room; next a dormitory; and then a decent reading room and a library. After that, if I still had more money that I couldn’t use, I would hire a professor and get some text books.
Stephen Leacock
My earlier postings on the changes needed in academia dealt with problems within our universities instead of those encompassing them. Perhaps that was the wrong way to start. If so, let me try to make up for my lapse by going back now and looking over a few of our basic assumptions about education and its structures at the post-secondary level. Let me describe what I think a university should be.
Given the realities of contemporary America, little of what I want to talk about has any chance of coming to life. The constituents influencing our educational structures (faculties, administrations, students, parents, governments, and employers) act as a brake on change, providing an innate protectionism governing most attitudes. This protectionism has grown stronger over the years, to the point where there’s hardly an “experimental” college left in the United States. Yet, by taking a look at the whole, perhaps we can identify ways that pieces can be modified, shoring up a tottering system, sure, but not one that’s necessarily bad.
As a college education has become more and more a necessary prerequisite for membership in the American middle class, ideas about what it should consist of necessarily have changed, though our colleges, really, have not–and do not. Within them, for example, one finds entrenched attitudes towards education reflecting a time when a college education wasn’t considered necessary, not even for many professional careers (once, even doctors and lawyers could train for their professions without first earning an undergraduate degree), when it was a place for the creation of upper-class and upper-middle-class “gentlemen” (and, in a few cases, “ladies”). The attitudes retained from this era have very little to do with education as it is constituted today, but the pressure exerted by the past cannot be ignored.
Another influence on education that has little to do with its actual, present-days needs was born, perhaps, along side of Isaac Newton–in other words, it was born a long time ago, though it has become something of a mania today. It is the attitude that “If you can name it and measure it, you can understand it.” That’s fine for collecting beetles or butterflies, but isn’t much use, in and of itself, when looking for the “why’s” of beetles and butterflies. Of course, Newton himself understood that measurement is only a part of learning, but that is one of the reasons we remember him today, and have forgotten most of his contemporaries in the sciences. This attitude stems from the unNewtonian misperception that the universe and knowledge are “things” and not “processes.” In this view, knowledge exists outside of the individual and, therefore, can be found–or received from another (see Paolo Friere’s “banking system of education” in his Pedagogy of the Oppressed). Education is no more than having facts at one’s fingertips. This is knowledge for the uncurious: once something is “known,” that’s the end of it.
People, such as David Horowitz, who argue that teachers should stick to “content” have fallen under the spell of this concept of knowledge as a “thing.” To many like him, this “thing” can be broken down, furthermore, into “good” components and “bad.” Such an attitude, also, externalizes knowledge to the point where it is perceived to be receivable passively and even without intent. As a result, Horowitz fears professors who extol what he sees as “bad” knowledge, for he sees their students as nothing more than receptors who will simply accept what they are told as “truth.”
Of course, neither the universe nor knowledge exists in static form–and students don’t learn passively, but engage in a dialogue with the material they are considering. Much of their education, actually (though those with beliefs like Horowitz’s would deny this), is not the assimilation of information but is a process of gaining the ability to negotiate information and to manipulate it towards desired ends. Instead of learning about the scientific method, for example, they learn to utilize it–even in arenas far removed from the sciences.
This, of course, is at the heart of why testing can’t be the sole arbiter of educational success. Ultimately, what one learns is much less important than what one learns to do with it. That’s why so many business people, when asked what they want out of colleges, say they look for the ability to think, not “mastery” of a specific body of knowledge.
Learning is the result of active engagement, not passive receptivity–and the ability to engage actively is what many employers seek. This has long been recognized, but the furthest anyone seems to go with it in today’s colleges is to discourage lecture classes in an attempt to get professors to increase student involvement in the classroom learning process. But this is hardly enough (and lectures can be an important part of active learning).
Let’s imagine, though, that these arguments have been settled. That no one expects a college education to provide a basis for a certain class identity. That learning has come to be seen as something more than an accumulation of facts. Even that employers act on the fact that they know but ignore–that “training” in a field is only a small part of what should be sought in most entry-level job candidates.
What would a college look like, in such an environment? Well, let me say this before anything else, nothing in my idealized academic environment is new. Much of it, in fact, is very old. But we’ve moved away from it, and not for good reasons. It’s really galling: we know how to give someone a good education. We just don’t do it.
First of all, my college would not look like the result of a detailed construction of category boxes followed by a willy-nilly dumping of unsorted items into them. It would not be based on divisions. Another organizational model would have to be developed, one that would replace departments and majors–and “courses.” Today, we rely on artificial delineations of content, but only because they do provide some sense of order, not because they have inherent value. There is no need for the particulars of the divisions we have created, and we maintain most of them simply because a re-drawing of lines would (by itself) accomplish nothing (something the proponents of forcing professors to stick to their particular “content” don’t understand). Yet few of us could teach effectively if we were forced to stick within the boundaries of our course descriptions: Nineteenth-Century American Literature can’t be taught without some consideration of earlier American writing, thought, and history, without mention of English literature, without examination of the political events of the century, or without a fundamental understanding of the scientific and technological advances of the time. Nor can it be taught without taking into account contemporary student mindsets… or without looking into the relationships of the older works with contemporary issues.
By the same token, Biology shouldn’t include only science, but might cover the role of science within society and the perceptions and misperceptions of science that allow for the continued adherence, say, to Creationism and its new offspring, “Intelligent Design.” Why something is not science is as important as understanding why something else is. Nothing in our lives exists in isolation, after all. It would be crazy to insist that studies of our world be conducted with a narrowness that the world has never exhibited. And the university should be set up in a way that reflects this reality.
In my idealized college, then, a “course” would not be a prescription but would simply be a focus. There might be goals, but there would not be things that “had” to be covered, certainly not to the detriment of something else. At first glance, this might seem to play havoc with the idea of a progression of learning through a series of requisite courses, but I don’t think it really would. Few courses meet their goals as it is, yet professors at the next level are able to make do. Professors would know more of what had gone on before their particular course: the schedule of classes would be such that professors could step in and out of each other’s classes, assisting when their particular areas of expertise were called upon.
No longer would the individual classroom be the domain of the particular professor. Responsibility for the class, in other words, would not translate into control of the classroom. Others, be they administrators, faculty, or students, could drop in, either to listen or contribute, learning about their students even before beginning to teach them.
For that to work there would have to be a concentration on developing collegiality amongst the teaching staff, not competition. Tenure, as now structured, could not be a part of this university. All teaching faculty would have to have what amounts to tenure in regards to their intellectual and teaching pursuits. That is, no position taken and no experiment (outside of those actually harmful to other beings) tried could be used as a basis for termination of employment. To provide a modicum of professional security, five-year contracts (perhaps) could be in place–but there should be no possible divisions between those with lifetime sinecures, those aiming for them, and those simply “filling in.”
Because this is an idealized world we’re dealing with, not only do all students concentrate on their studies, working in non-academic jobs only a minimum amount (and that on campus) but faculty live within walking-distance of the campus, making use of campus facilities for their own families–becoming much more familiar parts of their students’ lives than they could as simply classroom presences. College teaching should return to being an avocation, not simply a job.
There are many, many ways colleges could be structured effectively that don’t rely on the sorts of categorization we rely on today. Colleges have tried experiments over the years, and some of them with success–only to be pushed aside by the monolith of contemporary academia. St. John’s in Annapolis and Santa Fe still keeps to its great-books curriculum and tutorials, and Antioch still maintains its work program, but there are very few other institutions that have been able to maintain a structure even a little outside of the mainstream. Beloit College, my own alma mater, for example, once tried a program of three full semesters a year with all underclassmen completing three semesters their first year (going to school through August) followed by a middleclass period of two semesters on campus, two vacation semesters, and a “field term” (where each student worked for a semester at a job relating to their major) in an order determined by the student. The upperclass year was another three semesters on campus in a row. Though still a fine school, Beloit has retreated to a traditional two-semester schedule. Friends World College had a program in which each student kept a journal for four years while studying at the school’s various campuses around the world. The degree was awarded based on examination of the journal. There were others, the most famous having been Black Mountain College. Today, unfortunately, schools with experimental formulae are few and far between–most contemporary attempts at innovation being tepid, at best.
There are also many other ways colleges to be transformed into more learning-conducive environments–but this diary has already become too long.
Perhaps, for the next one, I can go into more detail, providing a more vivid picture of my ideal college.