Page 302 - American_Higher_Education_V4

Basic HTML Version

302
our answer: the curriculum!
We start, then, with the curriculum, invoking a concept
economists use to illustrate how perfectly rational actions
on the part of individuals can, when summed, produce
unintended and devastating consequences. The “tragedy
of the commons” tells the story of what happens when a
community-owned pasture (or
commons) is at or near its capacity
in terms of the size of the herd that
can be fed without destroying the
pasture itself. Even then it remains
in the interest of each farmer to
increase the size of his own herd
since he, like each of his neighbors,
has a right to feed all the cattle he
acquires on the same pasture where
his, as well as his neighbors’ cows
graze. The problem occurs when
the total number of animals exceeds
the pasture’s grazing capacity,
and the pasture begins a near
irreversible cycle of decline. For economists, the moral of the
tale is that a perfectly rationale act (the individual sending
just one more animal to graze on the commons) can have a
devastating impact on the system as a whole (the withering of
a productive pasture).
In many ways the dilemma now facing higher education
reflects the tragedy of the commons. Three decades of
constantly adding new programs and more choices to
the undergraduate curriculum have yielded colleges and
universities that are economically unsustainable and
educationally dysfunctional. To understand how this came
to pass, we need to revisit a piece of curricular history that
dates back to the 1960s. Sparked by student revolts in Europe
and a wave of student-led political protests in this country,
American colleges and universities responded by granting
students more personal freedom and by adopting curricular
changes that reduced both general education and graduation
requirements. In time, the faculty, who had at first opposed
student demands that they be allowed to “do their own thing,”
discovered that what was good for the goose was even better
for the gander. Few faculty enjoyed grading senior theses
or comprehensive examinations, or teaching the required
course sequences that comprised many major and pre-major
programs at most institutions.
What took hold was a laissez-faire environment in which
nearly every possible subject was admitted to the collegiate
curriculum, provided the new course was taught by a fully
qualified member of the faculty. Whole new disciplines and
concentrations were similarly added, often in response to
demands that full recognition be granted to specialties that
previously had been considered outside the accepted canon.
At the same time, except in the sciences, most courses became
stand-alone experiences not requiring prerequisites or, in
fact, much if any coordination among the faculty who taught
similar courses in the same department or discipline.
For those of us on the faculty, the lessening and then
the elimination of most requirements proved a bonanza.
We could teach what we wanted—principally our own
specialties—when we wanted, without having to worry too
much about how or what our colleagues were teaching. Each
course became a truly independent experience, and our
principal responsibility was to absorb our fair share of the
enrollments, thus ensuring our department would not lose
valuable faculty lines.
For students, this commitment to unfettered curricular
choice proved more than appealing—a chance not only to
do their own thing, but to change their minds, not just once
but frequently. The curriculum became a vast smorgasbord
of tempting offerings. Faculty seeking to insure adequate
enrollment were careful to tailor their requirements and
expectations to meet student tastes. Students could design
their own majors and concentrations. But as has become
increasingly obvious, too many students also got lost, unsure
of what it took to graduate, on the one hand, or, on the other,
unsure as to what was actually being asked of them in terms
of either subject mastery or learning skills.
Institutions faced even greater problems—and here is
the core of the financial side of our argument. The more
open-ended the curriculum became—the more faculty and
students were free to set their own schedules—the more
resources, both financial and human, the institution required
to meet its educational obligations. Adding more courses and
majors to the curriculum forced the institution to spread its
current faculty resources ever thinner, to increase the number
of full-time faculty, or, as has proven most often the case, to
hire more adjunct faculty.
The result was an almost endless series of undergraduate
curricula in which “almost anything goes.”That observation
was supplied by the Association of American Colleges
(AAC)’s “Integrity in the College Curriculum: A Report to
the Academic Community” (1985). The report went on to
complain, “We have
reached a point at
which we are more
confident of the
length of a college
education than its
content and purpose.”
What was true
then is even more
so today. Repeated
calls for greater
efficiency and the
more parsimonious
expenditure of
public funds and
tuition receipts have
been rhetorically
responded to and
then largely put aside.
In the meantime the
fragmentation of the undergraduate curriculum continues
unimpeded.
And as before, worries about the escalating cost of an
undergraduate education, on the one hand, and, on the
other, the large numbers of students who begin but do not
finish a baccalaureate education have remained separate
American higher
education today is an
expensive enterprise in
which an embarrassingly
large number of students
start but do not finish a
baccalaureate education.
Three decades of
constantly adding
new programs to
the undergraduate
curriculum have
yielded colleges that
are economically
unsustainable
and educationally
dysfunctional.