GRADUATE education is the Detroit of higher learning. Most graduate programs in American universities produce a product for which there is no market (candidates for teaching positions that do not exist) and develop skills for which there is diminishing demand (research in subfields within subfields and publication in journals read by no one other than a few like-minded colleagues), all at a rapidly rising cost (sometimes well over $100,000 in student loans).
That's a very hard hitting beginning to an op-ed piece in the New York Times by Mark Taylor, the chairperson of the religion department at Columbia University, entitled
"End the University as We Know It". Naturally, reaction to this controversial piece was swift and often brutal, with letters filling up a whole edition of the newspaper's letters to the editor section a couple of weeks later. Many of Mr. Taylor's colleagues had the knee-jerk reaction of calling the piece "anti-intellectualism" without actually responding to any of the intellectual arguments that were made--a sad example of
logical fallacies from academics who really ought to know better.
But there is no better example of intellectual honesty than asking tough questions about the way we're doing things instead of just saying "well we've always done things that way". How relevant is the modern university? I would argue that it's not so much a matter of relevance as a matter of mission. I would certainly not be one to say that the study of subjects such as the historical works of James Joyce or the merits of Medieval art have no value. On the contrary, paying attention to such topics enriches the intellectual and cultural life of society as a whole. But we can't escape two important observations. First of all, there is a limited demand for people who work in such fields and thus a limited number of openings. On the other side of the coin, many people have far more practical interests and want their education to tie in with career or other goals that are not necessarily about purely academic study.
So why then do we make the university a one-size fits all institution that we expect most everyone to attend? A careful reading of history will show that this was not always the case. It used to be that only the true intellectuals went to universities while people who simply wanted career training learned through on the job experience or through apprenticeships. I would also point out that making university attendance a sort of unwritten societal expectation has the effect of filling universities with a lot of people would not necessarily want to be there otherwise. This actually waters down the intellectual climate of universities for the people who really do want to be there to study academics.
The real need here is for more of a diversity of options. Society has a standard model of education that is supposed to apply to everyone in society: kindergarten, elementary school, junior high or middle school, four years of high school, and four years of college. Only after that do we start giving people some choices, such as whether to continue their studies to earn an advanced degree, enter the work force right away, go off to teach English abroad for a couple of years, or whatever. But in most areas of our lives, we live in an environment that is increasingly personalized and customized, all the way from thousands of different mobile phone apps to several dozen varieties of toothpaste. In an era where technological advances and demographic shifts have enabled an abundance of choice, why do we stick to a one sized fits all model in what is arguably one of the most important aspects of a person's life--their education?
Some progress is being made, such as the increasing ease and popularity of adult education or retraining programs. But I think we need to go further. While traditional universities still have an important role to play, we need other options for people who have other needs or goals. This would increase the quality of universities by ensuring that, since people have numerous other options, the people at the universities are the ones who truly want to be there. For those who don't, we should have far more options along the lines of apprenticeships, on the job training programs, technical and trade schools, and customizable training programs of all sorts. Currently, alternatives to attending a university are often thought of as inferior to a traditional bachelor's degree. In too many cases, this is because they are. But there's no inherent reason that they have to be, if we would only start putting a value on choice in education.
For instance, one approach that I've read about that I find intriguing is hands-on training programs centered around solving specific problems the world is facing. There is a world of possibility here. A program focusing on climate change or the energy crisis has room for work in all sorts of academic disciplines ranging from basic physics to chemistry to engineering to economics to public policy to marketing. We could focus on problems that are either pressing or merely theoretical. As an example of the latter, I once took a correspondence course focusing on SETI and theories about communicating with an extraterrestrial civilization. The course drew on material from mathematics, biology, planetary science, space exploration technology, communications theory, computer and information theory, and much more.
While universities are getting better at interdisciplinary programs and the like, there is still too much of an emphasis in increasing specialization and an obsessive narrowing down of academic disciplines. The problem is that, in the real world, solving problems effectively requires drawing on knowledge from many different specialties. Collaboration, rather than isolation, is increasingly the rule rather than the exception.
And what about the plethora of opportunities to integrate things like games, simulations, and virtual worlds into mainstream education? Don Tapscott's book Macrowikinomics discusses several examples of innovative uses of platforms such as Second Life. The military, which has every reason to place prime importance on quality training, makes extensive use of what-if scenarios, computer simulations, and war games. Another futuristic technology, virtual reality, is already making inroads into military training and no doubt could be put to good uses by imaginative teachers and professors in the civilian world as well.
In a world that increasingly requires a highly educated and highly skilled population, we can't afford to keep education mired in tradition at the expense of building an education system that effectively serves both its customers (students) and society as a whole. It's time to rethink old assumptions and get creative.