Universities in their original conception were idealised in the Renaissance as places that could produce ‘intellectuals’. An intellectual was a person of inquiring mind who, through rational judgment and prodigious independent study, battled with an intellect they could not suppress, to reason through challenges that as yet had little or no answers, for the betterment of knowledge, humanity and the the human race. Where we have failed in the last century is losing touch with this original conception of the intellectual in pursuit of a more mundane, professionalised ‘expert’, who contests not with ideas but merely ideas that have come before, and inches knowledge by an inch where their teachers refuse to let them run the mile.
Intellectuals have lost place amidst the bulging sphere of modern ‘experts’: people who have credentials in a subject matter, but do not necessarily dare venture beyond formalised learning and formalised education.
In “The Last Intellectuals”, Professor Russel Jacoby chronicles this decline in correlation with the rise of modern universities. He argues that “before the age of massive universities, ‘last’ generation intellectuals wrote for the educated [public].”
“The newly opened and enlarged colleges [in the 1970s] allowed, if not compelled, intellectuals to desert a precarious existence for stable careers. They exchanged the pressures of deadlines and freelance writing for the security of salaried teaching and pensions – with summers off to write and loaf… The losses seemed trifling: they gave up the pleasures of sleeping late, schmoozing with friends, and dreaming up their own projects.”
Academics grew up on campuses, instead of independently, and the effect this has had on thinking cannot be understated. There has long been evidence of individuals taking on characteristics of groups that they belong to. Hence, academics, moving to formalised institutions, lost that thread of independence, lost that verve for dreaming, for passion and creativity – they got taught that they should reference everything and that everything required authority. “Original thinking” is rewarded in academia only in so far as it is heavily referenced and relies on prior thought to substantiate it. In other words: as long as it is not original at all.
The schmoozing life of cozying up to editors, where no university pay check once existed, is long gone. That romantic era where Bertrand Russel, economist Meynard Keynes and author Virginia Wolfe all lived in the same cottage and through living there Keynes once substantiated, bounced ideas off each other in a furious bent of creative energy, is long gone. The Beat generation, with its abandonment of institutionalism and its yearning for creativity in a truly independent sense: free, inspired, “On the Road” if you will, is long gone. The old compulsion: to change the world, to tackle the big questions, to be creative and actually original – is stifled out in a system where all big, gutsy questions will be penalised by peers, tutors and markers. A process of specialisation is the driving force behind this. Students enter university as young, inspired intellectuals and gradually become narrowly focused to the point of becoming irrelevant specialist experts.
“When students get to college, they hear a couple of speeches telling them to ask the big questions, and when they graduate, they hear a couple more speeches telling them to ask the big questions. And in between, they spend four years taking courses that train them to ask the little questions—specialized courses, taught by specialized professors, aimed at specialized students.” [1]
Instead of informing public debate university experts argue amongst themselves. The vaunted and celebrated ‘battle of the minds’ we hear about at university are synonymous with this phenomenon. The debate between HLA Hart and Lon Fuller comes to mind, but one might cite Chomsky and Foucalt and others.
Academia is enriched by these debates, often locked away in peer-reviewed journals; but the public, who have limited access, are not.
“They [the intellectuals] have been supplanted by high-tech intellectuals, consultants and professors – anonymous souls, who may be competent, and more than competent, but who do not enrich public life… the public culture relies on a dwindling band of older intellectuals who command the vernacular that is slipping out of reach of their [younger] successors.”
In turning away from the idea of the intellectual, universities have trended towards producing professionals, people who are competent at doing their jobs but not necessarily competent at enriching the world. The problem with this is the effect it has on students themselves. It manifests in them a deafening silence by which they cannot articulate what is wrong with their lives or the world.
Well we might say that they are expert in their field, but can they answer why their field should exist? Or why they should exist? Or how employment should function? – basic, groundbreaking questions are left unanswered. Where universities once brokered in the pursuit of knowledge, now students are left to grovel to elite cultural icons that barely relate to them at all. There is something eerily dormant in the ‘by-the-books’ student who goes to class, does the reading, does the work, and never suffers himself to think. On the weekend, he goes out drinking, inching ever closer to a PhD in “thinking the way the university wants me to think”.
Independent thought, the type that is done quietly on a Sunday night close to two in the morning when all are asleep (and never assessed) – is a rare attribute. It is becoming an even scarcer attribute in the young, who are discouraged from voicing wide-ranging opinions on anything other than how they can contribute to the economy. It seems that the public only accepts old and seasoned –one might say ‘vetted’- intellectuals: Dawkins, Hitchens, Chomsky et. al. In a recent global survey of public intellectuals by Prospect Magazine, not one person in the top 50 names listed was under the age of 25.[2] Either the public does not think well of young intellectuals, young intellectuals do not exist, or young intellectuals are being discouraged from engaging in public discussion because they “lack” formal certification or ‘expertise’.
But what is expertise? Is it tangible, verifiable, dare I ask, empirically justified?
The Renaissance scholar Petrarch once viewed a strange phenomenon in his friendship circle:
“Sometimes I would smile and ask how Aristotle could have known things that obey no reason and cannot be tested experimentally. They would be amazed and silently angered, and would look at me as a blasphemer for requiring more than that man’s authority as proof of a fact”.[3]
This sounds very similar to modern expert culture, right?
It is a strange world where you have to be old before you can be wise, and you have to be an ‘expert’ before you can be trusted. Part of this is the time in which it takes to receive a PhD. The result is an over-inflation in the age of our ‘thinkers’ rather than a meritocratic inflation of ideas. Meritocracy is abandoned in favour of a strict hierarchical structure of Bachelors, Masters and PhDs: which reinforces a hierarchical ageism against the young. An inherent implication is that the young have nothing to contribute to public debate. The young must spend their time ‘learning’, no matter how clever they may be. And so in conclusion, the old generalisation that “wisdom comes with age” reigns supreme, untested, but heavily referenced.
The worst effects of this are felt by the student: whom, by the time they get old enough to actually be ‘qualified’ to ask big questions (if that’s even possible: what qualification beget answering the meaning of life?), they’ll already have a mortgage, kids, academic strictures, a demand of a certain amount of articles per year or other limitations that prevent growth, rigorous independent thought and energy.
I am reminded here of a saying by the Nineteenth Century Confucian Scholar, Shozan Sakuma:
“When I was twenty I realized that I play a part in my local state. After I was thirty I realized that I play a part in the affairs of the nation. After I was forty I realized that I play a part in the affairs of the entire world.”
Why should we not accelerate this process, knowing now what Shozan learnt over his entire life? Why should the young not be the first to question, rather than the last?
Leo Tolstoy once wrote, and I paraphrase: That we have experienced enough by the time we are twenty to write novels for the rest of our lives. It is then, absurd, with this mindset, to only start thinking and creating when one becomes old enough to be considered an expert by society. By then it is very many years too late to begin, and by then many years of solutions, answers and independence have been lost.
——
[1] Student Recruiter, William Deresiewicz, in his article “The Disadvantages of an Elite Education”.
[2] http://www.prospectmagazine.co.uk/magazine/globalpublicintellectualspoll
[3] Francesco Petrarca, ‘On His Own Ignorance and That of Many Others’ (1368).