By Professor Steven Yates
August 9, 2014
The Higher Education Matrix (II)
By the mid-1980s, higher education was entering a period of decline. The “golden age,” to the extent it existed, was long gone.
Consider first teaching. University teaching was more and more folded into a species of entertainment. Students came to college expecting to be entertained. The quality of government schools had also been dropping, courtesy of fads like Values Clarification and Outcomes-Based Education all of which followed “progressive education” in ratcheting down academics. There was a great deal being written about the so-called “affective domain” (i.e., emotion rather than cognition). Mastering the theories and classroom techniques of the social engineers behind such movements was a condition of getting the certification necessary for a job in a government school. In the Real Matrix, these were the “experts”; in the Desert of the Real, none of the originators of “theories of how children learn” had spent time in the classroom. They were lost in clouds of theory. None of their theories worked. Students thus entered college unprepared for what had traditionally been college-level work. Often they could not write coherent, grammatical sentences with all words spelled correctly. They could not work simply math problems. Most tragically, they had no grasp of personal finance—and so were fruit ripe for the picking for unscrupulous recruiters ready to lavish them with “student aid” packages.
University faculty—more and more of whom worked on semester-by-semester contracts as we’ll see momentarily—had become slaves of student evaluations courtesy of the Real Matrix everybody’s-equal mindset, and thus of the consumer’s mindset that came with Desert of the Real corporatization. Those newly minted Ph.D.s lucky enough to find academic jobs had a choice: they could keep their consumers entertained while trying to teach—or, again, find new careers.
Ludwig Wittgenstein is considered by some to be the twentieth century’s most influential philosopher. By any measure, he was one of the field’s deepest thinkers. By 1990, he would not have been employable in an American university. He only published two items during his life. In person he was eccentric and socially awkward. His lectures were not “student-centered”; they were often intense exercises articulating difficult ideas, some of them completely original. Sometimes he would pause in deep thought. These pauses could last minutes. This obviously exceeds the attention span of those raised on television. Wittgenstein would not have been seen as a “good colleague” or “effective in the classroom.”
In 2011, a work entitled Academically Adrift: Limited Learning on College Campuses appeared. It showed, with a wealth of statistics, that undergraduates were learning essentially nothing during their first two years on the average campus.
Now consider scholarship. I keep returning to academic philosophy because I know it best. Philosophers produced impressive achievements during the period 1945-1979 in almost every area, during academia’s “golden age.” The 1980s and 1990s, however, began to reveal the fruits of the repressive-tolerance mindset: “scholarship” combining cultural Marxism with collective grievance. These fruits included such revelations as that of radical feminist Catharine MacKinnon who viewed voluntary (heterosexual) sexual intercourse as akin to rape or “black studies” theorist Leonard Jeffries who saw the world as divided into “ice people” and “sun people.” (Roger Kimball’s Tenured Radicals, 1990, lays out numerous examples of the pseudo-scholarship of the academic hard left.) If you saw this as nonsense, it was because you were “viscerally racist” or “sexist” or “homophobic” (a propaganda term implying that criticism of open homosexual activity or policies favoring homosexuals is based on irrational fear, a phobia, rather than principle; irrational fear is to be treated therapeutically, not answered rationally).
It is probable, however, that the “culture war” provoked by such nonsense hid the corporatization of higher education, as we saw above. Universities were undergoing a massive shift in the use of resources from hiring and supporting faculty to creating layers of administration, paying administrators salaries vastly exceeding those of faculty. There was, in other words, a massive “redistribution of wealth” upwards, as with the larger society. The percentage of faculty members with tenure dropped as the older generation retired and was replaced not with new tenure lines but contingent slots—filled by ”adjuncts” who were hired on a semester-by-semester basis and paid poverty wages with no benefits. By 1990, over a third of new faculty hires fit this category. By 2000, the number was around 50%. Today, it is closing in on 75%! The latter are told, in accordance with the cost-cutting philosophy of corporatism, “we don’t have the money.” Those saying such things are either ignorant or lying. The money to pay faculty members living wages exists in university systems. The latter can pay top administrators six figures and football coaches even more. Administrations can spend millions on new buildings, sports facilities, and campus beautification projects. There are cases, e.g., this one, of “adjuncts” being let go after years of service to a university, with no severance packages, and then dying of treatable conditions because they were living in dire poverty!
How much of this was planned? Possibly very little. It is true, grants were made to radical feminists, e.g., courtesy of big foundations like Ford, and for the creation of ridiculous “gender studies” type programs. These helped discredit the humanities in the eyes of rational observers. Overall, however, no “grand conspiracy” was necessary. The systems destroying independent thought in higher education were running on their own. Those with real power could laugh at the radical feminists and critical race theorists and queer theorists as they swung at straight white male windmills. What counted was that the faculty, overall, had been turned into a precariat: those who worked precariously, on the margins, who could be gotten rid of at the drop of a hat (or a bad set of student evaluations) and therefore constituted easily-controlled cheap labor—think of today’s universities as the academic equivalent of the sweatshops we saw two installments back, and you have the right idea.
It is worth realizing that power elites despise intellectuals. They do not trust them. While some will do the elites’ bidding, intellectuals tend to be independent-minded and do not bow automatically to authority. They want the freedom to think and write, and they want youto have that freedom. When dictators come to power, intellectuals are usually the first to be rounded up and imprisoned or shot. Corporatism has taken a different route: impoverish intellectuals as a class. Subject them to a chronic fear of starvation. This route has proven more effective than imprisonment or martyrdom! It has neutered two generations so far. Where, after all, are these generations’ Wittgensteins? The most significant philosophers today, in the U.S. anyway, are all in their 70s and 80s!
Corporatists want a low-wage, compliant work force, and also to be able to charge higher prices for an increasingly inferior product. When I attended college in the 1970s, the cost of attending a public university was a few hundred dollars per semester. Private ones were a couple of thousand dollars. Today, tuition has climbed to tens of thousands of dollars per semester in private universities. This prices them out of the reach of those who are not rich and do not want to participate in the student loan system. The latter supports bloated administrative overhead and (as we saw in the last installment) reflects the dollar’s loss of purchasing power. In accordance with corporatism’s actually having little to do with free enterprise, tuition is not set in any “free market.” It vastly exceeds what it would be worth in the free market. The universities get paid by the government, so they don’t have to worry about market discipline. Nor do those in them have the slightest incentive to care; they have no “skin in the game.” Government pays tuition and the price of the loans plus interest is tallied up. Students get stuck with tabs able to cripple their financial lives indefinitely, as we noted earlier.
All these trends would strengthen in the 2000s. The overall intellectual and educational level of the public was dropping, despite increasing technological sophistication. That wasn’t dropping, obviously. What I mean is that the average newly minted university graduate cannot walk up to a map of the world and find, e.g., Iraq, or Ukraine. They knew next to nothing of our Constitutional heritage. I made a habit of asking classes to name the four rights enumerated in the First Amendment. Most could say something about free speech; but perhaps one student in a class of 30 could name them all. I recall the kid who asked me, “Is this going to be on the test?” Sometimes I didn’t know whether to laugh or cry.
For few students at the U.S. campus where I taught as an adjunct for seven years had done any thinking about what student loans could do to their futures. They blithely assumed, even after the Meltdown of 2008, that the economy would turn around and that good-paying jobs would materialize when they graduated. The men and women on TV said so. The Millennials—quite unlike the Boomers I came up with—trust authority figures. If it’s on the 6 pm news or if a celebrity says it, it must be true. And the more an image is shown or a statement repeated, the “truer” it becomes. For example, mainstream media could regale TV viewers with a five-year-old photograph of Trayvon Martin to depict him as an innocent child who was set upon and slaughtered by the evil racist George Zimmerman, whose media image was photo-shopped and darkened to make him appear thuggish, and whose bleeding head wounds (from Martin having been on top of him pounding his head into the pavement when Zimmerman shot him in self-defense) were omitted from media accounts.
The Zimmerman-Martin case exemplifies how mainstream media slants news stories, how the public buys the slanted versions, how truthful versions are generally available only via alternative (non-corporate) media, and how because of the enormous influence of corporate media a man’s life can be ruined by reverse racism (the irony here is that Zimmerman is part Hispanic, and therefore himself a minority).
The larger picture is worse. What is not reported by mainstream media is assumed no more real than Zimmerman’s head wounds. Thus both Republican and Democratic presidents can pursue essentially the same agenda, as was noted early in this series. Voters will barely notice—even when they are being hurt by it. Take the North American Free Trade Agreement (NAFTA). It was developed under two Republican administrations (Reagan, Bush I) and implemented by a Democrat (Clinton). Globalism via economic integration was taken further by another Republican (Bush II) via CAFTA and the Security and Prosperity Partnership (SPP), and is being taken further still, behind closed doors, via the Trans-Pacific Partnership (TPP), by another Democrat (Obama). None of this is reported on the 6 pm news; ergo, it isn’t real (or doesn’t matter).
This, of course, is the Real Matrix: a product of corporate media in many areas.
Finally, take the present-day policy of what amounts to open borders. For decades now, from hundreds of thousands to millions of people have crossed the U.S. Southern border each year illegally, to live and often work illegally in the U.S., at least until the country gives them amnesty as happened once under Reagan’s watch. Democrats want the votes of this burgeoning minority; Republicans (or their big business supporters) want the cheap labor. Both mainstream parties look the other way until a crisis develops that forces their attention.
Real education—involving history, philosophy, theology, literature, geography, foreign languages, and a study of actual cultures instead of politically correct “multiculturalism” that depicts all cultures as equals—might help a person explain why an advanced civilization that hopes to survive cannot allow everyone in who wants in, and why an absence of border security is culturally suicidal.
I’ll conclude with the question that “bakes people’s noodles.” At the very least, I’ve yet to get an intelligent response from someone who defense a mainstream perspective on these matters.
The attacks of September 11, 2001, gave us the War on Terror—the phrase that had entered the lexicon before the end of that week. Leaving aside the fact that the U.S. government had declared war on a tactic, not a real enemy as such, how does government wage a War on Terror with its Southern border wide open and admitting millions of people illegally?
Have Americans become so unable to think that they are unable to see that these two simply don’t fit together? Do America’s masses really believe the only people crossing the U.S. border are Latin Americans? Is this what their so-called “leaders” want them to think?
To be sure, that one event changed the mood of the country irreversibly and, arguably, the configuration of government from an incipient to an actual military-security police state. My main point here, to be developed in the next installment, is that the collapsed educational system prepared a gullible public for a configuration of government able to destroy what was left of their Constitutionally sanctioned freedoms amidst a climate of fear, and in which those critical of government’s rapid encroachments into every aspect of their lives could be labeled “domestic terrorists” or “extremists.”
In sum: the “culture war” accomplished two important overlapping objectives in the Real Matrix. First, those plugged in would not see the corporatizing of the universities in the Desert of the Real, destroying the capacity of those who might produce critiques of power while also entangling the next generation in a web of indebtedness. Second, they would not see how mantras about “inclusion” and “diversity” were dividing groups against one another and diverting attention from the super-elite was doing. Those few who talk about real power, are marginalized with expressions like “conspiracy theorist” and “tinfoil hat.”
Should you attend a university as universities exist now? Higher education is clearly in a bubble. A bubble is an overvalued asset. Should unpaid student loan debt continue to climb, one day the federal government will pull the plug. A lot of students—and the “adjunct” faculty teaching them—will be out in the cold, a reason I cannot recommend anyone pursue a teaching career in the present environment. Would-be students now have better options online. Some are free (Khan Academy) or quite inexpensive compared to attending classes on a campus (Udemy.com and Udacity.com are two examples). These allow you to acquire good information in less time, on your own time, for less money. (These should not be confused with online for-profit entities like the University of Phoenix which are as expensive as standard universities and arguably provide even less. There are now many such entities. Some are probably scams, so be warned!)
I end by suggesting that the only thing sustaining the higher education bubble is support by the corporatist Matrix and the blind faith of employers. A few corporations (Google is an example) no longer have a set requirement that prospective employees must present strings of university credentials. If this catches on, or if the higher education bubble pops, a lot of campuses will be forced to close their doors. That will be their Desert of the Real.
Next: the Official-Narratives Matrix
© 2014 Steven Yates – All Rights Reserved
Steven Yates has a doctorate in philosophy and currently lives in Santiago, Chile. He is the author ofFour Cardinal Errors: Reasons for the Decline of the American Republic (Brush Fire Press International, 2011). He also owns an editing business, Final Draft Editing Service.