"It would also seek to restore the honorable lineage of the university as one of the few arenas in modern society (another is the arts) in which prevailing ideologies can be submitted to some rigorous scrutiny."
Then the battle is lost, because universities, at least in the USA, are the home of "free speech areas" and not letting people speak who a group disagrees with. When people are talking about "safe areas" to protect students from words spoken by a speaker, then criticism is dead. Political parties have more diversity of allowed thought. STEM is often criticized as unwelcoming, but humanities has become a place of "agree with me" or be condemned.
If you didn't forge your ideas in the fire of criticism at university, then you were cheated by others or yourself. The best teachers will make you argue both / multiple sides of a scenario and be offended that you parroted their opinion back to them. The worst teachers only see one valid way to think and are doing "missionary work" instead of teaching.
Then the battle is lost, because universities, at least
in the USA, are the home of "free speech areas" and not
letting people speak who a group disagrees with.
STEM is often criticized as unwelcoming, but humanities
has become a place of "agree with me" or be condemned.
It is a mistake to assume that this is a localized phenomenon limited to the United States. Lately - and as somewhat of a testament to how disturbingly uniform and inauthentic, millenial culture has become everywhere - the same kind of intolerance for
discordant views can be seen on campuses from Canada to the UK.
I'll leave you with an excellent write up on the topic of this brand of groupthink and shutting down opposing views, on campuses, by Brendan O'Neill in The Spectator :
Have you met the Stepford students? They’re everywhere.
On campuses across the land. Sitting stony-eyed in lecture
halls or surreptitiously policing beer-fuelled banter in
the uni bar. They look like students, dress like students,
smell like students. But their student brains have been
replaced by brains bereft of critical faculties and
programmed to conform. To the untrained eye, they seem
like your average book-devouring, ideas-discussing,
H&M-adorned youth, but anyone who’s spent more than five
minutes in their company will know that these students are
far more interested in shutting debate down than opening
it up. [1]
[1] Free speech is so last century. Today’s students want the ‘right to be comfortable’
Wait. Book-devouring, ideas-discussing students? Where the hell are these creatures, and where were they when I attended undergrad!? All I've ever seen was job-seeking, exam-studying things.
Also, doesn't the constant talk of debate and free exchange of ideas rather presuppose a form of antirealism about everything under the sun? After all, we don't talk about a free exchange of ideas on the laws of motion. You can be accurate or incorrect about those.
> When people are talking about "safe areas" to protect students from words spoken by a speaker, then criticism is dead.
I understand what you're getting at, but some people really do have PTSD, and some speeches really do contain things like graphic recountings of abuse. Should those people just not attend university? (I'm actually undecided.)
> I understand what you're getting at, but some people really do have PTSD, and some speeches really do contain things like graphic recountings of abuse. Should those people just not attend university?
Perhaps don't attend that speech then? Life at a university doesn't involve dodging "pro rape" speeches blasted over the PA system multiple times a day.
The recent article about the "safe spaces" full of coloring books and toys notes that people were fleeing to those rooms simply because _someone was on campus giving a speech that challenged their beliefs / "triggered" them_. If universities are catering to folks that simply can't handle the thought of someone with a conflicting viewpoint talking on campus then the ideals of the university truly are dead.
My argument is that whether the idea of a "safe space" is warranted, and who should be allowed to make use of such a space, are different questions.
A "safe space" might have valid medical reasons to exist—in fact, it would likely be a concept quite easily unified with that of a counselling office.
But, for it to have the correct incentive structure, it would probably require you to get a note from a real honest-to-god psychiatrist at the beginning of the year before you could make use of it. Otherwise you end up in the stupid rut of the "ignorant-but-unaware-of-it" people avoiding the very properties of an education designed to bash their narrow views open.
> I understand what you're getting at, but some people really do have PTSD, and some speeches really do contain things like graphic recountings of abuse. Should those people just not attend university? (I'm actually undecided.)
Contrary to the collective opinion of certain Internet communities, I think it quite likely that reports of PTSD in the average person have been greatly exaggerated.
It's great to provide support in the rare case that a person really does have an unmanageable reaction to certain memories. But mitigating this edge case by making that support a default condition in every interaction on a campus stymies the free thinking and free expression that are supposed to flourish in a liberal learned environment.
I guess I am in the minority because I think that the concepts are good and useful in a limited scope but that they're obviously being abused by a cynical group of people to shut up other people they disagree with. Like, you have people arguing with a straight face that Christina-Hoff Sommers even being on their campus makes them feel unsafe. The entire thing has turned into a bizarre group-performativity ritual that has nothing to do with protecting PTSD-sufferers. Trigger warnings are supposed to protect the most vulnerable people from being blindsided, safe spaces are supposed to be places where emphasis is on support. They are not general, universally-applicable concepts.
It's because people have realized that if they say they feel unsafe, then the university takes them more seriously than if they said they were merely offended.
Universities exist for the benefit of society at large, not for the benefit of individuals.
If people are afraid or forbidden to speak their minds, then the university won't be able to function as a place to advance our understanding of the world.
Not being offended, and not entering a dissociative fugue state and shaking and screaming and muttering for 30 minutes and quite disrupting the speech in the process, are very different things.
PTSD is not Tumblr-fairy disease; it's the thing you get from being interned in a POW camp or locked in someone's sex dungeon for months/years. People do have a right to avoid reliving that sort of experience.
Which, like I said, might mean that they'd have to avoid going to university altogether. It's probably nearly impossible to have an environment that both allows for a free-flowing exchange of ideas, and yet protects people with this sort of permanent mental disability.
But making an allowance for rape survivors to leave the room when there's going to be a live screening of Salò isn't all that hard, is it?
Yes, they are very different things, which is why you are arguing against a strawman. You know perfectly well that letting someone voice his conservative political opinions is not going to trigger PTSD. Pretending to be offended in order to stop the debate is the left's version of invoking terrorism in ridiculous contexts.
The biggest irony, which you probably can't even see, is that your own comment is an example of the disturbing and vile speech you claim to hate ("sex dungeon", "POW camp", "Salo"). You didn't need to bring those up to make your point, but you did. Somehow disturbing and vile things like burning the flag and condoning drug abuse never seem to upset the left (and I could go on...), but talking about individual rights and respnsibilities does. Funny how that works.
I... didn't claim to hate anything? I was making an argument that some people will get sick from some speech. It's a medical issue, and there are several solutions. I'm not sure you read my comment charitably.
Now, I'm not from the US, haven't been to a university for quite some years, and so am not personally experienced with what these "safe zones" are about in practice, the politics surrounding them, etc.—but I was trying to say that, if they had any justification at all as a theoretical construct, it was in giving people with PTSD a way to get away from "broadcast all over campus" things like graduation speeches, if and when a planned speech is found to have content that might trigger specific kinds of traumatic stress attacks.
Not because it might offend someone. Not with voluntary ability to leave for any reason one might like. Just a medical excuse from attendance, like a person with acrophobia would get from a field trip to the roof of a skyscraper.
And again, even then, it might be better for those people to just leave the university campus, or avoid coming in the first place.
If you cannot function in an environment without damaging yourself, then you need to deal with that first. If making accommodations for you is used to censor or affect the free speech of others then you are now the oppressor even if it not intentional on your part. Letting people use your condition to condemn or exile others is beyond wrong.
I once worked inside a social program and the first lesson is to heal. Painfully, that often means scar tissue. If you have trama from words, you need to deal with it, not hope the world will.
To clarify, I didn't mean to disagree with you about the severity or suffering of PTSD patients.
> PTSD is not Tumblr-fairy disease; it's the thing you get from being interned in a POW camp or locked in someone's sex dungeon for months/years. People do have a right to avoid reliving that sort of experience.
It doesn't even take something that severe, I have a friend who was working at a convenience store at the time when he got PTSD after going through two robberies in two nights, both times from people he knew. My understanding is that PTSD is triggered by episodes of severe, debilitating stress.
PTSD sufferers deserve the freedom of not reliving those episodes, and there are treatments available to help them. They don't have to avoid going to university altogether, they don't even have to wait until they're better thanks to the many online education options available these days.
> But making an allowance for rape survivors to leave the room when there's going to be a live screening of Salò isn't all that hard, is it?
Perhaps there was a miscommunication, but I agree with you on that. That's what I meant in my previous comment when I said 'if you don't like what someone is saying, walk away'.
> That's what I meant in my previous comment when I said 'if you don't like what someone is saying, walk away'.
Ah. See, I figured that with something like a graduation speech, 'just walking away' might be a bit fraught:
- First, you won't know what the content is until you're already there and hearing the things that could be PTSD-triggering, and by that point you already might be mentally "gone" and unable to escape; it would be preferrable to catch such things before then, which is what all that "Trigger Warning: pictures of torture" stuff is about;
- And second, things like grad speeches are usually put on loudspeakers and carried along across the internal campus PA and video-channel to PA speakers and TVs in various places, etc. It's hard to get away from such a thing, even when warned that it will contain something that might trigger your condition, especially if your life is effectively isolated to the campus (you live in the dorms, etc.)
It was under those two requirements that I saw the idea of a "safe space" as being possibly useful: it gives such people a place to go where they know the content won't follow them.
Under any other conditions, it's a horrible idea. :)
Are we not really reaching here with the graduation speech examples? Every graduation speech I've heard has been little more than encouraging platitudes. Hardly likely to trigger PTSD. When was the last time graphic torture or rape was recounted in a graduation speech?
I guess I am reaching. But like I said, I'm looking for a "steelman argument"[1] in favor of the idea of a "safe zone"—a worst-case one-off emergency scenario where it might be warranted to have one.
If we can agree that it's warranted in that case, then it becomes a matter of agreeing exactly when it becomes unwarranted—which is a much more fact-based debate to have than one where one side just shouts "Freedom of Speech!" and the other side just shouts "Freedom from Persecution!" and neither side believes the other's bullshit one iota.
(Also, I think this is the most I've ever been downvoted for speaking in the classical hacker-vulnerability-probing "but what if a tornado of Hitlers hit the school?" style on HN. I'm not even using any inflammatory language or anything, whereas all the replies to me are acting as if I was some social-justice type spouting off about "my right to not hear some white cis male talk about the time he went to Vegas" or somesuch. It's interesting; is this one of those "if you try to take an orthogonal position then you're supporting neither side, so both sides see you as supporting their opponent and hate you" effects?)
The trouble here is that you're arguing from the enviable position of having a solution without a clear problem, and are trying to find a problem to justify it.
The concept got flipped in the chain here -- what is actually happening on these campuses is the establishment of "free speech zones", where a person can say what they like, and the establishment of rules that, on the rest of the campus, punishment will occur for saying things that could trigger or make someone feel unsafe.
So, what's happening is that an entire campus is made to be this "safe" area.
Is the assertion that there are so many people on any given university campus who might have a breakdown that the entire campus must walk on egg shells? Because that is the side of the debate you're arguing from.
We see what you're trying to do--the problem is that the general "think of the one terrible case" has been tarnished rather heavily in the last decades by people talking about "well, what if it stopped 9/11", or "well, what if it stopped the OKC bombing" and so on and so forth.
The idea of "well, in this one specific extraordinary case it's totally reasonable" is kinda played out among anyone with half a brain, because it's been used to justify so much bullshit.
Also, the entire dichotomy of warranted/unwarranted is suspect, but that's a different rabbit hole altogether.
So you have people on the right were shocked and triggered, screaming at the speaker and pulling the power on his microphone, and people on the left demanded that hard truths must be spoken so suck it up. But I wonder if today he could do this speech without giving trigger warnings for colonialism and war. I wonder if a university would even give him the freedom up front to do a speech without vetting the topic so it couldn't surprise anyone. Is this a price we're willing to pay?
I took a class on child abuse while at college. To say it had graphic recountings of abuse would be an understatement. It was one of the most extreme experiences I had during my time there.
On the first day, the professor told the class that we would be getting into some pretty extreme material, and if at any point someone needed to leave class or even skip it, to just let him know and schedule some one on one time with the professor. That seems to have worked as a good solution. Far better than banning that class or censoring material.
Evolutionary psychology should be careful about what it elevates to scientific "fact", because quite a lot of it as quoted by people seems to be just-so stories with no hard evidence.
The "interpretation of archeological artefacts" kind of science is a lot less durable than the "reproducible double-blind experiment" kind.
This is only mildly related to the article, but does anyone else have difficulty imagining what others' jobs are? For example, I'm just young enough to remember a time when I thought a programmer was someone who hid in a closet to slam at a keyboard for nine hours a day, but that notion was only corrected by entering the workforce. What about my other misconceptions? There's no way to learn about all of them firsthand. I figure plumbers just travel between job sites to hit pipes for nine hours a day, managers just scream into phones and reply to emails for nine hours a day, and literary theorists ... sit ... in their offices ... for nine hours ...
That's the problem; I'm not possessed of a sufficiently creative (or informed) mind to fill in the blanks here. Dangerously, due to that programming knowledge, I also have the incorrect notion that everyone else could be replaced by either a handful of code and some lightly trained workers, or wholesale by mechanization. Clearly this isn't the case (or the market is doing a very poor job of finding exploitable niches), so what gives? What do people _do_? Because I'm at a loss and need to educate myself.
So many people have this question early in their careers, it really seems like something is wrong. Perhaps we need a serious/mandatory/formal job-shadowing program for teens and college students.
This is a stunning book. 15 years ago, when I first came to this country from India, a classmate gave me a copy of this book. I guess he wanted me to understand America or Americans or whatever. It definitely affected me at some deep level. I still have my copy, and I read it everytime something upsets me about my job & I get those "maybe I should just switch to something else..." feeling.
Its not "by Study Terkel" because he didn't really write this book....I mean, its his book, but he basically ran into all these chaps, and let them do all the talking. And they talk and talk and talk. He prompts them from time to time, but mostly, its a book by these people. And their stories and lives are super compelling. I often used to wonder why people do dead-end jobs like parking cars in a building, or store clerk etc. Here, those people tell you why. There is this hooker who tells you how she got into the business, how she turns tricks, dealing with cops, offering freebies to pimps,the whole thing is simultaneously amazing and sordid and just puts everything into perspective. It taught me that America is a hard country. Its not some first world paradise. America is like, seriously fucked up. I mean, all these people, they were here long before I got here, and they have friends & families & homes & connections, and yet their lives are so hard and messed up and its not all whining & complaining, but I did get the impression things are very hard out here for the blue collar guy. Ofcourse had Turkel gone to Sand Hill road & bumped into a hundred VCs, that would be a very different book. It a definite must-read, especially if you are an immigrant.
Like apprenticeships? Man, what an outdated idea. They did that in the Dark Ages, for goodness sake! /s
We have a strong push towards internships, it seems odd that students aren't taking advantage of this hands-on experience in a field that interests them. Of course, missing out on a paid job for the (most likely unpaid) internship is probably a factor.
Well, an apprenticeship is far more of a commitment than what is needed here. Rather young people are looking for a survey of what specific careers really look like. Might be impossible, but job-shadowing might be the best answer. I shadowed a judge for a morning as a high schooler, and it was tremendously enlightening, but it could have been a lot better if he did it frequently and if I had a chance to shadow other professions for longer periods.
...but... that's what literary theorists literally do, isn't it? Read books and articles and write about them, and attend meetings. Having spent a reasonably long time in the University system (as a student, mind), I have a general idea of what everyone does and why.
On the other hand, I really don't understand what a lot people in the tech industry do. I have a friend who works for some kind of 'data analysis' company. He makes $80K a year, and he tells me on a bad day he has to do 2 hours of 'real' work: the rest of the time is spent on reddit or taking classes. It's gotten so bad (or good) for him that he is taking 3 evening classes in the University he graduated from while working full-time. He says he spends most of his worktime doing assignments and readings anyway, so he's more prepared for classes than he was ever in college.
I understand why a game company might need 50+ developers working on the same game. I understand why Google might need all those engineers. I really don't understand what 50+ 'data engineer's at what is essentially an SEO company with 100+ employees do... Like you say, I am really confused : is the market REALLY that bad at finding exploitable niches?
And then there's another friend of mine, who says the most difficult part of his Google engineering job by far was getting it. Go figure!
They do exactly what you think they do - they help CRUD - create, read, update and delete information.
Combined with engineering, there's some very interesting ways to do all those things. For example statistics on such a big scale were very hard until the internet became so widespread.
The folks who brag and tell you they make great money and don't do shit all - it's just ego tripping. Really, I've seen it a thousand times - somebody of moderate intelligence and no desire or ability to do anything creative or interesting, will sweat his/her way through university and then gloat over what an awesome position in company X he/she now has.
Really, it can be frustrating if you are in financial difficulty but rest assured, those who do the real work and are savvy enough to not get taken advantage of, will come out on top. The fella bragging about not doing much is going to continue not doing much and best case scenario - he/she cruises through life with a nice salary, having done the bare minimum.
Any large corporation is filled with busy work (~80%), which you can get out of.
It's up to the employee to find their way to that 20% that's real work. If wanted it's definitely possible to find that 20% and be /truly/ worked to the max. But it doesn't just happen.
Maybe also a good question is "what should they do". A lot of jobs are rubbish. That opinion rubs people the wrong way for totally acceptable reasons, but only a hundred years ago, around 90% of people were farmers. Today it's 2%. I think it's fair to say that the cataclysmic effect that our human economy is having on the planet can not last in it's current configuration. Most people should be employed in building and maintaining the planetary biome... in other words, they should be: farmers, only a sort of high-tech, hi-fi, scientific mercenary warrior type of farmer. A plumber for example - the world will always need those - ought to be primarily concerned with diverting waste as a source feed for soil, in a clean and efficient way: a hero. But most jobs exist under a 100 year old paradigm which will change rather quickly. You might say they are "soon to be farmers" - that's the best case scenario. Other's would have us all be mercenaries and go out in a blaze of glory. Not me.
I think you're ignoring the possibility of technical progress. Who says we will even need food in 100 years? Eating dying plants and dead animal bodies is, on the whole, a very, very inefficient way of getting energy from the sun.
Reminds me of this talk https://youtu.be/21j_OCNLuYg and also happens to be my current mid-range goal. I've worked on a farm, gardened, and raised chickens growing up and I'm dead set on doing it again, the amount of work that goes into it vs. the amount of food and enjoyment that you get out of it makes it a no brainer if you can take the leap and buy a plot. Construction begins this summer and I couldn't be happier.
Lawyers are probably a big one. Most people think they spend all day in the courtroom, but many lawyers never set foot in a courtroom. Instead, their days are spent meeting with clients and preparing documents: their job is to translate the plain-English requirements of what the clients want to agree upon into legalese that has specific meaning in court.
For most lawyers out drafting contracts, we're still in the days of mainframes. We write code by hand which will probably never be run on a real judge, but we hope we haven't made any bugs.
Except that very little of the code (e.g. contracts) that get written ever run and the people who debug the code (litigators) aren't the same people who wrote it in the first place. Also the people who wanted the code written in the first place have never actually read it are convinced it says something different to what it actually does say....
It would be an interesting project to create an artificial court/judge into which you feed a contract and an argument and get a response out.
My last startup did something like that - we interviewed close to a hundred people to figure out what they really did in their jobs. It was targeted towards undergrads, as a form of career guidance, but it turns out to be very hard to get people to pay attention to career guidance that doesn't directly lead to getting a job. We also tried refocusing toward older young professionals (late-20s), but that had the same problem: it's much easier to sell a service that's task-focused than one that's discovery-focused.
My sense is that there's probably not a viable business there but it may make an interesting hobby blog-series, if you can get people to participate.
What's it with humanities people that make them believe that only them, of all people, are able to do critical thinking?
Anyway, the author is part of the problem. Just at the beginning of the article he states that humanities are only good for rich students to pass their time. Until the professors themselves stop thinking this way, no government will prioritize them.
(And no, I don't agree that humanities are useless. They have a huge potential. But for them to be of any use, professors will need to seek those applications, and study them. Locking themselves in a room, nostalgically talking with like-minded people without ever doing anything leads nowhere.)
At the risk of being overly confrontational, the things that make humanities people believe that they are capable of critical thinking (in contrast to STEM types) are comments like yours.
> Just at the beginning of the article he states that humanities are only good for rich students to pass their time
Where in the article does the author state or imply this? The section beginning "When I first came to Oxford 30 years earlier..." is obviously written to be tongue-in-cheek.
A close reading (humanities skill!) of this article might highlight the following passage as the central thesis:
Universities, which in Britain have an 800-year history, have traditionally been derided as ivory towers, and there was always some truth in the accusation. Yet the distance they established between themselves and society at large could prove enabling as well as disabling, allowing them to reflect on the values, goals, and interests of a social order too frenetically bound up in its own short-term practical pursuits to be capable of much self-criticism. Across the globe, that critical distance is now being diminished almost to nothing, as the institutions that produced Erasmus and John Milton, Einstein and Monty Python, capitulate to the hard-faced priorities of global capitalism.
> Locking themselves in a room, nostalgically talking with like-minded people without ever doing anything leads nowhere.
Again, where in the article does the author suggest this course of action? If anything, he seems to be suggesting the opposite, that the humanities should be returned to their traditional place in the public square, questioning and criticizing the prevailing ideologies of the day:
It would also seek to restore the honorable lineage of the university as one of the few arenas in modern society (another is the arts) in which prevailing ideologies can be submitted to some rigorous scrutiny. What if the value of the humanities lies not in the way they conform to such dominant notions, but in the fact that they don't?
> Universities, which in Britain have an 800-year history, have traditionally been derided as ivory towers, and there was always some truth in the accusation. (...) Across the globe, that critical distance is now being diminished almost to nothing, as the institutions that produced Erasmus and John Milton, Einstein and Monty Python, capitulate to the hard-faced priorities of global capitalism.
> His views were controversial, notably with John Underhill, Rector of Lincoln College and subsequently bishop of Oxford, and George Abbot, who later became Archbishop of Canterbury. Abbot mocked Bruno for supporting "the opinion of Copernicus that the earth did go round, and the heavens did stand still; whereas in truth it was his own head which rather did run round, and his brains did not stand still",[23] and reports accusations that Bruno plagiarized Ficino's work.
I've recently read a book by French historian Jacques Le Goff who was saying something along the lines of: "the University as we know it was born around year 1,000 and flourished for the next couple of centuries, only to become more segregated and focused on itself starting with the 14th century, so much so that the Renaissance happened outside of its sphere of influence".
So I will keep my hopes up. This downfall of the University as an institution has happened before but we still managed to create and produce great things outside of it.
I now that, after all I'm a subscriber on /r/syriancivilwar , is that when you're a professor at a place like Oxford supposedly you don't actively fight against science.
> If anything, he seems to be suggesting the opposite, that the humanities should be returned to their traditional place in the public square, questioning and criticizing the prevailing ideologies of the day
Yes, he want to be nostalgically talking with like-minded people in an open space. That's my fault, I've misread it. A bit of self criticism from his part would certainly lead to the conclusion that such kind of study does never lead to any actionable result.
Anyway, there's no problem inherent in that. The only problems he faces is that he wants to do those things on the taxpayer money, and wants to sell it to students as some incredibly useful skill. People are stopping to accept that.
>Again, where in the article does the author suggest this course of action? If anything, he seems to be suggesting the opposite, that the humanities should be returned to their traditional place in the public square, questioning and criticizing the prevailing ideologies of the day:
Ok. Questioning and criticizing on which grounds? Either there's a right thing to do, and anyone informed can point out that the public is ignoring it (in which case, universities are useful for such), or, as the humanities academics tend to say these days, there is no right answer, in which case they really have given up grounds for criticism.
What I'm bloody fed up with their doing is claiming the sole ground to make normative claims, and then asserting no such claims at all!
"...What's it with humanities people that make them believe that only them, of all people, are able to do critical thinking?..."
The same thing that is with everyone else. If you look around our societies today it seems almost a matter of course that the first argument deployed in any debate is a dismissal of the opposing view holder's ability to think critically. Not to put too fine a point on it... but you can see that argument employed a lot here on HN. It's not really a matter of STEM people being dismissive of others' critical thinking skills - or even Humanities people being dismissive of others' critical thinking skills - as much as it is a matter of ALL people being dismissive of others.
It's, kind of, the nature of discourse in our society these days.
I disagree. People who practice critical thinking, like at least some Humanities courses do, become better at it. I think the idea that things are equally good or bad everywhere in large stems from a lack of practice of/in critical thinking.
I also think people are confused what critical thinking means. It's much more about how you discuss something rather than what you are discussing or what position you take.
Reasoning in "humanities" needs to be internally consistent, i.e. to sound good. Reasoning in "sciences" needs to be externally consistent, i.e. consistent with the "observable reality" of the "natural world."
Of course, "humanities people" are in the business of programming human beings, which is pretty useful for getting things done.
> What's it with humanities people that make them believe that only them, of all people, are able to do critical thinking?
I suspect it might be ego, after a fashion. Having observed that those educated in other fields have more marketable skills, a need to claim one's own education has produced a uniquely valuable skill arises.
Yeah, this is my impression. People in the humanities spend a lot of time fielding snide comments and "do you want fries with that" jokes. Even though their critical thinking skills aren't any better than other college grads, this is at least something that sounds like a plausible benefit to having spent 4+ years at a university.
Part of the problem is the marketable skill that the humanities used to impart, that of the ability to write well, just isn't in as much demand any more.
They certainly aren't the only ones capable of critical thinking, but they are supposedly the ones who received a 4-year education in critical thinking, while other people were spending time learning about physics or what have you.
Yet, they largely have a reputation as incapable of critical thought. When an archeology professor is talking about critical thought they mean an argument capable of convincing people. Technical people may view this as sloppy thinking as a transistor does not care what you think, but it’s a very valuable skill. It's not just useful for convincing other, but also to notice when you’re being manipulated.
PS: With this mindset listening to a news broadcast can become almost painful as you see layers of deception. Study a little marketing and suddenly a lot of 'toys' seem more like 'junk'.
> It's not just useful for convincing other, but also to notice when you’re being manipulated.
In my experience (having started uni in the humanities), this way of thinking (which is how most people think) actually makes you more susceptible to manipulation, not less, because you're basically trained to think that a convincing argument and a valid argument are the same thing.
Which really is how most people perceive reality: from political debates to the workplace, most people remember how well you "handled yourself" in a disagreement, how charismatic, how well you "stood your ground", etc. Not how sound your logic was. And this is how most of the world operates.
Programmers spend all of our day pointing out flaws in each others' work, we simply do not allow each other to be wrong and correct each other at every turn... this takes time to get used to when you're starting out (or come from a non-tech background), and would drive most people crazy and bruise their fragile egos (If this behavior has ever accidentally bled into your non-tech social life, you know this does drive most people crazy). But we do it because the damn thing has to work at the end of the day.
"Cargo cult programmers" are given a bad name, where in other professions you can make a career off of hopping onto the cargo cult of the day and parroting (or being) a charismatic talking head.
It's a "Sell me this pen" world, vs "Take this pen apart, tell me everything that's wrong with it, and build me a better pen".
It's not that there's no value in the humanities, but without technical training we end up with... well... the world as it is. There's a reason the tech community has been at the forefront of innovation for the past century, moving at light-speed while the rest of the "warm and fuzzy" world (from educational, to medicinal, to political institutions) is trying to catch up, like they're all still in the friggen 19th century communicating via carrier pigeons. And I don't mean only technical innovation: the tech community has been at the forefront of social innovation as well. Ideas like open data, open source, horizontal institutions, fast iteration and experimentation at the institutional level... nobody else does it. All other professions have their little cliques formed, where god forbid anyone question the Big Talking Head at the top or try to shake up things too much, or share too much information with anyone outside the in-group.
Thank you for this comment, it really put into words things that have been passing through my head constantly in the last couple of months, without me being able to put them down in writing as well as you did.
For example just the other day I (a programmer) was trying to explain to a friend of mine the concept of "Cargo cult programmers" and about how an initially good idea (agile programming) had been taken over and perverted by outside consultants and "professional managers". Said friend works in sales and I think she once mentioned to me the "sell me this pen" thingie.
And about people and institutions still living in the 19th century, I'm now reading a book on the 1830s Saint-Simon movement (https://en.wikipedia.org/wiki/Saint-Simonianism) written by an academia darling, French philosopher Jacques Rancière. I don't know how to put it in words, but to me it seems like Rancière would have preferred for the world to have stopped right then and there, i.e. in the 19th century, when bourgeois people had just taken over from the aristocracy and they could look from above at the peasants and workers that stood below them. He (Rancière) is of course trying to be sympathetic to the workers he writes about in his book, but one feels that he's being sympathetic from outside their world, worse yet, from above their world. More than that, there's this feeling of him mocking their (the workers') belief in an better future and improved social relations, especially when talking about Fourierism (https://en.wikipedia.org/wiki/Fourierism). It reminded me of people who not so long ago were still mocking open source.
And last but not least, and it pains me to say it, but I'm afraid that open data (the idea, the concept) is dead. Somehow the forces that be managed to kill it (I'm thinking Google post-2007 or so, FB since its inception, Twitter in the last couple of years, just to name the biggest). It's strange about how no-one wants to bring this subject up for discussion anymore, it's like a foregone conclusion by now.
I agree that humanities are not worthless, but I think I (and the author) would probably agree on what that utility is: the university is where people think delicate thoughts, and essentially lead the rest of us in how we think about a person, an era, a trend, a concept. The humanities are all about perception, and there is infinite newness to be gleaned from looking at every part of history, which is only very partially (in both senses) recorded. The humanities are, collectively, our imperfect attempt to define a consensus of what happened, what it meant, and why a new perspective might be important (or at least interesting) to modern people. (And since humanities people are happy to trigger off of other humanities people, we wouldn't even need any new events, history, or art to keep the machine going; it would just endlessly rehash it's own interpretations endlessly.)
I wonder if this death of criticism is misty eyed nastolgia for a future that either never, or rarely existed.
Let's not forget that our so-called great universities:
- For a long time excluded women, Jews and many minorities.
- Were the providence of only the technocratic elite.
- Did very little research before the 20th century.
And now that the costs escalate out of control, is it any wonder that they have to go more commercial?
Some back of the envelope math: If every student takes 10 classes a year (5 per semester) and every professor teaches 5 classes per year to 20 students each, and gets paid 100K all-in, then the per-student faculty labor cost is 100K10/(205) = 10K per year. That's not too bad for critical learning. Expand the classes to 40 and you can cut the cost in half, or give the faculty a big raise.
I wonder if this death of criticism is misty eyed nastolgia for a future that either never, or rarely existed.
Not entirely. Take a look at Louis Menand's The Marketplace of Ideas: Reform and Resistance in the American University (http://jakeseliger.com/2010/01/21/problems-in-the-academy-lo...). Universities grew enormously from 1945 – 1975. During that time, too, a disproportionately large number of students majored in English in particular, as well as other humanities disciplines. During that 30-year period, universities expanded and especially grad programs expanded—by 900%.
Most of those grad programs are still pumping out PhDs, even though the market for PhDs is weak. ABDs are a huge economic win for universities. So we have a situation in which supply has far outpaced demand for decades.
Shouldn't the overproduction of Phds drive down university costs?
I hear you on the bogosity factor. I wonder why this persists. Perhaps because by the time someone realizes it's all BS, their career it too invested in exposing it?
Shouldn't the overproduction of Phds drive down university costs?
Most cost increases appear to come from technology, ancillary services (gyms are popular punching bags in this regard) and administrative staff (everything from diversity deans to grant writers).
I hear you on the bogosity factor. I wonder why this persists. Perhaps because by the time someone realizes it's all BS, their career it too invested in exposing it?
I think this is precisely the issue: the system selects for people who become very, very invested in the system.
Another important change involves the elimination of mandatory retirement: http://www.slate.com/articles/life/silver_lining/2011/04/ple.... Until the early 1990s, tenure care with an expiration date around age 65. Someone who got tenure would be on the books for 25 – 35 years. Today tenure doesn't have an expiration date—someone can be on the books for 50 years or longer. That makes tenure much more onerous.
I'd like to see universities shift from a tenure-based system to a long-term-contract-based system. This has downsides but its downsides seem better than the current tenure-based downsides.
In theory technology costs should drive others down, no? But that's more true in the corporate world than non-profit or government which the universities resemble.
I agree on tenure costs. It seems like they're really bloating the system in the name of academic freedom. I get it at the large research universities where the professors bring in lots of research money, but is it really worth it elsewhere? People say it's a bulwark against poor administrators making short term decisions, but that's what the rest of the world has to deal with too.
Perhaps just timeboxing the tenure part would work. "The department has X tenured slots. After 65, you can stay on as an untenured professor emeritus at a lower salary."
You're assuming that teaching ability scales without scaling the workforce. Having an instructor provide for more students means each student gets less instructor-time. More instructor-time increases the learning ability of most students. And so class size is an important metric.
For most universities, their biggest product is their own marketing and promotional material. Now that they are corporatizing, they are painting hindsight with rose-colored glasses, as "Brave New 1984" as most corporations are wont to do.
I'm thinking that 'instructor-time' is a myth. A class of 20, say 3 lectures of an hour a week - that's just 9 minutes per student if they spent ALL the time interacting. But they don't; more like a minute per student tops.
No, instructor-time is moot. Maybe instructor training is important; maybe a quality syllabus helps; a good book is key. But the instructor/lecturer can lecture 10 or 100 equally well.
You're leaving out most of what the instructor does:
-Prepare lessons
-Answer student questions (during and) after class
-Grade homework
-Prepare, proctor, and grade examinations
And instructors rarely spend their entire instructional period interacting with students; usually, they have a lecture they deliver for the majority, and a small time at the end for questions - about the material, class errata, unrelated questions, or even just to bask in the glow of someone with expertise in the field.
I'd estimate that 1/4 to 1/2 of students will stay after for question time, or show up to office hours. Those few minutes are wildly different for a class of 10 vs a class of 100.
If you have a 50-minute class, and you choose to have a 45-minute lecture and 5 minutes of question time, you can spend 1 minute each with the 1/2 of your 10 students. With a class of 100, you can have a crowd of 50 where maybe 10% of them can get your attention.
Additionally, many lecturers have office hours - time specifically allotted to meet with students. If you kept 2 hours a week, you could give an additional 12 minutes to each of your 10 students a week. With a class of 100, you can hope that only 1/4 of your students show up.
Furthermore, in a lecture of 10, it's very easy to ask an incidental question and go onto a short educational tangent. In classes exceeding 25, it's very difficult to retain control of such a large body of people, and so the instructors work very hard to ignore such tangents and keep everyone on track.
Not to mention the order of magnitude difference in grading 100 assignments vs grading 10.
I'm less-convinced that instructor-time is a myth. But maybe I'm biased by experience.
It depends on the instructor and the way the university is set up. I had classes as an undergrad which the professor had literally been teaching for decades - these guys hadn't spent time on a lesson plan since before the Beatles broke up. And they had grad students to grade papers and tests, plus teaching assistants the students were expected to see first before bothering the professor.
The test questions were just mixtures of questions from previous years.
So three hours of class, plus two (mandated) office hours. I'm assuming there was some time spent coordinating the grad students, but still... that doesn't seem like much work.
You had one professor who had a system worked out. How did the rest of your professors handle their workload? I too had a few professors that had it all figured out, a lot that were hard-working professionals, and a few that struggled to be on time to class.
> But the instructor/lecturer can lecture 10 or 100 equally well.
Depends on the class. I once had a colleague say that he'd teach the intro class to 5000 students in the basketball arena. For that class, at that university, it would probably work.
I'm teaching a senior-level econometrics class to 10 students right now. Everyone has a laptop in front of them, typing into RStudio throughout the class. I help students when they can't get things to run properly. We have plenty of other interaction. They each do unique homework sets and their own project. There's no way that class would work with 100 students, or for that matter, more than 15 students.
I thought roughly the same thing until this semester, my fourth in university. I took my first class with under 10 people, a combinatorics class, with an outstanding teacher. The class was demanding, and it is the first time I have attended office hours on a weekly bases. I learned more than I ever have in any single class in my academic career, and I can directly contribute that to the amount of interaction I had with my professor. Now, had he been less adept at explaining concepts, perhaps it wouldn't have made a difference. But having easy access to him directly impacted the amount I learned.
Indeed, as the author lamented about how much the deck is stacked against the humanities, I found myself wondering: are there actually fewer students today studying the humanities than at some point in the past?
Is the number of people with degrees in medieval literature actually decreasing, or just not increasing as much as the number of engineers?
Seems more likely that there are just a lot more students who can attend university now, and the sustainable way for this to happen is for most of them to be encouraged to go into particularly employable fields. This would fit with the observation that there's currently a huge oversupply of humanities Ph.D.'s with no academic positions waiting for them.
Flaws (lack of inclusiveness) don't mean it was no good. Unless you believe exclusiveness was core to the value proposition of a University, we can strive to make it inclusive without destroying the positive aspects.
We can still be nostalgic for the loss of things that weren't yet perfect.
Let's also not forget that what was so amazing about post-war education is that it went exactly opposite to the points you stated: it became subsidised, democratized, inclusive to lower socioeconomic classes in general and to minorities and women, and started to do more genuine research.
There's little misty-eyed nostalgia for 18th century universities, there is for the 1970s.
The issue for me isn't that 10k tuition fees is so bad (once you add non-teaching staff, or teaching-supporting staff, rent for facilities, administration, equipment, licensing access to libraries etc it'll be at least 15k). It's the fact that studying is supposed to be a full-time occupation. We don't ask of 12 year olds to work a 40 hour week, but we do of 20 year olds despite the fact they have as many classes but a higher study load outside of class.
Now if we consider that studying ought to be a full-time occupation, there's obviously an opportunity cost to studying or lack thereof. It's not just having to pay 15k per year for tuition, it's also forgoing the opportunity to work a job and make money to pay every expense a non-studying adult human normally needs to pay, rent, food, insurance etc. Another $15k per year is on the low-end.
So you end up with 30k over 4 years, at an average of 5% interest rates you easily graduate with 150k debt.
This is unworkable. So we require of students to work extra jobs next to their studies, knowing juggling a 70h work week for years on end while still living under crappy circumstances negatively affects studies, health etc. We require students to choose jobs that are most financially rewarding (knowing all too well that pursuing remuneration over everything else leads not to happiness, and that at the higher end the correlation between salary and benefit to society is weak). We see students accessing higher education with funds from family, something reserved for the (upper) middle class, and even for these kids it's very normal to have to not view studying as a full-time occupation despite being enrolled in a full-time program, and even so average debt in many countries sits around 30-50k.
Meanwhile, a country like Germany is offering free tuition, without having invented a magic money machine.
Yes, the message is a bit ironic, coming from a country promulgating persistent class hierarchy. Just keep giving us a cut of an endowment and 16th century trust, and let the peasants get back to tilling the fields.
The "commoners" of today want to learn engineering and practical things so they can improve their standard of living, which is unfortunate for the overall percentage of educators in social sciences.
There is of course the change over the last few decades by which universities have become a third stage of standard education rather than a voluntary pursuit of possibly esoteric learning. This has been brought about by a number of factors, but has (I think) led to more education, which is a good thing. Of course, to offset the cost of 4 years of school and 4 "lost" years of productivity, students want degrees that will improve their odds of getting a job. That pretty much explains the shift towards professional training.
Bringing the universities to everyone also means broadening the offerings — originally when it was only the erudition-inclined or well-to-do, a university could get away with having a great deal of humanities and other fields that do not generate grants or jobs. It was learning for learning's sake, which few could afford.
I do think we're approaching an inflection point in the future at which some major universities will fight back against this trend. But because this will be expensive to them and their students, I don't think it will happen soon. We need a time of extraordinary prosperity in which money can be lavished on social services and education, and that's not today or the next ten years.
It's sad, but I'm hoping it's a transitional phase, not a final one.
There's something of a shift toward professional training, but there still seem to be a glut of students entering college without a clear idea of what they're getting out of it or what it will cost them. Pressured to attend by the social expectation that everyone should go to college, the now-foolish guidance many in my generation received of "do what you love," and the ready supply of student loans, students are still throwing themselves into the gaping maw of debt and degrees that are not useful to them in the workforce. (I am very lucky that, for me, "do what you love" ended up meaning the field of computer science).
What we need is to break out of this idea that everyone must attend university "or else you'll be a garbageman" or whatever they were telling us in school growing up.
> It is true that only about 5 percent of the British population attended university in my own student days, and there are those who claim that today, when that figure has risen to around 50 percent, such liberality of spirit is no longer affordable. Yet Germany, to name only one example, provides free education to its sizable student population.
... Yes, they have, and from what I've heard it consists substantially of very large seminar classes and an expectation of self-directed, self-motivated study from its students: hardly the paradigm the author has been mourning where faculty might expect that
> the undergraduate would simply drop round to their rooms when the spirit moved him for a glass of sherry and a civilized chat about Jane Austen or the function of the pancreas.
(Also available in Germany, just to note: immigration opportunities for international students, a premise the UK (and the US) have been shying away from.)
I think the crux of the problem lies in the sheer number of students who attend college today; as the author points out:
"It is true that only about 5 percent of the British population attended university in my own student days, ... [today] that figure has risen to around 50 percent..."
The reality is that teaching critical thinking doesn't scale nicely, because it requires an intimate dialogue; the process of rigorously critiquing ideas is a two-way street. In today's institutions, where professors lecture to classes of
200+ students, this simply isn't possible.
In the article, the author claims universities have abandoned their roles as centres of critical thinking due to capitalistic forces. While I think this is true, I also believe that our collective attitude towards university shares the blame. Unfortunately, college is seen as the only legitimate path to success after high school.
If students had more opportunities to explore their interests, instead of being funnelled into university, perhaps universities could re-establish themselves as institutions where critical discussion takes place.
Descriptions of certain ancient centers of learning (I am thinking of Nalanda[1]) seem to convey an atmosphere of open-to-all-and-sundry and free sharing of knowledge, ideas and interpretation amongst the entire community. We can at least look for that online.
Isn't this part of a larger, global trend of turning public institutions into private businesses?
It happened to the American prision system, it's happening to healthcare, education, and higher education.
Ya just replace private with hierarchical. I don't think many techies grew up dreaming of being cogs in an industrial machine but for whatever reason libertarianism/authoritarianism has taken a strong hold on many of their imaginations and we're seeing a lot of these controversies framed in ways that limit possibilities.
College is wonderful. Everyone should get to go. Acceptance should be based on age or merit, be freely provided by government funding, and the alumni should remember how they got where they are and pay it forward. And to extend that, significantly more research needs to happen at universities so it’s not tainted by the profit motive. Also we should remember that the adage “those who can’t do, teach” gave us the gift of a system where a single person can change dozens or even hundreds of lives a year, and thousands over a lifetime.
Frankly the older I get, the more these “lets all just forget the lessons of the past” arguments sound like a bunch of hooey. Yes there are compelling innovations all of the place that will let people learn at a vastly accelerated place outside of school. But public education was never just about learning. As much as I hated long periods of it, I can’t really imagine a democracy functioning in its absence. And fixing it would be rather straightforward, if we let teachers/professors have more of a say in it and got the blasted politicians and financiers out of it.
Those South Koreans were probably carrying brand new Samsung phones under their jacket. Just as potent as a pair of Kalashnikovs, which their Northern brethren prefer ;)
In certain parts of the English-speaking world, universities are not dying, they're actually flourishing... but only because of a major influx of Chinese and Korean students willing to pay those exhorbitant out-of-state fees.
And since most of those valuable foreign customers want to study business, finance, medicine, law, and a handful of STEM fields, universities have no choice but to cater to their demands. Some programs in the West Coast are half Chinese by now. Those kids probably pay 80% of the gross fees, too. On the other hand, when I took English or philosophy, I was often the only Asian in the class.
But China is growing very fast (slower than before, but still fast), and Korea has all but done catching up with the rest of the developed world. Other countries might then supplant China as the largest supplier of international students, but they'll grow up, too. Sooner or later, all the international students who are propping up American universities will decide that they'd much rather spend their dollars elsewhere. When that last bubble bursts, even STEM fields will not be immune from a massive shock, and heavily subsidized humanities departments will be in real trouble this time.
The author would have benefitted from a bit of historical knowledge of the university as an institution. The English were making the same sort of complaints about Scottish universities in the 18th Century, and it's pretty clear from our perspective now where more of the insights emerged in that period.
While he may find it distasteful, the first university in Bologna appears to have formed around a core faculty who decided to start charging students for their lectures.
Somebody should offer a prize for the earliest citation of the expression "critical studies" as used here. It may not go back to Monty Python's time, and certainly not Erasmus's. (I imagine the award to be the right to look coolly at Terry Eagleton and say "kill him" in Korean, or the language of one's choice.)
I think that the humanities gave away a good deal of their own prestige by chasing a false notion that they could and should become scientific. Northrop Frye, whom Eagleton mentions, had some big grand ideas about schematizing things, I recall.
The question is what's causing this? Is it income disparity and a lack of wealth distribution, making our universities tailor more towards making money? Is it the information golden age that we're in, making education more of a 4 year summer camp for people to have fun instead of learning? Is it leftism and our need for increasing political correctness and thus bureaucracy to enforce that? Is it a side effect of our new trend of sticking everyone into a skinner box?
Good article by Eagleton but I do wonder very much if it would have made headlines on HN if the title was "the slow death of the university as a center of humane critique" and the HN readers knew of https://en.wikipedia.org/wiki/Terry_Eagleton#Literary_Theory
This was easily one of the wittiest and funniest articles I've ever read from a British author...
But why shouldn't vampires be more lauded than Victorians? Why should Jane Austen, with her painfully circumlocutions, be more academically welcome than that woman (forget her name) who wrote 50 shades of gay? In many ways, old "classical" works are telling the exact same stories as modern "trash novel" works, except the modern "trash novel" works are doing it in such a way that is clear, simple, relevant (to today's audience), and thus free of misunderstandings. From them, through clever literally mental contortions, one can still elucidate all the themes, lessons, and humanities like you could from confounding classics - just less obfuscated like "there is no place like home" instead of "lost is my homecoming", "...and then they had sex and fell in love..." instead of "... I profane with my unworthiest hand this holy shrine, the gentle fine is this...", etc.
And who says the arts are dying? The arts are vibrant and alive in today's web-comics, video games, movies, and tv-shows. The medium has changed from a completely closed system of ink and paint to a modular, copyable, and distributable one of .mdl files, computer images, and carrier streams. It's just intentionally confusing junk like cubism, poorly drawn junk like medieval art, and inhumane junk like pyramid building that's gone away.
The pressure to have to constantly monetize, I'll admit, is painful... but that primarily hurts the large institutions who have bottom lines that must be covered. And in my opinion as a flexible small business kind of guy, that's a good thing. Large institutions were necessary for centuries for individual survival at the cost of individual self-actualization, but in today's flexible scale era, it's entirely possible to just be good at something and survive without having to give up your soul to a large corporation. In that case, going small, lean, and individual is the way of the bright future.
In asking the question of why modern things such as "things that are currently fashionable to today's 20 year old's" are less worthy for study than traditional subjects, these phrases were given:
>free of misunderstandings
>less obfuscated
>clever literally mental contortions
>intentionally confusing
I imagine you would be seeking to understand the difference, and seeking to understand why some things that may appear obfuscated and confusing to a modern person are thought of by many to be better. I hope I can help the understanding via this comment.
Pretty much all poetry for example is full of misunderstandings, obfuscated, with clever literally devices and it is intentionally so. Consider poetry then! Think about why many people value poetry over a clear concise newspaper article. Why do humans like art, why do people like these confusing things? Do they actually enjoy the confusion, or is it something else that they enjoy?
Your example is Jane Austin vs 50 shades of grey. Perhaps other comparisons would help. How about Dan Brown vs Shakespeare? How about Beethoven vs Bananarama? How about Turner vs Bob Ross? Do these comparisons help in understanding what defines a quality piece of work? Do you think that a university that for two years examines Dan Brown in it's literature department will continue doing so when Dan Brown no longer becomes popular and Game of Thrones becomes popular? What does that say about the educational and artistic value of an author when they are forgotten a few years later?
Should the subjects of universities be decided by the consumers, the young students? Or should they be decided by the academic establishment? What difference would that make to education, to critical thinking? This is some of what the article was about.
Couldn't agree more. For universities, it is the dictature of the accountants. Every penny spent must be checked by an almost infinite hierarchy of accountants/manager who have no experience of the core tasks of a university (teaching/research).
More precisely, I don't know if the hierarchy is finite or not, or if there is a termination problem in the process. Sometimes penny checking finishes, sometimes not.
Universities have been "dying" for hundreds of years. Please call me when they stop existing! Whenever there is a new technology (printing press, cinema, TV, computers, internet) there will be someone saying that universities are old-fashioned and useless. Of course, this never happens, because there is always a need for institutions of knowledge, in one way or another.
For example, people buying into the internet-crazy think that everything can be replaced by the Internet. Media companies (music, cinema, book publishing) were supposed to be dead at least a decade ago. It turns out that in reality they are bigger than before. What people forget is that the Internet doesn't create things by itself. Good movies will continue to be produced by specialized companies, good books will continue to be published by specialized publishers, and so on. Only the technology changes, but human needs continue the same.
Similarly, universities will be just fine in a century or more. They will adapt to the new technologies and continue to produce knowledge as they have done before.
"the slow death of the university as a center of humane critique."
What? What the hell is humane critique?
"Universities, which in Britain have an 800-year history, have traditionally been derided as ivory towers, and there was always some truth in the accusation. Yet the distance they established between themselves and society at large could prove enabling as well as disabling, allowing them to reflect on the values, goals, and interests of a social order too frenetically bound up in its own short-term practical pursuits to be capable of much self-criticism."
Wow... You know what allows you to reflect? Having enough time while being an active part of society. Time to reflect, being active to have proper perspective. If all you're doing is reading books and talking, all you can reflect on is books and hearsay, combined with some intuition.
The last thing an English major can do is reflect on anything that requires understanding mathematics or statistics or... you know, the stuff that people who do science have to know?
Just because you read Shakespeare and Dostoevsky, doesn't mean you can critique anything that actually matters, that people who haven't read Dostoevsky can't...
There's scientific knowledge, and then there's entertainment such as fiction and television shows etc. Guess which category this guy belongs to?
The all-over-the-place sloppy writing this guy produces is telling. I guess when you're surrounded by a bunch of folks who don't create anything but words, you start thinking you know a thing or two beyond entertaining people with words.
Then the battle is lost, because universities, at least in the USA, are the home of "free speech areas" and not letting people speak who a group disagrees with. When people are talking about "safe areas" to protect students from words spoken by a speaker, then criticism is dead. Political parties have more diversity of allowed thought. STEM is often criticized as unwelcoming, but humanities has become a place of "agree with me" or be condemned.
If you didn't forge your ideas in the fire of criticism at university, then you were cheated by others or yourself. The best teachers will make you argue both / multiple sides of a scenario and be offended that you parroted their opinion back to them. The worst teachers only see one valid way to think and are doing "missionary work" instead of teaching.
It is often depressing to read https://www.thefire.org