I believe the fact some people perceive it to be is the root of, perhaps, quite a bit of consternation on the issue.
But I see it is the same as forcing people to stay in an unpleasant class until they understand type theory. We teach people things they need to know so that they know them, because knowledge is useful. We're certainly not calling the question of "having education seminars" into question, are we?
But the 'knowledge is useful' stance presupposes that these classes have value, that what they're teaching is actually true or useful. That's precisely what many of us don't believe.
Suppose we were discussing a risk management seminar which you had reason to believe was teaching a flawed methodology, one likely to cause serious harm if widely adopted. Would you be similarly sanguine in that case?
With respect, I'll defer to the career sociologists and psychologists on the topic, much as I'd defer to the risk management PhD on the topic of risk management.
I've known too many tech-savvy folk who think humans can be reduced and deconstructed like a computer, declare humanities research bunk because it doesn't fit their mental model, and get profoundly surprised when their mental model doesn't fit actual human behavior. The psychologists and sociologists have a better track record on modeling and predicting human behavior.
> With respect, I'll defer to the career sociologists and psychologists on the topic, much as I'd defer to the risk management PhD on the topic of risk management.
This isn't a good heuristic, without some way of judging the entire field. Would you defer to expert career scientologists about E-meters? Would it help if they published papers on the topic and held conferences and you picked only the best-cited ones from that process? Why not?
I think the relevant question is: If they were completely wrong about how the world actually works, would this have had any negative consequences for them?
This would lead you to some degree of skepticism of risk management professionals, too. If the risks are small and frequent (like estimating how many car crashes per month) then you can trust that people & methods that are badly wrong will have been weeded out. But when the risks are of rare events (like, say, housing market crashes, or nuclear war) then professionals can pass an entire career without being tested against reality. So you should be wary that the most prominent voices are chosen for reasons other than actually predicting the risk.
And career sociologists, of course, aren't heavily rewarded for accurate predictions.
With respect, these careerists subscribe to journals that will literally publish translations of Mein Kampf so long as a few buzzwords are exchanged. On a less sensationalist but more quantifiable note, 70% of the research comprising the field can't be replicated. I think my scepticism is warranted.
That's a funny stance given the context in which the material was introduced: as evidence that powerful people are racist "by their own admission". Doesn't sound like the no-big-deal definition of racism they might have thought they were confessing to.
It also suggests it's up to session-organisers to decide what justifiably humiliates a person.
Most people (luckily) perceive racism to be a bad thing. Being forced to publicly confess to being something you should be ashamed of is the definition of humiliation.
But I see it is the same as forcing people to stay in an unpleasant class until they understand type theory. We teach people things they need to know so that they know them, because knowledge is useful. We're certainly not calling the question of "having education seminars" into question, are we?