Yeah, he is really quite good at this. (I wonder if it came naturally to him, or if it's just a matter of practicing very regularly for like 20 years?)
I wish he would give Elon Musk some mentoring on how to communicate with a technical audience. Or just a regular audience, I guess.
I'm under the hypothesis that (good) game developers know how to talk to people (or at least how to present things to people) because part of game design is thinking about how to manipulate people's emotions into wanting to (or even being eager to!) face a challenge you've set out for them.
It's similar but not identical to the reasoning for actors and directors to be good entertainers and interviewers. Those folks constantly think about manipulating people's emotions in general, but game design is completely focussed on just manipulating people into excitement or flow states — and that just so happens to be the emotion you (usually) want people to have in reaction to a presentation: the feeling of "I'm going to go out— and buy their thing —and change the world."
I like that idea. It's also true that a field like game development, which is both collaborative and interdisciplinary, pretty much requires excellent communication skills. Being a game designer is less about having amazing "ideas" or "vision" (everyone has those) and more about your ability to align everyone's ideas and vision in the same direction.
Personally, I prefer to describe "thinking about manipulating people's emotions" as _empathy_, but that's just me.
> Personally, I prefer to describe "thinking about manipulating people's emotions" as _empathy_, but that's just me.
Empathy is the core skill, yes, but there's a sort of... I almost want to say an instinctual disgust? that people also have to overcome, when they want to turn empathy around to use it to change someone else's mind, rather than just using it to predict someone else's mind. You have to become at least a little bit of a sociopath, is maybe the problem.
See, other people have different foundational beliefs—different axioms they're working from. To use "empathy aikido" on them—to come up with arguments that will convince them of something, not slowly and laboriously from logical first principles, but by building up quickly from what they already assume to be true—you have to be willing to make arguments that are true under their axioms, but not under yours. That is, you have to be willing to use arguments that you think are false, just because the person you're trying to convince will believe them.
It feels weirdly like lying; like you're a politician swaying the populace with empty rhetoric. But you're not saying things that nobody would believe (if given long enough to think about them); you're instead just getting into the head of—empathizing with—the person who holds those axioms, and then saying things that you—as that person—really do believe.
This is why, I think, there's a big divide between people who like or hate the idea of "salesmanship": some people fundamentally see it as lying, while other people fundamentally see it as empathizing.
Personally, I think it can end up either way—some people "sell" an idea while holding back a bunch of facts that, under their axioms, are total deal-breakers. Others, though, "build a bridge" between their interlocutor's world-model and their own, using their arguments to help the other person build a world-model enough like their own that they can then present the facts that they believe to the listener, and the listener can understand them through the "consensus schema†" they built.
People who are said to have "reality distortion fields", I'm guessing, are just good at making those kind of points that build a consensus schema, that they can then state plain "facts" against which will seem—within the consensus schema—to be obvious, rather than having to convince you of each fact through argument. Despite the gnawing feeling that accepting that consensus schema into your brain is sort of an indoctrination into a cult, it's really the less ethically questionable of the two options, in my mind: the speaker doesn't have to say anything they don't actually believe (other than the arguments required to build the consensus schema.)
> That is, you have to be willing to use arguments that you think are false, just because the person you're trying to convince will believe them.
I think genuinely arguing over whether something is true or false, is rarer than it seems. More often, I think we find ourselves arguing over which things are important, or how we should feel about something. In which case, I think you can put yourself in someone else's shoes and still be perfectly sincere.
Thank you for this comment - that's a really striking way of looking at these things. Makes me wonder just how many human interactions involve being that little bit of a sociopath. E.g., "putting your best foot forward" for a job interview or first date can feel like a kind of dishonesty, although it's expected in those cases. I wonder how you could start drawing a definitive line between "good" empathic manipulation and "bad" sociopathic manipulation, when even just smiling at someone can be manipulation of a sort?
> "That is, you have to be willing to use arguments that you think are false, just because the person you're trying to convince will believe them.
It feels weirdly like lying; like you're a politician swaying the populace with empty rhetoric. But you're not saying things that nobody would believe (if given long enough to think about them); you're instead just getting into the head of—empathizing with—the person who holds those axioms, and then saying things that you—as that person—really do believe." -derefr
Response to derefr: If you are willing to use an argument that you think is false then, by definition, you are lying [even in situations where the lie can be mistaken for truth by others].
and
> "E.g., 'putting your best foot forward' for a job interview or first date can feel like a kind of dishonesty, although it's expected in those cases. I wonder how you could start drawing a definitive line between "good" empathic manipulation and "bad" sociopathic manipulation, when even just smiling at someone can be manipulation of a sort?" -zazen
Response to zazen: If it feels like dishonesty then it most definitely is. Consider cases where the interviewee simply lacks confidence but puts on a facade to appear otherwise. As for drawing a line between "good" and "bad" manipulation, reference derefr's note:
> "To use "empathy aikido" on them—to come up with arguments that will convince them of something, not slowly and laboriously from logical first principles, but by building up quickly from what they already assume to be true—you have to be willing to make arguments that are true under their axioms, but not under yours. That is, you have to be willing to use arguments that you think are false, just because the person you're trying to convince will believe them." - derefr
The addition of "but not under yours" constitutes manipulation on grounds which are aside from truth.
I think you're misunderstanding what I mean by "axioms" here—a lot of these fundamental beliefs that inform which arguments you have to use with people to convince them, are "orthogonal to truth"—that is, things that aren't part of a causal graph, like normative or theological beliefs.
It takes a different thought process to convince someone that e.g. climate change is happening, if they're a Young Earth Creationist, than it does if they're a paleontologist. It takes different arguments to convince someone to donate money to charity if they're a deontologist than if they're a consequentialist. None of these axiomatic positions affect what (empirically discoverable) facts are true, per se; they just affect what facts are relevant to changing one's mind about what one should do—that is, these axioms influence how the "is" statements† a person hears will affect their confidence in various "ought" statements.
So, this kind of "empathy aikido" is less about modelling a person who believes different facts are true, as it is about modelling a person who cares about the truth-value of different facts than you do. It's not that you might have to believe [X thing you believe is true] to be false; it's that you will have to pretend that [X thing you believe being true] is not a compelling argument, and might be so unimportant that you've never even thought about it and never will. Whereas the truth-value of [Y thing you don't care about] might be, to the person you're talking to, the most important thing in the world; the "trick" is figuring this out and then using a (true) argument about Y to convince them, despite thinking personally that only X, and not Y, holds any real sway over the truth-value of your conclusion C.
It can still feel bad, but I hope you can see how that intuition is less grounded here in any real injustice you're doing. Telling a virtue-ethicist that it is "noble" to e.g. donate to the Against Malaria Foundation, when you think that "nobility" is complete poppycock and the only thing that matters is that those donations mean people won't die, isn't an example of a lie. It is a manipulation, but not an illegitimate one—because, from the other person's perspective, it's just the honest truth.
Eh, judging from the Masters of Doom book, i wouldn't take his words seriously considering the tension that existed at the time between the two and how Carmack basically burned out himself during Quake's development. Romero wasn't the only one who left id at the time.
Its a matter of preference, I think. I was only able to play UT growing up, thanks to its software renderer, and finally played Q3 (Quake Live) in college.
Q3 is more minimalist, which makes competitive play more interesting. Q3 has also maintained a competitive scene to this day where UTs seems to have fizzled out.
It has great game mechanics, but in all other manners of design it feels like a hodgepodge of unrelated assets thrown together. Quake 1 had a similar problem early in development, but they managed to tie it together. Quake 3 didn't even bother with a true singleplayer campaign, which seems like a winning move in hindsight, but at the time it was a departure from the norm and I can't help but feel like it was a move they did out of necessity.
While surely being a marvel of software engineering and level design, it had nothing in common with Quake I and II. It is not my intent to be critical here, but QA3 felt like a circus, especially compared with other multiplayer shooters such as the original Unreal Tournament, which had a wonderful mysterious atmosphere to it.
Linked video is still a great talk. Carmack comes off as someone who could talk about anything for a literal fortnight, so the 'mmm' may've been a tick where he was forcing himself to shutup & let the interview continue
From what I gather she is, my only point is that when you are in a dense field, generally your partner isn't too enthusiastic when you get into the minutia you've spent a 1000+ hours in that year, so when you get the chance to really talk about it, you gush.
Granted I've watched every QuakeCon keynote that was posted and enjoyed them all (along with reading every .plan). His enthusiasm is infectious. Yet even then he is breaking it down to the big things that happened that year, not the day-to-day stuff.
If it were over the dinner table every night, I'd probably lose my mind.
Carmack really should have quit Oculus when Facebook bought it and gone on to work for Musk at SpaceX. Both are workaholics, too, so they'd probably get along quite well. Plus, Carmack is a big fan of space rockets.
Perhaps Musk needs to go to him and tell him, Steve Jobs-style:
"Do you want to work for a company that invades people's privacy for the rest of your life, or do you want to come with me and change the world?"
>>I wish he would give Elon Musk some mentoring on how to communicate with a technical audience. Or just a regular audience, I guess.
Musk is an effective speaker in my opinion. To be effective, you need not be eloquent, engaging, and charming. Musk has his foibles and stumbles around a lot, but that is part of who he is. I appreciate it.
He didn’t strike me as very nervous, but if he was, he had good reasons: He announced that they would halt production of Falcon rockets after building a big enough batch of them and completely focus on building spaceships. He announced to endanger the only big revenue stream SpaceX has in favor of an unproven technology in the hope to solve all problems and start production before the last Falcon rockets seize to be reusable.
> Musk isn't a very good speaker and he seemed very nervous ...
Sort of.
Looking back, he sounded much more confident when he was younger [0] perhaps because he hadn't yet experienced life altering events like the loss of a child or coming close to bankruptcy at the end of 2008 after sinking his networth into SpaceX and Tesla.
I think his seeming lack of confidence when he speaks publicly today is a latent acknowledgement that the stakes are much higher relative to his PayPal days when the Internet was incipient, and that failure is still a possibility, regardless of the amount of resources he may have sunk into his investments.
I wish he would give Elon Musk some mentoring on how to communicate with a technical audience. Or just a regular audience, I guess.