Associated problem: salespeople seem to be gullible when buying - they get taken in by slick salespeople. Given their skills you would think they could spot someone pulling the wool over their eyes. I haven't yet worked out whether it is just admiration for a good snow job or falling for some status game.
Sales is a great example of how moralizing about tools rather than uses of them creates incoherent beliefs which lead to dysfunctional policy. Most sales people I've met generally believe they're not bullshitting people. Now, their profession selects for persuasion and exerting frame control over social reality, and teaches it, but they use this skillset in a more "legitimate" context than the central examples that get the mean names, so the notion that they are using a skill for good purposes quickly morphs into the belief that it's in fact a different toolkit, because this is the only way we can coherently tell them that they are legitimate but continue attacking grifters not just on their actions but on their competencies and methods
Possible that they are unable to distinguish between bullshit and reality? It would somewhat excuse their own bullshit: they may not know they are bullshitting.
I've often seen this phenomenon at play in religions. They are not intentionally lying even when they put forward outrageous claims; they genuinely believe what they say.
I once went to a conference about a company selling MLM products (I didn't know they were selling MLM products and I was young enough to not know about MLM). They were eating their own dogfood there, and lunch was free. It was at an expensive hotel in my country. There were loads of yuppies there. Everyone wore a suit.
Despite that, the conference felt as if I was at a cult. And the CEO knew I had a couple of questions about his products. He gave me the death gaze / cold stare during a speech. I was in my early 20s, scared shitless.
Now, remember I wrote the conference was as if I was at a cult? I got invited to this conference via a brother of an aunt (cold side). He used to be in a cult. Now he was a hardcore Christian. He got very rich from these MLM products because he was high up in the chain. As they say: the apple doesn't fall far from the tree.
He's not a salesperson; he's a Harvard Med School professor. But he made a fraudulent $700 million dollar sale (of his company), and escaped having it reversed because he managed to convince the court that he really believed his false claims and bogus research.
This is also the claim that SBF's lawyers are (still) making, now in the sentencing phase.
Elizabeth Holmes, and worse, some of Theranos' proponents still to this day believe the legal system "robbed" the world of the genius revolutionary change in medical labs that was "within reach" when Theranos cratered.
Well, can we really ascribe belief to the machine? And they don't usually have a bias in hallucination other than what you're feeding, salespeople absolutely do have a bias
Sure, just so long as you don't require beliefs to have qualia.
I'm fine with that myself, my beliefs only have qualia when I actively introspect them, but I'd still call them "beliefs" even before I examine them for the first time.
> And they don't usually have a bias in hallucination other than what you're feeding, salespeople absolutely do have a bias
The motivation may be different, even alien, but I'm not sure it matters. You can also tell an LLM to take on the role of a motivated salesperson, and it will role play that just as "happily" (if you'll excuse the anthropomorphisation) as any other.
Does that mean hallucinating LLMs will make great sales people?
Forget I asked. If you aren't careful about the places you inhabit on the internet most of what you'll read is generated by an LLM. If we aren't already there, we must be close.
Sounds like cognitive dissonance. It's easy to believe that BS is bad, but hard to recognize BS in others without also recognizing it in yourself. When those beliefs are in conflict the easiest resolution is to stop recognizing BS.
As they say: it's hard for a man to understand something when his paycheck depends on not understanding it.
One other hypothesis would be that salespeople really like it when a sale gets made, and don't differentiate whether they are a buyer or seller.
Although what does "gullible" even mean when buying? They are non-experts in most goods, it is hard to see why they would be sophisticated buyers. Being a sophisticated buyer surely involves an understanding of the market and quality signals of a specific good.
> They are non-experts in most goods, it is hard to see why they would be sophisticated buyers. Being a sophisticated buyer surely involves an understanding of the market and quality signals of a specific good.
You might think they are at least experts in (seeing through) sales tricks?
Also, they have some internalised sympathy for a good salesman and are forgiving. I think this tendency was alluded to several times in the series Mad Men where the lead character Draper who's an excellent salesman talked to other salesman.
That is value add. People want to know that the salesperson is on their wavelength.
The role of a good salesperson is to understand the customer's problems as best they can, then explain how the product will help resolve them. That is good for everyone - the business and the customer. It is helpful if they show the customer that they are trying to understand their emotional frame.
To make that clearer; imagine the opposite - if the salesperson purposefully adopts different body language that'd make the customer uncomfortable and less likely to buy a product even if it could have been useful and cost-effective for them. Value would be destroyed.
It is a value added if you believe that you are selling a good and useful stuff. Buying the item would add value to the customer, so tricks to convince the scared customer to take the right step is added value.
"People who are more gullible are more likely to try a bluff" makes more sense: they aren't aware enough to read the situation and react accordingly. That makes them more gullible as well as more likely to attempt a bluff.
I think it's more likely that people judge if there is an opportunity to lie by thinking if they would themselves fall for it. So, morals aside, somebody gullible sees the opportunity to lie all the time, while somebody very skeptical may think that they would only embarrass themselves by trying to lie. Or possibly lie unintentionally, giving an evasive reply, incorrectly believing it will be understood that they don't want to answer.
It’s hard to know but I doubt there’s that much reflection happening. Perhaps it’s as simple as “people with low emotional intelligence lie more often because they think they’re getting away with it more than they are. Also they are fairly insecure but don’t know how to manage it in a healthy way”
One of the most common assumptions people make when dealing with other people is that everyone else thinks like they do. So many problems with communication, instruction/teaching/learning could be better resolved if each person understood that different people process information in different ways and made sure to present information in a way that the receiver is capable of following.
This is about 1,000x easier said than done. For starters it's hard to present information even for an audience just like yourself. Then you need to learn additional ways of presenting information, know your audience, and since audiences often have more than one person try to make everybody happy.
Oh, absolutely. I've spent some time in education as a teacher, tutor, and other things. It's incredibly difficult. Especially with an audience of multiple people. But I'd wager just knowing that this might be a source of difficulty for some people can help. If things just aren't connecting, come at it from a different direction and/or method. But I agree, it's not easy.
I've been working with my first client who, I've come to recognize, is _very_ susceptible to believing exaggerated claims. This is someone who would enthusiastically dive into a cult if the right person told him it's nicer there, but instead he's tech entrepreneurship, so AI is always three weeks away from allowing him to replace all his employees.
In the meantime, everyone in the design team keeps sending back content that he's requested from ChatGPT instead of an actual copywriter. Because everyone here, like everyone on HN, can easily tell the difference, but he can't/won't.
What seems to be very difficult with someone like this is, they participate in hype cycles because not doing so leaves them feeling like _everyone knows something_ and they don't. They don't want to be the idiot left behind when something good happens, so they dive into every enthusiastic movement or belief that comes in range. They don't have the ability or will to suss out that each wave is made up of people exactly like them.
There are enough people like this where for better or worse, keeping up with these hype trains has become a solid investment strategy. Imagine getting in early enough on BTC, or DOGE, or AMC, or GME, or NVDA, or whatever comes next instead of cynically scoffing it off (for valid reasons) on its fundamental merits. You'd be singing a different tune I'm sure after it explodes and you've cashed out leaving the zealots on whatever subreddit holding the bag.
"Solid investment strategy" implies one could make money applying this idea consistently. There's a hundred new "next big thing" ideas every year - most of them amount to nothing.
My first impression about this is that your examples lean a lot on survivor bias.
However, with deep enough pockets, you can indeed invest in a thousand hyped things, avoid the worst ones, and see if one or two of these sucker magnets is a moonshot.
And suddenly we're describing traditional venture capitalism. To get rich, you just need to start by being rich!
I don't know, if you get out early enough and only invest a portion of your previous winnings into the next cycle, it only has to work long enough for you to build a decent nest egg. But I suppose the average cycle chaser doesn't think like that...
You can replace greater fool theory with any sort of investment strategy and your sentence is still very valid. The key is to just not over leverage yourself where you will get burned with one strategy. Never invest with money you can't afford to lose.
Exactly. Also for that half dozen successes listed before, there could have been thousands of options available at the time that only look silly in retrospect.
Joining a cult is a lot of fun as long as you can keep your wits about you and leave whenever you choose to. I’ve been in two, always used a fake name and only spent a couple of years in each. You will never be able to match the level of community or feeling of purpose outside of a cult.
Even from within the first few seconds you can tell Feynman is already annoyed with the interviewer. It's possible that there is some history here: there may have already been a long run of questions and he's now become tired of it.
I _think_ I can very easily spot BS. That has its downsides... I very easily get very annoyed with BS.
I know a few people who as good as constantly BS. It's a puzzle to me what drives them, and why it seems like they even don't know they are BSing. Is there some psychological explanation, I'm curious.
I was thinking about giving examples to state my point, but in fact every somewhat sane person must have had those 'WTF are they saying' moments.
I think part of it is that BS is commonly quite effective. Good BSers are commonly good at getting information out of people and feeding it back to them as a kind of echo chamber. Most people will not pose adversarial information (that is say something purposefully wrong) to see if the BSer agrees with it and parrots it back. The BSer confirms the other persons worldviews at the same time spouting whatever crap it is they are selling. Very common tactic in grifters.
One example I can easily recall is some guy bsing about his arena rank in wow. I wasn't good, but I asked enough questions to determine he was absolutely bsing to the point of laughter.
Like the kind of kid that says he has a GF that lives in another city or state or even Canada lol
Yes! This is what I use on technical interviews. I generally ask candidates to describe systems they have worked on in depth from an architecture perspective. This very rapidly spots the BS hand wavers from those who know how their systems work. As a side effect it can also serve as a gauge on how mature or junior they may be.
I wrote a program from scratch, tweaked it throughout my PhD, used it to get results for my thesis. I am quite confident in the robustness and accuracy of the program.
I couldn't explain my code to an interviewer (partially because it wasn't production-ready code, but mostly because I don't interview well). I'm glad I didn't get the job. Moved on to work with people who know I'm not a hand waver.
TBH, I've found there are people that can give great sounding answers that don't correspond all that well to reality.
There are also people who struggle to answer, because reality is more complex than what they can easily convey to someone who doesn't already know the domain.
Hey if you want a mock interview in some as-real-as-possible scenario, find my post on whoishiring and email the careers@ email with your cv, I’ll make sure to get you an interview and get feedback to you.
Feynman did not bulshit the answer as the interviewer does not have a capacity to understand the phenomena any deeper and he gives all valid reasons why. If instead he would have jumped to writing down a Hamiltonian and constructing a partition function from which magnetization could be derived for a magnet it would have killed the interviewer in obscurity.
During Silicon Valley boom cycles, these fast movers ted to do a lot better than people who stop to weigh the pros and cons, moral and ethical considerations, environmental impact, etc. etc.
It could be that their agenda is different from yours. Their agenda, as leader, is to keep all the balls in the air, keep everyone energized, everyone feeling trusted, the whole enterprise moving fast. A way to do that is to just keep charging forward with enthusiasm, at any target that's available. The movement and energy and trust is the thing.
The what and how is not for your client; it's not for the CEO (or not for a certain kind of CEO). That's for you and the employees. The CEO's job is to lead, inspire, motivate, provide resources.
I am yet to find a highly skilled person who gets excited by “moving fast” aka charging towards bullshit targets backed by unrealistic time/quality expectations. And getting highly skilled people bored or demotivated is the best way to drop all the balls that “leader” tries to juggle.
Seriously, I think this whole Marvel movie style leadership will be seen as a huge red flag and anti-pattern in 10-15 years, just like we now look down on those Henry Ford style managers of the 20th century.
> They also identify two types of bullshitting— persuasive and evasive. “Persuasive” uses misleading exaggerations and embellishments to impress, persuade, or fit in with others, while ‘evasive’ involves giving irrelevant, evasive responses in situations where frankness might result in hurt feelings or reputational harm.
> the researchers examined the relations between participants’ self-reported engagement in both types of BSing and their ratings of how profound, truthful, or accurate they found pseudo-profound and pseudo-scientific statements and fake news headlines.
By their own definition, it seems like the people most inclined to impress and mislead in situations where frankness might result in reputational harm(in other words a real BSer) wouldn't admit to engaging in BS behavior at a rate above a non-bullshitter.
From the press release: "Ratings of how profound, truthful, or accurate they found pseudo-profound and pseudo-scientific statements and fake news headlines." Does that mean the people being tested were just shown the headline? Usually, that's not enough information to decide if something is bullshit. So, were they basically asking their subjects to guess?
The actual paper [1] is paywalled. US$12.00. The press release does not link to the paper, nor does it give a full citation, but it does give enough info that the paper can be found.
A useful metric is that if something makes a strong but unusual claim, and the supporting data is hard to access, it's usually bullshit.
Doesn't this contradict that people who believe they're too clever to get scammed are the easiest marks? It seems like a complex topic where multiple factors are a force.
Obligatory: Harry Frankfurt's "On Bullshit" [0] is a must-read for
everyone interested in some stronger definitions. The full essay is
easily found online.
Long as short: BS has not got a lot to do with depth of knowledge or
truth value so much as the attitude of the bullshitter.
It's a moral not a knowldge defect.
The Bullshitter is neither a liar nor self-aggrandising as we commonly
think, but simply has contempt for the very concept of truth and
falsehood. It's like epistemological sociopathy.
Some bullshitters can be very knowledgable too. They pass the five-why
questioning but then defect and talk absolute rubbish as it suits
them.
I think all LLMs are in essence bullshitters, and I hope something
good to help us understand not just consciousness but morality too,
will probably come of studying why LLMs are not really "thinking" and
holding an interlocutor "in mind".
Not Frankfurt now, but my own observation, if you wire that to the idea
that power is the capacity to paint social reality in ones own
interests, then unsurprisingly bullshitters are much more common in
traditional power roles. LLMs are therefore (rather obviously)
dangerous in any role that involves power or decision making.
I feel like it follows because they had first to overvalue “impressive sounding information” in order to decide it was worth lying about. Like someone being star struck by a snake oil salesman.
> the researchers examined the relations between participants’ self-reported engagement in both types of BSing and their ratings of how profound, truthful, or accurate they found pseudo-profound and pseudo-scientific statements and fake news headlines
If I understand correctly, they simply asked the participants themselves if they are BS’ing and drew the conclusions solely from that? How do you expect people BS’ing others would admit doing that?
People who BS well are unfazed by the implausibilities in what they are saying, and that's probably because a good chunk of them don't think it through that far: they remain cool not just by adopting that attitude as a strategy, but also because they actually don't perceive the implausibilities.
It is cutesy to use the word "bullshit" but it seems too colloquial to glean any real information from. What do they mean by "bullshitting?" Any sort of misleading? Pretending you know something when you don't? I guess I'm just tired of reverse-engineering your information to make a good clickbaity headline.
From a cursory glance at the article, they appear to use the word bullshit in the same sense as Frankfurt's essay on bullshit.
Frankfurt's essay goes into a lot of detail about what is and isn't bullshit. If I remember correctly, he differentiates between deception / fraud and bullshit, which he characterizes as lack of concern for the truth. A liar has an untruth they want you to believe, a bullshitter wants you to believe something, they just don't care whether it is an untruth or not.
Now of course, one might complain that Frankfurt hijacked a colloquial word for his idea, but he does spend a lot of time trying to understand the everyday use of the word bullshit.
More like, people who are incompetent cannot determine BS from non BS as well as someone who is competent. People who are incompetent are also more likely to engage in bluffing strategies (ie BS) in order to gain status than someone that can simply rely on their competence.