The money shot: "The distinction between manipulation and non-manipulative influence depends on whether the influencer is trying to get someone to make some sort of mistake in what he thinks, feels, doubts or pays attention to."
I have stricter criteria, personally. If I am to persuade someone then according to my moral compass I have to respect their epistemic process, and participate in it in a way they would endorse. This is much harder to do than just being sincere and believing you're right / not trying to cause a mistake.
I like your standard, but it is much harder to do; I've been finding it difficult to do even when I am aware that we have diverging epistemic processes... it's hard to tell the difference between thinking someone is misguided/dumb/evil and thinking that they just have a different idea about how to do things in this world.
Pragmatics aside (it's much easier to rhetorically persuade someone if you understand their motives and rationale), there are situations in the world where we have real disagreements about what right action looks like:
I can't support all the various epistemic situations I see in a moral sense even if I can appreciate how it's more effective.
For instance, I can understand arguing that some jackbooted thug shouldn't summarily execute a person and arguing from that thug's position ("don't kill them because you'll have to do paperwork" or some such thing), but intuitively I don't feel like I have an obligation to respect that person's process and thoughts.
In that case, it seems okay to be "manipulative" in the sense of your quote.
But maybe not... Intuitively, I think that there has to be a good reason for this goal, and I feel like it is a goal I work towards (and yeesh, it takes a lot of real work on all parts of my identity and awareness), but I also feel like there are boundaries for that goal.
Is there a larger ethical take away or underwriting rationale in your moral compass for why we should "respect others' episteme"?
Depends on what your ethical framing is. You sound pretty utilitarian, so I think your takeaway is that respecting someone else's thought processes is more likely to deliver the results you want more reliably.
My advice is to never assume that someone is actively evil. People almost never think of themselves as evil. Instead, they have values to which they are attached and habits of thoughts to which they are accustomed.
You're more likely to achieve genuine understanding of another's mental processes - or empathy, as it's often called - if you start from a place of being willing to respect them. This understanding generally gives you a better appreciation of how to persuade them in a manner that will leave them with a positive impression of you afterwards.
Actually, I think my concerns aren't the utilitarian ones, even if I want to acknowledge the validity of those concerns.
I'm more interested in the deontic more something along the lines of the deontic idea that we should understand folks in the way that they understand themselves.
I'm with you on the idea that we shouldn't assume folks are evil and that we should always try and understand them in te way they understand themselves.
But I'm often finding myself working with and communicating with narcissistic people wholly lack that ability to see other people as having thoughts and motivations which are different from their own.
In these cases, I suppose evil is just too strong a word... there are levels of understanding people that I try to filter out, specifically the dialectic between how they understand themselves and what I should take away from that understanding.
So, at some point, I feel like we have to go beyond simply being agnostic about how people see themselves... of course we all see ourselves as driving as best we can. And we need to develop the empathy you're describing and understand the real differences between how we see things.
But beyond that, I'm having a hard time with the idea that we're all just morally neutral actors who all see ourselves as doing the best we can, and I feel like if I can better understand why we might have a real ethical obligation to respect others' epistemic situations I'll have a better understanding of my own ethics... and I guess that was what I was hoping for:
what's the underlying argument that we should attempt this ethical stance of respecting others beyond the pragmatic, utilitarian reasons that seem far more obvious.
Do you believe a person can only be “actually evil” if they consider themselves evil?
I think when one person describes another person as evil they usually mean “this person has values to which they are attached - and those are evil values.”
I don't subscribe to the idea that my values are absolute. Thus, I don't accept the notion of an objective evil, and the notion of "evil values" is one I have trouble with.
One could extend the the grandparent comment with yours:
"When Alice says Bob is evil, she's saying Bob is attached to some values, and those values are evil by Alice's own value system". This definition avoids calling any view objectively evil, but admits some values are evil in a given value system. That should be uncontentious.
Yes. And I think it can still be wise to avoid jumping to “this person is evil” too quickly.
Most of us know we often behave in ways inconsistent with our own values (because of lack of information or just personal imperfection). But when we see other people behaving in ways we think of as wrong, we’re quick to assume their behavior reflects their deepest values, instead of weakness or ignorance or normal hypocrisy.
I've used the definition that persuasion is when you try to change a person's mind whatever the reason, even lying to get a change of mind.
Justification is when you change a persons mind by giving a reason for why they ought to believe what you're telling them. A listener can analyze the reasons you gave and see if they accept the assumptions and conclusion, or in other words, is your justification sound?
Manipulators often persuade, but would be happy with a justification too. Ultimately they want to change your mind. Someone who is giving a justification tend to be more open to seeing a fault in their own logic or justification.
I "suffer" the same character "flaw," which has routinely limited my ability to affect change within the companies for which I've worked. I can't count the number of times that my suggestion to use a better technical choice has been overruled by people with a personal agenda that had nothing to do with technology.
Additionally, I've had to work closely with a couple of genuine sociopaths, who have mislead powerful people to benefit personally, while putting the (very large) company in an objectively worse place, and at a competitive disadvantage.
Would you be willing to compromise your position with an individual, or limited set of people, to achieve a large, positive organizational change?
This gets complicated. At risk of putting something that could be misquoted, I am generally willing to play the game that people force me to play.
I want to play the "everyone gets along and supports each others' goals by being respectful of each others' dignity and agency" game, but if someone wants to try to mug me for my wallet, then I'm going to defend myself with force.
Similarly, if someone's epistemic process is "whatever I think works best for me, fuck everyone else" then I'll try to cause them to think that what I want happens to be what's best for them through whatever means happen to work. But it's tricky because maybe they have strong internal epistemics, while perpetrating manipulation on everyone else, and therefore I'm actually equivocating and doing a golden rule thing where I'm not respecting their actual epistemic process, but rather the epistemic process they are trying to enforce on others.
In the case of people acting in good faith, there's little or no difference. In the case of a bad actor, there may be a huge difference.
When I posted I was imaging the "good faith" scenario. Maybe the switch to "I respect the epistemics that people try to enforce on others" is fine, but it's not exactly how I put it in the post you're replying to.
In practicality, just given the tiny summary you provided, my move would be to cause those powerful people to correctly see what was true or false according to their epistemics, ie. I'd try to counteract the manipulation. And if I were dealing with a bad actor I'd try to do it in a way that didn't position me as their enemy, but rather make it seem to them like they just failed to mislead the powerful person the way they had hoped.
A few others have summarized via quote, but I think this one is the most effective:
"For it is the intention to degrade another person’s decision-making situation that is both the essence and the essential immorality of manipulation."
A cult leader may not be thinking that they're getting you to make a mistake, because they're only thinking of themselves. But they are trying to degrade your decision-making.
Conversely, persuasion could be considered an attempt to empower your decision making: with new information, new perspectives, lights in blindspots, education on fallacies, etc.
Trying to determine which one you are doing is hard problem, because, relying on incomplete information. I believe it's best to be internally positive about your own decision making, and let what is not you take over when it does. Sometimes other people know best, sometimes you do. Not everything is as easy to understand as a computer program, math proof, circuit board - anything logical, mechanical in nature, easy to understand, easy to identify gaps in understanding - because, it's a mechanical object. Those things lack consciousness. Being a rational being means having awareness for the difference (at least to me).
Selectivity in who to listen to, depends on trust, which is a hard problem, life can sometimes really beat up on any of us. I don't think it's a good thing to choose to interact with people you have to constantly decide between two binaries - manipulator or guide. That is the business world though, sometimes, unfortunately. Some people can seem like they are just out to get you, but that doesn't make it true.
Tolerance is important, but, emotions sometimes, seem mechanical in nature. At least how I understand my own, internally. Feels like being computer. Socializing in general always seems to be a hard problem. Even harder when there's a bottom line.
If you are seeking to degrade someone's decision making, you're manipulating.
If you are feeling like your decisions process is being degraded, you're being manipulated.
For the first case, it doesn't mean that you're succeeding. For the second case, that doesn't mean it's their intention.
I'd then also say you can duck-type the opposite (which I wouldn't really call persuasion, but anyway) - if you seek to empower someone's decision making, well, then there's no reason to beat yourself up over which you're doing. (If someone says you're being manipulative tho, take it as a bug report - but again, since your intentions are solid, no reason to beat yourself up over it).
Hmmm I don’t know if I agree. You can feel like someone is trying to degrade your desision making when they’re not. In mental illness this is common. If a person refuses to believe they are mentally ill any attempt to help them seek treatment appears to them as an attempt to gaslight them.
Being labeled mentally ill can feel like it's gas lighting, especially if oneself knows their self to be just as rational and competent as anyone else. Mental illness is often a matter of circumstance. Of course people who are homeless or fear the worst happening to them over and over are going to appear as though their decision making is being degraded by being called mentally ill. It's basically like being called an idiot, which is ironic, because, lots of mental healthcare attempts to reduce stigma, but still must discriminate between types of individuals for diagnosis, even though people really, really can't be labeled that way. It's always an observer bias.
I don't know if this helps or hurts generally, about mental illness, I don't know your specific circumstance or people you have interacted with who have mental illness. But feeling slapped with a label for the rest of your life feels like a huge burden to carry. Just because, you made some big 'mistakes', in front of someone who had the authority to give you that label.
Something I'm personally very stubborn about, and very outspoken about, is mental illness discrimination. You can read through these papers and see if they make sense to you, if you want to. I'm not sure how many of the papers I've read throughout my life about psychology, neuroscience, etc, but I've read a lot. I'm not a psychologist, therapist, or anyone who has been correctly (via an accredited institution) educated in mental health care, and I only study myself, and to a lesser degree, my family, as they study me. Mental illness should be a label you can have removed - because every variation of behavior mental illness describes as negative can also have a positive side, and trying to remove all the things mental illness labels as negative can really be handicapping, eventually. You need 'black and white' thinking for software development. Application of it to social systems, can be useful, but only in chaotic social systems where people identify you as 'the enemy', which is just as much up to them as it is to you.
Trying to identify one big thing as 'the error' in life just, this a burdensome mindset to have. No matter what it gets attached to. Oneself can convince oneself they believe 'that was it, that was the error' at any moment in time. Those things aren't future proof, and it's erroneous to assume they are.
Yeah I totally get that. But there are cases where people are not as rational and competent as anyone else.
I've a good friend with Bipolar. 99% of the time they are as rational and competent as anyone I know. But there are times when they get manic. During that phase everything is amazing they are amazing and every decision they make is brilliant.
Only you know that in a couple of weeks that the depression is going to hit and if you don't help will have done things that they regret. Things that lead them to getting fired, spending too much money etc. And you're going to feel super guilty trying to comfort that person if you didn't do everything you could to stop them.
If in that manic phase they decide you're only gas lighting them, they wont listen. You just have to sit back and hope it doesn't get too bad. Hope that they don't wipe out everything they spent the last 5 years investing in.
I don't really care what label you use. But there are professionals that can help. Even in a non manic state my friend wont go beyond the GP. They've been on the same medication for decades. There must be more that can be done. More than just going back for a repeat prescription from a GP that has never been told how bad it gets.
Well, hopefully that friend can learn how to see everything they do as amazing internally, when they want to.
Everyone needs that sometimes in life. Comes down to self belief, self expectations, hopes, dreams.
I don't know what your manic friend is like, but I do know that if you really care about them, think of yourself as a guide to find those things out, and help them get there.
As children we are allowed to believe we can do anything. It's vitally important to retain that part of you.
I can't assume that a lot of people understand their cycles. But we all have them. Every person I've met, known, or read from or about. Everyone has insecurities and dreams.
This is a good quote. One similar way to state it is by using the System 1 / System 2 [1] terminology. Manipulation is when someone pushes the audience to System 1 reasoning, diverting them from using System 2.
It's not a perfect distinction (there probably are cases where appealing to System 2 might look a lot like manipulation), but it's succinct, ends-neutral and not too subjective to be useful in practice.
This is only true if you assume that System 2 is always better. But if someone has rationalized great cruelty, it is perfectly appropriate to appeal to their emotions if it will get them to stop (this is essentially equivalent to a point the author of the original piece makes).
Rationalizing is very different from actually making a rational decision, and is part of System 1 thinking.
Post hoc, ergo propter hoc.
A rational decision starts with defining the epistemic system in which it is made. Is it bayesian, frequentist, higher order fuzzy logic? How the weighting, if any, works? What is the data/proposition acquisition algorithm? Which heuristics are to be used to simplify decision and why?
Rationalization can mean rationalizing a system 1 decision. It can also just mean justifying a bad decision using explicit reasons, which is the sense in which I mean it.
I have a coworker, who largely does not listen to any argument I make; from my point of view, this is because I am younger, and he perceives me as less experienced because of my age. However, other coworkers are able to deliver the same argument to him, and have him listen to that argument. In that regard, they have influence over him, whereas I do not. Yet, nobody is attempting to manipulate anybody (in the sense of lie to, or attempting to get someone to hold a belief that the manipulator does not themselves hold or knows to be unsound).
From his point of view, you aren't a credible source, so he thinks you are trying to manipulate him (he doesn't like it). But he respects the other coworkers so they can influence him (he likes it). It's not just about the message, there are a lot of factors in play.
*Not saying I agree with the parent's definitions. Just that they could still fit your experience
"Propaganda" is another descriptor along those lines. Neither tells you anything about the content other than the value judgment belonging to the person making the statement.
"information, especially of a biased or misleading nature, used to promote or publicize a particular political cause or point of view."
Leaving aside the issue of bias, I think we can agree that if a particular piece of information, designed to mislead the public, is widely disseminated, that is propaganda. Widely disseminating facts, or privately disseminating misleading information is not propaganda.
Propaganda is intended to change a belief or a course of action through a substantial amount of disinformation, i.e. lying repeatedly or omitting many facts. Thus the intended audience doesn't have to be big (no wide dissemination) as long as the lies / omissions are numerous and show deliberate intent to deceive.
For example, a man can conduct a propaganda campaign lasting years and make many misleading statements toward only his wife. However, it'd be a stretch to refer to a single isolated lie as propaganda, even if it were intended for an audience of millions.
I suggest the dividing line between persuasion and manipulation to be akin to 'assent' vs 'consent'. Persuasion requires agreement, the informed, equal, and voluntary meeting of two minds, not a un/mis-informed compliance or subservience of one mind toward another. The latter could never be called persuasion.
Does the definition of Propaganda require that the information is disinformation? For example I consider the anti smoking packaging required in New Zealand propaganda. It’s all true but seeing a smokers lungs dissected on a packet of cigarettes to me seems like propaganda.
It can be very hard to distinguish these two concepts.
Instead, it is more useful to understand the agenda of the person doing persuasion. Do their interests align with yours? What are they trying to achieve? That's a good way to get insight into whether you're being 'manipulated' to do things that are not in your interest.
Because the agenda of an information source is key, that makes it vitally important to understand who owns and manages the outlets you get information from. In today's world, because of changes in news profitability and thus the number of financially-independent news outlets, knowing the agendas of ownership, management, and funders of news sources is absolutely critical.
It's all semantics. "Persuasion" and "manipulation" are just synonyms except for different emotional valences. Like "judgment" vs "discrimination", "patriotism" vs "jingoism", "principled" vs "idealistic". (This is fun. Got more?)
The article says they're different: that manipulation degrades the other's decision making process while persuasion does not.
You seem to disagree, which is fine. But your argument above to the article's "they're different" amounts to "no they're not". More words, of course, but just a "no".
I don't think that's true. Manipulation = influencing the process/behavior directly, persuation = providing a desirable prospect, or the "carrot", and let the protagonist close the gap by herself.
Interesting. This page (scroll down) claims that particular ad appeared in Life Magazine in 1951: http://mentalfloss.com/article/73588/14-vintage-ads-featurin.... I'd guess they were mislead by an updated copyright mark on a reprint. What made you sure the date was wrong?
I think the difference lays in the purpose of the persuader/manipulator. If it's disinterested, it's persuasion, if it's self interested, it's manipulation. Just like bullshit, which is not exactly synonymous with lying - the difference lays in the purpose. Telling something (true or false) with the sole purpose of attaining a goal that does not align with the other person's interests is bullshit.
> What makes a statement a lie and what makes it morally wrong are the same thing – that the speaker tries to get someone to adopt what the speaker herself regards as a false belief.
This is one perspective. But it introduces a problem that if someone is manipulated, and by continuation they manipulate another (e.g. MLM schemes), only the founder of the MLM is morally guilty, even if everyone who joined was negligent in repeating his lies.
Interestingly, in defamation law we recognize reckless disregard for the truth, not merely actual belief in the falsity of the statement, as a mental state that (along with actual falsity and resulting harm) produces culpability.
They aren't guilty of manipulation, but that doesn't mean they're innocent: They're guilty of not being critical thinkers. In my opinion, it doesn't take their guilt at all.
Even when manipulation is used for the benefit of the listener it still isn't great.
In such a case the manipulator is already convinced they are correct and the target's reasoning is going to be faulty or irrelevant. So in principle I would argue it is always better to present the facts clearly to the listener simply because the speaker may in fact be in error, can never be infallible.
That doesn't take into account that you're responsible for them understanding your message. You can't assume customers understand things to highest level possible, and you don't want to turn them off emotionally by accident.
For instance, say you're trying to promote a password manager, but the average customer is scared of losing their data. You have very little on page real estate that will get attention, so what do you say to the customer?
Do you say the technically detailed truth up front?
- Encrypted by X,Y,Z
- Hashed and salted passwords
- Even if you use a password manager, if someone gets your master password, they'll have access to all of your sites, so you'll want to also use a token based 2FA too.
- Link to more details
Or do you just say:
- Picture of a secure looking badge icon
- 10,000 people have downloaded us for secure browsing (even though appealing to who uses it is technically illogical)
- Link to details
I can tell you as a fact that more people will download your password manager if you use the second copy, even though it gives much less logical info to the customer. So using the first one will help less people be secure.
And the latter doesn't answer the pressing question the user actually had about backups.
The full design gives maximum information with keynoting and drill down structure. With some helpful information on how to secure it.
People tend to get desensitized by marketing BS and completely close over it, plus it does nothing to stop angry reviews when your application actually causes or does not solve problems.
Much of SV's revenue is built on a foundation of manipulation, tapping into all manner of human weakness, from our all-to-willingness to trade our souls for so-called free stuff to our susceptibility to addiction.
But such criticism will never get traction on Hacker News.
"It is difficult to get a man to understand something, when his salary depends upon his not understanding it!"
tl;dr: "The distinction between manipulation and non-manipulative influence depends on whether the influencer is trying to get someone to make some sort of mistake in what he thinks, feels, doubts or pays attention to."
To add to the summary: From the article, I would say that the essential quality is one of hypocrisy. Manipulative interventions are not authentic, in the sense that the manipulator does not believe what they are trying to get the victim to believe.
I'm surprised more people aren't saying this. Many people are forgetting that the manipulator doesn't sincerely believe what they are trying to get you to believe. They're hypocritically pushing you in a direction they will not go. How many of the ad executives who advertised cigarettes in the 90s smoked? Do you think Phillip Morris knew that cigarettes caused lung cancer? If so, the advertising was a form of manipulation.
As another example, if your girlfriend is trying to talk you into something that you know she wouldn't do herself, that's manipulative as well.
As someone who used to coerce people into doing things that were ultimately good for them, I find that this definition is too close to a moral hazard for my liking.
Let me attempt another.
We have within us a large set of goals. Some of our short term goals are directly at odds with our long term goals.
When we persuade we appeal to a person's long term goals in a way that silences complaints from their competing short term goals.
When we manipulate, we exploit or subvert the short term goals to achieve a result. Whether the manipulation is ethical or not depends on whether the goal we chose supports or undermines the target's long term goals. This is subject to re-evaluation at any time, but it was always a manipulation.
There are professions where we pay people to manipulate us. Therapists, personal trainers, body workers, all are capable of manipulating our lizard brain, tricking it into doing something it really doesn't want to do. Why? To get us something we always wanted (or previously had).
Motivational speakers and coaches try to persuade. They get you so excited about your long term goals that you shout down the disruptive voice in your head that tells you you can't, that you're not good enough, that it's too hard.
I noticed similar wordplay in coverage of the 2016 election. When Trump or Trump-supporting PACs targeted voters with ads negative on Clinton, it was labeled by the mainstream media as 'voter suppression', whereas when Clinton or Clinton-supporting PACs targeted voters with ads negative on Trump, it was 'getting the message out'
Interesting, I'm surprised to hear that this happened. In my recollection, most of the mentions of "voter suppression" had to do with things like voter registration shenanigans, not so much political ads. And voter registration shenanigans are definitely voter suppression.
AIUI Trump supporters used targeted facebook ads to go after American Haitians to remind them of the Clinton Global Initiative(which has a very bad reputation on the ground in Haiti).
When you are telling someone how to change his behavior, what to think, you are "educating" him or "opening his eyes". You are also a good person spreading the truth.
But when your opponent in ideology does that, he is "brainwashing", "propaganding" or "taking advantage of innocence/ignorance". He is a bad person spreading lies.
Questioning ones that fundamentally don't align, but you never actually know, because you only see your own side.
You can think you see the other side, but that often is incorrect. Those things are built into us from everything we learn as we grow, from experience.
It's easier to just not consider people opponents, ones you choose to interact with. That's sometimes hard when the world seems screwed up.
or how group of people fighting for against cultural genocide and right to govern themselves under their own values are selectively given the title "independence fighters" or "fucking terrorists" depending on their political alignment with whoever is labelling them.
I have stricter criteria, personally. If I am to persuade someone then according to my moral compass I have to respect their epistemic process, and participate in it in a way they would endorse. This is much harder to do than just being sincere and believing you're right / not trying to cause a mistake.