I agree on the parts about being more willing to share ideas and healthily accept criticism about them, but in general this article feels like it's presenting a false dichotomy: that you must either state things metter-of-fact or stay silent.
It's possible to share a belief while expressing an appropriate level of uncertainty about it, and this is something I really wish more people did rather than blurting out falsehoods with confidence all the time. There's no harm in adding "I think ... is likely" or "AFAIK ... seems the most probable" etc. Reserve direct assertions for things you are actually sure about. And then if you're wrong about those things, reflect on what led you to be more certain than you should have been and try to prevent it from happening again.
I think everyone should strive to never be 100% certain about things that end up being wrong.
I wish people do this more often. However it so happens that most of the time when someone says "I think..." and other similar statements, people perceive it as you being quite uncertain and less trustworthy just because you're being honest about how likely truthful a statement is.
People have a bias for certainty and simplicity (and security), which sadly works against this, and those that are more "confident" and take advantage of this bias are seen more favorably. This is, disappointingly, why we have overconfident but undeserving people like Elizabeth Holmes succeeding (and many more that fly below the radar).
It’s also why cults are so seductive to so many, turning over control and decision making does wonders for peace of mind, until the false basis for that confidence gets exposed.
Sadly. Can such tendencies perhaps be remedied, or at least mitigated to some effective degree, over a large population by learning and practicing more critical thinking?
I think it's simply a matter of habitualizing and getting better at critical thinking and establishing more helpful cultural norms. However both of which are, of course, difficult to do, just like anything that has anything to do with network effects.
Women are primed to not rock the boat and 'be nice', so in these kinds of discussions women are more likely to 'hide' 100% truthful statements behind these softening 'I think...' statements. And the bias you mention will therefore reject their opinions.
I've worked with women in leadership roles that were more assertive than all the men in the room, and I also worked with men who went out of their way to avoid any semblance of confrontation. That hardly justifies bigoted views towards men or women.
I don't see how my makes me prejudiced? Society wants women to be quiet and primes them (i.e., [1]), which is one of the main ways male power dominates. How does that make me bigoted?
> perhaps it's some kind of ESL issue, but here are some similar articles I'm talking about:
I'm not sure you noticed, but your so called sources are nothing more than baseless opinion pieces of people that you cherry-picked only because they convey the same prejudice and bias that you do.
Pointing out other people that share your opinion is hardly a substantial argument.
No, not really. That's not how logic works. This puerile "no, you are" argument is meaningless.
If someone makes a broad baseless accusation that all X are true and you reject that baseless assertion by stating the fact that in the very least there exists some X that are verifiably false, you don't magically annul the rejection by mindlessly claiming it's a subjective interpretation. Either all X are true, or they are not. No discussion.
You can repeat all the puerile arguments you want, but if you want to defend a thesis on how all women are primed for something, in the very least you need to try to substantiate your wild claims, and your personal prejudices and bigoted views won't do.
> which is one of the main ways male power dominates
That in itself is not a true belief, but a complex piece of bigoted propaganda. I think GP understood you well, they're just questioning the assumptions underlying what you said.
> People in leadership roles are in general assertive irrespective of gender.
I'm not sure you got my point, but the point was that these traits are not determined by gender, and thus any claim that goes "<gender foo> is primed for <something>" is patently false and only reflects prejudice and bias.
Yes they are. All of these traits live on a bell curve. The mean and variance are different for men and women. That some women are are more assertive than some men in no way invalidates the claim that on average, men are more assertive.
Where are your sources? It's a radical position to claim that there is no population level difference in psychology between men and women. My position is canonical and easily googled.
When someone says "A is B" you should hear "I think A is B", that makes everything much simpler since there are very few things people are certain about.
When making a stronger statement people say "I'm certain A is B".
Edit: Take your post as an example, it is about twice as long as it has to be since you add so much padding to mark uncertainty. Humans are uncertain about most things, if you say something I'll assume that is what you think and it isn't the whole truth, that is what everyone does, just learn to not be afraid of being wrong and just do the simple thing that is easy to read.
I find that tends not to be true, although this may be a cultural thing. When people say "A is B", what I find they usually mean is "I think that A is B, but I have not questioned this opinion particularly strongly and do not have a confidence level attached to this belief". And very often the default confidence level people attach to their unquestioned opinions is relatively high, which means for them it typically lies relatively close to the "I am certain" mark.
This is why I find it's really helpful to purposefully indicate when I'm saying things that are merely my opinions, when I'm saying things that I'm certain about, and when I'm somewhere in-between. Partly, it indicates to the to the other person the degree to which I hold a given opinion, but it mainly just helps me hold myself accountable about the assumptions I'm making: is this something I have sincerely thought about and formulated a belief about, or is this just an intuition that I've not tested yet?
> When people say "A is B", what I find they usually mean is
The important part is what confidence levels people are used that you have when you say "A is B", not what the person who said it is thinking. As you said here, you know people are wrong a lot when they say it, so that is what "A is B" means, it means "I think A is B".
The exception is if you are saying it from a place of authority like being a teacher or writing a manual. If you are an authority then you should hedge what you say if you aren't confident, but HN comments aren't an authority, there is no need to hedge what you say here, everyone knows you are just saying what you think. If someone was an authority they would start the comment with it.
Well the important part for me is what confidence levels I have, which is why I wrote the rest of the comment about why I try to distinguish between my opinions and my facts.
But yeah, when I'm reading things, I take them with a grain of salt (or depending on the topic, rather more than a grain). But I always trust people who are capable of expressing their own confidence levels more, because I can see that they've considered the topic more fully. For example, I don't like microservices, but I trust the comment that talks about microservices with qualified positivity far more than the one that just says "microservices are bad because XYZ" without giving any indication of the nuances involved in such a decision.
> But I always trust people who are capable of expressing their own confidence levels more, because I can see that they've considered the topic more fully
You shouldn't since it makes you vulnerable to social manipulation and bullshitters. People know you think like this, so those who wants to manipulate will talk like an authority, which is why you see ChatGPT use a lot of hedging and fluff language like this, because it makes people trust what it says more. Don't fall for that, it is so easy to fake.
But sure, if you want to manipulate people you should speak like that since it makes them trust you more. But that has nothing to do with being wrong or not, since people don't view you as an authority if you don't speak like that and therefore don't trust you then that means it is the correct way to speak if you aren't sure.
You don't get tricked since I don't hedge my language, that is exactly what I want.
i don't know if understand you right, especially i don't get how hedging is manipulative, but here is my take:
this applies in particular to pseudonymous written discussions like hacker news. i don't know who you are, and i don't know your character, but if i read you hedging your statements instead of speaking with authority then that gives me the feeling that i can have a constructive conversation with you even if i disagree with what you say. it doesn't mean you have to hedge everything, but that you indicate about which things you are more sure about and about which you aren't.
whereas if you write in an authoritative tone then i can either try to find out who you are and verify that you are indeed an authority on this subject, or i can blindly believe you (which i would only do if what you say confirms what i want to believe as true for myself) or reject you as someone who is unlikely to be able to reason with and give up responding. neither of which is a good choice. at best i can give a hedged response that explains why i believe i am right to disagree and hope that you are able to explain to me why you should be right after all. this is where not being afraid to be wrong comes in for me.
and once we enter a state where we both have opinions that we are not certain about we can then continue to explore the subject matter together until we can find a consensus that we both can agree with.
and it may just be that one or both of us changed their opinion on a subject completely because in the course of our exploration we both learned something new. but that is only possible if neither of us act like an authority on the subject refusing to accept new input.
The necessity of the verbal construct "I think X" logically suggests that all the other times someone is lying. Which isn't the case, I just find the implication funny.
People have a weird relationship with confidence. Because most people can't assess facts for themselves (reality is horrible, complex and it is good strategy to let someone else figure it out) they rely on the speaker's confidence to gauge how true something is. I guess is there is some sort of follow-up social system to punish people when their expressed confidence is out of line with what the expert consensus is, treating them as though they were exposed liars.
So on the one hand I personally prefer your approach - it is easier just to add a mental "I think" in front of everything everyone says because that is the real situation. And on the other don't expect the flowery language to go away because there is a social game playing out.
Author here. I really hope it doesn't come across as "blurt out falsehoods with confidence all the time"- I do mention in the conclusion that this is no substitute for working hard and thinking critically.
One thing I have a problem with in the workplace is what goes unsaid because everyone in the (chat) room already understands the premise and are on the same page versus things they have not even thought about.
For example, let's say there is an azure function that does something but is failing some of the time. I'll say something like hey me sending a message to the service bus again fixed this issue for me. Three people will yell at me and say sending this message doesn't do what I think it does. I'm sorry how do you know what I'm thinking? A week or so goes while we struggle with this problem. It isn't even a big problem because we can all clearly see the code and can read what it does but people are so confidently incorrect about their own code. Finally about a week later someone else has a revelation. Oh resending the failed message seems to fix the problem. And I'm thinking how did you not get that conclusion from what I said.
I feel like I'm missing some context because I feel like the senior developers in the room want to manufacture a crisis so they can rewrite the whole function but how do I ask this question out loud?
I mean I fully support that the code needs a rewrite. It is difficult to read and frankly the approach isn't very good but it feels like we are hesitant to making small incremental improvements today lest management thinks it is good enough and refuses to pay for a rewrite? Am I overthinking this?
I often say things like "I'm 76% sure that this operation will work well in this mode." It started as a joke, but actually it's extremely useful. I work in spacecraft operations, where differences in opinion often need to be resolved quickly. If the other person says "I'm 98% sure you're wrong," then I'll cede the point, but if we disagree at comparable certainly levels then we'll need to make an effort (spending value time) to check, e.g. running simulations.
> And then if you're wrong about those things, reflect on what led you to be more certain than you should have been and try to prevent it from happening again.
This is counterproductive for people who have a fear of being wrong, which is what the article is about.
Those people are already spending too much time reflecting on wrongness. Trying to prevent overconfident wrongness from happening is very likely to land them back in the camp of not speaking up, which is where they started.
In fact, I'd be willing to bet there's a statistical advantage to being a flat-out narcissist and quickly convincing oneself that a correction was just an interlocutor's misapprehension of what the narcissist already knew.[1] Something like how Heads-Tails-Tails beats Heads-Tails-Heads in random coin toss sequences. If ending up wrong can double as the first step of being right, it accelerates the learning process.
Meanwhile, students of your teachings would either:
1) fall behind as they speculate about the nature of recurring cognitive bias and overconfidence, and/or
2) introduce communication problems through compulsive hedging (probably compounded by said fear of being wrong)
1: I know a few non-narcissists who, for whatever reason, use this method. It's at most a mild annoyance, and they are vastly easier to communicate with than compulsive hedgers. (And obviously easier to communicate with than people who don't speak up at all for fear of being wrong.)
I think interestingly enough, AI confidence percentages have encouraged more developers I work with to give me confidence percentages. Even for myself, I think it’s helpful, as I’ve been told more than once that someone thought I was confident because of “the way I said it”. Even if those percentages are a guess, saying “60% sure” or “90% sure” changes how people act with that information.
Oh man god bless ya. I said exactly this to an interviewer once when they asserted with complete certainty that copying a value from a register to another MUST be faster than copying from a pointer.
I said "I agree that it's probably true for mostly every piece of hardware you'll ever touch but I think it's theoretically possible to design a digital circuit in which for certain cases they take the same amount of time"
To which the interviewer said "No it is impossible"
Being, wrong is a humble space to be in! Mind you, you can be the smartest chipmunk in the room; But, i think being curious _and_ being inquisitive is more important than *trying* to be right and turning out to be wrong.
Imagine you are in a boxing match, and your opponent looses the fight. The person who wins the fight would tell the opponent that they did well regardless; It's a courtesy. A sane person does not go around the ring saying "Ha ha you lost the fight". Is that right?
Similarly, I disagree that things need to be __that__ explicit.
What the fuck are you talking about? What does that have to do with what we're talking about? These things are not fucking comparable.
We're talking about something and I think it's helpful to have everything explicitly stated instead of just implicitly hoping everybody draws the same interpretation.
It's one thing to be willing to be wrong, but it's an entirely different thing to have the grace to accept being wrong when you're in that situation, which can take a level of emotional regulation and acceptance that isn't automatic. That's something most of us have to work at. That's why it needs to be explicitly mentioned. I've been willing to be wrong and stated things I was reasonably sure was correct, but felt frustrated and embarrassed leading to me being unnecessarily combative. Over time, I've worked on that emotional response so I'm able to say "yes, you're entirely right" and not take it personally when I am wrong.
So yes, it needs to be explicitly stated and discussed and cannot be glossed over.
He's talking about the adversarial nature of arguments. Ego is almost always involved and it often becomes heated and gets personal.
People who are wrong may have difficulty even realizing they are wrong. When they do realize it, they face the challenge of admitting it publicly. It's not easy. When I'm wrong I always try to do it but I'm not sure I always succeed.
Just as important as admitting it is allowing the other person to save face. Plenty of people will quietly de-escalate arguments and leave if you allow them to do so. It's a good idea to allow it if you spot the opportunity. Don't push it. Don't rub it in. Don't demand it. Don't try to get the last word. Sometimes it's good to let things lie after an argument has run its course. I believe that's the point the person you replied to was trying to make with the boxing analogy.
- Those who approach technical conversations where all facts stated must be always true (usually details-oriented people).
- Those who approach technical conversations where facts stated should converge to the truth (usually highly intuitive people).
These two types tend to frustrate each other. Although, I would contend that everyone needs to have the capacity to operate in either mode.
For example, you _must_ be technically correct when programming. You cannot omit a closing parenthesis. However, if you are working on trying to solve the overall problem, usually something that could have multiple implementations, you need to brainstorm, which is best done when you are free to state things imprecisely.
I recognize that. My colleague would say, I'm fixing that in the docker. Then i ask (somewhat frustrated) 'the dockerfile, the docker image or the docker container?' Then he answers (somewhat frustrated) 'that's a detail I don't want to go into now.' And I understand it like, 'damn I didn't even think about that, i just fend him off'. My other colleague, who has great empathy, says: "you are very blue, he's very red".
I'm not sure what blue vs red is, in this context, but I can't tell if the frustration is because your colleague thinks you're asking because you don't understand docker, or because you question their understanding of docker.
I know he knows docker well enough, but somehow he finds something a detail, where i find that difference important. That said, i do get what you're saying, indeed this might be the issue.
I'm going to echo what the other poster just said.
If I know something is a solvable problem, I don't worry about it, it's just a detail. For example, if I have to integrate with a 3rd party API, the details don't matter until I get there. I have full confidence I'll be able to do that even if it's a super old school strings over tcp before XML was a thing. Because I'm fully confident in being able to solve that, I don't care about it, it's just a detail, like having to reverse your vehicle out of your driveway.
It may even be that he's not completely sure of the details yet but he's confident he knows the approach to take and he'll do the discovery while he's working the problem.
That's interesting. Applying that model to my interactions with my wife I think I could add that the same person could be details oriented in some situations/subjects/contexts while at the same time being intuitive in others. Which only adds to the frustration people feel while seeking for a common ground.
This is an interesting perspective. I think these might actually be modes of thought, since I’ve experienced these attitudes from the same people just at different times. I’ll keep this comment in mind.
This holds back people of every level. Paradoxically I think it holds senior people back more because once someone gets a sufficiently senior label, they think they should have all the answers and stop asking questions
oh, it's not that i think that i should have all the answers and stop asking questions, but when i feel that this is what others expect from me then the only place where i can safely ask questions is among others who are even more senior than me.
or another way to look at it is stages: thinking that i should stop asking questions is senior level one. once you outgrow that and advance to senior level 2 then you can ask questions again.
There were some studies with firemen that showed the following technique was better than traditional teaching:
1. Explain/demonstrate right things to do.
2. Have the students criticize other people's work to find deviations.
3. Have the students do the work themselves.
That is: by looking at other people fail it takes the ego out of it.
Interesting, because as a former CS teacher I regularly did this. We'd go through some material and then I'd demonstrate it being used in different contexts, sometimes writing code that wouldn't work and sometimes writing poor code (while loops that should be for loops, inconsistent variable names, etc...) and the kids dug pulling it apart. It's much easier to critique than to create when you're beginning coding.
On that note I've been giving a lot of thought lately on how to develop curriculum which does a better job of teaching coding skills without necessarily needing code. Design thinking, but at a smaller scale, and later translatable to beginner level problem solving. Where kids get hung up is they often don't have a toolkit for this type of thinking, and building that up, while still making it interesting, is a challenge.
I did as well and believe my classes improved as a result.
I was taught "flow charting" as an attempt to teach coding without having to code. Of course, the teachers missed the point and forced template-driven syntactically correct charts that were more painful to build than the programs.
I have never taught software to complete novices, but I would think that it's difficult to teach something without teaching the thing. Have you tried the "wax on wax off" approach? That is, break things down into stunningly small pieces and build them together? You could spend days grounding on the fundamentals. (It works for training sports, rapid demonstrate/mimic/correct cycles on the basics)
I've tried something like what you're describing. Most years I'd start with a week or two of solving problems that don't involve any code at all.
For example, a day one teaser (these are 12 year olds) I liked to give was an 8 by 8 grid on the floor. They would get in groups and devise a way to move from a start to an end location. Then they would enact it, first me receiving their directions and later other students. I would of course take their directions a literally as possible, with them frequently getting exasperated ("that's not what I meant!").
We'd go from that to generalizing some rules and formulating steps. Once they had working rules, I'd add obstacles or change the start and end locations.
My goal with these was to develop the skill of seeing discrete, reusable steps, and how they mirror each other between problems. Learning loops and conditionals without ever writing coding, for example. Seeing the need for variables as storage, etc...
As a beginner, they don't plausibly understand anything. Understanding functions and variables is a deep topic that can take years to master. If it is possible to truly understand them and assuming human knowledge has reached the point where we understand functions completely - I feel confident we've got variables although even then it is hard to be sure. And that covers all of programming! I suspect in a large number of cases "admit when you don't understand something" is useless advice because the problem is they can't identify discrete chunks of knowledge to practice.
Even today I don't fully understand what a bus really is. I mean I sort of get that it is one or more (copper?) wires that carry information as electricity somehow but at no point do I remember anyone saying this explicitly in any of my education.
I've had to piece this information together in my head from multiple conversations. So now when I hear the word bus, I think a ribbon cable. Each of those cables can only send two states, high and low current and that's how I think of a bus because anything more advanced isn't something I can intuitively process at all.
So I'll just nod and smile when people talk about bus and I don't understand a thing.
I really like the message, “don’t be afraid to be wrong”. I like to tell people I’ve worked with I have a “First Idiot Principle”. Which is, “Don’t ever be afraid to be the first idiot in the room.”
There are many situations in life where someone puts out a call for ideas. Maybe a person asks for opinions in a meeting for example. Applying the First Idiot Principle you should respond to these even if you have an idiot idea. Usually one of a three things happens:
1. No one has a better idea and it isn’t an idiot idea after all.
2. Folks silently agree with you but were afraid to share because they thought they might look like the idiot. Now that the idea is public they back you up.
3. It is an idiot idea! And someone else realizes they have a better idea and they feel compelled to share when previously they may have thought they were the one with the idiot idea.
No matter what happens, you get an outcome that’s better than not sharing the idea at all.
“First Idiot Principle” - great terminology and thought process.
It's an expression of creative humility that opens a conversation rather than the "first principles" or "strong ideas weakly held" guys that basically shut down the conversation.
As soon as my managers stop using my having been incorrect as grounds for inhibiting my growth in some way I'll be far more inclined to verbalize things that I'm not nearly 100% certain about.
As opposed to the workplace, where nobody is ever penalized for being wrong?
The reality is that performance is rewarded basically everywhere in life. Id argue that schools actually have some of the least penalties for being wrong compared to what happens in the real world.
Frankly, if an education system doesn’t teach people about the relationship between performance and grades/outcomes, they wouldn’t be prepared for the real world. I’ve seen this in a number of people fresh out of college who absolutely melt down the first time they encounter anything other than glowing feedback. Depending on the school systems you go through, it’s possible to get all the way through college without really experiencing anything other than passing grades and praise and infinite opportunities to try again with extra help. Once those people hit the real world and encounter consequences, it’s a painful learning experience.
In workplaces you can iterate to come to a correct solution with feedback from peers and supervisors. So it’s nothing like school tests where you have to get everything correct the first time you swing.
I'm sure this is a part of why this fear exists. Middle Eastern countries are especially harsh in that sense imo: you're afraid of making mistakes or making your family "look bad"™.
Obviously it depends on the school and the teacher, but I don't think is is generally true. You get penalized for being wrong on a finished assignment (which can also happen at work), but it is not uncommon to have open classroom discussions where nobody is punished for asking questions or saying something wrong. This is the norm in my experience, I can't claim it's the norm everywhere but it doesn't seem unusual based on other people I know.
It's more, something in our DNA. Being confident and/or right many times increases or at least increased our chances of passing on more of our DNA. Hence there's more people with such DNA.
(my DNA will punish/praise me hormonally according to the amount of resulting downvotes/upvotes. Since I'm rather confident, i take my chances here. Also I'll try to apply the article's advice if I fail.)
So far most of the objections to the article are about how being wrong has consequences. That is of course the reason why you MUST not fear being wrong:
When a group of people are setting out to achieve something and succeed they must be able to do it close enough to right that it serves its purpose. This means they all need to do work that comes together into a correct and harmonious result. They cannot make something harmonious if they have different understandings of what they're doing and it cannot be correct if they have failed to foresee some important aspect of the problem/solution.
So you talk about the problem and your solution and explain your understandings and that allows the other people to correct you or possibly to see something they hadn't seen.
This is where you risk being wrong and MUST do it so that before you start taking actions you can be corrected. If you hide your misunderstandings then there will be consequences.
We peer review code and nobody likes being criticised but in the end it when it saves you from being the one to introduce a bad bug you tend to be grateful. Similarly with ideas - you put them out for review.
A work environment where you cannot safely put your ideas or code out for review is a toxic one and likely is wasting a lot of its potential.
I believe the author addresses the core issue of why we have a fear of being wrong in the first paragraph. Toxic work environment or the belief that the work environment is toxic. Work is only an edge case here too, we are afraid to be wrong in all environments. (I say "we", but of course some are more afraid than others.)
And I believe that showing a scared person the merits of not being scared and experimenting might motivate them temporarily and most likely get them to agree... but it will not stick. Because fear is a very powerful impulse that has roots that can go very deep and wont't just stand up and move to a little more nutricous soil. People have very, very good reasons to feel fear that are rooted in their past and they are most often irrefutable. What is refutable is that these reasons no longer exist or that one might now have other skills to deal with them.
Sadly this is an individual issue and the method for fixing it is case by case.
Similarly - this works the same way for confidence. Knowing that the benefits of being confident outweight the cons will not make you confident.
They can provide motivation though.
Source - only my personal experiences. Fear leaves me only if I feel safe or if I expend a great deal of effort to supress it temporarily. The latter of which is courage and willpower and both are not infinite.
You can create a less toxic bubble in work environment by effort and showing your own courage in the right settings. Some places are too bad for that to work but not all are.
hmm...Atelophobia is more than just the fear of being wrong; it encompasses a broader fear of imperfection and making mistakes in various ways. Fear of being wrong is related to it, but Atelophobia extends to a general apprehension about not meeting one's own or others' expectations. So more like a fear of failure, reluctance to take risks, or a constant attempt to attain perfection
The problem is not that that work environments are cartoonishly toxic, but that 95% of people will implicitly down-rank people who ask stupid questions or turn out wrong.
So, we can encourage people and act accommodating but it is just politeness. Everyone present is silently adjusting their expectations of others.
I immediately down-rank anyone who provides simplistic or “obvious” but utterly wrong solutions to difficult problems instead of trying to understand why a person chose the solution they did.
The problem isn’t being wrong, per se, it’s being wrong without entertaining that possibility.
>The main thing to remember is that being wrong isn't the end of the world
well it depends on how wrong you are. Being wrong, especially being wrong and confident can be a pretty costly combination. There's a reason the principle 'be liberal in what you accept, conservative in what you send' exists in programming. Being afraid of being wrong imposes a cost on you, not being afraid generally tends to impose a cost on others.
The fear of being wrong isn't just some evil invention by a team leader with a clipboard, it's an instinct that genuinely exists for good reason, it makes people scrutinize their claims. In general I think we tend to live more in a culture where people speak without thinking and too often propose solutions based on questionable intuition than the opposite. Especially in business culture where it's pretty rare for people who are responsible for something going wrong to internalize the cost of those decisions, mostly someone else just ends up dealing with it.
The problem with being wrong in this nowadays internet culture or unculture, with up and downvotes is exactly this.
You write something wrong you lose reputation, you write something right, you lose reputation, even if you're right, because booksmarts don't agree with it.
You write something that AI thinks is "harassment", even if it's just, "that fat cheese goes directly into your fat cells". Totally harassment... fucking Chinese Tiktok censorship.
You can't say "shit" or "fuck" because that's harassment.
Because creators are sensitive little flowers...
Fuck China and its censorship.
This is flat out bad advice - being wrong is okay sometimes but if it becomes a pattern or you make yourself look like always needing aid from other devs or team members you'll quickly paint a target on your back as the next person to be cut or fired.
The first note says that this only applies for a healthy workplace.
If everyone is terrified of saying something that is wrong. It's a cover your ass engineering place, where eventually, some massive mistake will occur because nobody wants to appear wrong.
Although, constantly sounding wrong does paint a picture of yourself being stupid. Guess that's a trade off.
I don't know where you all work, but it all sounds very toxic. Genuine question, I'm European, is this a US culture thing or have I just been very lucky with my jobs?
One interesting aspect of this is pair programming. I have noticed that sometimes even little ol' me can't focus enough while sharing a screen.
Frankly, I am not completely sure why that happens. Sure, trying to keep a conversation going is not free in terms of attention, but I have caught myself more than once just wanting to end the session so that I can do totally stupid things for a couple of minutes. Now, why would I prefer that, unless I am somewhat shy of showing weakness or something
Same, I feel pressure to not make mistakes. All my thoughts are focused on not appearing dumb that I forget how to resolve the problem. Then appear dumb anyway.
That's why I hate live coding interview bullshit, always a terrible experience.
SpaceX blowing up rockets but still being considered a successful launch test.
Many times in the world, you have to be wrong enough times to know that the suspected correct answer truly is correct. To call that directionless and flat is just naive on a very large scale.
There are times that some people can get to the correct answer faster, and it can be extremely frustrating to have the feeling of being held back or slowed down for everyone else to catch up. That's an ego problem that person needs to contend with and not a fault of everyone else.
The problem is the terms “wrong” or “right”. If I tell everyone confidently that https is all you need for a secure, privacy focused site then that is a different kind of wrong to companies that are set up to do experiments.
There is a kernel of truth to this argument: There must be some allowance for being wrong some of the time for a team to really thrive. If one wrong move results in outsize penalties to your reputation, your performance review, or your compensation then everyone is going to avoid taking any risks. It becomes safest to do nothing at all and adhere closely to the status quo, because you can’t be wrong if you avoid making any calls on your own.
However, this article takes this concept a little far, in my opinion. There must be some room for being wrong after making a good faith effort and doing proper research, but we have to be honest that being wrong on important outcomes will never be perfectly free of consequences even in the most idealistic workplaces. If you’re wrong as often as you’re right, people will notice. People take account of how often others are rigit and wrong and will factor that into judgments. It’s only natural.
In a healthy workplace, making the right call most of the time should open up room for being wrong occasionally. Nobody is perfect! If a company is handing out outsize penalties, explicit or otherwise, for single incidences of being wrong then that’s toxic and bad.
However, I’ve worked with people who are so unafraid of being wrong that they can’t be trusted. They’ll take on projects they know they’re not qualified to do because they don’t mind failure. They’ll answer questions they aren’t qualified to answer because they’re not afraid of leading someone astray. They’ll pull the trigger on high stakes issues quickly because they’re more concerned about doing something than doing the right thing. Everyone around them learns this quickly and adapts around it. It becomes an obstacle to promotion, as it should in a company that chooses leaders who can be trusted.
It’s a balance.
This article makes a mistake I see from a lot of casual advice writers, which is to first construct a hypothetically ideal workplace and then explain how to operate within their ideal workplace:
> In my posts, especially when providing advice, I will assume you work in a healthy workplace environment. If you work in a psychologically unsafe environment, none of this even matters - if possible, and safe, search for a different job.
This advice tends to feel good because it always describes a perfectly safe, secure workplace that caters to the employee (the reader) with little or no regard to anything else like their coworkers, the company, or the bottom line. In the real world, it’s never that simple. It’s easy to say “quit your job” to anyone who doesn’t see this ideologically perfect scenario in their own workplace, but the reality is that people need to learn how to operate within the realities of an imperfect, real workplace.
Definitely search for better jobs if you’re at a toxic company! However, don’t expect any company to live up to some of the lofty ideologically pure scenarios that get described in advice columns. In the real world, you’re going to have to weigh the risks and consequences of being wrong and adapt as you learn more about how the company and your peers react. Behave accordingly and operate with a feedback loop, but watch out for irrational paranoia about being wrong.
Seems like not being trusted is the only consequence to being wrong, but otherwise whatever system is in-place is designed to reward people who try and do things they're confident in approaching, but not necessarily sure of or measured to be good at, and that's a very common theme in management in every bureaucratic structure I've seen. It's not particularly relevant what choices they made or whether they're good or bad or have residual effects if they were either the only one who stepped up to try it or they were selected arbitrarily. It's the system that enables balancing effects or risk, not individual components.
No manager I've ever had has been good with people, their job has never been to be good with people, and there are rarely any consequences that have anything to do with that, other than perhaps high turnover and lots of complaints, but that only happens when there are other options for subordinates and a valid place to complain to, which also never exists.
I think my intent is less different from what you stated on how it comes across.
Especially this bit "I’ve worked with people who are so unafraid of being wrong that they can’t be trusted.". Hopefully the article doesn't come across as "be mindless" in the workplace. In fact in the conclusions I state that this is no substitute for hard work and thinking critically.
> "first construct a hypothetically ideal workplace and then explain how to operate"
I don't think this is what I stated but I know it wasn't the intent. Hopefully, working on a psychologically safe workplace isn't hypothetically ideal.
I'm definitely learning a few tricks on what I should be clearer by reading the comments though so thank you.
It's possible to share a belief while expressing an appropriate level of uncertainty about it, and this is something I really wish more people did rather than blurting out falsehoods with confidence all the time. There's no harm in adding "I think ... is likely" or "AFAIK ... seems the most probable" etc. Reserve direct assertions for things you are actually sure about. And then if you're wrong about those things, reflect on what led you to be more certain than you should have been and try to prevent it from happening again.
I think everyone should strive to never be 100% certain about things that end up being wrong.