Bugs are easy.... you just point out the bug and move on... design flaws are hard... it's hard to bring up the fact that while yes, your module solves this one immediate problem, it was written so rigidly that two sprints from now the whole thing will need to be re-written scratch. Or maybe you took a particular approach that seemed easy to implement, but it won't do async stuff properly and so it needs to be re-worked from the ground up. And I get you solved your immediate problem for your immediate task, but whoever works on this next is going to have to completely re-think the approach and that's not theoretical, that next week when we add more behavior.
You ever worked on a team a where someone would go write code in their own crazy way that wouldn't follow any sort of existing pattern or take advantage of existing tooling? So they spend like a week on a simple task because they re-wrote the strings.c because they didn't want to include it? Yeah, they get fired eventually, but they are the worst to write code reviews for, because you go into it just thinking WTH, why are they doing it this way, this entire approach is convoluted and error-prone and rigid and fragile and complicated.
If a super engineer has advice for me on how to deal with this stuff, I'm all ears. I usually just ignore it unless it directly impacts me and then go back and re-write it when we have to add on to it/make it interop with some more code/release it as an available API.
I had a coworker who had a 2,000 line PR that took me 3 days to review, it was like code breaking. Custom abstractions upon custom abstractions, abstractions in custom imported utilities for a one-liner that was only used once. Basically, instead of writing the code, he made code that generated the code that was needed, once the program was running. Dear Lord. Like somehow this monstrosity was good because instead of creating the feature, he built tooling to make many identical features whenever we want! Except nothing can be different, if it's different we need to rewrite the whole thing.
Every 4 hours or so I would ask my boss if I really had to do it. He told me to just keep going, I think he was building a case for letting him go. I still only made like 10 comments, needless to say they weren't taken well. What a nightmare.
Tangential to this whole thread, but guys like that, every day he worked took two days for other devs to fix/undo his work. Took my company a few years to catch on, he was a senior dev and also good at talking himself up. I would have paid him the same salary to go sit on a beach somewhere and enjoy himself instead of touching the code, we'd all have been happier. In fact the world would be a better place, because now he is presumably working somewhere else doing the same things.
That has got to be one of the most fascinating meta-comments I've ever read, give that the article goes on to say:
> The researchers measured individual differences in thinking styles and found that regardless of the political party identification, when high need for cognition individuals were presented with fake news stories that were consistent with their ideology, they were even more likely than everyone else to judge the story as legitimate, and when they were faced with fake news story that were inconsistent with their ideology, they were even less likely to consider the news legitimate than everyone else.
One possible explanation is that highly logical and rational people are not only more likely to disbelieve politically-inconsistent news stories along tribal lines, but they are also more likely to seek out further disconfirming information, thus exaggerating their disbelief of politically-inconsistent stories. This is consistent with research showing that people who score high in need for cognition tend to build information rich social networks, but of course this can be problematic when your rich social networks are still operating in an echo chamber.
> However, liberals aren't off the hook, as they are statistically more likely to use investment in the righteousness of their political viewpoints to believe politically-consistent news stories, and their higher level of need for cognition to delegitimize politically-inconsistent news stories. The researchers found that liberals who scored higher in a measure of "collective narcissism"-- which measures a tendency to invest in, and perceive superiority of, your political views--showed exaggerated legitimacy judgments for the politically-consistent (e.g., anti-Trump) fake news stories.
>One possible explanation is that highly logical and rational people are not only more likely to disbelieve politically-inconsistent news stories along tribal lines, but they are also more likely to seek out further disconfirming information, thus exaggerating their disbelief of politically-inconsistent stories. This is consistent with research showing that people who score high in need for cognition tend to build information rich social networks, but of course this can be problematic when your rich social networks are still operating in an echo chamber.
I haven't said anything in conflict with this. My point is that there's a difference between susceptibility to believing fake news and the rate at which one actually believes fake news. Surely you wouldn't deny that "highly logical and rational people" are more likely to have correct political opinions than those who are the opposite?
Rather, the point is that both liberals and conservatives develop a set of beliefs that allows them to make snap judgments. So they're equally susceptible to believing an individual piece of fake news that is targeted at them. But as a matter of fact, the things that liberals believe tend (but are not always) more true than those believed by conservatives. This is not an accident, but can rather be explained by the fact that the very beliefs that make up the biases that liberals hold are often informed by some knowledge of what the facts are.
> Surely you wouldn't deny that "highly logical and rational people" are more likely to have correct political opinions than those who are the opposite?
That's a complicated question, really... I guess I don't know what qualifies as a "correct political opinion". Presumably every person believes they have a "correct political opinion", because if they did not then they would change it. Obviously some number of these people are wrong. So for any given opinion do we just take into account what the majority of "highly logical and rational people" believe? (letting alone I don't know how you would do anything other than vaguely qualify that group). What if there is no majority, because there are seven different opinions on the same topic? That means for any given opinion, the majority believe it is incorrect. Are, then, all political opinions incorrect, or perhaps varying degrees of incorrect? This doesn't seem unreasonable to me, but still rather impossible to quantify.
> But as a matter of fact, the things that liberals believe tend (but are not always) more true than those believed by conservatives.
Is this true? How could we tell? We obviously cannot axiomatically prove it, so we must infer it based on some sample of both groups, presented with "truths". Because we're talking about political opinions, we can assume those truths are limited to this arena. In that case, we must also provide an objective, and potentially a moral framework... which gets fairly messy.
I do agree with your use of the words "likely" and "tend", as they convey a notion of probability, which seems like a reasonable way to approach this. It does seem likely to me that if you presented the set of "highly logical and rational people" with a political issue and collected their opinions, and then did the same with the inverse set, and I had to place a bet on which group's decision I would like more, I would bet on the first group. However, if you took that same set and compared it to, say, "all individuals with knowledge of the field", I would likely bet on the second set, due to their superior context.
You allude to context when you mention snap judgement. I don't necessarily agree, however, that liberals have more knowledge (or perhaps "superior context") than conservatives. In some arenas this is definitely true, but the inverse is also obviously true.
Another interesting line of reasoning: if we accept that there is equal susceptibility between both liberals and conservatives to fake news, it follows that "highly logical and rational people" are not, in fact, more likely to have (more) correct political opinions. Let's say, for example, most "highly logical and rational people" are conservatives, and let us further stipulate that significantly more fake news is pandered to conservatives than liberals. This would indicate that the group predicted to have the more correct political opinions definitely does not. Of course, the inverse is also true, which really means that no correlation between general rationality and correctness of opinions can be drawn. This isn't entirely unsurprising... Ben Carson is one example of this, IMO, someone who is clearly brilliant and rational, but perhaps not entirely correct in all his political opinions. It seems more reasonable to select for experience and domain knowledge over general rationality, which again I think is something you're sort of suggesting when you say that liberals are informed by more knowledge of what the facts are, but to split domain knowledge groups into liberal vs. conservative is probably one of the most ineffective splits possible, and I would be surprised if more "fact knowledge" could be determined as held more by one group than another within a reasonable margin of error.
Sorry this got a bit rambly. Just kind of working through my thoughts.
There's a lot of good stuff here. But there seems to be a thread of "correct political opinions are purely subjective." This is sometimes true I suspect, but there are two cases where it isn't:
Some "opinions" are actually just awareness of or acceptance of objective fact. To stick with the theme: Obama either gave or did not give $x to y charity in year z. Two people can hold differing subjective opinions, but one is objectively wrong.
The other case is when a political "opinion" seems subjective but is really based on a set of objective facts. This is where debate in the current climate goes to hell. I've found that people I disagree with often hold rational positions based on the "facts" they believe to be true. I just don't think any of their facts are accurate. One of us in that case is objectively correct, but resolving that takes an authority we both believe to be canon. That doesn't exist.
I prototyped something for a startup I was working on, and put in about 80-100 hours a week for three months. That was probably the most I could have managed. I went a bit crazy. My dreams got weird.
You ever worked on a team a where someone would go write code in their own crazy way that wouldn't follow any sort of existing pattern or take advantage of existing tooling? So they spend like a week on a simple task because they re-wrote the strings.c because they didn't want to include it? Yeah, they get fired eventually, but they are the worst to write code reviews for, because you go into it just thinking WTH, why are they doing it this way, this entire approach is convoluted and error-prone and rigid and fragile and complicated.
If a super engineer has advice for me on how to deal with this stuff, I'm all ears. I usually just ignore it unless it directly impacts me and then go back and re-write it when we have to add on to it/make it interop with some more code/release it as an available API.