Hacker News new | past | comments | ask | show | jobs | submit login

> People not able to factor out functions or structure their code in a readable way. Variables are called v1, v2, v3. Unit testing seen as a waste of time. CI seen as a fun toy. They lack the experience to even notice the difference.

Had a colleague work under a 'team lead'. Needed to take a form with variable amount of rows of input data - max 50 - and take data, parse it, and store it. Took 20-30 lines of code. Next day "I don't trust loops, these need to be unlooped". Really? This was all in writing and stated out loud in a meeting with witnesses, and everyone agreed. "Loops can be tricky - they don't always work like you think" (something like that). So a 30 line block of code with a loop around it became 1200+ lines with 'v1, v2, v3.... v50', with 'ifs' around each one to check if that row number was also submitted.

The code to generate the form was, of course, a loop that spat out holders for 50 rows. THAT was OK, because someone else's team wrote that a while back (really??) and ... it was already done and in production. The lead could not put their stamp on it.

Very very very weird. Having half a dozen other people all nod their head suggesting that a 30 line loop is fraught with danger, and the correct answer is copy/paste 50 times. Felt like gaslighting, to my recollection. Worked in same dept, just not on same project together, but enough of this was heard/pickedup across the dept.

And... my colleague and I aren't there any more, and to my knowledge, that team lead is still there.




This is what I call "preloopsarian": a state of coding innocence in which one has discovered assignment and alternation, but not iteration.


Looks like straight out of https://thedailywtf.com/


It's been a while since I'd visited! Always amusing.


>Felt like gaslighting, to my recollection.

Sounds like a real-life example of the Asch experiments.

https://en.wikipedia.org/wiki/Asch_conformity_experiments


That's terrible.

I was in a situation in the early 2000s where the team that would maintain our application after we were gone (to another project or product or company) were not skilled enough to follow certain things, and we were asked to change a number of things to make it easier for them. In that case, the leader of their team was self-aware and honest and communicative, which is the rare and exotic thing, but we did have to re-architect some things and even change the programming language in one area to suit their capabilities. Sometimes that's a business need, and it matters.


At the very least it's a signal to hurriedly look for a new job, if not resign on the spot.


One time I needed to sort some data arbitrarily — the resulting order did not matter, it only mattered that it was the same for the same data in different orders.

My senior engineer advised me against using Java .sort() because “we didn’t write it so we couldn’t be sure it would do the same thing every time.”


To play the fun game of charitability, that engineer could have been talking about sort stability. Which could technically violate the property you want.

A quick search however does say that Java's .sort() is stable.


Sounds like something out of https://blog.codinghorror.com ! :)


I regularly see code of the form `if(x == false)` as the author has a distrust of `if(!x)`.

I guess the author just distrusts smaller things, leaving me to distrust the author’s larger things.


This is completely another level of "issues" than other problems in this thread. During code review I'd only mention it as a nit. The longer form is correct and the only downside is that's a bit longer. It doesn't mess up code modularity or affect maintainability in a noticeable way.

Well, assuming that the language doesn't have any quirks in this area - e.g. in Java your statements aren't equivalent for a Boolean x.


Nitpick (as in general you are right regarding java as well): I’m fairly sure they are the same for java in this instance. Both will convert Boolean to boolean, throwing an NPE if it was a null.


Ha, that's true, thanks! I guess my Java-fu is weak these days :)


They're not equivalent in many languages (JS, C++/swift with operator overloads, if x is nullable etc. etc.).


It's been awhile since I absorbed the weird programming norm that "real programmers use the !x form!" but even after 10+ years of !x , I still find ==false more readable.


I agree. To me, its simpler to understand. Suppose x is a bool, reading the code, I say to myself "if not true..." or "if not false..." and my ape brain gets confused on what happens if its not true or not false.

Reading "if true == false" or "if false == false", it becomes much clearer what we're testing here and I understand it instantly.


If the statement is "if (!isGreen)", it's much clearer to say "if is not green" than it is to say "if is green is false". Putting == true or == false makes you convert a clear statement "is green" into "true" or "false" instead of just being a natural English statement. It would be like saying in conversation, "I want to go to the store is false" instead of "I don't want to go to the store".


> If the statement is "if (!isGreen)", it's much clearer to say "if is not green" than it is to say "if is green is false"

I agree that when you read it, it's clearer. And yet I still prefer "if(isGreen == false)" for reasons of clarity in another sense.

The "!" being right next to the "(" makes it easier to miss the "!" when scanning quickly through the code, hence reading the logic the wrong way round and seeing "(isGreen" instead of "(!isGreen". And that's enough of a risk to ignore the readability advantage of "(!".

(Edit: To be clear, I don't suggest "== true" for the opposite cases, as the lack of a "!" in those means the risk is gone)


It also helps readability if the ! is before a function name that doesn't follow the right naming convention for it. One of my pet peeves in C is "if (!strcmp(a, b))". "!strcmp" I read as "not string compare" and I would expect it to mean that the strings don't compare when it means the exact opposite. This is true of anything following the "0 means success, anything else is an error condition" error handling scheme. So I use "if (strcmp(a, b) == 0)" instead because the "==" makes look at what value it's being compared to specifically and I make fewer assumptions.


Even if the not operator in the language you're writing in happens to be the actual word 'not' ?


My whole career has been C++ and shader languages, so this really hasn't come up for me. I imagine it being a real word would improve readability greatly.


Even more common seems to be

  if (x == true)
which always seem to come with some argument how it is "more clear".

I have started to ask people straight away to change to

  if ((x == true) == true)
which following the same argument should be even more clear.


I wrote code like that sometimes. Equaling false is more specific, depending on language. There are many falsey things that are not false themselves.


In some languages these do different things, right? (Or if someone did something horrendous with operator overloading)


As much as people push the more succinct if(x)/if(!x) style of expression, I don't know that it is better. Now if your example was if(is_a_thing) then maybe it reads better. Add to that the possibility of three value logic and I could lean more to the if(x == false) style.


The worst one is "if x=true", which to me says the writer doesn't know what the if statement does...

That said, if x is data read from elsewhere that just happens to be boolean, I can write code like that in Python.


The UW intro CS courses call the `if (x)` form "Boolean Zen", which I've always enjoyed.


I've seen this in Ruby and Elixir that drives me a little nuts:

  if !is_nil(foo)


While it may not matter in many circumstances, this is not the same as "if foo", because false is not nil.


It's true but in 10+ years of writing Ruby it hasn't mattered in _any_ circumstances I've come across. I also assert that having a boolean where `nil` is meaningfully different than `false` is a smell and should be avoided.


Some of that might be Python habits which is terrifying for completely different reasons.


!x is not equal to === false.


It's not usually needed (especially these days), but there are times that it is better to repeat every possible iteration by hand and not have a loop.

This is a technique called loop unrolling. It is done for performance reasons. This is something we used to do at a company working on games for the old feature phones (think Nokia 30/40/60 series stuff). The devices were very limited, there is no direct control over J2ME garbage collection, etc... so loops could very noticeably slow down games.

We initially wrote code with loops, then would performance test and manually unroll when it was necessary. Eventually this became very burdensome and we eventually... wrote code to unroll the loops for us and that code of course had loops in it because it was build code that wouldn't ship.

There are other performance situations where this technique applies.

This may not have been the situation there, but I think it's important that rather than assume stupidity from the outside that we try to ask why.


Loop unrolling seems like something that should be done by a compiler when you turn on aggressive optimization flags, and not something you need to code explicitly.


Now, that's probably true. However, in the days of feature phones compilers auto optimization were still inferior to a person unrolling a loop in ASM.


That is why I said it's usually not needed, "especially these days". This was not true at the time. Relying on primitive compilers, especially the likes of the J2ME garbage collection would lead to total freezes in a game from something as simple as a loop. As the garbage collector decides previous passes in the loop are no longer needed it can trigger a garbage collection sweep. With such a slow and limited device a garbage collection sweep would literally cause a game to freeze until the sweep completed which could take on the order of several seconds.

The loop unrolling was just one example of how we would go about preventing an undesirable sweep.

As another example a large global game object array would be created when a game was started. As objects were created and deleted they would really just update pre-created objects in that array.

This allowed us to prevent garbage collection, while simultaneously making sure we didn't run out of memory.

The Nokia 1618 (a series 40 device) has a heap limit of 1024KB. Many S20 and S30 devices were even further memory constrained.


> my colleague and I aren't there any more

The only right choice in that situation.


At one of my earliest jobs, in a previous century, if statements were introduced to the language (RPG-3).

I was delighted, but the old-timers were quite suspicious of this experimental technology.


That’s some did-I-wake-up-in-another-dimension shit. :)


Makes me think I should consider myself lucky that so much time has passed since last time when something reminded me of https://thedailywtf.com...


This is terrifying.


with people like that roaming around the landscape maybe there is a point to all that leetcoding nonsense, since they'll certainly spout off some truly insane stuff in interviews...


Which country was this in?


Elbonia perhaps?


US of A.

FWIW this was... 2005-ish.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: