Hacker News new | past | comments | ask | show | jobs | submit login
The worst program I ever worked on (jacquesmattheij.com)
147 points by bemmu on March 16, 2011 | hide | past | favorite | 130 comments



My instructor in the introductory programming course informed us that yes, the grader did indeed read our source code, and check to make sure we'd commented it meaningfully.

I decided to test this assertion.

The program in question was trivial - it was one of those contrived examples of how OOP works, a "Kennel" which had "Dog" objects which possessed various traits, and would bark() or wag(), stuff like that.

I obfuscated the code as much as I could - indentation all over the place, line breaks in inappropriate places, code hidden inside multi-line comments, and multiple dummy variables named one character off from the useful variables. The useful variables had names like "satrtrek," "maryppopins" and "pizza." The dummies were "startrek" and "marypoppins."

This exercise taught me nothing about writing code, but because my code gave the expected output, I still got an A. Source code comments from then on were made only as needed.

The next semester, I got an email from a friend of mine who just couldn't get his head around what was supposed to be going on. He asked whether I'd kept the source to one particular program, so that he could review it and see if he couldn't make sense of the assignment.

I had indeed kept it, and just sent the .java files to him without opening them for review. By coincidence, the program he was stuck on was the one I'd obfuscated. He switched his major to Math and Spanish for what I believe to be unrelated reasons.


Worst for me was something I was asked to maintain a little while ago. Was essentially Microsoft Reporting Services implemented in 80,000 lines of VB.NET.

The first thing I did was chuck it into VS2010 and run some code metrics on it. The results follow,

10 or so Methods had 2000+ lines of code. The maintainability index was 0 (number between 0 and 100 where 0 is unmaintainable). The worst function had a cyclomatic complexity of 2500 (the worst I have ever seen on a function before was 750 odd). It was full of nested inline dynamic SQL all of which refered to tables with 100+ columns, which had helpful names like sdf_324. There were about 5000 stored procedures of which most were 90% similar to other ones with a similar naming scheme. There were no foreign key constraints in the database. Every query including updates, inserts and deletes used NOLOCK (so no data integrity). It all lived in a single 80,000 line file, which crashed VS every time you tried to do a simple edit.

I essentially told my boss I would quit over it as there was no way I could support it without other aspects of work suffering. Thankfully it was put in the too hard basket and nobody else had to endure my pain.

I ended up peer reviewing the changes the guy made some time later and a single column update touched in the order of 500 lines of code.

EDIT - I forgot to mention, there was so much nested if code in methods you could hold down page down and it would look like the page was moving the other way, similar to how a wheel on tv looks like its spinning the other way.


I met the guy who wrote this bit that I had to maintain:

  for(a=0;a<NbrOfAs;a++){
   for(aa=0;aa<NbrOfAAs;aa++){
    for(aaa=0;aaa<NbrOfAAAs;aaa++){
     for(aaaa=0;aaaa<NbrOfAAAAs;aaaa++){
      for(aaaaa=0;aaaaa<NbrOfAAAAAs;aaaaa++){
       if(aaaaa==0)else{ExamineAAAAA()}
      }
     }
    }
   }
  }
What really pissed me off was that he was such a nice guy.

He still works a lot. Makes a lot of money. And his customers love him. (No, I don't think they review his code.)


At least that code is indented properly..

Much of the code that I need to fix seems to be written by people who learned on day one of programming that "whitespace doesn't matter to a computer" and never looked back.


Yes. I've even heard people justify their haphazard, random formatting by parroting the old "the compiler doesn't care about formatting" line they heard from the instructor at the technical college on the first day of their language course.

If the compiler was the only other entity with which you're collaborating on a project, fine. But in the real world other humans have to read your garbage.


Did you ever ask this person why he wrote this code in this manner? Did he trot out the "job security" crap or was there a legit reason?


I met someone once who called every variable X. Yes really. The compiler was somehow just supposed to figure out that X on this line and X on that line were different things... What is baffling is that he did have code that worked, kinda.


Unless there is a lot of creative abuse of operator overloading or macros lurking in there that code really doesn't look too awful compared to some monstrosities I've had to work with.


I agree. At first glance, I thought that the NbrofA/AA/etc names were just gibberish. The most nonsensical part to me is starting the for-loop iterator at zero, and then only executing the function if the iterator is non-zero.


I hadn't even noticed the if - I was thinking it was loops all the way down....


Hm, did you check that each loop worked with its own loop variable? If not, how do you _know_ it does what you think it does?

If you code like this, you will have to be paranoid with each line you read. That is what makes such code bad, even if it does what it appears to do at first sight.


Is this equivalent?

  for(a=1; a < (NbrOfAs*NbrOfAAs*NbrOfAAAs*NbrOfAAAAs*NbrOfAAAAAs); a++){
    ExamineAAAAA()
  }
Or do I need a nap?


In most languages, it is not guaranteed to do the same thing; the multiplication may overflow.


You really don't know bad programming until you have spent some time in a 50000 line cobol program. I'd post some crap I work on every day but I don't want to make anyone cry. Nevermind. Here's some random code I'm working on.

             MOVE SPACES TO LISTBAT-NAME.
             STRING WORK-FILES "LIST.BAT" DELIMITED BY "  "
             INTO    LISTBAT-NAME.
             OPEN OUTPUT LISTBAT.
             MOVE SPACES TO SCR-S.
             STRING "DIR /B " DATA-PREFIX " > " 
             WORK-FILES "TMPLIST" DELIMITED BY "  " 
             INTO  SCR-S.
             MOVE SCR-S TO LISTBAT-REC.
             WRITE LISTBAT-REC.
             CLOSE LISTBAT.
             CALL "C$system" USING LISTBAT-NAME, 96 
             GIVING STATUS-VAL.
             MOVE SPACES TO TMPLIST-NAME.
             STRING WORK-FILES "TMPLIST" DELIMITED BY "  " 
             INTO  TMPLIST-NAME.
             OPEN INPUT TMPLIST.
             MOVE LOW-VALUE TO LIST-NAME.
             PERFORM UNTIL 1 = 2
             READ TMPLIST
             INTO SCR-S
                AT END EXIT PERFORM
             END-READ
             ADD 1 TO PROGRESS-REC-CT
             INSPECT SCR-S CONVERTING LOWER-CASE-ALPHA TO 
              UPPER-CASE-ALPHA
              PERFORM VARYING SCR-X FROM 50 BY -1 UNTIL 
              SCR-X = 1
              IF SCR-S(SCR-X:1) = "."
              MOVE SPACES TO SCR-S(SCR-X:)
              EXIT PERFORM
              END-IF
             END-PERFORM
             CALL "CC/STRINGER" USING SCR-S, STRING-INFO
             MOVE SCR-S(1:STRING-INFO-LENGTH) TO TMP-RID
             PERFORM LOAD-RPT-FILE THRU END-LOAD-RPT-FILE
             END-PERFORM.
             CLOSE TMPLIST.

The newline on some of those lines is off because HN makes it wrap but you get the idea


Note to self: do not learn COBOL.


I'm sure it would make you a better programmer. Using tools that were abandoned by their company 10 years ago teaches one to be humble and to appreciate the little things in life.


That's funny, because I look at that and think it's too bad I don't know anything about COBOL. It's not like I'm planning to learn it or anything, but it's quite different from anything I've ever worked with.

So, questions: I know I'll never program in COBOL. But I'm interested in programming languages in general. How much of a "quick intro" would be worth digesting just for the purpose of contrasting to C, VBA, Lisp, etc.? Is there a good one to look at?


I'm interning with a company that is a COBOL shop. It took me months to make the transition from "normal languages". I don't think it's worth learning honestly unless you get a job in it. Here are some things it has taught me.

It's taught me to create pretty code that is indented correctly. I pretty much live in a debugger. Most people out of a CS program think they know how to debug stuff. They don't have a clue. It's taught me to be very meticulous and review every single thing I do down to periods(which BTW terminate loops and if statements making life hell) If you can't understand what you just wrote it needs rewritten.

As far as language comparisons. All variables are declared at the top of the program. All variables are fully global. If you move a variable to a smaller variable it doesn't throw an error it just gets truncated. Loops start at 1 instead of 0. There is very limited error handling.

I don't know of any resources online. I searched when I started working but I didn't find anything real helpful. I did have a 25 year old 30 million line code base to learn from though


You know, I once had that rule for myself. And I would state it loudly.

My first consulting gig involved writing some moderately large royalty accounting programs in RPG-III for a System 34. One day, the engineer from the vendor stopped by and plopped cobol onto the system.

Suddenly, COBOL didn't seem all that bad, and i broke my rule.

Yes, there are some things worse than COBOL, and I hope you never have to deal with them.


I worked on a piece of code that literally drove the guy who wrote it round the bend. He had a mental breakdown, ran away, and eventually ended up in a different state ~3000 miles away.

It wasn't so bad.

Now debugging code written by Chemistry Students... that's some scary stuff right there.

Which is not to go all elitist and put Comp Sci degrees up on a pedestal, one of the better coders I've had the pleasure of working with was a trained vet. He was very methodical.

Actually, now that I think about it, working with something that if you make it angry it will bite your hand off, is probably good preparation for dealing with a compiler...


You should try working with code written by hardware engineers. Those guys are extremely smart when it comes to digital and analog electronics, but never ever let them write a device driver ...


Those guys are too smart. When you start to write programs that are too big to understand as one lump, you start to learn to decompose your code and organize it in ways that help you understand it a little bit at a time. They never do that; they just keep the whole program in their head at one time. Tell them that a function should do one "simple" thing that makes sense as a unit, and they have no problem putting the whole program in that function.

Actually, I shouldn't say they are too smart, because they will continue to program like that even when it causes them problems....


I had the interesting experience of maintaining a device driver written by a hardware engineer, for a servo motor. There was no reason for it to be in assembly language, but it was. No comments, either. Very bright guy, but he'd just never absorbed much common sense about coding up maintainable software.

I think the thing here is that, for the most part, hardware guys don't want to write software, so they don't bother to learn how to do it right. This usually comes back to bite them in the ass, when they end up needing to write some code anyway.


I did. Same job that drove the guy nuts.

Turns out most of the obscure errors were on the driver end. Go figure.

Arguably, at the exorbitant rate the company was paying these bozos to fix the problems with their own code (approx $200 per hour) there wasn't really much of an incentive for them to get it right the first time...


I worked with a guy (I won't name the company) who wrote Java code in one, huge, static class as much as possible. In fact, everything was largely in one function too.

He decided to name his fields alphabetically.

static int a

static int b

static String c

static float d

static int e...

What, I wondered, would happen when he ran out of letters? Scrolling down further I saw this:

static int aa

static float ab

static String ac

static int ad...


There was one networks class I took where the assignment was to implement a simple network protocol to do file transfers over a serial port. Computers in the lab were paired up and had their serial ports connected to one another. People were assigned to computers and given either the receiver or transmitter to implement.

I was about done implementing the first draft of my side and asked the other side how it was going so we could test some actual communication. The response I got was "it's about done, we just need to split it up into functions". I was initially shocked and then naively impressed that someone could actually reason about the problem without breaking it down.

The end result was of course that I just had to give up and implement both sides of the communication. This was eventually a much better learning experience. I ended up abstracting out the serial port and allowing the two sides to communicate through a unix pipe with random bit errors introduced in packets to test the recovery. I could then run much longer testing without depending on the lab or someone else. I think I eventually tested it enough that I was up against the fundamental problem that the cheap checksum we were using let errors pass way too easily.


That exact same thing happened to me, except that I was using a smaller error rate, and a Hamming(7,4) code for error correction.

After the ordeal was over, I looked at the code that the other guys had been writing. They hadn't started on the error detection and correction -- I'd heard much wailing and gnashing of teeth earlier about how mathematical it was -- and their code (all in one big main function, with no indentation) wouldn't compile. I watched as they spent about an hour randomly permuting it, to no avail.

I'm not sneering at these guys. I'm baffled by them.


To really bork up the Java, you need a pattern fanatic. Once you stand working with Handler Adapter Handlers you know you should have taken the other colour pill.


When you have factories making factories, it's time to hoist the flag, get out the knives and start slitting throats.

-- after HL Menkin


Yes, i remember working for a company (java devs) where at some point we ended up having wrappers around wrappers around wrappers delegating stuff around, factories of factories ... It made your head spin.


I mean this dead seriously: People complain about the abstractions like "Monad" in Haskell, but I'm yet to see anything as abstract and difficult to reason about as a decorator around a facade delegating to an implementation of a factory factory of something probably producing a concrete instance of some other pattern monstrosity.


Definitely, but a point should be made that design patterns themselves are not a bad thing, quite the contrary, actually. It's overuse of design patterns that should be a crime. I came across some pretty difficult to understand abstractions, but fortunately nothing like the monstrosity that you described.


Yes. APIDelegatorInvocationHandler and APIAbstractFactoryFactoryProvider. (actual Java class names)


When you said 'actual Java class name' I thought for a moment that those were actually classes in one of Java's libraries. I googled the names, though, and fortunately, they're not...



That stuff has its place. But it's needed less commonly than it's used.


Often smart people coming from mathematics start programming this way. In math it's traditional to give everything what programmers would consider cryptic one-letter names, partially because you tend to write them on paper a lot while thinking, and partially because you have fewer entities so it's less bothersome to remember their meaning.

Of course it's a bad idea while programming, I just find it helpful to remember the reasons that smart people can make seemingly terribly unaesthetic decisions.


Sounds like he got his start programming spreadsheets.


He's not a person, he's a compiler! I know single static assignment when I see it.


I work with a guy, long time ago, who would take code already split into functions and refactor it into one big function. I am not joking here, actually happened.


I've seen that too. I don't think it was intentional -- the programmer simply didn't understand what the existing code structure was for, and "defactored" it.


Just think of all the function-call overhead he saved!


There are actually systems where this is a non-trivial gain. I've heard some of the Extream software (since acquired by HP) guys tell stories about having to inter-operate with ancient mainframes their customers refused to replace. In order to get throughput on some of the logic up to an acceptable level, they had to do just this. Amdahl's Law, a-gogo!


Happened to me in a college course - fortunately I haven't seen it since. However the CURRENT codebase I'm working on is a whole other set of nightmares.

Bonus points for the fact that they don't let me fix it...


That was actually standard Fortran practice, back in the day when variables were case-insensitive and could only have four characters. You haven't lived until you've ported some of the Fortran code aerospace companies have been running since the '50's.


I think the best way to guarantee job security is to solve hard problems for the company you work for, and make it visible (i.e., communicate about it). This will gain respect from your peers and give you a reputation of being useful and very worthy keeping around.

Writing some kind of obfuscated code is shooting yourself in the foot - after a while not even you can maintain it. Plus, your peers will notice and probably not like it. This notion will percolate up to management.

Another way to guarantee job security might be when you are the only person able (or willing) to maintain some old legacy system. This carries the risk that if that system is finally scrapped or replaced, your job is being scrapped too ;-)

edit: horrible grammar


Not a pleasant story, but true...

I used to work at a place where one of the guys wrote a horribly complicated piece of code that about half of the system depended on. He was also quite an unpleasant man who routinely mocked everyone else in the company for not being as clever as he was.

The company ran out of money and needed to halve its workforce - one of my friends overheard him boasting that there was no way that they could get rid of him, as no-one understood what he had written.

I was asked about his code - I said no-one but him could maintain it, but give me three months and I could rewrite it.

I don't like to revel in other's misfortunes but the look on his face as he left the office on the day of the redundancies was a picture.

In the end I kept half of his code (ring-fenced so it was effectively a black box that no-one touched) and rewrote the rest in about a month.

Personally, I work hard, make it visible (as Luyt says), own up to my failures and every now and then go beyond the call of duty. And I try very hard to be nice to people, even when they're acting like idiots.


The only place where the names made any immediate sense was 'main' and any C stdlib calls.

Any job worth doing is worth doing right. Contrariwise, any job worth fucking up is worth fucking up stupendously. He didn't go all the way.

  #define monkeymeat printf
  #define turtlescrotum malloc
  #define wolfnipplechips gets
  #define chipotlaway exit
  ...


    #define HEREWEGO {
    #define ENOUGHNOW }


Well we are diverging from the theme, but the proper form of that is

  #define O_HAI {
  #define KTHXBYE }


You can even go a little further and use trigraphs, which is one of those ancient features that most people have forgotten because it doesn't make much sense anymore. ??< and ??> are equivalent to { and } in C, which could be handy for further obfuscation.

http://en.wikipedia.org/wiki/Digraphs_and_trigraphs


Not everything worth doing is worth doing well.

— Tom West, quoted by Tracy Kidder in The Soul of a New Machine (Modern Library, 1997). ISBN 0-679-60261-5

Note that Data General is defunct, so I take this quote as a warning.

References:

* http://en.wikipedia.org/wiki/Tom_West

* http://en.wikipedia.org/wiki/The_Soul_of_a_New_Machine


  #define wolfnipplechips gets
If you wanted to fuck it up proper, you should leave out this line. You don't want to dampen the horror people will have when they see you actually using gets.


I've seen

#define true false

Think they do it just for fun sake. Evil.


Back in the day, I was an RPG programmer (on an AS/400), and I had a co-worker who insisted upon using geographical labels for all of his GOTO targets: GOTO BARCELONA, GOTO TOKYO, etc.

Needless to say, maintaining his code took some getting used to.


That sounds kind of funny. Where do you go on error?

One time I had to debug this incredibly obfuscated Word Basic subroutine. It was such a mess of gotos (deliberately so) that the company that wrote it for us thought they had us over a barrel and could charge us like a wounded bull.

That is, until I showed my cow-orkers the superior technology of "paper, scissors and sticky tape".


Where do you go on error

Hell, Arizona or any other village with that name. Real C# code can be like this:

    if(John is evil){ goto Hell};
(http://msdn.microsoft.com/en-us/library/scekt9xw%28v=vs.71%2...)


GOTO HELL showed up with predictable frequency. I don't recall if he used it for errors, though.


Which Hell would that be? The one in California, Michigan, the Cayman Islands or Norway?


ON ERROR GOTO REDMOND


Redmond wasn't even on our mental map in those days; Windows was still at 3.11, and NT hadn't been released. The only thing we used PCs were good for was to hold our 5250-emulation cards...


The thing about languages that require GOTOs is that there is often no inherent meaning in the label, such as in a control structure that we'd take for granted these days, and back in the day you were often limited by token size (in one assembler I used it was 5 characters) so people sometimes used schemes like "A0001", "A0002", etc. or sometimes "DOG", "PIG", "CAT", etc. though the former was preferable as it wasn't distracting.


A project I worked at during an internship had an unmanaged C++ portion for rendering complex typography, which was written when every letter in a variable cost $10 (or so it seems). The result was that "pointer to a metadata character" became the variable "pmc", and that was just the beginning of it. I never quite did understand what the variable lpmnoics meant, and debugging the C++ was quite the drag and the task everybody avoided, and pushed off to the original developer of that code (who happened to be in a different geographic location).


That sort of thing is a lot easier to understand if you have the secret decoder key: http://www.joelonsoftware.com/articles/Wrong.html

But it's probably a defensible idea, in context. The idea is to supplement the type system of a low-level language with a manually-checked type system that helps you find semantic errors.


You're right, eventually I did get the hang of the chunk of code and wasn't afraid of it. Lots of others in the team were however. Also, because of eXtreme Programming and pairing, I knew the C# and what it was doing in 2 weeks and was writing code, having never seen C# before. I had some C++ experience, but by the end of my 6 month internship wasn't comfortable with the "views code". Yes, it was high performance code that did black magic, but making it readable is halfway to being able to understand it. It worked and it was fast enough and did good stuff (ever try doing a vertical unicode editing view?).


I went back to the code repository (happens to be open source now) and found a few (real) examples: - prgpttp: paragraph pointer to "TextProps" - prgpvwvc: pointer to the root something something view controller


lpmoics

a long pointer to a meta object in cthulhu studies...

... see also Hungarian notation as implemented by Miskatonic University


It does not need to be that complicated :)

When a company I worked for was bought the new companies tried to lay of some (most) of the employees. At this time our team was working on a product that was to a large degree implemented in Scheme (because management didn't really care about the programming language). I quit at the company (for unrelated reasons) soon thereafter, but it seems that my colleague (with whom I've implemented the system) had quite a good job security at this time.


I can't tell if the number of parentheticals in your post is a pun about using Scheme or not.


The legacy system that I'm working on now has about 20-30 distinct classes/data types that it needs to manage and persist to the database. The previous developer chose to implement all of these different data structures in one uber-table named 'object' and in the application code with one uber-class named 'Thing'. Relationships between 'Things', regardless of meaning or type, are simply dumped into a 'relationship' table. It wouldn't be so bad except for the meaning and nature of relationships is often inferred based on the order or 'direction' of the relationships in the database.

So pretend that I have a 'Thing' representing a table, and another 'Thing' representing a column. If the relationship was from the table to the column then that might indicate that the column simply belongs to the table. However, if the relationship is from the column to the table, then that might indicate that the column is a primary key.

There are also other gems deep within the bowels of the system. Many functions named things like 'doItForRealThisTime' and 'reallyDoIt'.


One of my favorite accessor method names I've run across is 'maybeGetCube()'.

The idea was that if it was cached, then it would return the cube, otherwise null. Cube was actually relatively aptly named, it was a big matrix of data BTW.

I guess really it wasn't THAT horrible, but the whole idea of 'maybe' doing something in code has always made me chuckle.


I guess you've never tried Haskell then. :)

http://hackage.haskell.org/packages/archive/base/4.1.0.0/doc...


There are a few methods in the .Net library for converting types called TryParse(). It's actually pretty handy, because it's an easy way get some string data munged into the type you currently want. Not that I would use this on a method that returns a matrix...


That's the whole premise of Prolog (ok, if you squint at it from the right angle). Execute steps X Y and Z, but if Z fails go back and try Y again a different way, and if that fails back up further and try X again a different way...


One of my former classmates names everything he can get his hands on after various obscure anime references-- database names, variables, servers, you name it. Once in a blue moon, I'd recognize a function name as being a character or item from one of the few animes I'd seen and after a bit of tortured logic based on the background it almost seemed like a reasonable choice.

Needless to say, he does better working on his own.


I wonder if he read the Manga guide to databases:

http://www.amazon.com/Manga-Guide-Databases-Mana-Takahashi/d...


My god... I guess it shouldn't surprise me that an EduManga industry exists.


Call me Captain Obvious, but I feel like quite often the time spent in obfuscating code in many cases might be a major component of why programmers that do this sort of thing get fired in the first place.

What goes on in someone's mind that says: "Maybe if I spend some time making my source code hard to understand I can better preserve my job." ?

In reality all it does is slow them down, waste their time and lead to the firing of the developer. (At which point we get to hear funny stories from people like jacquesmatteij)

If that time was spent improving/refactoring their code and working on new things, they'd actually have the job security they were so desperately seeking.


While this is true, there's a lot of space in between a model developer and a purposely-obfuscating twit. There's definitely a line that lazy developers can walk where they contribute just enough to make firing them more of a pain than putting up with them. I suppose the lesson for businesses is: be very careful about who you hire in the first place, because you may end up stuck with the person whether you like it or not.


And there are a lot of subtle little ways in which a developer can drift into this state. For example, not bothering to document how systems work, or opting to make band-aid patches to brittle systems rather than refactoring. After a while of doing the easy/lazy thing, you're eventually lording over a complex black box that only you understand.

Some may find that a comfortable place to be. You become an expert; you have opportunities to be the hero. And it doesn't necessarily even happen via malicious intent. Sometimes you just don't have time to document or refactor anything, and you find yourself in a suboptimal local minimum. (Speaking from experience.)


Yeah, it didn't save this guy from getting fired and it only made life harder on a another guy who he's never met. The whole "stick it to the man" mentality is childish and its shameful that this strategy is so common in software development.


I've watched enough "programmers at play" and read enough code to happily believe that this guy was just letting his sense of humor have its day. I'm not at all convinced that things like this are deliberate obfuscation. Of course, I'm not the one who had to disentangle it all.

We've all read bad code with bad names that nonetheless "works" (except perhaps for the occasional minor glitch). This is perhaps just one that combines poor programming style with a mis-placed sense of humor.


True enough. In the 7th grade, I once divided some number by 100,000 by writing out the long division, and writing 03=0, 7-0=7, 07=0, etc. The teacher came by and was completely baffled where I had gone wrong in my education, but I was just bored :)


I agree. Tomato and Pizza aren't any less silly than Foo, Baz and Bar, and certainly much less offensive than Fubar...

... It's not just programmers that do this either, also network admins name their computers R2D2 and C3PO, Shodan, Skynet etc.


Oh, yes......

My previous employer, you could work out the function of a server and its platform from its name; it was embedded within the name. There were index numbers when we had more than one which wasn't ideal, but....

My current employer, it helps if you know your nature taxonomies in some detail. We may have more to keep track of and so more namespace clashing, but I can't help but feel it's a suboptimal solution.


Naming computers using a linguistic convention unrelated to their function is a good idea, because there's a good chance you switch what your hardware is doing. The worst outcome is when you have computers named things like "webserver1", "webserver2" and then you get some new hardware so webserver1 is no longer your webserver, it's your test server.


I once took over some work that was previously the domain of our most highly paid ($150-200k a year) "architect"/contractor. Imagine my surprise when I actually examined his code and realized that it looked like it had been written by a student in an intro-level C101 class. At one point, I asked him what his approach to program design was...he replied "brute force and ignorance" and walked away laughing. We fired him shortly thereafter.


This is off topic but I find it difficult to read the blog-articles from jaquesmattheij.com . Zooming doesn't help because the text doesn't wrap correctly. Just a suggestion for the author to change the layout of the page.


The typography is off. The body text is written in Arial, which is not a great choice for body text (it's better for display). When in doubt, use Verdana (sans-serif) or Georgia (serif) for body text. Also, the font size is too small.

Here is an example of his blog with better typography. I used 18px Georgia as the body text instead of 12px Arial. I used Droid Sans (a nice display font) for the blog title.

http://i.imgur.com/jhjJA.png


I was under the impression that serif fonts were best left to headlines that are supposed to catch your attention, and that body text should always be a serif font - and that Arial is one of the better serif fonts if you use set the size to at least 14px and line-height to 150%.


You've got a lot of latitude when choosing a font for headlines. Both serif and sans serif work and you can get away with a fonts that have more "personality" than readability.

Body text, however, should be optimized for readability so your choices are more constrained. I don't think there is a huge difference in readability for serif vs. sans serif fonts for body copy. Ask a couple of designers and you'll get a couple of different opinions. Some will say that serif fonts are better because the serifs "lead the eye along". I'm not sure I buy that, but I do prefer serif fonts for body copy. But if you use a serif font, make sure it is big enough. At small sizes the serifs don't render well and get in the way of readability.

I've heard that Arial works better as a display font instead of a font for extended reading, which I tend to believe since it looks so much like Helvetica. If I choose a sans serif font for body copy, I choose Verdana which was specifically designed for the screen. But really, I don't think Arial is necessarily a bad font for body copy. I don't have any objective reason to say why you shouldn't use Arial.

Also, the header font and the body font should contrast, so if you have a serif header font it's usually good to have a sans serif body font and vice versa.


You are missing the word "sans" from that comment.

The general opinion used to be that serif was better for bodytext, and sans for title, and Arial is sans serif, but not one of the better ones.


Yeah, I meant sans serif for body text :)


I find your sample harder to read. Reading the current line I easily get distracted by the next, either because it's too close or too bold.

Thing is it's mostly subjective, so YMMV.


Are the current typography settings better?

http://jacquesmattheij.com/The+worst+program+I+ever+worked+o...


In FireFox, I enable the option View / Zoom / Zoom Text Only.


You should try readability.com


But the fact that readability is suggested, suggests that the site isn't readable. That's a hint to the author that it needs to be looked at, which is what the grandparent comment says.


The author publicly stated he no longer comes to HN, so he won't see any comment here.


I was thinking the exact same thing


This story reminded me of the poor sod I met who coded for a government department: http://blog.jgc.org/2007/08/why-you-dont-want-to-code-for.ht...


One semester I was TA'ing a certain course all electrical and computer engineering students are required to take.

One student named all his variables in phonetic chinese (using the utf8 character set of course).

Another student named all his variables after superheros.


For the Chinese guy, I wonder if the code would have been way easy to maintain for the average Chinese developer. That would be an interesting analysis.


I once worked in a Chinese software company on the mainland and often maintained code written by Chinese developers.

I'm Irish and didn't appreciate their comments in Chinese, they didn't appreciate my comments in Irish.

Once that was understood we swapped quickly over to English for all comments / variables / declarations.


On that note, Python's style guide, PEP 8, contains this:

"Python coders from non-English speaking countries: please write your comments in English, unless you are 120% sure that the code will never be read by people who don't speak your language."

http://www.python.org/dev/peps/pep-0008/


I worked with a guy a couple years ago who used the planets and nearby stars as a version numbering scheme. For example, the first revision of a file might be "Jupiter" and the second revision might have been called "Mars." (Of course, he didn't use the planets in their natural order)

He also refused to use real version control or even any kind of on network backup and insisted on using a Zip drive to back up all of the code he was working on. Needless to say, we had an intervention that involved hiding the Zip drive, and he quit in disgust shortly after.


I once worked on a game for the Wii using a 7 year old engine originally developed for the PS2. The company had a high turnover rate and the code was atrocious. The engine had been used for several games and each title introduced a new hack in the engine. You had the tipical spaghetti with meatballs mess with 2000+ lines functions and hacked code with comments like: "This code is for the E3 demo of the game xxx". Once I had to extend a 1000+ line switch statement.

But the worst I saw was something like this:

struct weapon { #ifdef xxx #include xxx #else ... #endif }


I remember that one of the sample robots that shipped with the game CROBOTS had all of its variables named variations on ll1l11l1, which is actually even worse in a VGA font than it looks here.


I came across a similar thing in the mid 1990s. The program had been written in the late 1980s, and the original author had since died, so he couldn't be consulted. In those days memory was scarce, so the program had been "compacted" by removing all spaces and with all variables represented by single characters. The original non-compacted source code had disappeared along with its originator. For any practical purpose, the whole program was unreadable.


The OP's post and all the examples in the comments are good candidates for The Daily WTF (http://thedailywtf.com).


code from a 150 line long method which is part of a 6000 line class. Obviously I can't post the actual code on here (but what it is doing isn't important in this context).

   private XXXXXXXXXXX XXXXXXXXXXX(final XXXXXXXXXXX[] XXXXXXXXXXX, final int XXXXXXXXXXX,
          final Long XXXXXXXXXXX, final Long XXXXXXXXXXX, final int XXXXXXXXXXX, final Long XXXXXXXXXXX,
          final XXXXXXXXXXX XXXXXXXXXXX, final int XXXXXXXXXXX, final int XXXXXXXXXXX, final Long XXXXXXXXXXX,
          final XXXXXXXXXXX XXXXXXXXXXX) {
   
         if (XXXXXXXXXXX || XXXXXXXXXXX) {
            XXXXXXXXXXX
            return null;
         }
   
         final XXXXXXXXXXX
         if (XXXXXXXXXXX) {
            XXXXXXXXXXX
            return null;
         }
   
         XXXXXXXXXXX

      if (XXXXXXXXXXX) {

         XXXXXXXXXXX
         XXXXXXXXXXX

         final XXXXXXXXXXX
         final XXXXXXXXXXX
         if (XXXXXXXXXXX {
            for (XXXXXXXXXXX) {
               final XXXXXXXXXXX
               final XXXXXXXXXXX
               if (XXXXXXXXXXX) {
                  if (XXXXXXXXXXX) {
                     if (XXXXXXXXXXX || XXXXXXXXXXX) {
                        if (XXXXXXXXXXX || XXXXXXXXXXX)
                            || XXXXXXXXXXX)) {
                           final XXXXXXXXXXX
                           if (XXXXXXXXXXX) {
                              if (XXXXXXXXXXX) {
                                 final XXXXXXXXXXX
                                 if (XXXXXXXXXXX) {
                                    int XXXXXXXXXXX
                                    if (XXXXXXXXXXX) {
                                       if (XXXXXXXXXXX) {
                                          XXXXXXXXXXX
                                       }
                                       if (XXXXXXXXXXX) {
                                          XXXXXXXXXXX
                                          XXXXXXXXXXX
                                          XXXXXXXXXXX
                                          XXXXXXXXXXX
                                       }
                                    }
                                 }
                              }
                           }
                        }
                     }
                  }
               }
            }
         }

      } else {

         XXXXXXXXXXX
         XXXXXXXXXXX
         XXXXXXXXXXX
         XXXXXXXXXXX
         XXXXXXXXXXX

         final XXXXXXXXXXX
         final XXXXXXXXXXX
         
         if (XXXXXXXXXXX) {
            for (XXXXXXXXXXX) {
               final XXXXXXXXXXX
               final XXXXXXXXXXX
               if (XXXXXXXXXXX) {
                  if (XXXXXXXXXXX) {
                     if (XXXXXXXXXXX) {
                        if (XXXXXXXXXXX)
                            || XXXXXXXXXXX) {
                           final XXXXXXXXXXX
                           if (XXXXXXXXXXX) {
                              if (XXXXXXXXXXX) {
                                 final XXXXXXXXXXX
                                 if (XXXXXXXXXXX) {
                                    if (XXXXXXXXXXX) {
                                       final XXXXXXXXXXX
                                       if (XXXXXXXXXXX) {
                                          if (XXXXXXXXXXX) {
                                             XXXXXXXXXXX
                                             XXXXXXXXXXX
                                             XXXXXXXXXXX
                                             XXXXXXXXXXX
                                          }
                                       }
                                       final XXXXXXXXXXX
                                       if (XXXXXXXXXXX) {
                                          if (XXXXXXXXXXX) {
                                             XXXXXXXXXXX
                                             XXXXXXXXXXX
                                             XXXXXXXXXXX
                                             XXXXXXXXXXX
                                          }
                                       }
                                    } else {
                                       final XXXXXXXXXXX
                                       if (XXXXXXXXXXX) {
                                          if (XXXXXXXXXXX) {
                                             XXXXXXXXXXX
                                             XXXXXXXXXXX
                                             XXXXXXXXXXX
                                             XXXXXXXXXXX
                                          }
                                       }
                                       final XXXXXXXXXXX
                                       if (XXXXXXXXXXX) {
                                          if (XXXXXXXXXXX) {
                                             XXXXXXXXXXX
                                             XXXXXXXXXXX
                                             XXXXXXXXXXX
                                             XXXXXXXXXXX
                                          }
                                       }
                                    }
                                 }
                              }
                           }
                        }
                     }
                  }
               }
            }
         }

      }

      if (XXXXXXXXXXX) {
         XXXXXXXXXXX
      }

      return XXXXXXXXXXX

   }


For a second I thought you meant that all variable names were a varying number of X's, and I almost crapped myself.


I thought the same - if it weren't for your comment, I would still be laughing. Brilliant!


that code almost looks like it was generated by a wizard ( not the pointy hat/white beard kind) or some other nefarious RAD tool.


It wasn't!


I had an assignment for a security class where I did just this. We each had to write code to print the contents of a file with 600 permissions upon entry of the correct password. The source code, password file, and any logs were all readable by everyone; the next project was to obtain the contents of other students' secret files (using John the Ripper, exploiting race conditions, etc.). So I replaced all my identifiers with the most distracting words I could think of - superman, flashbang, lassie, sparkles, etc. If I recall correctly, only one person in the class even tried looking for vulnerabilities in my code.


The worst program I ever worked on was spaghetti code with HTML, Javascript, PHP and PERL all entangled in one file. This was in 2003.

In 2010, I encountered something very similar at a well known start up. Some things never change...


The article and the comments here remind me of a great article on how to write unmaintainable code:

http://freeworld.thc.org/root/phun/unmaintain.html


foodfuscator.com is available if anyone is interested. :)


Oh you wanna try doing code golf, after a little while it warps your thinking and this sort of thing starts looking reasonable :)

using C=System.Console;class S{static int p,v,x,y,t=10;static void D(char c='@'){C.SetCursorPosition(p%t,p/t);C.Write(c);}static void Main(){var s="";for(;v<100;v++)D((s+="#### # ### ## # "[((y=v/t)<5?y:5-y+4)*5+((x=v%t)<5?x:5-x+4)])[p=v]);p=22;for(D();;D()){v=p+new[]{-1,-t,1,t}[(int)C.ReadKey(1>0).Key-37];D(' ');p=s[v]==' '?v:p;}}}



Maybe with proper code review these kinds of ridiculous problems would get caught early and stomped on rather than turning into actual problems.


Honest question - how does code review work among "peers" with no seniority. Hypothetically speaking snicker say one peer likes to churn out reams of nonsense code and the other peer is looking forward to a sense of maintainability.

Yes, I already know your first answer: "Switch jobs".


If he's simply green or ignorant then experience is the cure; a few good code reviews (yours and his) should transfer enough experience to make work less painful.

If he's simply lazy or doesn't care... Well, all you can do is go through the pockets and look for loose change. No amount of "process" can make a bad employee into a good one.


Thank you - I appreciate the answer. I'm not sure I will be able to apply it however as my scenario is not quite as clear cut (although it's a little more of #2 than #1).


I'll never forget when I saw a code in a production java app with lots of JSP, SQL, jdbc, html and javascript in the same file. One of the lines was something like:

<a href=# onclick="document.getElementById('field').innerHTML+=parseInt(this.parentNode.parentNode.parentNode.parentNode.parentNode.parenNode.childNodes[7].childNodes[13].split('|')[3])">...</a>


Reminds me of my first programming job. One of my colleagues used variables named after baseball players.


I'll summarize it by saying that one can have a lot of debugging 'fun' when you have to maintain code that starts including all kinds of other files in nested control structures which, in turn, reside in things like while-loops..


whatever goal this guy had with his trick it failed in a terrible way

Not really. The company had to pull in an expensive consultant, causing it financial damage.


actual variable name I have seen in production code I was trying to maintain:

TheThing




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: