Programming has learnt me nothing but bad habits. A tendency to read HN instead of doing proper work. Constantly forgetting to go to bed in time, staying up all night in stead, constantly being woken up by the all too bright hypnotising monitor. A habit of contacting people by sms, e-mail or facebook rather than in real life. The list can go on forever.
One think that I've been noticing over the years is that my brain has been rewired to find problems in things.
Whenver someone shows/tells me something I instantly start to search for holes in it. I don't actually think about it. It just happens, my wife complains a lot about it.
I describe myself exactly the same way. For me, I think it's two separate components:
- For an existing system (of any kind - mechanical, legal, social...), I automatically look for things that can be improved.
- When discussing a new idea, I'm always mentally jumping ahead to the implementation phase, where potential problems quickly jump out at me.
I've considered it to be just part of my personality rather than specific to programming, but since I first learned to program at age 7 it's difficult for me to distinguish the two.
I actually see that as the most negative real-life habit I've gotten from programming. At first I valued it highly, but I think it's important to keep in mind that overdoing it pushes you in the opposite direction of 'living in the moment' (or in less encumbered terms: enjoying what you already have).
Being able to see all sides of an issue seems to be a mixed blessing. I watched a bio-pic of Bobby Fischer where they explained how strategic thinking changed his personality over time. They said that constantly putting himself in his opponent's shoes caused him to become very paranoid in real life.
I can totally relate to this and I have to constantly remind myself to just agree with people instead of challenging them when there is really nothing to gain. It's not always easy to do :)
I don't program, and I do all of those things! So it might not have been programming-induced. Or perhaps there are multiple paths to this tragic life we share.
Also being over-eager about automation. "Why can't computer do this boring thing for me?" and down the rabbit hole of GUI automation you go... (of course, it can be a good thing as well. XKCD has as few good strips on this)
> "Everything is vague to a degree you do not realize till you have tried to make it precise, and everything precise is so remote from everything that we normally think, that you cannot for a moment suppose that is what we really mean when we say what we think." — Bertrand Russell
Biggest thing for me is to stop prematurely over-engineering things. Need something done? Just get it done, however possible. If, in future, you need to do it many more times, you can gradually optimize the process as you go. As opposed to letting the perfect kill the good in the first place.
Practicing something repeatedly, preferably every day, is the best way to get better at something. I started programming every day after being inspired by a blog post by John Resig that got posted to HN [1]. I started a game that I've been working on for 245 days straight. I started with almost no knowledge of game programming and now have, I dare say, a playable game. Github really helped with this because you get a nice display of your commit history over the past year that gives an incentive to not break the chain [2].
I've been able to apply this to other skills I've wanted to get better at: relearning how to play the piano, making music, working out, etc. and have seen great results.
This. I'm still a young gun with only a few years under my belt but the combination of writing code and working with clients who need code written has made me realize the importance of communication. I've been able to take the precise communication that I'm forced to use in my work (be it with a system or with a client) and use it in my hobbies and organizations I'm a part of.
I've been programming since I was 8, and I'm now fast approaching 40. This makes it a bit hard for me to isolate the specific influence of programming on my life habits. That said, I do have a couple thoughts on how programming might have been helpful.
For context, the systems I work in are typically nowhere near perfection. This leaves me with a continual list of things to be done that ranges around 5-10 person years of effort to complete. Because of this fact, I have had to learn to be tolerant of the imperfections, organized enough to keep a long-term plan in focus, and tactical enough to fix things in a sensible order. It's also been useful to develop an understanding of when costs are sunk and when ideas aren't worth further development.
I'd say that all of this has been useful in my day-to-day life.
Something that's come to me more slowly is the human aspect of software development. My earliest programs were all one person affairs: I'd be the only author, reader, and user of the software I wrote. The older I've gotten, the more interesting it is to write software that other people can contribute to, understand and use to better their life. Learning how to build systems that do that is teaching me a lot about design, empathy, and how to think outside my own head. This is still a work in progress, but it's been very rewarding.
For any problem I face in real life, I start to break it down to make it more manageable pieces. Eg, my wife would say the bank is over charging us in their bank feeds. I would break that down to what fees are we talking about, how are we using the accounts to cause the fees etc. It drives my wife nuts.
One bad habit that could be attributed to programming is that I don't remember much anymore. I just jump into SO or wikipedia etc to get the information, and I also keep a bunch of notes in my pocket for "all the things I need to remember".
Not to sit around speculating about how something may work or what may be faster when a test case can be built that quickly proves which works or is faster.
I no longer just believe what I'm presented with - I have to debug, dig deeper, look beneath the covers and find out what is causing the phenomenon. My wife often complains, but once I get that debugger fired up and start .. slowly .. stepping things through, it all comes out in the end.
Programming taught me to stop wasting time theorizing and arguing with anecdotal evidence when I should be implementing and testing my hypothesis. It taught me that sometimes extensive planning is not as important as being able to react and respond to problems as they come.
When others say something will take longer than I would expect, I trust them more. They're the ones who are doing it; they know the details that I don't. And it would take too long to make them explain all the details to me. And as they work on it and realize it will take even longer, I have sympathy because boy have I've been there myself.
I question the top answer as being a good habit at all. It's like saying "there's probably some magic going on that I'll never understand". There's a reason why you're wrong. You should probably find out and absorb the reason instead of pretending like the world operates in some mysterious way.
I don't think that's what he's saying at all. It's closer to following a scientific approach; what you "know" is simply a more or less wrong model of the reality. You may study something "magical" until you understand it completely, but you still must be prepared to revise your knowledge if suddenly you learn something new that goes against it.
Programming systems can be treated scientifically, but they can also be treated deductively (like math). It's possible to have full knowledge of what's going on if you treat it like a deductive system. If your primary way of learning is through experiment, then yes, you will be wrong a lot of the time. But then I would never claim you had a good reason for believing anything.
I can't view the stackoverflow link right now, but the top answer basically claimed "I feel strongly about my reasoning, but nevertheless, I could be wrong". How could that be unless you made a mistake? The system is logical!
If you really want to understand how things work from the very bottom, you can build a toy computer running on FPGA, there are some very cheap boards available now.
To make things even more interesting, you can write your own HDL compiler (into netlist, of course, anything lower level is, unfortunately, too proprietary).
From this point you can build up your abstractions. The shortest route to the high level you're already comfortable with is to implement Forth first, since it's very easy to build a simple stack-oriented machine on FPGA, see ( http://www.excamera.com/sphinx/fpga-j1.html ). Running your own Forth on top of such a CPU is easier than on something like x86.
Next step is to implement a Scheme compiler and runtime in Forth. Garbage collection can be somewhat tricky, but if you go for something trivial, like stop-and-copy, it won't take long to implement, especially if you have some assistance from your hardware.
Three totally manageable steps from the very bottom of the abstraction ladder - and you've got Scheme, an extensible, powerful meta-language, from which you can grow anywhere.
I recently wrote this bit on bootstrapping a Lisp-like language using the most trivial interpreter possible: https://combinatorylogic.wordpress.com/2015/01/14/bootstrapp... - you can find this approach useful, I did a similar thing on top of Forth and it was very easy, much easier than trying to implement a "proper" interpreter.
And once you know how to build the whole thing from the bare silicon, there won't be any place for "magic" in your understanding.
If you really want to understand how things work from
the very bottom, you can build a toy computer running
on FPGA, there are some very cheap boards available
now.
You don't even have to go that far, you can build something resembling the cheap early 80s/late 70s computers from off the shelf chips and 8 bit CPUs that are still sold (z80, i8051, 6502).
It'll just be a bit hard then to go all the way to the high level - i.e., memory-hungry, garbage-collected, with elaborate, computationally intensive compilation.
For this you'd need a bit more modern device with enough memory attached. At least something around 16Mb would be nice if you want to implement a decent Lisp. With z80 you'd likely be forced to stay at around C or Basic level of abstraction.
Pretty sure a 10mhz z80 with 64kb ram can handle a small forth or scheme dialect just fine (You'll have to program those interpreters in asm or c though!) :)
It won't be useful/"pragmatic" but I don't think that's what we're after here anyway.
Yes, but this way you'd need some magic - an external C compiler or at least an assembler. Instead of implementing everything from scratch, with most of the code developed on a board itself.
I was about to point out the "bug" in your "code", but then I reread the sentence. And now I need to use this as a joke with my wife. I'll probably get punched jokingly for this one, as with all my bad jokes :P
Yeay, another opportunity to rehash the tired discussion on SE's policies. I'm sure it won't be filled with the same arguments as all the other threads /s