Hacker News new | past | comments | ask | show | jobs | submit login
How to become a programmer, or the art of Googling well (okepi.wordpress.com)
201 points by okepi on Oct 24, 2014 | hide | past | favorite | 152 comments



This is very true, but it's a bit more complicated than that.

"I'd come to understand that the answer was out there, somewhere." The answer is out there, but so is the wrong answer. Recognizing the right answer is the hard part.

I also must point out that the answer is out there if you know English. Knowing English is a strong privilege. Success rate plummets if you Google programming questions in a language other than English.


As someone whose native language is not English (and didn't receive particularly good English classes at school, but had access to the Internet): if you start at a young enough age, you pick it up somewhat naturally. Seriously.

In my case, wanting to "do X thing in VB6" drove me to read heaps of text in English, and even if I couldn't make sense of everything I could kind of guess what they were saying from the context and some few words I recognised. Do this for a few years and you'll have decent comprehension skills in that language. I guess that's a pretty natural way to learn, rather than being drilled some abstract grammar rules.


Your English is excellent. It's sad that a single non-idiomatically phrased sentence can "blow" your cover though, e.g. "some few words", that gave it away instantly.


Hey, thanks. Yeah, being able to recognise non-native speakers (based on accents and non-idiomatic sentences) is apparently a significant evolutionary advantage. I remember reading most people can identify a person speaking in a foreign accent within 80ms. That's pretty impressive!

I'm currently trying to actively get rid of my accent. I've been asked a few times whether I'm actually British, and pretty much nobody can actually guess where I'm from.


I always hated it when I was learning German when people would point out that I'd said something silly but not tell me the "right" way to do it.

In that spirit, as a native speaker: "...from the context and the few words I recognised." would work just fine.

That's english for you. When the peasants are revolting, its very different from when the food is revolting...


Yep, thanks. Getting a basic handle on English is not too hard, because of the lack of gender pronouns and a few other things, but masquerading as a native speaker is exponentially harder :)


I think "a few words" is probably just slightly nearer the intended meaning than "the few words".


How long have you been living in Britain, for people to think you're British from your accent?

I started pretty much the same way, trying to make sense of C++ and VB manuals when I was young and I think I have pretty good reading and writing skills now, although a native speaker can notice English is not my primary language.

Spoken English is still an obstacle for me, I moved to London a year ago, I got a pretty good pronunciation, but I still find it much harder to convey my thoughts, compared to writing them. And I'm trying hard to lose my Italian accent, luckily it's not as pronounced as the stereotypical one.

Though I have to admit I've never been a good speaker, not even in my native language.


About a year now. My spoken English was pretty rough back then, but I started copying the sounds I heard when other people spoke, and it seems to have helped.

Yeah, it's a lot harder to convey thoughts by speech in a foreign language, and I think it's because you have to make a conscious effort to find the right words or expressions. It's like getting data from RAM (foreign language) or from the L1 cache (native). It gets better though, with practice.


I would guess you aren't thinking in the language in question. You are translating your native language sentence into the appropriate english sentence on the fly. To use a programming analogy, it is like writing Haskell code as if it was C. Because you aren't thinking functionally, for every imperative step the C code in your head takes you have to translate it to some sort of functional analog.

You can get through it, but it is slower, and looks "funny" to a "native".


I am curious. What exactly is wrong with "some few words"? In german it would make total sense.


It's just not common in an idiomatic way, really. Technically, it is 'incorrect' because 'few' is the object of the sentence, not 'words'.

But there are plenty of examples of phrases in English which are acceptable but which break grammar rules.

In short, you would say "a few words".


In Spanish you should also say "unas pocas palabras" (perhaps jdiez is an Spanish name). The adjective of Google perhaps should be in lowercase. Remember perhaps should be recall. As I almost never speak English I (don't have the opportunity or the like of it) I find it very difficult I will ever grok English, but that is my ever unsatisfied greater aspiration (aside from getting a bigger salary and other pleasures for a young male man).


I'm not sure if it is "wrong," per se, but not idiomatic. Here (in the midwestern part of the US, at least) you would pick one or the other, so "some words" or "a few words."


You can't say "some few". Either are fine, but not together.

It makes sense to me, but I'm not a native speaker either so I don't know if there's an actual rule behind this.


Sometimes you use few with a countable name. Some is for not countable. A few friends means that you know perfectly how many friends you have, if you have some friends the number of friends you have is subject to a great uncertainty from 1 to infinity. So some few is saying you have few but are uncertain or don't want to say how many, but that is not very rational. Why this reasoning? Because it can help you to avoid repeating the same error (or unidiomatic expression). The simple fact of analyzing any mistake it a way of becoming vaccinated against it.


It's probably ok. Just not extremely common.

http://www.oxforddictionaries.com/definition/english/some-fe...


As a note, while, as others have mentioned, it's non-idiomatic, and dropped in casual speech would come across as a trifle awkward, it -can- be made tenable in sufficiently florid prose that is clearly intended to be delivered verbally.

That is, while "some few" is ungainly, pausing and delivering it as though 'few' is reinforcing or clarifying 'some' (through a change in tone or similar) would sound a little archaic, perhaps, but completely natural.


Well, before that you have to be able to read Latin alphabets. I had to.

You may be fortunate enough to have a native language that happens to share the writing system with English, but a lot of people don't. Actually, I think the majority of people don't.


Learning another alphabet usually only takes a week or two with flashcards though. I wouldn't imagine that will hold anyone back. Though encouragement would help.


Learning an alphabet might not be particularly difficult in itself, but being able to consciously identify symbols is not enough for your brain to form higher-level abstractions. It's not unattainable by any means, of course, it's just a significant obstacle. (IMO)


That's very true, and I didn't think about it. I'm very fortunate in that regard.


"if you start at a young enough age"

My understanding is that less than 12 is a good definition of young enough. (Although, ironically in the context of this discussion, my google-fu is too weak this morning to find the reference that convinced me of this. :)

In any case, a distant 2 years of high-school German coupled with 6 months of effort last year (pretty nearly an hour a day) was not enough to give me the skill to even read a children's book in German. For some of us, learning a language is hard.


"learning a language is hard"

I don't agree here. First off, your two years of high school German, like my two years of high school Spanish, were likely a waste of time. I had the highest grade in class and couldn't speak a word of Spanish, although I could write simple sentences in present tense quite successfully.

I moved to a French speaking country when I was 23. Initially, like most Americans, I thought that learning languages was hard so why bother. After one year abroad I basically spoke 0 words of French. Once I decided to stay long-term I took a French class, it was once a week for two hours for about 7 months. I was teaching physics in French a few months later. Granted I wasn't fluent, but it was good enough and my students were incredibly patient. By the time my own French course started up again it was unnecessary, I was learning more than enough during the day so I quit.

The key part was that the French class was only in French from day one. You could not ask a question in any language other than French because the teacher couldn't understand you (in reality she simply feigned incomprehension). Amazingly, a good teacher can make you understand a language you don't speak by speaking to you in that language. For the life of me I don't understand why high schools in the US don't adopt this system.

Babies are great at learning languages and you have a massive advantage over them, you have the power of reason. You also have the massive disadvantage of people refusing to speak to you like a 2-year-old.

My advice, if you really want to learn German, is to spend the next 6 months speaking to someone in German for an hour everyday (there are some websites that let you do this remotely). Same time investment as before, but I'm guessing you didn't spend so much time last year actually speaking with someone. Babies don't learn to speak through books, they listen for a year and then start saying extremely simple things. Learning to read in German will do wonders for your vocabulary, but it won't allow you to have a conversation. As someone who is quite anti-social, I find this unfortunate, but the wiring required to speak another language gets put in much faster by speaking said language.

Also, don't bother learning how to speak in present tense, learn how to speak in the past and future. You want to be able to say what you did yesterday and what you are doing tomorrow. People ask you how your weekend was and what are you doing for vacation, they know what you are doing right now because you're talking to them. Surprisingly, I can't think of a language where it is harder to speak in the past or the future than in the present due to verb conjugations becoming largely trivial.

I want to emphasize that I was not an especially gifted student in my French class. We were all total beginners and by the end of the course most people were functionally speaking French.


No doubt immersion is the way to go.

But it still seems to me that countries where students learn foreign languages successfully at a high rate start those students before they're 12. My two years of high school was a waste, as you said yours was. But Germans (and Dutch, and Norwegians, etc.) have much better success in their schools, and they all start before the age 14 that's common for starting here in the US (though that's starting to change).


I'm not completely sure that 'youth' is a required attribute here. From my experience, this can happen later in life as well, though you may be limited to being good at only what you can practice easily.

For example, I'm much better at reading French than listening to it. The opposite is true of some other languages.


As a comparison, my German (as a foreign language) used to tell us the first year we learned it, "beginners shouldn't use a bilingual dictionary".

In order to ensure the word you found in the dictionary is correct, you need to be good enough in the language. Often there will be homonymns in your native language, and you'll get the wrong word. For example you want to translate the bank of a river but end up with the place that keeps your money.

It's the same with answers you get from Google or Stack Overflow about programming. If you're not good enough to ensure they're correct, you'll be in trouble.


> In order to ensure the word you found in the dictionary is correct, you need to be good enough in the language. Often there will be homonymns in your native language, and you'll get the wrong word. For example you want to translate the bank of a river but end up with the place that keeps your money.

That's why you use a good dictionary that support phrases, and even give a context annotations[1]. That makes the advice a bit obsolete.

[1] http://www.dict.cc/?s=bank+of+the+river


You still have to know enough to discern what phrase seems to be correct in order to know that's the phrase you want.


This is something I've always wondered - how do programmers who don't natively speak, read, or write English cope with the sheer amount of literature they have to parse to have a career programming?


In my country (Netherlands) people speak English quite well. My entire (CS) master was in English. So that means literature, class, assignments, exams, everything. I must say it did feel kind of awkward sometimes if you sit in a class where every single student and the teacher is native Dutch speaker, and still the course has to be done in English. But I think in the end it is worth it, because you learn all the terminology in English.

Actually it is funny, I only realized while reading this discussion that I always google my programming questions in English. It never occurred to me do google them in Dutch. I will try it today to see the results :)


Hehe, slight understatement. The Dutch learn English from 8 is it? Being Dutch, I guess you can't really relate to any of the other foreigners in the thread.

The Dutch all speak English so well you can move there and never bother learning Dutch. And unlike the French, who will fake not understanding, they seem positively delighted to get a chance to use English. In fact, I've had other English people complain to me that it's often very hard to get them to speak Dutch to you once they know you're English. The only time I've ever heard anyone complain is one night out in Amsterdam where we met up with some Dutch people and one of them said "Oh no, are we speaking English tonight?"

Although I'm a bit surprised they'd have entire uni courses in a foreign language, but I guess it makes sense in programming precisely because it's such an advantage.


I spent 4 years and some in NL(Den Haag) and I never got past the numbers and greetings in Dutch... By speaking English everywhere and with almost everyone made me never try and learn it.


France here - they don't. Relying on translated material is the kiss of death (it's old and will discourage exposure to the bleeding edge, trains will be leaving the station without you for the rest of your career.) I've noticed good forum answers though, mostly sysadmin related questions for Linux systems.


I always google in english and this is a huge plus comparing to my coworking looking for answers in french. Even when they find answers, the quantity and the quality is lesser.

Hopefully, it doesn't take a lot of english to be able to read programming question on stackoverflow. If you have difficulties reading english and you want to do programming, focusing on just that will help a lot. You don't really need to be able to speak or to listen which are harder (at least for me ).


I Germany, most professional literature is translated into German. Most developers I know prefer the English version though.


Very easily. But that might be because in Denmark, it's rather the exception than the norm that people don't know english. Of course with varying degrees of proficiency.


I don't natively speak, read, or write English. Speaking for myself, you cope by studying English. Blood, toil, tears, and sweat are involved.


For what it's worth, your English is pretty good.


I don't natively speak, read, or write English.

Damn. I wish I could show your comment to people like John Searle.


Based on the limited sample, you write it better than many "native" speakers.


...uh...seems to have paid off well for you.


OT:

> "it's rather the exception than the norm"

> "Blood, toil, tears, and sweat are involved."

How does one learn to use idioms like a native?


Reading literature, watching films, listening to the music.

Basically, delving into culture which contains common phrases, idioms and sayings scattered around. You hear/read them, understand meaning from context, and commit to memory so you can use it later on.

This is not really that different from how kids learn idioms too, isn't it?

By the way, a lot of idioms/saying are pretty similar in different countries. Their wording might be slightly off, but just by translating it in your mind you understand what it means.

Example: "Dark Horse" - One who was previously unknown and is now prominent. In Russian it is pretty much the same thing "Черная Лошадка" or if to translate it to English "Dark Horsey" ("horse" is used with affectionate diminutive suffix).

Source: Russian.


France here - starting as a student and stepping into the university library, it took me 30 seconds to realize that everything interesting was written in English and not being able to proficiently ingurgitate vast amount of it was going to be a problem. I just shunt French as written input for the next 5 years. When the valley called for a job, I was ready. Pasteur said "Chance favors the prepared mind" - he was relating it to the field of observation but it obviously applies almost anywhere.

Nowadays, if you have access to the Internet, there's no excuse not to be proficient with English. I've noticed during my travel that youngsters with a bit of education (high school or better) in countries where movies/series aren't translated (because the market is just too small) have very good English level. France has an official language which means that content is vastly translated and this-is-bad(TM) - the we are slower at getting better at English.


> Nowadays, if you have access to the Internet, there's no excuse not to be proficient with English.

That is so true. A lot of my compatriots seem to not realize that they are missing an incredible amount of information by disregarding English. And I often find myself angry at them when they try to argue that English is not that important and they don't need to know/learn it. So frustrating.

> France has an official language which means that content is vastly translated and this-is-bad(TM) - the we are slower at getting better at English.

Again, completely agree. I'm a huge proponent of subtitled media content and believe that that's what should be used when showing movies/tv shows on national networks (cable or sat can do whatever they want).


I wonder about the mind/minds

I should say Chance favors those with a prepared mind or Chance favors only favor people with prepared minds, perhaps minds is plural in Spanish an singular in English and this is reasoning? Can anyone confirm this reasoning or I am completely confuse?

Edited: Gooling I found people with heads are smarter, so here they use the plural. So why mind and not minds.

http://gnosticwarrior.com/head-size-matters.html


"Chance favors those with a prepared mind" is grammatically correct and sounds just fine, it's just not as pithy as the original. You could also say "Chance favors prepared minds".


With the change to digital TV in France the original soundtrack is often available too so people can get exposure to English.


Indeed - it makes a few broadcasts palatable. I don't think that option is widely used though and I'd say mostly by people who don't speak French.


Casting aside writing in French for 5 years, did your ability to write in French degrade over that time?


I cast aside written French input. I still had to produce output. And I did learn, under supervision, how to write well during these years; using French as a target language but that knowledge has proven to be highly re-usable. I'm lost writing correct French without a spell checker.


irc, or whatever people use these days. i've also always google'd tech related questions in english, never thought much about it just seemed to make sense.

movies and tv help in general, but live group communication with native speakers teaches you a lot about idioms and nuances in languages.

mind you, it'll also teach you a lot of bad grammar, as a non native speaker i would've never thought of eg. 'should of / should have' .


By repeatedly talking with natives. It'll infect you by sheer osmosis.


OReilly has released basically all of their books in Japanese


Japan is a bit of a special case since the Japanese can function well without learning English. In most European countries and increasingly throughout the rest of the world, English is the lingua franca.


This. Seriously - I look around for the library or code snippet I need, I find x variations that all seem correct.

Recognizing the answer that will actually work and hold up under stress is a measure of the programmer.

A Go-cart & a Lamborghini are both "4-wheeled vehicles that will get you to the bottom of the hill"


> Recognizing the right answer is the hard part.

Very true, but even further than that: There are very few clearly right or clearly wrong answers. There are actually very few wrong answers - but there is a stupendous amount of right answers that are bad answers. Especially: bad answers for you.

Recognizing a wrong answer is the easy part. Taking a dozen or so "right" answers and selecting and condensing them into the best answer that makes the most sense for your programming context is the hardest part.

It is astonishingly easy to accrue a tremendous amount of technical debt because you just chained up solution after solution into a grotesque mix of "works, but could explode any minute".


Isn't that said using fancy words on ALL of the software licenses, that it "works, but could explode any minute"? I could swear that was the definition of software. ;)


> Success rate plummets if you Google programming questions in a language other than English.

This is very true. Not just for technical/programming-questions but pretty much everything. Even simple wikipedia entries. There is a huge quality gap between the english Wikipedia and - for example - the german Wikipedia.


I have to agree. I was interested in Linux before Ubuntu took off and because the mainstream. Now when I have a problem I sometimes end up on Ubuntu sites, and the answers there are far from the best.


I found that one particular phrase,"technically privileged", both hilarious and utterly horrifying. Hilarious because I suppose I never paused to consider that the folks in my major who were both curious and motivated enough to be involved in the field outside of class to be particularly "privileged". I suppose I should have given greater thought to how unfair it was that read API documents and I didn't!

Like I said though, it wasn't just good times and passive aggressive complaining about first world problems while I read this. I can't even begin to fathom the miserable, pathetic, and generally small, unexamined existence that would lead one to believe that h(s)e is somehow the victim of deep injustice at the hands of those people with their prejudiced technical abilities and natural curiosity! How dare they not level the playing field just because she never bothered to explore the use of code outside the classroom; clearly everyone missed that she's the subjugated one, with a comfortable liberal arts education and regular internet access.

But still, we should all take a moment to recognize the plight of the comfortable, generally satisfactory lives of those among us struggling silently with the burden of "technical un-privilege."


What an uncharitable response to someone who had a legitimate revelation and used it as a catalyst for personal growth! Software development is intimidating to outsiders - of course it's a privilege to be on the inside. (Or at least it appears that way.) This isn't a complaint about an unlevel playing field - it's an awakening on how to get better at doing something you love.


What does it even mean to be on the inside? Nobody in my family had any experience with computers. My first programming book was a gift for me when I was ten that I asked for (it was a learn C++ in 24 hours book). You can imagine how that turned out with nobody there to help me. I didn't understand what variables were let alone how to add the compiler to the path on windows. The only help I had was that my dad bought me the book and I had a family computer in the living room.

Am I on the inside?


Yes; you started at 10, and had a computer growing up to practice on.


So pretty much everyone in the US who owned a computer was on in the inside.


This article reminded me of an NPR piece I saw a few days ago about "When Women Stopped Coding"[1]. The theory goes that without having prior knowledge of working with computers, people cannot compete in many introductory computer science classes. Computers had been marketed like toys, to one gender: boys. Because of this, it was rare for women to have prior computer experience, and so struggled and dropped out of CS.

It made me reconsider what "technical privilege" really means and why I think it's a valid thing. Exposure to computers and technology outside of the classroom is definitely becoming more common, but not the kind of hobbyist interest that this article talks about.

[1] http://www.npr.org/blogs/money/2014/10/21/357629765/when-wom...


I don't think the author meant "technically privileged" as "those unfairly given an advantage for whatever reason". To me, it read more as, "folks who grew up immersed in this stuff" vs "folks who come to it later in life". I started coding in high school, and I think I approach it very differently than people who never wrote a line of anything until sometime in college or later. It strikes me as a legit difference, not one to bemoan as unjust but worth pointing out to people just getting started who might be intimidated.


Having had a bit of empathy during my college years, I spent a lot of time complaining about how my instructors and my program leaned heavily on the fact that a large percentage of students would be hobbyists, and used that as an excuse to not teach.

I'm sure it would have been easier to lean on the fact that I'm the son of two computer programmers, always had a computer (and various manuals) in the house, and started learning C when I was 10 - and to look at the plight of people who were less fortunate than me as a "first world problem", as education and employment are normally rarely described.

>But still, we should all take a moment to recognize the plight of the comfortable, generally satisfactory lives of those among us struggling silently with the burden of "technical un-privilege."

Because everybody who goes to college must come from comfortable, generally satisfactory lives - they all must have grown up with the internet, owned computers, went to good schools, and couldn't have come from crushing poverty or from the third world. They must have all been exactly like you, except lazy.

I must be misunderstanding your post, because it seems exceptionally cruel, in an cocky, stupid way.


Did we read the same thing? I got the impression that the author realized at the end that "technically privileged" was just an illusion, and the whole point is that they don't exist.


Yes, to me this piece flies way past the "technically privileged" assertion. And in this way, it is one of the best counter-arguments to the "technically privileged" accusation some people have against our field.


Well they do in a sense. It's the difference between growing up in a family like Roald Dahl's 'Matilda' and in a family that encourages growth. I also think that having a family member in the software development would make the transition into the world of work quite a bit easier. But in the end the right attitude can trump any of these things, which is what the author discovered. I am happy for him.


You're missing the point: it's not about "outside of class", it's about prior experience


This, for me was a bit surreal and strange, because it took a long while for me to arrive at the same conclusion but from the complete opposite end.

I came to computers from electronics and for me, Data Structures and Algorithms was (to an extent, still is) black magic with Don Knuth the presiding head wizard. For me, compilers, filesystems and OS internals were all sort of deep dark places that I wasn't equipped to handle. On the other hand, banging out a shell script, spewing perl, or reciting python was just a matter of figuring out the right set of incantations ...not really hard. I learned html, css, js, a lot of python and was (/am) happy to just 'get-things-done' with these skills. I looked at people who could talk about BigO, concurrency, monads and the rest as the 'privileged' -- I could never be /that/ good. And yes, I thought all of this was relevant to software engineering (as opposed to science) and that I would never get there. Fortunately, I no longer think this. Thanks to all the resources available online these days, the limits, I know now, are not inherent nor are imposed by ability to understand, they are imposed simply by time for learning and practice. So, I am still learning and yes, in that sense, the art of googling well can make you a better programmer -- irrespective of what you know today.


I'm in a similar boat. I majored in "Information Systems" which was practically a glorified business degree with some IT, networking, and a tiny bit of programming thrown in.

I was able to pick up comp sci fundamentals from various awesome online tutorials and even just good old Wikipedia.


I currently work with a bunch of people who are "business majors" or whatever and are similarly self-taught in the Tech/IT/Programming space.

It's interesting to watch how they work and the kinds of outputs they produce. Quite a lot of the stuff works... mostly...

It's entirely untestable, not repeatable, undocumented and just stuck together with flaky glue code.

Sure, if you aiming to "make it work" you can do that. If you're aiming to do well... I suggest learning proper technique.


> It's entirely untestable, not repeatable, undocumented and just stuck together with flaky glue code.

Code that's testable, repeatable, well-documented and elegant doesn't guarantee that an application will actually meet anyone's needs in a commercially sufficient manner.

The WordPress code base is frequently criticized, and the Facebook index.php code from 2007[1] will probably make some people cringe, but these are just two of many examples demonstrating that if you want to "do well", building applications that work and meet the needs of their users is initially a lot more important than "proper technique."

[1] https://gist.github.com/nikcub/3833406


If the Facebook code had stayed untestable and undocumented, it would never have worked at their current scale.

Testing and documentation make a difference in the long term. Unless your business model is "build a shiny tower and sell it before it crumbles", at some point, you'll have to fix the mess you made in the early days of the application.


I once worked on a project that was started by two construction engineers. It was a Java web app. It worked well enough, but many customers complained about the ammount of crashes and errors the app produced. But it had nothing to do with them being construction engineers. It was their mindset that they thought they now knew enough and don't need to become better anymore. When the complexity grew, the app was above their competency. Even when faced with their failures their were unwilling to admit that they need to learn more about professional software engineering.

I have talked with some developers about my observations and almost all of them had a similar anecdote to tell. Some of them included CS grads with a similar "good enough" mindset.

These anectodes are in no way a representable study, but for me, it put the radical professionalism movement of Uncle Bob in a whole new context.


I think it's a mindset thing; either you have the mind of a developer or you don't, degree or not. I'm self-taught, and early on I had a inferiority complex when it came to those with CS degrees. Didn't take long for that to wear off as reality exposed gaps in their ability/knowledge.


Fair enough.

I have a Software Engineering degree (certified Engineer), so I already have a superiority complex anyway ;)


Just wait until you meet mathematicians.


Or physicists, everyone not a physicist is a lesser mortal, maybe except mathematicians :).


And mathematicians are wankers (according to Feynman).


This is a false dichotomy. I'm a self-taught programmer and the software I develop has full test coverage as well as thorough documentation.

Proper technique and best practices are definitely important. They can, however, be a part of the self-learning process, just like learning how to write the program itself.


Absolutely. I think it's important to learn how to write proper code. I think my biggest hurdles were thinking that I wasn't good enough to even try, and being unable to acknowledge that my school had given me a solid background in CS.


>Some Small Advice: if you’re just beginning to program, I’d suggest starting with JavaScript/HTML/CSS because there is a plethora of help out there.

What a terrible advice.


Absolutely. I always suggest python since it teaches kids to always indent and make code readable. It also has very good documentation and is a simple language. There is plenty of help out there for it as well.

I initially tried js/html with the kids in my school, and although it was fun because they were making making visual things, it turned them away from the console. This is not a good thing since the console is where they have to do a lot of the work, and having an aversion to it is one of the worst things that can happen to you.

I simply state this as my experience. Yours might be different.


Why? For beginners web technologies are perfect. you can have results like a first static page within the first day and from that point go on to eventually build complex web apps. and all that with the same tools and languages. you will not be a programmer at first, but you might become one.


I think it's a great entry point. HTML/CSS is a very low entry bar to be introduced on how to talk to a computer. Of course it's not "programming" and very little logic, but it can show people one very important thing:

"See look, it's not so hard to talk to a computer. Those words on TV aren't all gibberish."

It's not like it teaches bad practices like a number of other "entry level" languages do. I'm sure the majority of people in my generation (late 20's, early 30's) started with HTML/CSS when they were kids.


Why is it a terrible advice?


Because programming != web programming, and sure as hell HTML/CSS is not programming.

About javascript(vanilla js), nobody even wants touch it anymore. Almost anybody who interacts with javascript code does it with a 5 foot stick(coffescript, clojurescript), and/or behind huge frameworks that protects you from its perils.

Something that is more usable and understandable in its barebones state like python makes more sense for beginners.


Vanilla JS is perfectly fine and there are a ton of people out there who are using it every day on a highly professional level.

There is a huge amount of tooling these days and specially the Node.js community and ecosystem are flourishing. Code coverage, static analysis, test frameworks for every use case imaginable, you name it.

Coffeescript and its brethren are a "solution" for people who are not interested in JavaScript and don't care about performance / maintainability in the long term. The additional compile step and the resulting problems with not being able to know exactly what it ends up generating become a huge burden in the long run. If you're really good in JS, you know what we'll be JIT'ted and what not, just like a really good C++ developer knows what the compiler will eventually optimize away and what not.

Seriously, building your cool, new library in <Insert Fancy New Compile to JS Language here> is like writing the latest addition to Boost in C89. Nobody wants to depend on it in the long run.

From what I have seen, the people who are mainly using things like Coffeescript are coming from Ruby or Python and want their cool-aid back.

If you don't want to learn a language, don't use it and if that means you won't be able to do stuff on the web, well I'm totally fine with that. I've seen enough code that looked like a J2EE developer just started coding C the other day.


While do agree with your stated facts, and I accept mine was a hyperbole, my point was more on the hidden pitfalls and uncommon language design of javascript.

Learning a language with prototype-based inheritance would serve less for a beginner than a professional.


I think you're severely overestimating the number of people who use a compile to language


I do agree it was a hyperbole.


>>Because programming != web programming

That's your personal opinion, and frankly, it reeks of elitism.


Um, that bit alone should be uncontroversial. Two sets are only equal if they contain precisely the same members. "There is no programming except for web programming" reeks of more elitism (in that it is more restrictive) than "there is programming that is not web programming". "Web programming is not programming" is another matter, but that's not actually what was said.


>>"Web programming is not programming" is another matter, but that's not actually what was said.

Actually, it was. If you read the rest of the parent's comment, he completes that sentence with: "...and sure as hell HTML/CSS is not programming."

So it's pretty clear from the way the sentence is written that the parent has clearly established some sort of hierarchy in his mind, and what he actually meant was, "web programming is not real programming."


"Actually, it was."

No, it wasn't. Read it more carefully. You may be right about the beliefs of the author, and I don't endorse the entirety of the comment, but "web programming is not programming" was not said anywhere there, and the principle of charity would suggest you should at least make sure that is what the author was trying to convey before lambasting them for it.

"If you read the rest of the parent's comment"

I did. Before I replied to you, even.

'he completes that sentence with: "...and sure as hell HTML/CSS is not programming."'

HTML/CSS is not programming, it's markup. Markdown and LaTeX are also not programming. That doesn't mean there can't be plenty of programming that involves those.

'So it's pretty clear from the way the sentence is written that the parent has clearly established some sort of hierarchy in his mind, and what he actually meant was, "web programming is not real programming."'

It's plausible, but again, principle of charity.


Isn't TeX/LaTeX turing complete? I'm pretty sure people have done things like implementing an emulator in LaTeX.


Possibly. Postscript certainly is. I don't think that really changes the point that most of what people do using LaTeX is markup, not programming.


HTML and CSS are not programming. This is not controversial.


Googling well is very helpful in filling in gaps in your knowledge that are widely known by others. For most (?) software engineering jobs, that's probably enough.

Writing some code that works, though, doesn't forward the state of the art. I was lucky enough to take AP computer science in high school. I remember arguing with the teacher when she criticized my code in front of the class. I said something along the lines of "Well, it runs and produces the right output!" She then lit into me, explaining that the structure of the code, its readability and maintainability were at least as important as its output. I've carried that lesson with me in the 19 years since, and had in reinforced multiple times over.

The art of "Googling well" won't teach you good abstraction and maintainability, and yet I would argue it is one difference between junior engineers and (good) senior engineers. We talk about learning all these frameworks, but that's not nearly as useful as developing the skills to create the next powerful abstraction to make a class of hard or time-consuming problems easier or faster to solve. You won't learn how to do that on Stack Overflow.


Do you have any thoughts on how a junior engineer can improve their ability to write maintainable code? This is the main problem I struggle with as a self-taught developer. I can write code that works for the most part thanks to googling all the things, but my projects tend to turn to spaghetti the longer I work on them, and are greek to me when I return to them in 6 months.

Part of the issue is that it's much more rewarding to add a new shiny feature than it is to refactor the existing mess, but another part is I'm not really sure what goes where. Most online courses targeted at inexperienced programmers teach x feature of y programming language, instead of practical software engineering... as a result I feel like I've learned how to build a fancy stained glass door for a house that has an unstable foundation.

I've often thought I'd like to hire a senior dev to sit with me and help me work through building an app the 'correct' way... but short of that I'd love to know of any books or web resources that would help.


Read other people's code. There's a world of high quality (and not so high quality!) code out there - pick a reasonably simple open source app and read it's code. Think about how it's put together: Why did the programmer chose to cut the problem space into those particular pieces? How do those pieces interact? Is the API obvious, or completely opaque? Is it completely batshit internally? (I've seen some batshit code in my time. I even wrote some of it.) How could it be improved?

Mentoring with a senior dev is a great idea if you can find someone willing to sit down with you, but just doing the work of going through real world code is well worth your time IMO.


> Do you have any thoughts on how a junior engineer can improve their ability to write maintainable code?

Think of your code like an essay. Write comments on the motivation for decisions you make, not just what the codes does. Think of the exercise not as trying to just produce writing code, but to make the next person reading the code feel like they were with you the whole way, making decisions together, working out problems on a whiteboard.

While doing this, don't be afraid of honesty, don't be afraid of verbosity, and always have empathy for the next person reading your code.


I'm far from what I would call a senior engineer (or a engineer at all), but the times I've written the most readable and maintainable code were when I write first what the program should do in comment form and then completed it with the corresponding code.

This way you force yourself to think outside your programming language and also forces you to write the corresponding comments.

I think this has a name as a methodology but cant remember it.


I was also self-taught and it took a good 2 years for me to get to the point to be able to write maintainable code. Here are the lessons I learned, in no particular order:

1.) Read your code after you write it. I do this via git diff --cached. It's amazing how many bugs you catch this way. Be critical of variable names, code layout or confusion, lack of documentation. It's basically doing your own code review. Is it documented? Is it tested? Does it make sense logically? Is it easy to read?

2.) Document stuff. Fluff documentation is even fine to start with, you'll eventually learn to separate the important documentation from fluff. Ultimately you need to get to the point of the "why" code does what it does, not the "what" it does. "What" it does should be evident by the code itself and the variable names/etc.

3.) Plan ahead. I don't mean pseudo-code per-se...more thinking or writing down how code will fit together. Mind mapping is great for this, particularly with a white board or tile. I have a Galaxy Note that I bought partially for this purpose and use Papyrus for it now. Basically, how do the abstractions and parts fit together? Is it logical? Not everything requires a whiteboard, but a few minutes of thinking ahead can help spaghetti code a lot.

4.) Learn different programming paradigms. OO style can quickly become un-maintainable even to pros. I suggest learning functional programming in a language like Scheme/Clojure, Lisp, or Erlang. I now pick up a new language every year in different paradigms...not so much to learn the language or to brag, but to learn different way things can be architected, and the way other languages do things.

5.) Do refactor. Outside of budget constraints, don't be afraid to rewrite code that doesn't seem to flow properly.

You mentioned sitting with a senior dev...that was pretty huge for me too, even in limited times. (I'm not sure I'd pay for it, but it did cut the learning curve down quite a bit) Having someone do code reviews of your code that knows what they're doing and invests time in it is helpful too. I've found though that not every Senior dev is good at reviewing code or teaching...

Sorry, I don't have any actual resources for this other than well structured code like Django (mostly) or sqlite. I now have a B.S. in CS and I don't really think they teach this well in schools either. Patterns will only get you so far in OO too...generally, trying new languages, reading code, and refactoring has made the largest impact on me.


It seems like the skill in question isn't necessarily about becoming a programmer as it is about becoming a hacker. It's a fine line, but I think the skills you learn at Google will be much more programmer-esque, like code review, unit testing, and doodling on whiteboards :). Whereas I see deploying APIs in weekends to get things off Craigslist to be in the domain of hacker-ish things, what someone else already called "hobbyist". I think that's an important and subtle flag for career prospects, so the line isn't that clear, though.

Another point I get from this is both a critique and praise for CS learned in a liberal arts college. You felt unqualified for getting a software engineering job, which is something that really needs to be addressed in small LACs. On the other hand, academic rigor probably prepared you well for cutting through bullshit that surrounds fashionable technologies. For example, you may not have heard of Heroku, but you may have learned about networking protocols and virtualization, which I think is the harder part to learn.

I think you struck a good community by choosing web development (a bit biased), but I'd like to amend that with a word for the road not taken, which is that back-end programming is becoming really, really accessible as well. Tutorials for Flask, Node, and Rails are just as prevalent (and potentially misleading) as HTML tutorials of yore. Now anyone with a mac (and even windows! would you believe that!) can set up a server and push "hello world" to heroku.

Anyway, I'm looking forward to seeing "How to become a Googler, or the art of ????? well". I don't know what ???? is but I imagine it involves a lot of frustrated inability to Google things :)


My 2cents.

Don't be fooled by this false sense of know-how when you just start being able to write some html/js stuff.

Googling a problem is a reflex when you start coding, and yes, good or bad googling abilities could be the difference between you still being in the industry in 5 years or not.

But, still, this is only a first step on your path, and stopping there will also stop you from being a real hacker.

There are things that just can't be resolved with a single search: "How to center a div in CSS" is not like "How to share load across different servers", because a big question must be divided into a lot a smaller ones, and there comes the difference between you and your "code-god compatriot".

So the real path looks more like this: 1- read good literature (CS, functionnal programming, network, etc), 2- exercise, 3- read good code, 4- exercise, 5- goto 1.

Anyhow, best of luck at Google ;)


I started learning in 2000, before Google was big and Stack Oveflow didn't exist. I think every new programmer gets a false sense of how easy everything is.

14 years of learning later, I now realize how little I know.


I started coding in high school in 1973 and professionally in 1981 with no benefit of any classes (other than the one in high school). Believe me I prefer to live in the modern age where your knowledge is expanded by the world's collective brains. In the "old days" you only had your own brain and limited library books and periodicals and maybe a good buddy. It was much harder than today to do anything you didn't already know but we didn't know any other way. Today it's much easier to find information, get ideas and opinions. I even use Google to find method names in iOS documentation (Xcode's sucks). But you still need to be able to understand what you read, decide if it is relevant, and how to apply it. Those things you can't just learn from Google fu, it takes experience, trial and error, and a willingness to recognize when you are wrong.


Except for maybe Haskell[0] and Lisp[0], I've begun to feel like there isn't any language that I can't quickly learn with how I'm able to google at the moment.

For example, I'm currently helping some fellow students with their F# assignments, even though I don't do F# at all. I just know what to google, because I know the concepts that I need translated to the specific syntax.

[0] Maybe Lisp, but Haskell is way too different from other languages in the way to programme. I'm currently learning it, but it's by far the language that has taken me the longest.


I remember how I was starting to think at the beginning of college, that with a reference sheet I could pick any language and start writing in it ... you know, C++/Java/Python, it all looked the same.

Then I applied for a course in Programming paradigms, that boiled to 4 weeks of haskell and then 4 weeks of prolog.

So, definitely add prolog to your list :-) It was eye opening :)


I have an opposite sort of experience: I can't learn things fast enough if they aren't mathematical in nature. I used to think programming was disgusting because of Python(heard Java is even worse) till I came across Clojure. I immediately appreciated how the latter imitates pure math so well :) Never going back to Python. Once in a math mindset, learning anything imperative is painful.


Curious if you've also tried Forth?


Or any other concatenative or stack-based programming language. And Agda and Idris and other dependently typed languages I feel deserve to be on this list. There's also Qi/Shen. Of course various assemblers, too. And logic languages, like Prolog, Datalog, kanren. And then...

In short: every time I thought I know "most" of what there is to know about something I soon discovered that I was wrong, and that "something" was a vast area which I didn't really explore at all except for a tiny little fraction of my immediate surroundings.


Take a look at APL. There will be three.


I think there's only two and Lisp doesn't need to be on that list, but maybe I say that because I've been using Common Lisp for so long.

Lisp ought to be easy since all you basically have to learn is: "(function parameter1 ... parameterN)".

Coming from Lisp I've been trying to learn some Haskell and holy cow syntax[1]! There's a lot to parse and know (compared to Lisp) before being able to write a small program.

[1] I know it is not literally syntax but mostly function calls, it's still a lot of extra special characters one needs to know about.


Prolog, and there'll most likely be four.


Or J, and related languages. Crazy stuff, but apparently super productive once you're used to them.


I'll argue that good programmers need to have a working understanding of algorithms, databases, compilers and all that. Sure, it's not something you will use day-to-day in a software engineering job but it's important to know how all the low level pieces of your stack fit together in order to write good code.

There's nothing worse than a programmer that fits together code snippets from Stackoverflow and similar sites without understanding what it is actually doing.


I think it's fair to say that you can be effective without necessarily being good and vice versa.

I think the some of the best developers are those who are smart enough to have gotten a CS degree but skipped a bit of the academics. (A bit self serving: I graduated high school in 1996, had my first intro IT job by the fall of 1998, and had my first real programming job in January 2000, while my contemporaries were still in college)


Everyone overvalues education similar to their own and undervalues education dissimilar to their own.


Everyone generalises too much.


Best 4 words I've read today.


The general conclusion is that you haven't read much today.


This was fun to read. What I saw was someone say that they were panicking because they needed time to understand how all the things that go into learning to be a researcher goes into learning how to be a professional programmer. I'm quite sure his mind will be blown once he realizes that creating things, while incredibly rewarding, is only half the work. The other half is getting someone else to use your widget. Also (bad at math joke), there's an extra 50% involved in getting attribution.

My take? I'm biased because I was educated as a researcher. I can name-drop three famous professors at my alma mater (and yes, I did study under all three), but what really pushed me was getting a job, on campus, writing software to support PhD candidates in another discipline. I quickly made the jump from researcher to practicum precisely because I was trained as a researcher, then applied those skills to understanding what my customers needed.

Seems to me this guy found his own "push". And yet, he's still a researcher.


>>but what really pushed me was getting a job, on campus, writing software to support PhD candidates in another discipline. I quickly made the jump from researcher to practicum precisely because I was trained as a researcher, then applied those skills to understanding what my customers needed.

I think what allowed you to make that jump was that you were inherently familiar with the problem domain; even though your customers did research in another discipline, at the end of the day they were still researchers.

I was the lead trainer at my company, and the training materials (hundreds of pages of them) used to be on Word documents. I got fed up with it, taught myself Rails, developed and deployed a learning management system, and migrated the content there. It was very successful because as the lead trainer, I knew exactly what features would be needed and how they should work. It was kind of funny actually because when I gave a demo to the trainers in other teams, they basically said, "wow, this is exactly what we need -- when can we start using it?" Which is music to any programmer's ears.


FYI, the author is not male.


  The first time I met him, he was in the middle of writing a 
  script to scrape Craigslist of all its free furniture 
  listings (which is illegal, btw).
It's a violation of the Craigslist TOS, a contract between two citizens, so it would be covered under civil law, not criminal. A creative prosecutors could try to spin it as a crime under the CFAA, (http://en.wikipedia.org/wiki/Computer_Fraud_and_Abuse_Act) and there in fact have been CFAA prosecutions for scraping Craigslist listings, (http://en.wikipedia.org/wiki/Craigslist_v._3Taps) but I doubt there's much a case here.


This would fall under US v Drew (http://www.dmlp.org/threats/united-states-v-drew). A case where someone violated the MySpace TOS and then was charged under the CFAA. The Judge ruled this was not a violation citing the void-for-vaguness doctrine and not wanting to give the power to right federal laws to websites.


Learning to “google well” has become the fundamental skill to replace reading/writing well in life. Scary.


Wow, the guy is exactly me when he graduated college and in winter he started panicking. Pretty good in Googling too lol Graduated with a degree in Mechanical Engineering this year summer and unemployed since. Winter now, panicking. Learning web Development in hope of getting into something. Strange how similar his and my path are up to winter. Time will tell where I end up.


* But a liberal arts CS major prepares you for a career in computer science research, which is completely different from, say, software engineering. It’s like comparing a physical therapist with an athlete

Kind of funny that software engineering is compared to an athlete. In the current situation, I'd compare it with a headless chicken.


Over time I have come to the conclusion that one of the best ways to identify a really interesting problem is when your attempts to google a solution come up blank. Google doesn't have all the answers, and if it becomes literally your only mechanism for moving forward you will hit a wall sooner or later.


Hm this can go the other way around too: You don't realise how hard is something and how far and away some people are, until you start programming[1]. Same goes for other sciences/sports/whatever.

http://xkcd.com/1425/


Google is your remote long term memory. Why wouldn't you use your long term memory? Sooner or later our brains will be modified to make the round-trip of entering request and getting a result much shorter and the difference will disappear.


Or google 'what books should I read to become a programmer' http://matt.might.net/articles/what-cs-majors-should-know/


I feel the opposite. I am a self-taught full stack programmer and yet I wish I had some sort of degree to pursue research as a career.


this just reminds me of the Expert Beginner: http://www.daedtech.com/how-developers-stop-learning-rise-of...


In my mind I am pronouncing the italicized words with an Italian accent.


I thought this was going to be something bashing programmers by the nature of the title. Glad I was wrong. Great story and something that all aspiring programmers should stick to.


So the OP used Google to get a job at well, Google.


"But I’d never even heard of the words front-end developer or full-stack. What was Heroku? Never heard of Xcode. Wasn’t git just the thing you used to make sure your partner had access to the code you’d written? Frameworks? Didn’t you just need HTML and Javascript? What the hell was an API?!"

If you literally never heard about any of those during your computer science courses, you attended a sad, sad university.


Apart from HTML/Javascript, which is usually taught in a web development course, most of these aren't taught in usual computer science courses because they have nothing to do with computer science. They are software engineering concepts.


I am well aware of that, being a CS student myself. However, those things are expected to be learned alongside. There won't be lectures about it but hand-on exercises will be impossible without them.


Lol that's not true either. How would hands-on exercises be impossible without Heroku? Haha. You know what I did in my CS classes? Wrote my own server in C. That's real CS, which is very focused on deep understanding -- not domain knowledge in software tools. You should ditch the attitude. Your first comment wasn't even necessary. You were just looking to feel good about yourself.


You wrote your own server without using any kind of API (e.g. TCP sockets)? I don't believe you ;)


Or went to university a decade or so ago, where instead of XCode it was emacs, instead of git it was cvs, and so on ..


Or you went to university a decade or two earlier?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: