Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Wat (2012) [video] (destroyallsoftware.com)
147 points by mikasjp on April 18, 2021 | hide | past | favorite | 28 comments


WAT is classic and very funny, but I recommend all software engineers to watch all of Gary Bernhardt's talks on his website: https://www.destroyallsoftware.com/talks

His talk about "Boundaries" taught me one of the most illuminating concepts I've learned in my career, about separation of concerns, functional core & imperative shell, unit testing, etc., and radically changed the way I write code - the talk uses Ruby as an example, but can be applicable to any language under the sun.

In fact, watch all five of them, they all reach this outstanding level of being educational, interesting and funny at the same time.


> I recommend all software engineers to watch all of Gary Bernhardt's talks

They are all great talks, but please don't use "The Birth & Death of JavaScript" as an instruction manual!

In addition to the talks on his website, I also recommend watching "The Unix Chainsaw"[1]. It's a great introduction into using the full power of the unix shell as an interactive programming language.

[1] https://www.youtube.com/watch?v=ZQnyApKysg4


> They are all great talks, but please don't use "The Birth & Death of JavaScript" as an instruction manual!

Heretic. "The Birth & Death of JavaScript" is the Holy Bible of JavaScript. It contains everything from Genesis to Apocalipse. It contains all the good parts of JavaScript which matter.


I'd argue for making this mandatory in your onboarding. It can help so much more than random training sessions and seniors mentioning this in code review. It's a simple shift in thinking that most people should just get.


I also strongly endorse the Boundaries talk and am re-watching it now, https://www.destroyallsoftware.com/talks/boundaries

The core concept of immutable values as the boundaries between components now seems self evident, but I can recall not appreciating this concept earlier in my career. Further, I don't think one can fully appreciate the importance of this concept until they've had to develop, maintain, and operate code both before and after this principle is applied. Here’s my story of being educated in the importance of “Boundaries”.

I started my career at a relatively small company that grew rather quickly. In developing more robust and reliable systems, I experienced the importance of this principle first hand. Commonly, a system that started as a small, low-value prototype could grow into a substantially large and important engineering system. Initial functionality may have been implemented in brittle, mutating code and these quickly became a pain point, particularly in testing.

I can recall several times where one seemingly innocent Java static method that mutated inputs or static fields was repeatedly the source of production errors. Adding test cases for these situations got increasingly complicated and time consuming with more and more mocks, stubs, and inspection of mutated state. Further, changing and adding functionality of the code itself became a tribulation of mentally working through the complexity of code and existing tests.

And what do you know, our recent changes reintroduced an old bug despite all of the tests passing. Turns out our existing tests to catch that bug only handled a specific manifestation of the fault, but didn't capture other cases. So of course, let us add some more test cases, each with their own menagerie of mocks, stubs, and inspection of mutated state.

Eventually, an experienced engineer would guide me to refactoring this functionality into isolated components. A single static method could be replaced by several Java classes; some classes to hold immutable state and others to perform state transformation. We may even introduce an interface so that different functionality could be provided through polymorphism. Tests became simpler and more robust such that fewer faults were discovered in production.

From the outside this may appear to be the classic Java Architecture Astronauts menace with a single static method replaced by a collection of classes and interfaces. We may have even had some XFactoryProvider interfaces. Yet the end result was easier to reason about and test, with the tangible benefit of fewer errors in production.

And I tell this story only so that I can say that I now appreciate this talk even more after living through the application of the “Boundaries” principle.


The WAT talk was rhetorically nice because it managed to joke about weird corners of the language without going into ranty programmer mode, but opted for a more gentle style of bemused boggling.

It also kind of cemented for me that weird edge cases don't actually matter in practice (in my experience), and that a language can have lots to make fun of while still being lovely to use.


>It also kind of cemented for me that weird edge cases don't actually matter in practice

I had the exact opposite impression. Indeed, this stuff is the cause of so much that gets attributed to something else (e.g. JavaScript framework churn being attributed to culture).

The problems caused by these issues compound as you work your way up the stack.


I think I get where you're coming from, that weird low-level behaviour could result in the things you build on top being unstable (I have been able to avoid the employment of the big javascript frameworks/systems, so have not had to suffer a lot of technology churn), or harder to build on top of [see my reply to another comment], but OTOH abstracting away weirdness one of the fundamental functions of engineering? And why would, say, undesirable array semantics result in framework churn on top? Because people are trying to 'fix'/'mitigate' the behaviour but can't settle on a single solution?


Often times it's because there is no good solution. That is, you can't really fix the problem without being overly strict and making common use cases impossible or awkward to use. So you settle for something that usually works for most people. Bad designs have a tendency of creeping up.

The thing about things designed with bad edge cases is that they can have dire consequences for some people, but as long as it works for most people, they stick around. Forever causing pain.


If you don’t keep in mind all the quirks you create bugs and waste manhours. If you keep in mind all the quirks you reduce your cognitive capacity and waste manhours. Languages with quirks are categorically worse than languages without quirks.


Large areas of ridiculous behaviour are, if easily characterised, maybe easier to avoid than more finely-graned edge-case zones that you'd get in more elegantly top-down designed languages, and result in less mental overhead. "just stay clear of doing anything remotely weird with arrays" might result in better, more readable code than a langauge with more easily-understandable array behaviour, and have lower mental overhead than what you might get in a better designed language where it might feel more feasible to keep in mind the edge-cases.

I've made a scripting language for videos games that some people use, and kept the semantics super informal, growing it quite organically, and it's worked out pretty well. Every time I try to nail down the edge cases I think "ah no, I'd prefer to have this all be undefined/erratic that to have people rule-lawyering their way into technically-correct but inelegant code". The main loss I've had from this is the inability to write a fuzzer, and in annoying some people (who use it or, for instance, wanted to write compatible compilers/interpreting). I do rely a lot on test suites for it (which also helps other when they want to do reimplementations), which provides a good amount of stability in spite of the fact that I never tell anyone what exactly counts as a valid variable name and amn't exactly sure what that is off the top of my head.

That said, nicely-specified programming languages that one can accurately comprehend + model with a human brain are super nice, allow for great tooling + better portability, and eliminate a large class of bugs that can be very troublesome. I'm a big fan of Go/C#, but they're not everything, and not from my point of view at all categorically better than other options.


> It also kind of cemented for me that weird edge cases don't actually matter in practice

Just look at the linux kernel. The tremendous amount of complexity and code need to support all the weird edge cases of hardware that does pretty much the same thing is mind boggling.

If you can ignore the edge cases yeah they don't matter, but as a language implementer these things matter. For the user, they are always present, when by mistake or by choice they are used, then they will matter.

So yeah, weird edge cases matter a lot! In fact trying to minimize them is very important to keep the complexity of systems down.


Trying to minimize edge cases can also increase complexity. As a specific example look at the behaviour highlighted in this talk where he shows the result of adding two nonsensical types. In most cases there is probably no sensible result except TypeError, but...

- Throwing a TypeError in those situations would itself require that particular edge cases are specifically defined in the spec to return TypeError, therefore actually making the spec more complex and creating more error situations for code to deal with.

- Some might get a benefit from the "implicit" behaviour even if it usually makes no sense at first glance (only in context with the other rules). Whereas TypeError is never useful in the happy path. An important idea in dynamic languages is to let the caller decide if a particular usage of a type makes sense, rather than the callee.


I think that doesn't reduce complexity it just pushes it to the users and the software they create. So now you potentially have orders of magnitude more complexity than if you had dealt with it in the appropriate place.


I had the opposite reaction. I thought the "wat" talk was entertaining of course, but I've found that it has generated a significant amount of hate towards JavaScript among laypeople who don't really understand that every language has weird caveats and edge cases like this.


It's a lot like this Order Of Operations quizes that make it around social media, with people who don't quite understand Order of Operations getting the wrong answer.

The obvious answer every engineer understands, as soon as it gets complex to work out the Order Of Operations, you make it explicit by adding parenthesis everywhere. Same thing with JS, once you have a chance for some of this weird behavior to show up, you make sure the types make sense.


(2012)

Discussed back then: https://news.ycombinator.com/item?id=3515845 (98 comments)


You had me all fired up when I thought some new video is coming from Destroyallsofware / Gary Bernhardt.

Missing 2012 from the Title.

I think with COVID, way deep in our subconscious. We are releasing all the emotionally attachment we have in our old programming tools.


it shows that at the fringes nobody in the audience correctly predicts what will happen...

anyways, here's the abductee-programming-language-conundrum:

consider Language X: there is a nuclear power plant, and you or loved ones live within 10 miles. it is your responsibility to select the language for implementing the emergency shutdown procedures. would you use language X?


I'd select a redundant mechanical device.


Or chemical. My first thought was: scram rods hanging off a "rope" made from an alloy specifically selected to melt at the right temperature. And/or a similar arrangement involving wax, as I think waxes have longer history of being used as thermal breakers.


For something like that I would be more concerned about the tests than the language. In other words, I'd trust my own code not at all, in any language.


I posit there is a difference between the safety of millions of people and the functionality of my stupid website.


If X is formally verifiable and conducive to writing testable, contractual code then yes.


Sounds like a job for Ada


"A few programming languages quirks" makes it sound less exciting than it actually is.

As per HN Guidelines <https://news.ycombinator.com/newsguidelines.html>, please don't editorialize titles.


This is a classic...


This is so good!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: