Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The lesson here: it was 1998. "The industry" was still a little fledgling, so the legal territory was still largely unchartered. But it grew into an ugly duckling, quickly.

Sent the contract to my lawyer. She marked it up, sent it to the client. Then the client marked it up and sent it back to my lawyer. And so on, back and forth for almost a month.

Garbage in, garbage out. During the "ugly duckling" phase, the legal machine is just learning that it can spew garbage. It tests its limits. Just how much garbage can it spit before something happens? When the garbage is between private parties? Apparently, a lot.

I charged my first client $1,400. My second client paid $5,400. The next paid $24,000. I remember the exact amounts — they were the largest checks I’d seen up til that point.

Then I wrote a proposal for $340,000...

The Bust was just growing pains.

It probably could be reasonably argued that the industry is still in an ugly duckling phase (multi-Billion dollar valuations, really?)

But this is part of growing up.

In Code and Other Laws of Cyberspace Lessig writes:

It is a lack of a certain kind of regulation that produced the Y2K problem, not too much regulation. An overemphasis on the private got us here, not an overly statist federal government. Were the tort system better at holding producers responsible for the harms they create, code writers and their employers would have been more concerned with the harm their code would create. Were contract law not so eager to allow liability in economic transactions to be waived, the licenses that absolved the code writers of any potential liability from bad code would not have induced an even greater laxity in what these code writers were producing. And were the intellectual property system more concerned with capturing and preserving knowledge than with allowing private actors to capture and preserve profit, we might have had a copyright system that required the lodging of source code with the government before the protection of copyright was granted, thus creating an incentive to preserve source code and hence create a resource that does not now exist but that we might have turned to in undoing the consequences of this bad code. If in all these ways government had been different, the problems of Y2K would have been different as well.

[source: http://www.code-is-law.org/conclusion_excerpt.html]

This is dated (1999), but interesting. He was wrong about Y2K, of course, but not about the underlying issues and problems with contract law and IP.



Why would a coder ever be held responsible for something not specified in the requirements doc? If people wanted Y2K compliant software in the 70s and 80s they should have specified it.

As for multibillion dollar valuations, what's wrong with the valuations on MSFT, AAPL, GOOG? If you think the PE is crazy on LNKD, just short it. As for what VCs are willing to invest for particular companies those investments are more anecdotes than data.


Because it's not unreasonable to expect the coder to supply you with something which will not fail arbitrarily in 10 years for a reason which is 100% predictable. A parallel example: If an Architect designs you a building which develops a leak because of a mistake in a construction detail you would sue for the cost of repairs and damage to property. Either in contract if available or in Tort for negligence if no contract exists.


>Because it's not unreasonable to expect the coder to supply you with something which will not fail arbitrarily in 10 years for a reason which is 100% predictable.

Really. A coder is supposed to know that he should spend unnecessary resources (memory, mainly) to add a feature that will be needed in ten years for a program only expected to be used for five years? Most of the problems (that didn't happen is it turned out) with Y2K were because many programs that were expected to be replaced weren't. And before the mid 1990s RAM was seriously expensive, people did not buy more than necessary, "Just in Case".


Buildings are expected to last, and cutting corners (on multi decade support, for example) is not exactly a "mistake" like the one that would spring a leak. Aside from that, where do you draw the line?

If the software doesn't work with accented characters and it wasn't in the specs, do you blame the coder? Reading that common-but-not-currently-used-here file type? IPv6 support?

Maybe it's because I'm a coder, but if you want a building to withstand an earthquake, you gotta say so.


Buildings also need to be maintained and retrofitted. Around here in Cambridge, there are countless residential and commercial buildings that were built before the days of electricity and air conditioning, and even plumbing in some cases. The maintenance costs often far outweigh the initial building costs, even given inflation.


Defects that are 100% predictable do not fail arbitrarily, they fail predictably.

I suppose you've never used a fixed size data type (eg. int) as a primary key right?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: