Hacker News new | past | comments | ask | show | jobs | submit login
Going “Write-Only” (begriffs.com)
230 points by begriffs on April 20, 2015 | hide | past | favorite | 140 comments



> The discussion in The Republic maintains that would-be citizens of the ideal republic should be exposed to music that cultivates their good qualities, and prohibited from listening to the bad. Much modern music creates agitation and aggression. I’ll listen to serene and balanced songs like Gregorian chant and neoclassical

> The etching, and even the pressing and collating of pages were difficult processes but somehow the artists outdid us, we who can so easily create, modify and distribute images. Goodbye cartoonish web images, let me be immersed in nature and see uninhibited art instead.

The author should be careful not to confuse a desire for focus and on the lasting rather than the ephemeral with a fetishisation of the past, of the "authentic", and of the "natural". It is all too tempting to step from use of survivorship bias as a tool, as a filter, into a false belief in the superiority of things past.

> the (false?) feeling of connectedness through a glowing rectangle

This particularly gets to me. I know people who would not be here on this earth today if not for the ability to connect with others through the internet. If you feel like disconnecting makes your life more fulfilling, great! But don't project that onto others by saying that what makes their life meaningful is false. (At least he had the self-awareness to throw in a question-mark.)


Modes of communication are just that, different modes. No less intrinsically superior or inferior to one another, merely different. Literature is just words but it has great potency in communication. It can transform one's mind, it can lead to new vistas of understanding, perception, reason, and thought. It can serve as a bridge of understanding between individuals, cultures, eras, religions. It can evoke the full gamut of emotion and feeling, it can reveal the numinous, the extraordinary, the sublime, and the awe-inspiring. Many religions put their roots in literary works precisely because of the incredible potency such works can have. But the written word is hardly unique, many other forms of art can be just as potent and rich, such as music, painting, sculpture, motion pictures, and even teevee. Even blogs, twitter, instagram, youtube, or vine.

When we look at new sources of communication we often judge them based on their average use case, but this ignores the fact that the majority of nearly any form of communication is dominated by the routine and banal. When we judge literature we judge it based on its greatest successes and ignore the mountain of mediocrity underneath the pinnacle, we would be wise to do the same elsewhere. It's easy to denigrate little glowing rectangles, but it's easy to denigrate scribbles on sheets of pressed wood pulp too. But both can serve as the foundation for meaningful connections between individuals, both can provide windows into the transcendent and the numinous, and both can have profoundly transformative impacts on emotions, lives, cognition, values, perspective, everything.


I predict that this will go full circular again in a few years, and we're back to the "leave meatspace, enter the cyber world" over-enthusiasm again. And then a few years after it's vinyl/Thoreau again etc.


Ebb and flow.


Everyone knows that the musical expressions of 14th-16th century white European men cultivate the most appealing qualities and are the least ephemeral.

I'm sure there's an 18th century biological treatise establishing that.


I know you're trying to be edgy, but it's actually objective fact that 14th century music still being enjoyed by millions in the 21st century is among the least ephemeral music.

http://www.merriam-webster.com/dictionary/ephemeral


I think his point is that we have no idea how long people will enjoy any of the music created today, nor can we really say how much people will be listening to classical music 100 years from now. Music from the 14th century is certainly more battle-tested, but there's nothing objectively "least ephemeral" about it that means it can't, or shouldn't, be bested.


> To program any more was pointless. My programs would never live as long as [Kafka’s] The Trial.

As a programmer, I want to believe I'm a writer, a poet, a sculptor.

But the truth is programmers are dancers, mimes, ventriloquists. We're performance artists, and the systems we program really only have meaning as long as we keep programming them (or, if we're really lucky, someone else takes up the dance when we grow tired of it). Once we take our final bows, the programs will fade away and die and all that will be left is a memory of the performance.


Programmers are really boring analogy whalers, out at sea forever hunting their perfect, elusive, bloody boring, analogs.


Call me HTML


Burritos!


I don't think this is true. Consider the popularity of emulators for older systems. People still care about and use old programs and games. You can even still buy games released 35+ years ago from online retailers like gog. Think about how old programs like ls, dd, etc are (even older when you consider the design and not just the implementation).

People still care about "Super Mario Bros" (a computer program) today, over 30 years after it's original release. Is it really going to be forgotten in the next 10 years? 20? 30? I think not. Will all emulators be abandoned?

Maybe appreciation for old computer programs is a niche interest, but let's face it, Kafka is not exactly popular among the masses either.

Most software may be forgotten, but most books will be too.


I find it a little sad that most modern computing will be lost to time. Early console games were easily self contained systems, and lent themselves well to emulation. Today's consoles are towers of complexity that will likely never be emulated. At best compatibility layers will be built into future systems to interoperate, the likely result is that the classics of today will continue to hobble along on a dwindling number of devices. At worst, a corporation loses interest and all their consoles immediately turn into expensive paperweights.


I disagree. It's easy today to think of the consoles of yesteryear as simple self contained systems (A 6502-based CPU? How quaint!). However, Consider the unholy trinity that was the Sega Genesis/Master System/CD/32X. Though no longer modern, that absolutely is a tower of complexity, especially when it was brand new.

The emulators of the game consoles of the past exist today because some hacker got curious and interested. Complexity certainly isn't stopping emulator devs. The Wii was more complex than the Gamecube, which was more complex than the N64, which was more complex than the SNES, which was more complex than the NES. And yet there are emulators available for all of those systems.

What is different today that would prevent an enterprising hacker tomorrow from making emulators of today's consoles?


> What is different today that would prevent an enterprising hacker tomorrow from making emulators of today's consoles?

Not a sliver of hope of access to the server-side runtime of cloud-assisted† games?

† not full-scale MMOs but Diablo III, Starcraft II, Destiny, even Guild Wars...


If they follow the example of Ultima Online they certainly will live on forever.

In many some though, they will be replaced with full conversions that gets even more popular than the original.


Emulators for old systems aren't very popular at all, compared to the general population's use of computers and computer systems. An increasingly small set of folks seeks out and builds / runs old NES or Sega emulators for the couple of games they miss from their childhood, but I'd wager a vast majority of computer system users today wouldn't even think of a gaming machine as a computer, let alone be able to be run on their modern machine.

Perhaps the myopia of computer culture is a result of our lack of evangelism for classic systems, but, outside of games, most of software deviates from literature or music in that it's a tool. If I write a better word processor, why would I go back to use a two-decades-old word processor? I wouldn't.

Same for graphics software, or IDEs, or web browsers, accounting tools, what have you. For the most part, these things go to obsolescence and there isn't a use case for dusting them off once that happens. With a book, a story is still a story whether the setting or events are modern or not. A song is still a song, whether played on harpsichord or synth tracks on a macbook.

But a piece of software is a hammer or a handsaw or an impact driver. When you have a better one, you put the crappier one away and it doesn't see the light of day again until your kids are cleaning out your garage when they move you into a retirement home.


Kafka also isn't very popular at all, compared to the general population's choice of reading material.

Old software programs can be useful. For example, consider the diehard following of older programs like Deluxe Paint, Impulse Tracker, etc.

I also don't think it's fair to compare utility the kind of utility programs that you are talking about to the works of Kafka. A novel is something designed to provide entertainment, insight, or some other benefit beyond the basic utility of a tool. I think the game is the best type of computer program to compare to a novel, in that they have the same aims.

A better comparison for the type of program you are talking about would be something like a lost pet flyer, or a personal ad, a "Caution: Wet Paint" sign, the terms and conditions for a website or even a technical manual. These are also works of writing, but probably are not going to provide much value out of their specific situation and context, save for historical interest.

So I guess the take-away is that if you want to create something that will be remembered for a long time, you have to more than just provide utility for your own specific time and situation. I think software programs are more than capable of doing that, the programmer just has to have the right focus in developing (so maybe a game, or a software art program, or a demo like the demoscene), or give the program some personality.

On the other hand utility programs and utility writing are still useful and necessary in the short term, and there's nothing wrong with that.


I think this would fall into the category of "picking up the dance". For some software, people don't want the performance to end, so they bring the performances to new stages, but, as soon as they stop actively "performing", the dance will end.


> To program any more was pointless. My programs would never live as long as [Kafka’s] The Trial.

I found it a touch ironic considering that Kafka, on his deathbed, asked his friend Max Brod to destroy all of Kafka's unpublished work, including The Trial.


The request was the final grand gesture.

> BORGES: "In the case of Kafka, we know very little. We only know that he was very dissatisfied with his own work. Of course, when he told his friend Max Brod that he wanted his manuscripts to be burned, as Virgil did, I suppose he knew that his friend wouldn't do that. If a man wants to destroy his own work, he throws it into a fire, and there it goes. When he tells a close friend of his, “I want all the manuscripts to be destroyed,” he knows that the friend will never do that, and the friend knows that he knows and that he knows that the other knows that he knows and so on and so forth."


As much as it pains me to disagree with Borges, Kafka had already burnt almost 90% of his unpublished work [1] before his health declined due to starvation from a then untreatable laryngeal tuberculosis.

I think Kafka did truly want to burn his work, but also feel Max Brod was right to save what he could.

[1] http://www.nytimes.com/2010/09/26/magazine/26kafka-t.html?_r...


He had burnt a lot, it's true, but that raises another question: given that he did burn a lot of it regularly throughout his life before his fatal illness, why did he not burn the remainder while healthy and instead later asked Brod (who claims to have immediately told Kafka he would not do that) to do it? Borges's reading here makes more sense; it was simply the final Kafkaesque gesture.


I prefer the term `software gardener' to `software engineer' for this reason: software is slowly created over time and requires constant care, or it will wilt and die.


I never liked this analogy. There is no such thing as software entropy - bits are bits, and the code doesn't decay with time. The only reason we have maintenance problems is because we keep changing components. If you program a box, unplug it from the Internet and leave it running, the code will continue to work until the hardware physically fails.

Programming for the on-line world is more like managing a clothing brand - everything is constantly changing around you, things go out of fashion every other month and you need to keep up or else your brand will die and get quickly forgotten.


> There is no such thing as software entropy - bits are bits, and the code doesn't decay with time.

The article contains some thought that disagrees:

    > To program any more was pointless. My programs would never live as long
    > as [Kafka’s] The Trial. A computer will never live as long as The Trial.
    > … What if Amerika was only written for 32-bit Power PC? Can an unfinished
    > program be reconstructed? Can I write a program and go, “Ah, well, you
    > get the gist of it.” … But no. It wasn’t written for 32-bit Power PC. It
    > was written for eyes.
See, The Trial was written on a platform that's been around quite a while: German. And it was ported to other, fairly stable platforms, like English.

However, if you write some code from the 32-bit PPC architecture, the vast majority of 'readers' possible for your code is very, very small. And dwindling by the day. Nothing you can do! Other than keep working on the code, porting it to the latest version of the OS, architecture, platform, API version, or whatever.


You can write it in C. Bloody thing is like a particularly disgusting cockroach. I bet people can still read it in way too many years, to the extent that C can be read.


Even in C, it's not insta-portable. In the quote author's case, the UI framework was entirely deprecated, basically requiring a large re-write to use the new API, to keep the same functionality.

If you wrote some C code that ran on Mac OS 9, for example, you'd eventually be forced to upgrade, because eventually, there are no computers running OS 9 anymore.


Gary Bernhardt's "The Birth and Death of Javascript"[1] is a rebuttal to this, I think.

Effectively, imagine that a compiler, put in static-compilation mode, would link everything required to run a piece of code (the relevant windowing toolkit, all the system libraries, copies of any OS binaries the code spawns, copies of any OS binaries the code sends messages to over an IPC bus, etc.) into something resembling a Docker container, or a Unikernel VM image. Imagine, additionally, that all this then gets translated into some standard bytecode for a high-level abstract platform target that has things like graphics and networking primitives—say, asm.js. Now everything can live as long as something still executes that bytecode.

[1] https://www.destroyallsoftware.com/talks/the-birth-and-death...


This will be really cool in the future, someday. But it's a sci-fi like talk for a reason.

(and funny enough, when I saw him give this talk, the above quoted situation is exactly what I thought of...)


Maybe it's unrealizable at the moment for x86 code. On the other hand, for random digraphs of arcade-game microcontrollers, JSMESS is doing pretty well at achieving this. :)


Language changes dramatically over time. Read English from 1000 years ago! There is a great example in "The Language Instinct" which shows a writing in English from 4 different periods in time. The oldest will make non sense to modern English speakers.


That's why I said 'fairly' stable. Compared to the length of time it takes software to change, languages are effectively immutable ;)


you have a strange definition of immutable. Language is changing rather constantly. Take an 80 year old and have them listen to a conversation between 16 year olds. I bet there are a bunch of words they don't know the meaning of.


In this analogy, you're talking about PPC specific assembly code, and I'm talking about C.

I mean, yes, this is fuzzy. It's hard to even nail down what "English" _is_, for things like what you're saying.

The broader point is that computer languages have a much smaller period of time where their particular language is spoken than human languages. We'll see if that changes in the future, but I doubt it.


>everything is constantly changing around you

Like an ecosystem? ;)


Well, ok. I guess my point is that programming is like gardening only if you deliberately decide to work in an environment that forces you to keep up with changes, like the Internet. Software for infrastructure can easily last unchanged for 20+ years.


I really like the ecosystem analogy.

Ecosystems have a variety of niches which change at varying paces. A shark isn't better than a finch because it is more stable on the evolutionary timescale, it is well-adapted to a stable niche. More stable niches are not better or worse than transient, quickly-changing niches, they're just more stable.

I think it is important to recognize that the stability of a niche is independent of its value or the care that should be taken filling it. A feed processor must be written carefully even though most often they only live a couple of years before the underlying need for it goes away. A compiler is of great value but without constant maintenance it will be worthless within a few years, as compiler techniques evolve, languages are restandardized, and machine architectures shift. Software written as part of scientific inquiry is frequently worthless and uninteresting after the questions being examined have been answered.


So where do security updates factor into your diatribe? You're saying bits don't decay, but to be honest it's not bits you're coding, it's logic.

When underlying shared libraries and OS gets update because of security updates, your code ages a bit. In a while it won't behave as you expect, and after long enough it simply won't work.

And let's face it - I would bet less than 1% of the people in this forum write code they would expect to reside in an unconnected box. As soon as you connect it, security updates are required/happen.


TeMPOraL's post was hardly a diatribe.


Exactly. When was the last time you had to update the firmware on your microwave?


That's because it doesn't need to integrate with anything yet. Most software written now does need to integrate with many different things, especially obvious on web and mobile.


We live in such a primitive era of programming that you shouldn't compare it to printed books, but to scrubbed reusable parchment. We don't write good enough code to last. We don't have robust enough machines to keep it in service.

It's too soon to adopt a philosophical position on what it means to write code. We don't do it well enough to claim to be experts at it.


One might rebut that with an analogy to the William Gibson quote: quality programming is already here—it's just not very evenly distributed.

We write code that lasts all the time. We call it "embedded code." Satellites, nuclear power plants, microwaves, pacemakers, cars—they all run code that lasts.

Most code doesn't last, because most code doesn't need to last. It's not part of the profit-function of the business producing the code, so something else gets optimized for: ease and speed of production, say.

We can code well when there's any sort of incentive to that outweighs other incentives. It's not even a matter of individual programmer skill, only of systems engineering: having checks and cross-checks, QA reviews and burn-in testing, tolerances, etc.

Those things can happen "in the small" at the level of how individual lines of code get written, too. Maybe "we" don't know how to do that. But that's ignorance, not inability. We've never tried.


That's quality programming by today's standards. And you say it lasts, but there probably isn't a single piece of code that has lasted a full century. That's barely more than a single lifetime.

Comparing microwave code to the code I'm talking about is like comparing medieval accounting records to Hamlet. One is far more impressive than the other.


Would that be an university degree?

On my world software engineer is a synonym for what is known as CS degree.

So I can only expect it to be comparable, if gardening would be one as well.


I don't like the theory that a piece of software can't be complete. Not all software can be complete, but some simple tools (which most software should be made of) can and should be able to be finished forever, or at least asymptotically approach completeness to a degree that the distance becomes irrelevant.


A perfect, fully complete piece of software will still need adaptations: even if feature requirements never change, the underlying platform surely will.

Even a complete, self-contained system—one which never sees any changes at any level from software to hardware—will still require periodic hardware maintenance.


Even the simplest, most -seemingly- feature-complete software can and will need fixes. The best example comes from the Big Iron world, where a program which does nothing was needed: http://en.wikipedia.org/wiki/IEFBR14


Construction/architecture-metaphors work a bit too: They can outlive the involvement of the initial designer, many bits can be tacked on or changed, and they have some overriding functional requirements.


They especially work if you've owned and maintained an actual, physical building. You can build for the ages, but if no one patches the holes in the roof every now and again, it will rot from the inside out in a few years.


Unless you build in stone.


Even if you build in stone.

Most long-lasting stone structures are in constant decay. Without being weather sealed and maintained, they will decay and fail. Even if they are in moisture and vegetation free environment like the pyramids, they are still subject to that most basic human trait - theft.


I've seen plenty of roofless stone buildings. They're decorative but not useful.


Architecture metaphor is relevant because architecture is a mix of art and science. The same happens with software.

Very few houses become or have the pretension to be monuments. Most of them only have a mundane function: they are shelters. The same happens with software.

Very few architects are visionary enough to project an house in a different way and point a new direction. Most of them just follow the existent trends. The same happens with software.

Most houses become monuments because of historical reasons, not because they are the best ones. The same happens with software.


I see myself as a bonsai artist. Bonsai trees need to be cared for, trimmed, styled, wired, etc. They grow over time. You can change things, but you can't realistically change everything without starting again. A bonsai tree is not "done" until it is dead.

I think this is the same as software.


I generally agree, but I have been surprised. We had a bug in a program my company released in the late 90's and that bug only became apparent last year. 5-6 people contacted us for a fix (which we provided). The program was more than 15 years old and it was still in production. That surprised me and made me adjust my views of each of my software releases. They might be deployed in the field for far longer than I had assumed.


The irony is that a digital information is theoretically lossless, but in practice software and hardware changes so fast that it is one of the most ephemeral things humanity has ever created.

So if you got into programming because you wanted to create something lasting, I think you are destined to be disappointed. But then again, all human endeavor, and indeed everything in our human experience, even the seas and the mountains and the stars are all ephemeral on some time scale. As the Buddhists say, the temporary nature of things is unavoidable and neutral, it's ones desire for permanence that causes a problem.


Programmers are indeed like writers, but some of us get to write Beowulf, some others get to write college term papers and the rest of us lie somewhere in between.


As a programmer, I want to believe I'm a writer, a poet, a sculptor.

But the truth is programmers are dancers, mimes, ventriloquists. We're performance artists, and the systems we program really only have meaning as long as we keep programming them (or, if we're really lucky, someone else takes up the dance when we grow tired of it). Once we take our final bows, the programs will fade away and die and all that will be left is a memory of the performance.

Nah, it doesn't have to be that way, although it often is.

We're like artists or architects, but most of what we build isn't lasting because there's little demand for that, just as many architects will never get paid to work on anything that might last. The Sistene Chapel lasted; the slums of Rome from almost any era haven't. For every artist whose work makes it into a museum, there are thousands hawking cheap prints on the streets of Paris.

Programmers can build software that will last for a long time. I know people whose programs are still running, untouched, after 30 years. We can solve problems and keep them solved. It doesn't have to be just code; papers and new algorithms can advance the state of the field. Since we're a progressive sort who really believe in the ability of our "magic power" to blow away drudgery, we love when we're able to do that: to solve a problem and keep it solved. If this is applied to code, you can get exponential growth in codebase or library or ecosystem value for a long time.

Here's the problem, though: no one is willing to pay for code that lasts. Just as most architects are building clapboard houses in the suburbs, most of us get stuck writing throwaway code because The Business won't pay for good code by giving us the autonomy and timeframes that'd enable it to exist. Of course, bad code and failed software projeects are far more costly in the long term, but executives are not a tribe known for thinking long term; most corporate climbers have the political skill to get promoted away from their externalized costs before anything can be linked to them.


You're tying quality with being long-lasting, but is that really true? I know code that's running for over 30 years, and it's terrible. It's running because it's terrible, because nobody dare approach it and risk it breaking, because it's a flimsy column supporting the roof, and you fear it might collapse.


In your opinion what does an organization which does pay for Sistine Chapels of code look like? It's an interesting thought experiment since presumably we're looking for the Holy Roman Empire of a company (Google? Google in 20 years? Google if they last another 20 years? Or 200?)


> In your opinion what does an organization which does pay for Sistine Chapels of code look like?

The OpenBSD Foundation: http://www.openbsdfoundation.org/index.html


In your opinion what does an organization which does pay for Sistine Chapels of code look like?

That's a hard question. Until recently, people relied on institutions to set visions, take on big projects, and think to the future. Whatever you think of governments, churches, and universities, they had long-term plans and that prevented them from being too greedy or focused on the short term.

These days, individuals have more long-term vision than the corporations that they work for. Programmers can expect a fight if they want 30-40 year careers. It can be done, but you have to "steal time" at most companies to keep abreast of changes, contribute to open source, etc.

Companies are now in next-quarter mode, because it's a way to hide the fact that most companies are being looted by their executives. If you create a sense of short-term desperation and mandatory immediate focus, then people are distracted from the hands in the corporate coffers. You basically don't see people thinking further ahead than three months in most companies.

So now we rely on individuals to set the long-term course, but individuals are (coincidence of words not intended) individually effective but rarely as coherent as a well-run institution. We rely on individuals for long-term orientation because organizations no longer make themselves adequate.


> We rely on individuals for long-term orientation because organizations no longer make themselves adequate.

The flip side of that coin is that organizations are only as good as their leadership. Being able to select good leadership generation after generation has been something that has eluded mankind since the inception of civilization. There hasn't been an organization invented that doesn't eventually fall to corruption.

So, how do we create institutions that will last? It's a problem that appeals to the ego. The solution would have a much larger impact than just on tech.


What a waste. I am amazed that you can take the attitude of "All new things are broken" and be involved in the creation of anything new at all. To me you need to have one of two outlooks on life to hold that viewpoint. Either you really think that you are independently of anyone else so awesome that you will create something of value by yourself that is somehow better then what anyone else is doing, or you think that you yourself are so much better then everyone else that things which hold value to others are meaningless to you. Either way you're placing so much value on yourself that whatever comes out will be so isolated as to be useless to everyone else, but why do you care since you're so much better then everyone else.

I mean consider the blog post "I am going write only" which loosely translates to what I produce is worth your time to read, but what you produce is not worth my time to read. No thank you.


Especially the part about his twitter usage shows is attitude pretty well.


I like this post a lot, because it reminds me a lot of myself. I have spent a long time thinking about how the deluge of easy entertainment and shallow articles on the Internet has affected the way people operate. I've even done a similar experiment in the past, except in my case it was banning the Internet except for a specific set of use cases. It was certainly an interesting experience.

One warning I will throw out for those considering entertainment/distraction deprivation: you will need something to spend your time on. Instead of just taking away, use the same opportunity to try to cultivate a productive habit (hey, you're going to be suffering anyway -- might as well).

Also, you need to think about whether this really addresses the issues that cause you to be "mediocre"/lazy/unproductive. It's easy to blame external factors like the Internet for providing easy entertainment, but it's also important to look inward and see if you have mental roadblocks that are inhibiting you -- perhaps you are mentally exhausted from work, or maybe you just aren't interested. Also, I've started to think that some people are just flat out less productive than others. Or to put it another way, some people are abnormally productive. After all, productivity is really more of a means than an end. If you're not sure where you are going, it doesn't matter how fast you get there. And if you do have a goal, abstract ideas like quality or productivity are quickly redefined in more concrete terms to fit your new intention.


He has delusions of grandeur. He compares himself to Kafka and says he insulates himself from "online mediocrity". Does he realize how incredibly arrogant he sounds? What he needs right now is to acknowledge that he's just another ordinary human, capable of ordinary deeds (basically an "ego adjustment" to realistic levels). He should do what he loves doing how he loves doing it, without any thoughts about how it measures up against peers or great artists from the past. That's how great art comes into being. But hey - if being a judgmental prick is part of his creative process, then so be it.


You'd also think Thoreau a dick for renouncing society for a time, yes?

What he needs right now is to acknowledge that he's just another ordinary human

OP is on an experiment to put his ordinariness as a human being to the test. I'm grateful for a view from the sidelines.

"A man is called selfish not for pursuing his own good but for neglecting his neighbor's." -- Whately

And a man is called arrogant not for how much he strives to better himself but for how he belittles his neighbor.

Sorry that you feel belittled.


I got the exact same feeling while reading the post, while I don't necessarily agree at the efficacy of your biting response. However, the excessive quoting of "great" thinkers easily came off as delusions of grandeur.

Anyone can take quotations out of context and use them to sound well-read; what about adding more of your thoughts to the mix?


As a counterpoint, how many people can actually read Chauncer's Canterbury Tales in its original form, or Beowulf for that matter?

Human language is a longer scale of change, no doubt, but in Neo-Sanfransokyo in the year 2599, will English even resemble itself now?


Chaucer isn't necessarily that bad. If you take the time, you can read much of it. For example, I think you or most HN readers could read and understand almost this entire passage if you know the framing story:

'And whan this goode man saugh that it was so, / As he that wys was and obedient / To kepe his foreward by his free assent, / He seyde, "Syn I shal bigynne the game, / What, welcome be the cut, a goddes name! / Now lat us ryde, and herkneth what I seye." / And with that word we ryden forth oure weye, / And he bigan with right a myrie cheere / His tale anon, and seyde as ye may heere...'

http://www.librarius.com/canttran/genpro/genpro824-860.htm


Almost definitely not. 4 quick dialects are AAVE, Indian English, Australian English, and, for a bit of localization, Iowan English. Mutually intelligible to someone who cares to pick up the slang, turns of phrase, and alternate grammars but clearly they are swiftly diverging.


I actually think NZ English is diverging the fastest. Some thing about being a small remote island nation that also has an indigenous language in regular use. Those guys come up with crazy new words all the time. What on earth is a Jandal?


Try Singlish (http://en.wikipedia.org/wiki/Singlish_vocabulary). Combined with their accent, it's a wonder if you understand anything at all when said with a lot of Singlish.


I live in Iowa. Is our dialect concrete enough to put it in a group with Indian English and Australian English?


I'm sorry, I can quite understand what you're asking.

Could you repost it in Australian English?


Probably not, but it is distinct enough from the south, the northeast, the PNW, and the west coast. I felt the need to localize a bit more than "American accent."


What makes you think it's not?


I think English will be different, but intelligible with Google Translate's 2599 edition. 2015 English -> 2599 English should be a cakewalk.


It will be intelligible, but will it still be art? English lends itself well to all manner of subtext. When what we call english today becomes a dead language, will it still even have the same meaning? It's something I occasionally wonder about Shakespeare. It's full of subtext and innuendo, but are we comprehending that subtext and innuendo the way they did back then? Probably not. The definitions of the words and phrases used are too different, let alone the subtext, idioms, etc.


I think you've answered your own question; we still appreciate Shakespeare as art, even if we can't get as much from it as a contemporary listener would. But historians, etc. can help fill in the gaps (much like reading Greek plays, etc.).


> My first concrete step will be to eliminate variable information rewards from my computing life.

By posting your article to HN? I appreciated the article and found it interesting, but it seems odd that you're the one posting it given the sentiment behind the article.


Posting to HN is consistent with his plan, as long as he doesn't respond to comments. He has said he will be write-only. He can post tweets, hn articles, etc. However, he cannot read the responses and comments to his output more than once a week.


That's a pretty ridiculous stance to take. By posting here, he clearly thinks his thoughts are worth our time to read, but by going "write-only", he clearly thinks other peoples' thoughts aren't worth his time to read.

The whole exercise seems very conceited.


Remember how deep and cool we thought the guy from The Guy I Almost Was was when he decided to check out of techno-hip cybersociety, become a citizen of the past, and search for true meaning in old, or absent, technology?

Yeah, turns out he was just going full hipster. Which is just as bad in its own way.

Balance. In all things, balance. Not very profound to say. You can't make a manifesto out of it, because duh. But yeah. It's what works, I think.


> A computer will never live as long as The Trial. … What if Amerika was only written for 32-bit Power PC?

This line kind of got me, and made me think about the transience of my own work. I build educational software, and I entered this field because I was inspired by the educational games I played as a kid. Most of those games, developed for a specific operating system, can still be run today on a VM or an emulator of an older Windows machine. But the apps I work on are web applications - we're constantly racing to update and maintain them in a sea of ever-changing devices and standards. The odds of somebody being able to run my work, even a just few years from now, and have it work without issue is unfortunately kind of small (the introduction of iOS 8 already wreaked havoc on some of our layouts). And that makes me kind of sad.


I've been thinking about this problem lately as well. I think one solution might be to build your games for a well specified abstract machine. Then, you can just build an implementation of that machine for the web, and run the games on that.

The inspiration for the idea was SCUMM, and how we're able to play SCUMM games now with SCUMMVM.


Oh precious irony of clicking on this in HN in a bored moment!

I think you'll do better if you find ways to make your clarifying restrictions be imposed upon you, rather than attaining them through willpower.

I do find 2-3 day email and internet fasts useful from time to time, but the internet is also a source of much inspiration.


So his plan is no online news; only write to GitHub and Twitter, not read; only old time tested books. I don't think you can be a good coder like this.

A lot of improvements to software development are too new to be time tested and encoded into book form. For example, Node scales a lot better than Rails, and maybe there are books about that now and not just everyone hearing about companies that switched, but beginner Node programmers are going to get stuck in massive ugly callback pyramids unless they learn mitigation techniques like Promises.

There's simply too many things a good coder should know that are picked up by reading other people's code. What if you didn't know about A* path finding and had a bunch of buggy, slow, piles of loops and if statements? Then you are doing shoddy work and you are charging your clients more for reinventing and debugging the wheel when you should have just looked up the algorithm.

The other problem I see is that he wants to live in the city, unlike the people he cited who moved to cheaper less busy places like Walden Pond, but he is focusing more on art than work, so it will be tougher for him to make rent. Maybe Loop/Recur is paying him enough he has time for that, though, who knows. Many people in big, expensive cities like NYC have to struggle with multiple jobs.


It depends a lot on what you are doing and your innate abilities and determination. Knuth has captured some thoughts about staying connected via email very eloquently so I will share them here -

"Email is a wonderful thing for people whose role in life is to be on top of things. But not for me; my role is to be on the bottom of things. What I do takes long hours of studying and uninterruptible concentration. I try to learn certain areas of computer science exhaustively; then I try to digest that knowledge into a form that is accessible to people who don't have time for such study. "

- above excerpt from Knuth is taken from http://www-cs-faculty.stanford.edu/~uno/email.html


People ignoring old works is why we have node and people trying to put polish it with promises, rather than people just using Haskell or Erlang or something else with lightweight threads. Or at least something inspired by them.


One of my favorite insights I've heard is that the present came from just a tiny part of the past.


This post is a beautiful illustration of self-determination. Often on HN I read about the tyranny of employers, government and co-workers and often I identify with the aggrieved poster. But Begriff's post is a welcome reminder for me that most of us already have everything we need to exercise what Locke (and I think Begriff) would call our "natural rights", except the will to do so. After reading this post I have less regard for my own complaints about 'self' unless accompanied by determination.


begriffs will create new content, but only read the work of others if it is decades/centuries old? That sort of delayed feedback loop is no way to participate in a community, or even in a discussion larger than comments on your own work.

The survivorship bias is a logical fallacy, not a prescription for choosing what to read!

Why disable images in your browser, when many concepts are best expressed with a diagram or photograph?

It's important to consciously limit your content consumption, but this is not the way to do it.


> That sort of delayed feedback loop is no way to participate in a community

Who says he's participating in any particular community?

> The survivorship bias is a logical fallacy, not a prescription for choosing what to read!

So you suggest reading works that didn't survive? How would you go about finding content that is by definition dead?

> Why disable images in your browser, when many concepts are best expressed with a diagram or photograph?

Yeah, you've got me there. I understand his viewpoint on this, but the execution is problematic. I'd wager the vast majority of images on websites have nothing to do with the content you're reading. Even images that are technically a part of the article you're reading frequently add nothing to the text. How do you filter the useful from the superfluous?


This article makes a great argument for not reading it.


There are things that last because they were done once and were done, mostly, right. Then there are things that have lasted because they were done once, mostly wrong, and still provided enough value at the time that they became popular among the group with the power to determine what would be the dominant solution before they could be refined.

The flip-side of permanence is path-dependency. So many programs get written for a certain architecture of chip and instruction set, for instance, that to change it to something more efficient would break too many things.

To make something that lasts because it is good, that lasts on its own virtues, would be a wonderful thing. Nonetheless, the next generation ought to be able to surpass us - and if they are not able to do so, we have done something wrong. Blanket approval of past solutions simply because they have lasted seems likely to lead one wrong.


I'd be surprised if this was the attitude of the people whose work he intends to read. I expect they were mostly very in touch with all the other art of their time.


If you find your problem is reading - I would argue you are reading the wrong things and not balancing your input with your output.

I agree that limiting your input might help improve your output. It could also harm your output as you are having less ideas being inputted to feed inspiration.

Best of luck when you read this on Monday. ;)


This would be too hard for me to maintain, but I'm sure it will bring benefits for as long as it can be sustained. The content we expose ourselves to definitely affects our subconscious thoughts. However, I instead prefer to envelop myself in things I enjoy and attempt to recognize and improve the quality of things I enjoy over time. For example, reddit is a low-quality entertainment source. The more I recognize it the less I enjoy it, and the less time I spend on it. Progress depends on exposing myself to things I don't know how to enjoy and attempting to find ways to enjoy them. My preferences have shifted from consuming to creating over time. The adjustments required for my philosophy are much smaller and easier than for this 'write-only' philosophy.


This resonates in a certain sense, but honestly I could not help but find this post incredibly narcissistic.


Part of the narcissism comes in because because he want to broadcast not communicate. I don't disagree that email can be a distraction, but only answering your email once a week makes communication impossible. Twitter would be useless (more than I already believe it is) if everyone decided to use it as broadcast medium and never read anything. What's the point if there are no readers?

You can do broadcasting if your Linus Torvalds, if not you might need to check your email a little more frequently.


Yeah (though I don't agree with the Torvalds comment), I basically agree. “Write-only” means “I am the center of the universe, and the plebs should listen to what I have to say and write, but I will listen to nobody: and if you were smart, you would stop listening to me and only broadcast, but if you are not, then you should keep reading my blog.”

It really strikes me as another variation on the Carefully Curated Instagram Feed that contains a million pictures of some hustler douche skydiving or touring Africa, or something. Building engagement with the Personal Brand, etc. I mean, everybody kind of does it, but it's a new level of narcissism to write a blog post about how “I’m going to focus on #1 right now, which is me, and the fact that I am the only one who has anything worthwhile to say.”

If you really want to focus on knowing yourself and being reflective, that's great! But don't “broadcast”. Just be silent for God's sake.


I agree with the sentiment in this that contemporary media is distracting and oversaturated.

But when he says:

>The simple test of time does yield false negatives.

I wonder what makes him so sure it doesn't yield false positives. My own view on aesthetics is that they're highly personal - something "timeless" might not appeal to you at all. Timeless works are the ones which happen, overall, to appeal to the most people - it doesn't mean they necessarily are higher "quality." It also has to do with how accessible they are, how ubiquitous (/ marketed), etc.

(Sidenote, but his arguments against startup aesthetics could also apply to modernism and many other movements. It's about distilling something to it's essence, removing superfluous details in favor of clarity. For all the faults "flat" design has, I think it is a noble cause even if it takes some missteps.)

There is a lot of contemporary work (programming, music, art, etc) that I find compelling, and personally I want to embrace it. Understanding the contemporary helps you understand the current moment of time, and where things are going. It lets you see larger trends and where you might want to fit in. I'm not very afraid of embracing something that turns out to be a false start - at least, at the moment, I was doing what I thought was best. Maybe this is a fallacy of youth.

I'm all for embracing the classics, the tried and true. But I think applying lessons learned from them to the unproven ideas is where things get interesting.


I've seen a lot of pieces lately, talking about the career of a programmer, or the state of the tech industry or what have you, written with rather a lack of perspective. Sorry, but even the work of those who are deeply fulfilled by it will not last as long as a Kafka novel. It's simply not possible, or else we would be drowning in the past. Even for writers, even for whatever occupation you think noblest. The blowing away of the past is what gives us a chance to build, and the blowing away of all we have is what will give that chance to future generations. I believe someone may have written a song about this at some point.

Of course, just because something's not going to happen doesn't mean we shouldn't strive for it anyway. But I am unsure that posting less frequently on Github will get the author any closer to that achievement. To me this attitude of needing one's work to survive signals an uneasiness with death that will permeate the rest of one's life, but then that would be quite a leap to make on the basis of a blog post.


>> "To me this attitude of needing one's work to survive signals an uneasiness with death"

I look at it slightly differently. It signals an acceptance of death being the end and that you should make what you do now really count.


It also speaks to a universal human desire to have a purpose in life, to make a lasting positive impact on the world.


A pretty good article, i agree with most of the points made. However i think there are times when the creativity inside you needs some fuel and looking around and browsing the internet is a very fast way of acquiring that knowledge. The key is to have a rule like 30/70. So if you write 70% of the time, you should be reading for 30% to charge your batteries.


If we create dependable, practical things that never break, few people will do anything more than depend upon them, and likely would never learn to create things themselves.

So then, if six or seven billion people learn, and then go on to create something that lasts, compounded across mutliple generations, each with a delta of several billion, we're left with pollution.


Well, we have created a lot of dependable practical things. I say something is dependable when you don't have to look for instances where they break. The hardware of your computer, the architecture of your house, the transport system, water, electricity and food supply. Almost everything that matter is so dependable it is possible to not to learn to create anything at all.


I used to argue with this the conclusions of the author, as presented to me by other people older and wiser than myself. But I had to learn from experience that I was wrong and the author is right. I thought I was smarter than everyone else who told me to moderate and that you become what you put in your body, but I wasn't.


> To program any more was pointless. My programs would never live as long as [Kafka’s] The Trial.

If you wrote Cobol it might live just as long.

On more serious note your code might live in billion devices and do useful stuff. Don't dismiss stuff that doesn't run on meatware. People might get obsolete faster than some of your code.


All pretty good. However the author didn't answer the question he posed at the beginning of the article. How does he plan to escape the "ephemerality" of computer programs? Seems from the footnote of his website that he plans to do so by "journeying from web ephemera to the timeless world of data." But is that really so fundamental a change? Or, actually, maybe programming isn't that "ephemeral" after all. It's just that people nowadays have the luxury of updating things at a much higher rate than ancient people, while still preserving(and improving upon) the core spirit. Languages, libraries and classical programs have been around for years. They're not really being overhauled/outdated as suggested. They're just being improved.


An inherent desire to make long lasting things is an aspect of craftsmanship. However, I think some people are driven by a need for external validation (N people use my creation!).

Much of my programming is a (hopefully great) meal that I make and eat myself.


If you bet on the right platform, code can live for a while.

For example: this classic paper on compression http://web.stanford.edu/class/ee398a/handouts/papers/WittenA...

was published with a C, Unix-based implementation (I put it here http://brenocon.com/WittenNealCleary-ArithmeticEncodin-cacm-...)

which compiles and runs fine (just lots of compiler warnings) on modern Linux and OS X, 18 years later.


> I will eliminate all use of the computer that is not directly related to creating things. If I’m not coding, writing, or editing videos then there will be literally nothing to do.

Err, but aren't I reading this on a Computer? Did he not read inspiring things online, watch videos on how to code or cook?

Consumption of others output is not an intrinsically bad thing. It is balancing it with creation that matters. Very similarly to the academics closed door, if You create without ever consuming you may be productive but eventually will be working on the wrong problem.


Oh he'll only use the command line twitter tool during his month of solitude and reflection, huh?

It just seems artificially constructed as a framework for doing... something, like this guy is waving his arms about, trying to find some deliberate way of being like Thoreau or Emerson.

It just reminds me a bit of when Sarah Silverman talked about hecklers -- how once, a woman simply shouted out, "I exist!", because that's what hecklers are trying to do -- show people they exist.

This guy just seems like he's trying to let other people know he exists.


Code might not live forever. So what? It's not supposed to.

I'd be disturbed if something better didn't eventually replace what we are currently are doing. Evolution. The progressive succession of organisms. Unless you are a horseshoe crab your species is probably just a flash in the pan and a rung on a ladder.

I consider myself less an architect and more a cook. The nutrition is provided and the meal enjoyed, but then people go on their way. The meal is gone, but the effects provided echo in time forever.


I did something similar in 2013, see my blog: http://peterevjan.com/posts/a-week-of-no-media-consumption/

In the end, it forced me to be more creative and social, but didn't produce any lasting change. I need to keep forcing myself to turn off internet etc to be able to get creative stuff done during evenings and weekends. Sad but true.


His plan is missing the inevitable step of, "Go crazy."

Seriously, as an extrovert I've found that "shut down all external stimuli" is a recipe for disaster.


It sounds like a pretty good plan if you're an introvert.

And if he needs human contact, he's not moving out to the middle of nowhere. He can go outside. Turn on the TV or get a paper for news. Go to a restaurant. Hook up with a friend for an evening. He's just turning off all the little social drip-feeds that live in the computer.

I've never done that for a month, but I've done it for a week or two now and then. It changed the way I think to not have Twitter to distract me on a bus ride or walk through the park or whatever.


Yes, strangely there were people with fulfilling lives before the internet came along.


Really? I thought it would be ideal for an extrovert. Instead of spending all your newfound downtime reading, you could instead spend all your newfound downtime interacting with real humans.


> These are repeated activities which occasionally – and unpredictably – give a pleasant surprise.

That's interesting observation. But I feel not only distractions work that way. Programming is to repetitive activity that occasionally, unpredictably gives you pleasant surprise that makes you keep coming back to it. Learning too.


Somehow, I suspect that Knuth's "Art of Computer Programming" will survive.

I feel that algorithms, especially as presented in concise research papers, will survive, and continue to show beauty in the centuries to come, just as the first-principles of mathematics and physics do. :-)


Nice article but are you sure you want to end the article with a "Contact me" link?


> To program any more was pointless. My programs would never live as long as [Kafka’s] The Trial.

The Trial had it been written today wouldn't have survived long either. It's a function of much less competition rather than quality IMHO.


Excellent and thought-provoking post, very relevant to our industry and specifically to our times. I would personally add a daily period of meditation to further strengthen awareness and emotional tranquility.


Reminder: we live in a time where many university libraries have digitized classic books and made them available on archive.org and elsewhere. A bounty of pre-1920s history of human civilizations.


He should change his taste in modern music instead of listening to old old old music. Just because it is modern does not mean that it is agitated and aggressive.


His comparing of logos and the "free expression of a skilled painter" is honestly ridiculous. They are optimized for completely different things!


Only place I've ever seen the word "hamartia" (before I Googled it) was in a clue from a very, very hard crossword puzzle.


The hipster is strong in this one.


Why some people here insist that programming/coding is an art form?

No, despite how much you'd like to be and see it happening, it is not and will not be.


I think people are using the word art when they really should be using creativity. Programming is a highly creative skill but not all creative things are art.


If you want to be an artist be an artist.

Being a programmer is not going to get you there if thats what you need.

I dont really understand the mindset, I think that is because I am an engineer. Meaning by training, schooling, and occupation i did industrial and civil engineering for 10 years. I tend to take the software engineering literally, well more literally than most...

Its kind of the antithesis of being an artist to me. Still beauty in it.


Art and engineering to me are different paths to get to the same place. Art is very free-form, flow-based, and seemingly arbitrary but which contains deeper meaning and structure that often comes out subconsciously (when done well) while working toward a goal. Engineering is a methodical, reliable way to reach a goal. The end results may be the same - program that does something - but how they were reached varies.


Still beauty in it.

as they say, beauty is in the eye of the beholder but software development is an engineering discipline and won't be considered as an artistic medium not now not ever.

I can understand the philosophical dimension of software engineering but I am not even interested to entertain even for a second the concept of the art of writing software because it's just insane.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: