Hacker News new | past | comments | ask | show | jobs | submit login

Not true.

Let's say you take a mega-program, break it into 100 programs.

Each now becomes more stable, due to simplicity. Yes, it really does.

But my point is, when you use those individual things, you use a random allotment of 3 or 4 of them. Not all 100.

Thus, the stringing-together bit is far, far less complex.




> Thus, the stringing-together bit is far, far less complex.

This is the statement I disagree with the most.

The problem is that combining different programs, that aren't strictly written to interoperate in a specific way, gives you an order of magnitude increase in the complexity. Sure, running find and grep together is common enough, but how about running {obscure_command_rarely_used} through {other_command} through {other_other_command}. There are always corner cases that can crop up, and even before we get to corner cases, just understanding how to fit the pieces together correctly is not so simple.

One cohesive program designed to all work together sometimes just makes more sense, especially when we're talking about complicated, high level actions, as opposed to the more common low-level actions of unix commands. (Or well, maybe high level and low level is the wrong way to put it - maybe the lower abstraction stuff in Word vs. the higher abstraction stuff that Unix tools deal with.)


> One cohesive program designed to all work together sometimes just makes more sense

Yes, your example of find|grep is perfect for this. Sometimes (not often) I have a situation where a simple pipe combination doesn’t work and I have to fall back to awk, which would be the single cohesive (and more complicated ) program.


This trivially becomes very fragile. Each of those 100 programs now doesn't just do stuff, but is held to a contract. A contract that it's well possible nobody intended to provide, but yet organically happened anyway.

And so you find that a small change somewhere like deprecating an option, fixing a typo, or adding extra data to the output makes the integration explode horribly in a confusing fashion.

There's also that plain text is a horrible serialization mechanism. Every program has to deal with parsing text that's mostly made for human consumption and deal with things like imprecise boundaries and arbitrary limitations per tool and per system.

Eg, /etc/passwd is a nice simple format, until you consider that somebody might want to use a ":" in a field, or to add more fields, or to store multiple lines of data. And all those quirks are specific for that particular file and may be handled subtly differently by other parts of the system.

And internationalization is an extra bit of fun, which ensures random bizarre bugs that use ',' instead of '.' as the decimal separator, or just emit text in some language that isn't English.


The object model of PowerShell is in my opinion really great for, at least, mitigating the text munging issue. The others are still there though.


Indeed, I think PowerShell deserves a lot of credit for taking a new look at this idea and doing it better.


If this was true, non-technical people would be stringing these together instead of using products that were created with the end user in mind


Figuring out which of the 100 I use and how to get them to work together is much more complicated than finding the feature in an omnibus program that already does exactly what I want.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: