NHacker Next
  • new
  • past
  • show
  • ask
  • show
  • jobs
  • submit
The Rise of Worse Is Better (1991) (dreamsongs.com)
segfaultbuserr 1616 days ago [-]
This article was originally written with a critical tone of C & Unix, it said they have a lot of problems, but with a 50% functionality, a lot of people had accepted it as good enough, then spreaded it like "the ultimate computer viruses". After it's publication, many Unix people reinterpreted it as a celebration of Unix's successful apporach since then.

I always strongly dislike JavaScript for its inconsistencies and ugly syntax, I found the tendency that everything now comes with a JavaScript engine and force developing in JavaScript regardless of their native programming language is unfortunately, and I always dreamed that the alternative timeline which Scheme became the language of the web.

But I've changed my position in recent years, because of this article. Now I think the same principle applies to JavaScript. It was something with a 50% functionality, and so it became the ultimate virus on the Web, and since then, a huge amount of manpower was spent on improving JavaScript. As a result, it still has the same inconsistencies from its original design, but overall, due to all the efforts, the overall usability is actually higher than alternative systems with a "cleaner" / "better" design. Just like a Unix-like OS like FreeBSD or Linux is still one of the most usable systems in existence. So I think I'd just accept JavaScript...

jstimpfle 1615 days ago [-]
> After it's publication, many Unix people reinterpreted it as a celebration of Unix's successful apporach since then.

The other comment below links this: https://www.dreamsongs.com/Files/PatternsOfSoftware.pdf

Look on book page 219 (224 in my pdf viewer), and read around the section that starts with "It is far better to have an underfeatured product that is rock solid ..."

It's probably not a "reinterpretation of the Unix people". More like, the author did too good of a job by not taking sides explicitly, so everyone just interpreted it the way they liked. If anything, the author argued why "worse is better" is really better. To take the "Worse" approach just means to "not take all the unnecesssary effort" which will result in a product that exists.

In other words, it's an essay about how big bang approaches don't work out.

akkartik 1616 days ago [-]
I recently discovered my favorite summary of "Worse is Better". It's by the author, but it isn't anywhere in articles by that name.

“It is far better to have an under-featured product that is rock solid, fast, and small than one that covers what an expert would consider the complete requirements.”

https://www.dreamsongs.com/Files/PatternsOfSoftware.pdf (pg 219)

ChrisSD 1616 days ago [-]
Hm... when I think "worse is better" I'm not thinking of software that is "rock solid, fast, and small". Have I been misunderstanding the essay?

I thought most of these applications start out small but they're nowhere near rock solid and haven't been optimised for speed. It just gets the essential features in the hands of the people who need it and works just well enough to be useful.

jstimpfle 1616 days ago [-]
Well, at least 70% of the HN crowd has been misunderstanding it.

The article has an anecdote in it where the "better" guy has grandiose ideas about wanting to make a perfect system that always does the right thing, even at a huge increase in implementation complexity.

In that specific case, not having to restart system calls after an interrupt. I've always thought the "worse" guy made the right choice by shifting the complexity out of the core, because we should deal with each problem in the place where it's most natural. There's nothing wrong with requiring users to make and use (or only use) a wrapper that deals with that complexity in one specific way that is right in the given situation.

akkartik 1616 days ago [-]
I was surprised as well! But this is from the author so one can't argue with it.

As I think about it more, it makes a lot of sense. The only people willing to put up with something flaky are programmers. For most people, it can do little but it has to be reliable.

ChrisSD 1616 days ago [-]
Also by the author (under a pseudonym), Worse is Better is Worse: http://dreamsongs.com/Files/worse-is-worse.pdf

> I’ve always told him that his penchant for trying to argue any side of a point would get him in trouble.

sampo 1616 days ago [-]
> Hm... when I think "worse is better" I'm not thinking of software that is "rock solid, fast, and small". Have I been misunderstanding the essay?

Have you read the 4 bullet points of the worse-is-better philosophy, in the beginning of the essay?

ChrisSD 1616 days ago [-]
Yes. Simplicity of implementation is given priority over all other factors. This does not, to my mind, mean "rock solid" software (even if care is taken for "observable aspects" of correctness) unless we have very different definitions for the term.

It could be fast by virtue of having a simple implementation but that isn't a given. Sometimes getting speed is complicated, especially on modern processors.

MS90 1616 days ago [-]
I'm with you here. "Rock solid, fast, and small" is a better example of "less is more"

"Worse is better" implies lowered quality.

jstimpfle 1616 days ago [-]
I understand "worse" as tongue-in-cheek. We also often see "stupid", "dead", "dumb", and similar things, as desirable qualities.
kragen 1616 days ago [-]
You are correct. The PCLSR facility given as the example of the MIT approach is faster and more solid than the Unix approach.
UI_at_80x24 1616 days ago [-]
Which sounds like the "Unix Philosophy"

>Make each program do one thing well. To do a new job, build afresh rather than complicate old programs by adding new "features".[0]

[0] https://en.wikipedia.org/wiki/Unix_philosophy

mjw1007 1616 days ago [-]
I think the central point here is "completeness must be sacrificed whenever implementation simplicity is jeopardized".

Indeed, sometimes omitting a feature which really ought to be there makes life easier for the end user, not just the implementor, even though the user has a good reason to want the feature.

For example, Subversion allows versioning empty directories while Git (like CVS) doesn't.

On the face of it this is just a deficiency in git, but the fallout from Subversion doing the Right Thing is quite extensive: because Subversion treats directories as first-class objects rather than just part of a file's name, you can get a Subversion repo into all kinds of strange and confusing states.

With git the user is never going to be confused by the result of something like "I deleted the directory then tried to merge a branch which added a file inside it".

gerbilly 1616 days ago [-]
Main main criticism of the 'worse' development approach, rush out a simple MVP then iterate, is that it only works for simple problems.

Not all problems can be solved by incremental refinement.

ChainOfFools 1616 days ago [-]
yes, this article's philosophical summary could more accurately, if less memorably, be written "strategies that prioritize short term objectives tend to outcompete those that prioritize long term objectives"
1616 days ago [-]
goto11 1616 days ago [-]
Wasn't Linux built with an incremental approach? That seems to have been pretty successful.
gerbilly 1616 days ago [-]
When they started to build the Linux kernel, they had a few great reference implementations of UNIX to base their design on: BSD and SysV UNIX.

Both were available as source, and both had technical books describing their system architecture.

Linux has even been criticized for being a fairly conservative reimplementation of traditional UNIX.

goto11 1612 days ago [-]
That makes it even more incremental, right?
workthrowaway 1616 days ago [-]
that's interesting. what example problems incremental refinement isn't able to solve?
wry_discontent 1616 days ago [-]
The flaws of incremental refinement should be obvious to anybody who's worked in a codebase more than 5 years old built on that approach. Maybe it could work in theory, but in practice, the iterations on crap produce more crap.

As far as a specific example, I remember a discussion with Uncle Bob where he specifically mentions banking and accounting as systems that shouldn't use that approach because you'll build the wrong thing.

AnimalMuppet 1616 days ago [-]
And yet the conventional wisdom is that "a complicated system that works is almost always found to have been derived from a simple system that works".

But you're right, evolving code often turns it into a mess. The only way that doesn't is if, at each stage, the people working on it keep the architecture and code clean. That takes discipline, not just by the programmers, but also by management - they have to give the programmers time to do the cleanup that is needed, not just time to shoehorn something in.

einpoklum 1615 days ago [-]
Building a sky-scraper.

If you build a 1-story house, then gradually do what-it-takes to build another floor on top of it, you'll either never have a tall building, or end up with something like a pyramid or huge angular support pylons (i.e. extremely wide base).

gerbilly 1616 days ago [-]
Incremental refinement is like implementing a local search algorithm.

You may end up getting stuck on a local maximum in the software design space.

To optimize and really find the global maximum often requires backtracking, which in the case of software development might mean throwing out large portions of the codebase and starting over.

einpoklum 1615 days ago [-]
Actually, backtracking is still kind of incremental.
scottlocklin 1616 days ago [-]
Programming language or database design.
Ididntdothis 1616 days ago [-]
Building a rocket or an airplane? There is a lot of work that needs to be done before you can start working incrementally.
Isamu 1616 days ago [-]
The title itself lets you know the desire of the author to occupy some philosophical high ground, while admitting some hard truths.

They could have asked more honestly, "why is C/Unix winning hearts and minds while Lisp-based systems are not" but first they wanted to provide the given that C/Unix was clearly inferior to Lisp based systems. That was not up for debate.

So that sets the bounds for the discussion that follows, and frames the discussion as "why are people choosing the clearly inferior over the clearly superior?"

You could write a similar essay from the point of view of what you might call the "original intent" of C/Unix, which is that simplicity is chronically undervalued and everybody, everywhere, all the time, try to add "just one more feature" to make it better.

jonjacky 1612 days ago [-]
That essay has already been written, Rob Pike's UNIX Style, or cat -v Considered Harmful [1]. It inspired a web site and project devoted to simplifying Unix [2].

[1] http://harmful.cat-v.org/cat-v/

[2] http://cat-v.org/

ChrisSD 1616 days ago [-]
> C is therefore a language for which it is easy to write a decent compiler, and it requires the programmer to write text that is easy for the compiler to interpret.

Ironically this is no longer true. Not so much because C has changed but because the hardware underneath it has.

chrchang523 1615 days ago [-]
I would instead state that the standard for what constitutes a "decent compiler" has risen. For any given performance target, I'd still bet that it takes less work to hit it with a C compiler than with a compiler for most newer languages. (Partly because the success of C has resulted in C-related hardware design constraints...)
wayoutthere 1616 days ago [-]
Worse isn't better, simple is better. Simple is much easier to adapt to complex use cases because it can be easily understood at a high level. Simple avoids bikeshedding, which Lisp developers (the intended audience of this article) are notorious for.
DannyB2 1615 days ago [-]
I remember reading this, when it was first published, I think maybe in AI Magazine.

Being a huge Common Lisp fan at the time, I immediately adopted the idea that Correctness is the most important single thing above all else. I don't care what's in the box. The external interfaces on the box should be correct.

This seems like a basic thing we take for granted in tools, libraries, languages and other things we use. Dishwashers. Thermostats.

tboyd47 1616 days ago [-]
There's also the hiding-in-plain sight explanation that C was just easier than Lisp for English-speaking people to learn and use because, like English, it's SVO instead of VSO. Then object-oriented languages overtook C for the same reason.
chooseaname 1616 days ago [-]
It's like literary fiction vs popular fiction. One might think literary fiction is the right way to do fiction, but popular fiction is where the money is.
AnimalMuppet 1616 days ago [-]
But define "better". Literary fiction is "better" in the sense of "communicating profound ideas better". Popular fiction is "better" in the sense of "being something that people want to read".
ijiiijji1 1616 days ago [-]
Unix and C are the ultimate computer viruses.

Javascript: Hold my beer.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact
Rendered at 06:35:59 GMT+0000 (Coordinated Universal Time) with Vercel.