Note that all of the ways the worse, less designed solution (Unix, C), are better boil down to being better from an adaptation/colonization/survival perspective. In other words the rise of Unix/C is computer evolution in action.
Linus just recognizes that that applies to Linux as well.
Absolutely , that was the first thing that came to my mind - this is <i>Worse is Better</i> and taking a further leap its a result of <i>Release Early and Release Often</i>.
You're reading too much into it. The points just indicate that around 25 people didn't realize the article was from 2002 until they read the comment so they upmodded
I probably didn't deserve a 104 points for finding it, but it's like who's line is it anyway. The points don't matter, what matters is that you guys enjoyed the read.
I like to think that people vote on a comment based on the content, not on the (current) score. It is not particularly unlikely that twenty-four people thought:
This is a worthwhile comment! It points out something that I did not realize, and change the meaning of the article to a degree. I may as well vote on it, that it may move upward in the list and other people read it earlier than other comments.
Also remember that +1 isn't specified to be "Insightful" or "Informative" like Slashdot; nor can you vote a "half-good" comment up a half-point. Instead, to achieve the same effect, a half-good comment must appeal to half as many people. This comment happened to appeal, on its own merits, to twenty-four people.
(Does this mean that the highest-voted comments might be the ones that appeal to the biggest percentage of the site, rather than the ones of the highest quality? Hmm. Does that mean that for the voting system to work well, a good percentage of users must specifically only vote comments that are very good?)
It was an oldie, but a goodie. Since I wasn't aware of it (I wonder how much other great info I'm missing), I suspected it was worth sharing. Sun's still kicking though, although more from Java and web growth than from solaris.
That's exactly right. On top of that Sun had just lost 80% of its market capitalisation and there was a general sense of doom in the tech industry back then.
But that's like saying that you know that you're going to build a car with four wheels and headlights - it's true, but the real bitch is in the
details.
Linux as a whole was not designed -- that is easy to accept, and also that it is a strength.
However, design is not so entirely opposed to evolution as might be inferred. Design seems essentially to have some iterative character too. It always entails a some series of ideas, rejections, and then further adjusted ideas. That is deliberately controlled, but it still has variation and selection -- so the possibility of blending between design and evolution is very reasonable.
And given the wheel example, artificial design has something to offer too.
(I think Torvalds' view is consonant with this. -- I mean, is he not overall offering some combination of evolution and design?)
Before wheels, I'm pretty sure people used tree trunks to drag heavy objects from A to B. And I'm pretty sure the wheel's inventor hasn't envisioned the average consumer car we have today either.
Linus's point is that design happens at a micro-level, but at a macroscopic level the direction of a successful project should be made by trial and error with a feedback cycle.
To put things in perspective ... Linux is not designed for servers, it is not designed for desktops, it is not designed for real-time systems, it is not designed for mobile phones. When people wanted to use Linux for such stuff, they took it and modified it to their will, and the parts that slowed them down were eventually removed / refactored.
That's actually very true if you look at the history of many successful projects. And that's evolution more than design. "Release early and often" is exactly about this.
design at the micro-level = hypothesis formulation
direction of successful project at macroscopic level = repeated testing and reformulation of hypothesis
in other words, project success is science well applied
Well, no it isn't too good. But it does basically fit: It is a small feature of larger systems, as you describe, and it has been done better by artificial design.
Not challenging your point, just curious...do you know of any (non-human) animals that have harnessed fire? I don't mean in an "environmental" way, like looking for food along the edges of a burn, or having IR-sensitive vision, I mean actually collecting it, containing it, using it as a tool. Or is that a uniquely human trait?
To the best of my knowledge that is a uniquely human trait. And we not only harness it - we are dependent on it. Over time our digestive systems have adapted to require cooked food. (Every known culture cooks food. Theoretically you can live on uncooked food. However the few who try it have to eat constantly and tend to be skinny...)
What's wrong with being skinny ? I can see the point of not having to eat constantly, but if skinny is one end of the spectrum and overweight is the other I'd rather be skinny.
Nothing is wrong with being skinny. However if you live in a culture with virtually unlimited food available, are eating constantly, and can't add weight, then that suggests you aren't meeting your nutritional requirements.
The reason, of course, is that our guts are adapted to pre-processed food, and so are shorter. Therefore unless you get some cooked nutrition, it is very hard to get enough food. It is possible, some people do it, but it is hard.
As a skinny fellow, I tend to agree -- or, rather, my metabolism agrees for me. Counter-point, though: consider the "obesity paradox". Statistically, obesity correlates massively more positively with survival in the face of catastrophic illness (cancer, &c) than for folks with a normal-ish BMI, let alone the scrawny ones. It's probably a less "selective" (in the evolutionary sense) trait than it might be, in that cancer tends to be an ailment associated with at least somewhat advanced age, a time by which most people who are going to will likely have bred.
Saying "wheel!" is crude and invites a simple counter. But the point is there is a simple, important thing where artifice has a substantial lead.
And without well-chosen application of design, the large-scale evolution might be practically excluded from certain areas.
(. . . that was really a side issue. It was the procedural similarity of design and evolution -- they are not so much opposed, as differently balanced.)
Wheels are quite unpractical. Their only advantage is that they require relatively little energy over long distances. However, why not reduce the need for long distances or not create it in the first place?
"Take TCP for example. The TCP protocol is specified in a series of
documents. If you make a formally correct implementation of the base TCP RFC
you won't even make connections. Much of the flow control behaviour, the
queueing and the detail is learned only by being directly part of the
TCP implementing community. You can read all the scientific papers you
like, it will not make you a good TCP implementor."
Well, firstly that post was written eight years ago. Secondly, OS X might surpass it on Desktops, but what about servers, computing clusters and so on?
I'm not here to spar. I was just using the generally accepted definition of the term. Defining a specific market segment is, by nature, adding a condition or limitation that adds some necessary context for certain statements: Nature Valley Peanut Butter Granola Bars are the most widely deployed snack...in my cubicle.
I think you misunderstand the OP then - it wasn't a limitation, it was an expansion.
Edit: I guess the OP wasn't that specific, but this seems to be the right way to look at things - all deployed installations of Linux, regardless of application.
Even if a commercial version (like OS X) may surpass it in the future, it will contain parts from modern Linux distributions. And it wouldn't last too long.
What I like about Linux is that it is getting better and better. And it also is the most versatile OS available.
I first used Linux in 1997, when I first tried a Red Hat CD included in a magazine I read. The user experience was just awful, while today's Ubuntu Linux is almost as good as Windows and OS X.
The kernel has also seen major improvements, especially in stability and drivers. My wireless stick, the sound card, my video card ... these work out-of-the-box on Ubuntu.
The communities around Linux may move slower, and have fewer surprises, but they do move forward constantly. In contrast, Microsoft may have the resources to pull off a Windows XP, but every once in a while bad things happen ... like Windows ME, Vista (after 5 or 6 years of waiting), a frozen IExplorer, etc...
One day a distribution like Ubuntu or OpenSuse will be as good as Windows or OS X. When that happens, you can't beat its price or the freedoms that come with it, even if people don't like changing their environment. This is exactly what's happening with Firefox or any other successful open-source projects out there ... once they reach critical mass, they are unstoppable.
I had a nice experience last night. I was one of several presenters at a community council meeting, and my netbook - an Acer Aspire One with Ubuntu 9.04 - was the designated AV device.
I was worried and showed up early, but getting my display projected onto the big screen was as simple as enabling the projector as a second monitor and mirroring it with my display. (Also, two PowerPoint shows with animations and so on worked seamlessly on the OpenOffice PowerPoint clone.)
In contrast, I've seen plenty of Windows XP and Mac OSX laptops totally choke at working with a projector.
Claiming that windows and mac laptops have more trouble working with a projector than Linux is bold indeed, and I think very misleading.
Strangely, in all the time I've been to talks, I've never seen a single Windows or Mac laptop have any trouble with a projector. In contrast, I've seen a few Linux machines have trouble to the point where the presenter gave up and used a different computer.
Granted that was a few years ago, it's nice to hear that Ubuntu has improved so much in recent times. I remember just three or so years ago getting a second monitor enabled would often require quite a lot of effort on Linux.
> Claiming that windows and mac laptops have more trouble working with a projector than Linux...
But that's not what I claimed. I relayed a happy anecdote about a positive experience with Linux - and I did so a) in the context of a parent comment that argued Linux is getting better, and b) on the assumption that most HN readers understand that the plural of anecdote is not data.
Yes, getting a second monitor would require playing with the "xrandr" command. It's not that hard, but definitely not for normal users.
The situation is thankfully improving though, and I'm happy for that. It's nowhere near ready for consumers (as much as some people like to claim), but it is getting there, and it has a really good and steady improvement rate.
The parent was saying that Linux is the most widely deployed Unix, and that OSX may overcome Linux as the most widely deployed Unix. This is merely a post of clarification, not support :)
reminds me of yesterday's font page submission on poor use of analogies. not saying it is, but i can't help seeing the connection...oh god! may this is a bad analogy. will it never end!!
That which works well beats that which looks good.
Take quantum for example: When Dirac was asked, "What’s the answer to the measurement problem?" his response was, Quantum mechanics is a provisional theory.