July 17, 2006

The UNIX quote

"I started keeping a list of these annoyances but it got too long and depressing so I just learned to live with them again. We really are using a 1970s era operating system well past its sell-by date. We get a lot done, and we have fun, but let's face it, the fundamental design of Unix is older than many of the readers of Slashdot, while lots of different, great ideas about computing and networks have been developed in the last 30 years. Using Unix is the computing equivalent of listening only to music by David Cassidy."

Rob Pike

He's right and he is wrong. I think it's entirely likely to me that we'll find further down the road that software works much like genetic development in nature. Nature never throws out old designs. In fact most of our human basic design is the same as the basic design in fish and plants and bacteria and it hasn't changed in billions of years. However, the interest, the competitive edge, moves away from the old designs once they win and onto greater things. So i'm not sure we'll ever have new file systems or new anything really. I find it entirely likely that inside the massively parallel billion CPU core machine of 2050 we'll find a million linux 2.6 cores with ext3 filesystems...
I think we can already see this as OS'es get commoditized and the interest moves from scaling up to scaling out. Scaling out is a developer way of saying "I'm not going to fix the I/O DNA or the process DNA of computing, I'll just add sophistication on top".
The only real reason this isn't truly plausible on a 200 year scale is energy consumption. It's quite possible that in a truly parallelized world we'd really much rather have a much simpler operating system able to function on much less power, but robust and distributable.

[UPDATE should have read the whole thing and a minimum of stuff about plan 9 - which answers some of the questions, but the failure of plan 9 to catch on underscores the point - and it's clear from the interview that Pike is aware of this]

The question that then comes to mind: Suppose we wanted to build the multi-concurrent internet-ready super machine of the future, programmed entirely in a fanstastic functional language that is able to hide complexity and concurrency in an efficient way, what would we keep around?
Some ideas on design points:


  • Software will run on millions of mutually sandboxed cores. Cores are perishable and automatically restartable. Cores are simply glorified processes.
  • Cores maintain a distinction between interior and exterior and police their communication surface (think cells)
  • Cores are hardware independent, all software on a core relocates effortlessly to other cores
  • There is no "shared storage" there is only the cores. The communication substrate between cores is the only shared medium and it has no state
  • Any idea of of privilege or trust other than sandboxes is just unmaintainable. The idea that we'll be running software that is hundreds of times more complex than what we have today (or run the same software on data scaled hundreds of times, which is really the same thing) and be able to think consciously about trust is probably not sound.
  • The coordination mechanisms between software can't come from a high enough level of abstraction.
    What that means is that any kind of coordination protocol or mechanism that is "bottom up" is really not useful. So an example would be implementing component coordination within a sandbox but not supporting coordination between the sandboxes from above.
    What I'm thinking of here is once again the security mechanisms and the privilege mechanisms, but also something that might just be more of a pipe dream - which is the scripted ability to control any resource on any reachable machine - with sandboxing and privacy of course, but still. The point about it being from above not below is that I don't want to have to go to a substrate below to accomplish my connectivity goal, it should just be a standard operating assumption about any layer that it naturally distributes and shares
  • Unreliability is the norm not the exception. I mean this both in terms of hardware failure, software bugs and malware. As the world becomes more and more complex, there's just no way we will remain in conscious control of the quality of our system. At best we can do some double computation and fact checking and that kind of thing.

(I think I need to start a blog specifically for spaced out posts)

Posted by Claus at July 17, 2006 2:31 AM
Comments
Post a comment