Frankly, coming from a background in 1980s and 1990s OSes, I think modern ones are appalling shite. They're huge, baggy, flabby sacks of crap that drag themselves around leaving a trail of slime and viscera - but like some blasphemous shoggoth, they have organs to spare, and the computers they run on are so powerful and have so much storage that the fact that these disgusting shambling zombie Frankenstein's-monster things, stitched together from bits of the dead, dropping eyeballs and fingers, actually work for weeks on end.
On the server, no problem, run hundreds of instances of them, so when they implode, spawn another.
It's crap. It's all terrible, blatantly obvious utter crap, but there's almost nobody left who remembers any other way. I barely do, from old accounts, & I'm near 50.
We have layers of sticking-plaster and bandages over kernels that are hugely-polished turds, moulded into elegant shapes. These are braindead but have modules for every conceivable function and so can run on almost anything and do almost anything, so long as you don't mind throwing gigabytes and gigahertz at the problem.
And those shiny turds are written in braindead crap languages, designed for semi-competent poseurs to show off their manliness by juggling chainsaws: pointless Byzantine wank like pointer arithmetic, missing basic types for strings, array bounds-checking, and operator overloading. Any language that even allows the possibility of a buffer or stack overflow is hopelessly broken and should be instantly discarded. The mere idea of a portable assembly language is a vestige of days when RAM was rationed and programmers needed to twiddle bits directly; it should have been history before the first machine with more than a megabyte of RAM per user was sold.
Computers should be bicycles for the mind. They let us take our existing mental tools and provide leverage, mechanical advantage, to let us do more.
We work in patterns, in sets, in rich symbols; it is how we think and how we communicate. That, then, should be the native language to which our computers aim: the logic of entities and sets of entities, that is, atoms and lists, not allocated blocks of machine storage - that is an implementation detail, it should be out of sight, and if it's visible, then
your design is faulty. If you routinely need to access things, then your design is
not even wrong.
By the late '50s we had a low-level programming language that could handle this. It's unreadable, but it was only meant to be the low-level; we just never got the higher level wrapper to make it readable to mortals. The gods themselves can work in it; to lesser beings,
it's all parens.
Now, we have a rich
choice of higher-level
wrappers to make it all
nice and easy and
pretty. Really very
pretty.
And later, people built machines specifically to run that language, whose processors
understood its
primitives.
But they lost out. CPUs were expensive, memory was expensive, so instead, OSes grew simpler; Unix replaced Multics, and CPUs grew simpler too, to just do what these simple OSes written in simple languages did. Result, these simple, stripped-down machines and OSes were way more cost-effective, and they won. The complex machines died out.
Then the simpler machines - which were still quite big and expensive - were stripped down even more, to make really cheap,
rudimentary 4-bit CPUs for calculators, ones that fitted on one chip.
They sold like hotcakes, and were developed and refined, from 4-bit to
8-bit, from primitive 8-bit to
better 8-bit, with its own de-facto standard OS which was a
dramatically simpler version of a simple, obsolete OS for 16-bit minicomputers.
And that chip begat a
clunky segmented 8/16-bit one, and that a
clunky segmented 16-bit one, and that a bizarre
half-crippled 32-bit one that could
emulate lots of the 8/16-bit one in hardware FFS. And that redefined the computer industry and it was nearly two decades until we got something slightly
better, a somewhat-improved version of the same old same old.
And that's where we are now. The world runs on huge, vastly complex scaled-up go-faster versions of a simplified-to-the-maximum-extent-possible calculator chip. These chips grew out of a project to scale-down simple, dumb, brain-dead chips built to be cheap-but-quick because the proper ones, that
people actually liked, were too expensive 40 years ago. Of course, now, the descendants of those simplified chips are vastly more complex than the big expensive ones their ancestors killed off.
And what do we run on them? Two OSes. One a descendant of a quick-n-dirty lab skunkworks project to make an old machine useful for games, still today written in portable assembler with richer portable-assembler things written in the lower-level one running on top of it. And a descendant of a copy of a copy of a primitive '60s mini OS which has been extensively rewritten in order to imitate the skunkworks thing.
But these turds have been polished so brightly, moulded into such pretty shapes, that they've utterly dominated the world since my childhood. It's still all made from shit but it's been refined so much that it
looks, smells and tastes quite nice now.
We still are covered in shit and flies - "binaries", "compilers", "linkers", "IDEs", "interpreters", "disk" versus "RAM", "partitions" and "filesystems", all this technical cruft that better systems banished before the first Mac was made, before the 80286 hit the market.
But as the preface to the Unix-Hater's Handbook says:
``I liken starting ones computing career with UNIX, say as an undergraduate, to being born in East Africa. It is intolerably hot, your body is covered with lice and flies, you are malnourished and you suffer from numerous curable diseases. BUT, as far as young East Africans can tell, this is simply the natural condition and they live within it. By the time they find out differently, it is too late. They already think that the writing of shell scripts is a natural act.''
- Patrick Sobalvarro
Nobody knows any better any more. And when you try to point out that there was once something better, that there are other ways, that it doesn't need to be like this... people just ridicule you.
And no, in case it's not clear, I am not a Lisp zealot. I find it unreadable and cannot write "hello world" in it. I also don't want 1980s Lisp Machines back - they were designed for Lisp programmers, and I'm not one of them.
I want rich modern programming languages, as easy to read as Python, as expressive as Lisp, with deep rich integration into the GUI - not some bolt-on extra like a tool to draw forms and link them to bits of code in 1970s languages. There's no implicit reason that why the same language shouldn't be usable by a non-specialist programmer writing simple imperative code, and also by a master wielding complex class frameworks like a knight with a lightsabre. It's all code to the computer: you should be able to choose your preferred viewing level, low-level homoiconicity or familiar Algol-like structures. There shouldn't be difference between interpreted languages and compiled - it's all the same to the machine. JIT and so on solved this years ago. There's no need for binaries at all - look at Java, look at Taos and
Intent Elate, look at
Inferno's Limbo and Dis. Hell, look at Forth over 30 years ago: try out a block of code in the interpreter; once it works, name it and bosh, it's compiled and cached.
Let's assume it's all FOSS. No need for licences mandating source distribution: the end-product is all source. You run the source directly, like a BASIC listing for a ZX Spectrum in 1983, but at modern speeds. If you aren't OK with that, you don't like distributing your code, fine, go use a proprietary OS and we wish you well. Hope it still works on their next version, eh?
It could be better than we have. It should be better than we have. Think the Semantic Web all the way down: your chip knows what a function is, what a variable is, what a string or array is - there's no level transition where suddenly it's all bytes. There doesn't need to be.
And this stuff isn't just for programmers. I'm not a programmer. Your computer should know that a street address is an address, and with a single command you can look up anyone's address that is in any document on your machine - no need to maintain a separate address-book app. It should understand names and dates and amounts of money; there were apps that could do this in the 1980s. That we still need separate "word processors" and "spreadsheets" and "databases" today is a sick joke.
I have clients who keep all their letters in one huge document, one per page or set of pages per correspondant... and there's nothing wrong with that. We shouldn't be forced to use abstractions like files and documents and folders if we don't want to.
I have seen many clients who don't understand what a window is, what a scrollbar does; these abstractions are too complex for them, even for college professors after decades of use of GUIs. That's why iPads are doing so well. You reach out and you pull with a fingertip.
And that's fine, too. The ancestor of the iPad was the Newton, but the Newton that got launched was a crippled little thing; the original plan was a pocket Lisp Machine, with everything in Dylan all the way down to the kernel.
And the ancestor of the Macintosh was Jef Raskin's "information appliance", with a single global view of one big document. Some bits local, some remote; some computed, some entered; some dynamic, some static; with the underlying tools modular and extensible. No files, no programs, just commands to calculate this bit, reformat that bit, print that bit there and send this chunk to Alice and Charlie but not Bob who gets that other chunk.
Sounds weird and silly, but it was, as he said, humane; people worked for millennia on sheets of paper before we got all this nonsense of icons, files, folders, apps, saving, copying and pasting. The ultimate discrete computer is a piece of smart paper that understands what you're trying to do.
And whereas we might be able to get there building on bytes in portable assembler, it will be an awful lot harder, tens to hundreds of times as much work and the result won't be very reliable.