Leave a comment

steer August 31 2012, 11:21:51 UTC
What Killed the Linux Desktop

Hmm... cf the usage stats here:

http://www.w3schools.com/browsers/browsers_os.asp

Now that is a huge overestimate of market share (because Linux users likely browse more) -- but note the trend. All the "what killed linux for desktop" articles over the last ten years have tended to ignore the fact that it has a growing percentage of a growing number of users (that is, its market share is increasing within a market that is increasing -- the number of deployed desktops worldwide is going up).

Same story different figures:
http://www.pcworld.com/businesscenter/article/247577/desktop_linux_gains_share_in_recent_months.html

However, at this point market share doesn't really matter to a free operating system. It's nice for the developer ego sure. But when you're at the point of having (say) 1.5% of such a huge market then you can sustain the user supported product indefinitely because you have sufficient userbase even if a very small percentage contribute.

It also ignores the fact that Apple has a rather easier job. They have to create an operating system which runs on, what, five, maybe ten bits of hardware which have many commonalities. I think if Ubuntu's goal was to get something which would run on ten different PC configurations with no tweaking, they could do that perfectly by the end of the week. The PC operating system has to cope with the fact that it could be using more or less any hardware. (This is one why community hackintosh ports to PC architecture is remarkably unstable.)

Reply

andrewducker August 31 2012, 12:00:34 UTC
Windows covers pretty-much as much hardware - and does so by keeping driver API solid for long periods of time. The biggest problems with Windows 2000 and Windows Vista when they launched was lack of drivers due to a new driver model, and that's something they try to avoid. If you're going to cover large amounts of hardware then you want to make it as easy as possible for the manufacturers to write drivers for you - which is something, as the article makes clear, that the kernel developers aren't interested in.

Reply

steer August 31 2012, 12:35:30 UTC
Windows covers pretty-much as much hardware - and does so by keeping driver API solid for long periods of time.

How often do you install windows from scratch? My experience is that if you install windows onto a new machine it is every bit as difficult to get hardware working properly as Linux. Most users do not see this because they don't do a "from fresh" install. Last time I did this the windows install took ten hours longer than the linux install (which took an hour). I still get the odd crash full-screening flashplayer on windows.

OK... so most users don't see that because they buy a working box with windows installed. So think about peripherals. I never did get my girlfriend's printer working with windows 7 -- it was five or six years old but they never did windows 7 drivers. She had to buy a new one. I've not yet been able to get her current wireless printer working over wireless with windows 7 but then, I've only spent two hours on that. I can't print on my office printer with windows... no idea why. Being fair I took a while to get my own printer working with linux and the scanner remains unsupported (I would have to use it through wine if I were windows only -- I think it also has some kind of internal feature which can sort things out somehow but I don't scan often so it's not an issue).

I wouldn't claim that hardware is easier to get to work on Linux. Sometimes there are huge issues and more so than with windows because a manufacturer *has* to make it easyish to get a device to work with windows for most people.

[Incidentally, I know of several academic projects which are trying to get a sane and clear way to get networked peripheral hardware installed and working more easily for ordinary people.]

I think it's attitudinal. If the printer doesn't work with windows the printer is shit. If the printer doesn't work with linux then linux is shit.

something, as the article makes clear, that the kernel developers aren't interested in.

I'm not convinced that's actually true. I mean they're not really kernel changes I think he means here. I imagine he's talking about things like the festering sore that is linux audio -- yes, the migration from OSS to ALSA then to pulse was painful. That's three architectures between 1992 and 2012. Each one was a pain in the neck to move from. In that time Microsoft have 11 major directX releases. My recollection is that from about 6 onwards they weren't a big deal. But directX 1->2 and 2->3 were really painful. At least one of those early changes had me wiping windows and reinstalling from scratch.

So, I'm not convinced that, by comparison, windows has been any more API stable. When we look at graphics, there have been similar ructions in the windows world. OpenGL versus DirectDraw are pretty widely adopted now but remember the time when voodoo graphics cards ruled the world and everybody and his dog was programming in Glide and your game wouldn't work unless you had a voodoo card or something near compatible? Lasted mid to late 90s. Then that fell from fashion and nowadays you'd have to search a long time to find anything that uses the glide API.

So, in conclusion, I'd say that windows has in no way kept the API stable. I've no insight into this from the Mac point of view except this: between OS9 and OSX they pretty much burned the entire thing to the ground and started again. Now compared to that, I think windows and Linux have been islands of stability.

Reply

andrewducker August 31 2012, 12:40:09 UTC
Never had any problems with Windows 7 - the drivers installed instantly, and worked just fine from scratch, grabbing new versions from Windows Update as you go. Versions of Windows from the early days certainly had problems, but nowadays it all works completely smoothly so far as I can tell.

With the two networks printers I have, I just clicked the button to add a networked printer, it scanned the network, found it, and installed the drivers. The only proprietary driver I needed there was to be able to scan remotely.

DirectX is entirely backwards compatible. If you install the latest version then it includes support for older versions, and I tend to find that graphics Just Works nowadays, and has for a few years.

The kernel people, including Linus, have stated in the past that they have no interest in keeping the binary driver interfaces stable - they want open-source drivers that are recompiled whenever the kernel is.

Reply

steer August 31 2012, 13:00:55 UTC
DirectX is entirely backwards compatible. If you install the latest version then it includes support for older versions,

Sure -- but it wasn't in the bad old days. I will say that now it is pretty smooth... though I never did get my sound card to work reliably with windows in my current machine. Eventually I fell back to using the less powerful on motherboard sound card.

Never had any problems with Windows 7 - the drivers installed instantly, and worked just fine from scratch, grabbing new versions from Windows Update as you go.

You have been very lucky. I've had horrible, horrible problems with many windows versions including windows 7. I guess I am the type who fiddles with stuff and reinstalls often so I'm more prone to coming across these things. Probably install windows once every other year on average (that's over several machines) and linux once a year (more often because it's much quicker to do).

With the two networks printers I have, I just clicked the button to add a networked printer, it scanned the network, found it, and installed the drivers. The only proprietary driver I needed there was to be able to scan remotely.

Sometimes you just get lucky with these things I guess. Caron installed her printer fine, and it worked for a week then it stopped working over the network. I tried everything I could think of. Uninstalled all the drivers, reinstalled from scratch and it still doesn't work wirelessly and can't be detected on the network. I gave up after an hour on that one. The previous printer I am convinced is impossible to work with windows 7 as I only found google links to "Can anyone get this to work with windows 7".

The kernel people, including Linus, have stated in the past that they have no interest in keeping the binary driver interfaces stable - they want open-source drivers that are recompiled whenever the kernel is.

This isn't really the problem though. I mean the user-visible problems aren't at that layer and these aren't the problems that the article really means. When people read "stable binary interface", I think they only read the word "stable". All of the fuss over OSS->ALSA->Pulse would have been just the same with or without a stable binary interface because the changes were above that level. Similarly changes from, say, xfree86 through to x.org would have happened and caused issues even with a stable binary interface.

But look -- this article explains better than I can why stable kernel is a red herring.

http://www.kroah.com/log/linux/stable_api_nonsense.html

The stability issues that bug users and cause irritating incompatibilities are at a different level. Not in the kernel but in kernel modules or the user space.

Reply

andrewducker August 31 2012, 13:10:27 UTC
That article starts off by agreeing with me!
"What you want is a stable running driver, and you get that only if your driver is in the main kernel tree."

The problem is that many drivers are _not_ in the main kernel tree, nor are they likely to be, and they _are_ affected by the changes.

Of course, there are incompatibilities at all levels - but there's no reason to not support multiple sound systems. If your drivers work correctly then they can be called by a wide variety of different subsystems without a hitch.

Reply

steer August 31 2012, 13:31:20 UTC
That article starts off by agreeing with me!

At a superficial level -- but then goes on to point out that the best way to get a stable running driver is not stable kernel binary interfaces.

The problem is that many drivers are _not_ in the main kernel tree, nor are they likely to be, and they _are_ affected by the changes.

Many? I'm not that convinced. Or rather, OK you're technically correct that there are many drivers not in the main kernel tree but when was the last time you needed one. In the last five years I've needed:
(1) a tweaked driver for my laptop touchpad because it was new to market. It was in the next kernel release. So I needed the non main-tree driver for six months.
(2) a driver for my desktop printer. I needed the non main-tree driver for six months.
(3) ATI specific drivers.

The ATI one is a special issue and would remain an issue even with a stable binary kernel interface. The other two were not kernel modules. So on the three occasions I've needed to go outside the main supported code to get hardware working non of those would have been touched by such an issue.

Of course, there are incompatibilities at all levels - but there's no reason to not support multiple sound systems. If your drivers work correctly then they can be called by a wide variety of different subsystems without a hitch.

But in practice that is not what happens. In practice what happens is what did happen when the multiple sound systems were all running together... one part of the multiple sound system did not work nicely with another and the whole thing fell over. Or the bit of software written to run with OSS suddenly stopped working because OSS was not working like it used to. It tends to be this kind of thing that people talk about when they say that the linux API is not stable. But the problem was occurring several architectural layers above the kernel. In fact I think the kernel level interface calls remained stable throughout.

The problem with the transition there was (IMHO) two fold:
1) Any new thing done for the first time unleashes a daemon -- when you roll out a big new piece of API you discover that a percentage of the world has hardware which doesn't quite fit the model you had -- the previous driver was broken but in a subtle way that nobody noticed... weirdly common.
2) Once your sound was broken you never knew which BIT was broken. I can remember in the transition years struggling for 30 minutes to get sound working only to discover that the volume was turned off in the ALSA architecture part but not in the pulse architecture part -- so everything was actually working, it was just that there was a hidden volume control. The problem was that I couldn't tell if OSS, pulse or ALSA was broken.

Reply

andrewducker August 31 2012, 13:33:36 UTC
So we agree that there are many drivers not in the mainline tree, and that this is an issue for those drivers.

Personally, I've never needed a driver not in the mainline tree, because I don't run Linux (well, except as a core part of Android, where all of that is taken care of for me). The few times I've tried to run Linux long-term I've run into fairly major problems quite quickly, and stopped.

Reply

steer August 31 2012, 13:54:16 UTC
So we agree that there are many drivers not in the mainline tree, and that this is an issue for those drivers.

We agree that there are many drivers not in the mainline tree. It potentially could be an issue for those drivers but I've never encountered such a situation to my knowledge. The non mainline tree drivers I've used have not been hurt by the situation. While the number of drivers outside the mainline tree will only increase (because nobody's going to reincorporate some 12 year old bit of hardware) the proportion of users affected will decrease because things are swiftly put into the mainline tree -- so the issues I had with drivers outside the mainline tree were only issues because it takes time for Ubuntu to get the latest kernel -- that is the lag from new hardware->hardware drivers written and incorporated in kernel->your OS uses that kernel.

I have never to my knowledge come across a driver problem which I honestly think would be helped by a binary stable kernel interface. I accept that there probably are some out there but at that point we're into the realms of the obscure. It's fixing the wrong problem IMHO.

except as a core part of Android, where all of that is taken care of for me

You might find more examples than you think. :-) I was surprised to find that my TV is running linux (a lot of LG models are). Your satnav probably is, perhaps your router. One great thing to come out of the TV discovery was that I managed to hack extra features on my TV (unlocked the "play movie from USB" feature that was supposed to be only on the next model up but was software disabled).

Reply

andrewducker August 31 2012, 14:05:18 UTC
Both my routers almost certainly are, my Synology NAS definitely is. I don't have a satnav, lacking a car (or license).

Anyway, getting back to the article, Miguel isn't actually (now I go and re-read the article) saying that the problem _is_ the constant breaking changes in the kernel API - he's saying that this set the tone, and that everyone else then does the same - including changes to things like the sound infrastructure. And, one assumes that he'd know, being the founder of GNOME.

Reply

steer August 31 2012, 14:21:27 UTC
Ooops -- *blush* you are correct. I picked up the wrong thing from the article. I think we've been barking up the wrong tree quite loudly with this discussion. :-)

Hmm... it's a very tricky one. The changes in sound architecture were a real pain... a real gigantic pain... for about two years I reckon from OSS->ALSA and about one until pulseaudio was good. But that was quite some time ago now.

He's right -- it's a big problem or has been in the past and probably will be again. At the moment you can pretty much rely I think on openGL graphics and pulseaudio sound (95% or more of userbase). As an application developer that's OK I think.

Likely Ubuntu will move from x.org to wayland -- and there will be much wailing and gnashing of teeth. Not sure if that will affect devs.

The issues he gets at in his article "working audio, PDF viewers, working video drivers, codecs for watching movies" well... the audio was a nightmare and it was as he points out a library stability issue. No idea what he's getting at with PDF viewers. Acrobat is crap but it's crap on windows too and it's not the default on most linux. Video drivers and movie codecs are legal issues not stability issues. (They work but you need to click on the "I really want to do this" button.)

Reply

andrewducker August 31 2012, 14:29:58 UTC
I'm equally as guilty. I posted something, only half-remembered it the next day, and rather than going back to first principles decided to pick holes in what you were saying. Not a _great_ argument technique!

Reply

steer August 31 2012, 15:01:23 UTC
I did read the whole article but sort of speed read the step from "kernel programmers" to library/userspace stuff so I thought he was advocating the idea that a stable kernel API would fix things... his other points are not so bad.

Reply

danmilburn September 2 2012, 09:13:32 UTC
The key sentence from that article being: "(remember we are talking about GPL released drivers here, if your code doesn't fall under this category, good luck, you are on your own here, you leech .)"

Well, at least we all know where we stand.

Meanwhile, every time there's even a minor kernel update on my machine at work, I have to remember to reinstall the graphics card drivers. If I forget, then I can't run X. So now I just don't ever update the kernel. Nvidia are, of course, unlikely to GPL their code.

Reply

steer September 2 2012, 11:00:38 UTC
Wow. Old school. Which distro still needs that, I will for sure avoid it. Ubuntu sorts this automatically.

Reply

danmilburn September 3 2012, 06:58:00 UTC
This is CentOS 5. Which I have to run at work, because it's the only Linux distro our software officially supports. It is of cousre massively out of date, but graphics drivers are certainly an case where the ones from Nvidia/AMD are unlikely to ever be included in the kernel source tree. And, well, I've never had to reinstall my graphics drivers after running Windows Update.

Reply


Leave a comment

Up