The most likely reason GNU/Linux hasn't taken off... yet...
I have no raw figures, I have tried to get them from the Internet. All I have are memories. And the biggest I had was a colleague distributing Ubuntu CDs at work during the mid-2000s. Ubuntu was the next big thing, it was said. And when I tried it, I had to admit, I couldn't blame anyone from saying so.
Ubuntu in the 2000s was a first class OS. It had the following massive features:
- It ran on pretty much anything powerful enough. The installer was first rate, booted in the most trying conditions, and installed an image with your wireless, sound and accelerated video all ready to use. Well, OK, ATI cards needed that fglrx thing to get acceleration, and I can't remember exactly how you installed it, but I do know it wasn't hard.
- It ran GNOME 2. For those who are wondering why that was a good thing, GNOME 2 was basically an intuitive user interface that was somewhat more obvious and consistent than Windows, and maybe a small step back from Mac OS X. It was customizable but...
- ...it had sane defaults everywhere. The default Ubuntu desktop at that time was easy to understand.
Did you have to drop into the command line to do anything? That depended. You sometimes did in the same way you sometimes have to with Windows or Mac OS X. You have an obscure set of technical conditions, and you need to debug something or configure something equally obscure, and just like Mac OS X and Windows you'd have to use the “nerd” user interface. But an “average” user who just wanted a web browser and office suite would not ever need to do that.
So it wasn't surprising that, anecdotally (like I said, it seems to be rough getting any concrete figures, Statcounter claims a 0.65 marketshare for “Linux” in 2009 but I don't trust them as far as I can throw them, and more importantly they have no pre-2009 figures online making it hard to show growth during that period. Also it's contradicted by other information I'm finding on the web) Ubuntu started to drive installs of GNU/Linux. People really seemed to like it. I even heard major figures in the Mac world at the time switching. Ubuntu was the OS everyone wanted, it “just worked”.
So what happened in the 2010s to halt this progress? Everything changed? Yes.
And by everything, I mean Ubuntu.
Ubuntu decided to change its user interface from GNOME 2 to Unity. In part this was driven by the GNOME team themselves who, for whatever reason, decided the GNOME 2 user interface was obsolete and they should do something different.
I'm not necessarily opposed to this thinking, except the “obsolete” part, but what neither party (Canonical, authors of Ubuntu and the Unity user interface, and the GNOME team) did was to go about this understanding the impact on existing users. Namely:
- The user interfaces they proposed were in most cases radically different from GNOME 2. So existing users wanting to upgrade would find they would literally have to learn how to use their computers again.
- The user interfaces proposed only partially used the paradigms that everyone had gotten used to and trained on during the 1990s. GNOME 3 in particular switched to a search model for almost everything. Unity was a little more standard, but launching infrequently used applications in both environments was confusing. These user interfaces were only slightly closer to what had become standard in the 1990s than the new mobile touchscreen UIs that doubtless had influenced their authors.
To understand how massive a problem this was, look at Apple and Microsoft's experience with user interface refreshes.
Apple does it right
Let's start with Apple, because Apple didn't fail:
In the 1990s and early 2000s, Apple switched from their 1980s MacOS operating system to the NEXTSTEP derived Mac OS X. NEXTSTEP and MacOS were nothing alike from a user interface point of view, making shipping NEXTSTEP with new Macs a non-starter. So Apple took pains to rewrite the entire NEXTSTEP user interface system to make it look and feel as close as possible to contemporary MacOS.
The result was Rhapsody. Rhapsody had some “feel” issues in the sense of buttons not quite responding the same way they did in MacOS, some things were in a different place, and running old MacOS applications felt clumsy, but a MacOS user could easily switch to Rhapsody and while they would be aware they were running a new operating system, they knew how to use it out of the box.
Rhapsody was well received by those who got it (it was released in beta form to developers, and sold for a while as Mac OS X Server 1.0), but from Apple's point of view, they still had time to do better. So they gave the operating system's theme an overhaul, creating Aqua. But the overhaul was far more conservative than people give Apple credit for:
- If something was recognizably a button in Rhapsody/MacOS, it was recognizably a button in Aqua.
- If something was a window in Rhapsody/MacOS, it was recognizably a window in Aqua.
- If you did something by dragging it or clicking it or poking your tongue out at it in Rhapsody/MacOS, you'd do the same thing in Aqua.
- If it was in the top left corner in Rhapsody/MacOS, it was in the top left corner in Aqua. Positions generally stayed the same.
...and so on. The only major new user interface element they added was a dock. Which could even be hidden if the user didn't like it.
So the result, when Apple finally rolled this out, was an entirely new operating system with a modern user interface that looked fantastic that was completely 100% usable by people used to the old one.
Microsoft “pulls a Ubuntu/GNOME” but understands how to recover
In some ways saying Apple did it right and Microsoft didn't is unfair, because Microsoft has done operating system upgrades correctly more times than you might imagine. And they even once even managed a complete GNOME-style UI overhaul that actually succeeded: replacing Windows 3.x's UI with Windows 95's UI. At this time they were successful though for a variety of reasons:
- Windows 3.x was really hard to use. Nobody liked it.
- The new Windows 95 user interface was a composite UI based upon Mac OS, Amiga OS, GEM, Windows 1.x, OS/2, and so on. It was instantly familiar to most people who had used graphical mouse-driven user interfaces before.
- In 1995, there were still people using DOS. Windows 3.x was gaining acceptance but wasn't universally used.
Since then, from 1995 to 2012, Microsoft managed to avoid making any serious mistakes with the user interface. They migrated NT to the 95 UI with Windows NT 4. They gave it a, in my view ugly, refresh with Windows XP which was a purely visual clean up similar to, though not as radical as, the Rhapsody to Aqua user interface changes I noted above. But like Rhapsody to Aqua, no serious changes in the user interface paradigm were made.
They did the same thing with Vista/7 creating a clean, composited, UI that was really quite beautiful, yet, again, kept the same essential paradigms so a Windows 95 user could easily switch to Windows 7 without having to relearn anything.
Then Microsoft screwed up. Convinced, as many in the industry were at the time, the future was touch user interfaces and tablets, they released Windows 8, which completely revamped the user interface and changed how the user interacted with the computer. They moved elements around, they made things full screen, they made things invisible.
Despite actually being very nice on a tablet, and despite PC manufacturers pushing 2 in 1 devices hard on the back of Windows 8's excellent touch screen support, users revolted and refused to have anything to do with it.
Windows 8 generated substantial panic at Microsoft, resulting in virtually all the user interface changes being taken out of Windows 10, its major successor. Windows 10 itself was rushed out, with early versions being buggy and unresponsive. But compared to Windows 7, the user interface changes were far less radical. It retained the Windows 7 task bar, the start menu, and buttons were where you'd expect them. A revised preferences system was introduced that... would have been controversial if it wasn't for the fact earlier versions of Windows had a fragmented system of half written preferences systems anyway. A notifications bar was introduced, but it wasn't particularly radical.
But windows, buttons, etc, all operated the same way they did in Windows 7 and its predecessors.
What is NOT the reason Ubuntu ceased to be the solution in the 2010s.
Amazingly, I've heard the argument Ubuntu failed because the underlying operating system is “too nerdy”. It isn't. It's no more nerdy than Mac OS X, which was based on a similar operating system.
Mac OS X is based on a kernel called XNU, which in turn is based on a kernel called Mach, that's been heavily modified, and a userland that's a combination of – let's call it user interface code – and BSD. There are some other small differences like the system to manage daemons (in old school BSD this would have been bsdinit), but nothing that you'd immediately notice as an end user.
All versions of GNU/Linux, including Ubuntu, are based on a kernel called Linux, and a userland that's a combination of the GNU project and some other projects like X11 (which maintains the core windowing system) and some GNU projects like GNOME (which does the rest of the UI.) There are multiple distribution specific changes to things like, well, the system to manage daemons.
So both are XNU or Linux, BSD or GNU, and then some other stuff that was bolted on.
XNU and Linux are OS kernels designed as direct replacements for the Unix kernel. They're open source, and they exist for slightly different reasons, XNU's Mach underpinnings being an academic research project, and Linux being Linus Torvald's efforts to get MINIX and GNU working on his 386 computer.
BSD and GNU are similar projects that ultimately did the same things as each other but for very different reasons. They're both rewrites of Unix's userland, that started as enhancements, and ultimately became replacements. In BSD's case it's just a project to enhance Unix that grew into a replacement because of frustration at AT&T's inability to get Unix out to a wider audience. In GNU's case, it was always the plan to have it replace Unix, but it started as an enhancement because it's easier to build a replacement if you don't have to do the whole thing at once.
So... that's all nerd stuff right? Sure. But dig into both OSes and you'll find they're pretty much built the same way. A nice friendly user interface bolted onto that Unix-like underpinnings that'll never be friendly to non-nerds. So saying Ubuntu failed because it's too nerdy is silly. Mac OS X would have failed for the same reason if that were true. The different origins between the two does not change the fact they're similar implementations of the same underlying concept.
So what did Ubuntu do wrong and what should it have done?
The entire computer industry at this point seems to be obsessed with changing things for the sake of doing so, to make it appear they're making progress. In reality, changes should be small, and cosmetic changes are better for both users and (for want of a better term) marketing reasons than major paradigm changes. The latter is bad for users, and doesn't necessarily help “marketing” as much as marketing people think it helps them.
Ubuntu failed to make GNU/Linux take off because it clumsily changed its entire user interface in the early 2010s for no good reason. This might have been justifiable if:
- The changes were cosmetic as they were for the user interfaces in Windows 95 vs XP vs Vista/7 vs 10/11, and Rhapsody vs Aqua. They weren't.
- The older user interface it was replacing was considered user unfriendly (like the replacement of Windows 3.1's with 95.) It was, in fact, very popular and easy to use.
- The older user interface prevented progress in some way. If this is the reason, the apparent progress GNOME 3+ and Unity enabled has yet to be identified.
- The older user interface was harder for users migrating from other systems to get used to than its replacements. This is laughably untrue.
Radically changing a user interface is a bad idea. It makes existing users leave unless forced to stay. And unless it's sufficiently closer to the other user interfaces people are using, it won't attract new users. It was a colossal misstep on GNOME and Canonical's part.
GNOME 3/Unity should, to put it bluntly, have had the same fundamental paradigm as GNOME 2. Maybe with an optional dock, but not the dock-and-search focused system they put in instead.
Where both teams should have put their focus is simple modernization of the look and focused larger changes on less frequently used parts of the system or internals needed to attract developers. I'm not particularly pro-Flatpak (and Snap can die a thousand deaths) but making it easier to install third party applications (applications not in repositories) would have also addressed some of the few holes in Ubuntu that other operating systems did better. There's a range of ways of doing this that do not involve sandboxing things and forcing developers to ship and maintain all the dependencies of their applications such as:
- Identifying a core subset of packages that will only ever be replaced by backward compatible versions in the foreseeable future, and will always be installed by default and encouraging static linking for libraries outside of those packages, even making static linking default. (glibc and the GTK libraries are obvious examples of the former, libraries that should be fully supported going forward with complete backward compatibility, while more obscure libraries and those that have alternatives, image file parsers would be an example, should be statically linked by default.)
- Supporting signed .DEBs
- Making it easy to add a third party repository while sandboxing it (to ensure only relevant packages are ever loaded from it) and authenticating the identity of the maintainer at the time it's added. (Canonical's PPA system is a step in the right direction but it does force the repos to be maintained by them.)
- Submitting Kernel patches that allow for more userland device drivers (giving them a stable ABI)
Wait! This is all “nerd stuff”. But non-nerds don't need to know it, from their perspective they just need to know that if they download an application from a website, it'll “just work”, and continue to when they install GreatDistro 2048.1 in 24 years.
What is NOT the solution?
The solution is not an entirely different operating system, because any operating system that gets the same level of support of GNU/Linux will find itself making the same mistakes. To take, for example, off the top of my head, no particular reason to select this one except it's a well regarded open source OS that's not GNU/Linux, ooh, Haiku, the OS inspired by BeOS?
Imagine Haiku becoming popular. Imagine who will be in charge of it. Will these people be any different to those responsible for GNOME and Canonical's mistakes?
No.
Had Haiku been the basis of Ubuntu in the 2000s, it's equally possible that Haiku would have suffered an unnecessary user interface replacement “inspired” by the sudden touch screen device craze. Why wouldn't it? It happened to GNOME and Ubuntu. It happened to Windows for crying out loud. Haiku didn't go there not because it's inherently superior but because it was driven by BeOS loving purists in the time period in question. If Haiku became popular, it wouldn't be driven by BeOS loving purists any more.
Frankly, I don't wait Haiku to become popular for that reason, it'd ruin it, I'd love however for using fringe platforms to be more practical...