System panic
Microsoft big cheese Satya Nadella is backpedalling on recent changes to Windows, hoping to win back “fans”. I’m going to pass over the category error of ‘fans’ of operating systems because there actually is something interesting going on here. I can see no circumstances in which Microsoft will get it right, though.
Speaking to investors (of course) Nadella said that the company was rolling back recent changes, presumably including intrusive gluing of artificial intelligence (AI) to anything anyone could think of. Such as Notepad.
“With Windows, we recently announced performance improvements for lower memory devices, streamlined the Windows Update experience, and brought back focus to core features and fundamentals that matter most to our customers,” he said.
It’s probably worth thinking about what an operating system is and how they came to be. Computers are, fundamentally, a set of resources that allow for calculations to be made. As users, however, layers of abstraction allow us to perform tasks that, at a human level, have precious little to do with adding up. Consequently, an OS is Janus-faced, with one side oriented toward the silicon and one toward us.
It is this latter aspect that is in decline. File systems are increasingly hidden, while human-centric functionality, such as spatial file browsing that matches how humans think and a functional desktop-as-metaphor, are long dead. You do get adverts, though, so…
In a sense, operating systems matter less than ever. Chromebooks may not be in vogue anymore, but the direction of travel is clear: the Web browser as the universal interface. Even many supposed desktop applications, and even more mobile ones, are just browsers with reduced functionality. You can tell because they run dog slow.
It is worth remembering that the rise of Microsoft is utterly tied to operating systems. Microsoft, then a micro business selling a programming language, brute-forced its way into nearly total domination of computing around the world with a little thing called Disk Operating System (DOS).
Everyone knows that IBM dropped the ball by allowing Microsoft to freely license DOS, the system that ran the IBM PC launched in 1981. As a result, once IBM’s BIOS was reverse engineered PC clones abounded and the rest is history. What this account misses is that DOS was easy to overlook. Fundamentally, all it did was allow users to list and load programmes. Unix it was not. Eventually Windows was rewritten with underpinnings philosophically derived from DEC’s VMS OS, but until Windows 2000 and XP, it was little more than a DOS application, and an ugly and difficult to use one until the arrival of Windows 3 in 1990.
Over time, DOS and its passenger Windows, grew into something more recognisably an operating system. Others like the Mac’s system software, AmigaOS, and the various Unixes kicking around large institutions were all OSes from day one, and, to varying degrees, were designed with the user in mind. The Unix user was, of course, a rather different figure from the person who bought a Mac and a copy of Pagemaker, and that is not necessarily a problem.
Today’s OSes feature inconsistent and incoherent design languages and, in a concession to usability that actually makes things worse, hide functionality from users.
Peak user
The absolute high water mark of OSes, at least as far as usability goes, was the late 1990s: Mac OS 9, Windows 2000 and BeOS. Yes, the core, underlying computing functionality has come on leaps and bounds since then, but clarity, logic and consistency has gone out the window. Nothing that can be done today using macOS, Windows or Linux would be impossible to do with one of those ancient OSes, or the likes of BeOS descendant Haiku.
In other words, the absent functionality in these antediluvian OSes is at the application level, not that of the gubbins, and the nightmare of memory management and co-operative multitasking is entirely unrelated to human-computer interaction.
The saddest thing of all, though, is mainstream Linux developers cloning the broken UIs of macOS and Windows. The inherently complex, but more or less comprehensible, Unix file hierarchy is one thing, but actively hiding it does not help, nor does removing basic functionality such as… pull down menus.
Here is my little manifesto: operating systems should run the computer, allowing it to do the adding up, and otherwise just get out of the user’s way. In order to do those things, what they need is to be consistent and to put the user at the centre of the experience. Doing this means starting with a coherent worldview and working from there.
Today, what we actually see is attempts not to simplify but instead hide complexity, which is, of course, catastrophic. In addition, the rise of cloud computing has forced Web design conventions into software in the form of ‘dashboards’ – an amusing term as it refers back a thing that stopped mud and animal waste from hitting people in the face.
Apple, once a leader in designing computer interfaces that neither patronised nor attacked users, has been on a two-decade-long streak of destruction, laying waste to a legacy of computers ‘for the rest of us’ and as ‘bicycles for the mind’. Steve Jobs, for some reason, forgot that the desktop was a metaphor and added caricatures of physical objects, then, under the direction of Jony Ive, flipped to the opposite pole, introducing ‘flat’ design where every object looks the same as every other one.
The original Apple Human Interface Guidelines (HIG), published in 1987, was not an aesthetic document. Rather, it established interaction principles: consistency, feedback, direct manipulation, and proper use of metaphor. In plain English, this means using familiar concepts to make abstract things understandable, and so the desktop was a metaphor. It was never meant to be a picture of a desk.
In the 2000s, Apple violated that distinction, adding images of things like stitched leather to applications. Later, Apple committed the opposite error with equal confidence, stripping out visual affordances entirely on aesthetic grounds. The problem isn’t ‘flatness’ per se but that it is an aesthetic decision imposed on interaction, rather than an interaction decision that happened to have aesthetic consequences.
Microsoft is right to reverse ferret its AI out of Notepad, but a deeper think about how people use computers is needed, and not just in Redmond.
What OSes need is a re-commitment to human interface guidelines. Don’t hold your breath. In the meantime, I’m installing Haiku.







Subscribers 0
Fans 0
Followers 0
Followers