It’s a beige box, the height of two or three shoeboxes. It’s got a grey nine-inch screen and two slots for floppy discs. It sits on a bookshelf behind me, peeking out in Zoom calls, leading to knowing smiles from friends of a certain age: my age, middle age. It’s a Macintosh, and it’s utterly iconic, which is appropriate for the machine that introduced the idea of icons to most computer users.
Forty years ago, in early 1984, Steve Jobs unveiled a very strange new computer with an outrageous TV ad. Directed by Ridley Scott, it featured discus athlete Anya Major hurling a sledgehammer through a talking head on a screen, symbolically freeing users from Big Brother and the stultifying tedium of business computing. The tagline promised: “On January 24th, Apple Computer will introduce Macintosh. And you’ll see why 1984 won’t be like ‘1984’.”
Despite the absurd bluster of the ad—essentially selling a consumer product that promised to combat fascism—the Mac was truly revolutionary. Before it, computers showed blocky, pixelated green or white text on a black background. They were for accountants, engineers and those trapped in the bowels of corporate bureaucracies. Computers weren’t creative, cool or associated with fashion or design. And then, suddenly, they were.
There was so much about the Mac that was obviously right. Why should you have to type cryptic commands to make a machine do what you want? Why couldn’t you just point and click? (Like so many of the innovations normalised by the Mac, the mouse and the graphical user interface had been introduced earlier, but were largely unknown to users outside computing research and development labs.) Why shouldn’t office machines be cute, friendly and welcoming?
In the early 1980s, personal computing was largely dominated by Microsoft’s MS-DOS operating system—with the release of the Mac, however, that dominance looked under threat. In response, Microsoft introduced Windows in 1985, an obvious rip-off of the Macintosh system, and managed to hold on to its pole position for decades to come. But the vision Windows embodied was the one Jobs and his team had introduced with Macintosh.
I have been a Mac user since university, where I used one to write papers and make beer money designing flyers for campus organisations. One of the most powerful features of the Mac was “WYSIWYG”—What You See Is What You Get—a design principle that tried to bridge the world between the computer screen and paper. In programs such as MacPaint and MacDraw—and later page layout programs like Aldus PageMaker—you could design on-screen and be assured that what you printed would look nearly identical.
To open your Mac you needed a special long-handled screwdriver and a ‘case cracker’
This was light years ahead of the pre-Mac experience, when laying out printed pages required esoteric layout languages like TeX. (Scholars in technical fields still wrestle with these languages, but now have WYSIWYG editors to help.) Back then, a printed page was described by thousands of fields and parameters, which a program translated into instructions the printer could understand, written in a language called PostScript. Macs ultimately sent Postscript to the printer as well, but the user never had to learn the intricacies—instead, what you manipulated on the screen appeared on the page.
Forty years after the introduction of the Mac, I find myself reflecting on the rise of WYSIWYG and its implications. On the one hand, bridging the gap between the screen and the page made computers vastly more useful for millions of people, including me. I was a lousy programmer, but the newer system enabled me to experiment with graphic design (provided I was willing to do some obsessive pixel-pushing), and I learned a great deal more about computer capabilities than I otherwise would have. A generation of computer users grew up on “desktop publishing”—publishing zines and posters with our desktops when previously we’d have needed either a print shop or hours with hobby knives and a glue stick—and many of us became the first generation of web designers.
On the other hand, WYSIWYG is a lie, a convenient fiction. Between the screen and the paper are layers of technical translation, countless hidden details that are the “real” document you are creating. Understanding those details is the province of the programmer and, perhaps in the future, the AI co-pilots that will increasingly help programmers with their work.
Technical complexity hidden behind smooth, beautiful surfaces became a signature of Jobs’s leadership. Unlike MS-DOS PCs, which were held together with Phillips screws, the Macintosh was closed with Torx screws buried within its handle. To open your Mac—which Jobs hoped amateurs would never do—you needed to purchase a special, long-handled Torx screwdriver and a “case cracker”, a clamp designed to pull the plastic shell apart. Opting in to Macintosh meant accepting a trade-off: you could have a beautiful, easy-to-use surface if you promised not to delve into the technical innards.
The best contemporary example of the shiny surface hiding technical complexity is the smartphone, another product introduced by Jobs. BlackBerry and Nokia had arguably introduced phones capable of running multiple applications and replacing many computer functions well before the iPhone’s release in 2007. But, as with the Mac, the iPhone popularised the characteristics that came to define the smartphone. In a historical parallel to Windows, Google launched the Android operating system in 2008, replicating many iPhone features and perhaps sacrificing some elegance to ensure the system’s operability on different devices. As with Macs and PCs, Android phones have vastly greater global market share, but iPhones dominate in terms of design influence and prestige.
Shortly after the iPhone came the App Store, for purchasing applications. Installing new software can be complex and potentially harmful to your device, but the App Store allowed Apple to screen out malicious programs. It also gave the company veto power over what users could download: the store blocks pornographic content (and has sometimes blocked apps carrying magazines and newspapers with nudity in their pages). Apple also uses its veto power to protect its revenues—iPhone users cannot purchase books from within the Kindle app, for example, since Amazon is unwilling to sacrifice a major cut of the sales it generates that way. Apple takes a 30 per cent cut of revenue for many apps sold. Practices like this led the Netherlands Authority for Consumers and Markets to censure the company.
Hiding complexity makes computers accessible to wider audiences, but it makes those users less independent. I have found that very few of my students—even computer science students!—who use Macs know that just under the easy-to-use surface is a powerful variant of Unix, the complex operating system that, together with similar programs, runs the majority of the world’s webservers. They are shocked to discover that, behind a forest of folders, windows and icons is a blinking cursor and black text on a grey background, with the signature Unix “prompt” waiting for a user’s command.
And that’s just how Steve Jobs would have wanted it.
Computers are central to society in a way that would have been hard to predict 40 years ago. We carry them in our pockets everywhere we go. Our TVs are computers, as are our cars. In this context, it makes sense that the Macintosh vision of computers as sleek, friendly and beautifully designed would win out over a world in which watching a football match would require you to type in a complex text command.
But it’s that complex text command which, at heart, is what’s happening when you click your remote. And in surrendering our understanding of the tools we use, we’ve given immense power to the massive multinational companies that keep us entertained and productive, but hide the deeper workings of their systems away from us. Ultimately, the Macintosh may have helped challenge an Orwellian bureaucratic future for computing. But in the process it helped usher in something more analogous to Huxley’s Brave New World.