Jony Ive: The Genius Behind Apple’s Greatest Productsby Leander Kahney (Portfolio, £14.95)
Quoting his subject’s words at the head of the chapter on the design and development of Apple’s iPhone, Leander Kahney makes Jony Ive sound oracular: “When we are at these early stages in design… often we’ll talk about the story for the product—we’re talking about perception. We’re talking about how you feel about the product, not in a physical sense, but in a perceptual sense.” Throughout his biography of Apple’s design magus for nigh on the past two decades, Kahney comes at Ive’s notion of the “narrative” of a product time and again, but it’s this formulation that most closely approaches the metaphysical, seemingly suggesting that all those iMacs, PowerBooks, iPods and iPads that Ive has been responsible for mind-birthing should be considered not as mere phenomena, but actual noumena; for, what else can he mean by “perceptual”—as distinct from “physical”—if not some apprehension of how the iPhone is in itself, freed from the capacitive touch of our fingers?
You may find this rather too high-flown for a mobile phone—or a laptop, or a tablet computer for that matter—but when it comes to Apple and its products the sky is no limit: in 2012, the company founded 36 years earlier in the garage of a Californian bungalow by Steve Jobs, Steve Wozniak and Ronald Wayne, reached a market capitalisation of $660bn, surpassing the record set by Microsoft in 1999 and making it the most valuable publicly-traded company ever. It is in the perception (and I use the term here in an ordinary language, non-Ive sense) that Apple piled up this mountain of pelf not simply by flogging clever electronic gizmos, but by somehow altering global consciousness, that the company’s own identity finds its fullest expression. Other tech giants may have their schticks—Microsoft slick and savvy, Google cuddly and approachable, Facebook brash and sophomoric—but only Apple claims to have elevated its marketing strategy to the status of a transcendental aesthetic.
But before we examine this claim in more detail—and it may surprise you to learn that I agree in a large part with Apple’s self-estimation—let me relate my own Apple narrative. In 1989 I was running a small company that published low-end corporate literature such as employee magazines and sales brochures. When I was brought in by the company’s owners the business consisted of a couple of crap Amstrad PCs and a couple of equally crap clients. The previous incumbent had done all the work himself: writing the copy, taking the photos, and then marking these up for the typesetters and the scanners; he would then cut and paste together a camera-ready layout. I realise these very terms—”cut” and “paste”—now require further elucidation: I’m talking about the physical acts of cutting with scissors, and pasting with glue. Having blagged my way into the job I set about learning these laborious processes on the hoof, although I was already aware that the wind of change was blowing strongly through the print industries, and that the words it howled were “desktop” and “publishing.”
Within 18 months I’d brought in a designer familiar with the new technology, and the office had a staff of four working on three Mac Classics and a Mac II, all running Adobe PageMaker and QuarkXPress software. I still had to go out to the Safeway branch in St Andrews and interview the store manager about his frozen food gondolas, then photograph him surrounded by the snowy snoods of his staff, but I could give the copy to the designer on a floppy disc, and he could take the photographs, scan them into his machine, crop and size them, lay everything out on screen, then supply the printers with digital files. On occasion I even did the layouts myself, and this collapsing of several formerly widely distributed specialist trades into a single small beige plastic box operated by a rank amateur struck me as the very miniaturisation of epochal change; the unruly mobs manning the Wapping picket lines had been shrunk and shrunk until all that remained of them was this diminuendo: my fingers clicking on the mouse, the blinking arrow dragging the guidelines across the screen, and the text magically realigning.
The Apple Macs were intimately bound up with all of this: the WYSIWYG (“What You See Is What You Get”) software interface, for those of us who had grown up using computers needing keystroke commands, removed the distinction between the onscreen and the in-view: the real and the virtual became fused. It no longer mattered that we didn’t conceptually understand computing—which was a barrier when you felt you had to communicate with the machine—because it was clear that it understood us, smoothly intuiting how we went about things in our funny four-dimensional phenomenal world. The anthropomorphism of the Mac Classic helped as well: its screen a household God’s monocular eye, its front-loading slot a mouth through which it ingurgitated our floppy offerings. A frustrated fiction writer at the time, I do believe that it was working on Macs to produce finished publications that helped me to realise my creative potential: or, if you prefer a more high-flown formulation, the Macintosh was part of my entelechy.
You might’ve thought that once smitten by the Mac computer, I would’ve bitten down on Apple ever since, but not a bit of it. In fact, I’ve only bought two Apple products in the last 23 years: an iPod in 2007 and an iPhone last year. When the Amstrad I inherited from my mother finally gave out in the mid-90s, the friend-of-a-friend I asked for computer support dismissed Apple on the grounds that their machines and software weren’t immediately compatible with Microsoft’s then world-sweeping Windows, nor was their word-processing programme on a par with Word. It’s as a result of this decision that nearly two decades later I’m typing this article on a generically dull PC rather than a transcendental MacBook Air.
I only bother with this history because it seems to fly in the anodised aluminium face of the Apple mythos. To quote from the Evening Standard on 1st November: “Dozens of Apple fans camped on the streets of London for up to 24 hours in order to be the first in Britain to get their hands on the new iPad Air. First in the queue at the Covent Garden store was Constantin Zabrotskiy, 29, from Moscow, who flew in from Russia on Wednesday and was flying back at 1pm today after spending £399 on the device.” Of course, there are plenty of other products besides Apple’s that provoke such consumerist frenzy, but few are accorded the reverence lavished by media and consumers alike. The photograph of Zabrotskiy and his new iPad has a suitably iconic feel: the transcontinental pilgrimage has ended successfully, with the newly-minted relic held aloft to be kissed.
***
I only surveyed a small portion of the burgeoning Apple literature for this piece. Kahney’s book; a modest chapbook entitled What Would Apple Do? that’s optimistically subtitled: “How You Can Learn From Apple and Make Money”; and Dogfight by Fred Vogelstein, which is still more rousingly subtitled: “How Apple and Google Went to War and Started a Revolution.” In all of them the same biblical story is glossed: how Apple was born in a manger; how its messiah, Jobs, founded the religion of personal computing; how he was exiled by pharisaic boardroom manoeuvrings to the wilderness of NeXT and Pixar; how he returned with the blindingly simple tablets of the Law (“There shall be but four basic Apple products: two laptops and two desktops”); and how despite the devastating plague of Windows ’95, he and his John the Baptist—Jony Ive—then launched a crusade to convert the entire heathen planet to their faith that in simplicity lies the ultimate sophistication.
The sacerdotal trappings are, I think, not overstated. It helps that Jobs was such a shoo-in for the role of digital prophet, albeit an old testamentary one. With his dabbling in the acid-pastel counterculture, and his professed Buddhism (the typical spiritual pacifier for the most angry of occidentals), Jobs brought to flogging gadgets a peculiarly self-deluding inversion of conventional business practises. Rather than make things that consumers wanted, Apple’s product designers and engineers would articulate their own inchoate desires. Rather than merely chasing a dollar, Apple’s goal would be, in Ive’s words, “absolutely not to make money”; the sales would come, it was fervently believed, when the perceptual feelings of the multitude were so aroused by these beautiful objects of their creators’ own desire.
That this has indeed proved to be the case represents, in my view, a serendipitous alignment between technological advance and a messianic phase in late capitalism, rather than any preternatural prescience on the part of Jobs and Ives. (Although, if there were ever a strong argument for nominative determinism it would seem fulfilled in this: two men called Jobs and Ive providing so much employment for the benighted Foxconn wage-serfs of the Pearl River delta.) No doubt about it, Jony Ive is a superb designer, and Jobs was an inspired salesman and entrepreneur, but it’s entirely possible to imagine a steam-punkish alternative timeline in which Apple is not the demiurge of the digital. After all, while it’s widely perceived that the WYSIWYG interface originated with Apple and was subsequently ruthlessly copied by Microsoft, there’s no doubt that someone would’ve hit on it sooner rather than later—just as someone (or possibly Samsung), would have hit upon the touch-screen mobile phone-cum-computer.
There are those pundits who say that Apple’s nemesis lies in its refusal to adopt open-source technology; that the IOS represents, in a virtual form, the Kremlin that Apple’s Cupertino HQ has become. It’s true that Apple has a curiously totalitarian set-up for a company founded by an old hippy: the iPhone’s designers had to sign confidentiality agreements, and then legal affidavits attesting that they’d signed; they required no fewer than four electronic passes to get in and out of their slinkily minimalist atelier; Jobs was himself subject to delivering spittle-flecked rants worthy of a KGB interrogator—and according to Vogelstein, the Apple engineers present for his celebrated 2007 launch of the iPhone drank themselves into near-oblivion as they looked on, so terrified were they of his reaction if something went wrong. Others aver that Apple’s whole problem lies in its success in the decade since the iPod suckered itself to the ears of millions—that having become the new normal, rather than a style-victim’s favourite accessory, the brand has lost its proverbial soul.
True, in Kahney’s book we learn a factoid that were it not true it would be necessary to invent: a major problem with the first iPhone prototype was that the designer stubble of its designers caught in the crack between the glass and the bezel. But it doesn’t matter how hairy Ives and his team were, they remain very smooth operators when it comes to anticipating consumer demand. And make no bones about it: while Jobs may have wrapped his own acumen up in gauzy veils, this is all the naked truth amounted to. No, Apple’s great success lies in my Classic epiphany of all those years ago: that the technologies they have played a part in making shift the locus of creative production away from the specialists and towards the amateurs. Bidirectional digital media make everyone their own photographer, broadcaster, cinematographer, chanteuse, matchmaker and funeral director. From cradle to grave the only interface someone now requires with the wider world is formed by the toughened glass of an iThis or an iThat; surely this is what Jony Ive was struggling towards when he talked about “feeling” his products in “a perceptual sense”: the virtual world is becoming our phenomenal world, while Ive has embodied the physical objects that make this transubstantiation possible with suitably seraphic forms.
Quoting his subject’s words at the head of the chapter on the design and development of Apple’s iPhone, Leander Kahney makes Jony Ive sound oracular: “When we are at these early stages in design… often we’ll talk about the story for the product—we’re talking about perception. We’re talking about how you feel about the product, not in a physical sense, but in a perceptual sense.” Throughout his biography of Apple’s design magus for nigh on the past two decades, Kahney comes at Ive’s notion of the “narrative” of a product time and again, but it’s this formulation that most closely approaches the metaphysical, seemingly suggesting that all those iMacs, PowerBooks, iPods and iPads that Ive has been responsible for mind-birthing should be considered not as mere phenomena, but actual noumena; for, what else can he mean by “perceptual”—as distinct from “physical”—if not some apprehension of how the iPhone is in itself, freed from the capacitive touch of our fingers?
You may find this rather too high-flown for a mobile phone—or a laptop, or a tablet computer for that matter—but when it comes to Apple and its products the sky is no limit: in 2012, the company founded 36 years earlier in the garage of a Californian bungalow by Steve Jobs, Steve Wozniak and Ronald Wayne, reached a market capitalisation of $660bn, surpassing the record set by Microsoft in 1999 and making it the most valuable publicly-traded company ever. It is in the perception (and I use the term here in an ordinary language, non-Ive sense) that Apple piled up this mountain of pelf not simply by flogging clever electronic gizmos, but by somehow altering global consciousness, that the company’s own identity finds its fullest expression. Other tech giants may have their schticks—Microsoft slick and savvy, Google cuddly and approachable, Facebook brash and sophomoric—but only Apple claims to have elevated its marketing strategy to the status of a transcendental aesthetic.
But before we examine this claim in more detail—and it may surprise you to learn that I agree in a large part with Apple’s self-estimation—let me relate my own Apple narrative. In 1989 I was running a small company that published low-end corporate literature such as employee magazines and sales brochures. When I was brought in by the company’s owners the business consisted of a couple of crap Amstrad PCs and a couple of equally crap clients. The previous incumbent had done all the work himself: writing the copy, taking the photos, and then marking these up for the typesetters and the scanners; he would then cut and paste together a camera-ready layout. I realise these very terms—”cut” and “paste”—now require further elucidation: I’m talking about the physical acts of cutting with scissors, and pasting with glue. Having blagged my way into the job I set about learning these laborious processes on the hoof, although I was already aware that the wind of change was blowing strongly through the print industries, and that the words it howled were “desktop” and “publishing.”
Within 18 months I’d brought in a designer familiar with the new technology, and the office had a staff of four working on three Mac Classics and a Mac II, all running Adobe PageMaker and QuarkXPress software. I still had to go out to the Safeway branch in St Andrews and interview the store manager about his frozen food gondolas, then photograph him surrounded by the snowy snoods of his staff, but I could give the copy to the designer on a floppy disc, and he could take the photographs, scan them into his machine, crop and size them, lay everything out on screen, then supply the printers with digital files. On occasion I even did the layouts myself, and this collapsing of several formerly widely distributed specialist trades into a single small beige plastic box operated by a rank amateur struck me as the very miniaturisation of epochal change; the unruly mobs manning the Wapping picket lines had been shrunk and shrunk until all that remained of them was this diminuendo: my fingers clicking on the mouse, the blinking arrow dragging the guidelines across the screen, and the text magically realigning.
The Apple Macs were intimately bound up with all of this: the WYSIWYG (“What You See Is What You Get”) software interface, for those of us who had grown up using computers needing keystroke commands, removed the distinction between the onscreen and the in-view: the real and the virtual became fused. It no longer mattered that we didn’t conceptually understand computing—which was a barrier when you felt you had to communicate with the machine—because it was clear that it understood us, smoothly intuiting how we went about things in our funny four-dimensional phenomenal world. The anthropomorphism of the Mac Classic helped as well: its screen a household God’s monocular eye, its front-loading slot a mouth through which it ingurgitated our floppy offerings. A frustrated fiction writer at the time, I do believe that it was working on Macs to produce finished publications that helped me to realise my creative potential: or, if you prefer a more high-flown formulation, the Macintosh was part of my entelechy.
You might’ve thought that once smitten by the Mac computer, I would’ve bitten down on Apple ever since, but not a bit of it. In fact, I’ve only bought two Apple products in the last 23 years: an iPod in 2007 and an iPhone last year. When the Amstrad I inherited from my mother finally gave out in the mid-90s, the friend-of-a-friend I asked for computer support dismissed Apple on the grounds that their machines and software weren’t immediately compatible with Microsoft’s then world-sweeping Windows, nor was their word-processing programme on a par with Word. It’s as a result of this decision that nearly two decades later I’m typing this article on a generically dull PC rather than a transcendental MacBook Air.
I only bother with this history because it seems to fly in the anodised aluminium face of the Apple mythos. To quote from the Evening Standard on 1st November: “Dozens of Apple fans camped on the streets of London for up to 24 hours in order to be the first in Britain to get their hands on the new iPad Air. First in the queue at the Covent Garden store was Constantin Zabrotskiy, 29, from Moscow, who flew in from Russia on Wednesday and was flying back at 1pm today after spending £399 on the device.” Of course, there are plenty of other products besides Apple’s that provoke such consumerist frenzy, but few are accorded the reverence lavished by media and consumers alike. The photograph of Zabrotskiy and his new iPad has a suitably iconic feel: the transcontinental pilgrimage has ended successfully, with the newly-minted relic held aloft to be kissed.
***
The iPad Air and Mini launch at an Apple store in San Francisco, October 2013: the company's products provoke a "consumerist frenzy" ©Xinhua/Rex
I only surveyed a small portion of the burgeoning Apple literature for this piece. Kahney’s book; a modest chapbook entitled What Would Apple Do? that’s optimistically subtitled: “How You Can Learn From Apple and Make Money”; and Dogfight by Fred Vogelstein, which is still more rousingly subtitled: “How Apple and Google Went to War and Started a Revolution.” In all of them the same biblical story is glossed: how Apple was born in a manger; how its messiah, Jobs, founded the religion of personal computing; how he was exiled by pharisaic boardroom manoeuvrings to the wilderness of NeXT and Pixar; how he returned with the blindingly simple tablets of the Law (“There shall be but four basic Apple products: two laptops and two desktops”); and how despite the devastating plague of Windows ’95, he and his John the Baptist—Jony Ive—then launched a crusade to convert the entire heathen planet to their faith that in simplicity lies the ultimate sophistication.
The sacerdotal trappings are, I think, not overstated. It helps that Jobs was such a shoo-in for the role of digital prophet, albeit an old testamentary one. With his dabbling in the acid-pastel counterculture, and his professed Buddhism (the typical spiritual pacifier for the most angry of occidentals), Jobs brought to flogging gadgets a peculiarly self-deluding inversion of conventional business practises. Rather than make things that consumers wanted, Apple’s product designers and engineers would articulate their own inchoate desires. Rather than merely chasing a dollar, Apple’s goal would be, in Ive’s words, “absolutely not to make money”; the sales would come, it was fervently believed, when the perceptual feelings of the multitude were so aroused by these beautiful objects of their creators’ own desire.
That this has indeed proved to be the case represents, in my view, a serendipitous alignment between technological advance and a messianic phase in late capitalism, rather than any preternatural prescience on the part of Jobs and Ives. (Although, if there were ever a strong argument for nominative determinism it would seem fulfilled in this: two men called Jobs and Ive providing so much employment for the benighted Foxconn wage-serfs of the Pearl River delta.) No doubt about it, Jony Ive is a superb designer, and Jobs was an inspired salesman and entrepreneur, but it’s entirely possible to imagine a steam-punkish alternative timeline in which Apple is not the demiurge of the digital. After all, while it’s widely perceived that the WYSIWYG interface originated with Apple and was subsequently ruthlessly copied by Microsoft, there’s no doubt that someone would’ve hit on it sooner rather than later—just as someone (or possibly Samsung), would have hit upon the touch-screen mobile phone-cum-computer.
There are those pundits who say that Apple’s nemesis lies in its refusal to adopt open-source technology; that the IOS represents, in a virtual form, the Kremlin that Apple’s Cupertino HQ has become. It’s true that Apple has a curiously totalitarian set-up for a company founded by an old hippy: the iPhone’s designers had to sign confidentiality agreements, and then legal affidavits attesting that they’d signed; they required no fewer than four electronic passes to get in and out of their slinkily minimalist atelier; Jobs was himself subject to delivering spittle-flecked rants worthy of a KGB interrogator—and according to Vogelstein, the Apple engineers present for his celebrated 2007 launch of the iPhone drank themselves into near-oblivion as they looked on, so terrified were they of his reaction if something went wrong. Others aver that Apple’s whole problem lies in its success in the decade since the iPod suckered itself to the ears of millions—that having become the new normal, rather than a style-victim’s favourite accessory, the brand has lost its proverbial soul.
True, in Kahney’s book we learn a factoid that were it not true it would be necessary to invent: a major problem with the first iPhone prototype was that the designer stubble of its designers caught in the crack between the glass and the bezel. But it doesn’t matter how hairy Ives and his team were, they remain very smooth operators when it comes to anticipating consumer demand. And make no bones about it: while Jobs may have wrapped his own acumen up in gauzy veils, this is all the naked truth amounted to. No, Apple’s great success lies in my Classic epiphany of all those years ago: that the technologies they have played a part in making shift the locus of creative production away from the specialists and towards the amateurs. Bidirectional digital media make everyone their own photographer, broadcaster, cinematographer, chanteuse, matchmaker and funeral director. From cradle to grave the only interface someone now requires with the wider world is formed by the toughened glass of an iThis or an iThat; surely this is what Jony Ive was struggling towards when he talked about “feeling” his products in “a perceptual sense”: the virtual world is becoming our phenomenal world, while Ive has embodied the physical objects that make this transubstantiation possible with suitably seraphic forms.