Rory

June 3, 2021

The iPad Pro: a review, of sorts

The iPad Pro excites me more than any new piece of technology since the…

Well. That’s a bit tricky to answer. I can’t entirely remember being this excited about a new device.

I’ve been excited about the iPad Pro since it was first announced in 2015. But I haven’t owned an iPad Pro until just this week. I’m pretty slow to acquire new gizmos: my love of technology—of digital possibility, really—goes hand-in-hand with my deep suspicion of consumption, or of the belief that our problems in life can be solved by slightly nicer things. Before this year, I went about half a decade without grabbing a new computer or phone; every time I do upgrade, I try to make the New Thing last longer than the Old Thing did. The dream would be to keep this new batch of things through 2030, especially since certain things, like typing text onto a screen, don’t seem to require much more processing power year-over-year. We shall see.

The point I’m trying to make is, I try to care less about features than about philosophy. I want technology in my life that’s put careful thought into why and how it’s going to fit into my days; if that consideration is there, I’m not too concerned with specifics. And no product of Apple’s feels more like an ideological culmination than the iPad—which is saying something, since Apple is the most ideologically-driven computer manufacturer the world has ever seen. The initial iPad Pro excited me, and nearly every year has seen it make a significant step forward. Now that I have my hands on one, it feels almost surreal: I’ve had moments where I’m overwhelmed, not by what this iPad can do, but by how and why it’s able to do it.


Some disclaimers and warnings.

This is not going to be a product review. I won’t pretend to take a neutral, objective position here. Truth be told, my many criticisms of Apple don’t feel particularly interesting: there’s a lot to say there, and plenty of people are already saying it. Similarly, there are plenty of critiques to be made about the iPad, especially where its operating system are concerned; some of those will be addressed at WWDC in a week, and many certainly won’t be, but that’s unsurprising. The iPad, more than any of Apple’s other products, is an unfinished machine. In a sense, that can’t be helped: when I call it an ideological culmination, I mean in part that the iPad is the endpoint of all of Apple’s ideas about the intersection of the analog and digital worlds, and was designed to be open-ended enough that it would be able to receive all of Apple’s innovations along the way. The simplest, best, and most-beloved Apple products are its specialized gadgets: they have a specific form factor and a specific use case, and they fulfill those purposes extraordinarily well. The iPad is the polar opposite of that in every way: it is Apple’s most generic product, a device without a specific use case. That can make it a frustrating device for people who want a means to a given end. But in this case, the “end” is our relationship to technology, and the means is still in progress.

What I’m interested in is that end, and these means. And to explore that interest, I need to assume that Apple has a philosophy about technology and digital experience, and that Apple’s product map reflects this philosophy on a grand scale, and that this philosophy is interesting enough to be discussed as if it was, well, philosophy. I’ll also need to assume that Steve Jobs was a smart enough and creative enough man that we can treat him like a meaningful artist and not just a charismatic businessman, and that Apple’s other executives generally understand and buy into his vision, enough that they’ve managed not to deviate from it in the decade since his death.

If any of this is a stretch for you, I’ll completely understand: it is reasonable to be cynical of (and disgusted by) any major corporation, and of marketing in general, which is something Apple is uniquely good at. Jobs himself was a charismatic figure, and the obsession with him and his company does famously take on a cultish form. And if you’re particularly weary of straight white men doing shitty things and getting away with it, well, Jobs gives you plenty of ammo, given how he treated his child, his co-founder, his employees, and random journalists who captured his ire. Silicon Valley is a self-obsessed and fatuous culture, convinced that it’s smarter than it is, high on its notion of itself as revolutionary and groundbreaking. And arguably more of that attitude came from Apple than from any other company. I happen to think that Apple earns that attitude more than most of its peers, but I’m still going to be saying a lot of earnest and grandiose things about a company that likes telling you to spend lots of money on things it mainly builds in Chinese sweatshops. This is a compromise I am willing to make, and I think that there are very interesting things afoot here, but if this sort of thing is completely obnoxious to you then I recommend you not hateread this on the off-chance that it leads to something which is Not That. This, I regret to say, is entire That.

I will further attempt to stave you off with the following warning: the next two chunks of this essay will consist of neurotic personal ramblings and obsessive amateur Apple historian-ings, before even remotely touching upon anything resembling an explanation of what I like about this machine.


I hate general-purpose computers.

Nothing makes me more uncomfortable than computers, and computer software, without a specific purpose. I have a deep hatred, for instance, of web browsers: I have tried endlessly to remove them from all of my devices, which is tricky because I happen to like the Internet a great deal. For a long time, I used an app called Fluid to generate site-specific browsers on my computer, in the hopes that I could remove Safari from my machine altogether. What really ruined that attempt was my need to occasionally use a search engine, and search engines are more-or-less web browsers within our web browser. Which brings me to a second point: I hate search engines, too. I also hate all content-recommendation algorithms, which to my mind are just forms of non-consensual web browsing.

It would not be a stretch to say that I kind of hate computers, period. Not all forms of digital technology—just the ones that call themselves computers. Desktops and laptops most of all, with phones coming in close second. I never had a problem with my old iPod. I rarely resent my Apple Watch. If there is a device that got me all excited in the same way my iPad Pro has me, it’s my Nintendo Switch, which does a marvelous job of letting me do a single, somewhat-unhealthy thing. And I should be clear that I love many, many things on my computers, and on the Internet too for that matter. I just, hypothetically, would like a knob that I can flick to those particular things, and none of the other things, and use my computer as if it was a Swiss army knife with exactly 23 functions on it, and no more, ever.

This is kind of how I use my phone, in fact, which I do remove Safari from. I connect to the web through apps exclusively, I use a single-window browser for the occasional DuckDuckGo search, and I’m pretty restrictive in what apps I allow on my device. The result doesn’t leave me entirely happy, but it comes increasingly close. And I continue to try and find new ways of making my phone do even less; my big bugbear these days is messaging apps, which feel like they give me important connections in deeply irritating ways.

It’s much harder to make, say, my laptop feel pleasant. In part, that’s because laptops are far more physically engaging devices: if I’m sitting at a desk with my hands connected to a keyboard, I feel like I’m committing myself to using that device, somewhat. For a couple of years out of college, my laptop screen broke, and I bought a cheap projector and committed to using my machine only when it was projected onto a big screen; if what I was doing was so important, I reasoned, I ought to dedicate an entire room to it, and make it a truly absorbing experience. Turns out that was too limiting, which is exactly the trick of it: in my line of work, “too enabling” is permitted, but “too limiting” simply is not. And working out that balance is tricky, because permissiveness will always be easier than strictness. My next plan involves moving away from laptops towards desktops, to at least create a rooted destination at which I use my too-absorbing machine… but that makes it tricky to do certain kinds of work when I travel, and travel has always been one of my go-to ways of keeping myself from falling into tech-induced stupors. You can tell that this is coming close to making a meaningful segue into iPad discussion, but we’re going to veer away again, because I feel it’s not quite time.

Because it’s not just the form factor of laptops and desktops that prove so inconvenient. It’s also the software itself. Computer OSes are more-or-less distraction machines: they’re practically designed to create ways to disrupt your focus. Even before notification systems became baked-in system features—back in my day, you had to install those separately and third-party!—the entire window-driven interface concept means you get apps popping up beneath apps, their edges peeking out, until you open enough that certain programs, certain intended tasks, are buried altogether. You get a task tray, reminding you of all the things you’ve successfully distracted yourself from, all the things you probably should get back to one of these days. You wind up with a real net of digital functions, each strand distracting you from all the others, pulling and pulling away at you until your attention frays and you find yourself not-quite-focused on any one thing.

I suspect that it tugs at the same part of you that leads you to stare into your refrigerator, door open, wondering what brought you to the fridge in the first place—trying to work out if you came here to get one particular thing or if you’re just staring into your fridge for the sake of staring, hoping that one thing will catch your eye and give your saunter-over meaning. Computers are addictive: we mostly all know this. There are theories of exactly what addicts us the most. Does it have something to do with dopamine? Are we responding specifically to notifications, social media and text messages each yelling at us to engage with them, leaving us primed and waiting for the next opportunity for engagement? I’m not convinced that the buck stops there. It’s the act of using these devices that addicts, even before we use particularly engaging or addictive apps. It’s the juggle of it all: staring at your screen and knowing you could do a dozen different things with it, flicking between those things trying to see if any of them will reward you for choosing to do them. At least phones offer the possibility of distracting yourself while you’re on-the-go. More computer-y computers would rather distract you while they keep you stuck in place, rooted to this thing that’s too inconvenient to move with you, yet too convenient to move yourself away from. Digital possibility becomes digital ennui. I’m reminded of the passage in 2666 where Roberto Bolaño describes a character who

…approached the nearest window, where he stood looking out at what was beyond the curtains of frenzied rain, until the lady called to him, peremptorily, and the Swabian turned his back on the window, not knowing why he had gone to it, not knowing what he hoped to see, and just at that moment, when there was no one at the window anymore and only a little lamp of colored glass at the back of the room flickering, it appeared.


Two keynotes.

In a sense, the iPad is a general-purpose computer—exactly the kind that I despise. It is, of course, not Apple’s first general-purpose computer, since Apple got its start introducing the first mass-market personal computer to the world. The original Macintosh introduced the idea of “what a computer is” that we still use to this day: screen, mouse, keyboard. When I complain about windows piling up, I’m complaining about what the Macintosh introduced to the world in 1984.

One of the strange things about the iPad is that it doesn’t really have a market. Apple didn’t stop making personal computers; in fact, as of 2021, its Mac lines use identical chips to its iPads. After its initial success, Apple famously ousted Steve Jobs as CEO, then floundered in the 90s; it regained its mojo only after buying out Jobs’ new company, NeXT, and replacing its operating system with the one Jobs had been busy making over there.

But the new Apple that Jobs birthed in the early 00s wasn’t merely a software company. What Jobs revived was Apple’s interest in hardware. And if there’s a name that Apple became synonymous with, it wasn’t Jobs’: it was that of Jonathan Ive, who was credited with every new Apple product’s hardware through Jobs’ death, then became more powerful, becoming the head of its software department too. From 2014 on, Apple’s software design was under Ive’s thumb—and it increasingly was designed to reflect its hardware, rather than the prior interface philosophy, in which software was an experience unto itself, and hardware was simply the means by which we accessed it.

Apple’s first iconic product, upon Jobs’ return, was the iMac G3—also credited as one of Jony Ive’s first major hardware projects. Like the original 1984 Macintosh, the iMac aggressively emphasized its integrated design: it wasn’t just a computer, it was a product—a consumable good. Apple gets derided for its interest in fashion (it’s often criticized for being only a fashion company), but where its computers are concerned, “fashion” is to some extent a surface expression of Apple’s more humanistic intentions: the how and why of its machines. Where will it sit in your living room? What will it look like? But more: what will your experience of using it be like?

With the iMac came OS X, the operating system that continues to define what computer aesthetics ought to be like: early Google was obsessed with Apple’s design choices, early Facebook sought to match Apple’s exacting standards, and Windows conspicuously moved towards airer and more futuristic-seeming interfaces, after a decade of aggressive beige. The iMac’s accessories, meanwhile, sought to extend Apple’s design philosophy even further: the Puck Mouse was an amateurish and not-always-convenient design compared to what Apple replaced it with, but it indicates that Apple was thinking of its mice as more than simply means to an end: every piece of its hardware served as an analog extension of its more-than-merely-digital experiences.

Even more important than the iMac, though, was what came after it. The iPod, released in 2001, was Apple’s first genuine foray into the production of “consumer goods”. It succeeded, not because of some brilliant chip design, but because of Apple’s understanding that its hardware and software needed to be designed around each other, each united to serve a single purpose. The music-player software was what posed the challenge—how do you make it easy to browse an immense library of music, and how do you make that music easy to control?—but it was in the hardware that Apple found its solution. The click wheel was uniquely well-designed for the task of browsing file hierarchies; it would’ve been useless for placing phone calls or writing documents, or for selecting icons on a screen, but music players didn’t need any of that. Without the iPod’s hardware, its software would have been impossible to envision.

All of which led to the keynote that revealed the future.

Steve Jobs’ unveiling of the original iPhone
is likely the best keynote in tech history. Even if you’re not much for tech demos, it’s magnetic. Jobs was a great writer and a great performer, and on both fronts he’s working at his peak: the misleading hints before the actual announcement, and then the teases before the actual device’s reveal, are the work of a man who knows, not just what he has to offer, but how low his audience’s expectations are. Listen to the sounds his audience makes. You can’t write that off as the enthusiasm of a handful of geeks. These people are sitting in the room where the future of technology is being announced—and in that moment, they realize it.

What was the iPhone? What was the “big reveal” that changed the shape of our society? It wasn’t just that a computer could suddenly fit in your pocket—Palm Pilots already existed. It wasn’t just that a touch screen was possible, or even that multitouch was: those had been around for years before the iPhone. No: what mattered about the iPhone wasn’t the computer, but the philosophy of the computer. It was Apple throwing away the keyboard. It was Apple devoting the bulk of its phone’s size to its battery, rather than its processor. It was the removal of what we’d come to think of as a computer, thanks to Apple. The Macintosh had had a keyboard; so had every cell phone until the iPhone. The Macintosh had had a mouse; every touchscreen device before the iPhone used a stylus, while so-called “smartphones” generally used clickers that were even less sophisticated than the iPod’s clickwheel. Jobs highlights this in his keynote: keyboards are static, but the iPhone changes. Mice and styluses add a layer of abstraction to the ways we interact with our screens; the iPhone lets us use a finger.

How do you scroll your screen? Just touch it with a finger. Gasps, laughter, and applause.

How do you zoom in on a photo? You pinch on it. Gasps. Laughter. Applause.

They weren’t applauding the technical breakthrough. They weren’t applauding the specifics of the user interface: the blue gradient, the skeuomorphic stylings of the various apps. Their applause was for the philosophy, which was as deceptively simple as: touch it. But that simplicity is deceptive. The man who’d popularized the mouse and keyboard certainly knew that.

The story of Apple post-iPhone is well-known. The story of the Mac post-iPhone is an interesting one of new developments getting bounced back and forth between iOS devices and Mac devices: something will debut on one, then bounce over to the other. Generally speaking, if Apple debuts a new feature on one of its product lines, its other products will get those features within a few years. Every year, people have speculated on whether or not Apple will eventually replace its Macs with iOS devices, or at least add touchscreens to them; every year, Apple makes it clear that it has no intentions to do so. But these aren’t as interesting stories as the one about what happened to the iPod, and where its design philosophy eventually led Apple, and what Apple makes in between making computers.

And before we get into that story, we should talk about another Apple keynote, and Steve Jobs’ final hardware reveal.

Which is to say: we should talk about the iPad.

The iPhone introduction I linked above has 30,000,000 views, as of this writing. The most-viewed video of the iPad introduction has fewer than 800,000. There was feverish anticipation before the iPad announcement; the iPhone’s staggering, runaway success meant that all eyes were on Apple. What would its tablet computer look like? Was Apple about to unveil its vision for the future of computing?

It was. And people were underwhelmed. “It’s just a big iPhone!” went the popular refrain. Because, well, yeah. It was.

Skip around the full-length video of the iPad introduction, and you’ll get the gist of it. Steve Jobs spends almost all of it sitting in an armchair. When other executives come on stage, they sit in armchairs too. The apps they’re demoing are fundamentally similar to the apps they’d demoed on the original iPhone; the screen is larger and the apps show more data, but there’s nothing spectacularly breakthrough on display. The armchair was, by and large, the star of the show.

We had already seen Apple’s vision for what computers ought to do. They’d given us that in 1984. And we’d already seen Apple’s vision for how computers ought to feel. They’d given us that in 2007. All the iPad did was claim a stake to a little plot directly at the intersection between the two. People who were looking for a third big thing were left disgruntled.

As with the iPhone, what was truly groundbreaking about the iPad wasn’t what was there—it was what wasn’t there. People had seen that already in the iPhone. The iPad’s only claim to fame was that, well, it was as big as an actual computer. It had more room to play with. And that didn’t mean a lot in 2010, when all it offered was the potential for that play. That potential, a decade later, is still not fully realized. But it lies, fittingly, where people weren’t looking. Specifically, and strangely, it lies in a product that in 2010 was already on its deathbed: a former Apple hit that had been killed, of all things, by the iPhone. The future of the iPad was hidden in the iPod, in a way that wouldn’t be fully revealed for years to come.


The iPod’s successors.

It’s September 2014. Steve Jobs has been dead for three years. Earlier this year, Apple veiled its first major new product since his death: iOS 7, the controversial redesign pioneered by Jony Ive, in his new role as Apple’s primary software designer. Ive is now arguably the most powerful man in Apple; Scott Forstall, one of Jobs’ most beloved lieutenants, was let go in the wake of Ive’s software ascendancy—the first major act by Tim Cook as CEO. It is perhaps symbolic that iOS 7 murdered many of Jobs’ most idiosyncratic design loves: his fondness for textures, like leather on iOS’s calendar and linen on iOS’s everything-else, is finally dead. Even the iconic lock screen has been removed.

Not only is Ive in control of Apple’s flagship products in a way nobody else has ever been before, but he is overseeing the first launch of a major new hardware line since the iPad—so really, in a sense, since the iPhone. Everybody knows it’s a watch. Everybody’s speculating. Excitingly, Marc Newson, a legendary industrial designer and founder of Swiss watch company Ikepod, has just announced he’ll be joining Apple, suggesting he’s been working quietly for Apple for a little while.

The curse of designing the iPhone is that everybody keeps expecting another iPhone. The iPad was fiercely anticipated, but was by no means as huge a breakthrough. Now, Apple’s announcing its entry into another much-detested consumer market: smartwatches exist, and everybody hates ‘em. The big question—really the only question—is: will Apple have what it takes to make that market viable? And will it be such a big deal that we’ll start using our watches for every purpose imaginable, the way we currently use our phones?

When Steve Jobs unveiled the iPhone, he mentioned that Apple had introduced two historic user interface mechanisms—the mouse and the clickwheel—and was about to introduce a third. Tim Cook tried to piggyback off that in his Apple Watch unveiling, claiming that the Watch’s Digital Crown was an innovation comparable to the other three. It’s easy to say that it wasn’t—but in retrospect, it’s not fair to say that the first three were equals themselves. The mouse and the touch screen were vastly more significant innovations than the clickwheel was; the Digital Crown may in fact be as significant as the clickwheel, historical importance aside. What made the clickwheel a big deal was that it was Apple’s demonstration that unique software challenges required unique hardware, and that it was the intersection of hardware and software that led to radically improved user experiences.

The Digital Crown was introduced during an era in which Apple was already the undisputed master of digital product design, whereas the clickwheel served as a harbinger of that era, but it similarly serves as a demonstration of what makes Apple such a peerless design firm: its awareness of what a product is, and, equally importantly, what it is not. The Apple Watch, much like the iPod before it, is a product defined by its limitations: it’s not a general-use computer and it doesn’t try to be, its full-screen apps are less important than the single-glance complications they offer watch faces, and its most important uses and technologies revolve around its many sensors, which are the most-often-updated features of the watch year-over-year. What the Digital Crown offers is the one mechanical need which the Apple Watch’s interface needed: precision, specifically precise manipulation of data in an extremely limited space. It doesn’t do anything more than that, because the Apple Watch doesn’t need anything more.

In a sense, the iPhone and iPad are anomalies: they are general-purpose devices by a company whose product line increasingly consists of tools which are more focused. This is nothing new, but it used to be that the bulk of Apple’s more-specialized products were either necessary accessories to their general purpose machines, like their keyboards and mice, or software products, like their high-end apps for music and film editing. (It is unusual, and not-often-commented-on, that Apple’s Final Cut Pro is directly competitive with Adobe Premiere, and that its Logic Pro is neck-and-neck with Avid’s ProTools; before it was discontinued, Apple also had Aperture, a professional photo editor that competed with Adobe’s Lightroom. What other hardware companies also create high-end software? Microsoft Office is one of the few examples that comes to mind, but Microsoft has always been more of a software company than a hardware company. Nintendo may be the other big example that comes to mind.)

Since the Apple Watch—so, for nearly a decade now—Apple’s new product lines have all been focused on either specialized hardware or service platforms. On the one hand, you have AirPods, HomePods, and AirTags; on the other, you have Apple Music, TV+, and Fitness+. Straddling the two, you get the Apple Card, a physical product attached to a unique service, and the Apple TV hardware, which, though it runs a variant of iOS, is defined more than anything by its hardware remote (and, until recently, by how much that remote is loathed). In many of these cases, its products on either end are defined by their unique integrations of hardware and software: Apple Card’s anal-retentive titanium design paired with its unusual software interface, or Fitness+’s connection with the Apple Watch, or the Apple-exclusive way that AirPods can connect with various other Apple devices. This is true even of Apple Music and TV+, which are the closest Apple comes to making software-only experiences: even there, TV+ was specifically designed to dovetail with the Apple TV’s UI, and Apple Music is built into every HomePod, and used to give special leverage to the Apple Watch’s cellular models. (No cloud music service, no wristwatch that can play your music from anywhere—in a sense, Apple Watch and Apple Music form the two halves of a cloud-centric iPod.)

But the two most radical new Apple products since the Apple Watch were not designed to be used standalone. Each exists to augment one product line and one alone; neither is an essential part of that product line, the way a Mac always needed a mouse, but each shapes and lends purpose to that product in a way that it simply couldn’t have otherwise. And that’s by design: looking back, it’s as if the main product was intentionally stripped down, and kept, not featureless, but open: ready to be modified in other ways, by other products, as the need arose.

Steve Jobs bragged about removing the keyboard from the iPhone, to permit it the purest possible digital fluidity. It’s ironic, then, that the most interesting additions to the iPad were hardware additions: not changes to the initial blank canvas, but new pieces of hardware that modify the software, and change the meaning of that empty screen. Since Jony Ive became Apple’s chief software designer in 2013, Apple’s software has increasingly become a receptacle for its hardware, its on-screen UI reflecting the capacities of the device it exists to serve. The descendent of the iPod and its clickwheel wasn’t just the Apple Watch, its Digital Crown, or its sensor-and-complications-driven app design. It’s the iPad, which augments the iPhone’s digital fluidity with a newfound analog fluidity, its screen both a display and a receptacle, the singular reflective surface of an ecosystem which, though digital at its core, is increasingly defined by interchangeable physical devices.

How else do you explain the Magic Keyboard? With the Apple Pencil, Apple finally let the iPad act out the formerly-conventional idea of a tablet computer—one manipulated by stylus, primarily for purposes of drawing and writing. The Magic Keyboard is far odder, conceptually speaking: it’s a device that lets the iPad pretend to be a computer.


If you have read this far, congratulations: we will now talk about the iPad.

Apple is a company known for its linguistic marketing quirks: the incorrectness of its “Think Different” slogan, for instance, or the way it drops “the” before describing its products, which leads to sentences like “Here is what to expect when you turn iPhone on.” And one of its oddest and most consistent quirks is this: it refuses to call the iPad a computer. It avoided the word “computer” on the iPad’s original launch, and continues to steer away from that word to this day. A quick search through Apple’s current iPad marketing materials turns that word up only once: the iPad Air is said to do more than a computer, implying that it exists as a separate kind of product altogether.

You can reach two conclusions with this information. First, you can decide that it’s just Apple being pretentious, the same way that Pepsi’s new logo was accompanied by an absolutely insane branding document. That wouldn’t at all be unreasonable! But you can also say that Apple’s up to something intentional and particular: they’re defining the “computer”, essentially, as anything which resembles the original Macintosh. They refer to their MacBooks and iMacs as computers, loudly and often; it isn’t just that they insist on referring to each of their products by brand name alone.

MacBooks and iMacs are variations on the original personal computer; competitors’ products are computers too. But the iPad is a different kind of device. It doesn’t privilege or prioritize the keyboard or the mouse—and plenty of criticisms of the iPad over the years have revolved around its being worse at traditional computer applications that are keyboard-and-cursor-centric. You can do these things, with fingertaps and on-screen keyboards, but these are simulations of a computer-like experience, in the same way that GarageBand’s on-screen piano keyboard is a simulation of of keyboard. An iPad is a computer in the same way that an iPad is a phone, or an iPad is a TV. And you can say fairly that this is because our TVs and phones have become computers too, in the same way that an iPhone’s phone capabilities are among the least interesting of its many functions. But many people, especially computer developers, would say that the iPad is limited as a computer, not only because of its interface, but because of the many ways that it hides its backend, puts walls around its applications, and prevents its users from making modifications to it, in a way that most computers don’t.

If I could program on an iPad, I would strongly consider giving my laptop up—that feature, more than nearly anything else, is what keeps me using my Mac. I’m holding out hope that this is the year that changes, but, unlike many developers, I don’t feel particularly surprised that it hasn’t happened yet. The Magic Keyboard would certainly make programming feel easier and more intuitive, to the point that there no longer feels like there’s any interface obstacle to the act of programming. But nothing about the iPad privileges the Magic Keyboard over, say, the Apple Pencil—and neither of those are privileged over the primary interface, which requires neither keyboard nor pencil to operate.

The iPad is as much a computer as it is a piece of paper; technically speaking, it’s far closer to one than to the other, but “technically” is not how Apple defines its products. Is your average iPad user likelier to want to draw or program? Or are they likelier to, say, watch TV, or read books, or make video calls? You can define “computer” as the thing which, functionally speaking, makes this plethora of options possible—or you can define “computer” as a specific type of device, and decide that to use it to describe everything that’s technically a computer dilutes the phrase as much as, say, using the word “sandwich” to describe a burrito.

Every so often, when a friend and I are hanging out and decide to watch something, we’ll ask each other: “Do you want to watch it on the TV, or do you want to keep it on the computer?” The TV, technically, is a computer, but the word “TV” refers to its specific form factor. Similarly, “computer” is the form factor to the thing that we can keep on our laps, or curl up with in bed. Is the iPad a computer? Form-factor-wise, it can be: it looks a lot like one as I’m typing this, for instance. But when I pluck it off its keyboard and rest it vertically in my lap, writing on it by hand, am I thinking of it as a computer? Does it feel like a computer, when I write on it so? I’m not saying that I think of it, as Apple marketing would like me to, as an iPad first and foremost. It’s more that I don’t think of it at all: it’s just a vessel for whatever’s happening through it, which sometimes feels like a computer-y thing and sometimes feels more book-ish and sometimes feels like a video game console.

Moreover, in a world in which devices are increasingly interconnected, the question of where something is happening feels oddly beside the point. If I play a song on my iPad and transfer it over to a HomePod, is the iPad the computer or is the HomePod? Does it matter if I start it on the iPad and stream it to my HomePod, or is that unimportant when I could use my HomePod without my iPad attached? Is the computer the physical device, or is it the field of possibilities across devices?

Perversely, the more fluid digital experiences become, the more they divorce from any one device—and the more each device becomes defined solely by its physical properties. Your phone is the one in your pocket. Your TV is the one on the wall. Your watch is on your wrist. These devices may all do the same things, often so seamlessly that you don’t notice one switching over to the other. How consciously do you think of which device you use to watch which TV show on, versus thinking of the show and picking the device which most suits the occasion? Computers are no longer inseparably entwined to the digital world they represent: they are merely the interfaces, and those interfaces are defined as much by hardware parameters as by software representations. The paradox is: the more truly digital our digital things become, the more our devices become, in essence, analog.

Within this world, the iPad remains perhaps the most digital of all these analog devices. It is the most elusive, the one which least immediately serves a singular purpose. It’s the Platonic ideal of a screen: large enough to serve as a meaningful focus, small enough to be moved about. So it’s fitting, maybe, that even its analog form is ambiguous, defined by its interplay with other devices. Those devices, too, straddle the analog and digital worlds in a curious manner: the Apple Pencil is less a pencil than the Platonic ideal of a pencil, a symbolic representative of a utensil that you’d hold in your hand. Its purpose is to symbolize: it is in a very literal sense iconic. On the Macintosh, the mouse was the device and the arrow was the icon; on the iPad, the icon is not what appears on the screen but what rests in the palm of your hand.

Similarly, the Magic Keyboard obviously isn’t a computer in and of itself. Instead, it’s the symbolic trappings of a computer—literally, a keyboard and “mouse” to accompany the iPad’s screen. Use them together, and you have something approximating a laptop. But it’s an approximation, and one you can remove the iPad from at any time. That approximation has its own surrealities: in particular, its weight distribution is still a little disconcerting to me. It’s top-heavy rather than bottom-heavy, the way most laptops are, and it’s almost entirely focused to the rear of the device. I have a bad habit of balancing my machines on top of things, particularly in the kitchen; with laptops, the weight distribution feels pretty intuitive, since weight is distributed evenly across the device. With my iPad, “even” usually means the keyboard jutting way out over the edge of something, since the keyboard is relatively weightless. Perched on top of something, the iPad looks like it ought to fall over at any second; the reason it doesn’t has something to do with the fact that its computer “trappings” are not actually what make up the computer.

As the story goes, the Palm development team scoffed at rumors of a touch-screen iPhone because they couldn’t imagine a pocket computer with a battery powerful enough to sustain itself for any meaningful amount of time. When the iPhone was announced and released, they disassembled one, and were nonplussed to realize that Apple’s solution had been to shrink the computer chips way down, leaving as much room for battery as possible. The iPhone was not a computer with a battery attached, to paraphrase them: it was a battery with a computer attached.

The iPad is similarly battery-dense: most of its added space is used to power its larger screen. With the advent of the M1 chips, Apple’s miniaturized computers now dominate their entire computer line-up; already, we’ve seen that result in a vastly shrunk-down iMac, one whose “chin” holds the “real” computer in its entirety. The “computer” part of Apple’s computers is increasingly small. And in terms of form factor, that allows Apple to make devices in any shape and size they want. At what point, then, do we start using other terms for these things, and stop calling them primarily computers? When is my Apple Watch just a watch, and my iPhone just a phone, and my iPad… well… just another kind of interface, albeit occasionally an interface with a computer attached?


Something approximating a review.

By far the least satisfying way to use an iPad is as a laptop.

That’s not because it’s not a good laptop. On the contrary: it’s a brilliant one. The Magic Keyboard feels fantastic to type on, and the trackpad, though small, is even better: iPadOS’s cursor is a delight to use, and the various multitouch commands only go to show how much more refined of an operating system it is than MacOS. While MacOS’s multitouch is probably a bit more powerful overall, that’s because it needs to be, in order to handle its innate complexity. iPadOS is vastly more elegant, so its multitouch commands feel almost excessive in what they let you do.

But the laptop form factor comes with the same old problems: it divorces you from the device you’re using. Do you want to be using a keyboard, with the screen merely reflecting what you type? Fine, do that. Do you want your primary interface to be a trackpad? I largely can’t imagine why; no use case for that immediately springs to mind. (I will say that gaming on this iPad is weirdly enjoyable, in that the cursor functions almost like a finger but not quite. Your finger immediately registers as a click, while your cursor pauses, allowing you to hover and target and manipulate with more precision. This is hindered, of course, by the fact that nobody in their right minds designs gaming controls around an iPad finger-plus-trackpad combo. But it’s nice.)

When my iPad is docked, I find myself using it more listlessly, less purposefully. It feels like a detached abstraction. The moment I remove it from the Magic Keyboard, something about it feels vastly more engrossing. It becomes analog. The laptop add-on as a full-time extension holds it back.

That said, I love that the Magic Keyboard comes with its own plug-in charger option. It turns the entire contraption into a sort of MagSafe connector: you can keep the keyboard plugged in, remove the iPad and use it however you’d like, then merely return it to its base to charge it. The keyboard’s portability almost feels like a plus when you use it like that, rather than the obvious default. As a charging base set up on a desk, it works delightfully well.

The Apple Pencil is still the far more intriguing device. Drawing apps aside, what fascinates me most is the way it makes that digital/analog divide clear by playing with it from both sides: you can use it to write in any textbox, and your handwriting is immediately digitized, meaning you can now text your friends as if you were writing them letters with a feather quill. (Side note: do feather attachments for this exist?) Alternatively, you can write freehand in Apple’s Notes app, and your handwriting will be translated to text and stored, free for you to search and copy whenever you see fit. There is something extremely enjoyable about highlighting your handwriting the way you’d highlight any typed text, and seeing your handwriting outlined letter-by-letter in highlight yellow. There is also something enjoyable about getting to doodle little tornadoes in the margins when you get bored.

I find myself using the Pencil to send texts and post online an awful lot. There’s just something different about writing with an actual utensil. It feels more intimate and more considered. Given the tradeoff between analog’s richness and digital’s convenience, I go digital every time. But a device that doesn’t make me choose… that’s something special.

Much like the iPad itself, this system isn’t what I’d call “pragmatic”. It has a lot of very neat features—crossing out words to delete them, circling them to select them—but the errors it makes transcribing handwriting are simply too numerous for this to feel like a revolutionary new system. Worse than the errors it makes, correcting those errors is frustrating. There aren’t easy ways to modify single characters, or to insert punctuation if the system doesn’t immediately recognize it. At this point, you have to resort to the on-screen keyboard, which presents its own hassles (its keyboard-as-trackpad mechanism is inferior to the pre-iPhone X’s weighted-touch-based on-screen trackpads), and further serves to keep the Pencil from working properly: once you type on it, it assumes you’re done with handwriting, and your first screen-touch with the Pencil is generally taken as an attempt to scroll or manipulate the page. This holds a lot of promise as a system, but it’s very much a first-year Apple product.

What I think matters, though, is how much this doesn’t matter. All this is true, yet I still find myself using it more often than not. The promise of a digital-compatible analog device—gross phrase, lovely sentiment—is too great for me to resist.

As a product unto itself, the iPad can be defined pretty simply: it’s light and sound and speed, and very little else.

Light:
The iPad’s screen was already sophisticated in a way that few screens are, and continues to add new joys with every year. What appeals to me most are the features that have been present for at least a few years now: ProMotion, which means the screen runs at 120 frames per second and makes the littlest movements feel subtle, fluid, and smooth, and TrueTone, which shifts the light of your screen to match the ambient light in whatever room you’re in. Neither is a write-home feature, but both add a tremendous amount of pleasure in use, in their quiet ways. Screens feel far more natural when they reflect the light around them, and the added smoothness helps with the illusion that everything you’re doing is somehow analog, for all you’re just manipulating a series of tiny, preprogrammed dots.

The latest addition, the iPad’s shift to MiniLED, means that its screen is lit in tiny clusters rather than all at once; in practice, this gives you darker blacks, to the point of the screen’s not appearing lit at all, and the potential for much, much brighter brights.

The darker blacks are satisfying, especially given the tendency of other Apple products to “flicker” during dark scenes: regular LEDs light everything up at once, so during dark scenes, they attempt to find a median lightness that, as it shifts, makes the whole screen fluctuate at once. The new system prevents that fluctuation, which is a relief. That said, it presents a new weird phenomenon: in the dark, whatever is lit often has an ambient glow around it, since the MiniLEDs are not precise enough to light the screen on a pixel-by-pixel basis. This sometimes gives you glowy “chunks” here and there, surrounded by pools of dark black. I’ll take it over what existed before, but it’s hard not to notice, and I’m sure will one day feel quite dated.

Most important to me is this: the iPad is bright enough to be read in broad daylight. As a book reader, as a computer, it can function outdoors at any time of day without inconvenience. In terms of when and how I can practically rely on it as my main machine, that means a lot.

Sound: First of all, this thing is loud. Almost surreally so, given the right circumstances. In vaster rooms, it does get a bit swallowed up, but in more modest spaces these speakers can envelop you weirdly well. In use, I rarely find myself going past half-volume; past about three-quarters or so, I genuinely start worrying about my neighbors.

Apple constantly claims that iPad’s speaker system can approximate surround sound without actual surround speakers; strangely, this is pretty much true. It’s a little eerie how effectively it can project sound to the left or right of you, and even behind you somewhat. I’m not sure what acoustic magic that is, but it’s neat. Nothing you’d confuse for actual surround sound, but enough to pluck the same pleasure centers in your brain.

Speed: Way back when the iPad first came out, conventional wisdom was that it could achieve more with less of a processor. Because it doesn’t allow for the same intensive multitasking as a typical laptop, you get less accumulated lag, even as you run more programs at a time. It was a nice consolation at a time when iPads were considerably weaker than any device with a remotely comparable screen.

Now, however, the iPad Pro runs the same chip as Apple’s cutting-edge Macs—the M1s developed after a decade and a half’s worth of chip-miniaturization practice in prior iOS devices. You can look at the benchmarks if you want to see how freakishly fast these new machines are. In practice, this means, simply, that for the moment there’s no such thing as processor delay. Even running fairly advanced programs—professional photo processing is about as heavy as I get—there never seems to be any hesitation. 

I’m sure that’ll end one day soon, but for the time being, it feels like a dream. It’s like the abstract notion of using a device, the version as dramatized in film, rather than the actual reality of using computers that stutter and frustrate. Everything about it feels idealized, even when it falls short of its ideal.


But why?

What for?

Ultimately, that’s the question, for any general-purpose kind of machine. Sure, it can do anything. But what do you want it to do?

If you want to watch Seinfeld on it, or read spy thrillers with it, you’ll find it’s pretty much unparalleled for the task. But does it make that much of a difference? Not really. I can’t remember the last time I resented the device I watched a sitcom on.

Is it a nice machine for typing on? Goodness, yes. My preferred writing app is Mac-only, and I dearly regret not having access to it on this device.

Is it an astonishing device for artists? Probably. But I’m not enough of an artist to take advantage of it.

Can you browse the web on it? I mean, sure.

Does it play games? Quite well! But it’s no Switch, even if, from a technological and even a user-experience standpoint, it’s superior in every way. It doesn’t have the Switch’s controls, and it doesn’t have the Switch’s games, and nothing else matters nearly that much. For that matter, it doesn’t have the Mac’s games, let alone a Windows machine’s.

Does it offer a variety of well-designed software for specialized purposes? Yes! In addition to the reading apps and writing apps and watching apps and games, I’m a huge fan of Apple Fitness+, another neat case of a product that exists across products, somewhere in between the screen and the watch. I will always cherish GarageBand fiercely. If you have a smart home, this is a good device to program it with. 

But none of these are killer apps. All of them are nice apps, but no single one of them can possibly justify this device above every other device, if that’s what you’re looking for.

If there’s one thing I value about the iPad’s functionality above everything else, it’s the app “tray” it provides, for miniature apps you want to occasionally check up on without letting them dominate. The fact that any app can be placed there, unlike menu apps on the Mac, and that there’s a singular place for apps like that, and that you can shift those apps over to a split-screen interface and pull them back out again when needed, means there’s a consistent logic to the interface that says: some things take up your full attention, and some things linger at the side, for when you need them. That the sidebar mimics the width of an iPhone, meaning that you essentially have an “iPhone overlay” on top of your iPad, is both clever and cute: clever, because it spares developers extra work; cute, because it gives you a way of understanding what you’d be checking there, because it’s all the stuff you’d otherwise check your phone for.

But that’s not a killer app either. Not really.

The iPad is Apple’s Xanadu—both the one from the poem, and the ambitious Ted Nelson project to make a better hypertext environment than the World Wide Web. Unlike Ted Nelson, Apple has a virtually unlimited budget and a giant, brilliant team with which to execute its vision, but even that hasn’t let it figure out, in ten years, how to create an unparalleled device that renders computers obsolete.

What they choose, instead, is openness. What they choose is striving. The iPad is less practical than a Mac, but it’s also less practical than a Microsoft Surface, which manages to be a full-fledged computer that also doubles as a tablet. At every point, Apple chose to push away from what makes computers computers, until they could find a way to reintroduce the same old things in ways that didn’t crimp that openness. It took them a decade to re-introduce the cursor. It will take longer for the iPad to truly serve as a Mac replacement, if in fact it ever gets there. (And I doubt Apple has plans to replace the Mac any time soon, if ever.)

Why the iPad succeeds despite not being a “full-fledged computer”, I think, is that, at the end of the day, we really don’t need computers to do all that much. They can do a lot, but—the more we move past the starry-eyed boom days—there’s typically not a lot we need them for.

We use them to keep in touch with each other. The iPad’s good for that. (I didn’t bother mentioning the new wide-angle front-facing camera, which lets you move around the room as you talk without ever disappearing from your video call.) I wouldn’t call its built-in email app anything groundbreaking, but hey—that’s why HEY exists. And iMessage on the iPad Pro is a delight, not least of which because you can send those lovely little handwriting messages, which write themselves stroke-by-stroke on the recipient’s screen. (It’s very fun to send messages which scribble things, then scribble over those things to reveal something new—little real-time practical jokes that text could never contain.)

We use them to create and modify various kinds of documents. At this point, Apple’s best-of-class where photos are concerned; its office suite is comparable to Microsoft’s in many ways, slimmer but more elegant, but you can also just use Microsoft Office if you’d rather. If you need a machine for office work, it’ll suffice, unless you’re a programmer, in which case you’re still out of luck—but I could see professional designers getting by with this and only this, especially with that devastatingly good screen.

We use them to consume media—and on that front, the iPad’s world class.

Beyond that… it’s up to you, really. In a sense, the iPad—and any computer, but the iPad even more—is a device you have to sculpt. It’s nothing until you choose to make it something. Launch an app, and everything but that app will disappear. It is what you make of it.

What do you need it to be? What do you want it to be? What will you choose to make it?

The temptation is to make it do everything. But is that wise? With possibility comes attachment. Everything you fill your life with leaves less room for other things. Every app, every project, every commitment, takes up space and time. 

How much of that space, how much of that time, do you want to devote to staring at a screen?

I know it’s a dumb and overrepeated question. And I know there are plenty of screen-based things you can find worth in. But the question still stands.

Here, I think, is where it pays to have a device as elegant as the iPad—elegant not only in how it feels, but in how exactingly it lets you plot out your own intended use of it. The iPad encourages you to be conscientious. It encourages you to build a machine for yourself that does what you need it to do, when you need it to do it, and otherwise leaves you the hell alone.

I mentioned noticing a difference between how I use the iPad when it’s in “laptop mode” and how I use it when it’s resting on my lap. I’m suddenly aware of how much easier it is to use it wastefully, ignorantly, distractedly. When I use my laptop, I never notice. I use it like it was designed to be used, and the way it was designed to be used is wasteful. But suddenly I pick up the iPad, and find myself wanting to write friends by hand, and start working out ways to cut out my use of it. I start wondering how I can make it feel less like a computer, and more like one object among many.

At some point, during the writing of this essay, I disabled the built-in web browser: a step I can’t take on my laptop, but insist on taking on my phone. It feels too imprecise. That may seem neurotic—on some level, it probably is—but the way I see it, there are two kinds of freedoms. One is the freedom to do as you please, work as you please, make up your mind in the moment as to what you ought to be doing and how and why. The other is the freedom to know that what you’re doing works a particular way, serves a particular purpose, has a particular duration. On a device like this, it feels like precision demands precision. And the burden it relieves is immense.

Though we don’t often compartmentalize our device usage, there is a spectrum that defines our relationship to a device at any given point. There’s passive use, where the device serves as a facilitator for media (while possibly offering us a chance to fiddle distractedly as that media plays). There’s engaged use, where we’re solely focused on some kind of work, with our device merely facilitating and reflecting that work in progress. Web browsing and social media, predictably, falls somewhere in between, offering us both passive and engaged modes of interaction, all the better to ensnare us with. Then there’s ambient use, where a device facilitates an experience that exists beyond it: music playing, smart home manipulation, and virtual assistants all fall into this territory. I’d argue, too, that there’s a difference between how the “laptop” and the “pencil-and-paper” modes of the iPad work: the latter feels closer to an ambient digital experience, because the engagement in question is so much more physically connected in nature.

Understanding this spectrum, or at least being aware of it, is important if we’re to make sense of the space this kind of technology takes up in our lives. Using a computer is different from watching television—except for when it’s not. A computer used to manage the environment around us is different from a computer that demands our attention. A computer typed on is different from a computer written with.

Fundamentally, the iPad is designed to be worth our focus. Its screen is large enough to dominate your field of vision. The iPhone exists for you to check on the side; the Apple Watch exists to handle tasks that don’t require a screen at all. While the iPad can be used as a remote and a monitor, it principally exists to be that thing you stare at, for however long you need to stare at it, for whatever’s worth staring at.

But it was also designed, from day one, to be something you sat with in an armchair. Something you read, rather than just watched. Something you connected with, in ways you don’t connect with your computer. Something, perhaps, that’s easier to put away once your time with it is done.

The strange thing about my iPad excitement is that it pushes me away from my iPad as much as it draws me towards it. Certainly I use it plenty, especially since it can replace my laptop in myriad ways, but already I find myself pushing away from its most addictive qualities—its keyboard mount, its most distracting uses—and asking myself, soberly, what I should be using it for. I’ll consider it a successful device if and only if I find myself using all of my devices less now that it’s in my life.

On one level, the iPad is a device engineered for a world in which computers are more ubiquitous than ever. On another level, that computing is receding into the background, moving away from the classic keyboard-and-screen combo popularized by the Macintosh. Computing takes the form of voice monitors, location-aware sensors, and exercise trackers. The next big wave for computing will be some combination of VR and AR: VR being a world projected through a computer, AR being a computer projecting onto the world. And it’s notable that the iPad loudly touts its advanced AR functionality, ahead of Apple’s announcing a VR headset of their own. (Or are they? It will be curious to see how much of their headset relies on AR rather than VR.) This wave will leave the traditional computer behind from the other direction, but it will leave the computer behind all the same.

So what’s the future of the old-school computer? One option is for it to stay the same, doing what it does for the people who need to do it—the path taken by the Mac, and by Surface-style tablet/computer hybrids. One option is the path that Google’s taking, where the laptop is seen as just another interface to a unified cloud environment that essentially functions identically no matter where you get to it from. A third option, the iPad’s option, is that the form factor of the computer will get increasingly ambiguous, increasingly fluid, in a way that allows for the laptop without demanding the laptop. 

Different sizes of screen make sense for different purposes—but there’s nothing saying that a screen of a certain size can only serve the one purpose. The iPhone’s ambiguity is more immediately recognizable: it’s more a camera than a phone at this point, and it’s given rise to a generation of mobile-only social media that makes radically different assumptions about what people want out of a social network when they’re on the move. We’re not accustomed to thinking of a lap-sized device being as flexible as a pocked-sized one, but there’s no reason why it couldn’t be, or shouldn’t be for that matter. (Even the fact of the iPad’s ability to rotate increasingly makes a difference, when the Pencil favors a portrait-mode paperlike approach and the Magic Keyboard insists upon a landscape-mode orientation.)

That analog-adjacent ambiguity allows for more potential. But that analog nature, along with the iPad’s overall elegance, encourages a more considered usage, and pursuit of less overall. The iPad is more flexible, more open, in some ways than the Mac. That it’s less flexible and less open as a computerwinds up being less important, because the iPad’s purpose is not primarily to be a computer in that sense. It might replace my laptop soon, but if it does, the result will be my doing fewer laptop-y things, and perhaps even using the iPad itself less overall—less, but more focused ways.

If you don’t have the same aversion to the idea of general-purpose computers that I do—an aversion that I know is partly due to how much I use computers, both professionally and personally—then this might not seem like a big deal to you. You, the hypothetical reader who has stuck with me through this essay so far, may finally be wondering if I’m not splitting hairs just a smidgen. But do I think there’s something important happening here. We’re witnessing the transformation of what we think of as computers—a transformation that Apple kicked off with the iPhone and that has already completely reconfigured the world, and taken us to unusual and even frightening places. We’re coming to realize that our ceaseless indulgence in technology may come back to haunt us, whether it’s due to individual addiction or tech startups causing devastation or the information economy pushing us to places of polarization, ignorance, and hate. 

It would be very odd to point to the iPad as the answer to all those challenges. And the iPad definitively is not. If anything, it’s the answer to the laptop, which was not the device which caused all these problems in the first place.

Instead, the iPad is an attempt to answer a subtler question: a question of how un-computer-y you can make the most computer-y of all computers in this modern schema, how analog you can make a device that is essentially one large glass surface. The answer may have more implications than you realize: not because of the iPad as a standalone machine, but because of the slow shifts in how we think of “our devices”, how we relate to technology in general, how we make room for it in our lives, and when we decide that enough is enough. The iPad cannot be a primarily analog device, by its very nature, but Apple’s ongoing attempt to make something that does let us write on it rather than type with it, something that lets us hold it rather than merely prop it up, feels like an attempt to close the divide between computers and the physical world: an attempt to make something that has a place in meatspace, not just by letting us take the digital world with us wherever we go, but by encouraging us to think of the digital world as something that can be manipulated by analog means, with the digital serving as a backend to something more physical, more immediate, more real.

For me, it means—whenever I can get away with shunting the laptop off to the side—creating a space for myself in which that flat, unreal digital realm is kept on the fringes, except for when it’s providing me an experience I actively want to have. It means having something that can work as I efficiently as I need it to, but only when I need it to: a keyboard when nothing else would do, but a pencil whenever I can get away with it. It’s also a delight of a machine, a technical marvel any way you look at it, and it doesn’t look half bad as a bit of atmosphere in any room—and sometimes, having something you can prop up as a quick TV and still blast the thing you’re watching is exactly what you need. The fact that it can be those things means that I can have those things, precisely when they’re needed and not at any other time. But what it gives me more than anything is a device potent enough, and well-shaped enough, that I can better ask myself what I need and when.


Something approximating a conclusion.

Over the last year, Apple has mounted an astonishing comeback. More than it has for years—maybe more than it has since Steve Jobs died—it has managed to introduce sweeping, dazzling new changes to its hardware and software both, from its microprocessors to its new device form factors to its expanded headphone line-up to its broad variety of new services. 

The most astonishing thing about its comeback, of course, is that Apple didn’t need a comeback at all. It was already in the lead.

Of its many steps forward, of the many products in its lineup, the iPad is likely destined to be overlooked yet again. It doesn’t stand out among the dazzling new Macs; it lacks the relative affordability of the AirPods lineup and the Apple Watch; and, of course, nothing stands up against the iPhone, still a monster after all these years.

(It’s possible that the only Apple product that will receive less attention are the AirPods Max—and even there I wonder if there’s an argument to be made that the AirPods Max make the most sense as another iPad-specific extension. They’re a movie-lover’s companion, but more than that: they feel perfect for listening to music within the bounds of a localized environment, without a device in your pocket or a device right in front of you. And what better third extension for an iPad than one that lets you use it as a tether, playing you music or even shows and movies that remain rooted to a specific place in a room, with the iPad serving to transform physical space into a digital environment of sorts?)

But the iPad, by and large, was never going to be the star of the show. From the start, it was quiet, muted compared to other Apple standouts, presented as a comfortable and organic device rather than as a disruptive and world-changing one, functioning primarily as a competitor to Apple’s single longest-running, and by far most mature, product line. Its innovations have consisted of add-ons, accentuating and shaping it rather than serving as the “killer apps” that define the platform. 

And that’s okay. The iPad serves the most ambiguous purpose out of all Apple’s products; its evolution has been the most open-ended and the most unfocused. You could argue that the iPad’s purpose is to be unfocused, to diffuse focus, to take a kind of machine that generally hyperfocuses and addicts and distracts people and gradually let it serve more ambient and analog and quiet functions.

Criticism of the iPad generally falls along the lines of: This isn’t enough like a computer. It needs to be more like a computer. People who want it to be a computer ought to look somewhere else.

And perhaps, with WWDC around the corner, it’s about to do just that. If not this one, then certainly at one soon.

But I’m not excited about the iPad because it’s a computer. I’m excited because the iPad could be a computer. Could be, but doesn’t have to be. The iPad excites me because it’s a machine that lets you decide that sometimes, a computer isn’t what you want.

About Rory

rarely a blog about horses