Tablet Talk: multitouch, the desktop OS, and Apple's unique position

Feb 19, 2010 10:11


When someone points out the objections I raised in a previous post about desktop operating systems on tablets, generally someone else will chime in to say "But Windows 7 is different! It's got all these enhancements for touch operation!" I'd like to unpack that claim a little, because I think doing so illuminates some interesting problems in advancing consumer computing, and sheds some light on an interesting difference between Microsoft's and Apple's business models.

The Windows 7 enhancements can be separated into two distinct sets. The first set includes things like an enlarged toolbar and icons for easier touch operation, a better virtual keyboard, and easily-available handwriting input for entering short text. I don't consider these "enhancements;" they're really stopgaps for the fact that the OS was primarily designed to operate with keyboard and mouse input, neither of which a slate-type tablet has. At best, they transform the Windows touch experience from bloody annoying to a tolerable simulation of your desktop experience. The problem is just that: it's a *simulation* of the desktop experience, not a *tablet* experience. For a tablet experience, you need to embrace the fact that touch is your primary input, and provide enhancements that facilitate operating your applications with touch input. Microsoft's second category of Windows 7 touch enhancements attempt to address this issue: a series of touch gestures for multitouch input supported at the OS level.

Herein lies our chicken-and-egg problem. Microsoft's primary computing product is its OS. It also makes software for Windows, but its software offerings are a tiny subset of the available Windows programs. (A very influential subset, to be sure, but numerically it's a drop in the ocean.) This means that Microsoft is dependent on third-party developers to produce Windows 7 applications that incorporate multitouch. But Microsoft doesn't produce the hardware that runs its OS, so it has no control over the user's access to multitouch hardware - that decision lies with the computer OEMs and the consumers. There's no pressure to buy or provide multitouch hardware unless there are compelling apps that need it, but there's no incentive to provide multitouch apps without some expectation that users will have the hardware available. Tablet computers have not been widely adopted thus far, and multitouch-enabled tablets are just emerging onto the market, so those devices don't produce a compelling case for software development - there needs to be a critical mass of desktop and laptop multitouch-capable hardware.

Here we run into another problem. The primary devices driving user awareness of multitouch are smartphones, like the iPhone, Droid, and Nexus One, which have capacitive displays capable of multitouch. Microsoft and lots of equipment manufacturers appear to be thinking "Hey, users like multitouch displays on their cellphones! Let's bring multitouch displays to other computers!" And so multitouch-capable monitors and laptops have begun appearing on the market. Since I presume that virtually all of you are reading this post on a standard desktop or laptop, consider the implications of that technology using your current computer. A multitouch display breaks the flow of your work: you have to take at least one hand away from your primary input surface (keyboard or mouse/trackpad) to manipulate the screen. This means that you have to reach the screen, and unless you've got freakishly long forearms, you're going to be holding your arm out from the shoulder for the entire length of time that you're manipulating the screen. Touchscreen monitors disrupt your workflow, and they're an ergonomic nightmare. They're impressive at trade shows, but they're never going to become a standard part of the desktop or laptop computing experience.

Now consider Apple's take on multitouch. Apple seems to believe that multitouch is the future of computing input, and wants to support it as widely as possible. So, in addition to making all of their touch-input devices (iPhone, iPod Touch, iPad) multitouch-capable, and building multitouch APIs into their desktop OS, Apple is including multitouch trackpads on their laptops, and shipping the multitouch-enabled Magic Mouse with their desktop systems. This is significant in two ways. First, Apple understands (as Microsoft apparently has not) that multitouch is an *input* technology, not a display technology, and so the relevant location for it in existing form factors is with the other input hardware, rather than on-screen. (To be fair, there's some evidence of this thinking in the Windows world as well: Wacom makes a set of multitouch-enabled USB input tablets for consumers, for example, and I'm fairly certain that some laptop manufacturers are enabling multitouch on their trackpads.) Second, since *only* Apple makes computing hardware to run their OS (the Hackintosh phenomenon aside), software developers can be assured that a significant - and growing - number of Mac users will have the necessary hardware to utilize multitouch, which incentivizes creation of multitouch-enabled Mac software. Moreover, since Apple *also* makes software for their OS (iWork, iLife, Aperture, Logic, Final Cut Pro), it's in a position to both set user expectations for multitouch by including it in Apple-produced applications *and* model innovative multitouch software interfaces for other Mac developers. Under these circumstances, I'm convinced that Apple will eventually release a MacPad, once it's had a little while (I'm guessing about 3 years, although it could be as short as 2 or as long as 5) to build up a critical mass of multitouch-augmented OS X software.

tablet

Previous post Next post
Up