

Well, that's it then. We're officially in the midst of an interface revolution.
The Nintendo Wii
The first real, big-time graduate from the school of keys. For me the killer game was Tennis. The very first time I picked up a Wiimote to play, I noticed I was turning my entire body to hit the ball, as I would in real tennis. Sold. That, a little force feedback and sound cues, and you're in the game. What's even more amazing is how simple the technologies at play are: accelerometers, IR, and Bluetooth.

NYTimes article on how the Wiimote works.
Another observation in the field: I was at a party where there was a Wii (where there's a Wii, there's almost certainly a party), and my friend Ellen who has probably never picked up a console game in her life, grabbed a Wiimote and started playing with a young pimply 19-year old kid who had obviously played console games his whole life. It was clear that Ellen had played tennis from her excellent swing and use of her whole body (more than once whacking her opponent's arm) while the kid stood stationary, almost motionless except for the flicking of his wrist -- he had pared the interface down to its bare minimum (see Christopher Alexander).
The iPhone
Steve Jobs deserves the Design Medal of Honor for single-handedly (in a manner of speaking) showing the business world what happens when good design values get instilled from the top. You get what most people think we should be carrying around in our pockets. He's right to devote a slide in his Keynote presentation to the current limitations of the hardware interfaces:
Most interfaces are never quite right (some worse than others), but we get used to them. We deal. But the touch-screen multi-touch interface does seem like a nice solution to the "how the hell do we fit 50+ keys onto this palm-sized space" problem. Wash your hands and get ready to use all them digits!

Star Trek The Next Generation had the right idea, though I don't remember if the interface changed contextually like the iPhone will. The design problem: we carry around (most of us at least) two or three gadgets. Wouldn't it be great if we only had to carry around one (and sync our data to our computer at home). The problem is: a multi-function device has multiple modes which require multiple key-sets or button-sets. For phone usage, you need the numeric keypad. For music usage, you need to pick songs, play them, stop, and adjust the volume -- the wheel interface on the iPod was the huge ease-of-use revolution (though it took three generations to get it to where it is today -- see John Maeda's observations and a small debate). For SMS, email and calendering, you need a real keyboard -- a true bugger because our alphabet has 26 letters. (Though texting in China and Japan is wildly popular. The Japanese even have a word to describe the generation that thumbs: oyayubizoku, or "thumb-tribes".)
One thing I have noticed teaching my class at Parsons is, since everyone has to have a laptop and since they're all young designers, they have Macs naturally, but what surprised me was that a good half of them did not use a mouse. When I inquired about this, thinking that the mouse offers the highest precision instrument for doing detailed design-work, they shrugged and said, we don't like carrying around the extra equipment. We just use the trackpad.
Jobs rightly dismisses the stylus as an unnecessary piece of equipment because, hey, we've got 10 styli on our hands, the best pointing devices in the world. But look at the size of your finger. No, really. It's a pretty blunt pointing device, say, about a 1/2 to 3/4 of an inch in diameter. It's huge in comparison to the size of a pixel. On top of that, the resolution of the iPhone screen is 160 pixels per inch, which is incredible for clarity and crispness of images and text, but now with touchscreens the problem becomes not the resolution of the display, but the resolution of our pointing device on the screen. Viz:

On a device like the iPhone, this is probably fine, but
pixel-pushing with that thing (even manipulating 12px-high letters) is a little like manipulating the world wearing foam fingers.
So what's the big deal with the iPhone's UI? Multi-touch.

Basically, it's creating a gesture language that the computer can interpret and translate into functionality based on context. This idea is already in play with your trackpads on OS X. You can use two fingers to scroll a page or double-tap to bring up a contextual menu. Jobs says they patented it, but Jeff Han has been doing the coolest research with it for awhile.
Also, the Lemur is pretty awesome too, from Jazzmutant in France. Always expect DJs to be on the forefront of technology, and to be using the hell out of it. (There's even a question on their site about whether sweat affects the touchscreen, something I'm sure the first users of the iPhone will worry about.) And it makes sense too, because musicians play with keyboards all the time, and they practice chording, which is what multi-touch really means -- using two or more fingers in concert to manipulate the interface. As Steve Jobs said selling the iPhone: "Touch your music."
What does it all mean?
Stay tuned...
No comments:
Post a Comment