Porqué necesitamos un lenguaje de diseño háptico

Lo se, hace mucho que no “posteaba” pero les dejo esto que les puede servir.
Aunque el artículo se refiere a los “vestibles” aquella tecnología que se incorpora a los accesorios que se llevan puestos como los relojes, pulseras o prendas de vestir. El artículo es bastante recomendable si queremos pensar en nuevas formas de conectar el mundo físico con el digital mas allá de lo obvio.

Why We Need A Haptic Design Language For Wearables

Better haptic design could turn smartwatches into silent universal communicators, says Immersion’s VP of UX Chris Ullrich.

There’s a disconnect between what wearables can be and what they currently are, says Chris Ullrich, who heads up user experience at Immersion, a firm known for its haptic feedback innovations. Right now, even the most advanced smartwatch is really just a mirror of the smartwatch in your pocket. Your phone receives alerts from the world, and instead of pulling it out the screen in your pocket, you look down at the screen on your wrist.

This is absurd in Ullrich’s eyes, because wearables could really be so much more: silent universal communicators snuggled up against your skin that can let you know everything from how fast your heart is beating to what’s happening on Twitter.

This is why at last month’s Mobile World Congress in Barcelona, Immersion introduced TouchSense Core platform hopes to fill that void.

The key behind TouchSense Core is to better utilize haptics as a communication tool; to create a design language for the vibrating motors inside smartwatches that would not only allow wearables to communicate with you invisibly, but just as importantly, be able to deliver nuanced information.

At the heart of TouchSense Core are five main categories of notifications, each of which has its own haptic syntax, established in software. The first two categories–Urgent, or Later–deal with immediacy. An incoming phone call, for example, is a more urgent notification than someone retweeting you on Twitter.

Two other categories–Foreground, and Background–relate to whether the notification you’re receiving is in relation to what you’re currently doing: if you’re a runner, for example, your smartwatch would use one category of notifications to say you just ran 5k, and another to remind you it’s time for your daily run.

And because your smartwatch should be able to raise its voice above all other notifications to let you know you’re running low on battery, or to give feedback when you touch the touch screen, the last category of notifications has to do with system notifications. Each of these categories in TouchSense Core speaks its own haptic dialect, with distinct notification styles that a user can learn over time.

“When you look at something like Android Wear, it’s really well thought out on almost every level,” says Ullrich. “But Google’s not giving any guidance to devs on how to create nuanced haptic sensations,”

That’s the problem with wearables–and, indeed, most vibrating gadgets–as it stands right now: It’s all binary. You either have an alert, or you don’t. Your smartwatch might buzz at you, but all the buzzes feel mostly the same. “It makes it really hard to tell if your smartwatch is alerting you because your grandma just wished you a happy birthday, or because you just lost millions on the stock market,” Ullrich says.

What TouchSense Core is attempting to offer developers is what Ullrich calls a “standardized iconography” of haptic effects. On the hardware side, this means that Immersion is working with wearable makers to make sure their devices support TouchSense Core, and that they feature vibrating motors that are dynamic enough to communicate a with a range of intensity. And on the app side, TouchSense Core has built out a design framework for devs to implement nuanced haptic notifications that users can actually understand.

But how different will these notifications really feel from one another? Aren’t they all just simple buzzes? Ullrich says no.

“One of the biggest challenges we have is getting the public to understand that the breadth of haptic experience is just so much bigger than they’ve been exposed to,” Ullrich says. For example, on Immersion’s Google Play demo app, you can experience a phone call notification that actually feels like an old rotary phone ringing in your hand. The effect’s so good, it’s eerie, yet chances are, if you have your smartphone on vibrate, it does nothing more than buzz insistently.

To Ullrich, wearables represent an unparalleled opportunity to inform the public about the importance of great haptic design. Wearables are strapped against the skin, not separated from your body by a purse or a pocket like a smartphone. They’re more intimate. Once customers learn to “hear” the silent language of haptics pulsing against their wrist, the hope is that they will demand a richer texture of haptic experience in other devices as well. Where as most people experience haptics in their smartphones and game controllers today as simple, repetitive notes, wearables could teach us to expect more from our devices: rich haptic melodies that track our physical and digital lives.

Although the haptic technology maker is not ready to announce when the first smartwatches with TouchSense Core integration will ship, or even who will make them, Immersion already works closely with companies like Samsung, LG, and Motorola. So chances are the first smartwatches to feature TouchSense Core could be just around the corner.

Either way, it’s part of a larger trend to redefine the ways in which our gadgets and appliances talk to us. If the recent work of companies like Immersion and Method is anything to go on, the gadgets of the future will be able to talk to us in a lot richer ways than just blinking their screens at us.