I just heard about something that I think is not only really, really cool, but has a real chance to impact the nature of how we interact with computing devices in the future.
It’s called TeslaTouch. If you’re a user of a touch-pad device like the iPad or, to a lesser degree, if you’re a user of a touch-screen smart-phone, you’re familiar with the problem it addresses. When you use one of the glass screens on these devices you can see what’s happening, but you don’t get any tactile feedback. This can be an especially big issue when you’re typing. If you’re a touch-typist with any proficiency at all, trying to type on a glass screen can be an extremely frustrating experience. Instead of focusing on your thoughts and having the words just appear you’re often reduced to some form of “hunt-and-peck.” At the very least, you frequently have to stop and make corrections. The reason is that you don’t get the tactile feedback from the glass screen that you get from a physical keyboard.
The keyboard often has special keys where the “f” and “j” keys are so your fingers know that they’re beginning in the appropriate “home row” position. And from there, your fingers know immediately when they’ve missed a key you’re trying to press, or when you’ve pressed two at once. But with a smooth glass screen everything feels the same, including the spaces between keys. This problem was by far the most critical deciding factor for me choosing an HP EiliteBook Tablet PC over an iPad recently. The iPad is much lighter, lasts much longer on a charge, and is just all around much cooler… but doesn’t have a keypad. There was no way I could use it as my primary computing device without one.
But TeslaTouch fixes the problem by using small electric impulses to change how the screen feels at different places. Touch one place; receive on sensation. Touch another; receive a different one. So, for example, the “home” keys could have a different sensation than other keys. And the space between keys could offer no sensation at all. Touch-typing problem solved!
Not surprisingly, there are numerous potential uses beyond more effectively emulating a keyboard. For example, objects on the screen could be made to “feel” heavier or lighter relative to others. For example, large files could “feel” heavier than smaller files. Very useful to know when you’re trying to download or transfer files over a connection. Or “rubbing” a folder might convey to you how “heavy” its contents are. Dragging and dropping items could be enhanced too: successfully dropping the item on the target could be felt as a “snap.” If you don’t feel the snap you know immediately you missed. Certainly you’re getting the visual feedback as well, but adding the tactile feedback creates a much more holistic feel to the interaction.
We’re already in the midst of an accelerating transition away from keyboards and mice to a more direct interaction with objects on the screen. TeslaTouch, if successful, could dramatically speed up that transition. If the iPad were equipped with TeslaTouch, I’d probably be on my way to the store right now.