The vibration motors in today's smartphones and tablets (if these larger form factor devices even include the tech) are barely capable of more than a gentle pocket-tingling sensation, or an annoying desk rattle.
Compared to the haptic feedback from the motors in games controllers like Sony's SixAxis ones, they seem even less capable. But if some brand-new tech from the University of Bristol penetrates the market, soon you'll have to think about coding haptic feedback of a far more sophisticated type into your mobile games. And possibly into many other apps too because the long-bubbling-under tech of haptics may be about to go mainstream.
What the Bristol team have achieved is actually useful on two fronts: Their system allows giant-scale touch devices to exhibit haptic touch feedback, and it also allows a surprising touch feedback effect before the user's fingertip ever comes into contact with the touchscreen.
The trick uses a totally different system to generate the tiny impulses that constitute haptic feedback. Whereas phone vibrators use a symmetric spun weight or occasionally a micro-oscillating weight, the Bristol team uses acoustic radiation. Since sound is a vibration of air molecules, it can, at the right frequency and amplitude, generate a force on an object nearby (if you've ever felt the baseline emanating from a speaker at a concert you can testify to this).
The new innovation, called UltraHaptics, uses an array of ultrasonic transducers to generate a complex set of sound waves. The array is phased--like the best new-generation military radar systems--so that sound waves generated by each sensor arrive at a chosen point in space at the same time. When enough ultrasound waves are focused on a point above, say, a tablet's surface they can generate a definite sensation in a human fingertip thanks to the high sensitivity of the nervous system.
Through a series of tests and experiments, the researchers have shown that they can create individual haptic feedback points that are finer than a user can sense. This means in theory that if you combine a number of these points then you should be able to generate very complex and subtle force feedback sensations, something akin to running one's finger over a slight edge on an onscreen UI button--as if it were a real, raised surface. Since the effects are generated by an array of transducers, this also means they're not localized to the near vicinity of the transducer itself (as in existing phone buzzers) and can thus deliver feedback for very large screens.
This type of haptics is evidently a cut above merely buzzing a smartphone's vibrator simply to say "something happened," without necessarily being able to distinguish what event the tactile alert corresponds to from its sensation alone. The team imagines that they will be able to generate "air gestures" that have touch feedback without having to touch a screen, and also create visually restricted displays--a way of communicating data to the user where an on-screen display is insufficient.
Why should you care about this? It is groundbreaking tech and it's likely a short distance away from being embedded in real devices, but it does indicate that the way you deliver feedback to users of apps is going to dramatically change.
[Image: Flickr user woodleywonderworks]