MMaterialsgateNEWS 2016/04/13

University of Sussex research brings 'smart hands' closer to reality

Using your skin as a touchscreen has been brought a step closer after UK scientists successfully created tactile sensations on the palm using ultrasound sent through the hand.

The University of Sussex-led study - funded by the Nokia Research Centre and the European Research Council - is the first to find a way for users to feel what they are doing when interacting with displays projected on their hand.

This solves one of the biggest challenges for technology companies who see the human body, particularly the hand, as the ideal display extension for the next generation of smartwatches and other smart devices.

Current ideas rely on vibrations or pins, which both need contact with the palm to work, interrupting the display.

However, this new innovation, called SkinHaptics, sends sensations to the palm from the other side of the hand, leaving the palm free to display the screen.

The device uses 'time-reversal' processing to send ultrasound waves through the hand. This technique is effectively like ripples in water but in reverse - the waves become more targeted as they travel through the hand, ending at a precise point on the palm.

It draws on a rapidly growing field of technology called haptics, which is the science of applying touch sensation and control to interaction with computers and technology.

Professor Sriram Subramanian, who leads the research team at the University of Sussex, says that technologies will inevitably need to engage other senses, such as touch, as we enter what designers are calling an 'eye-free' age of technology.

He says: "Wearables are already big business and will only get bigger. But as we wear technology more, it gets smaller and we look at it less, and therefore multisensory capabilities become much more important.

"If you imagine you are on your bike and want to change the volume control on your smartwatch, the interaction space on the watch is very small. So companies are looking at how to extend this space to the hand of the user.

"What we offer people is the ability to feel their actions when they are interacting with the hand."

Source: University of Sussex – 11.04.2016.

Investigated and edited by:

Dr.-Ing. Christoph Konetschny, Inhaber und Gründer von Materialsgate
Büro für Material- und Technologieberatung
The investigation and editing of this document was performed with best care and attention.
For the accuracy, validity, availability and applicability of the given information, we take no liability.
Please discuss the suitability concerning your specific application with the experts of the named company or organization.

You want additional material or technology investigations concerning this subject?

Materialsgate is leading in material consulting and material investigation.
Feel free to use our established consulting services

MMore on this topic

What if the touchscreen of your smartphone or tablet could touch you back? What if touch was as integrated into our ubiquitous technology as sight and sound?

Northwestern University and Carnegie Mellon University researchers now report a fascinating discovery that provides insight into how the brain makes sense of data from fingers. In a study of people drawing their fingers over a flat surface that has two "virtual bumps," the research team is the first to find that, under certain circumstances, the subjects feel only one bump when there really are two. Better yet, the researchers can explain why the brain comes to this conclusion. Their new mathematical model and experimental results on "haptic illusions" could one day lead to flat-screen displays featuring active touch-back technology, such as making your touchscreen's... more read more

Technology has changed rapidly over the last few years with touch feedback, known as haptics, being used in entertainment, rehabilitation and even surgical training. New research, using ultrasound, has developed an invisible 3D haptic shape that can be seen and felt.

The research paper, published in the current issue of ACM Transactions on Graphics and which will be presented at this week's SIGGRAPH Asia 2014 conference [3-6 December], demonstrates how a method has been created to produce 3D shapes that can be felt in mid-air. The research, led by Dr Ben Long and colleagues Professor Sriram Subramanian, Sue Ann Seah and Tom Carter from the University of Bristol's Department of Computer Science, could change the way 3D shapes are used. The new technology could enable surgeons to explore a CT scan by enabling them to feel a disease, such as a tumour, using haptic feedback. The method uses ultrasound, which is focussed onto hands above the device... more read more

Disney Research develops algorithm for rendering 3-D tactile features on touch surfaces

A person sliding a finger across a topographic map displayed on a touch screen can feel the bumps and curves of hills and valleys, despite the screen's smooth surface, with the aid of a novel algorithm created by Disney Research, Pittsburgh for tactile rendering of 3D features and textures. By altering the friction encountered as a person's fingertip glides across a surface, the Disney algorithm can create a perception of a 3D bump on a touch surface without having to physically move the surface. The method can be used to simulate the feel of a wide variety of objects and textures. The algorithm is based on a discovery that when a person slides a finger over a real physical bump... more read more

More on this topic:

MaterialsgateNEWSLETTER

Partner of the Week

Search in MaterialsgateNEWS

Books and products

MaterialsgateFAIR:
LET YOURSELF BE INSPIRED