Skip to main content
2D materials

2D materials

Magnetosensitive e-skin senses objects without touching

22 Jan 2018 Isabelle Dumé
Interactive magnetosensitive device
Interactive magnetosensitive device
Interactive magnetosensitive device

A new magnetosensitive electronic skin that can manipulate physical and virtual objects without touching them could find use in robotics, virtual and augmented reality, remote control and could even be employed in dangerous or difficult-to-access places. The new structure, which is made up of magnetic field sensors, operates at low power, is accurate at detecting objects and their position in space, and is flexible.

Today’s virtual or augmented reality systems usually require the user to wear goggles or gloves. These are not only cumbersome but are also often energy-inefficient.

A team of researchers led by Denys Makarov at Helmholtz-Zentrum Dresden-Rossendorf together with Oliver Schmidt and colleagues at IFW Dresden and Martin Kaltenbrunner’s team in the Soft Electronics Laboratory at JKU’s Linz has now developed an alternative with its new e-skin that works by sensing magnetic fields.

2D magnetic field sensor

A device that can manipulate virtual objects needs to have two basic functions: the “touch” function and being able to detect the position of an object in space. Makarov and colleagues’ new device is the first to combine the two.

When placed on the palm of a hand, for example, the researchers can monitor the angular position of their sensor with respect to the direction of an applied magnetic field. This information is used to reconstruct the spatial position of the hand, which can then be displayed in a virtual reality image. The virtual hand can then interact with objects in virtual or augmented reality.

Makarov and colleagues tested out their device on volunteers who were able to type on a virtual keypad and even turn a knob on a computer screen without touching the screen.

Monitoring the angular position of the sensor

“To make our sensor, we synthesized very thin foils upon which we patterned two Wheatstone bridges accommodating a set of eight magnetic field sensors, each with a well-defined magnetic anisotropy axis,” explains study lead author Cañón Bermúdez. “By using this configuration, and the intrinsic properties of the sensors of themselves, the whole circuit is able to output two signals associated with the x and y- in-plane components of the external magnetic field.

“We then process these signals using software to reconstruct the angle of the in-plane magnetic field vector. By tracking the angle in the region where the user moves her or his hand, and by encoding a virtual parameter (for example, the light intensity) to each angle over the range of movement, we can control the state of a virtual object at will.”

2D magnetic field sensors based on spin valves

The sensors used to make the 2D devices are spin valves – magnetic switches that work thanks to the giant magetoresistive effect. They comprise a [Py/CoFe]/Cu/[CoFe/Py]/IrMn heterostructure, with the IrMn acting as an antiferromagnet that pins the magnetisation of the “reference” [CoFe/Py] bilayer next to it thanks to a phenomenon called exchange bias. The [Py/CoFe] free layer separated by a Cu spacer acts as the active sensor layer.

Although flexible and ultrathin, the devices are mechanically robust and operate at applied magnetic fields of 20 to 120 Oersteds. They might find use in a host of applications areas, says Makarov. These include virtual or augmented reality, fine motion tracking and reconstruction, navigation, remote control and manipulating objects in dangerous or inaccessible environments.

The team, reporting its work in Science Advances DOI: 10.1126/sciadv.aao2623, says that it will now be further improving the sensor technology so that it can detect even smaller magnetic fields. “We could also think of implementing a feedback system that would provide a stimulus, be it visual, auditory or tactile, to the user, thus enhancing the interactive experience,” he tells nanotechweb.org.

Copyright © 2024 by IOP Publishing Ltd and individual contributors