3D-printed sensors a step towards ‘intelligent’ furniture and custom controls
Image credit: Courtesy of Stefanie Mueller, Jun Gong, Olivia Seow, Cedric Honnet, et. al
MIT researchers have created 3D-printed objects with sensing incorporated directly into their material, with applications for assistive technologies and smart furniture that responds to users.
The objects were created using a new method for 3D-printing mechanisms that detect how force is being applied to an object. The structures are made from a single piece of material, meaning they can be rapidly prototyped.
The researchers integrated electrodes into structures made from metamaterials – structures engineered to interact with light in a certain manner – which are divided into a grid of repeating cells. They also created software which allows users to customise these devices.
“Metamaterials can support different mechanical functionalities, but if we create a metamaterial door handle can we also know that the door handle is being rotated and, if so, by how many degrees?” said co-lead author Dr Jun Gong, who has since left MIT to work at Apple. “If you have special sensing requirements, our work enables you to customise a mechanism to meet your needs.”
As metamaterials are made from a grid of cells, when the user applies force to a metamaterial object, some of the flexible, interior cells stretch or compress. The researchers took advantage of this by creating “conductive shear cells,” flexible cells with two opposing walls made from conductive filament (which function as electrodes) and two from nonconductive filament. When a user applies force, the conductive shear cells stretch or compress and the distance and overlapping area between the electrodes changes. Using capacitive sensing, the magnitude and direction of the applied forces can be calculated.
The researchers created a metamaterial joystick with four conductive shear cells embedded in the base of the handle (up, down, left, right) to demonstrate their material. As the user moves the joystick handle, the direction and magnitude of each applied force can be sensed, allowing them to play a Pac-Man game.
Understanding how joystick users apply force could enable a designer to quickly prototype unique, flexible input devices which are perfectly designed for the user (including users with limited grip strength), such as a squeezable volume controller or bendable stylus. The engineers used their approach again to create a music controller designed to a user’s hand.
MetaSense, the 3D editor the researchers developed, enables this rapid prototyping. Users can manually integrate sensing into a metamaterial design or let the software automatically place the conductive shear cells in optimal locations.
“The tool will simulate how the object will be deformed when different forces are applied and then use this simulated deformation to calculate which cells have the maximum distance change. The cells that change the most are the optimal candidates to be conductive shear cells,” said Gong.
The researchers endeavoured to make their software straightforward, but challenges remain. “In a multimaterial 3D printer, one nozzle would be used for nonconductive filament and one nozzle would be used for conductive filament, but it is quite tricky because the two materials may have very different properties,” explained Gong. “It requires a lot of parameter-tuning to settle on the ideal speed, temperature, etc. We believe that, as 3D printing technology continues to get better, this will be much easier for users in the future”.
In the future, the researchers would like to improve the algorithms behind MetaSense to enable more sophisticated simulations. They also hope to create mechanisms with many more conductive shear cells. Embedding hundreds or thousands of conductive shear cells within a very large mechanism could enable high-resolution, real-time visualisations of how a user is interacting with an object, Gong said.
“What I find most exciting about the project is the capability to integrate sensing directly into the material structure of objects. This will enable new intelligent environments in which our objects can sense each interaction with them,” said co-author Professor Stefanie Mueller.
“For instance, a chair or couch made from our smart material could detect the user’s body when the user sits on it and either use it to query particular functions (such as turning on the light or TV) or to collect data for later analysis (such as detecting and correcting body posture).”
Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.