'Smart clothes' that can measure your movements

"Smart clothes" that can measure your movements
Credit: Massachusetts Institute of Technology

In recent years there have been exciting breakthroughs in wearable technologies, like smartwatches that can monitor your breathing and blood oxygen levels.

But what about a wearable that can detect how you move as you do a or play a sport, and could potentially even offer feedback on how to improve your technique?

And, as a major bonus, what if the wearable were something you'd actually already be wearing, like a shirt of a pair of socks?

That's the idea behind a new set of MIT-designed clothing that use special fibers to sense a person's movement via touch. Among other things, the researchers showed that their clothes can actually determine things like if someone is sitting, walking, or doing particular poses.

The group from MIT's Computer Science and Artificial Intelligence Lab (CSAIL) says that their clothes could be used for athletic training and rehabilitation. With patients' permission they could even help passively monitor the health of residents in assisted-care facilities and determine if, for example, someone has fallen or is unconscious.

The researchers have developed a range of prototypes, from socks and gloves to a full vest. The team's "tactile electronics" use a mix of more typical textile fibers alongside a small amount of custom-made functional fibers that sense pressure from the person wearing the garment.

Credit: Wan Shou, MIT

According to CSAIL graduate student Yiyue Luo, a key advantage of the team's design is that, unlike many existing wearable electronics, theirs can be incorporated into traditional large-scale clothing production. The machine-knitted tactile textiles are soft, stretchable, breathable, and can take a wide range of forms.

"Traditionally it's been hard to develop a mass-production that provides high-accuracy data across a large number of sensors," says Luo, lead author on a new paper about the project that is appearing in this month's edition of Nature Electronics. "When you manufacture lots of sensor arrays, some of them will not work and some of them will work worse than others, so we developed a self-correcting mechanism that uses a self-supervised machine learning algorithm to recognize and adjust when certain sensors in the design are off base."

The team's clothes have a range of capabilities. Their socks predict motion by looking at how different sequences of tactile footprints correlate to different poses as the user transitions from one pose to another. The full-sized vest can also detect the wearers' pose, activity, and the texture of the contacted surfaces.

The authors imagine a coach using the sensor to analyze people's postures and give suggestions on improvement. It could also be used by an experienced athlete to record their posture so that beginners can learn from them. In the long term, they even imagine that robots could be trained to learn how to do different activities using data from the wearables.

"Imagine robots that are no longer tactilely blind, and that have 'skins' that can provide tactile sensing just like we have as humans," says corresponding author Wan Shou, a postdoc at CSAIL. "Clothing with high-resolution tactile sensing opens up a lot of exciting new application areas for researchers to explore in the years to come."

More information: Yiyue Luo et al. Learning human–environment interactions using conformal tactile textiles, Nature Electronics (2021). DOI: 10.1038/s41928-021-00558-0

Journal information: Nature Electronics
Citation: 'Smart clothes' that can measure your movements (2021, March 25) retrieved 28 March 2024 from https://techxplore.com/news/2021-03-smart-movements.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

Novel soft tactile sensor with skin-comparable characteristics for robots

143 shares

Feedback to editors