Researchers at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) have created a new tactile sensing carpet, which can gauge human poses without the use of cameras. This technology could be a big step towards creating improved personalized healthcare, smart homes, and gaming gadgets.
Existing technologies that monitor human activity rely on cameras, be they single RGB, omnidirectional wearables, or the good ol’ webcam. However, these have raised privacy concerns about possibly being watched 24/7.
Here’s where this carpet differs, as it only used cameras to create the dataset the system was trained on.
Now, to infer the pose a person’s in, one would just have to get onto the carpet and flex their favorite yoga pose, and the carpet would be able to determine if they had just done a sit-up, stretch, or something else entirely.
The carpet itself is made out of commercial pressure-sensitive film and conductive thread, with over 9,000 sensors spanning over 36 by two feet.
When a user’s feet, limbs, or torso makes contact with the carpet, the sensors convert the pressure into an electrical signal. The CSAIL team specifically trained the system on synchronized tactile and visual data, such as a video and corresponding heat map of a user doing a push-up.
“You can imagine leveraging this model to enable a seamless health monitoring system for high-risk individuals, for fall detection, rehab monitoring, mobility, and more,” said Yiyue Luo, lead author of the paper.
Currently, the system is able to predict a user’s pose with an error margin of less than 10cm (3.9 inches). For classifying specific actions, it was accurate 97% of the time.
Yunzhu Li, co-author of the paper, said that the carpet could also be used for exercise. It will be able to recognize the activity being performed (e.g. push-up, sit-up, etc), count the number of reps, and calculate the amount of burned calories.
While the system can currently identify poses by a single user, the researchers want to improve the metrics for multiple users. They also hope to gain more information from the signals, such as a user’s height or weight.
To see the “magic” carpet in action, take a look at the video below.
[via ZDNet, video and cover image via MIT Computer Science & Artificial Intelligence Lab]