X-ray vision has long seemed like a far-fetched sci-fi fantasy, but over the last decade, a team led by Professor Dina Katabi from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) has continually gotten us closer to seeing through walls.
Their latest project, “RF-Pose,” uses artificial intelligence (AI) to teach wireless devices to sense people’s postures and movement, even from the other side of a wall.
The researchers use a neural network to analyze radio signals that bounce off people’s bodies, and can then create a dynamic stick figure that walks, stops, sits, and moves its limbs as the person performs those actions.
MIT News – Electrical engineering and computer science (EECS) – Computer science and technology
Discover more from Gadget Rumours
Subscribe to get the latest posts sent to your email.