As part of a wider pose-estimation programme called DensePose, Carnegie Mellon University engineers take advantage of the humble Wi-Fi chips found in, well, almost everything, and use them to figure out where in the room we are standing – and even the position or pose that we are making.
Their system does this not by using any special sensors, but by taking the raw Wi-Fi signal data and applying some clever machine learning. And the researchers have now published their experiments in a paper, which shows the DensePose software accurately drawing a wireframe mesh identifying test subjects – and even correctly predicting the direction they are facing, and the position of their arms and legs.
Research professor Fernando De La Torre, alongside his colleagues Jiaqi Geng and Dong Huang, believe that such technology could be particularly useful for taking care of elderly people living at home.
“My parents live in Spain, and they are getting older,” says De La...