Researchers at Carnegie Mellon University have created a new technology that uses the camera and accelerometers in an average mobile to measure real objects in 3D space and use the system to build "3D models of the world" just by waving your phone around an object or scene.
The accelerometers used, called inertial measurement units (IMU), roughly tell the phone's software the position of the phone in space. IMU's are very noisy and are rarely used to assess a phone's actual orientation with accuracy but coupled with the camera we can get much better results.
The TechCrunch web site quoted Simon Lucey, associate research professor in the CMU Robotics Institute, as saying, that they were able to get accuracies with cheap sensors which they had not imagined.
Lucey also said that with a face tracker program, they were able to measure the distance between a person's pupils within half a millimeter. Such measurements would be useful for applications such as virtual shopping for eyeglass frames.
This tool allows for better computer vision and could mean that you could create a 3D model of almost anything with smart phone. The researchers expect to use this in self-driving cars, bypassing expensive and heavy power consuming radar. The team used the technology to create something called Smart Fit that finds the perfect glasses frames for your face.
Lucey was also quoted, as saying that the trajectory created with these cheap IMUs will 'drift' over time, but the vision element created was very accurate. So we can use the 3-D model to correct for the errors caused by the IMU, even as we use the IMU to estimate the dimensions of the model.