A standard camera takes flat, 2-D pictures. To get 3-D information, such as the distance to a far-away object, scientists can bounce a laser beam off the object and measure how long it takes the light to travel back to a detector.
The technique, called time-of-flight (ToF) navigation systems for autonomous vehicles, and other applications, but many current systems have a relatively short range and struggle to image objects that do not reflect laser light.
Researchers have tackled these limitations and reported their findings in the journal Optics Express.
The new system works by sweeping a low-power infrared laser beam rapidly over an object. It records, pixel-by-pixel, the round-trip flight time of the photons in the beam as they bounce off the object and arrive back at the source.
More From This Section
The system can resolve depth on the millimetre scale over long distances using a detector that can "count" individual photons.
"Our approach gives a low-power route to the depth imaging of ordinary, small targets at very long range," McCarthy said.
"Whilst it is possible that other depth-ranging techniques will match or out-perform some characteristics of these measurements, this single-photon counting approach gives a unique trade-off between depth resolution, range, data-acquisition time, and laser-power levels," he said.
The primary use of the system is likely to be scanning static, human-made targets, such as vehicles. With some modifications to the image-processing software, it could also determine their speed and direction.
This long-wavelength light travels more easily through the atmosphere, is not drowned out by sunlight, and is safe for eyes at low power.
Ultimately, McCarthy says, it could scan and image objects located as far as 10 kilometres away.