MIT develops depth-imaging system that sees through fog

  • 6 years ago
CAMBRIDGE, MASSACHUSETTS — MIT researchers have developed a new imaging system that can estimate the distance of objects obscured by fog.

One of the main challenges to the development of self-driving navigation systems that use visible light is their inability to handle misty or foggy driving conditions.
The system uses a time-of-flight camera, which emits short laser bursts at objects and then counts how long it takes for the light to bounce back.

Fog normally scatters laser light, which is problematic for autonomous vehicles. But thanks to an algorithm developed by the scientists, the system can find patterns in the scattered light to calculate distance.
Researchers tested the system in fog much denser than cars would encounter in the real world.

The team from MIT Media Lab's Camera Culture Group found that the system performed even better than human vision.

The goal is to integrate the system into autonomous vehicles so that even in bad weather, self-driving cars can avoid obstacles.