Driving in total darkness at Ford’s Arizona Proving Ground in the US marks the next step on the company’s journey to delivering fully autonomous vehicles to customers around the globe.
It’s an important development, demonstrating that even without cameras, which rely on light, Ford’s LiDAR (laser radar) system, working with the car’s virtual driver software, is robust enough to steer flawlessly on winding roads. While it’s ideal to have all three modes of sensors (radar, cameras and LiDAR), the latter can function independently on roads without street-lighting.
In the US, National Highway Traffic Safety Administration data has found the passenger vehicle occupant fatality rate during hours of darkness to be around three times higher than the daytime rate.
“Thanks to LiDAR, the test cars aren’t reliant on the sun shining, nor cameras detecting painted white lines on the asphalt,” says Jim McBride, Ford technical leader for autonomous vehicles. “In fact, LiDAR allows autonomous cars to drive just as well in the dark as they do in daytime.”
To navigate in the dark, Ford self-driving cars use high-resolution 3D maps, complete with information about the road, road markings, geography, topography and landmarks like signs, buildings and trees. The vehicle uses LiDAR pulses to pinpoint itself on the map in real time. Additional data from the radar system is combined with LiDAR to complete the full sensing capability of the autonomous vehicle.
For the desert test, Ford engineers, sporting night vision goggles, monitored the Fusion from inside and outside the vehicle. Night vision allowed them to see the LiDAR functioning in the form of a grid of infrared laser beams projected around the vehicle as it drove past. LiDAR sensors shoot out 2.8 million laser pulses a second to precisely scan the surrounding environment.