Detecting other common road users, such as pedestrians and bikers, is considerably more challenging for radars. Therefore, radars are commonly used when implementing Adaptive Cruise Control systems. determining the position and direction of other vehicles. Radars work best when measuring large, moving metallic objects, i.e. The primary issue with radars is the difficulty in interpreting measured information. Radars have been in commercial use in vehicles for years, and the technology is now proven. The radar’s ability to measure through rain and clouds is commonly used in land vehicles, seafaring, aviation, and space technology. The third sensor type, radar, is primarily weather-independent.
LiDAR works equally well at night and during the day, and they do not get blinded by sunlight from the horizon or shining banks of snow. The greatest strength of LiDAR sensors is their independence of lighting conditions. For example, Sensible 4’s positioning algorithm – measuring the shape of terrain and buildings surrounding the vehicle – works even when over 50% of the measured information is distorted by weather. LiDAR sensor weather-proofing is a challenge that can, in many cases, be solved through advanced software. For example, raindrops or blowing sand do not stay put, unlike traffic signs or buildings. The sheer amount of LiDAR data produced is useful, as it provides great opportunities for computational error correction of the data. The devices may, in fact, measure up to a million points per second. LiDAR sensors, by design, produce massive amounts of data. This type of interference can generate random, erroneous observations of the environment, making it more difficult to measure actual objects. Rain, fog, autumn leaves, or other debris carried by the wind distort the 3D image rendered by the LiDAR sensor.
As a result, LiDAR sensors produce a real-time 3D model of its surroundings. LiDAR sensors transmit laser light and measure its reflections. Various objects can also look quite different under varying lighting conditions, making it all the more difficult for machine vision to operate. In daylight, if the sun is close to the horizon or the terrain is covered by snow, cameras may be blinded by the brightness or lack of any sort of objects in the view. In the dark, cameras mostly have to rely on the vehicle’s lights or streetlights. The thicker the fog or more intense the rain, the harder it is to identify objects in the distance.Īlso varying lighting conditions compromise camera performance.
cameras and LiDAR sensors.įog and snow obstruct the camera’s view, much like they make it difficult for a human driver to see ahead. Weather conditions pose problems particularly for sensors operating in visible light frequencies, i.e. Not surprisingly, rain and fog are the main obstacles for autonomous vehicle sensors to play with.Īn autonomous vehicle sees the world through various sensors, typically cameras, laser scanners (so-called LiDAR sensors), and radars. He’s one smart dude.According to the NHTSA (National Highway Traffic Safety Administration), the most common weather-related issues that lead to accidents are rain and wet road surface.įrozen roads actually pose a much smaller problem than wet road, mainly because the season for potential road freezing is much shorter and areas affected are smaller. Special thanks to Brian Blaylock for his assistance with understanding GOES data. You’re on your own during the 19-minute intervals. San Francisco weather condition data comes from Accuweather and is updated every 20 minutes. The red band cannot create a “true color” image on its own (aww, shucks), so these images use a blue-to-white color scheme to distinguish the darker land and ocean from the lighter clouds and fog.
(Oooooh.) Band 2 has the highest resolution of the 16 bands (score!) and can help identify fog boundaries and cloud coverage. The images above come from GOES-17’s Band 2, also known as the red visible band.
Stop us if you’ve heard this one: GOES satellite data managed by the NOAA is available for free for use through Amazon Web Services. The data for the Fog Tracker images comes from the National Oceanic and Atmospheric Administration’s GOES-17 satellite and includes visible clouds and fog formations, as well as visibility conditions in San Francisco. In an effort to remove some of the (cough, cough) fogginess around how we got this data, here’s a brief summation.