A chauffeur for everyone
The autonomously driving car has been in development since the 1970s. But except for cruise control and the beeping parking sensors, nothing was done in the field for a long time.
The DARPA Challenge 2004 was a turning point. DARPA is a research institution of the US Defense Department that pioneered the internet, among other things. They offered one million dollars in reward for the first vehicle to autonomously travel a 240 kilometer (149.1 mi) track through the Mojave Desert. Of the fifteen teams taking part, none reached the finish line.
One vehicle made it twelve kilometers (7.5 mi) before it went off the road in a hairpin bend and caught fire. The DARPA Challenge was repeated in 2005 and documents how quickly the development of autonomous driving had progressed: Of 23 teams, five reached the finish line just one year later, and most exceeded the best results of the previous year.
How autonomous driving works.
The vehicle needs information in order to travel safely in street traffic. This is provided by ultrasonic sensors, radar and LiDAR (Light Detection and Ranging, distance and speed measurement via laser), tire and inertia sensors, as well as cameras, but external information such as GPS positional data and high-resolution maps are also used. This information is first sent to a control system. Here, a picture of the vehicle’s surroundings is generated and interpreted in real time. Depending on the level of automatization, the corresponding vehicle commands (brake, accelerate, signal, turn, etc.) will be derived for the corresponding actors (brake and engine system, steering system, etc.).
Always up to date: dynamic landscape mapping with AI
The Mobile Mapping System from Mitsubishi Electric is currently installed on the roofs of numerous mapping vehicles, collecting a wide range of relevant data in real time as the vehicles are driven. This data is the basis for 3D maps that are precise to the centimeter. Responsible for the creation of the actual maps is ‘Dynamic Map Planning’, a consortium under the leadership of Mitsubishi Electric. The raw data is processed here and converted to dynamic maps. In this case, dynamic means that by applying artificial intelligence only changed information, like new traffic signs or street markings, is filtered out and incorporated when updating the maps. In the future, a larger number of common vehicles like delivery trucks, will be equipped with a (miniaturized) Mobile Mapping System.
By the way: With the Quasi-Zenith Satellite System, Mitsubishi Electric also provides positional data exact to the centimeter. In order to achieve this level of precision, the so-called Centimeter Level Augmentation Service (CLAS) was started in 2018. It augments satellite signals and corrects disruptions of the signal by the ionosphere and troposphere, which has prevented centimeter level positioning so far – another small building block that brings us closer to the driverless car.
Artificial intelligence instead of pattern recognition.
Whatever unexpected challenges autonomous driving may yet provide, its animal recognition has proven to work excellently with elk but had great difficulties in Australia. The different types of kangaroos on the fifth continent differ starkly in terms of size, and their silhouettes vary extremely depending on posture. In addition, kangaroos jump up to 12 meters, which means the vehicle’s sensors are no longer able to detect their distance and speed. As a consequence: The system does not work very reliably (at least in Australia).
In another situation, a car and a truck were involved in a fatal collision, because the system had erroneously interpreted the side of the coming truck as a billboard. The complexity of the real world puts yet more large obstacles on the path to autonomous driving, which means pure pattern recognition does not help further.
Part of the solution is the expansion of the database by using information from the cloud or the internet. This may be weather and temperature data (fog, black ice, etc.), data from vehicles in the area or additional traffic participants on the same route, but also relevant map material. This information allows autonomous vehicles to see more than their sensors are able.
Benefits of autonomous driving
For autonomous vehicles, safety features are just as important as comfort. A human’s spatial awareness shuts off after about twenty minutes due to the monotony and routine. In this situation, networked systems can indefinitely maintain distance, lane, and, above all, speed, the most common cause of fatal traffic accidents. Because severe acceleration and braking is reduced, this lowers fuel consumption, noise pollution, and exhaust emissions. Trucks would be able to align themselves in columns (platoons) in the slip stream of the preceding truck.
Limits of autonomous driving
The higher the driving speed, the more data is recorded in the same span of time. The demands of data processing and analysis and issuing control commands are exploding. While a drive on a level section of the highway with
clearly visible line markings and no cross traffic can almost be seen as static by the technology, city driving with pedestrians and bicycles as well as main roads are often driving situations that require quick, flexible handling. If a ball rolls onto the street, the system must decide within fractions of a second whether to brake, evade, or not react at all. In this situation, the general assessment abilities of people are far superior to current systems, even if the
latter do have a slightly faster reaction time of 1.2 seconds. The topic of autonomous driving in city traffic has become highly complex. Artificial intelligence (AI), machine learning and decision algorithms must work much better so that autonomous driving becomes absolutely safe.
Design study and technology driver on the way to autonomous driving: the EMIRAI 4
1. Heads-up-display with augmented reality.
In augmented reality, data from position sensors as well as highly precise 3D mapping and position determination technology is combined and visualized on the heads-up-display. This means that the lane edges and all street markings are faded in to provide a way to navigate in bad weather and poor visibility as well.
2. Intuitive human-machine-interface.
Using the intuitive control knob, the driver can control every function without having to take his eyes off the road.
3. Driver status recognition.
A wide-angle camera permanently analyzes the physical condition of the driver and passengers by following their head positions. In the event of a problem, it warns the driver and, in emergency situations, safely transfers control from manual to autonomous driving.
There is therefore a lot more to do yet, and Mitsubishi Electric is leading the way. <<