TUM scientist Leah Strand checks the technology on the gantry. Credit: Technical University Munich

In the Providentia++ project, researchers at the Technical University of Munich (TUM) have worked with industry partners to develop a technology to complement the vehicle perspective based on onboard sensor input with a bird's-eye view of traffic conditions. This improves road safety, including for autonomous driving.

The expectations for autonomous driving are clear: "Cars have to travel safely not only at low speeds, but also in fast-moving traffic," says Jörg Schrepfer, the head of Driving Advanced Research Germany at Valeo. For example, when objects fall off a truck, the "egocentric" perspective of a car will often be unable to detect the hazardous debris in time. "In these cases, it will be difficult to execute smooth evasive action," says Schrepfer.

Researchers in the Providentia++ project have developed a system to transmit an additional view of the traffic situation into vehicles. "Using sensors on overhead sign bridges and masts, we have created a reliable, of the traffic situation on our test route that functions around the clock," says Prof. Alois Knoll, project lead manager TUM. "With this system, we can now complement the vehicle's view with an external perspective—a bird's-eye view—and incorporate the behavior of other road users into decisions."

Transmitting the digital twin into the car: Minimizing time lags

Transmitting the digital twin into the car is far from trivial: The digital twin needs to know the exact location of the vehicle into which the sensor station information is transmitted. To make this possible, the project partner Valeo used an IMU-GNSS system (inertial measurement unit—global navigation satellite system) consisting of a measurement unit, a satellite navigation system and a real-time kinematic kit.

"In this way, we create a coordinate system in real time that is accurate to the nearest centimeter," explains Valeo expert Jörg Schrepfer. To synchronize the information from the vehicles and the measurement stations for the digital twin, the researchers use the UTC standard, which provides a uniform basis for coordinating time. Ideally, the digital mapping would be superimposed like a second layer over the car's perspective.

However, time lags (latencies) in the overall system cannot be entirely avoided. From the physical detection by the sensors and the processing of the data to the radio transmission to the vehicle, time passes. Data are packaged, coded and transmitted and then decoded in the car. Other conditions play a role, too, such as the distance of the vehicle from the transmitter tower on the test route and the traffic volume on the data transmission network. In a recent demonstration run, Valeo worked with the LTE (4G) wireless standard, which caused latency of 100 to 400 milliseconds. "These latencies can never be completely eliminated. However, intelligent algorithms will help," explains Schrepfer. "The results will be even better in the future when we have full coverage with the 5G or 6G telecommunications standards."

Prototype available for real-time digital twin

The Providentia++ research project has created the conditions for using these data in the vehicle. The goal was to create a scalable and highly available digital twin of the traffic situation with real-time capability. For this purpose, the team built a 3.5 kilometer test route in Garching, just outside Munich, consisting of seven sensor stations. The prototype was developed to permit series implementation if needed:

  • The researchers are working with decentralized digital twins. This permits the test route to be scaled up or extended to any desired length.
  • To handle data volumes of several gigabytes per second, they created a data processing concept that optimizes the load distribution across multiple CPUs and (GPUs).
  • Special programming challenges were posed by the calibration of sensors and the development of the tracking algorithms—tasks for which no software existed. "We are now using an automatic calibration process based on a high-resolution roadmap (HD map). It did not previously exist, so we had to develop it," explains technical project leader Venkatnarayanan Lakshminarashiman from the TUM Chair of Robotics, Artificial Intelligence and Real-time Systems.

Consortium leader Prof. Alois Knoll from TUM says, "The digital twin is ready for the project development stage. The concept is working reliably in 24/7 operations and is suitable not only for highways, but also for secondary roads and around intersections."

Related research was published in the 2022 25th International Conference on Information Fusion (FUSION) and the 2022 IEEE Intelligent Vehicles Symposium (IV).

More information: Leah Strand et al, Systematic Error Source Analysis of a Real-World Multi-Camera Traffic Surveillance System, 2022 25th International Conference on Information Fusion (FUSION) (2022). DOI: 10.23919/FUSION49751.2022.9841305

Christian Cres et al, A9-Dataset: Multi-Sensor Infrastructure-Based Dataset for Mobility Research, 2022 IEEE Intelligent Vehicles Symposium (IV) (2022). DOI: 10.1109/IV51971.2022.9827401