News & press releases

Creating a real-time ensemble vision

After the Motor Valley Festival in May 2019, the CLASS project strived to provide to the municipality and to the connected cars a tool to observe the situation of the traffic in the Modena Automotive Smart Area (MASA) in real-time. Taking advantage of the MASA infrastructure and data provided, CLASS could produce a 3D map with real life road users moving in real time.

During the project, we developed the Deep Neural Network (DNN) able to identify and position object with GPS positions from smart cameras thanks to fine calibration and a suited homography projection. After that, together with ATOS and the Barcelona Supercomputing Center (BSC), we also studied and implemented a way to better predict the behaviour of the detected objects in the near future in order to be able to track the objects that are in the video frame after frame.

After obtaining the extracted metadata, we send it to an aggregator server that uses a K-Nearest-Neighbors (KNN) algorithm to remove duplicates coming from different cameras which are covering the same area.

Once the aggregation was done, we have a snapshot of the state of the MASA with cars, buses, pedestrians and motorbikes moving in the area. In the data center there is a 3D viewer that receives this snapshot and projects it on a monitor. In the meantime, one of our connected vehicles receive the same snapshot via our private 4G connection and visualise it on a screen, such as an enhanced version of Google maps, but with real live data on it.

In the video below, you can see a demonstration of moving vehicles and pedestrians in real-time in the MASA area. The objects are coloured as follows: cars in green, buses in blue and pedestrians in orange colour.