Meet PERCIV AI

Perciv AI is a Dutch machine perception startup, a spin-out from TU Delft, founded in 2022 by Srimannarayana Baratam, Balazs Szekeres, and Andras Palffy.

We believe that radars are not exploited to their limits, and just like cameras and LiDARs, they can be pushed beyond their traditional limitations with dedicated AI, and perform extremely well for a cheaper price – even in adverse weather conditions.

Building on our academic and engineering background, we develop AI-driven machine radar perception solutions for automated vehicles and systems, such as cars, trucks, tractors, robots, or smart cities, to understand their environments even in harsh circumstances, increasing safety for both the passengers and the surroundings.

What is your role and involvement in the EVENTS project?

In the EVENTS project, Perciv AI is working together with TU Delft to demonstrate the performance of radar sensors in rainy conditions. This is important because rain and wet pavement cause significantly more accidents and fatalities compared to snowy, icy, and foggy conditions combined. Rain challenges sensors’ performance by creating water droplets and spray that obstruct visibility. Additionally, wet and slippery surfaces make vehicle control more difficult. Our goal is to address these challenges from both perception and control perspectives.

The outcomes of the EVENTS project will be integrated into the Perciv AI Radar Perception SDK, a cutting-edge software kit designed to enhance the perception capabilities of radar sensors in autonomous vehicles. This integration ensures that vehicles and robots remain safe and reliable even in adverse weather conditions. By advancing radar technology with dedicated AI, we believe radars can perform exceptionally well at a lower cost compared to other sensors like cameras and LiDARs. Our collaboration with TUD aims to set a new industry benchmark, improving the safety and performance of autonomous vehicles in challenging scenarios and ultimately reducing accidents and fatalities.

For the layman, the “automated driving” industry seems to focus mainly on lidar and cameras, as evidenced by companies like Tesla, or robotaxi companies such as Waymo and Cruise. Is there a need for radar, will they be included in future vehicles?

This is a common misconception. Radars have been in vehicles for decades, and there is at least one radar in every new car made for western markets today. The mentioned companies are not exceptions either: Waymo is making their own radars, Cruise acquired a radar company, and Tesla just patented their own radar. In our opinion, and the industry seems to agree – radars are not going anywhere. Better perception is critical, driven by both regulatory requirements, such as NCAP, and customer demand. Radar offers several unique advantages.

First, radar is significantly more cost efficient than other sensors, being 2-3 times cheaper than cameras and 8-15 times cheaper than lidar. This cost advantage makes it a practical choice for widespread adoption.

Second, radar is weather robust; it operates at different wavelengths, allowing it to penetrate particles in the air such as rain, fog, and snow. Unlike cameras and lidar, radar is not affected by bright sunlight either and, being an active sensor, it performs well in darkness.

Third, radar sensors are less susceptible to damage from scratches, dust, and other environmental factors, making them more reliable and long-lasting in various driving conditions.

While higher levels of automation may require a combination of multiple sensors, including cameras and lidar, there will be no (partially or fully) automated cars or trucks that have lidar without incorporating radar.

Radar’s affordability, robustness, and durability make it the candidate sensor for adverse weather conditions, and as such, was selected to be the main sensor of our experiment with TU Delft in the EVENTs project.

If radars are so great, why is there a need for Perciv AI and the EVENTS project to develop them further?

Despite radar’s benefits, such as cost-effectiveness, durability, and weather robustness, they also present significant challenges. Radars are generally hard to work with; their data is difficult for humans to interpret because it is sparse and noisy as a point cloud, and handling and transferring raw radar data can be complex.

For example, due to its wavelength, radar perceives flat objects like road surfaces, garage doors, or the sides of vans as “mirrors,” resulting in reflections that show accurate distance but potentially incorrect direction. While this can sometimes provide valuable information—such as our published work on using radar’s multipath propagation properties to “peek under cars” and detect pedestrians not visible to cameras—it also makes working with radar data quite challenging.

This is why dedicated AI designed for radar is essential, and it is precisely what we focus on at Perciv AI. In the EVENTS project, we develop segmentation algorithms that analyze the radar point cloud, identifying which reflections are likely “fake,” which are moving, and which come from objects of interest, such as cyclists.

Anything else you would like to mention or highlight?

We would like to thank the EVENTS consortium and EU Horizon for this opportunity. We are thoroughly enjoying the EVENTS project and gaining valuable insights from other consortium members. We look forward to the second half of the project, having just passed the midterm evaluation. The upcoming experiments will be an epic demonstration of what the future of safe driving in adverse weather looks like. Additionally, we are actively seeking partners who are interested in our technology. Potential partners, customers, and collaborators can reach us at info@perciv.ai. We are excited about the prospect of collaborating on future projects as well.

Dr Andras Palffy,

Co-founder of Perciv AI

Perciv AI is the lead of EVENTS Experiment 8