Abstract:Perception is a core capability of automated vehicles and has been significantly advanced through modern sensor technologies and artificial intelligence. However, perception systems still face challenges in complex real-world scenarios. To improve robustness against various external factors, multi-sensor fusion techniques are essential, combining the strengths of different sensor modalities. With recent developments in Vehicle-to-Everything (V2X communication, sensor fusion can now extend beyond a single vehicle to a cooperative multi-agent system involving Connected Automated Vehicle (CAV) and intelligent infrastructure. This paper presents VALISENS, an innovative multi-sensor system distributed across multiple agents. It integrates onboard and roadside LiDARs, radars, thermal cameras, and RGB cameras to enhance situational awareness and support cooperative automated driving. The thermal camera adds critical redundancy for perceiving Vulnerable Road User (VRU), while fusion with roadside sensors mitigates visual occlusions and extends the perception range beyond the limits of individual vehicles. We introduce the corresponding perception module built on this sensor system, which includes object detection, tracking, motion forecasting, and high-level data fusion. The proposed system demonstrates the potential of cooperative perception in real-world test environments and lays the groundwork for future Cooperative Intelligent Transport Systems (C-ITS) applications.
Abstract:In the realm of automated driving simulation and sensor modeling, the need for highly accurate sensor models is paramount for ensuring the reliability and safety of advanced driving assistance systems (ADAS). Hence, numerous works focus on the development of high-fidelity models of ADAS sensors, such as camera, Radar as well as modern LiDAR systems to simulate the sensor behavior in different driving scenarios, even under varying environmental conditions, considering for example adverse weather effects. However, aging effects of sensors, leading to suboptimal system performance, are mostly overlooked by current simulation techniques. This paper introduces a cutting-edge Hardware-in-the-Loop (HiL) test bench designed for the automated, accelerated aging and characterization of Automotive LiDAR sensors. The primary objective of this research is to address the aging effects of LiDAR sensors over the product life cycle, specifically focusing on aspects such as laser beam profile deterioration, output power reduction and intrinsic parameter drift, which are mostly neglected in current sensor models. By that, this proceeding research is intended to path the way, not only towards identifying and modeling respective degradation effects, but also to suggest quantitative model validation metrics.