Abstract:The identification of Line-of-Sight (LoS) conditions is critical for ensuring reliable high-frequency communication links, which are particularly vulnerable to blockages and rapid channel variations. Network Digital Twins (NDTs) and Ray-Tracing (RT) techniques can significantly automate the large-scale collection and labeling of channel data, tailored to specific wireless environments. This paper examines the quality of Artificial Intelligence (AI) models trained on data generated by Network Digital Twins. We propose and evaluate training strategies for a general-purpose Deep Learning model, demonstrating superior performance compared to the current state-of-the-art. In terms of classification accuracy, our approach outperforms the state-of-the-art Deep Learning model by 5% in very low SNR conditions and by approximately 10% in medium-to-high SNR scenarios. Additionally, the proposed strategies effectively reduce the input size to the Deep Learning model while preserving its performance. The computational cost, measured in floating-point operations per second (FLOPs) during inference, is reduced by 98.55% relative to state-of-the-art solutions, making it ideal for real-time applications.
Abstract:The Line-of-Sight (LoS) identification is crucial to ensure reliable high-frequency communication links, especially those vulnerable to blockages. Network Digital Twins and Artificial Intelligence are key technologies enabling blockage detection (LoS identification) for high-frequency wireless systems, e.g., 6>GHz. In this work, we enhance Network Digital Twins by incorporating Age of Information (AoI) metrics, a quantification of status update freshness, enabling reliable real-time blockage detection (LoS identification) in dynamic wireless environments. By integrating raytracing techniques, we automate large-scale collection and labeling of channel data, specifically tailored to the evolving conditions of the environment. The introduced AoI is integrated with the loss function to prioritize more recent information to fine-tune deep learning models in case of performance degradation (model drift). The effectiveness of the proposed solution is demonstrated in realistic urban simulations, highlighting the trade-off between input resolution, computational cost, and model performance. A resolution reduction of 4x8 from an original channel sample size of (32, 1024) along the angle and subcarrier dimension results in a computational speedup of 32 times. The proposed fine-tuning successfully mitigates performance degradation while requiring only 1% of the available data samples, enabling automated and fast mitigation of model drifts.
Abstract:Digital Twin has emerged as a promising paradigm for accurately representing the electromagnetic (EM) wireless environments. The resulting virtual representation of the reality facilitates comprehensive insights into the propagation environment, empowering multi-layer decision-making processes at the physical communication level. This paper investigates the digitization of wireless communication propagation, with particular emphasis on the indispensable aspect of ray-based propagation simulation for real-time Digital Twins. A benchmark for ray-based propagation simulations is presented to evaluate computational time, with two urban scenarios characterized by different mesh complexity, single and multiple wireless link configurations, and simulations with/without diffuse scattering. Exhaustive empirical analyses are performed showing and comparing the behavior of different ray-based solutions. By offering standardized simulations and scenarios, this work provides a technical benchmark for practitioners involved in the implementation of real-time Digital Twins and optimization of ray-based propagation models.