Abstract:The careful planning and safe deployment of 5G technologies will bring enormous benefits to society and the economy. Higher frequency, beamforming, and small-cells are key technologies that will provide unmatched throughput and seamless connectivity to 5G users. Superficial knowledge of these technologies has raised concerns among the general public about the harmful effects of radiation. Several standardization bodies are active to put limits on the emissions which are based on a defined set of radiation measurement methodologies. However, due to the peculiarity of 5G such as dynamicity of the beams, network densification, Time Division Duplexing mode of operation, etc, using existing EMF measurement methods may provide inaccurate results. In this context, we discuss our experimental studies aimed towards the measurement of radiation caused by beam-based transmissions from a 5G base station equipped with an Active Antenna System(AAS). We elaborate on the shortcomings of current measurement methodologies and address several open questions. Next, we demonstrate that using user-specific downlink beamforming, not only better performance is achieved compared to non-beamformed downlink, but also the radiation in the vicinity of the intended user is significantly decreased. Further, we show that under weak reception conditions, an uplink transmission can cause significantly high radiation in the vicinity of the user equipment. We believe that our work will help in clearing several misleading concepts about the 5G EMF radiation effects. We conclude the work by providing guidelines to improve the methodology of EMF measurement by considering the spatiotemporal dynamicity of the 5G transmission.
Abstract:The Artificial Intelligence Satellite Telecommunications Testbed (AISTT), part of the ESA project SPAICE, is focused on the transformation of the satellite payload by using artificial intelligence (AI) and machine learning (ML) methodologies over available commercial off-the-shelf (COTS) AI chips for on-board processing. The objectives include validating artificial intelligence-driven SATCOM scenarios such as interference detection, spectrum sharing, radio resource management, decoding, and beamforming. The study highlights hardware selection and payload architecture. Preliminary results show that ML models significantly improve signal quality, spectral efficiency, and throughput compared to conventional payload. Moreover, the testbed aims to evaluate the performance and application of AI-capable COTS chips in onboard SATCOM contexts.
Abstract:Accurate asset localization holds paramount importance across various industries, ranging from transportation management to search and rescue operations. In scenarios where traditional positioning equations cannot be adequately solved due to limited measurements obtained by the receiver, the utilization of Non-Terrestrial Networks (NTN) based on Low Earth Orbit (LEO) satellites can prove pivotal for precise positioning. The decision to employ NTN in lieu of conventional Global Navigation Satellite Systems (GNSS) is rooted in two key factors. Firstly, GNSS systems are susceptible to jamming and spoofing attacks, thereby compromising their reliability, where LEO satellites link budgets can benefit from a closer distances and the new mega constellations could offer more satellites in view than GNSS. Secondly, 5G service providers seek to reduce dependence on third-party services. Presently, the NTN operation necessitates a GNSS receiver within the User Equipment (UE), placing the service provider at the mercy of GNSS reliability. Consequently, when GNSS signals are unavailable in certain regions, NTN services are also rendered inaccessible.
Abstract:Spiking neural networks (SNNs) implemented on neuromorphic processors (NPs) can enhance the energy efficiency of deployments of artificial intelligence (AI) for specific workloads. As such, NP represents an interesting opportunity for implementing AI tasks on board power-limited satellite communication spacecraft. In this article, we disseminate the findings of a recently completed study which targeted the comparison in terms of performance and power-consumption of different satellite communication use cases implemented on standard AI accelerators and on NPs. In particular, the article describes three prominent use cases, namely payload resource optimization, onboard interference detection and classification, and dynamic receive beamforming; and compare the performance of conventional convolutional neural networks (CNNs) implemented on Xilinx's VCK5000 Versal development card and SNNs on Intel's neuromorphic chip Loihi 2.
Abstract:Satellite communications (SatCom) are crucial for global connectivity, especially in the era of emerging technologies like 6G and narrowing the digital divide. Traditional SatCom systems struggle with efficient resource management due to static multibeam configurations, hindering quality of service (QoS) amidst dynamic traffic demands. This paper introduces an innovative solution - real-time adaptive beamforming on multibeam satellites with software-defined payloads in geostationary orbit (GEO). Utilizing a Direct Radiating Array (DRA) with circular polarization in the 17.7 - 20.2 GHz band, the paper outlines DRA design and a supervised learning-based algorithm for on-board beamforming. This adaptive approach not only meets precise beam projection needs but also dynamically adjusts beamwidth, minimizes sidelobe levels (SLL), and optimizes effective isotropic radiated power (EIRP).
Abstract:Satellite communications, essential for modern connectivity, extend access to maritime, aeronautical, and remote areas where terrestrial networks are unfeasible. Current GEO systems distribute power and bandwidth uniformly across beams using multi-beam footprints with fractional frequency reuse. However, recent research reveals the limitations of this approach in heterogeneous traffic scenarios, leading to inefficiencies. To address this, this paper presents a machine learning (ML)-based approach to Radio Resource Management (RRM). We treat the RRM task as a regression ML problem, integrating RRM objectives and constraints into the loss function that the ML algorithm aims at minimizing. Moreover, we introduce a context-aware ML metric that evaluates the ML model's performance but also considers the impact of its resource allocation decisions on the overall performance of the communication system.
Abstract:This paper delves into the application of Machine Learning (ML) techniques in the realm of 5G Non-Terrestrial Networks (5G-NTN), particularly focusing on symbol detection and equalization for the Physical Broadcast Channel (PBCH). As 5G-NTN gains prominence within the 3GPP ecosystem, ML offers significant potential to enhance wireless communication performance. To investigate these possibilities, we present ML-based models trained with both synthetic and real data from a real 5G over-the-satellite testbed. Our analysis includes examining the performance of these models under various Signal-to-Noise Ratio (SNR) scenarios and evaluating their effectiveness in symbol enhancement and channel equalization tasks. The results highlight the ML performance in controlled settings and their adaptability to real-world challenges, shedding light on the potential benefits of the application of ML in 5G-NTN.
Abstract:Interest in the integration of Terrestrial Networks (TN) and Non-Terrestrial Networks (NTN); primarily satellites; has been rekindled due to the potential of NTN to provide ubiquitous coverage. Especially with the peculiar and flexible physical layer properties of 5G-NR, now direct access to 5G services through satellites could become possible. However, the large Round-Trip Delays (RTD) in NTNs require a re-evaluation of the design of RLC and PDCP layers timers ( and associated buffers), in particular for the regenerative payload satellites which have limited computational resources, and hence need to be optimally utilized. Our aim in this work is to initiate a new line of research for emerging NTNs with limited resources from a higher-layer perspective. To this end, we propose a novel and efficient method for optimally designing the RLC and PDCP layers' buffers and timers without the need for intensive computations. This approach is relevant for low-cost satellites, which have limited computational and energy resources. The simulation results show that the proposed methods can significantly improve the performance in terms of resource utilization and delays.
Abstract:Sophisticated antenna technologies are constantly evolving to meet the escalating data demands projected for 6G and future networks. The characterization of these emerging antenna systems poses challenges that necessitate a reevaluation of conventional techniques, which rely solely on simple measurements conducted in advanced anechoic chambers. In this study, our objective is to introduce a novel endeavour for antenna pattern characterization (APC) in next-generation multiple-input-multiple-output (MIMO) systems by utilizing the potential of signal processing tools. In contrast to traditional methods that struggle with multi-path scenarios and require specialized equipment for measurements, we endeavour to estimate the antenna pattern by exploiting information from both line-of-sight (LoS) and non-LoS contributions. This approach enables antenna pattern characterization in complex environments without the need for anechoic chambers, resulting in substantial cost savings. Furthermore, it grants a much wider research community the ability to independently perform APC for emerging complex 6G antenna systems, without relying on anechoic chambers. Simulation results demonstrate the efficacy of the proposed novel approach in accurately estimating the true antenna pattern.
Abstract:Intelligent reflecting surfaces (IRS) have emerged as a promising technology to enhance the performance of wireless communication systems. By actively manipulating the wireless propagation environment, IRS enables efficient signal transmission and reception. In recent years, the integration of IRS with full-duplex (FD) communication has garnered significant attention due to its potential to further improve spectral and energy efficiencies. IRS-assisted FD systems combine the benefits of both IRS and FD technologies, providing a powerful solution for the next generation of cellular systems. In this manuscript, we present a novel approach to jointly optimize active and passive beamforming in a multiple-input-multiple-output (MIMO) FD system assisted by an IRS for weighted sum rate (WSR) maximization. Given the inherent difficulty in obtaining perfect channel state information (CSI) in practical scenarios, we consider imperfect CSI and propose a statistically robust beamforming strategy to maximize the ergodic WSR. Additionally, we analyze the achievable WSR for an IRS-assisted MIMO FD system under imperfect CSI by deriving both the lower and upper bounds. To tackle the problem of ergodic WSR maximization, we employ the concept of expected weighted minimum mean squared error (EWMMSE), which exploits the information of the expected error covariance matrices and ensures convergence to a local optimum. We evaluate the effectiveness of our proposed design through extensive simulations. The results demonstrate that our robust approach yields significant performance improvements compared to the simplistic beamforming approach that disregards CSI errors, while also outperforming the robust half-duplex (HD) system considerably