Abstract:Universal connectivity has been part of past and current generations of wireless systems, but as we approach 6G, the subject of social responsibility is being built as a core component. Given the advent of Non-Terrestrial Networks (NTN), reaching these goals will be much closer to realization than ever before. Owing to the benefits of NTN, the integration NTN and Terrestrial Networks (TN) is still infancy, where the past, the current and the future releases in the 3$^{\text{rd}}$ Generation Partnership Project (3GPP) provide guidelines to adopt a successfully co-existence/integration of TN and NTN. Therefore, in this article, we have illustrated through 3GPP guidelines, on how NTN and TN can effectively be integrated. Moreover, the role of beamforming and Artificial Intelligence (AI) algorithms is highlighted to achieve this integration. Finally the usefulness of integrating NTN and TN is validated through experimental analysis.
Abstract:The Artificial Intelligence Satellite Telecommunications Testbed (AISTT), part of the ESA project SPAICE, is focused on the transformation of the satellite payload by using artificial intelligence (AI) and machine learning (ML) methodologies over available commercial off-the-shelf (COTS) AI chips for on-board processing. The objectives include validating artificial intelligence-driven SATCOM scenarios such as interference detection, spectrum sharing, radio resource management, decoding, and beamforming. The study highlights hardware selection and payload architecture. Preliminary results show that ML models significantly improve signal quality, spectral efficiency, and throughput compared to conventional payload. Moreover, the testbed aims to evaluate the performance and application of AI-capable COTS chips in onboard SATCOM contexts.
Abstract:In this paper, we consider a scenario with one UAV equipped with a ULA, which sends combined information and sensing signals to communicate with multiple GBS and, at the same time, senses potential targets placed within an interested area on the ground. We aim to jointly design the transmit beamforming with the GBS association to optimize communication performance while ensuring high sensing accuracy. We propose a predictive beamforming framework based on a dual DNN solution to solve the formulated nonconvex optimization problem. A first DNN is trained to produce the required beamforming matrix for any point of the UAV flying area in a reduced time compared to state-of-the-art beamforming optimizers. A second DNN is trained to learn the optimal mapping from the input features, power, and EIRP constraints to the GBS association decision. Finally, we provide an extensive simulation analysis to corroborate the proposed approach and show the benefits of EIRP, SINR performance and computational speed.
Abstract:This paper studies the channel model for the integrated satellite-terrestrial networks operating at C-band under deployment in dense urban and rural areas. Particularly, the interference channel from the low-earth-orbit (LEO) satellite to the dense urban area is analyzed carefully under the impact of the environment's characteristics, i.e., the building density, building height, and the elevation angle. Subsequently, the experimental results show the strong relationships between these characteristics and the channel gain loss. Especially, the functions of channel gain loss are obtained by utilizing the model-fitting approach that can be used as the basis for studying future works of integration of satellite and terrestrial networks (ISTNs).
Abstract:Spiking neural networks (SNNs) implemented on neuromorphic processors (NPs) can enhance the energy efficiency of deployments of artificial intelligence (AI) for specific workloads. As such, NP represents an interesting opportunity for implementing AI tasks on board power-limited satellite communication spacecraft. In this article, we disseminate the findings of a recently completed study which targeted the comparison in terms of performance and power-consumption of different satellite communication use cases implemented on standard AI accelerators and on NPs. In particular, the article describes three prominent use cases, namely payload resource optimization, onboard interference detection and classification, and dynamic receive beamforming; and compare the performance of conventional convolutional neural networks (CNNs) implemented on Xilinx's VCK5000 Versal development card and SNNs on Intel's neuromorphic chip Loihi 2.
Abstract:Satellite communications (SatCom) are crucial for global connectivity, especially in the era of emerging technologies like 6G and narrowing the digital divide. Traditional SatCom systems struggle with efficient resource management due to static multibeam configurations, hindering quality of service (QoS) amidst dynamic traffic demands. This paper introduces an innovative solution - real-time adaptive beamforming on multibeam satellites with software-defined payloads in geostationary orbit (GEO). Utilizing a Direct Radiating Array (DRA) with circular polarization in the 17.7 - 20.2 GHz band, the paper outlines DRA design and a supervised learning-based algorithm for on-board beamforming. This adaptive approach not only meets precise beam projection needs but also dynamically adjusts beamwidth, minimizes sidelobe levels (SLL), and optimizes effective isotropic radiated power (EIRP).
Abstract:Satellite communications, essential for modern connectivity, extend access to maritime, aeronautical, and remote areas where terrestrial networks are unfeasible. Current GEO systems distribute power and bandwidth uniformly across beams using multi-beam footprints with fractional frequency reuse. However, recent research reveals the limitations of this approach in heterogeneous traffic scenarios, leading to inefficiencies. To address this, this paper presents a machine learning (ML)-based approach to Radio Resource Management (RRM). We treat the RRM task as a regression ML problem, integrating RRM objectives and constraints into the loss function that the ML algorithm aims at minimizing. Moreover, we introduce a context-aware ML metric that evaluates the ML model's performance but also considers the impact of its resource allocation decisions on the overall performance of the communication system.
Abstract:Backscatter communication (BC) technology offers sustainable solutions for next-generation Internet-of-Things (IoT) networks, where devices can transmit data by reflecting and adjusting incident radio frequency signals. In parallel to BC, deep reinforcement learning (DRL) has recently emerged as a promising tool to augment intelligence and optimize low-powered IoT devices. This article commences by elucidating the foundational principles underpinning BC systems, subsequently delving into the diverse array of DRL techniques and their respective practical implementations. Subsequently, it investigates potential domains and presents recent advancements in the realm of DRL-BC systems. A use case of RIS-aided non-orthogonal multiple access BC systems leveraging DRL is meticulously examined to highlight its potential. Lastly, this study identifies and investigates salient challenges and proffers prospective avenues for future research endeavors.
Abstract:This paper studies the potential of RIS-integrated NTNs to revolutionize the next-generation connectivity. First, it discusses the fundamentals of RIS technology. Secondly, it delves into reporting the recent advances in RIS-enabled NTNs. Subsequently, it presents a novel framework based on the current state-of-the-art for low earth orbit satellites (LEO) communications, wherein the signal received at the user terminal traverses both a direct link and an RIS link, and the RIS is mounted on a high-altitude platform (HAP) situated within the stratosphere. Finally, the paper concludes by highlighting open challenges and future research directions to revolutionize the realm of RIS-integrated NTNs.
Abstract:The latest satellite communication (SatCom) missions are characterized by a fully reconfigurable on-board software-defined payload, capable of adapting radio resources to the temporal and spatial variations of the system traffic. As pure optimization-based solutions have shown to be computationally tedious and to lack flexibility, machine learning (ML)-based methods have emerged as promising alternatives. We investigate the application of energy-efficient brain-inspired ML models for on-board radio resource management. Apart from software simulation, we report extensive experimental results leveraging the recently released Intel Loihi 2 chip. To benchmark the performance of the proposed model, we implement conventional convolutional neural networks (CNN) on a Xilinx Versal VCK5000, and provide a detailed comparison of accuracy, precision, recall, and energy efficiency for different traffic demands. Most notably, for relevant workloads, spiking neural networks (SNNs) implemented on Loihi 2 yield higher accuracy, while reducing power consumption by more than 100$\times$ as compared to the CNN-based reference platform. Our findings point to the significant potential of neuromorphic computing and SNNs in supporting on-board SatCom operations, paving the way for enhanced efficiency and sustainability in future SatCom systems.