Accurate asset localization holds paramount importance across various industries, ranging from transportation management to search and rescue operations. In scenarios where traditional positioning equations cannot be adequately solved due to limited measurements obtained by the receiver, the utilization of Non-Terrestrial Networks (NTN) based on Low Earth Orbit (LEO) satellites can prove pivotal for precise positioning. The decision to employ NTN in lieu of conventional Global Navigation Satellite Systems (GNSS) is rooted in two key factors. Firstly, GNSS systems are susceptible to jamming and spoofing attacks, thereby compromising their reliability, where LEO satellites link budgets can benefit from a closer distances and the new mega constellations could offer more satellites in view than GNSS. Secondly, 5G service providers seek to reduce dependence on third-party services. Presently, the NTN operation necessitates a GNSS receiver within the User Equipment (UE), placing the service provider at the mercy of GNSS reliability. Consequently, when GNSS signals are unavailable in certain regions, NTN services are also rendered inaccessible.
Spiking neural networks (SNNs) implemented on neuromorphic processors (NPs) can enhance the energy efficiency of deployments of artificial intelligence (AI) for specific workloads. As such, NP represents an interesting opportunity for implementing AI tasks on board power-limited satellite communication spacecraft. In this article, we disseminate the findings of a recently completed study which targeted the comparison in terms of performance and power-consumption of different satellite communication use cases implemented on standard AI accelerators and on NPs. In particular, the article describes three prominent use cases, namely payload resource optimization, onboard interference detection and classification, and dynamic receive beamforming; and compare the performance of conventional convolutional neural networks (CNNs) implemented on Xilinx's VCK5000 Versal development card and SNNs on Intel's neuromorphic chip Loihi 2.
Satellite communications (SatCom) are crucial for global connectivity, especially in the era of emerging technologies like 6G and narrowing the digital divide. Traditional SatCom systems struggle with efficient resource management due to static multibeam configurations, hindering quality of service (QoS) amidst dynamic traffic demands. This paper introduces an innovative solution - real-time adaptive beamforming on multibeam satellites with software-defined payloads in geostationary orbit (GEO). Utilizing a Direct Radiating Array (DRA) with circular polarization in the 17.7 - 20.2 GHz band, the paper outlines DRA design and a supervised learning-based algorithm for on-board beamforming. This adaptive approach not only meets precise beam projection needs but also dynamically adjusts beamwidth, minimizes sidelobe levels (SLL), and optimizes effective isotropic radiated power (EIRP).
Satellite communications, essential for modern connectivity, extend access to maritime, aeronautical, and remote areas where terrestrial networks are unfeasible. Current GEO systems distribute power and bandwidth uniformly across beams using multi-beam footprints with fractional frequency reuse. However, recent research reveals the limitations of this approach in heterogeneous traffic scenarios, leading to inefficiencies. To address this, this paper presents a machine learning (ML)-based approach to Radio Resource Management (RRM). We treat the RRM task as a regression ML problem, integrating RRM objectives and constraints into the loss function that the ML algorithm aims at minimizing. Moreover, we introduce a context-aware ML metric that evaluates the ML model's performance but also considers the impact of its resource allocation decisions on the overall performance of the communication system.
This paper delves into the application of Machine Learning (ML) techniques in the realm of 5G Non-Terrestrial Networks (5G-NTN), particularly focusing on symbol detection and equalization for the Physical Broadcast Channel (PBCH). As 5G-NTN gains prominence within the 3GPP ecosystem, ML offers significant potential to enhance wireless communication performance. To investigate these possibilities, we present ML-based models trained with both synthetic and real data from a real 5G over-the-satellite testbed. Our analysis includes examining the performance of these models under various Signal-to-Noise Ratio (SNR) scenarios and evaluating their effectiveness in symbol enhancement and channel equalization tasks. The results highlight the ML performance in controlled settings and their adaptability to real-world challenges, shedding light on the potential benefits of the application of ML in 5G-NTN.
Interest in the integration of Terrestrial Networks (TN) and Non-Terrestrial Networks (NTN); primarily satellites; has been rekindled due to the potential of NTN to provide ubiquitous coverage. Especially with the peculiar and flexible physical layer properties of 5G-NR, now direct access to 5G services through satellites could become possible. However, the large Round-Trip Delays (RTD) in NTNs require a re-evaluation of the design of RLC and PDCP layers timers ( and associated buffers), in particular for the regenerative payload satellites which have limited computational resources, and hence need to be optimally utilized. Our aim in this work is to initiate a new line of research for emerging NTNs with limited resources from a higher-layer perspective. To this end, we propose a novel and efficient method for optimally designing the RLC and PDCP layers' buffers and timers without the need for intensive computations. This approach is relevant for low-cost satellites, which have limited computational and energy resources. The simulation results show that the proposed methods can significantly improve the performance in terms of resource utilization and delays.
Sophisticated antenna technologies are constantly evolving to meet the escalating data demands projected for 6G and future networks. The characterization of these emerging antenna systems poses challenges that necessitate a reevaluation of conventional techniques, which rely solely on simple measurements conducted in advanced anechoic chambers. In this study, our objective is to introduce a novel endeavour for antenna pattern characterization (APC) in next-generation multiple-input-multiple-output (MIMO) systems by utilizing the potential of signal processing tools. In contrast to traditional methods that struggle with multi-path scenarios and require specialized equipment for measurements, we endeavour to estimate the antenna pattern by exploiting information from both line-of-sight (LoS) and non-LoS contributions. This approach enables antenna pattern characterization in complex environments without the need for anechoic chambers, resulting in substantial cost savings. Furthermore, it grants a much wider research community the ability to independently perform APC for emerging complex 6G antenna systems, without relying on anechoic chambers. Simulation results demonstrate the efficacy of the proposed novel approach in accurately estimating the true antenna pattern.
Intelligent reflecting surfaces (IRS) have emerged as a promising technology to enhance the performance of wireless communication systems. By actively manipulating the wireless propagation environment, IRS enables efficient signal transmission and reception. In recent years, the integration of IRS with full-duplex (FD) communication has garnered significant attention due to its potential to further improve spectral and energy efficiencies. IRS-assisted FD systems combine the benefits of both IRS and FD technologies, providing a powerful solution for the next generation of cellular systems. In this manuscript, we present a novel approach to jointly optimize active and passive beamforming in a multiple-input-multiple-output (MIMO) FD system assisted by an IRS for weighted sum rate (WSR) maximization. Given the inherent difficulty in obtaining perfect channel state information (CSI) in practical scenarios, we consider imperfect CSI and propose a statistically robust beamforming strategy to maximize the ergodic WSR. Additionally, we analyze the achievable WSR for an IRS-assisted MIMO FD system under imperfect CSI by deriving both the lower and upper bounds. To tackle the problem of ergodic WSR maximization, we employ the concept of expected weighted minimum mean squared error (EWMMSE), which exploits the information of the expected error covariance matrices and ensures convergence to a local optimum. We evaluate the effectiveness of our proposed design through extensive simulations. The results demonstrate that our robust approach yields significant performance improvements compared to the simplistic beamforming approach that disregards CSI errors, while also outperforming the robust half-duplex (HD) system considerably
In this paper, a novel robust beamforming for an intelligent reflecting surface (IRS) assisted FD system is presented. Since perfect channel state information (CSI) is often challenging to acquire in practice, we consider the case of imperfect CSI and adopt a statistically robust beamforming approach to maximize the ergodic weighted sum rate (WSR). We also analyze the achievable WSR of an IRS-assisted FD with imperfect CSI, for which the lower and the upper bounds are derived. The ergodic WSR maximization problem is tackled based on the expected Weighted Minimum Mean Squared Error (WMMSE), which is guaranteed to converge to a local optimum. The effectiveness of the proposed design is investigated with extensive simulation results. It is shown that our robust design achieves significant performance gain compared to the naive beamforming approaches and considerably outperforms the robust Half-Duplex (HD) system.
The paradigm of joint communications and sensing (JCAS) envisions a revolutionary integration of communication and radar functionalities within a unified hardware platform. This novel concept not only opens up unprecedented possibilities, but also presents unique challenges. Its success is highly dependent on efficient full-duplex (FD) operation, which has the potential to enable simultaneous transmission and reception within the same frequency band. While ongoing research explores the potential of JCAS, there are related avenues of investigation that hold tremendous potential to profoundly transform the sixth generation (6G) and beyond cellular networks. This article sheds light on the new opportunities and challenges presented by JCAS by taking into account the key technical challenges of FD systems. Unlike simplified JCAS scenarios, we delve into the most comprehensive configuration, encompassing uplink (UL) and downlink (DL) users, as well as monostatic and bistatic radars, all harmoniously coexisting to jointly push the boundaries of both the communications and sensing performance. The performance improvements introduced by this advancement bring forth numerous new challenges, each meticulously examined and expounded upon.