This paper introduces a joint optimization framework for user-centric beam selection and linear precoding (LP) design in a coordinated multiple-satellite (CoMSat) system, employing a Digital-Fourier-Transform-based (DFT) beamforming (BF) technique. Regarding serving users at their target SINRs and minimizing the total transmit power, the scheme aims to efficiently determine satellites for users to associate with and activate the best cluster of beams together with optimizing LP for every satellite-to-user transmission. These technical objectives are first framed as a complex mixed-integer programming (MIP) challenge. To tackle this, we reformulate it into a joint cluster association and LP design problem. Then, by theoretically analyzing the duality relationship between downlink and uplink transmissions, we develop an efficient iterative method to identify the optimal solution. Additionally, a simpler duality approach for rapid beam selection and LP design is presented for comparison purposes. Simulation results underscore the effectiveness of our proposed schemes across various settings.
Spiking neural networks (SNNs) implemented on neuromorphic processors (NPs) can enhance the energy efficiency of deployments of artificial intelligence (AI) for specific workloads. As such, NP represents an interesting opportunity for implementing AI tasks on board power-limited satellite communication spacecraft. In this article, we disseminate the findings of a recently completed study which targeted the comparison in terms of performance and power-consumption of different satellite communication use cases implemented on standard AI accelerators and on NPs. In particular, the article describes three prominent use cases, namely payload resource optimization, onboard interference detection and classification, and dynamic receive beamforming; and compare the performance of conventional convolutional neural networks (CNNs) implemented on Xilinx's VCK5000 Versal development card and SNNs on Intel's neuromorphic chip Loihi 2.
Satellite swarms have recently gained attention in the space industry due to their ability to provide extremely narrow beamwidths at a lower cost than single satellite systems. This paper proposes a concept for a satellite swarm using a distributed subarray configuration based on a 2D normal probability distribution. The swarm comprises multiple small satellites acting as subarrays of a big aperture array limited by a radius of 20000 wavelengths working at a central frequency of 19 GHz. The main advantage of this approach is that the distributed subarrays can provide extremely directive beams and beamforming capabilities that are not possible using a conventional antenna and satellite design. The proposed swarm concept is analyzed, and the simulation results show that the radiation pattern achieves a beamwidth as narrow as 0.0015-degrees with a maximum side lobe level of 18.8 dB and a grating lobe level of 14.8 dB. This concept can be used for high data rates applications or emergency systems.
The latest satellite communication (SatCom) missions are characterized by a fully reconfigurable on-board software-defined payload, capable of adapting radio resources to the temporal and spatial variations of the system traffic. As pure optimization-based solutions have shown to be computationally tedious and to lack flexibility, machine learning (ML)-based methods have emerged as promising alternatives. We investigate the application of energy-efficient brain-inspired ML models for on-board radio resource management. Apart from software simulation, we report extensive experimental results leveraging the recently released Intel Loihi 2 chip. To benchmark the performance of the proposed model, we implement conventional convolutional neural networks (CNN) on a Xilinx Versal VCK5000, and provide a detailed comparison of accuracy, precision, recall, and energy efficiency for different traffic demands. Most notably, for relevant workloads, spiking neural networks (SNNs) implemented on Loihi 2 yield higher accuracy, while reducing power consumption by more than 100$\times$ as compared to the CNN-based reference platform. Our findings point to the significant potential of neuromorphic computing and SNNs in supporting on-board SatCom operations, paving the way for enhanced efficiency and sustainability in future SatCom systems.
The space communications industry is challenged to develop a technology that can deliver broadband services to user terminals equipped with miniature antennas, such as handheld devices. One potential solution to establish links with ground users is the deployment of massive antennas in one single spacecraft. However, this is not cost-effective. Aligning with recent \emph{NewSpace} activities directed toward miniaturization, mass production, and a significant reduction in spacecraft launch costs, an alternative could be distributed beamforming from multiple satellites. In this context, we propose a distributed beamforming modeling technique for wideband signals. We also consider the statistical behavior of the relative geometry of the swarm nodes. The paper assesses the proposed technique via computer simulations, providing interesting results on the beamforming gains in terms of power and the security of the communication against potential eavesdroppers at non-intended pointing angles. This approach paves the way for further exploration of wideband distributed beamforming from satellite swarms in several future communication applications.
This paper jointly designs linear precoding (LP) and codebook-based beamforming implemented in a satellite with massive multiple-input multiple-output (mMIMO) antenna technology. The codebook of beamforming weights is built using the columns of the discrete Fourier transform (DFT) matrix, and the resulting joint design maximizes the achievable throughput under limited transmission power. The corresponding optimization problem is first formulated as a mixed integer non-linear programming (MINP). To adequately address this challenging problem, an efficient LP and DFT-based beamforming algorithm are developed by utilizing several optimization tools, such as the weighted minimum mean square error transformation, duality method, and Hungarian algorithm. In addition, a greedy algorithm is proposed for benchmarking. A complexity analysis of these solutions is provided along with a comprehensive set of Monte Carlo simulations demonstrating the efficiency of our proposed algorithms.