This paper introduces a joint optimization framework for user-centric beam selection and linear precoding (LP) design in a coordinated multiple-satellite (CoMSat) system, employing a Digital-Fourier-Transform-based (DFT) beamforming (BF) technique. Regarding serving users at their target SINRs and minimizing the total transmit power, the scheme aims to efficiently determine satellites for users to associate with and activate the best cluster of beams together with optimizing LP for every satellite-to-user transmission. These technical objectives are first framed as a complex mixed-integer programming (MIP) challenge. To tackle this, we reformulate it into a joint cluster association and LP design problem. Then, by theoretically analyzing the duality relationship between downlink and uplink transmissions, we develop an efficient iterative method to identify the optimal solution. Additionally, a simpler duality approach for rapid beam selection and LP design is presented for comparison purposes. Simulation results underscore the effectiveness of our proposed schemes across various settings.
Semi-grant-free non-orthogonal multiple access (semi-GF NOMA) has emerged as a promising technology for the fifth-generation new radio (5G-NR) networks supporting the coexistence of a large number of random connections with various quality of service requirements. However, implementing a semi-GF NOMA mechanism in 5G-NR networks with heterogeneous services has raised several resource management problems relating to unpredictable interference caused by the GF access strategy. To cope with this challenge, the paper develops a novel hybrid optimization and multi-agent deep (HOMAD) reinforcement learning-based resource allocation design to maximize the energy efficiency (EE) of semi-GF NOMA 5G-NR systems. In this design, a multi-agent deep Q network (MADQN) approach is employed to conduct the subchannel assignment (SA) among users. While optimization-based methods are utilized to optimize the transmission power for every SA setting. In addition, a full MADQN scheme conducting both SA and power allocation is also considered for comparison purposes. Simulation results show that the HOMAD approach outperforms other benchmarks significantly in terms of the convergence time and average EE.
The space communications industry is challenged to develop a technology that can deliver broadband services to user terminals equipped with miniature antennas, such as handheld devices. One potential solution to establish links with ground users is the deployment of massive antennas in one single spacecraft. However, this is not cost-effective. Aligning with recent \emph{NewSpace} activities directed toward miniaturization, mass production, and a significant reduction in spacecraft launch costs, an alternative could be distributed beamforming from multiple satellites. In this context, we propose a distributed beamforming modeling technique for wideband signals. We also consider the statistical behavior of the relative geometry of the swarm nodes. The paper assesses the proposed technique via computer simulations, providing interesting results on the beamforming gains in terms of power and the security of the communication against potential eavesdroppers at non-intended pointing angles. This approach paves the way for further exploration of wideband distributed beamforming from satellite swarms in several future communication applications.
This paper proposes a joint optimization framework for energy-efficient precoding and feeder-link-beam matching design in a multi-gateway multi-beam bent-pipe satellite communication system. The proposed scheme jointly optimizes the precoding vectors at the gateways and amplifying-and-matching mechanism at the satellite to maximize the system weighted energy efficiency under the transmit power budget constraint. The technical designs are formulated into a non-convex sparsity problem consisting of a fractional-form objective function and sparsity-related constraints. To address these challenges, two iterative efficient designs are proposed by utilizing the concepts of Dinkelbach's method and the compress-sensing approach. The simulation results demonstrate the effectiveness of the proposed scheme compared to another benchmark method.
To allow flexible and cost-efficient network densification and deployment, the integrated access and backhaul (IAB) was recently standardized by the third generation partnership project (3GPP) as part of the fifth-generation new radio (5G-NR) networks. However, the current standardization only defines the IAB for the terrestrial domain, while non-terrestrial networks (NTNs) are yet to be considered for such standardization efforts. In this work, we motivate the use of IAB in NTNs, and we discuss the compatibility issues between the 3GPP specifications on IAB in 5G-NR and the satellite radio regulations. In addition, we identify the required adaptation from the 3GPP and/or satellite operators for realizing an NTN-enabled IAB operation. A case study is provided for a low earth orbit (LEO) satellite-enabled in-band IAB operation with orthogonal and non-orthogonal bandwidth allocation between access and backhauling, and under both time- and frequency-division duplex (TDD/FDD) transmission modes. Numerical results demonstrate the feasibility of IAB through satellites, and illustrate the superiority of FDD over TDD transmission. It is also shown that in the absence of precoding, non-orthogonal bandwidth allocation between the access and the backhaul can largely degrades the network throughput.
This paper presents a study of an integrated satellite-terrestrial network, where Low-Earth-Orbit (LEO) satellites are used to provide the backhaul link between base stations (BSs) and the core network. The mobility of LEO satellites raises the challenge of determining the optimal association between LEO satellites, BSs, and users (UEs). The goal is to satisfy the UE demand while ensuring load balance and optimizing the capacity of the serving link between the BS and the LEO satellite. To tackle this complex optimization problem, which involves mixed-integer non-convex programming, we propose an iterative algorithm that leverages approximation and relaxation methods. The proposed solution aims to find the optimal two-tier satellite-BS-UE association, sub-channel assignment, power and bandwidth allocation in the shortest possible time, fulfilling the requirements of the integrated satellite-terrestrial network.
This paper presents a centralized framework for optimizing the joint design of beam placement, power, and bandwidth allocation in an MEO satellite constellation to fulfill the heterogeneous traffic demands of a large number of global users. The problem is formulated as a mixed integer programming problem, which is computationally complex in large-scale systems. To overcome this challenge, a three-stage solution approach is proposed, including user clustering, cluster-based bandwidth and power estimation, and MEO-cluster matching. A greedy algorithm is also included as a benchmark for comparison. The results demonstrate the superiority of the proposed algorithm over the benchmark in terms of satisfying user demands and reducing power consumption.
This paper aims to develop satellite-user association and resource allocation mechanisms to minimize the total transmit power for integrated terrestrial and non-terrestrial networks wherein a constellation of LEO satellites provides the radio access services to both terrestrial base stations (BSs) and the satellite-enabled users (SUEs). In this work, beside maintaining the traditional SatCom connection for SUEs, the LEO satellites provide backhaul links to the BSs to upload the data received from their ground customers. Taking the individual SUE traffic demands and the aggregated BS demands, we formulate a mixed integer programming which consists of the binary variables due to satellite association selection, power control and bandwidth allocation related variables. To cope with this challenging problem, an iterative optimization-based algorithm is proposed by relaxing the binary components and alternating updating all variables. A greedy mechanism is also presented for comparison purpose. Then, numerical results are presented to confirm the effectiveness of our proposed algorithms.
This paper jointly designs linear precoding (LP) and codebook-based beamforming implemented in a satellite with massive multiple-input multiple-output (mMIMO) antenna technology. The codebook of beamforming weights is built using the columns of the discrete Fourier transform (DFT) matrix, and the resulting joint design maximizes the achievable throughput under limited transmission power. The corresponding optimization problem is first formulated as a mixed integer non-linear programming (MINP). To adequately address this challenging problem, an efficient LP and DFT-based beamforming algorithm are developed by utilizing several optimization tools, such as the weighted minimum mean square error transformation, duality method, and Hungarian algorithm. In addition, a greedy algorithm is proposed for benchmarking. A complexity analysis of these solutions is provided along with a comprehensive set of Monte Carlo simulations demonstrating the efficiency of our proposed algorithms.
This paper aims to jointly determine linear precoding (LP) vectors, beam hopping (BH), and discrete DVB-S2X transmission rates for the GEO satellite communication systems to minimize the payload power consumption and satisfy ground users' demands within a time window. Regarding constraint on the maximum number of illuminated beams per time slot, the technical requirement is formulated as a sparse optimization problem in which the hardware-related beam illumination energy is modeled in a sparsity form of the LP vectors. To cope with this problem, the compressed sensing method is employed to transform the sparsity parts into the quadratic form of precoders. Then, an iterative window-based algorithm is developed to update the LP vectors sequentially to an efficient solution. Additionally, two other two-phase frameworks are also proposed for comparison purposes. In the first phase, these methods aim to determine the MODCOD transmission schemes for users to meet their demands by using a heuristic approach or DNN tool. In the second phase, the LP vectors of each time slot will be optimized separately based on the determined MODCOD schemes.