This paper proposes an energy-efficient RIS-assisted downlink NOMA communication for LEO satellite networks. The proposed framework simultaneously optimizes the transmit power of ground terminals of the LEO satellite and the passive beamforming of RIS while ensuring the quality of services. Due to the nature of the considered system and optimization variables, the energy efficiency maximization problem is non-convex. In practice, obtaining the optimal solution for such problems is very challenging. Therefore, we adopt alternating optimization methods to handle the joint optimization in two steps. In step 1, for any given phase shift vector, we calculate satellite transmit power towards each ground terminal using the Lagrangian dual method. Then, in step 2, given the transmit power, we design passive beamforming for RIS by solving the semi-definite programming. We also compare our solution with a benchmark framework having a fixed phase shift design and a conventional NOMA framework without involving RIS. Numerical results show that the proposed optimization framework achieves 21.47\% and 54.9\% higher energy efficiency compared to the benchmark and conventional frameworks.
Unmanned aerial vehicles (UAV) have emerged as a practical solution that provides on-demand services to users in areas where the terrestrial network is non-existent or temporarily unavailable, e.g., due to natural disasters or network congestion. In general, UAVs' user-serving capacity is typically constrained by their limited battery life and the finite communication resources that highly impact their performance. This work considers the orthogonal frequency division multiple access (OFDMA) enabled multiple unmanned aerial vehicles (multi-UAV) communication systems to provide on-demand services. The main aim of this work is to derive an efficient technique for the allocation of radio resources, $3$D placement of UAVs, and user association matrices. To achieve the desired objectives, we decoupled the original joint optimization problem into two sub-problems: (i) $3$D placement and user association and (ii) sum-rate maximization for optimal radio resource allocation, which are solved iteratively. The proposed iterative algorithm is shown via numerical results to achieve fast convergence speed after fewer than 10 iterations. The benefits of the proposed design are demonstrated via superior sum-rate performance compared to existing reference designs. Moreover, results showed that the optimal power and sub-carrier allocation help to mitigate the inter-cell interference that directly impacts the system's performance.
This work considers the orthogonal frequency division multiple access (OFDMA) technology that enables multiple unmanned aerial vehicles (multi-UAV) communication systems to provide on-demand services. The main aim of this work is to derive the optimal allocation of radio resources, 3D placement of UAVs, and user association matrices. To achieve the desired objectives, we decoupled the original joint optimization problem into two sub-problems: i) 3D placement and user association and ii) sum-rate maximization for optimal radio resource allocation, which are solved iteratively. The proposed iterative algorithm is shown via numerical results to achieve fast convergence speed after less than 10 iterations. The benefits of the proposed design are demonstrated via superior sum-rate performance compared to existing reference designs. Moreover, the results declared that the optimal power and sub-carrier allocation helped mitigate the co-cell interference that directly impacts the system's performance.
Low earth orbit (LEO) satellite constellation-enabled communication networks are expected to be an important part of many Internet of Things (IoT) deployments due to their unique advantage of providing seamless global coverage. In this paper, we investigate the random access problem in massive multiple-input multiple-output-based LEO satellite systems, where the multi-satellite cooperative processing mechanism is considered. Specifically, at edge satellite nodes, we conceive a training sequence padded multi-carrier system to overcome the issue of imperfect synchronization, where the training sequence is utilized to detect the devices' activity and estimate their channels. Considering the inherent sparsity of terrestrial-satellite links and the sporadic traffic feature of IoT terminals, we utilize the orthogonal approximate message passing-multiple measurement vector algorithm to estimate the delay coefficients and user terminal activity. To further utilize the structure of the receive array, a two-dimensional estimation of signal parameters via rotational invariance technique is performed for enhancing channel estimation. Finally, at the central server node, we propose a majority voting scheme to enhance activity detection by aggregating backhaul information from multiple satellites. Moreover, multi-satellite cooperative linear data detection and multi-satellite cooperative Bayesian dequantization data detection are proposed to cope with perfect and quantized backhaul, respectively. Simulation results verify the effectiveness of our proposed schemes in terms of channel estimation, activity detection, and data detection for quasi-synchronous random access in satellite systems.
This paper proposes an energy-efficient RIS-enabled NOMA communication for LEO satellite networks. The proposed framework simultaneously optimizes the transmit power of ground terminals at LEO satellite and passive beamforming at RIS while ensuring the quality of services. Due to the nature of the considered system and optimization variables, the problem of energy efficiency maximization is formulated as non-convex. In practice, it is very challenging to obtain the optimal solution for such problems. Therefore, we adopt alternating optimization methods to handle the joint optimization in two steps. In step 1, for any given phase shift vector, we calculate efficient power for ground terminals at satellite using Lagrangian dual method. Then, in step 2, given the transmit power, we design passive beamforming for RIS by solving the semi-definite programming. To validate the proposed solution, numerical results are also provided to demonstrate the benefits of the proposed optimization framework.
A whole suite of innovative technologies and architectures have emerged in response to the rapid growth of wireless traffic. This paper studies an integrated network design that boosts system capacity through cooperation between wireless access points (APs) and a satellite for enhancing the network's spectral efficiency. We first mathematically derive an achievable throughput expression for the uplink (UL) data transmission over spatially correlated Rician channels. Our generic achievable throughput expression is applicable for arbitrary received signal detection techniques under realistic imperfect channel estimates. A closed-form expression is then obtained for the ergodic UL data throughput when maximum ratio combining is utilized for detecting the desired signals. As for our resource allocation contributions, we formulate the max-min fairness and total transmit power optimization problems relying on the channel statistics for performing power allocation. The solution of each optimization problem is derived in form of a low-complexity iterative design, in which each data power variable is updated relying on a closed-form expression. Our integrated hybrid network concept allows users to be served that may not otherwise be accommodated due to the excessive data demands. The algorithms proposed to allow us to address the congestion issues appearing when at least one user is served at a rate below the target. The mathematical analysis is also illustrated with the aid of our numerical results that show the added benefits of considering the space links in terms of improving the ergodic data throughput. Furthermore, the proposed algorithms smoothly circumvent any potential congestion, especially in face of high rate requirements and weak channel conditions.
This paper first describes the introduction of 6G-empowered V2X communications and IRS technology. Then it discusses different use case scenarios of IRS enabled V2X communications and reports recent advances in the existing literature. Next, we focus our attention on the scenario of vehicular edge computing involving IRS enabled drone communications in order to reduce vehicle computational time via optimal computational and communication resource allocation. At the end, this paper highlights current challenges and discusses future perspectives of IRS enabled V2X communications in order to improve current work and spark new ideas.
As a promising technology in beyond-5G (B5G) and 6G, dual-function radar-communication (DFRC) aims to ensure both radar sensing and communication on a single integrated platform with unified signaling schemes. To achieve accurate sensing and reliable communication, large-scale arrays are anticipated to be implemented in such systems, which brings out the prominent issues on hardware cost and power consumption. To address these issues, hybrid beamforming (HBF), beyond its successful deployment in communication-only systems, could be a promising approach in the emerging DFRC ones. In this article, we investigate the development of the HBF techniques on the DFRC system in a self-contained manner. Specifically, we first introduce the basics of the HBF based DFRC system, where the system model and different receive modes are discussed with focus. Then we illustrate the corresponding design principles, which span from the performance metrics and optimization formulations to the design approaches and our preliminary results. Finally, potential extension and key research opportunities, such as the combination with the reconfigurable intelligent surface, are discussed concisely.
This paper studies an integrated network design that boosts system capacity through cooperation between wireless access points (APs) and a satellite. By coherently combing the signals received by the central processing unit from the users through the space and terrestrial links, we mathematically derive an achievable throughput expression for the uplink (UL) data transmission over spatially correlated Rician channels. A closed-form expression is obtained when maximum ratio combining is employed to detect the desired signals. We formulate the max-min fairness and total transmit power optimization problems relying on the channel statistics to perform power allocation. The solution of each optimization problem is derived in form of a low-complexity iterative design, in which each data power variable is updated based on a closed-form expression. The mathematical analysis is validated with numerical results showing the added benefits of considering a satellite link in terms of improving the ergodic data throughput.