This paper investigates the massive connectivity of low Earth orbit (LEO) satellite-based Internet-of-Things (IoT) for seamless global coverage. We propose to integrate the grant-free non-orthogonal multiple access (GF-NOMA) paradigm with the emerging orthogonal time frequency space (OTFS) modulation to accommodate the massive IoT access, and mitigate the long round-trip latency and severe Doppler effect of terrestrial-satellite links (TSLs). On this basis, we put forward a two-stage successive active terminal identification (ATI) and channel estimation (CE) scheme as well as a low-complexity multi-user signal detection (SD) method. Specifically, at the first stage, the proposed training sequence aided OTFS (TS-OTFS) data frame structure facilitates the joint ATI and coarse CE, whereby both the traffic sparsity of terrestrial IoT terminals and the sparse channel impulse response are leveraged for enhanced performance. Moreover, based on the single Doppler shift property for each TSL and sparsity of delay-Doppler domain channel, we develop a parametric approach to further refine the CE performance. Finally, a least square based parallel time domain SD method is developed to detect the OTFS signals with relatively low complexity. Simulation results demonstrate the superiority of the proposed methods over the state-of-the-art solutions in terms of ATI, CE, and SD performance confronted with the long round-trip latency and severe Doppler effect.
Reconfigurable Intelligent Surfaces (RIS) are planar structures connected to electronic circuitry, which can be employed to steer the electromagnetic signals in a controlled manner. Through this, the signal quality and the effective data rate can be substantially improved. While the benefits of RIS-assisted wireless communications have been investigated for various scenarios, some aspects of the network design, such as coverage, optimal placement of RIS, etc., often require complex optimization and numerical simulations, since the achievable effective rate is difficult to predict. This problem becomes even more difficult in the presence of phase estimation errors or location uncertainty, which can lead to substantial performance degradation if neglected. Considering randomly distributed receivers within a ring-shaped RIS-assisted wireless network, this paper mainly investigates the effective rate by taking into account the above-mentioned impairments. Furthermore, exact closed-form expressions for the effective rate are derived in terms of Meijer's $G$-function, which (i) reveals that the location and phase estimation uncertainty should be well considered in the deployment of RIS in wireless networks; and (ii) facilitates future network design and performance prediction.
In millimeter-wave (mmWave) dual-function radar-communication (DFRC) systems, hybrid beamforming (HBF) is recognized as a promising technique utilizing a limited number of radio frequency chains. In this work, in the presence of extended target and clutters, a HBF design based on the subarray connection architecture is proposed for a multiple-input multiple-output (MIMO) DFRC system. In this HBF, the double-phase-shifter (DPS) structure is embedded to further increase the design flexibility. We derive the communication spectral efficiency (SE) and radar signal-to-interference-plus-noise-ratio (SINR) with respect to the transmit HBF and radar receiver, and formulate the HBF design problem as the SE maximization subjecting to the radar SINR and power constraints. To solve the formulated nonconvex problem, the joinT Hybrid bRamforming and Radar rEceiver OptimizatioN (THEREON) is proposed, in which the radar receiver is optimized via the generalized eigenvalue decomposition, and the transmit HBF is updated with low complexity in a parallel manner using the consensus alternating direction method of multipliers (consensus-ADMM). Furthermore, we extend the proposed method to the multi-user multiple-input single-output (MU-MISO) scenario. Numerical simulations demonstrate the efficacy of the proposed algorithm and show that the solution provides a good trade-off between number of phase shifters and performance gain of the DPS HBF.
Reconfigurable intelligent surfaces (RISs) have recently gained significant interest as an emerging technology for future wireless networks. This paper studies an RIS-assisted propagation environment, where a single-antenna source transmits data to a single-antenna destination in the presence of a weak direct link. We analyze and compare RIS designs based on long-term and short-term channel statistics in terms of coverage probability and ergodic rate. For the considered optimization designs, closed-form expressions for the coverage probability and ergodic rate are derived. We use numerical simulations to analyze and compare against analytic results in finite samples. Also, we show that the considered optimal phase shift designs outperform several heuristic benchmarks.
Non-orthogonal multiple access (NOMA) is expected to provide high spectral efficiency (SE) and massive connectivity in future wireless networks. On the other side, backscatter communications (BC) is an emerging technology towards battery-free transmission in future wireless networks by leveraging ambient radio frequency (RF) waves that enable communications among wireless devices. This paper proposes a new optimization framework to maximize the SE of the NOMA-BC network. In particular, we simultaneously optimize the transmit power of the base station and reflection coefficient of the backscatter device in each cell under the assumption of imperfect decoding of successive interference cancellation. The SE optimization problem is coupled on multiple variables which makes it very difficult to solve. Thus, we apply a decomposition method and KKT conditions to get an efficient solution. Simulation results demonstrate the superiority of the proposed NOMA-BC framework over the benchmark NOMA without BC framework.
Source localization plays a key role in many applications including radar, wireless and underwater communications. Among various localization methods, the most popular ones are Time-Of-Arrival (TOA), Time-Difference-Of-Arrival (TDOA), and Received Signal Strength (RSS) based. Since the Cram\'{e}r-Rao lower bounds (CRLB) of these methods depend on the sensor geometry explicitly, sensor placement becomes a crucial issue in source localization applications. In this paper, we consider finding the optimal sensor placements for the TOA, TDOA and RSS based localization scenarios. We first unify the three localization models by a generalized problem formulation based on the CRLB-related metric. Then a unified optimization framework for optimal sensor placement (UTMOST) is developed through the combination of the alternating direction method of multipliers (ADMM) and majorization-minimization (MM) techniques. Unlike the majority of the state-of-the-art works, the proposed UTMOST neither approximates the design criterion nor considers only uncorrelated noise in the measurements. It can readily adapt to to different design criteria (i.e. A, D and E-optimality) with slight modifications within the framework and yield the optimal sensor placements correspondingly. Extensive numerical experiments are performed to exhibit the efficacy and flexibility of the proposed framework.
High-throughput satellite communications systems are growing in strategic importance thanks to their role in delivering broadband services to mobile platforms and residences and/or businesses in rural and remote regions globally. Although precoding has emerged as a prominent technique to meet ever-increasing user demands, there is a lack of studies dealing with congestion control. This paper enhances the performance of multi-beam high throughput geostationary (GEO) satellite systems under congestion, where the users' quality of service (QoS) demands cannot be fully satisfied with limited resources. In particular, we propose congestion control strategies, relying on simple power control schemes. We formulate a multi-objective optimization framework balancing the system sum-rate and the number of users satisfying their QoS requirements. Next, we propose two novel approaches that effectively handle the proposed multi-objective optimization problem. The former is a model-based approach that relies on the weighted sum method to enrich the number of satisfied users by solving a series of the sum-rate optimization problems in an iterative manner. Meanwhile, the latter is a data-driven approach that offers a low-cost solution by utilizing supervised learning and exploiting the optimization structures as continuous mappings. The proposed general framework is evaluated for different linear precoding techniques, for which the low computational complexity algorithms are designed. Numerical results manifest that our proposed framework effectively handles the congestion issue and brings superior improvements of rate satisfaction to many users than previous works. Furthermore, the proposed algorithms show low run-time, which make them realistic for practical systems.
Optimal allocation of shared resources is key to deliver the promise of jointly operating radar and communications systems. In this paper, unlike prior works which examine synergistic access to resources in colocated joint radar-communications or among identical systems, we investigate this problem for a distributed system comprising heterogeneous radars and multi-tier communications. In particular, we focus on resource allocation in the context of multi-target tracking (MTT) while maintaining stable communication connections. By simultaneously allocating the available power, dwell time and shared bandwidth, we improve the MTT performance under a Bayesian tracking framework and guarantee the communications throughput. Our alternating allocation of heterogeneous resources (ANCHOR) approach solves the resulting nonconvex problem based on the alternating optimization method that monotonically improves the Bayesian Cram\'er-Rao bound. Numerical experiments demonstrate that ANCHOR significant improves the tracking error over two baseline allocations and stability under different target scenarios and radar-communications network distributions.
Due to spectrum scarcity, the coexistence of radar and wireless communication has gained substantial research interest recently. Among many scenarios, the heterogeneouslydistributed joint radar-communication system is promising due to its flexibility and compatibility of existing architectures. In this paper, we focus on a heterogeneous radar and communication network (HRCN), which consists of various generic radars for multiple target tracking (MTT) and wireless communications for multiple users. We aim to improve the MTT performance and maintain good throughput levels for communication users by a well-designed resource allocation. The problem is formulated as a Bayesian Cram\'er-Rao bound (CRB) based minimization subjecting to resource budgets and throughput constraints. The formulated nonconvex problem is solved based on an alternating descent-ascent approach. Numerical results demonstrate the efficacy of the proposed allocation scheme for this heterogeneous network.