Low earth orbit (LEO) satellite constellation-enabled communication networks are expected to be an important part of many Internet of Things (IoT) deployments due to their unique advantage of providing seamless global coverage. In this paper, we investigate the random access problem in massive multiple-input multiple-output-based LEO satellite systems, where the multi-satellite cooperative processing mechanism is considered. Specifically, at edge satellite nodes, we conceive a training sequence padded multi-carrier system to overcome the issue of imperfect synchronization, where the training sequence is utilized to detect the devices' activity and estimate their channels. Considering the inherent sparsity of terrestrial-satellite links and the sporadic traffic feature of IoT terminals, we utilize the orthogonal approximate message passing-multiple measurement vector algorithm to estimate the delay coefficients and user terminal activity. To further utilize the structure of the receive array, a two-dimensional estimation of signal parameters via rotational invariance technique is performed for enhancing channel estimation. Finally, at the central server node, we propose a majority voting scheme to enhance activity detection by aggregating backhaul information from multiple satellites. Moreover, multi-satellite cooperative linear data detection and multi-satellite cooperative Bayesian dequantization data detection are proposed to cope with perfect and quantized backhaul, respectively. Simulation results verify the effectiveness of our proposed schemes in terms of channel estimation, activity detection, and data detection for quasi-synchronous random access in satellite systems.
This paper proposes a unified semi-blind detection framework for sourced and unsourced random access (RA), which enables next-generation ultra-reliable low-latency communications (URLLC) with massive devices. Specifically, the active devices transmit their uplink access signals in a grant-free manner to realize ultra-low access latency. Meanwhile, the base station aims to achieve ultra-reliable data detection under severe inter-device interference without exploiting explicit channel state information (CSI). We first propose an efficient transmitter design, where a small amount of reference information (RI) is embedded in the access signal to resolve the inherent ambiguities incurred by the unknown CSI. At the receiver, we further develop a successive interference cancellation-based semi-blind detection scheme, where a bilinear generalized approximate message passing algorithm is utilized for joint channel and signal estimation (JCSE), while the embedded RI is exploited for ambiguity elimination. Particularly, a rank selection approach and a RI-aided initialization strategy are incorporated to reduce the algorithmic computational complexity and to enhance the JCSE reliability, respectively. Besides, four enabling techniques are integrated to satisfy the stringent latency and reliability requirements of massive URLLC. Numerical results demonstrate that the proposed semi-blind detection framework offers a better scalability-latency-reliability tradeoff than the state-of-the-art detection schemes dedicated to sourced or unsourced RA.
This paper proposes a unified semi-blind detection framework for sourced and unsourced random access (RA), which enables next-generation ultra-reliable low-latency communications (URLLC) with massive devices. Specifically, the active devices transmit their uplink access signals in a grant-free manner to realize ultra-low access latency. Meanwhile, the base station aims to achieve ultra-reliable data detection under severe inter-device interference without exploiting explicit channel state information (CSI). We first propose an efficient transmitter design, where a small amount of reference information (RI) is embedded in the access signal to resolve the inherent ambiguities incurred by the unknown CSI. At the receiver, we further develop a successive interference cancellation-based semi-blind detection scheme, where a bilinear generalized approximate message passing algorithm is utilized for joint channel and signal estimation (JCSE), while the embedded RI is exploited for ambiguity elimination. Particularly, a rank selection approach and a RI-aided initialization strategy are incorporated to reduce the algorithmic computational complexity and to enhance the JCSE reliability, respectively. Besides, four enabling techniques are integrated to satisfy the stringent latency and reliability requirements of massive URLLC. Numerical results demonstrate that the proposed semi-blind detection framework offers a better scalability-latency-reliability tradeoff than the state-of-the-art detection schemes dedicated to sourced or unsourced RA.
The capacity of commercial massive multiple-input multiple-output (mMIMO) systems is constrained by the limited array aperture at the base station, and cannot meet the ever-increasing traffic demands of wireless networks. Given the array aperture, holographic MIMO with infinitesimal antenna spacing can maximize the capacity, but is physically unrealizable. As a promising alternative, reconfigurable mMIMO is proposed to harness the unexploited power of the electromagnetic (EM) domain for enhanced information transfer. Specifically, the reconfigurable pixel antenna technology provides each antenna with an adjustable EM radiation (EMR) pattern, introducing extra degrees of freedom for information transfer in the EM domain. In this article, we present the concept and benefits of availing the EMR domain for mMIMO transmission. Moreover, we propose a viable architecture for reconfigurable mMIMO systems, and the associated system model and downlink precoding are also discussed. In particular, a three-level precoding scheme is proposed, and simulation results verify its considerable spectral and energy efficiency advantages compared to traditional mMIMO systems. Finally, we further discuss the challenges, insights, and prospects of deploying reconfigurable mMIMO, along with the associated hardware, algorithms, and fundamental theory.
To provide seamless coverage during all flight phases, aeronautical communications systems (ACS) have to integrate space-based, air-based, as well as ground-based platforms to formulate aviation-oriented space-air-ground integrated networks (SAGINs). In continental areas, L-band aeronautical broadband communications (ABC) are gaining popularity for supporting air traffic management (ATM) modernization. However, L-band ABC faces the challenges of spectrum congestion and severe interference due to the legacy systems. To circumvent these, we propose a novel multiple-antenna aided L-band ABC paradigm to tackle the key issues of reliable and high-rate air-to-ground (A2G) transmissions. Specifically, we first introduce the development roadmap of the ABC. Furthermore, we discuss the peculiarities of the L-band ABC propagation environment and the distinctive challenges of the associated multiple-antenna techniques. To overcome these challenges, we propose an advanced multiple-antenna assisted L-band ABC paradigm from the perspective of channel estimation, reliable transmission, and multiple access. Finally, we shed light on the compelling research directions of the aviation component of SAGINs.
With the blooming of Internet-of-Things (IoT), we are witnessing an explosion in the number of IoT terminals, triggering an unprecedented demand for ubiquitous wireless access globally. In this context, the emerging low-Earth-orbit satellites (LEO-SATs) have been regarded as a promising enabler to complement terrestrial wireless networks in providing ubiquitous connectivity and bridging the ever-growing digital divide in the expected next-generation wireless communications. Nevertheless, the stringent requirements posed by LEO-SATs have imposed significant challenges to the current multiple access schemes and led to an emerging paradigm shift in system design. In this article, we first provide a comprehensive overview of the state-of-the-art multiple access schemes and investigate their limitations in the context of LEO-SATs. To this end, we propose the amalgamation of the grant-free non-orthogonal multiple access (GF-NOMA) paradigm and the orthogonal time frequency space (OTFS) waveform, for simplifying the connection procedure with reduced access latency and enhanced Doppler-robustness. Critical open challenging issues and future directions are finally presented for further technical development.
Reconfigurable intelligent surface (RIS) can significantly enhance the service coverage of Tera-Hertz massive multiple-input multiple-output (MIMO) communication systems. However, obtaining accurate high-dimensional channel state information (CSI) with limited pilot and feedback signaling overhead is challenging, severely degrading the performance of conventional spatial division multiple access. To improve the robustness against CSI imperfection, this paper proposes a deep learning (DL)-based rate-splitting multiple access (RSMA) scheme for RIS-aided Tera-Hertz multi-user MIMO systems. Specifically, we first propose a hybrid data-model driven DL-based RSMA precoding scheme, including the passive precoding at the RIS as well as the analog active precoding and the RSMA digital active precoding at the base station (BS). To realize the passive precoding at the RIS, we propose a Transformer-based data-driven RIS reflecting network (RRN). As for the analog active precoding at the BS, we propose a match-filter based analog precoding scheme considering that the BS and RIS adopt the LoS-MIMO antenna array architecture. As for the RSMA digital active precoding at the BS, we propose a low-complexity approximate weighted minimum mean square error (AWMMSE) digital precoding scheme. Furthermore, for better precoding performance as well as lower computational complexity, a model-driven deep unfolding active precoding network (DFAPN) is also designed by combining the proposed AWMMSE scheme with DL. Then, to acquire accurate CSI at the BS for the investigated RSMA precoding scheme to achieve higher spectral efficiency, we propose a CSI acquisition network (CAN) with low pilot and feedback signaling overhead, where the downlink pilot transmission, CSI feedback at the user equipments (UEs), and CSI reconstruction at the BS are modeled as an end-to-end neural network based on Transformer.
Massive connectivity for extra-large multi-input multi-output (XL-MIMO) systems is a challenging issue due to the prohibitive cost and the near-field non-stationary channels. In this paper, we propose an uplink grant-free massive access scheme for XL-MIMO systems, in which a mixed-analog-to-digital converters (ADC) architecture is adopted to strike the right balance between access performance and energy cost. By exploiting the spatial-domain structured sparsity and the piecewise angular-domain cluster sparsity of massive access channels, a compressive sensing (CS)-based two-stage orthogonal approximate message passing algorithm is proposed to efficiently solve the joint activity detection and channel estimation problem. Particularly, high-precision quantized measurements are leveraged to perform accurate hyper-parameter estimation, thereby facilitating the activity detection. Moreover, we adopt subarray-wise estimation strategy to overcome the severe angular-domain energy dispersion problem which is caused by the spatial non-stationarity of near-field XL-MIMO channels. Simulation results verify the superiority of our proposed algorithm over state-of-the-art CS algorithms for massive access based on XL-MIMO with mixed-ADC architectures.
The fourth industrial revolution, i.e., Industry 4.0, is evolving all around the globe. In this article, we introduce the landscape of Industry 4.0 and beyond empowered by the seamless collaboration of communication technology (CT), information technology (IT), and operation technology (OT), i.e., CIOT collaboration. Specifically, CIOT collaboration is regarded as a main improvement of Industry 4.0 compared to the previous industrial revolutions. We commence by reviewing the previous three industrial revolutions and we argue that the key feature of Industry 4.0 is the CIOT collaboration. More particularly, CT domain supports ubiquitous connectivity of the industrial elements and further bridges the physical world and the cyber world, which is a pivotal prerequisite. Then, we present the potential impacts of CIOT collaboration on typical industrial use cases with the objective of creating a more intelligent and human-friendly industry. Furthermore, the technical challenges of paving the way for the CIOT collaboration with an emphasis on the CT domain are discussed. Finally, we shed light on a roadmap for Industry 4.0 and beyond. The salient steps to be taken in the future CIOT collaboration are highlighted, which may be expected to expedite the paradigm shift towards the next industrial revolution.
6G wireless networks are foreseen to speed up the convergence of the physical and cyber worlds and to enable a paradigm-shift in the way we deploy and exploit communication networks. Machine learning, in particular deep learning (DL), is going to be one of the key technological enablers of 6G by offering a new paradigm for the design and optimization of networks with a high level of intelligence. In this article, we introduce an emerging DL architecture, known as the transformer, and discuss its potential impact on 6G network design. We first discuss the differences between the transformer and classical DL architectures, and emphasize the transformer's self-attention mechanism and strong representation capabilities, which make it particularly appealing in tackling various challenges in wireless network design. Specifically, we propose transformer-based solutions for massive multiple-input multiple-output (MIMO) systems and various semantic communication problems in 6G networks. Finally, we discuss key challenges and open issues in transformer-based solutions, and identify future research directions for their deployment in intelligent 6G networks.