Metaverse seamlessly blends the physical world and virtual space via ubiquitous communication and computing infrastructure. In transportation systems, the vehicular Metaverse can provide a fully-immersive and hyperreal traveling experience (e.g., via augmented reality head-up displays, AR-HUDs) to drivers and users in autonomous vehicles (AVs) via roadside units (RSUs). However, provisioning real-time and immersive services necessitates effective physical-virtual synchronization between physical and virtual entities, i.e., AVs and Metaverse AR recommenders (MARs). In this paper, we propose a generative AI-empowered physical-virtual synchronization framework for the vehicular Metaverse. In physical-to-virtual synchronization, digital twin (DT) tasks generated by AVs are offloaded for execution in RSU with future route generation. In virtual-to-physical synchronization, MARs customize diverse and personal AR recommendations via generative AI models based on user preferences. Furthermore, we propose a multi-task enhanced auction-based mechanism to match and price AVs and MARs for RSUs to provision real-time and effective services. Finally, property analysis and experimental results demonstrate that the proposed mechanism is strategy-proof and adverse-selection free while increasing social surplus by 50%.
To provide seamless coverage during all flight phases, aeronautical communications systems (ACS) have to integrate space-based, air-based, as well as ground-based platforms to formulate aviation-oriented space-air-ground integrated networks (SAGINs). In continental areas, L-band aeronautical broadband communications (ABC) are gaining popularity for supporting air traffic management (ATM) modernization. However, L-band ABC faces the challenges of spectrum congestion and severe interference due to the legacy systems. To circumvent these, we propose a novel multiple-antenna aided L-band ABC paradigm to tackle the key issues of reliable and high-rate air-to-ground (A2G) transmissions. Specifically, we first introduce the development roadmap of the ABC. Furthermore, we discuss the peculiarities of the L-band ABC propagation environment and the distinctive challenges of the associated multiple-antenna techniques. To overcome these challenges, we propose an advanced multiple-antenna assisted L-band ABC paradigm from the perspective of channel estimation, reliable transmission, and multiple access. Finally, we shed light on the compelling research directions of the aviation component of SAGINs.
In recent years, the exponential proliferation of smart devices with their intelligent applications poses severe challenges on conventional cellular networks. Such challenges can be potentially overcome by integrating communication, computing, caching, and control (i4C) technologies. In this survey, we first give a snapshot of different aspects of the i4C, comprising background, motivation, leading technological enablers, potential applications, and use cases. Next, we describe different models of communication, computing, caching, and control (4C) to lay the foundation of the integration approach. We review current state-of-the-art research efforts related to the i4C, focusing on recent trends of both conventional and artificial intelligence (AI)-based integration approaches. We also highlight the need for intelligence in resources integration. Then, we discuss integration of sensing and communication (ISAC) and classify the integration approaches into various classes. Finally, we propose open challenges and present future research directions for beyond 5G networks, such as 6G.
Semantic communication is viewed as a revolutionary paradigm that can potentially transform how we design and operate wireless communication systems. However, despite a recent surge of research activities in this area, the research landscape remains limited. In this tutorial, we present the first rigorous vision of a scalable end-to-end semantic communication network that is founded on novel concepts from artificial intelligence (AI), causal reasoning, and communication theory. We first discuss how the design of semantic communication networks requires a move from data-driven networks towards knowledge-driven ones. Subsequently, we highlight the necessity of creating semantic representations of data that satisfy the key properties of minimalism, generalizability, and efficiency so as to do more with less. We then explain how those representations can form the basis a so-called semantic language. By using semantic representation and languages, we show that the traditional transmitter and receiver now become a teacher and apprentice. Then, we define the concept of reasoning by investigating the fundamentals of causal representation learning and their role in designing semantic communication networks. We demonstrate that reasoning faculties are majorly characterized by the ability to capture causal and associational relationships in datastreams. For such reasoning-driven networks, we propose novel and essential semantic communication metrics that include new "reasoning capacity" measures that could go beyond Shannon's bound to capture the convergence of computing and communication. Finally, we explain how semantic communications can be scaled to large-scale networks (6G and beyond). In a nutshell, we expect this tutorial to provide a comprehensive reference on how to properly build, analyze, and deploy future semantic communication networks.
In this paper, we propose to deploy multiple unmanned aerial vehicle (UAV) mounted base stations to serve ground users in outdoor environments with obstacles. In particular, the geographic information is employed to capture the blockage effects for air-to-ground (A2G) links caused by buildings, and a realistic blockage-aware A2G channel model is proposed to characterize the continuous variation of the channels at different locations. Based on the proposed channel model, we formulate the joint optimization problem of UAV three-dimensional (3-D) positioning and resource allocation, by power allocation, user association, and subcarrier allocation, to maximize the minimum achievable rate among users. To solve this non-convex combinatorial programming problem, we introduce a penalty term to relax it and develop a suboptimal solution via a penalty-based double-loop iterative optimization framework. The inner loop solves the penalized problem by employing the block successive convex approximation (BSCA) technique, where the UAV positioning and resource allocation are alternately optimized in each iteration. The outer loop aims to obtain proper penalty multipliers to ensure the solution of the penalized problem converges to that of the original problem. Simulation results demonstrate the superiority of the proposed algorithm over other benchmark schemes in terms of the minimum achievable rate.
In recent years, unmanned aerial vehicles (UAVs) assisted mobile edge computing systems have been exploited by researchers as a promising solution for providing computation services to mobile users outside of terrestrial infrastructure coverage. However, it remains challenging for the standalone MEC-enabled UAVs in order to meet the computation requirement of numerous mobile users due to the limited computation capacity of their onboard servers and battery lives. Therefore, we propose a collaborative scheme among UAVs so that UAVs can share the workload with idle UAVs. Moreover, current task offloading strategies frequently overlook task topology, which may result in poor performance or even system failure. To address the problem, we consider offloading tasks consisting of a set of sub-tasks, and each sub-task has dependencies on other sub-tasks, which is practical in the real world. Sub-tasks with dependencies need to wait for the resulting signal from preceding sub-tasks before being executed. This mechanism has serious effects on the offloading strategy. Then, we formulate an optimization problem to minimize the average latency experienced by users by jointly controlling the offloading decision for dependent tasks and allocating the communication resources of UAVs. The formulated problem appears to be NP-hard and cannot be solved in polynomial time. Therefore, we divide the problem into two sub-problems: the offloading decision problem and the communication resource allocation problem. Then a meta-heuristic method is proposed to find the sub-optimal solution of the task offloading problem, while the communication resource allocation problem is solved by using convex optimization. Finally, we perform substantial simulation experiments, and the result shows that the proposed offloading technique effectively minimizes the average latency of users, compared with other benchmark schemes.
Facilitated by rapid technological development of the near-space platform stations (NSPS), near-space communication (NS-COM) is envisioned to play a pivotal role in the space-air-ground integrated network for sixth-generation (6G) communications and beyond. In NS-COM, ultra-broadband wireless connectivity between NSPSs and various airborne/spaceborne platforms is required for a plethora of bandwidth-consuming applications, such as NSPS-based Ad hoc networking, in-flight Internet and relaying technology. However, such requirement seems to contradict with the scarcity of spectrum resources at conventional microwave frequencies, which motivates the exploitation of terahertz (THz) band ranging from 0.1 to 10 THz. Due to huge available bandwidth, the THz signals are capable of supporting ultra-high-rate data transmission for NS-COM over 100 Gb/s, which are naturally suitable for the near-space environment with marginal path loss. To this end, this article provides an extensive investigation on the THz-band NS-COM (THz-NS-COM) from a physical-layer perspective. Firstly, we summarize the potential applications of THz communications in the near-space environment, where the corresponding technical barriers are analyzed. Then the channel characteristics of THz-NS-COM and the corresponding modeling strategies are discussed, respectively. Afterwards, three essential research directions are investigated to surpass the technical barriers of THz-NS-COM, i.e., robust beamforming for ultra-massive antenna array, signal processing algorithms against hybrid distortions, and integrated sensing and communications. Several open problems are also provided to unleash the full potential of THz-NS-COM.
With the rapid development of satellite communication technologies, the space-based access network has been envisioned as a promising complementary part of the future 6G network. Aside from terrestrial base stations, satellite nodes, especially the low-earth-orbit (LEO) satellites, can also serve as base stations for Internet access, and constitute the LEO-satellite-based access network (LEO-SAN). LEO-SAN is expected to provide seamless massive access and extended coverage with high signal quality. However, its practical implementation still faces significant technical challenges, e.g., high mobility and limited budget for communication payloads of LEO satellite nodes. This paper aims at revealing the main technical issues that have not been fully addressed by the existing LEO-SAN designs, from three aspects namely random access, beam management and Doppler-resistant transmission technologies. More specifically, the critical issues of random access in LEO-SAN are discussed regarding low flexibility, long transmission delay, and inefficient handshakes. Then the beam management for LEO-SAN is investigated in complex propagation environments under the constraints of high mobility and limited payload budget. Furthermore, the influence of Doppler shifts on LEO-SAN is explored. Correspondingly, promising technologies to address these challenges are also discussed, respectively. Finally, the future research directions are envisioned.
Precipitated by the technological innovations of the near-space platform stations (NSPS), the near space communication (NS-COM) network has emerged as an indispensable part of the next-generation space-air-ground integrated network (SAGIN) that facilitates ubiquitous coverage and broadband data transfer. This paper aims to provide a comprehensive overview of NS-COM. Firstly, we investigate the differences between NS-COM and the existing terrestrial cellular networks as well as satellite-based and unmanned-aerial-vehicle (UAV)-based communication networks, which is followed by a review of the NS-COM development. Then, we explore the unique characteristics of NS-COM regarding the platforms and the propagation environment of the near space. The main issues of NS-COM are identified, resulted from the extremely long transmission distance, limitations of the communication payloads on NSPS and complex atmospheric constitution of the near space. Then various application scenarios of NS-COM are discussed, where the special technical requirements are also revealed, from the physical-layer aspect like transceiver design to the upper-layer aspect like computational offloading and NSPS placement. Furthermore, we investigate the co-existence of NS-COM and ground networks by treating each other as interferers or collaborators. Finally, we list several potential technologies for NS-COM from the perspective of spectrum usage, and highlight their technical challenges for future research.
Automatic modulation classification is of crucial importance in wireless communication networks. Deep learning based automatic modulation classification schemes have attracted extensive attention due to the superior accuracy. However, the data-driven method relies on a large amount of training samples and the classification accuracy is poor in the low signal-to-noise radio (SNR). In order to tackle these problems, a novel data-and-knowledge dual-driven automatic modulation classification scheme based on radio frequency machine learning is proposed by exploiting the attribute features of different modulations. The visual model is utilized to extract visual features. The attribute learning model is used to learn the attribute semantic representations. The transformation model is proposed to convert the attribute representation into the visual space. Extensive simulation results demonstrate that our proposed automatic modulation classification scheme can achieve better performance than the benchmark schemes in terms of the classification accuracy, especially in the low SNR. Moreover, the confusion among high-order modulations is reduced by using our proposed scheme compared with other traditional schemes.