Abstract:Low-Earth-orbit (LEO) inter-satellite links must cope with strongly doubly selective channels and aged channel state information (CSI). In this paper, the term ``sensing'' refers to the receiver-side identifiability of a small set of dominant delay--Doppler path parameters, quantified via CRLB-type proxies, rather than a full-fledged target-sensing pipeline. Affine frequency division multiplexing (AFDM) provides a sparse delay--Doppler (DD) representation well suited to such channels, yet most existing AFDM designs assume ideal CSI, operate on grid-based channel coefficients, and optimize only communication performance. This paper proposes a two-stage AFDM-based ISAC framework for mobile LEO ISLs that explicitly operates under predicted CSI. In Stage~I, we model the channel by a small number of dominant specular paths and perform sequence prediction directly on their complex gains, delays, and Dopplers, from which we reconstruct the AFDM DD-domain kernel used as the sole instantaneous CSI at the transmitter. In Stage~II, we design a sensing-aware AFDM pre-equalizer by augmenting the classical minimum mean-square error (MMSE) solution with a term obtained from Cramér--Rao-type sensitivity measures evaluated under the predicted channel model, leading to a first-order surrogate of a CRLB-regularized pre-equalizer with a single tuning parameter that controls the communication--sensing tradeoff. Simulation results for representative LEO ISL trajectories show that the proposed path-level predictor improves effective-kernel reconstruction over AFDM-unaware baselines, and that, under predicted CSI, the sensing-aware pre-equalizer significantly improves sensing-oriented metrics over outdated-CSI baselines while keeping symbol error rates close to a communication-oriented MMSE design with only modest additional complexity.
Abstract:Ka-band low-Earth-orbit (LEO) downlinks can suffer second-scale reliability collapses during flare-driven ionospheric disturbances, where fixed fade margins and reactive adaptive coding and modulation (ACM) are either overly conservative or too slow. This paper presents a GNSS-free, link-internal predictive controller that senses the same downlink via a geometry-free dual-carrier phase observable at 10~Hz: a high-pass filter and template-based onset detector, followed by a four-state nearly-constant-velocity Kalman filter, estimate $Δ$VTEC and its rate, and a short look-ahead (60~s) yields an endpoint outage probability used as a risk gate to trigger one-step discrete MCS down-switch and pilot-time update with hysteresis. Evaluation uses physics-informed log replay driven by real GOES X-ray flare morphologies under a disjoint-day frozen-calibration protocol, with uncertainty reported via paired moving-block bootstrap. Across stressed 60~s windows, the controller reduces peak BLER by 25--30\% and increases goodput by 0.10--0.15~bps/Hz versus no-adaptation baselines under a unified link-level abstraction. The loop runs in $\mathcal{O}(1)$ per 0.1~s epoch (about 0.042~ms measured), making on-board implementation feasible, and scope and deployment considerations for dispersion-dominated events are discussed.




Abstract:Internet of Agents (IoA) envisions a unified, agent-centric paradigm where heterogeneous large language model (LLM) agents can interconnect and collaborate at scale. Within this paradigm, federated learning (FL) serves as a key enabler that allows distributed LLM agents to co-train global models without centralizing data. However, the FL-enabled IoA system remains vulnerable to model poisoning attacks, and the prevailing distance and similarity-based defenses become fragile at billion-parameter scale and under heterogeneous data distributions. This paper proposes a graph representation-based model poisoning (GRMP) attack, which passively exploits observed benign local models to construct a parameter correlation graph and extends an adversarial variational graph autoencoder to capture and reshape higher-order dependencies. The GRMP attack synthesizes malicious local models that preserve benign-like statistics while embedding adversarial objectives, remaining elusive to detection at the server. Experiments demonstrate a gradual drop in system accuracy under the proposed attack and the ineffectiveness of the prevailing defense mechanism in detecting the attack, underscoring a severe threat to the ambitious IoA paradigm.
Abstract:Terahertz inter-satellite links enable unprecedented sensing precision for Low Earth Orbit (LEO) constellations, yet face fundamental bounds from hardware impairments, pointing errors, and network interference. We develop a Network Cram\'er-Rao Lower Bound (N-CRLB) framework incorporating dynamic topology, hardware quality factor $\Gamma_{\text{eff}}$, phase noise $\sigma^2_\phi$, and cooperative effects through recursive Fisher Information analysis. Our analysis reveals three key insights: (i) hardware and phase noise create power-independent performance ceilings ($\sigma_{\text{ceiling}} \propto \sqrt{\Gamma_{\text{eff}}}$) and floors ($\sigma_{\text{floor}} \propto \sqrt{\sigma^2_\phi}/f_c$), with power-only scaling saturating above $\text{SNR}_{\text{crit}}=1/\Gamma_{\text{eff}}$; (ii) interference coefficients $\alpha_{\ell m}$ enable opportunistic sensing with demonstrated gains of 5.5~dB under specific conditions (65~dB processing gain, 50~dBi antennas); (iii) measurement correlations from shared timing references, when properly modeled, do not degrade performance and can provide common-mode rejection benefits compared to mismodeled independent-noise baselines. Sub-millimeter ranging requires co-optimized hardware ($\Gamma_{\text{eff}}<0.01$), oscillators ($\sigma^2_\phi<10^{-2}$), and appropriate 3D geometry configurations.