Abstract:Low Earth orbit (LEO) inter-satellite links (ISLs) must achieve joint synchronization and ranging under severe hardware impairments, namely oscillator phase noise, clock drift, and measurement outliers, exacerbated by rapid relative dynamics exceeding 7~km/s. In coherent Doppler processing, the frequency observable depends on the \emph{difference} between consecutive carrier phase states, creating a cross-epoch coupling structure that fundamentally affects estimation-theoretic performance limits. This paper makes three contributions. First, we prove analytically that this cross-epoch Doppler coupling is \emph{necessary} to avoid unbounded carrier phase uncertainty: without it, phase variance grows linearly without bound. Second, we derive a posterior Cramér-Rao bound (PCRB) via the Tichavský recursion that explicitly incorporates the resulting 10$\times$10 block information structure. Third, we propose a hybrid robust filtering framework combining hard gating for impulsive cycle-slip outliers with Huber M-estimation for heavy-tail contamination, using TASD-aware innovation covariance to account for cross-epoch uncertainty in residual normalization. Monte Carlo simulations at Ka-band confirm that the PCRB accurately lower-bounds estimator performance under nominal conditions, while the hybrid method reduces 95th-percentile phase error by 27--93\% compared to standard extended Kalman filtering across different outlier regimes.
Abstract:We present a policy-aware, cross-layer methodology for edge-side auditing of service tiering and quota-based throttling in Starlink. Using a multi-week plan-hopping campaign (232.8 h) on a UK residential terminal, we align 1 Hz terminal telemetry with host-side probes to obtain portal-labeled traces spanning priority (pre-quota), post-quota throttling, stay-active operation, and residential service. Using portal status only as ground truth (independent of throughput), we show these policy regimes manifest as distinct signatures in goodput, PoP RTT, and an internal-to-user ratio $R=C_{\mathrm{int}}/T_{\mathrm{user}}$. A lightweight rule on windowed medians separates high-speed from low-rate operation without operator visibility.
Abstract:Ka-band low-Earth-orbit (LEO) downlinks can suffer second-scale reliability collapses during flare-driven ionospheric disturbances, where fixed fade margins and reactive adaptive coding and modulation (ACM) are either overly conservative or too slow. This paper presents a GNSS-free, link-internal predictive controller that senses the same downlink via a geometry-free dual-carrier phase observable at 10~Hz: a high-pass filter and template-based onset detector, followed by a four-state nearly-constant-velocity Kalman filter, estimate $Δ$VTEC and its rate, and a short look-ahead (60~s) yields an endpoint outage probability used as a risk gate to trigger one-step discrete MCS down-switch and pilot-time update with hysteresis. Evaluation uses physics-informed log replay driven by real GOES X-ray flare morphologies under a disjoint-day frozen-calibration protocol, with uncertainty reported via paired moving-block bootstrap. Across stressed 60~s windows, the controller reduces peak BLER by 25--30\% and increases goodput by 0.10--0.15~bps/Hz versus no-adaptation baselines under a unified link-level abstraction. The loop runs in $\mathcal{O}(1)$ per 0.1~s epoch (about 0.042~ms measured), making on-board implementation feasible, and scope and deployment considerations for dispersion-dominated events are discussed.




Abstract:Internet of Agents (IoA) envisions a unified, agent-centric paradigm where heterogeneous large language model (LLM) agents can interconnect and collaborate at scale. Within this paradigm, federated learning (FL) serves as a key enabler that allows distributed LLM agents to co-train global models without centralizing data. However, the FL-enabled IoA system remains vulnerable to model poisoning attacks, and the prevailing distance and similarity-based defenses become fragile at billion-parameter scale and under heterogeneous data distributions. This paper proposes a graph representation-based model poisoning (GRMP) attack, which passively exploits observed benign local models to construct a parameter correlation graph and extends an adversarial variational graph autoencoder to capture and reshape higher-order dependencies. The GRMP attack synthesizes malicious local models that preserve benign-like statistics while embedding adversarial objectives, remaining elusive to detection at the server. Experiments demonstrate a gradual drop in system accuracy under the proposed attack and the ineffectiveness of the prevailing defense mechanism in detecting the attack, underscoring a severe threat to the ambitious IoA paradigm.
Abstract:Terahertz inter-satellite links enable unprecedented sensing precision for Low Earth Orbit (LEO) constellations, yet face fundamental bounds from hardware impairments, pointing errors, and network interference. We develop a Network Cram\'er-Rao Lower Bound (N-CRLB) framework incorporating dynamic topology, hardware quality factor $\Gamma_{\text{eff}}$, phase noise $\sigma^2_\phi$, and cooperative effects through recursive Fisher Information analysis. Our analysis reveals three key insights: (i) hardware and phase noise create power-independent performance ceilings ($\sigma_{\text{ceiling}} \propto \sqrt{\Gamma_{\text{eff}}}$) and floors ($\sigma_{\text{floor}} \propto \sqrt{\sigma^2_\phi}/f_c$), with power-only scaling saturating above $\text{SNR}_{\text{crit}}=1/\Gamma_{\text{eff}}$; (ii) interference coefficients $\alpha_{\ell m}$ enable opportunistic sensing with demonstrated gains of 5.5~dB under specific conditions (65~dB processing gain, 50~dBi antennas); (iii) measurement correlations from shared timing references, when properly modeled, do not degrade performance and can provide common-mode rejection benefits compared to mismodeled independent-noise baselines. Sub-millimeter ranging requires co-optimized hardware ($\Gamma_{\text{eff}}<0.01$), oscillators ($\sigma^2_\phi<10^{-2}$), and appropriate 3D geometry configurations.




Abstract:Molecular communication (MC) provides a foundational framework for information transmission in the Internet of Bio-Nano Things (IoBNT), where efficiency and reliability are crucial. However, the inherent limitations of molecular channels, such as low transmission rates, noise, and inter-symbol interference (ISI), limit their ability to support complex data transmission. This paper proposes an end-to-end semantic learning framework designed to optimize task-oriented molecular communication, with a focus on biomedical diagnostic tasks under resource-constrained conditions. The proposed framework employs a deep encoder-decoder architecture to efficiently extract, quantize, and decode semantic features, prioritizing task-relevant semantic information to enhance diagnostic classification performance. Additionally, a probabilistic channel network is introduced to approximate molecular propagation dynamics, enabling gradient-based optimization for end-to-end learning. Experimental results demonstrate that the proposed semantic framework improves diagnostic accuracy by at least 25% compared to conventional JPEG compression with LDPC coding methods under resource-constrained communication scenarios.