Abstract:In continuous aperture arrays (CAPAs), careful consideration of the underlying physics is essential, among which electromagnetic (EM) mutual coupling plays a critical role in beamforming performance. Building on a physically consistent mutual coupling model, the beamforming design is formulated as a functional optimization whose optimality condition leads to a Fredholm integral equation. The incorporation of the coupling model, however, substantially increases computational complexity, necessitating efficient and accurate integral equation solvers. In this letter, we propose two efficient solvers: 1) a coordinate-transformation-based kernel approximation that preserves the operator structure while alleviating discretization demands, and 2) a direct lower-upper (LU)-based solver that stably handles the Nyström-discretized system. Numerical results demonstrate improved accuracy and reduced computational overhead compared to conventional methods, with the LU-based solver emerging as an efficient and scalable solution for large-scale CAPA optimization via offline factorization.
Abstract:Federated learning (FL) and federated distillation (FD) are distributed learning paradigms that train UE models with enhanced privacy, each offering different trade-offs between noise robustness and learning speed. To mitigate their respective weaknesses, we propose a hybrid federated learning (HFL) framework in which each user equipment (UE) transmits either gradients or logits, and the base station (BS) selects the per-round weights of FL and FD updates. We derive convergence of HFL framework and introduce two methods to exploit degrees of freedom (DoF) in HFL, which are (i) adaptive UE clustering via Jenks optimization and (ii) adaptive weight selection via a damped Newton method. Numerical results show that HFL achieves superior test accuracy at low SNR when both DoF are exploited.