Long-term beamforming (LTBF) is a widely-used scalable alternative to instantaneous multi-user MIMO processing that leverages slowly varying spatial channel statistics. VLSI implementations require matrix inversion that become computationally challenging for massive MIMO systems with large number of antennas. In this work, we show that dominant interferers significantly degrade the numerical conditioning of the LTBF covariance matrix, leading to severe performance loss in finite-precision implementations of polynomial and conjugate gradient (CG) based inversion methods. To address this issue, we propose a subspace nulling approach that operates solely on long-term channel statistics and acts as an implicit preconditioning step for LTBF. By projecting the received signal onto the orthogonal complement of the dominant interference subspace, the proposed method reduces the eigenvalue spread of the covariance matrix and improves numerical stability. Through ray-tracing simulations in a realistic 5G scenario, we demonstrate that the proposed method substantially reduces the number of CG iterations required to achieve near-optimal performance across floating-point and fixed-point implementations while preserving the low-overhead nature of LTBF.