



Frequency offsets-compensated least mean squares (FO-LMS) algorithm is a generic method for estimating a wireless channel under carrier and sampling frequency offsets when the transmitted signal is beforehand known to the receiver. The algorithm iteratively and explicitly adjusts its estimates of the channel and frequency offsets using stochastic gradient descent-based rules and the step sizes of these rules determine the learning rate and stability of the algorithm. Within the stability conditions, the choice of step sizes reflects a trade-off between the algorithm's ability to react to changes in the channel and the ability to minimize misadjustments caused by noise. This paper provides theoretical expressions to predict and optimize the tracking and misadjusment errors of FO-LMS when estimating channels and frequency offsets with known time-varying characteristics. This work also proposes a method to adjust the FO-LMS's step sizes based on the algorithm's performance when the time-varying characteristics are not known, which is more often the case in practice. Accuracy of the expressions and performance of the proposed variable step sizes algorithm are studied through simulations.