Linear discriminant analysis (LDA) is a fundamental method in statistical pattern recognition and classification, achieving Bayes optimality under Gaussian assumptions. However, it is well-known that classical LDA may struggle in high-dimensional settings due to instability in covariance estimation. In this work, we propose LDA with gradient optimization (LDA-GO), a new approach that directly optimizes the inverse covariance matrix via gradient descent. The algorithm parametrizes the inverse covariance matrix through Cholesky factorization, incorporates a low-rank extension to reduce computational complexity, and considers a multiple-initialization strategy, including identity initialization and warm-starting from the classical LDA estimates. The effectiveness of LDA-GO is demonstrated through extensive multivariate simulations and real-data experiments.