Block sparsity is a widely exploited structure in sparse recovery, offering significant gains when signal blocks are known. Yet, practical signals often exhibit unknown block boundaries and isolated non-zero entries, which challenge traditional approaches. A promising method to handle such complex sparsity patterns is the difference-of-logs total variation (DoL-TV) regularized sparse Bayesian learning (SBL). However, due to the complex form of DoL-TV term, the resulting optimization problem is hard to solve. This paper develops a new optimization framework for the DoL-TV SBL cost function. By introducing an exponential reparameterization of the SBL hyperparameters, we reveal a novel structure that admits a majorization-minimization formulation and naturally extends to unknown noise variance estimation. Sparse recovery results on both synthetic data and extended source direction-of-arrival estimation demonstrate improved accuracy and runtime performance compared to benchmark methods.