Abstract:Inference efficiency in Large Language Models (LLMs) is fundamentally limited by their serial, autoregressive generation, especially as reasoning becomes a key capability and response sequences grow longer. Speculative decoding (SD) offers a powerful solution, providing significant speed-ups through its lightweight drafting and parallel verification mechanism. While existing work has nearly saturated improvements in draft effectiveness and efficiency, this paper advances SD from a new yet critical perspective: the verification cost. We propose TriSpec, a novel ternary SD framework that, at its core, introduces a lightweight proxy to significantly reduce computational cost by approving easily verifiable draft sequences and engaging the full target model only when encountering uncertain tokens. TriSpec can be integrated with state-of-the-art SD methods like EAGLE-3 to further reduce verification costs, achieving greater acceleration. Extensive experiments on the Qwen3 and DeepSeek-R1-Distill-Qwen/LLaMA families show that TriSpec achieves up to 35\% speedup over standard SD, with up to 50\% fewer target model invocations while maintaining comparable accuracy.
Abstract:This article proposes a Mix Neural Network (MNN) based on CNN-FCNN for predicting magnetic loss of different materials. In traditional magnetic core loss models, empirical equations usually need to be regressed under the same external conditions. When the magnetic core material is different, it needs to be classified and discussed. If external factors increase, multiple models need to be proposed for classification and discussion, making the modeling process extremely cumbersome. And traditional empirical equations still has the problem of low accuracy, although various correction equations have been introduced later, the accuracy has always been unsatisfactory. By introducing machine learning and deep learning, it is possible to simultaneously solve prediction problems with low accuracy of empirical equations and complex conditions. Based on the MagNet database, through the training of the newly proposed MNN, it is found that a single model is sufficient to make predictions for at least four different materials under varying temperatures, frequencies, and waveforms, with accuracy far exceeding that of traditional models. At the same time, we also used three other machine learning and deep learning models (Random Forest, XGBoost, MLP-LSTM) for training, all of which had much higher accuracy than traditional models. On the basis of the predicted results, a hybrid model combining MNN and XGBoost was proposed, which predicted through weighting and found that the accuracy could continue to improve. This provides a solution for modeling magnetic core loss under different materials and operating modes.