Local constraint ordered statistics decoding (LC-OSD) provides strong soft decision performance for short block length linear codes, but its practical cost is dominated by the number of tested error patterns (TEPs). This paper proposes a neural early stopping (NES) protocol for LC-OSD with explicit cost control through one trade-off parameter balancing frame error risk and search effort. The proposed approach is trained with frame error rate (FER)-aligned supervision at predefined checkpoints, and learns if additional search is still likely to improve the current best candidate. Later, stopping is decided by comparing predicted continuation need with a cost measured in TEPs. Experimental results across multiple code families show that the proposed protocol significantly reduces average TEP count with only marginal FER degradation, using a single global model for the range of all operating signal-to-noise ratios (SNRs).