Uncertainty in biological neural systems appears to be computationally beneficial rather than detrimental. However, in neuromorphic computing systems, device variability often limits performance, including accuracy and efficiency. In this work, we propose a spiking Bayesian neural network (SBNN) framework that unifies the dynamic models of intrinsic device stochasticity (based on Magnetic Tunnel Junctions) and stochastic threshold neurons to leverage noise as a functional Bayesian resource. Experiments demonstrate that SBNN achieves high accuracy (99.16% on MNIST, 94.84% on CIFAR10) with 8-bit precision. Meanwhile rate estimation method provides a ~20-fold training speedup. Furthermore, SBNN exhibits superior robustness, showing a 67% accuracy improvement under synaptic weight noise and 12% under input noise compared to standard spiking neural networks. Crucially, hardware validation confirms that physical device implementation causes invisible accuracy and calibration loss compared to the algorithmic model. Converting device stochasticity into neuronal uncertainty offers a route to compact, energy-efficient neuromorphic computing under uncertainty.