We propose novel smooth approximations to the classical rounding function, suitable for differentiable optimization and machine learning applications. Our constructions are based on two approaches: (1) localized sigmoid window functions centered at each integer, and (2) normalized weighted sums of sigmoid derivatives representing local densities. The first method approximates the step-like behavior of rounding through differences of shifted sigmoids, while the second method achieves smooth interpolation between integers via density-based weighting. Both methods converge pointwise to the classical rounding function as the sharpness parameter k tends to infinity, and allow controlled trade-offs between smoothness and approximation accuracy. We demonstrate that by restricting the summation to a small set of nearest integers, the computational cost remains low without sacrificing precision. These constructions provide fully differentiable alternatives to hard rounding, which are valuable in contexts where gradient-based methods are essential.