Abstract:Recently, there has been a growing interest in generative models based on diffusions driven by the empirical robustness of these methods in generating high-dimensional photorealistic images and the possibility of using the vast existing toolbox of stochastic differential equations. %This remarkable ability may stem from their capacity to model and generate multimodal distributions. In this work, we offer a novel perspective on the approach introduced in Song et al. (2021), shifting the focus from a "learning" problem to a "sampling" problem. To achieve this, we reformulate the equations governing diffusion-based generative models as a Forward-Backward Stochastic Differential Equation (FBSDE), which avoids the well-known issue of pre-estimating the gradient of the log target density. The solution of this FBSDE is proved to be unique using non-standard techniques. Additionally, we propose a numerical solution to this problem, leveraging on Deep Learning techniques. This reformulation opens new pathways for sampling multidimensional distributions with densities known up to a normalization constant, a problem frequently encountered in Bayesian statistics.
Abstract:We present a nonparametric model-agnostic framework for building prediction intervals of insurance claims, with finite sample statistical guarantees, extending the technique of split conformal prediction to the domain of two-stage frequency-severity modeling. The effectiveness of the framework is showcased with simulated and real datasets. When the underlying severity model is a random forest, we extend the two-stage split conformal prediction procedure, showing how the out-of-bag mechanism can be leveraged to eliminate the need for a calibration set and to enable the production of prediction intervals with adaptive width.