Alert button
Picture for Jeppe Revall Frisvad

Jeppe Revall Frisvad

Alert button

Eikonal Fields for Refractive Novel-View Synthesis

Feb 11, 2022
Mojtaba Bemana, Karol Myszkowski, Jeppe Revall Frisvad, Hans-Peter Seidel, Tobias Ritschel

Figure 1 for Eikonal Fields for Refractive Novel-View Synthesis
Figure 2 for Eikonal Fields for Refractive Novel-View Synthesis
Figure 3 for Eikonal Fields for Refractive Novel-View Synthesis
Figure 4 for Eikonal Fields for Refractive Novel-View Synthesis

We tackle the problem of generating novel-view images from collections of 2D images showing refractive and reflective objects. Current solutions assume opaque or transparent light transport along straight paths following the emission-absorption model. Instead, we optimize for a field of 3D-varying Index of Refraction (IoR) and trace light through it that bends toward the spatial gradients of said IoR according to the laws of eikonal light transport.

* 8 pages, 6 figures 
Viaarxiv icon

Superaccurate Camera Calibration via Inverse Rendering

Mar 20, 2020
Morten Hannemose, Jakob Wilm, Jeppe Revall Frisvad

The most prevalent routine for camera calibration is based on the detection of well-defined feature points on a purpose-made calibration artifact. These could be checkerboard saddle points, circles, rings or triangles, often printed on a planar structure. The feature points are first detected and then used in a nonlinear optimization to estimate the internal camera parameters.We propose a new method for camera calibration using the principle of inverse rendering. Instead of relying solely on detected feature points, we use an estimate of the internal parameters and the pose of the calibration object to implicitly render a non-photorealistic equivalent of the optical features. This enables us to compute pixel-wise differences in the image domain without interpolation artifacts. We can then improve our estimate of the internal parameters by minimizing pixel-wise least-squares differences. In this way, our model optimizes a meaningful metric in the image space assuming normally distributed noise characteristic for camera sensors.We demonstrate using synthetic and real camera images that our method improves the accuracy of estimated camera parameters as compared with current state-of-the-art calibration routines. Our method also estimates these parameters more robustly in the presence of noise and in situations where the number of calibration images is limited.

* 10 pages, 6 figures 
Viaarxiv icon