Research    Publications    Funding    Professor    People    Course

Yuna Jeong

Ph.D. Dissertation, Sungkyunkwan University, 2019.
This dissertation presents a GPU-based rendering algorithm for real-time defocus blur and bokeh effects, which significantly improve perceptual realism of synthetic images and can emphasize user’s attention. The defocus blur algorithm combines three distinctive techniques: (1) adaptive discrete geometric level of detail (LOD), made popping-free by blending visibility samples across the two adjacent geometric levels; (2) adaptive visibility/shading sampling via sample reuse; (3) visibility supersampling via height-field ray casting. All the three techniques are seamlessly integrated to lower the rendering cost of smooth defocus blur with high visibility sampling rates, while maintaining most of the quality of brute-force accumulation buffering. Also, the author presents a novel parametric model to include expressive chromatic aberrations in defocus blur rendering and its effective implementation using the accumulation buffering. The model modifies the thin-lens model to adopt the axial and lateral chromatic aberrations, which allows us to easily extend them with nonlinear and artistic appearances beyond physical limits. For the dispersion to be continuous, we employ a novel unified 3D sampling scheme, involving both the lens and spectrum. Further, the author shows a spectral equalizer to emphasize particular dispersion ranges. As a consequence, our approach enables more intuitive and explicit control of chromatic aberrations, unlike the previous physically-based rendering methods. Finally, the dissertation presents an efficient bokeh rendering technique that splats pre-computed sprites but takes dynamic visibilities and appearances into account at runtime. To achieve alias-free look without excessive sampling resulting from strong highlights, the author efficiently sample visibilities using rasterization from highlight sources. Our splatting uses a single precomputed 2D texture, which encodes radial aberrations against object depths. To further integrate dynamic appearances, the author also proposes an effective parameter sampling scheme for focal distance, radial distortion, optical vignetting, and spectral dispersion. The method allows us to render complex appearances of bokeh efficiently, which greatly improves the photorealism of defocus blur.
Paper preprints, slides, GitHub, and Google Scholar
* Copyright Disclaimer: paper preprints in this page are provided only for personal academic uses, and not for redistribution.
@INPROCEEDINGS{yuna19:dissertation, title={{Efficient and Expressive Rendering for Real-Time Defocus Blur and Bokeh}}, author={Yuna Jeong}, booktitle={{}}, year={2019} }

27336, College of Software, Sungkyunkwan University, Tel. +82 31-299-4917, Seobu-ro 2066, Jangan-gu, Suwon, 16419, South Korea
Campus map (how to reach CGLab)