This repo holds the final project of ETHz Computer Graphics cource. You could find the source code for this project as well as a non-complete tutorial-like report below. Useful resources for implementation are listed at the end. Hope this repo could help you quickly start building your customizd renderer!
Theme of this year:
Many things have their dedicated place to be. For this competition, we want to escape the ordinary and allow everything to be "out of place".
ArtStation - She's Gone - UE4 Fish Tank, Borja "Helix" Ferrandez
ArtStation - Ruins in the water, Xiangzhao Xi
Some people like to have a fish tank in their house. In this design, every thing inside the fish tank is scaled out and every thing outside is put into the fish tank. In the initial imagination, a tank floats on the water under the sky. A batch of clouds float in the sky and fishes swim in the water. A room is scaled into the tank with its furniture and possibly other light sources and human characters.
Noticeable technical aspects:
- Environment map of the sky
- Volumetric rendering for the clouds and water, if not suitable models could be found, clouds will be included in the environment map
- Disney BSDF for the furniture in the tank and stones in the water
- Image as textures for the room floor and ocean floor
- Homogeneous Participating Media (15 points)
- Disney BSDF (15 points)
- subsurface
- metallic
- specular
- sheen
- clearcoat
- Environment Map Emitter (15 points)
- Image as Textures (5 points)
- Probabilistic Progressive Photon Mapping (5 points)
- Depth of Field (5 points)
To make an identical scene for both renderers, we use blender as a bridge. There exists plugins for both renders thus we could build a scene in blender and then export it to nori and mitsuba format.
From blender
to nori
Phil26AT/BlenderNoriPlugin: Export blender scenes to the Nori educational raytracer. Proposed and used in the Computer Graphics course at ETH Zurich, Fall 2020 (github.com)
Problems:
- Texture .png files are not exported, even enabled this setting.
- Obj normals might direct inward. Use
meshlab
to flip the normals. You can first view the obj in windows 3D viewer to see where the normals are.
From blender
to mitsuba
mitsuba-renderer/mitsuba-blender: Mitsuba integration add-on for Blender (github.com)
Problems:
- Object may be classified as
two-sided
- A constant emitter may be added automatically
You could download scenes from Blend Swap | Home and directly export to nori
and mitsuba
.xml format which could save you some time.
Note that sometimes you need to manually rectify some errors but there's essentially little work to do. The procedure is very simple. Just replace all BSDF to diffuse and remove seemingly extra terms.
Besides, you might need to tune the path integrator in nori
and mitsuba
so that they give almost the same output for a simple scene, something like this in tev
:
File added
include/nori/medium.h
include/nori/phasefunction.h
src/henyey_greeenstein.cpp
src/homogeneous.cpp
src/transluscent.cpp
src/volpath.cpp
File touched
CMakeLists.txt
include/nori/common.h
include/nori/camera.h
include/nori/emitter.h
include/nori/object.h
include/nori/ray.h
include/nori/shape.h
include/nori/bsdf.h
src/shape.cpp
src/sphere.cpp
src/arealight.cpp
src/dof_camera.cpp
src/mesh.cpp
src/perspective.cpp
src/pointlight.cpp
src/inftyarealight.cpp
src/mirror.cpp
This is the most difficult part of the project. The entire rendering pipeline is changed. Following the PBRT book, a medium class is created and attached to emitter, shape, camera and ray. Emitter, camera and ray are designated to one medium and shape serves as a boundary of two medium. When no boundary is specified, the scene is by default in vacuum. A special translucent BSDF is implemented to serve as invisible boundary. This BSDF is only valid for path integrator or photonmapper. A special integrator volpath.cpp
is implemented. Only with this integrator can we render participating media. Using other integrators, the media is not visible.
When a ray travels in a medium, it could be scattered or absorbed by the medium. (Possibly enhanced by radiance from other directions or the medium is itself a emitter, but there are not modeled in the implementation). We follow the same strategy used in path_mis
:
A path is sampled, and at each vertex of this path we do emitter sampling and BSDF sampling.
This time a path contains both volumetric vertex and surface vertex. The medium class decides which type of interaction happens and the phase function class decides which direction the ray is scattered to when a medium interaction is sampled. Whichever interaction is sampled, the attenuation of the ray must be taken into account.
Result:
Smoke confined in a dielectric and translucent surface
Remark:
A new ray is generated for the next iteration. Its maxt
is usually set to be INFINITY
but remember to reset this value when an intersection is found in the next iteration. Otherwise a medium interaction will always be sampled.
File added
src/disney.cpp
src/twosided.cpp
File touched
CMakeLists.txt
src/path_mis.cpp
src/direct_mis.cpp
A strict implementation following UCSD CSE 272 Assignment 1: Disney Principled BSDF. Implemented parameters:
Texture<Color3f> * m_baseColor;
float m_subsurface;
float m_sheen;
float m_sheenTint;
float m_clearcoat;
float m_clearcoatGloss;
float m_specular;
float m_specTint;
float m_specTrans;
float m_roughness;
float m_anisotropic;
float m_metallic;
0.0 | 0.2 | 0.4 | 0.6 | 0.8 | 1.0 | |
---|---|---|---|---|---|---|
subsurface | ![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
metallic | ![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
specular | ![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
sheen | ![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
clearcoat | ![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
File added
src/inftyarealight.cpp
File touched
CMakeLists.txt
include/nori/integrator.h
include/nori/emitter.h
src/pppm.cpp
src/photonmapper.cpp
src/path_mis.cpp
src/path_mats.cpp
src/direct_mis.cpp
src/direct_mats.cpp
src/direct_ems.cpp
The PBRT book gives a nice example for the process. Here a simpler implementation using nearest neighbor strategy is given:
- Load the .exr file into a bitmap which is essentially an Eigne array.
The loading infrastructure already exists in nori, find it in
hdrToLdr.cpp
. - Precompute the pdf as indicated in lecture slides. Use Eigen matrix and vector to store the pdf information. Use Eigen library use greatly simplify the process. Check it here Eigen: Getting started.
- Mapping a direction to the spherical coordinates and use nearest neighbor interpolation to find the pixel. Note that the PBRT book uses bilinear interpolation which is usually a better interpolation strategy. But it will be tedious to deal with the probability of this interpolation. Thus the nearest neighbor interpolation is used so that importance sampling is much simpler.
Results:
nori | mitsuba |
---|---|
![]() |
![]() |
Remarks: There are some slight differences which might because of nearest neighbor interpolation and bilinear interpolation. All integrators are changed to incorporate environment mapping.
File added
include/lodepng.h
src/imagetexture.cpp
src/lodepng.cpp
File touched
CMakeLists.txt
src/direct_mis.cpp
src/path_mis.cpp
.obj and .ply files already store the uv coordinates thus we don't need to worry about the process of creating uv coordinates for the mesh.
- Load .png files There is a powerful opensource repo on GitHub lvandeve/lodepng: PNG encoder and decoder in C and C++. (github.com) which provides the utility. Simply look into the examples it provides and you will quickly learn how to use it.
Another important thing is that .png files are gamma-corrected. You need to undo gamma correction to get the true color. Example code
float inverseGammaCorrect(float value)
{
if (value <= 0.04045f)
return value * 1.f / 12.92f;
return std::pow((value + 0.055f) * 1.f / 1.055f, 2.4f);
}
-
Map the uv coordinates to pixel coordinates First of all, assume all uv coordinates lie in
$[0,1]$ . Simple multiply each by the width and height respectively we are in pixel field. After mapping to width and height, two interpolations are implemented:- Nearest neighbor
- Bilinear Bilinear will usually give better results but if you use high resolution pictures, the difference could be neglected.
-
Deal with uv coordinates out of bound / Scaling If there are uv coordinates out of bound
$[0, 1]$ accidentally or we want to scale the picture, we could either use repeat or clamp strategy. Both are implemented.
Results:
nori / mitsuba
nori | mitsuba |
---|---|
![]() |
![]() |
scaling
shrink | expand |
---|---|
![]() |
![]() |
Remarks: Check the integrators if they correctly set the uv coordinates in BSDFQueryRecord.
File added
src/pppm.cpp
File touched
CMakeLists.txt
include/nori/integrator.h
src/photonmapper.cpp
src/render.cpp
To implement this feature, we need to touch the rendering pipeline i.e. render.cpp
and integrtor.h
so that you are allowed to render multiple images.
The algorithm is simple, we only need to add an outer loop to the main rendering loop. You may need an additional parameter in .xml files to specify the number of iterations. The output of rendering is also changed so that when you have multiple image outputs, nori will create a directory for you.
Result:
parameters | results |
---|---|
- 400 iterations - 250000 photons - 32 spp - 17.6 min |
![]() |
- 1 iterations - 10000000 photons - 512 spp - 1.7 min |
![]() |
Iteration | Image |
---|---|
1 | ![]() |
50 | ![]() |
100 | ![]() |
400 | ![]() |
Remark: The sampler in the integrator could not be reset, otherwise it will always sample the same point for every image.
File added
src/dof_camera.cpp
File touched
CMakeLists.txt
This feature is fairly simple, following Projective Camera Models (pbr-book.org) or course slides, it could be easily implemented.
A problem for validation is that, parameters of lens for cameras in nori are provided in local coordinates while in mitsuba they are in world coordinates. This makes validation quite difficult.
Results:
nori | mitsuba |
---|---|
![]() |
![]() |
This class of BSDF serves as an invisible boundary for homogeneous participating medium. Only path integrator or photonmapper will show the invisible effect.
This class of BSDF is designed to be visible on both sides so that you don't need to care about the normals, since sometimes the output of blender may not have the correct normals as you wish.
This class of BSDF models thin dielectric, mainly bubbles. Since my scene contains bubbles this class is very useful.
Now dielectric material could have texture.
The image showed here consists 72 shapes and about 150000 primitives. It's rendered using 512 spp to a 1920x1080 HD image.
The image is essentially a duality of fish tank in a room, i.e. the room is now in a tank floating on the sea. A girl and a corgi are confined in the tank and fishes swim around them. You could also see corals and starfishes on the seafloor.
Technical details:
- The sky is an environment emitter using a 4k exr image.
- The tank is a bubble(thin dielectric) BSDF.
- The tank is filled with homogeneous participating media to make the sunset effect more impressive.
- Every living creatures have their own texture on diffuse or Disney BSDF.
- The sea also contains homogeneous participating media. You could see the effect clearly near the tank. And the coral and starfish are almost invisible due to attenuation of the radiance.
Dartmouth rendering competition
ArtStation - She's Gone - UE4 Fish Tank, Borja "Helix" Ferrandez
ArtStation - Ruins in the water, Xiangzhao Xi
Physically Based Shading At Disney (disneyanimation.com)
UCSD CSE 272 Assignment 1: Disney Principled BSDF
Raytracing - UV Mapping and Texturing | 1000 Forms of Bunnies (viclw17.github.io)
lvandeve/lodepng: PNG encoder and decoder in C and C++. (github.com)
Progressive Photon Mapping: A Probabilistic Approach
Infinite Area Lights (pbr-book.org)
Sampling Light Sources (pbr-book.org)
Rendering the Moana Island Scene Part 1: Implementing the Disney BSDF (schuttejoe.github.io) Volume Scattering (pbr-book.org)
Light Transport II: Volume Rendering (pbr-book.org)
Model credit to: Blend Swap | CGTrace's Material Ball Mitsuba Material Ball
Environment credit to: Limpopo Golf Course HDRI • Poly Haven