A Path Tracer in Haskell
We follow mostly the P. Shirley's architecture with couple of differences.
The branches follow the chapters from online repository.
- Small color gradient in 01-ppm branch:
- Red sphere from 04-sphere branch:
- Normals from branch 05-surface:
- Multiple objects from branch 06-multiple:
- Antialiasing from branch 07-antialias:
- Diffuse image from branch 08-diffuse
- Metal image from branch 09-metal
- Fuzzy metal image from branch 09-metal
- Dielectric from branch 10-dielectric
- Camera focus from branch 11-camera
- A version of final scene from branch 12-oneweekend
- Another version of final scene branch 12-oneweekend
- Fixed version of final scene. The fix happens around branch 14-texture
- Final one weekend final branch 14-texture
- Motion blur branch 14-texture
- Checkered texture from branch 14-texture
- Perlin Noise with Light from branch 14-texture
- Earth image from branch 14-texture
- Cornell box image from branch 15-instances
- Cornell smoke boxes from branch 16-constant-density-mediums
- Cornell sphere and a box from branch 17-scattering-pdf
- Cornell box from 18-spectral
The from branch 08-diffuse an onwards as the usage of random functions become prominent the performance decreases considerably. However the inverse is also true, if you can place your random generators efficiently, you can easily increase your performance. I simply concentrated on getting the images right. Do not be surprised if you find that some other arrangement of RNGs result in better performance.
Spectral rendering is done through use of spectral textures. The general idea
is that material determines the behaviour of the surface distribution function
and the texture determines its color space.
You can see how spectral textures are used in SpectralScene.hs
. The
rendering function determines that the scene is spectral using the data type
of the background. If the background is of type PixSpecSampled
then it
switches to spectral rendering.
Another point is the setting spectral data from rgb color model. This done
through the convenience function fromRGBModel
. You can try specifying
spectrum data directly as well. A SampledSpectrum
is simply a non empty list
of wavelength, power tuple along with a spectrum type specifier. This last
call is not entirely necessary, since for all the operations between spectrums
we don't care about the type of the spectrum, but it becomes convenient to
know when you are doing conversions between spectrum to trichromatic systems.
Lastly the beware that spectral rendering takes much more time than its rgb
equivalent. The spectral cornell box whose image can be found in the
18-spectral branch took 3931.857353s
with 5 samples per pixel and 5 ray
bounce limit for an image width 320 and aspect ratio 16:9. The sampled
wavelength range is [380, 720]
, and the sampling step size is 5, so we sampled
power values for a list of wavelengths such as [380, 385, ..., 720]
.
We use a slightly more flexible approach to rotations than the original books. Basically the rotation is done using rotation matrices which are constructed from the angle and axis information provided during the setup of Rotatable type. Though it can be generalized into arbitrary axis, we are currently supporting only XYZ axes which are passed as RotationAxis type. Also our rotations are inversed with respect to the book. That which is in clockwise is counter clockwise in our case.
I hope to make the tracer as minimal but useful as possible. Here is a list of planned features:
- Loading assets with obj files
- Spectral rendering switch: done
- BVH acceleration structure: done but not tested.
- Multithreaded rendering: This is as easy as passing -N3 as option now, since most of the code is composed of pure functions.