The Bean Ray Tracer
Bean is a Monte Carlo ray tracer
which uses Photon
mapping to simulate full global illumination. It implements takes
planes, spheres and triangles as geometrical primitives and implements.
It can render diffuse, specular, Phong, dielectric and generalized
lambertian (Oren-Nayar) surfaces. Texture mapping is performed using
bilinear interpolation. The gray level of any arbitrary image can be
used as the bumpmap for a surface. Sublinear ray object intersection is
implemented using Bounding Volume Hierarchies and BSP Trees. Point, area
and directional light sources with a finite angular extent are
available for illuminating the scene. HDRI illumination/radiance maps
can also be used for lighting up scenes.
For global illumination it uses
Photon mapping to estimate the indirect illumination at various points
in the scene. Since I had no need for caustics in my scene, the current
implementation only implements standard and volume photon map. The
caustic photon map will just have to wait its turn. While not shown in a
any of these images the renderer also implements participating media. It
implements uses upto two lobes of the Henyey-Greenstien phase function
to perform scattering and uses ray marching to render the images.
Rendering the Sponza Atrium
I wish I had the skills to
model, but I don't. Hence I decided to render someone else's model. The
scenes I rendered are based on a model of the Sponza Atrium graciously
made available by Marko
and can be downloaded from his website.The model has
apporximately 65,000 triangles and comes with associated texture maps.
The scenes were rendered using
two kind of light sources. I used a directional lightsource with a small
angular extent to simulate the incoming illumination from the sun. The
sun was given an angular extent of 0.01 radians. A hemispherical light
source was simulate the incoming lighting from the sky. The hemisphere
was straitified into 300 point light sources which were sampled using
All images were rendered as high dynamic range images and composited
and tone mapped using the vism tool.
Photon mapping with multiple light sources
A total of 2,000,000 photons
were used for each light source. The sky and the sun were used to render
separate image which were then composited into the same image. This was
done because the sky and the sun have very different brightnesses (more
than a few orders of magnitude) and to render everything using a single
photon map requires you to shoot photons proportional to the relative
brightness of each light. This will lead to an explosion of photons
without improving image quality very much.
All the code is written using C++
and compiles cleanly on gcc 3.2.2. Since I am a total Python lover, any programming project
without some Python flavouring is simply not done, hence I wrote the
entire ray tracer as an extension module. Hence the entire renderer is
Python scriptable and scene files are Python programs. The C++ to python
wrapping was done using the Boost.python. Pyste
rocks !. Finally you can't possibly use make for a Python project when
you have the joyous SCons around.