Starting Scene: I added an OpenGL yellow dot to indicate location of point light for the rendering preview. Scene also includes an HDR environment map.
I wanted to added a sphere around the point light so that it will show in the rendered scene. When my ray tracer hits the light sphere, it just returns the color of the light
Next I added Gamma correction.
Then I added Photographic Tonemapping.
Then I increased the light to (2.0,2.0,2.0) (HDR)
I wanted to add a bloom filter, so I added a reflective ball in order to add
more HDR to the scene.
The first image is the original. The second image cuts out all the pixels
where r <= 1 || g <= 1 || b <= 1. The second one is the "blur" according to the
slides. The fourth images is the original image plus the third.
Again, but with the light set to (5,5,5):
First trying the environment map way (w/o actually creating an environment map).
Dotting the reflected ray with the surface normal, and trying different thresholds.
First image is wireframe. Second images is with a < 0.1 threshold. Third image is
with a < 0.15 threshold. The fourth image is with a < 0.2 threshold.
The algorithm works well for the sphere, but for very "blocky" objects, it
appears to perform poorly.
I believe this is due to the fact that the chance that the light will hit the edge exactly is quite low. The
sphere has many places where the light can hit and have dot product meet the threshold. For the blocks,
however, the light ray has to pretty much hit right at the edge. I tried many threshold values, but it
always catches either too much or too little.
Same algorithm for Sponza scene:
Some of my attempts ...
Here's a picture of non-bouncing photons from the light source:
There here is one showing the photons after 1 bounce
Here are 5 rays that show the path from light to object and then bounce to 2nd object
Here is the just the irradiance rendered directly at each point:
Since the light is still (5,5,5), I put it down to (1,1,1) and render the irradiance:
Now here is the scene where 100 random rays are shot at each point to gather the irradiance:
I see that my edge detection is still on, so I turn it off and put it
down to 10 random rays per pixel
(for irradiance calculation) so that it renders faster:
I figure out that the black spots on my sphere are due to dividing by
zero. When I took
the average indirect illuminance, I divided by zero if there were zero rays that were able to
hit an object. I fixed that and rendered this (100 random rays per pixel):
This is indirect illumination only:
Toshiya explained that I was doing some things wrong (e.g. forgot to re-enable
so here are new images with some fixes implemented:
(I added irradiance to places with shadows to the second one, and I also added more photons and samples)
Making a Scene:
I know I can't just present the Cornell Box as my final scene, so I went looking for models on the Internet and found this:
I had to go and edit the model to add color to it. One problem, however, is that
our current object loader
does not support material color lading, so I had to work on adding that. I started by dissecting the model by parts
and applying the material.
Eventually I figured out, though, that this method wasn't optimal, so I changed
the loader. Here's an image of the loader loading the model but not yet reading
the material file:
After getting material loading to properly be read and loaded, I ended up with this (I added the color to the model myself):
I decided I wanted more polygons, so I used a model editor to ran a smoothing algorithm. I ran the algorithm twice. Once on the left image and twice on the right image.
Next I found a mechanical owl that could work well in my final scene:
Next I add a setting for the scene: a bedroom
Now I try to put it all together to create a cohesive scene:
I needed to add the walls to enclose the room, and when I added the walls in the
model, I couldn't get it to work. After trying many things and debugging,
I finally realized that the normals were flipped wrong, so I manually flipped them (with a text editor) the right way around.
I enable shadows and the high-polygon model, and then I render with photon mapping:
I realize that the indirect illumination should be multiplied by the object diffuse:
With bloom enables:
Photon mapping with cos distribution fixed:
Cel shading on just the owl:
Full rendering with various settings:
A more dense photon rendering:
Rendering with a ball in there:
I added toon shading to the owl to make it glow:
I didn't do the square lights properly in my last assignment, so I wanted to fix
it here. Here is some debugging visualization:
(random spots from the square light are shot at each point to determine visibility)
I wanted to do something special, and make it a rectangle light that only points in one direction. Here is one of my renderings, where I realized it was pointing the wrong way:
(as you can see with the indirect illumination photons, almost none of them end up on the back wall, because they hit that first every time.
Some debug renderings showing trajectory:
I also wanted the yellow rectangle where the light comes in to be shown in the
image (to make it look like brightness was coming out of there).
I added intersection testing with lighting and code to replaces those pixels with bright lights.
I wanted a refractive sphere in the scene for caustics. Here is the first rendering, where the the caustics hit the sphere and then go inside:
Here I complete it by having the caustics photons go through the sphere
Fixing the white sphere:
A more complete rendering:
Trying to finish it up by putting it all together:
I added information while rendering that estimates completion time: