The CPU raytracer was a flawed project from year 1, which didn't produce the results I wanted due to having lots of hardware issues. Hence I had taken it upon myself to redo the project with the new found knowledge I had gathered.
In the end the project represents a Whitted style raytracer that creates images based on a loaded model, a skybox and a configured material or a loaded material, that comes with the model. I also managed to add some basic camera effects such as anti aliasing and depth of field.
Group members: 1 graphics programmer
Span: November 2020 - January 2021
Software: Visual studio 2019
- Designing the class structure.
- Implementing a custom BVH - took the implementation from PBRT3
- Intersection methods for: triangles/spheres/boxes.
- Implementing the camera models for perspective and oblique rendering
- Implementing camera effects such as anti-aliasing and depth of field.
- Skybox sampling
- Multithreading to accelerate the raytracing process
- Implementing shading techniques:
- Blinn phong (and normal coloring)
- Beers law
Image Context: The stanford bunny with a diffuse green material
The takeaway from this project would be: don't re-invent the wheel. Instead do your research thoroughly and find people that have been successful. Learn to understand those implementations and master them. A good example would be the BVH, I was in need of a spatial subdivision structure, since the performance of an intersection test would scale linearly, for which I used PBRT's implementation.
Apart from that the ray tracer also supports refraction together with Beer's law, though this doesn't seem to create the expected results. The reasoning for me has been unclear and due to time constraints, I did not manage to resolve this issue.
When looking back, this project was a good learning experience and incredibly motivating to work on. It taught me how to read academic papers and when/where I should implement custom solutions. It is however still a shame that the refraction doesn't work. If I had more time to spend I would definitely look into the issue.
Image context: the Stanford "OBJ" dragon (12 million triangles) with a diffuse grey material, 8x anti-aliased and 8x depth of field sampled. The dragon itself is standing on a reflective plane.