Personal Projects

Path tracing and global illumination with Vulkan
(2022)

I used ray tracing and global illumination as a case study to learn Vulkan. Everything is bindless and indirect. You can switch between ray tracing and rasterization.

BRDFs are specified the PBR/glTF way: metalness, roughness, ambient occlusion. This project has it’s own GitHub page. So, please refer to that for more details.


Radiosity engine
(2001, Advanced Graphics course, SUNY Stony Brook)

The Radiosity engine uses the ‘HemiCube‘ form factor calculation. Radiosity methods take several iterations to converge, this being indicated on the right.


Volume Renderer
(2001, Visualization course, SUNY Stony Brook)


The volume renderer supported X-Ray, MIP, Iso-Surface and Compositing algorithms with Phong Shading. It supported an extensible interface, so it was possible to add custom lighting/ray marching algorithms.

This renderer was also used to generate the results in this paper: ‘Splatting with Shadows‘ (K.Mueller, M. Nulkar – Volume Graphics ’01).

Of course today, its possible to do ray casting and volume rendering directly on the GPU as we did in the Volume Rendering module of NVSG (now SceniX)

Terrain rendering
(2010)

Here are some videos showing an integration of Ogre3D with the Advantage Terrain Library. Just something I was experimenting with.

I replaced the threading from the Advantage SDK sample (it uses a thin wrapper for win 32 specific threading functions), with Ogre’s Work Queue interface for performing background loading/work. Its pretty fast, even when the draw distance is almost size of breadth of the data set. For instance, the Puget data set renders at ~250 fps for about 100 batches, with 8x full screen anti-aliasing at a resolution of 1920×1080.

BRDF based rendering
(2005)

BRDFs are 4 dimensional functions that exactly describe how light is reflected off the surface of a material, given the lighting and viewing direction. This allows you more accurately represent materials. Even with GPU shaders, computing this is slow. Computing offline and storing this in a 3D texture is not feasible, because the function is 4 dimensional.

The solution is to reformulate the 4 dimensional function as an operation between two 2-dimensional functions. These 2D functions are stored in 2 cube maps and combined appropriately in the shader. As to the original models themselves, we chose BRDF models that were most appropriate to the material at hand. These included variations of
Cook-Torrance Model.
Schlick’s approximation for Fresnel terms.
Microfacet based BRDF models.


Genome
(2005-2010)

The ‘genome engine’ is a collection of several classes/design patterns that I found myself recurrently using in my projects. It uses these components:
– Ogre3D rendering engine.
– MyGUI for overlays.
– Bullet for physics.
– wxWidgets for window UI.
– Boost (e.g. signals and slots).
– Loki (for factories).

Here’s the Genome Editor and a silly little FPS demo. Both apps use the same code base:


DX11 Tessellation
(2011)

Bezier Curve:
The input consists of 4 control points.
– Vertex Shader: Passes these control points straight through.
– Hull Shader: Passes its input straight through. One can change the basis here if needed. 
– Tessellator: Configured to tessellate lines.
– Domain Shader: Generates the actual points for the samples generated by the tessellator. The generation is given by the equation for the Cubic Bezier Curve:

The u parameter (t in the equation above) is received from the tessellator. The Domain Shader has all the control points (4 in this case) available, so computing the curve point is as simple as plugging the 4 control points into the equation above. And that’s it basically!

NURBS Circle:
Almost the same as the Bezier curve although we have 9 control points though in this case. The computation of a NURBS curve and its basis functions is a rather involved process and beyond the scope of these experiments. I used the code from my assignments and ported it to the Domain Shader instead. You can look in the original Piegl and Tiller book, for how they arrive at the control points, knots, etc.

NURBS Cylinder(Surface):
Conceptually, the NURBS cylinder is just a NURBS circle, swept by an amount equal to the height of the cylinder along the axis of the cylinder. Again, the Piegl and Tiller book shows how they arrive at the control points, knots, etc.

The input here is 9 (circle) x 2 = 18 control points. The domain is now a ‘quad’ and the Domain Shader is now evaluating the surface/patch points using u, v points (from the tessellator) and the Basis Equations. Also, the tessellator is setup with the appropriate tessellation factor for the patch. Here’s some images of the cylinder at various tessellation factors:

My original course assignments included surfaces of revolution as well (e.g. Torus, Vase, etc.), but these require too many control points to render with the DirectX API as a single patch. The maximum value is D3D11_PRIMITIVE_TOPOLOGY_32_CONTROL_POINT_PATCHLIST or 32 if I understood right, for a single patch.
I suppose you could divide the surfaces into multiple patches, but again, this is beyond the scope of my experiments, not to mention I had forgotten a fair bit of NURBS math when I did these experiments.