Skip to content

Realtime rasterizor with interactive support, including 4 BRDFs, implicit/explicit shape rendering, interactive camera movements, FBO, texture mapping, FXAA, adaptive level of detail, etc.

Notifications You must be signed in to change notification settings

Junyu-Liu-Nate/Realtime-Interactive-Rasterizor-OpenGL

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Realtime Interactive Rasterizor - OpenGL

This project implements a realtime interactive rasterizator with 4 BRDFs, adaptive level of details, implicit/explicit shape rendering, advanced lightings, interactive camera control, FBOs, FXAA, etc. This project associates with Brown CSCI 2230 Computer Graphics course and all project handouts can be found here.

Part One: Lights, Camera

The handout for this part can be found here.

BRDFs

Run the program, open the specified .json file, follow the instructions to set the parameters, and save the image with the specified file name using the "Save image" button in the UI. It should automatically suggest the correct directory - again, be sure to follow the instructions in the left column to set the file name. Once you save the images, they will appear in the table below.

If your program can't find certain files or you aren't seeing your output images appear, make sure to:

  1. Set your working directory to the project directory
  2. Clone the scenefiles submodule. If you forgot to do this when initially cloning this repository, run git submodule update --init --recursive in the project directory
File/Method To Produce Output Expected Output Your Output
Input: unit_cone.json
Output: unit_cone.png
Parameters: (5, 5, 0.1, 100)
Place unit_cone.png in student_outputs/lights-camera/required folder
Input: unit_cone_cap.json
Output: unit_cone_cap.png
Parameters: (5, 5, 0.1, 100)
Place unit_cone_cap.png in student_outputs/lights-camera/required folder
Input: unit_cube.json
Output: unit_cube.png
Parameters: (5, 5, 0.1, 100)
Place unit_cube.png in student_outputs/lights-camera/required folder
Input: unit_cylinder.json
Output: unit_cylinder.png
Parameters: (5, 5, 0.1, 100)
Place unit_cylinder.png in student_outputs/lights-camera/required folder
Input: unit_sphere.json
Output: unit_sphere.png
Parameters: (5, 5, 0.1, 100)
Place unit_sphere.png in student_outputs/lights-camera/required folder
Input: unit_cone.json
Output: unit_cone_min.png
Parameters: (1, 3, 0.1, 100)
Place unit_cone_min.png in student_outputs/lights-camera/required folder
Input: unit_cone_cap.json
Output: unit_cone_cap_min.png
Parameters: (1, 3, 0.1, 100)
Place unit_cone_cap_min.png in student_outputs/lights-camera/required folder
Input: unit_cube.json
Output: unit_cube_min.png
Parameters: (1, 1, 0.1, 100)
Place unit_cube_min.png in student_outputs/lights-camera/required folder
Input: unit_cylinder.json
Output: unit_cylinder_min.png
Parameters: (1, 3, 0.1, 100)
Place unit_cylinder_min.png in student_outputs/lights-camera/required folder
Input: unit_sphere.json
Output: unit_sphere_min.png
Parameters: (2, 3, 0.1, 100)
Place unit_sphere_min.png in student_outputs/lights-camera/required folder
Input: parse_matrix.json
Output: parse_matrix.png
Parameters: (3, 5, 0.1, 100)
Place parse_matrix.png in student_outputs/lights-camera/required folder
Input: ambient_total.json
Output: ambient_total.png
Parameters: (5, 5, 0.1, 100)
Place ambient_total.png in student_outputs/lights-camera/required folder
Input: diffuse_total.json
Output: diffuse_total.png
Parameters: (5, 5, 0.1, 100)
Place diffuse_total.png in student_outputs/lights-camera/required folder
Input: specular_total.json
Output: specular_total.png
Parameters: (5, 5, 0.1, 100)
Place specular_total.png in student_outputs/lights-camera/required folder
Input: phong_total.json
Output: phong_total.png
Parameters: (5, 5, 0.1, 100)
Place phong_total.png in student_outputs/lights-camera/required folder
Input: directional_light_1.json
Output: directional_light_1.png
Parameters: (5, 5, 0.1, 100)
Place directional_light_1.png in student_outputs/lights-camera/required folder
Input: directional_light_2.json
Output: directional_light_2.png
Parameters: (10, 10, 0.1, 100)
Place directional_light_2.png in student_outputs/lights-camera/required folder
Input: phong_total.json
Output: phong_total_near_far.png
Parameters: (5, 5, 9.5, 12)
Place phong_total_near_far.png in student_outputs/lights-camera/required folder
Input: directional_light_1.json
Output: directional_light_1_near_far.png
Parameters: (25, 25, 8, 10)
Place directional_light_1_near_far.png in student_outputs/lights-camera/required folder

Design Choices

  • Camera data: Calculation of view matrix and projection matrix is implemented in render/camera.cpp
  • Shape implementations: Shape implementations are in shapes/. The implementation has been run in lab 8 to validate the correctness of discretization and normal calculation.
  • Shaders: Shaders implementation and variable passing are implemented. One design choice to note is that only initializations are put in initializeGL() and value passing and updates are put in sceneChanged(), to enable correct updating of scenefiles.
  • Tessellation: Shape discretization changes with parameter toggles in all situations (including in adaptive level of detail).
  • Software engineering, efficiency, & stability:
    • Repeatable codes are packed into functions.
    • When updating scenes or parameters, only necessary changes are computed.
    • When changing scenes, clean ip functions are called to clear scene objects.
    • The program can quickly render complex scenes like recursive_sphere_7.json

Extra Credit Features

Adaptive level of detail

  • Number of objects in the scene

When number of objects is in the scene is more than 10, the discretization level would be scaled by factor 1.0 / (log(0.1 * (renderScene.sceneMetaData.shapes.size() - 10) + 1) + 1) to have smooth decreasing when the descretization level when the number of objects in the scene is changing.

Note that the lower bound 10 can be changed and the scaling factor function can also be changed to 1.0 / (log(0.1 * (renderScene.sceneMetaData.shapes.size() - lower_bound) + 1) + 1). In addition, there is a commented out fixed factor 0.5 if user need to use this simplified version.

File/Method To Produce Output Whole Image Zoom In
Input: recursive_sphere_2.json
Output: recursive_sphere_2.png
Parameters: (25, 25, 0.1, 100)
Input: recursive_sphere_3.json
Output: recursive_sphere_3.png
Parameters: (25, 25, 0.1, 100)
Input: recursive_sphere_4.json
Output: recursive_sphere_4.png
Parameters: (25, 25, 0.1, 100)
Input: recursive_sphere_5.json
Output: recursive_sphere_5.png
Parameters: (25, 25, 0.1, 100)
Input: recursive_sphere_6.json
Output: recursive_sphere_6.png
Parameters: (25, 25, 0.1, 100)
Input: recursive_sphere_7.json
Output: recursive_sphere_7.png
Parameters: (25, 25, 0.1, 100)
Input: recursive_sphere_8.json
Output: recursive_sphere_8.png
Parameters: (25, 25, 0.1, 100)
  • Distance from the object to the camera

The discretization parameters are scaled by calculateDistanceFactors(), in which the minimum distance minDistance of all the shapes in a scene is calculated and the discretization parameters for all other primitives are scaled by 1 / (distance / minDistance). Note that after the scaling, the lower bound of discretization parameters is checked to ensure that the distant primitives have reasonable shapes.

The table below shows the comparison of extra_credit2 turned off and on. When turned on, more distant spheres show fewer discretizations than nearer ones. When turned off, they have the same discretization levels.

File/Method To Produce Output Tunred off Tunred on
Input: recursive_sphere_5.json
Output: recursive_sphere_5.png
Parameters: (25, 25, 0.1, 100)

Custom Scene File

This custom scene file resembles a robot car with sensors like eyes.

File/Method To Produce Output Your Output
Input: custom_scene_file.json
Output: custom_scene_file.png
Parameters: (25, 25, 0.1,100)

Mesh Rendering

  • Implemented my own .obj file reader from scratch.
  • Can handle .obj files that explicitly specifies vn info (e.g., the dragon_mesh.obj) and files that doesn't explicitly provide them (e.g., the bunny_mesh.obj). In the later situation, the .obj reader function automatically calculates the vertex normals.
File/Method To Produce Output Your Output
Input: bunny_mesh.json
Output: bunny_mesh.png
Parameters: (5, 5, 0.1,100)
Input: dragon_mesh.json
Output: dragon_mesh.png
Parameters: (5, 5, 0.1,100)

Part Two: Interactve Control and FBO

The handout for this part can be found here.

Advanced Lighting, Interavtve Camera Control, and FBOs

Important

Before generating expected outputs, make sure to:

  1. Set your working directory to the project directory
  2. From the project directory, run git submodule update --recursive --remote to update the scenefiles submodule.
  3. Change all instances of "action" in mainwindow.cpp to "action" (there should be 2 instances, one in MainWindow::onUploadFile and one in MainWindow::onSaveImage).

Run the program, open the specified .json file and follow the instructions to set the parameters.

Note that some of the output images seems that the camera is moved a bit, but the on-screen results are exactly the same as the expected output. This can be checked in demonstration.

Point and Spot Lights

File/Method To Produce Output Expected Output Your Output
Input: point_light_1.json
Output: point_light_1.png
Parameters: (5, 5, 0.1, 100)
Place point_light_1.png in student_outputs/action/required folder
Input: point_light_2.json
Output: point_light_2.png
Parameters: (5, 5, 0.1, 100)
Place point_light_2.png in student_outputs/action/required folder
Input: spot_light_1.json
Output: spot_light_1.png
Parameters: (5, 5, 0.1, 100)
Place spot_light_1.png in student_outputs/action/required folder
Input: spot_light_2.json
Output: spot_light_2.png
Parameters: (5, 5, 0.1, 100)
Place spot_light_2.png in student_outputs/action/required folder

Invert

File/Method To Produce Output Expected Output Your Output
Input: primitive_salad_1.json
Apply invert filter
Output: primitive_salad_1_invert.png
Parameters: (5, 5, 0.1, 100)
Place primitive_salad_1_invert.png in student_outputs/action/required folder

Grayscale

File/Method To Produce Output Expected Output Your Output
Input: primitive_salad_1.json
Apply grayscale filter
Output: primitive_salad_1_grayscale.png
Parameters: (5, 5, 0.1, 100)
Place primitive_salad_1_grayscale.png in student_outputs/action/required folder

Blur

File/Method To Produce Output Expected Output Your Output
Input: recursive_sphere_4.json
Apply blur filter
Output: recursive_sphere_4_blur.png
Parameters: (5, 5, 0.1, 100)
Place recursive_sphere_4_blur.png in student_outputs/action/required folder

Camera Translation

Instructions: Load chess.json. For about 1 second each in this order, press:

  • W, A, S, D to move in each direction by itself
  • W+A to move diagonally forward and to the left
  • S+D to move diagonally backward and to the right
  • Space to move up
  • Cmd/Ctrl to move down
Expected Output
Screen.Recording.2023-11-29.at.4.34.35.AM.mov
Your Output
camera_translation.mov

Camera Rotation

Instructions: Load chess.json. Take a look around!

Expected Output
Screen.Recording.2023-11-29.at.4.33.06.AM.mov
Your Output
camera_rotation.mov

Design Choices

Efficency Optimization

  • Scene objects are regenerated only when Param1 and Param2 are changed.
  • No inverse and transpase are calculated in vertex shader. The normal matrix is precomputed as uniforms and passed to vertex shader.
  • Texture images are reloaded only when the texture image filepath is changed.

Software Engineering

  • Error conditions (e.g., non-exist texture files) are properly handled and can produce results without crashing.
  • Common codes are wrappoed into functions to reduce redundancy.
  • Code are properly annotated with explanations of key steps.

Known Bugs

The main bug is in texture mapping. The textures which in the edge regions of a texture image show inaccurate uv location and results in some high-frequency aliasing. I have ensure that all clampings are properly used, but it seems that the rounding error of vertex positions may cause such issue.

Extra Credit Features

More filters

Gray scale filter and solbel filter are implemented as extra filters for per-pixel and multi-stage filter.

Perpixel - Grayscale Multi-Stage Filter - Sobel filter

Texture mapping

Texture mapping is implemented by adding uv coordinates to VBO and fetching texture image color in Phong illumination model.

Primitive Type Single Multiple
Sphere
Cube
Cyliner
Cone

Fast Approximate Anti-Aliasing

The FXAA is implemented with the following steps:

  1. Luminance Calculation: Compute the luminance of the current pixel and its neighbors.
  2. Edge Detection: Detect edges by comparing luminance differences.
  3. Sub-Pixel Anti-Aliasing: Incorporate finer details by considering sub-pixel variances.
  4. Edge Direction Determination: Calculate the gradient to determine edge direction.
  5. Anti-Aliasing Blend: Blend colors along the detected edges based on the edge direction and strength.

The difference is obvious if zoomed in. Also, it's obvious in live demonstration.

File No FXAA With FXAA
primitive_salad_1.json
recursive_sphere_4.json
Cyliner

About

Realtime rasterizor with interactive support, including 4 BRDFs, implicit/explicit shape rendering, interactive camera movements, FBO, texture mapping, FXAA, adaptive level of detail, etc.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published