Skip to content

Recitation 08 Terrain and Water(FrameBuffer)

jijup edited this page Nov 22, 2023 · 2 revisions

Terrain Generation and Water Waves

The recitation is designed to give you a hands on training on virtual terrain generation and simple animation. You can build your assignment 2 on top of this recitation exercise. We will create a sand beach with water animation. We will use Perlin noise (GPU) to create the beach terrain. A quad based water mesh will be created, textured with water texture layered time dependent periodic signals to simulate the water waves. Further, the Framebuffer object will be used to render reflection and refraction textures of water.

You will learn the following aspects.

  • Construction of planar grid mesh (XY plane at z=0).
  • Learn how to displace the vertices of a grid mesh using values from a noise texture.
  • Sine function for water waves animation.
  • Use framebufferObjects to render textures.
  • Create noise values on GPU.
  • Blinn-Phong Shading in fragment shader.
  • Keyboard based camera movements.

alt text

Getting Started

The code for this lab is in the 'Beach' subfolder of the GraphicsLab repository. The folder contains five header files, Camera.h, Water.h, Terrain.h, loadTexture.h and utility.h and four shader programs- two for the terrain rendering and two for the water rendering. Specific TODOs are provided in Terrain.h, terrain_vshader.glsl, terrain_fshader.glsl,Water.h,water_fshader.glsl and Camera.h. Use the following descriptions and complete all the seven tasks to get the required rendering and animation.

Terrain Generation (Terrain.h)

Task 1: Grid Mesh Generation (Terrain.h)

Construct a 2D grid mesh by keeping z values of the vertices to a constant value, i.e., a grid mesh on a z-plane (z=0). In this example, the grid is created in the Terrain() constructor provided inTerrain.h. The code generates the grid triangles. Use the following hints to complete the code.

Given the grid resolution, the first loop generates the vertex positions (optional:texture coordinates). vertices are pushed to points (optional: the texture coordinates of each vertex is pushed to texCoords). You have to find out the x, y corrdinates of each vertex. Please note that the grid is placed at z=0, so the z coordinate will be alsways 0.

///--- Vertex positions
    for (int j = 0; j < n_width; ++j) {
			for (int i = 0; i < n_height; ++i) {
				// we create a vertices from [-size_grid_x/2, size_grid_x/2] instead of [-1, 1] => this creates a larger terrain 
				// that we can blur to give the illusion that it is infinite
				float vertX = -size_grid_x / 2 + j / (float)n_width * size_grid_x;
				float vertY = -size_grid_y / 2 + i / (float)n_height * size_grid_y;
				float vertZ = 0.0f;
				points.push_back(Vec3(vertX, vertY, vertZ));
			}
		}

A similar procedure can be used to find out the corresponding texture coordinates. Please note that since we are generating Perlin noise on GPU, we do not need texture coordinates for terrain mesh vertices for this exercise.

alt text

The second loop generates the triangle indices. The indices of the vertices (points) that form the two triangles of each grid cell are pushed to indices in the following loops. You may use the illustration (Figure 2) to find out the corrrect indices of vertices in each iterations of i and j.

///--- Element indices using GL_TRIANGLE_STRIP
    for (int j = 0; j < n_width - 1; ++j) {
			float baseX = j * n_width;
			indices.push_back(baseX);
			float baseY = ((j + 1) * n_height);
			indices.push_back(baseY);

			for (int i = 1; i < n_height; ++i) {
				float tempX = i + j * n_width;
				indices.push_back(tempX);
				float tempY = i + (j + 1) * n_height;
				indices.push_back(tempY);
			}

			// A new strip will begin when this index is reached
			indices.push_back(indexRestart);
		}

Figure 1 shows the mesh grid with and without the vertices displaces.

Task 2: Displacing the Vertices in shader(terrain_vshader.glsl)

Recall that vertex positions are passed into the shader in model space. They are then transformed with the Model, View, Projection (MVP) matrix that puts them into screen space before being given to the GPU rasterizer.

In this exercise, we generate Perlin noise in the vertex shader. First get the height of each grid vertex, i.e. position.xy

float h = Perlin2D(_______);

You will have to apply the displacement as a vertical translation (along z) to the model space position of each vertex. For example: add vec3(0,0,h) to the vposition.

fragPos = vposition.xyz + vec3(______); 

Since we are using Perlin texture as the height map, the resultant mesh will have a smooth displacement of vertices exhibiting terrain or mountainous properties.

Task 3: Per Vertex Normals (terrain_vshader.glsl)

alt text

The diffuse and specular terms in the Phong Shading Model require the world-space normal vector of your surface at a fragment (i.e., pixel). We will calculate the normals of the grid vertices based on the positions of the adjacent vertices and then interpolate the vertex normals to get the fragment normals. You can numerically evaluate the x and y slopes at the vertex by considering neighboring vertices at one unit offset and taking their differences. Find the coordinates of the neighboring vertices (See Figure 3 for the illustration):

vec3 A = vec3(position.x + 1.0f, position.y       , Perlin2D(position.xy + (1, 0))    );
vec3 B = vec3(______, __________, _________   );
vec3 C = vec3(position.x       , position.y + 1.0f, Perlin2D(position.xy + (0, 1))    );
vec3 D = vec3(_______, _______, ______  );

Similarly, construct vertices B, C & D (refer to the illustration in the Figure 3). Once you have the slope, you can construct a vector perpendicular to the terrain at the point of interest, which is the normal vector needed for Phong lighting. The normal at the vertex can be approximated through the cross product of the two vectors A-B and C-D. Make sure to normalize (normalize()) the vectors before performing the cross product (cross()).

Task 4: Blinn-Phong Shading(terrain_fshader.glsl)

Blinn-Phong shading is performed in the terrain fshader. ALl the required constants (ka = 0.2f, kd = 0.3f, ks = 0.7f, p) are provided in the fshader. To compute shading at a specific fragment(pixel), we need to find the light direction (lightDir), normal (normal) and view direction (viewDirection) at that fragment. Light direction can be determined from lightPos and fragPos.

vec3 lightDir = normalize(lightPos - fragPos);

View direction can be calculated as follows.

vec3 viewDirection = viewPos - fragPos;

Interpolated normal (using the per vertex normal) is already available in the fragment shader.

Next, you can calculate the ambient, diffuse and specular components of the shading model.

Specular:

 vec3 halfVector = normalize(lightDir + viewDirection);
 float specular = ks * max(_____, pow(dot(______, ______), p));
 

Diffuse:

float diffuse = kd * max(_____, dot(_____, ______));

Ambient:

float ambient = ka * vec4(1.0f, 1.0f, 1.0f, 1.0f);

Color due to Blinn-Phong can be calculated as follows.

col = ambient + diffuse*col + specular*col;

Water Waves Generation (Water.h)

Task 5: Water Mesh Generation(Water.h)

Construct a quad mesh by keeping z values of the vertices to a constant value, i.e., waterheight (z=waterHeight). In this example, the grid is created in the Water() constructor provided inWater.h. The code generates the grid of two triangles. Use the following hints to complete the code.

First, get x and y values. The corner vertices are (x, y) (-x, y), (x, -y), (-x, -y). Since the quad mesh is cemtered at 0, both the x and y coordinates can be set to size_grid_x/2 and size_grid_y/2.

Push the vertices to points and texture coordinates to texCoords.

points.push_back(Vec3(_____, _____, waterHeight)); 
texCoords.push_back(Vec2(0, 0));
points.push_back(Vec3(_____,  _____, waterHeight)); 
texCoords.push_back(Vec2(0, 1));
points.push_back(Vec3( _____, _____, waterHeight)); 
texCoords.push_back(Vec2(1, 0));
points.push_back(Vec3( _____,  ______, waterHeight)); texCoords.push_back(Vec2(1, 1));

Task 6: Waves Simulation using Time Dependent Sine Function(water_fshader.h)

To simulate the water waves, we can use layering and composition. By adding several displaced versions of the water texture, it appears as if it is moving. Time dependent sine functions with different frequencies can be used to offset the actual texture coordinates in u and v directions. Then we will take a weighted average of such offseted texture coordinates to retrieve the water texture for the fragment. We take 0.5 contribution of waterUV1, 0.3 contribution of waterUV2 and 0.2 contribution of waterUV3.

vec2 waterUV1 = uv + vec2(0.2 * (1 + sin(time/5)), 0.2 * (1 + sin(time/10)));
vec2 waterUV2 = uv + vec2(0.1 * (1 + sin(time/2)), 0.1 * (1 + sin(time/5)));
vec2 waterUV3 = uv + vec2(0.05 * (1 + sin(time)), 0.05 * (1 + sin(time/2)));

vec4 waterColor = ____ * texture(waterTexture, waterUV1) + ____ * texture(waterTexture, waterUV2) + ____ * texture(waterTexture, waterUV3);

Camera Movements (Camera.h)

Task 7: WASD based Camera Movement(Water.h)

Camera is described by its position (cameraPos), look at direction (cameraFront) and an up vector (cameraUp). Field of view (fov) is the maximum area that the camera can image. Camera movement is controlled by mouse and keyboard which require parameters such as speed of the movement, speedIncrement and mouseSensitivity. Camera angles can be adjusted using yaw and pitch. yaw controls the side movement, yaw = 0 means we are looking forward, yaw = 90 means we are looking east, yaw = 270 means we are looking west. pitch controls the up-down movement pitch = 0 means we are looking forward, pitch = 90 means we are looking up, pitch = 270 means we are looking down. Camera angles are updated based on the changeInMousePosition, see updateCameraAngles() method of Camera.h.

The camera movement is controlled using WASD keys. If the pressed key is W then the camera should move forward along the camerafront (normalized) direction, i.e.,

if (k.key == GLFW_KEY_W) 
{
    cameraPos = cameraPos + speed * cameraFront.normalized();
}

If the pressed key is S then the camera should move backward in the negative camerafront (normalized) direction, i.e.,

if (k.key == GLFW_KEY_S) {
    cameraPos = ________________;
}

If the pressed key is A then the camera should move to the left. The left movement is along the direction of a vector orthogonal to camerafront (normalized) and camerUp which can be calculated using the cross product of them, i.e.,

if (k.key == GLFW_KEY_A) {
    cameraPos = cameraPos - speed * cameraFront.normalized().cross(cameraUp);
}

If the pressed key is D then the camera should move to the right (along the negative of camerafront.normalized().cross(cameraUp).

if (k.key == GLFW_KEY_D) {
    cameraPos = __________________;
}