In this post, I will introduce my implementation of the frame buffer in OpenGL and present two of the usage: directional light shadow mapping and mirror.
And this feature is included in my Graphic game engine:

Usage of frame buffer: shadow mapping and mirror effect

What is a Framebuffer?

It is easy to misunderstand framebuffer by just seeing its name. The framebuffer is an aggregator of one or multiple buffers. Or we can say it contains all pieces of information we need to render a frame.

When we created an OpenGL context (window) using GLFW, two default color buffers are created and configured automatically. Everything is rendered in the back buffer, and when it is time to do the next frame update, we swap the front/back buffer such that a new frame is displayed on the screen. This is call double-buffering.

If we enable the depth testing (GL_DEPTH_TEST), an extra depth buffer will be created, which stores the depth value z( z ∉ [0, 1] ).

There is one buffer call stencil buffer in a framebuffer. It is used to cull fragment after fragment shader.

All three type of buffers (color, depth, stencil) consists a framebuffer. Those buffers are called attachment in the context or OpenGL. One thing, a frame buffer can have more than one color attachments. The maximum count of the color attachment depends on our graphic card. It can be (GL_MAX_COLOR_ATTACHMENTS). 
The following image shows how a framebuffer is constructed.

Connectivity between FBO, texture and Renderbuffer

Connectivity between FBO, texture and Renderbuffer

These attachments in a framebuffer are called framebuffer-attachable images, which are 2D arrays of pixels that can be attached to a framebuffer object.


All in all, framebuffer stores all pieces of information we need to render a frame, or it can be used to store extra information for the following pass.

What is the concept of pass in a rendering loop?

A very simple example can explain the concept of pass: mirror effect.

  • Create a frame buffer.
  • Render the scene without a “mirror” and store the rendering images to the framebuffer.
  • Use buffers in the framebuffer as a texture of a “mirror”.
  • Render the scene with the “mirror” again.
  • Done.

The implementation of the rendering loop

In this example, we render the scene twice, which means we go through the whole rendering pipeline(From constructing MVP, vertex shader, …, rasterizer, fragment shader, …, display) twice. But for the first time, we are not displaying the image to the screen, but we store it in the framebuffer instead. And we use its data to help render the actual scene.

Among multiple passes, we do not have to use the same pair of shaders.

For example, when I implemented my shadow map, I use another vertex/fragment shader pair, and they are very simple. The actual data I need is the depth value, generating by the rasterizer. So I did not even need a fragment shader. But for the sake of displaying the shadow map, I draw the fragment by their depth value (gl_FragCoord.z). The following codes are my vertex/fragment shader pair for the shadow map pass.

// directional_shadow_map_vert.glsl
#version 420
layout (location = 0) in vec3 pos;

layout(std140, binding = 1) uniform uniformBuffer_drawcall
	mat4 modelMatrix;
	mat4 normalMatrix;
uniform mat4 directionalLightTransform;
void main()
	gl_Position = directionalLightTransform * modelMatrix * vec4(pos, 1.0);
// directional_shadow_map_frag.glsl
#version 420
out vec4 color;
void main(){
    color = gl_FragCoord.z * vec4(1,1,1,1);

TODO in the future:

Last but not least, it is hard to use the concept of passes and framebuffer properly. I still need to refine my engine codes, and they are messy. I should have put all the uniform IDs in my cEffect class but not other classes. For example, I put all lighting-related uniform id in light class, but in my shadow map pass, I am using another effect that does not have the related uniform variable in the shader, which may cause problems when I try to update the uniform variable in the shader.