Bloom.html (21648B)
1 <h1 id="content-title">Bloom</h1> 2 <h1 id="content-url" style='display:none;'>Advanced-Lighting/Bloom</h1> 3 <p> 4 Bright light sources and brightly lit regions are often difficult to convey to the viewer as the intensity range of a monitor is limited. One way to distinguish bright light sources on a monitor is by making them glow; the light then <em>bleeds</em> around the light source. This effectively gives the viewer the illusion these light sources or bright regions are intensely bright. 5 </p> 6 7 <p> 8 This light bleeding, or glow effect, is achieved with a post-processing effect called <def>Bloom</def>. Bloom gives all brightly lit regions of a scene a glow-like effect. An example of a scene with and without glow can be seen below (image courtesy of Epic Games): 9 </p> 10 11 <img src="/img/advanced-lighting/bloom_example.png" "Example of Bloom or glow in OpenGL tutorial"/> 12 13 <p> 14 Bloom gives noticeable visual cues about the brightness of objects. When done in a subtle fashion (which some games drastically fail to do) Bloom significantly boosts the lighting of your scene and allows for a large range of dramatic effects. 15 </p> 16 17 <p> 18 Bloom works best in combination with <a href="https://learnopengl.com/Advanced-Lighting/HDR" target="_blank">HDR</a> rendering. A common misconception is that HDR is the same as Bloom as many people use the terms interchangeably. They are however completely different techniques used for different purposes. It is possible to implement Bloom with default 8-bit precision framebuffers, just as it is possible to use HDR without the Bloom effect. It is simply that HDR makes Bloom more effective to implement (as we'll later see). 19 </p> 20 21 <p> 22 To implement Bloom, we render a lit scene as usual and extract both the scene's HDR color buffer and an image of the scene with only its bright regions visible. This extracted brightness image is then blurred and the result added on top of the original HDR scene image. 23 </p> 24 25 <p> 26 Let's illustrate this process in a step by step fashion. We render a scene filled with 4 bright light sources, visualized as colored cubes. The colored light cubes have a brightness values between <code>1.5</code> and <code>15.0</code>. If we were to render this to an HDR color buffer the scene looks as follows: 27 </p> 28 29 <img src="/img/advanced-lighting/bloom_scene.png" class="clean" alt="Image of a HDR scene where we need to add the bloom or glow effect in OpenGL"/> 30 31 <p> 32 We take this HDR color buffer texture and extract all the fragments that exceed a certain brightness. This gives us an image that only show the bright colored regions as their fragment intensities exceeded a certain threshold: 33 </p> 34 35 <img src="/img/advanced-lighting/bloom_extracted.png" class="clean" alt="Bright regions extracted of a scene for the bloom or glow post-processing effect in OpenGL"/> 36 37 <p> 38 We then take this thresholded brightness texture and blur the result. The strength of the bloom effect is largely determined by the range and strength of the blur filter used. 39 </p> 40 41 <img src="/img/advanced-lighting/bloom_blurred.png" class="clean" alt="Bright regions extracted for glow or bloom effect are blurred in OpenGL"/> 42 43 <p> 44 The resulting blurred texture is what we use to get the glow or light-bleeding effect. This blurred texture is added on top of the original HDR scene texture. Because the bright regions are extended in both width and height due to the blur filter, the bright regions of the scene appear to glow or <em>bleed</em> light. 45 </p> 46 47 <img src="/img/advanced-lighting/bloom_small.png" class="clean" alt="Example of the Bloom or Glow post-processing effect in OpenGL with HDR"/> 48 49 <p> 50 Bloom by itself isn't a complicated technique, but difficult to get exactly right. Most of its visual quality is determined by the quality and type of blur filter used for blurring the extracted brightness regions. Simply tweaking the blur filter can drastically change the quality of the Bloom effect. 51 </p> 52 53 <p> 54 Following these steps gives us the Bloom post-processing effect. The next image briefly summarizes the required steps for implementing Bloom: 55 </p> 56 57 <img src="/img/advanced-lighting/bloom_steps.png" class="clean" alt="Steps required for implementing the bloom or glow post-processing effect in OpenGL"/> 58 59 <p> 60 The first step requires us to extract all the bright colors of a scene based on some threshold. Let's first delve into that. 61 </p> 62 63 <h2>Extracting bright color</h2> 64 <p> 65 The first step requires us to extract two images from a rendered scene. We could render the scene twice, both rendering to a different framebuffer with different shaders, but we can also use a neat little trick called <def>Multiple Render Targets (MRT)</def> that allows us to specify more than one fragment shader output; this gives us the option to extract the first two images in a single render pass. By specifying a layout location specifier before a fragment shader's output we can control to which color buffer a fragment shader writes to: 66 </p> 67 68 <pre><code> 69 layout (location = 0) out vec4 FragColor; 70 layout (location = 1) out vec4 BrightColor; 71 </code></pre> 72 73 <p> 74 This only works if we actually have multiple buffers to write to. As a requirement for using multiple fragment shader outputs we need multiple color buffers attached to the currently bound framebuffer object. You may remember from the <a href="https://learnopengl.com/Advanced-OpenGL/Framebuffers" target="_blank">framebuffers</a> chapter that we can specify a color attachment number when linking a texture as a framebuffer's color buffer. Up until now we've always used <var>GL_COLOR_ATTACHMENT0</var>, but by also using <var>GL_COLOR_ATTACHMENT1</var> we can have two color buffers attached to a framebuffer object: 75 </p> 76 77 <pre><code> 78 // set up floating point framebuffer to render scene to 79 unsigned int hdrFBO; 80 <function id='76'>glGenFramebuffers</function>(1, &hdrFBO); 81 <function id='77'>glBindFramebuffer</function>(GL_FRAMEBUFFER, hdrFBO); 82 unsigned int colorBuffers[2]; 83 <function id='50'>glGenTextures</function>(2, colorBuffers); 84 for (unsigned int i = 0; i < 2; i++) 85 { 86 <function id='48'>glBindTexture</function>(GL_TEXTURE_2D, colorBuffers[i]); 87 <function id='52'>glTexImage2D</function>( 88 GL_TEXTURE_2D, 0, GL_RGBA16F, SCR_WIDTH, SCR_HEIGHT, 0, GL_RGBA, GL_FLOAT, NULL 89 ); 90 <function id='15'>glTexParameter</function>i(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR); 91 <function id='15'>glTexParameter</function>i(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR); 92 <function id='15'>glTexParameter</function>i(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE); 93 <function id='15'>glTexParameter</function>i(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE); 94 // attach texture to framebuffer 95 <function id='81'>glFramebufferTexture2D</function>( 96 GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0 + i, GL_TEXTURE_2D, colorBuffers[i], 0 97 ); 98 } 99 </code></pre> 100 101 <p> 102 We do have to explicitly tell OpenGL we're rendering to multiple colorbuffers via <fun>glDrawBuffers</fun>. OpenGL, by default, only renders to a framebuffer's first color attachment, ignoring all others. We can do this by passing an array of color attachment enums that we'd like to render to in subsequent operations: 103 </p> 104 105 <pre><code> 106 unsigned int attachments[2] = { GL_COLOR_ATTACHMENT0, GL_COLOR_ATTACHMENT1 }; 107 glDrawBuffers(2, attachments); 108 </code></pre> 109 110 <p> 111 When rendering into this framebuffer, whenever a fragment shader uses the layout location specifier, the respective color buffer is used to render the fragment to. This is great as this saves us an extra render pass for extracting bright regions as we can now directly extract them from the to-be-rendered fragment: 112 </p> 113 114 <pre><code> 115 #version 330 core 116 layout (location = 0) out vec4 FragColor; 117 layout (location = 1) out vec4 BrightColor; 118 119 [...] 120 121 void main() 122 { 123 [...] // first do normal lighting calculations and output results 124 FragColor = vec4(lighting, 1.0); 125 // check whether fragment output is higher than threshold, if so output as brightness color 126 float brightness = dot(FragColor.rgb, vec3(0.2126, 0.7152, 0.0722)); 127 if(brightness > 1.0) 128 BrightColor = vec4(FragColor.rgb, 1.0); 129 else 130 BrightColor = vec4(0.0, 0.0, 0.0, 1.0); 131 } 132 </code></pre> 133 134 <p> 135 Here we first calculate lighting as normal and pass it to the first fragment shader's output variable <var>FragColor</var>. Then we use what is currently stored in <var>FragColor</var> to determine if its brightness exceeds a certain threshold. We calculate the brightness of a fragment by properly transforming it to grayscale first (by taking the dot product of both vectors we effectively multiply each individual component of both vectors and add the results together). If the brightness exceeds a certain threshold, we output the color to the second color buffer. We do the same for the light cubes. 136 </p> 137 138 <p> 139 This also shows why Bloom works incredibly well with HDR rendering. Because we render in high dynamic range, color values can exceed <code>1.0</code> which allows us to specify a brightness threshold outside the default range, giving us much more control over what is considered bright. Without HDR we'd have to set the threshold lower than <code>1.0</code>, which is still possible, but regions are much quicker considered bright. This sometimes leads to the glow effect becoming too dominant (think of white glowing snow for example). 140 </p> 141 142 <p> 143 With these two color buffers we have an image of the scene as normal, and an image of the extracted bright regions; all generated in a single render pass. 144 </p> 145 146 <img src="/img/advanced-lighting/bloom_attachments.png" alt="Image of two colorbuffers obtained from a single render pass with multiple color attachments for the bloom or glow effect in OpenGL"/> 147 148 <p> 149 With an image of the extracted bright regions we now need to blur the image. We can do this with a simple box filter as we've done in the post-processing section of the framebufers chapter, but we'd rather use a more advanced (and better-looking) blur filter called <def>Gaussian blur</def>. 150 </p> 151 152 <h2>Gaussian blur</h2> 153 <p> 154 In the post-processing chapter's blur we took the average of all surrounding pixels of an image. While it does give us an easy blur, it doesn't give the best results. A Gaussian blur is based on the Gaussian curve which is commonly described as a <em>bell-shaped curve</em> giving high values close to its center that gradually wear off over distance. The Gaussian curve can be mathematically represented in different forms, but generally has the following shape: 155 </p> 156 157 <img src="/img/advanced-lighting/bloom_gaussian.png" class="clean" alt="Image of a Gaussian Curve used for blurring a bloom or glow image in OpenGL"/> 158 159 <p> 160 As the Gaussian curve has a larger area close to its center, using its values as weights to blur an image give more natural results as samples close by have a higher precedence. If we for instance sample a 32x32 box around a fragment, we use progressively smaller weights the larger the distance to the fragment; this gives a better and more realistic blur which is known as a <def>Gaussian blur</def>. 161 </p> 162 163 <p> 164 To implement a Gaussian blur filter we'd need a two-dimensional box of weights that we can obtain from a 2 dimensional Gaussian curve equation. The problem with this approach however is that it quickly becomes extremely heavy on performance. Take a blur kernel of 32 by 32 for example, this would require us to sample a texture a total of 1024 times for each fragment! 165 </p> 166 167 <p> 168 Luckily for us, the Gaussian equation has a very neat property that allows us to separate the two-dimensional equation into two smaller one-dimensional equations: one that describes the horizontal weights and the other that describes the vertical weights. We'd then first do a horizontal blur with the horizontal weights on the scene texture, and then on the resulting texture do a vertical blur. Due to this property the results are exactly the same, but this time saving us an incredible amount of performance as we'd now only have to do 32 + 32 samples compared to 1024! This is known as <def>two-pass Gaussian blur</def>. 169 </p> 170 171 <img src="/img/advanced-lighting/bloom_gaussian_two_pass.png" class="clean" alt="Image of two-pass Gaussian blur with the same results as normal gaussian blur, but now saving a lot of performance in OpenGL"/> 172 173 <p> 174 This does mean we need to blur an image at least two times and this works best with the use of framebuffer objects. Specifically for the two-pass Gaussian blur we're going to implement <em>ping-pong</em> framebuffers. That is a pair of framebuffers where we render and swap, a given number of times, the other framebuffer's color buffer into the current framebuffer's color buffer with an alternating shader effect. We basically continuously switch the framebuffer to render to and the texture to draw with. This allows us to first blur the scene's texture in the first framebuffer, then blur the first framebuffer's color buffer into the second framebuffer, and then the second framebuffer's color buffer into the first, and so on. 175 </p> 176 177 <p> 178 Before we delve into the framebuffers let's first discuss the Gaussian blur's fragment shader: 179 </p> 180 181 <pre><code> 182 #version 330 core 183 out vec4 FragColor; 184 185 in vec2 TexCoords; 186 187 uniform sampler2D image; 188 189 uniform bool horizontal; 190 uniform float weight[5] = float[] (0.227027, 0.1945946, 0.1216216, 0.054054, 0.016216); 191 192 void main() 193 { 194 vec2 tex_offset = 1.0 / textureSize(image, 0); // gets size of single texel 195 vec3 result = texture(image, TexCoords).rgb * weight[0]; // current fragment's contribution 196 if(horizontal) 197 { 198 for(int i = 1; i < 5; ++i) 199 { 200 result += texture(image, TexCoords + vec2(tex_offset.x * i, 0.0)).rgb * weight[i]; 201 result += texture(image, TexCoords - vec2(tex_offset.x * i, 0.0)).rgb * weight[i]; 202 } 203 } 204 else 205 { 206 for(int i = 1; i < 5; ++i) 207 { 208 result += texture(image, TexCoords + vec2(0.0, tex_offset.y * i)).rgb * weight[i]; 209 result += texture(image, TexCoords - vec2(0.0, tex_offset.y * i)).rgb * weight[i]; 210 } 211 } 212 FragColor = vec4(result, 1.0); 213 } 214 </code></pre> 215 216 <p> 217 Here we take a relatively small sample of Gaussian weights that we each use to assign a specific weight to the horizontal or vertical samples around the current fragment. You can see that we split the blur filter into a horizontal and vertical section based on whatever value we set the <var>horizontal</var> uniform. We base the offset distance on the exact size of a texel obtained by the division of <code>1.0</code> over the size of the texture (a <code>vec2</code> from <fun>textureSize</fun>). 218 </p> 219 220 <p> 221 For blurring an image we create two basic framebuffers, each with only a color buffer texture: 222 </p> 223 224 <pre><code> 225 unsigned int pingpongFBO[2]; 226 unsigned int pingpongBuffer[2]; 227 <function id='76'>glGenFramebuffers</function>(2, pingpongFBO); 228 <function id='50'>glGenTextures</function>(2, pingpongBuffer); 229 for (unsigned int i = 0; i < 2; i++) 230 { 231 <function id='77'>glBindFramebuffer</function>(GL_FRAMEBUFFER, pingpongFBO[i]); 232 <function id='48'>glBindTexture</function>(GL_TEXTURE_2D, pingpongBuffer[i]); 233 <function id='52'>glTexImage2D</function>( 234 GL_TEXTURE_2D, 0, GL_RGBA16F, SCR_WIDTH, SCR_HEIGHT, 0, GL_RGBA, GL_FLOAT, NULL 235 ); 236 <function id='15'>glTexParameter</function>i(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR); 237 <function id='15'>glTexParameter</function>i(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR); 238 <function id='15'>glTexParameter</function>i(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE); 239 <function id='15'>glTexParameter</function>i(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE); 240 <function id='81'>glFramebufferTexture2D</function>( 241 GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, pingpongBuffer[i], 0 242 ); 243 } 244 </code></pre> 245 246 <p> 247 Then after we've obtained an HDR texture and an extracted brightness texture, we first fill one of the ping-pong framebuffers with the brightness texture and then blur the image 10 times (5 times horizontally and 5 times vertically): 248 </p> 249 250 <pre><code> 251 bool horizontal = true, first_iteration = true; 252 int amount = 10; 253 shaderBlur.use(); 254 for (unsigned int i = 0; i < amount; i++) 255 { 256 <function id='77'>glBindFramebuffer</function>(GL_FRAMEBUFFER, pingpongFBO[horizontal]); 257 shaderBlur.setInt("horizontal", horizontal); 258 <function id='48'>glBindTexture</function>( 259 GL_TEXTURE_2D, first_iteration ? colorBuffers[1] : pingpongBuffers[!horizontal] 260 ); 261 RenderQuad(); 262 horizontal = !horizontal; 263 if (first_iteration) 264 first_iteration = false; 265 } 266 <function id='77'>glBindFramebuffer</function>(GL_FRAMEBUFFER, 0); 267 </code></pre> 268 269 <p> 270 Each iteration we bind one of the two framebuffers based on whether we want to blur horizontally or vertically and bind the other framebuffer's color buffer as the texture to blur. The first iteration we specifically bind the texture we'd like to blur (<var>brightnessTexture</var>) as both color buffers would else end up empty. By repeating this process 10 times, the brightness image ends up with a complete Gaussian blur that was repeated 5 times. This construct allows us to blur any image as often as we'd like; the more Gaussian blur iterations, the stronger the blur. 271 </p> 272 273 <p> 274 By blurring the extracted brightness texture 5 times, we get a properly blurred image of all bright regions of a scene. 275 </p> 276 277 <img src="/img/advanced-lighting/bloom_blurred_large.png" class="clean" alt="Blurred image using Gaussian Blur of extracted brightness regions for the glow or bloom effect in OpenGL"/> 278 279 <p> 280 The last step to complete the Bloom effect is to combine this blurred brightness texture with the original scene's HDR texture. 281 </p> 282 283 <h2>Blending both textures</h2> 284 <p> 285 With the scene's HDR texture and a blurred brightness texture of the scene we only need to combine the two to achieve the infamous Bloom or glow effect. In the final fragment shader (largely similar to the one we used in the <a href="https://learnopengl.com/Advanced-Lighting/HDR" target="_blank">HDR</a> chapter) we additively blend both textures: 286 </p> 287 288 <pre><code> 289 #version 330 core 290 out vec4 FragColor; 291 292 in vec2 TexCoords; 293 294 uniform sampler2D scene; 295 uniform sampler2D bloomBlur; 296 uniform float exposure; 297 298 void main() 299 { 300 const float gamma = 2.2; 301 vec3 hdrColor = texture(scene, TexCoords).rgb; 302 vec3 bloomColor = texture(bloomBlur, TexCoords).rgb; 303 hdrColor += bloomColor; // additive blending 304 // tone mapping 305 vec3 result = vec3(1.0) - exp(-hdrColor * exposure); 306 // also gamma correct while we're at it 307 result = pow(result, vec3(1.0 / gamma)); 308 FragColor = vec4(result, 1.0); 309 } 310 </code></pre> 311 312 <p> 313 Interesting to note here is that we add the Bloom effect before we apply tone mapping. This way, the added brightness of bloom is also softly transformed to LDR range with better relative lighting as a result. 314 </p> 315 316 <p> 317 With both textures added together, all bright areas of our scene now get a proper glow effect: 318 </p> 319 320 <img src="/img/advanced-lighting/bloom.png" class="clean" alt="Example of the Bloom or Glow post-processing effect in OpenGL with HDR"/> 321 322 <p> 323 The colored cubes now appear much more bright and give a better illusion as light emitting objects. This is a relatively simple scene so the Bloom effect isn't too impressive here, but in well lit scenes it can make a significant difference when properly configured. You can find the source code of this simple demo <a href="/code_viewer_gh.php?code=src/5.advanced_lighting/7.bloom/bloom.cpp" target="_blank">here</a>. 324 </p> 325 326 <p> 327 For this chapter we used a relatively simple Gaussian blur filter where we only take 5 samples in each direction. By taking more samples along a larger radius or repeating the blur filter an extra number of times we can improve the blur effect. As the quality of the blur directly correlates to the quality of the Bloom effect, improving the blur step can make a significant improvement. Some of these improvements combine blur filters with varying sized blur kernels or use multiple Gaussian curves to selectively combine weights. The additional resources from Kalogirou and Epic Games discuss how to significantly improve the Bloom effect by improving the Gaussian blur. 328 </p> 329 330 <h2>Additional resources</h2> 331 <ul> 332 <li><a href="http://rastergrid.com/blog/2010/09/efficient-gaussian-blur-with-linear-sampling/" target="_blank">Efficient Gaussian Blur with linear sampling</a>: descirbes the Gaussian blur very well and how to improve its performance using OpenGL's bilinear texture sampling.</li> 333 <li><a href="https://udk-legacy.unrealengine.com/udk/Three/Bloom.html" target="_blank">Bloom Post Process Effect</a>: article from Epic Games about improving the Bloom effect by combining multiple Gaussian curves for its weights.</li> 334 <li><a href="http://kalogirou.net/2006/05/20/how-to-do-good-bloom-for-hdr-rendering/" target="_blank">How to do good Bloom for HDR rendering</a>: Article from Kalogirou that describes how to improve the Bloom effect using a better Gaussian blur method.</li> 335 </ul> 336 337 338 339 340 </div> 341