LearnOpenGL

Translation in progress of learnopengl.com.
git clone https://git.mtkn.jp/LearnOpenGL
Log | Files | Refs

HDR.html (17244B)


      1     <h1 id="content-title">HDR</h1>
      2 <h1 id="content-url" style='display:none;'>Advanced-Lighting/HDR</h1>
      3 <p>
      4   Brightness and color values, by default, are clamped between <code>0.0</code> and <code>1.0</code> when stored into a framebuffer. This, at first seemingly innocent, statement caused us to always specify light and color values somewhere in this range, trying to make them fit into the scene. This  works oké and gives decent results, but what happens if we walk in a really bright area with multiple bright light sources that as a total sum exceed <code>1.0</code>? The answer is that all fragments that have a brightness or color sum over <code>1.0</code> get clamped to <code>1.0</code>, which isn't pretty to look at:
      5 </p>
      6 
      7 <img src="/img/advanced-lighting/hdr_clamped.png" class="clean" alt="Color values clamped in bright areas"/>
      8   
      9 <p>
     10   Due to a large number of fragments' color values getting clamped to <code>1.0</code>, each of the bright fragments have the exact same white color value in large regions, losing a significant amount of detail and giving it a fake look.
     11 </p>
     12   
     13 <p>
     14   A solution to this problem would be to reduce the strength of the light sources and ensure no area of fragments in your scene ends up brighter than <code>1.0</code>; this is not a good solution as this forces you to use unrealistic lighting parameters. A better approach is to allow color values to temporarily exceed <code>1.0</code> and transform them back to the original range of <code>0.0</code> and <code>1.0</code> as a final step, but without losing detail.
     15 </p>
     16   
     17 <p>
     18   Monitors (non-HDR) are limited to display colors in the range of <code>0.0</code> and <code>1.0</code>, but there is no such limitation in lighting equations. By allowing fragment colors to exceed <code>1.0</code> we have a much higher range of color values available to work in known as <def>high dynamic range</def> (HDR). With high dynamic range, bright things can be really bright, dark things can be really dark, and details can be seen in both.
     19 </p>
     20   
     21 <p>
     22   High dynamic range was originally only used for photography where a photographer takes multiple pictures of the same scene with varying exposure levels, capturing a large range of color values. Combining these forms a HDR image where a large range of details are visible based on the combined exposure levels, or a specific exposure it is viewed with. For instance, the following image (credits to Colin Smith) shows a lot of detail at brightly lit regions with a low exposure (look at the window), but these details are gone with a high exposure. However, a high exposure now reveals a great amount of detail at darker regions that weren't previously visible.
     23 </p>
     24   
     25   <img src="/img/advanced-lighting/hdr_image.png" alt="HDR image showing multiple exposure levels and their respective details"/>
     26     
     27 <p>
     28   This is also very similar to how the human eye works and the basis of high dynamic range rendering. When there is little light, the human eye adapts itself so the darker parts become more visible and similarly for bright areas. It's like the human eye has an automatic exposure slider based on the scene's brightness.
     29 </p>
     30     
     31 <p>
     32   High dynamic range rendering works a bit like that. We allow for a much larger range of color values to render to, collecting a large range of dark and bright details of a scene, and at the end we transform all the HDR values back to the <def>low dynamic range</def> (LDR) of [<code>0.0</code>, <code>1.0</code>]. This process of converting HDR values to LDR values is called <def>tone mapping</def> and a large collection of tone mapping algorithms exist that aim to preserve most HDR details during the conversion process. These tone mapping algorithms often involve an exposure parameter that selectively favors dark or bright regions.
     33 </p>
     34     
     35 <p>
     36   When it comes to real-time rendering, high dynamic range allows us to not only exceed the LDR range of [<code>0.0</code>, <code>1.0</code>] and preserve more detail, but also gives us the ability to specify a light source's intensity by their <em>real</em> intensities. For instance, the sun has a much higher intensity than something like a flashlight so why not configure the sun as such (e.g. a diffuse brightness of <code>100.0</code>). This allows us to more properly configure a scene's lighting with more realistic lighting parameters, something that wouldn't be possible with LDR rendering as they'd then directly get clamped to <code>1.0</code>.
     37 </p>
     38 
     39 <p>
     40   As (non-HDR) monitors only display colors in the range between <code>0.0</code> and <code>1.0</code> we do need to transform the currently high dynamic range of color values back to the monitor's range. Simply re-transforming the colors back with a simple average wouldn't do us much good as brighter areas then become a lot more dominant. What we can do, is use different equations and/or curves to transform the HDR values back to LDR that give us complete control over the scene's brightness. This is the process earlier denoted as tone mapping and the final step of HDR rendering. 
     41 </p>
     42     
     43 <h2>Floating point framebuffers</h2>
     44 <p>
     45   To implement high dynamic range rendering we need some way to prevent color values getting clamped after each fragment shader run. When framebuffers use a normalized fixed-point color format (like <var>GL_RGB</var>) as their color buffer's internal format, OpenGL automatically clamps the values between <code>0.0</code> and <code>1.0</code> before storing them in the framebuffer. This operation holds for most types of framebuffer formats, except for floating point formats.
     46 </p>
     47     
     48 <p>
     49   When the internal format of a framebuffer's color buffer is specified as <var>GL_RGB16F</var>, <var>GL_RGBA16F</var>, <var>GL_RGB32F</var>, or <var>GL_RGBA32F</var> the framebuffer is known as a <def>floating point framebuffer</def> that can store floating point values outside the default range of <code>0.0</code> and <code>1.0</code>. This is perfect for rendering in high dynamic range!
     50 </p>
     51     
     52 <p>
     53   To create a floating point framebuffer the only thing we need to change is its color buffer's internal format parameter:
     54 </p>
     55     
     56 <pre><code>
     57 <function id='48'>glBindTexture</function>(GL_TEXTURE_2D, colorBuffer);
     58 <function id='52'>glTexImage2D</function>(GL_TEXTURE_2D, 0, GL_RGBA16F, SCR_WIDTH, SCR_HEIGHT, 0, GL_RGBA, GL_FLOAT, NULL);  
     59 </code></pre>
     60     
     61 <p>
     62   The default framebuffer of OpenGL (by default) only takes up 8 bits per color component. With a floating point framebuffer with 32 bits per color component (when using <var>GL_RGB32F</var> or <var>GL_RGBA32F</var>) we're using 4 times more memory for storing color values. As 32 bits isn't really necessary (unless you need a high level of precision) using <var>GL_RGBA16F</var> will suffice.
     63 </p>
     64     
     65 <p>
     66   With a floating point color buffer attached to a framebuffer we can now render the scene into this framebuffer knowing color values won't get clamped between <code>0.0</code> and <code>1.0</code>. In this chapter's example demo we first render a lit scene into the floating point framebuffer and then display the framebuffer's color buffer on a screen-filled quad; it'll look a bit like this:
     67 </p>
     68     
     69 <pre><code>
     70 <function id='77'>glBindFramebuffer</function>(GL_FRAMEBUFFER, hdrFBO);
     71     <function id='10'>glClear</function>(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);  
     72     // [...] render (lit) scene 
     73 <function id='77'>glBindFramebuffer</function>(GL_FRAMEBUFFER, 0);
     74 
     75 // now render hdr color buffer to 2D screen-filling quad with tone mapping shader
     76 hdrShader.use();
     77 <function id='49'>glActiveTexture</function>(GL_TEXTURE0);
     78 <function id='48'>glBindTexture</function>(GL_TEXTURE_2D, hdrColorBufferTexture);
     79 RenderQuad();
     80 </code></pre>
     81     
     82 <p>
     83   Here a scene's color values are filled into a floating point color buffer which can contain any arbitrary color value, possibly exceeding <code>1.0</code>. For this chapter, a simple demo scene was created with a large stretched cube acting as a tunnel with four point lights, one being extremely bright positioned at the tunnel's end:
     84 </p>
     85     
     86 <pre><code>
     87 std::vector&lt;glm::vec3&gt; lightColors;
     88 lightColors.push_back(glm::vec3(200.0f, 200.0f, 200.0f));
     89 lightColors.push_back(glm::vec3(0.1f, 0.0f, 0.0f));
     90 lightColors.push_back(glm::vec3(0.0f, 0.0f, 0.2f));
     91 lightColors.push_back(glm::vec3(0.0f, 0.1f, 0.0f));  
     92 </code></pre>
     93     
     94 <p>
     95   Rendering to a floating point framebuffer is exactly the same as we would normally render into a framebuffer. What is new is <var>hdrShader</var>'s fragment shader that renders the final 2D quad with the floating point color buffer texture attached. Let's first define a simple pass-through fragment shader:
     96 </p>
     97     
     98 <pre><code>
     99 #version 330 core
    100 out vec4 FragColor;
    101   
    102 in vec2 TexCoords;
    103 
    104 uniform sampler2D hdrBuffer;
    105 
    106 void main()
    107 {             
    108     vec3 hdrColor = texture(hdrBuffer, TexCoords).rgb;
    109     FragColor = vec4(hdrColor, 1.0);
    110 }  
    111 </code></pre>
    112     
    113 <p>
    114   Here we directly sample the floating point color buffer and use its color value as the fragment shader's output. However, as the 2D quad's output is directly rendered into the default framebuffer, all the fragment shader's output values will still end up clamped between <code>0.0</code> and <code>1.0</code> even though we have several values in the floating point color texture exceeding <code>1.0</code>. 
    115 </p>
    116     
    117     <img src="/img/advanced-lighting/hdr_direct.png" alt="Direct rendering of floating point color values to the default framebuffer without tone mapping."/>
    118   
    119 <p>
    120   It becomes clear the intense light values at the end of the tunnel are clamped to <code>1.0</code> as a large portion of it is completely white, effectively losing all lighting details in the process. As we directly write HDR values to an LDR output buffer it is as if we have no HDR enabled in the first place. What we need to do is transform all the floating point color values into the <code>0.0</code> - <code>1.0</code> range without losing any of its details. We need to apply a process called <def>tone mapping</def>.
    121 </p>
    122       
    123 <h2>Tone mapping</h2>
    124 <p>
    125   Tone mapping is the process of transforming floating point color values to the expected [<code>0.0</code>, <code>1.0</code>] range known as low dynamic range without losing too much detail, often accompanied with a specific stylistic color balance.
    126 </p>
    127       
    128 <p>
    129   One of the more simple tone mapping algorithms is <def>Reinhard tone mapping</def> that involves dividing the entire HDR color values to LDR color values. The Reinhard tone mapping algorithm evenly balances out all brightness values onto LDR. We include Reinhard tone mapping into the previous fragment shader and also add a <a href="https://learnopengl.com/Advanced-Lighting/Gamma-Correction" target="_blank">gamma correction</a> filter for good measure (including the use of sRGB textures): 
    130 </p>
    131       
    132 <pre><code>
    133 void main()
    134 {             
    135     const float gamma = 2.2;
    136     vec3 hdrColor = texture(hdrBuffer, TexCoords).rgb;
    137   
    138     // reinhard tone mapping
    139     vec3 mapped = hdrColor / (hdrColor + vec3(1.0));
    140     // gamma correction 
    141     mapped = pow(mapped, vec3(1.0 / gamma));
    142   
    143     FragColor = vec4(mapped, 1.0);
    144 }    
    145 </code></pre>
    146       
    147 <p>
    148   With Reinhard tone mapping applied we no longer lose any detail at the bright areas of our scene. It does tend to slightly favor brighter areas, making darker regions seem less detailed and distinct: 
    149 </p>
    150       
    151       <img src="/img/advanced-lighting/hdr_reinhard.png" class="clean" alt="Reinhard tone mapping algorithm applied with HDR rendering in OpenGL"/>
    152         
    153 <p>
    154   Here you can again see details at the end of the tunnel as the wood texture pattern becomes visible again. With this relatively simple tone mapping algorithm we can properly see the entire range of HDR values stored in the floating point framebuffer, giving us precise control over the scene's lighting without losing details.
    155 </p>
    156         
    157 <note>
    158 	Note that we could also directly tone map at the end of our lighting shader, not needing any floating point framebuffer at all! However, as scenes get more complex you'll frequently find the need to store intermediate HDR results as floating point buffers so this is a good exercise.  
    159 </note>
    160         
    161 <p>
    162   Another interesting use of tone mapping is to allow the use of an exposure parameter. You probably remember from the introduction that HDR images contain a lot of details visible at different exposure levels. If we have a scene that features a day and night cycle it makes sense to use a lower exposure at daylight and a higher exposure at night time, similar to how the human eye adapts. With such an exposure parameter it allows us to configure lighting parameters that work both at day and night under different lighting conditions as we only have to change the exposure parameter.
    163 </p>
    164         
    165 <p>
    166   A relatively simple exposure tone mapping algorithm looks as follows:
    167 </p>
    168         
    169 <pre><code>
    170 uniform float exposure;
    171 
    172 void main()
    173 {             
    174     const float gamma = 2.2;
    175     vec3 hdrColor = texture(hdrBuffer, TexCoords).rgb;
    176   
    177     // exposure tone mapping
    178     vec3 mapped = vec3(1.0) - exp(-hdrColor * exposure);
    179     // gamma correction 
    180     mapped = pow(mapped, vec3(1.0 / gamma));
    181   
    182     FragColor = vec4(mapped, 1.0);
    183 }  
    184 </code></pre>
    185         
    186 <p>
    187   Here we defined an <var>exposure</var> uniform that defaults at <code>1.0</code> and allows us to more precisely specify whether we'd like to focus more on dark or bright regions of the HDR color values. For instance, with high exposure values the darker areas of the tunnel show significantly more detail. In contrast, a low exposure largely removes the dark region details, but allows us to see more detail in the bright areas of a scene. Take a look at the image below to see the tunnel at multiple exposure levels:
    188 </p>
    189         
    190         <img src="/img/advanced-lighting/hdr_exposure.png" alt="Multiple exposure levels of HDR tone mapping in OpenGL"/>
    191           
    192 <p>
    193   This image clearly shows the benefit of high dynamic range rendering. By changing the exposure level we get to see a lot of details of our scene, that would've been otherwise lost with low dynamic range rendering. Take the end of the tunnel for example. With a normal exposure the wood structure is barely visible, but with a low exposure the detailed wooden patterns are clearly visible. The same holds for the wooden patterns close by that are more visible with a high exposure. 
    194 </p>
    195           
    196 <p>
    197   You can find the source code of the demo <a href="/code_viewer_gh.php?code=src/5.advanced_lighting/6.hdr/hdr.cpp" target="_blank">here</a>.
    198 </p>
    199           
    200 <h3>More HDR</h3>
    201 <p>
    202   The two tone mapping algorithms shown are only a few of a large collection of (more advanced) tone mapping algorithms of which each has their own strengths and weaknesses. Some tone mapping algorithms favor certain colors/intensities above others and some algorithms display both the low and high exposure colors at the same time to create more colorful and detailed images. There is also a collection of techniques known as <def>automatic exposure adjustment</def> or <def>eye adaptation</def> techniques that determine the brightness of the scene in the previous frame and (slowly) adapt the exposure parameter such that the scene gets brighter in dark areas or darker in bright areas mimicking the human eye.
    203 </p>
    204                     
    205 <p>
    206   The real benefit of HDR rendering really shows itself in large and complex scenes with heavy lighting algorithms. As it is difficult to create such a complex demo scene for teaching purposes while keeping it accessible, the chapter's demo scene is small and lacks detail. While relatively simple it does show some of the benefits of HDR rendering: no details are lost in high and dark regions as they can be restored with tone mapping, the addition of multiple lights doesn't cause clamped regions, and light values can be specified by real brightness values not being limited by LDR values. Furthermore, HDR rendering also makes several other interesting effects more feasible and realistic; one of these effects is <def>bloom</def> that we'll discuss in the next <a href="https://learnopengl.com/Advanced-Lighting/Bloom" target="_blank">next</a> chapter.
    207 </p>
    208           
    209 <h2>Additional resources</h2>
    210 <ul>
    211   <li><a href="http://gamedev.stackexchange.com/questions/62836/does-hdr-rendering-have-any-benefits-if-bloom-wont-be-applied" target="_blank">Does HDR rendering have any benefits if bloom won't be applied?</a>: a stackexchange question that features a great lengthy answer describing some of the benefits of HDR rendering.</li>
    212  <li><a href="http://photo.stackexchange.com/questions/7630/what-is-tone-mapping-how-does-it-relate-to-hdr" target="_blank">What is tone mapping? How does it relate to HDR?</a>: another interesting answer with great reference images to explain tone mapping.</li>
    213 </ul>       
    214 
    215     </div>
    216     
    217     <div id="hover">
    218         HI
    219     </div>
    220    <!-- 728x90/320x50 sticky footer -->
    221 <div id="waldo-tag-6196"></div>
    222 
    223    <div id="disqus_thread"></div>
    224 
    225     
    226 
    227 
    228 </div> <!-- container div -->
    229 
    230 
    231 </div> <!-- super container div -->
    232 </body>
    233 </html>