1 / 45

CS-378: Game Technology

CS-378: Game Technology. Lecture #7: More Mapping Prof. Okan Arikan University of Texas, Austin Thanks to James O’Brien, Steve Chenney, Zoran Popovic, Jessica Hodgins V2005-08-1.1. Today. More on mapping Environment mapping Light mapping Bump mapping Buffers. Environment Mapping.

Télécharger la présentation

CS-378: Game Technology

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CS-378: Game Technology • Lecture #7: More Mapping • Prof. Okan Arikan • University of Texas, Austin • Thanks to James O’Brien, Steve Chenney, Zoran Popovic, Jessica Hodgins • V2005-08-1.1

  2. Today • More on mapping • Environment mapping • Light mapping • Bump mapping • Buffers

  3. Environment Mapping • Environment mapping produces reflections on shiny objects • Texture is transferred in the direction of the reflected ray from the environment map onto the object • Uses ray with same direction but starting at the object center • Map contains a view of the world as seen from the center of the object Environment Map Reflected ray Lookup ray Viewer Object

  4. Environment Mapping www.debevec.org Need For Speed Underground Far Cry

  5. Lat/Long Mapping • The original algorithm (1976) placed the map on a sphere centered on the object • Mapping functions assume that s,t texture coordinates equate to latitude and longitude on the sphere: • What is bad about this method? • Sampling • Map generation • Complex texture coordinate computations

  6. Cube Mapping • Put the object at the center of a cube • Represent the environment on the cube faces • Assumptions ? • Hardware supported Reflection ray View ray

  7. Sphere Mapping • Again the map lives on a sphere, but now the coordinate mapping is simplified • To generate the map: • Take a map point (s,t), cast a ray onto a sphere in the -Z direction, and record what is reflected • Equivalent to photographing a reflective sphere with an orthographic camera (long lens, big distance) • Again, makes the method suitable for film special effects

  8. A Sphere Map

  9. Indexing Sphere Maps • Given the reflection vector: • Implemented in hardware • Problems: • Highly non-uniform sampling • Highly non-linear mapping

  10. Non-uniform Sampling

  11. Non-linear Mapping • Linear interpolation of per-vertex texture coordinates picks up the wrong texture pixels • Use small polygons! Correct Linear

  12. Example

  13. Other Env. Map Tricks • Partially reflective objects • First stage applied color texture • Second stage does environment mapping using alpha blend with existing color • Just put the lights in the environment map • What does this simulate? • Recursive reflections • Bad cases for environment maps?

  14. Light Maps • Speed up lighting calculations by pre-computing lighting and storing it in maps • Allows complex illumination models to be used in generating the map (eg shadows, radiosity) • Used in complex rendering algorithms (Radiance), not just games • Issues: • How is the mapping determined? • How are the maps generated? • How are they applied at run-time?

  15. Example www.flipcode.com Call of duty

  16. Choosing a Mapping • Problem: In a preprocessing phase, points on polygons must be associated with points in maps • One solution: • Find groups of polygons that are “near” co-planar and do not overlap when projected onto a plane • Result is a mapping from polygons to planes • Combine sections of the chosen planes into larger maps • Store texture coordinates at polygon vertices • Lighting tends to change quite slowly (except when?), so the map resolution can be poor

  17. Generating the Map • Problem: What value should go in each pixel of the light map? • Solution: • Map texture pixels back into world space (using the inverse of the texture mapping) • Take the illumination of the polygon and put it in the pixel • Advantages of this approach: • Choosing “good” planes means that texture pixels map to roughly square pieces of polygon - good sampling • Not too many maps are required, and not much memory is wasted

  18. Example Nearest interpolation Linear interpolation What type of lighting (diffuse, specular, reflections) can the map store?

  19. Example No light maps With light maps

  20. Applying Light Maps • Use multi-texturing hardware • First stage: Apply color texture map • Second stage: Modulate with light map • Actually, make points darker with light map • DirectX allows you to make points brighter with texture • Pre-lighting textures: • Apply the light map to the texture maps as a pre-process • Why is this less appealing? • Multi-stage rendering: • Same effect as multi-texturing, but modulating in the frame buffer

  21. Dynamic Light Maps • Light maps are a preprocessing step, so they can only capture static lighting • Texture transformations allow some effects • What is required to recompute a light map at run-time? • How might we make this tractable? • Spatial subdivision algorithms allow us to identify nearby objects, which helps with this process • Compute a separate, dynamic light map at runtime using same mapping as static light map • Add additional texture pass to apply the dynamic map

  22. Fog Maps • Dynamic modification of light-maps • Put fog objects into the scene • Compute where they intersect with geometry and paint the fog density into a dynamic light map • Use same mapping as static light map uses • Apply the fog map as with a light map • Extra texture stage

  23. Fog Map Example

  24. Bump Mapping • Bump mapping modifies the surface normal vector according to information in the map • Light dependent: the appearance of the surface depends on the lighting direction • View dependent: the effect of the bumps may depend on which direction the surface is viewed from • Bump mapping can be implemented with multi-texturing, multi-pass rendering, or pixel shaders

  25. Storing the Bump Map • Several options for what to store in the map • The normal vector to use • An offset to the default normal vector • Data derived from the normal vector • Illumination changes for a fixed view

  26. Embossing • Apply height field as a modulating texture map • First application, apply it in place • Second application, shift it by amount that depends on the light direction, and subtract it

  27. Dot Product bump mapping • Store normal vectors in the bump map • Specify light directions instead of colors at the vertices • Apply the bump map using the dot3 operator • Takes a dot product • Lots of details: • Light directions must be normalized – can be done with a cubic environment map • How do you get the color in? • How do you do specular highlights?

  28. Dot Product Results www.nvidia.com

  29. Normal Mapping DOOM 3 James Hastings-Trew

  30. Environment Bump Mapping • Perturb the environment map lookup directions with the bump map Nvidia Far Cry

  31. Multi-Pass Rendering • The pipeline takes one triangle at a time, so only local information and pre-computed maps are available • Multi-pass techniques render the scene, or parts of the scene, multiple times • Makes use of auxiliary buffers to hold information • Make use of tests and logical operations on values in the buffers • Really, a set of functionality that can be used to achieve a wide range of effects • Mirrors, shadows, bump-maps, anti-aliasing, compositing, …

  32. Buffers • Buffers allow you to store global information about the rendered scene • Like scratch work space, or extra screen memory • They are only cleared when you say so • This functionality is fundamentally different from that of vertex or pixel shaders • Buffers are defined by: • The type of values they store • The logical operations that they influence • The way they are accessed (written and read)

  33. OpenGL Buffers • Color buffers: Store RGBA color information for each pixel • OpenGL actually defines four or more color buffers: front/back (double buffering), left/right (stereo) and auxiliary color buffers • Depth buffer: Stores depth information for each pixel • Stencil buffer: Stores some number of bits for each pixel • Accumulation buffer: Like a color buffer, but with higher resolution and different operations

  34. Fragment Tests • A fragment is a pixel-sized piece of shaded polygon, with color and depth information • After pixel shaders and/or texturing • The tests and operations performed with the fragment on its way to the color buffer are essential to understanding multi-pass techniques • Most important are, in order: • Alpha test • Stencil test • Depth test • Blending • Tests must be explicitly enabled • As the fragment passes through, some of the buffers may also have values stored into them

  35. Alpha Test • The alpha test either allows a fragment to pass, or stops it, depending on the outcome of a test: • Here, fragment is the fragment’s alpha value, and reference is a reference alpha value that you specify • op is one of: <, <=, =, !=, >, >= • There are also the special tests: Always and Never • Always let the fragment through or never let it through • What is a sensible default? if ( fragment op reference ) pass fragment on

  36. Billboards • Billboards are texture-mapped polygons, typically used for things like trees • Image-based rendering method where complex geometry (the tree) is replaced with an image placed in the scene (the textured polygon) • The texture has alpha values associated with it: 1 where the tree is, and 0 where it isn’t • So you can see through the polygon in places where the tree isn’t

  37. Alpha Test and Billboards • You can use texture blending to make the polygon see through, but there is a big problem • What happens if you draw the billboard and then draw something behind it? • Hint: Think about the depth buffer values • This is one reason why transparent objects must be rendered back to front • The best way to draw billboards is with an alpha test: Do not let alpha < 0.5 pass through • Depth buffer is never set for fragments that are see through • Doesn’t work for partially transparent polygons - more later

  38. Stencil Buffer • The stencil buffer acts like a paint stencil - it lets some fragments through but not others • It stores multi-bit values – you have some control of #bits • You specify two things: • The test that controls which fragments get through • The operations to perform on the buffer when the test passes or fails • All tests/operation look at the value in the stencil that corresponds to the pixel location of the fragment • Typical usage: One rendering pass sets values in the stencil, which control how various parts of the screen are drawn in the second pass

  39. Stencil Tests • You give an operation, a reference value, and a mask • Operations: • Always let the fragment through • Never let the fragment through • Logical operations between the reference value and the value in the buffer: <, <=, =, !=, >, >= • The mask is used to select particular bit-planes for the operation • (reference & mask ) op ( buffer & mask )

  40. Stencil Operations • Specify three different operations • If the stencil test fails • If the stencil passes but the depth test fails • If the stencil passes and the depth test passes • Operations are: • Keep the current stencil value • Zero the stencil • Replace the stencil with the reference value • Increment the stencil • Decrement the stencil • Invert the stencil (bitwise)

  41. Depth Test and Operation • Depth test compares the depth of the fragment and the depth in the buffer • Depth increases with greater distance from viewer • Tests are: Always, Never, <, <=, =, !=, >, >= • Depth operation is to write the fragments depth to the buffer, or to leave the buffer unchanged • Why do the test but leave the buffer unchanged? • Each buffer stores different information about the pixel, so a test on one buffer may be useful in managing another

  42. Copy to Texture • You can copy the framebuffer contents to a texture • Very powerful • Why ?

  43. Multi-Pass Algorithms • Designing a multi-pass algorithm is a non-trivial task • At least one person I know of has received a PhD for developing such algorithms • References for multi-pass algorithms: • Real Time Rendering has them indexed by problem • The OpenGL Programming guide discusses many multi-pass techniques in a reasonably understandable manner • Game Programming Gems has some • Watt and Policarpo has others • Several have been published as academic papers

  44. Multipass examples • Transparent objects

  45. Reading • Core Techniques & Algorithms in Game Programming • Chapter 18 pages 565 - 600

More Related