190 likes | 207 Vues
Explore advanced shader techniques in computer graphics for improved lighting effects, normal mapping, and more. Learn modern applications and games shader usage, with an overview of per-pixel lighting, specular maps, normal mapping, and their implementation in rendering.
E N D
Advanced Computer GraphicsAdvanced Shaders CO2409 Computer Graphics Week 16
Lecture Contents • Introduction • Per-Pixel Lighting • Normal Mapping • Parallax Mapping • Cell Shading • Other Techniques • Further Reference
Introduction • We have looked at the basics of the entire pipeline, using vertex and pixel shaders • We have also created some interesting effects using shader techniques • But these effects were of limited use [wiggle!] • We can use advanced shaders to implement more useful high-quality rendering techniques • Modern applications & games make much use of such shaders • Here we briefly introduce some key techniques • Not in depth, intended to give a flavour only • Overview of techniques is examinable, exact knowledge of the finest detail is not expected
Per-Pixel Lighting • We have considered lighting as a vertex processing task • Vertex lighting colours were interpolated (blended) across polygon pixels • However, there are problems lighting large polygons only using the vertices • Specular highlights can be lost or vague • Attenuation (fall off of brightness) is poorly represented • We can move the lighting calculations to the pixel shader • Very precise lighting effects are possible • More expensive – many more lighting calculations
An Aside: Rasterisation • Vertex shaders work on vertices, pixel shaders on pixels • Many pixels per triangle, how are they generated? • The rasterization stage, occuring between the shaders converts triangles into pixels • Take three vertices forming a triangle, calculate every pixel inside • Call pixel shader once for every pixel • Data from vertices is interpolated to give input to pixel shader • Clearly pixel shader performance is important
An Aside: Interpolation • The vertex shader outputs data into the pipeline • E.g. colours, texture coordinates, etc. • Geometry shader may update this data, but we’ll ignore that here • This data is linearly interpolated by the rasterization stage to give inputs to the pixel shader • Each pixel is given a weighted average of the outputs from the three vertices in the triangle • In fact some perspective correction also occurs • Some data doesn’t interpolate ideally • E.g. Vectors change length when interpolated
Per-Pixel Lighting • Per-pixel lighting: • Note the reflection on the front of the cube • Doesn’t affect any vertex • Vertex processing would lose this highlight entirely • Also attentuation & specular on the floor would be lost Method: • The vertex shader passes the vertex position and normal on to the pixel shader • Interpolated & used by the pixel shader to calculate lighting with the usual equations • I.e. Just pass data down the pipeline and calculate lighting later
Specular / Gloss Maps • Multiple textures can be blended on a polygon • Trivial in a shader • E.g. a specular map (or gloss map) is a texture used to adjust specular highlights • It defines which areas are affected by specular light • i.e. the shiny areas
Diffuse & Specular Maps • Equation used in pixel shader to use a specular map: Output pixel colour = diffuse light * diffuse map + specular light * specular map • The main texture is called the diffuse map • The textures are named by the lighting they affect • Convenient to store the diffuse and specular map within a single bitmap • Diffuse in the r, g, & b channels, specular in the alpha channel • Can’t use alpha transparency + =
Normal Mapping • When calculating lighting per-pixel, the normals are interpolated: • All polygons will be flat or smoothly curved • Can only represent bumpiness with detailed models • Use yet another texture containing normals – a normal map • Note the use of multiple textures in more advanced effects • Look up the normal for each pixel in this texture • Ignore interpolated vertex normals • Improves quality greatly • But some technical complexity
Normal Mapping • Normal mapping example: • Diffuse texture is very simple • Normal map shown later • Normal mapping suggests bumpiness when combined with pixel lighting
Normal Map Example • Normal maps look unusual • Look at red, green and blue channels to visualise + + Red X direction of normal = Green Y direction of normal Actual Normal Map Blue Z direction of normal
Normal Map Transformation • Normals in a normal map are defined relative to the polygon texture they are applied to • Assumes texture is facing down the Z-axis • This is tangent space – the world as seen from the texture • But textures in the scene might face in any direction • So transform normals from normal map onto mesh • This is a matrix transform: • Tangent space (relative to texture) • To world space (because lights are in world space)
Tangent Space • We need to calculate a matrix to represent the viewpoint of each polygon - the tangent space of the polygon • X & Y axes match U & V texture axes • The Z axis is simply the polygon normal • Already have the polygon normal, • Store tangent vector (texture U axis) • But calculate the bi-tangent (see lab) • Normal maps use RGB to define (X,Y,Z) vectors in tangent space • RGB range is 0-1, vector ranges -1 to 1 • So conversion: (RGB values * 2) – 1 • E.g. outward normal of (0,0,1) is stored as RGB(0.5,0.5,1), • The distinctive purple-blue of normal maps
Parallax Mapping • Normal maps give the impression of bumpiness with correct lighting, but it’s only a lighting effect • The actual depth of bumpiness not shown • Parallax mappingtries to shows depth as well as bumpy lighting • Adjust the UVs used in the pixel shader • Distorts texture to give impression of depth • It’s an artificial effect ParallaxMapping Normal Mapping
Idea behind Parallax Mapping • With parallax mapping we also need a height map: • A value for each texel indicating its height • Stored in the alpha channel of the normal map Height map Black low, white high
Basics of Parallax Mapping • We have a “bumpy” texture, but we are applying it to a flat surface • This means that the texture coordinates we normally use are not the right ones: • We attempt to correct this by offsetting the UVs • Basic parallax mapping offsets them using the camera (eye) vector • Uses an approximation shown in diagrams
Another Shader Example: Cell-Shading • Cell-Shading describes cartoon like rendering: • Black outlines and only a few colours • The outline can be created inside or outside of a shader • Here we draw a 2nd inside-out slightly larger black model • Other methods too (e.g. could look at polygon normals / camera) • Lighting is calculated in usual way • But colours are clamped to a fixed set of values • A 1D texture is used for this (see labs) • An example of non-photorealistic rendering
Other Techniques • Many extensions to these techniques, e.g.: • Animated normal maps to create water effects • The parallax mapping methods can be taken further to create self shadowing surfaces with real depth • This is called steep parallax mapping • Newer approach: use tessellation to generate extra polygons to create the shape implied by the normal / height map • Many other techniques too, e.g.: • Different lighting models (e.g. for skin, metal, etc.) • Image processing, HDR lighting • Post-processing for blur, feedback, scene distortion, high dynamic range lighting etc. • And many more. Consider for a 3rd year project…