1 / 133

CSCI 6360/4360

CG Summary: Lighting and Shading “From Vertices to Fragments” Discrete Techniques Angel, Chapters 5, 6, 7; “Red Book” slides from AW, red book, etc. CSCI 6360/4360. Overview (essential concepts). The rest of the essential concepts … for your projects Shading and illumination

jerom
Télécharger la présentation

CSCI 6360/4360

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CG Summary:Lighting and Shading“From Vertices to Fragments”Discrete TechniquesAngel, Chapters 5, 6, 7; “Red Book”slides from AW, red book, etc. CSCI 6360/4360

  2. Overview(essential concepts) • The rest of the essential concepts … for your projects • Shading and illumination • Basic models and OpenGL use (30) • “Vertices to fragments” (42) • Drawing algorithms and hidden surface removal, e.g., z-buffer (55) • Color (80) • Discrete techniques (90) • The pixel pipeline (98) • Texture mapping (103) • Alpha blending and antialiasing • Depth cueing, fog, motion blur, stencil …

  3. Illumination and Shading Overview • Photorealism and complexity • Light interactions with a solid • Rendering equation • “infinite scattering and absorption of light” • Local vs. global illumination • Local techniques • Flat, Gouraud, Phong • An illumination model • “describes the inputs, assumptions, and outputs that we will use to calculate illumination of surface elements” • Light • Reflection characteristics of surfaces • Diffuse reflection, Lambert’s law, specular reflection • Phong approximation – interpolated vector shading

  4. Photorealism and Complexity • Recall, from 1st lecture … examples below exhibit range of “realism” • In general, trade off realism for speed – interactive computer graphics • Wireframe – just the outline • Local illumination models, polygon based • Flat shading – same illumination value for all of each polygon • Smooth shading (Gouraud and Phong) – different values across polygons • Global illlumination models • E.g., Raytracing – consider “all” interactions of light with object Wireframe Ray tracing Polygons – Flat shading Polygons - Smooth shading

  5. Shading • So far, just used OpenGL pipeline for vertices • Polygons have all had constant color • glColor(_) • Not “realistic” – or computationally complex • Of course, OpenGL can (efficiently) provide more realistic images • Light-material interactions cause each point to have a different color or shade • Need to consider : • Light sources • Material properties • Location of viewer • Surface orientation • Terminology • “Lighting” • modeling light sources, surfaces, and their interaction • “Shading” • how lighting is done with polygon

  6. Rendering Equation shadow multiple reflection translucent surface • Light travels … • Light strikes A • Some scattered, some absorbed, … • Some of scattered light strikes B • Some scattered, some absorbed, … • Some of this scattered light strikes A • And so on … • Infinite scattering and absorption of light can be described by rendering equation • Bidirectional reflection distribution function • Cannotbesolved in general • Ray tracing is a special case for perfectly reflecting surfaces • Rendering equation is global, includes: • Shadows • Multiple scattering from object to object • … and everything

  7. Light – Material Interactions for CG(quick look) • Will examine each of these in detail • Diffuse surface • Matte, dull finish • Light “scattered” in all directions • Specular surface • Shiny surface • Light reflected (scattered) in narrow range of directions • Translucent surface • Light penetrates surface and emerges in another location on object

  8. “Surface Elements” for Interactive CG(a big idea) • A computer graphics issue/orientation: • Consider everything or just “sampling a scene”? • Again, global view considers all light coming to viewer: • From each point on each surface in scene - object precision • Points are smallest units of scene • Can think of points having no area or infinitesimal area • i.e., there are an infinite number of visible points. • Of course, computationally intractable • Alternatively, consider surface elements • Finite number of differential pieces of surface • E.g., polygon • Figure out how much light comes to viewer from each of these pieces of surface • Often, relatively few (vs. infinite) is enough • Reduction of computation through use of surface elements is at core of tractable/interactive cg

  9. Surface Elements and Illumination, 1 • Tangent Plane Approximation for Objects • Most surfaces are curved: not flat • Surface element is area on that surface • Imagine breaking up into very small pieces • Each of those pieces is still curved, • but if we make the pieces small enough, • then we can make them arbitrarily close to being flat • Can approximate this small area with a tiny flat area • Surface Normals • Each surface element lies in a plane. • To describe plane, need a point and a normal • Area around each of these vertices is a surface element where we calculate “illumination” • Illumination

  10. Surface Elements and Illumination, 2 • Tangent Plane Approximation for Objects • Surface Normals • Illumination • Again, light rays coming from rest of scene strike surface element and head out in different directions • Light that goes in direction of viewer from that surface element • If viewer moves, light will change • This is “illumination” of that surface element • Will see model for cg later

  11. Light-Material Interaction, 1 • Light that strikes an object is partially absorbed and partially scattered (reflected) • Amount reflected determines color and brightness of object • Surface appears red under white light because the red component of light is reflected and rest is absorbed • Can specify both light and surface colors rough surface Livingstone, “Vision and Art”

  12. Light-Material Interaction, 2 • Reflected light is scattered in a manner that depends on the smoothness and orientation of surface to light source • Diffuse surfaces • Rough (flat, matte) surface scatters light in all directions • Appear same from different viewing angles • Specular surfaces • Smoother surfaces, more reflected light is concentrated in direction a perfect mirror would reflected the light • Light emerges at single angle • … to varying degrees – Phong shading will model

  13. Light Sources • General light sources are difficult to work with because must integrate light coming from all points on the source • Use “simple” light sources • Point source • Model with position and color • Distant source = infinite distance away (parallel) • Spotlight • Restrict light from ideal point source • Ambient light • A real convenience – recall, “rendering equation” – but real nice • Same amount of light everywhere in scene • Can model (in a good enough way) contribution of many sources and reflecting surfaces

  14. Overview: Local Rendering Techniques • Will consider • Illumination (light) models focusing on following elements: • Ambient • Diffuse • Attenuation • Specular Reflection • Interpolated shading models: • Flat, Gouraud, Phong, modified/interpolated Phong (Blinn-Phong)

  15. About (Local) Polygon Mesh Shading(it’s all in how many polygons there are) • Angel example of approximation of a sphere… • Recall, any surface can be illuminated/shaded/lighted (in principle) by: 1. calculating surface normal at each visible point and 2. applying illumination model … or, recall surface model of cg! • Again, where efficiency is consideration, e.g., for interactivity (vs. photorealism) approximations are used • Fine, because polygons themselves are approximation • And just as a circle can be considered as being made of “an infinite number of line segments”, • so, it’s all in how many polygons there are!

  16. About (Local) Polygon Mesh Shading(interpolation) • Interpolation of illumination values are widely used for speed • And can be applied using any illumination model • Will see three methods - each treats a single polygon independently of others (non-global) • Constant (flat) • Gouraud (intensity interpolation) • Interpolated Phong (normal-vector interpolation) • Each uses interpolation differently

  17. Flat/Constant Shading, About • Single illumination value per polygon • Illumination model evaluated just once for each polygon • 1 value for all polygon, Which is as fast as it gets! • As “sampling” value of illumination equation (at just 1 point) • Right is flat vs. smooth (Gouraud) shading • If polygon mesh is an approximation to curved surface, • faceted look is a problem • Also, facets exaggerated by mach band effect • For fast, can (and do) store normal with each surface • Or can, of course, compute from vertices • But, interestingly, approach is valid, if: • Light source is at infinity (is constant on polygon) • Viewer is at infinity (is constant on polygon) • Polygon represents actual surface being modeled (is not an approximation)!

  18. Gouraud Shading, About • Recall, for flat/constant shading, single illumination value per polygon • Gouraud (or smooth, or interpolated intensity) shading overcomes problem of discontinuity at edge exacerbated by Mach banding • “Smooths” where polygons meet • H. Gouraud, "Continuous shading of curved surfaces," IEEE Transactions on Computers, 20(6):623–628, 1971. • Linearly interpolate intensity along scan lines • Eliminates intensity discontinuities at polygon edges • Still have gradient discontinuities, • So mach banding is improved, but not eliminated • Must differentiate desired creases from tessellation artifacts • (edges of a cube vs. edges on tesselated sphere)

  19. Gouraud Shading, About • To find illumination intensity, need intensity of illumination and angle of reflection • Flat shading uses 1 angle • Gouraud estimates • …. Interpolates 1. Use polygon surface normals to calculate “approximation” to vertex normals • Average of surrounding polygons’ normals • Since neighboring polygons sharing vertices and edges are approximations to smoothly curved surfaces • So, won’t have greatly differing surface normals • Approximation is reasonable one 2. Interpolate intensity along polygon edges 3. Interpolate along scan lines • i.e,, find: • Ia, as interpolated value between I1 and I2 • Ib, as interpolated value between I1 and I3 • Ip, as interpolated value between Ia and Ib • formulaically, next slide

  20. Simple Illumination Model • One of first models of illumination that “looked good” and could be calculated efficiently • simple, non-physical, non-global illumination model • describes some observable reflection characteristics of surfaces • came out of work done at the University of Utah in the early 1970’s • still used today, as it is easy to do in software and can be optimized in hardware • Later, will put all together with normal interpolation • Components of a simple model • Reflection characteristics of surfaces • Diffuse Reflection • Ambient Reflection • Specular Reflection • Model not physically-based, and does not attempt to accurately calculate global illumination • does attempt to simulate some of important observable effects of common light interactions • can be computed quickly and efficiently

  21. Reflection Characteristics of Surfaces, Diffuse Reflection (1/7) • Diffuse Reflection • Diffuse (Lambertian) reflection • typical of dull, matte surfaces • e.g. carpet, chalk plastic • independent of viewer position • dependent on light source position • (in this case a point source, again a non-physical abstraction) • Vecs L, N used to determine reflection • Value from Lambert’s cosine law … next slide

  22. Reflection Characteristics of Surfaces, Lambert’s Law (2/7) • Lambert’s cosine law: • Specifies how much energy/light reflects toward some point • Computational form used in equation for illumination model • Now, have intensity (I) calculated from: • Intensity from point source • Diffuse reflection coefficient (arbitrary!) • With cos-theta calculated using normalized vectors N and V • For computational efficiency • Again:

  23. Reflection Characteristics of Surfaces, Energy Density Falloff (3/7) • Less light as things are farther away from light source • Reflection - Energy Density Falloff • Should also model inverse square law energy density falloff • Formula often creates harsh effects • However, this makes surfaces with equal differ in appearance ¾ important if two surfaces overlap • Do not often see objects illuminated by point lights • Can instead use formula at right • Experimentally-defined constants • Heuristic

  24. Reflection Characteristics of Surfaces, Ambient Reflection (4/7) • Ambient Reflection • Diffuse surfaces reflect light • Some light goes to eye, some to scene • Light bounces off of other objects and eventually reaches this surface element • This is expensive to keep track of accurately • Instead, we use another heuristic • Ambient reflection • Independent of object and viewer position • Again, a constant – “experimentally determined” • Exists in most environments • some light hits surface from all directions • Approx. indirect lighting/global illumination • A total convenience • but images without some form of ambient lighting look stark, they have too much contrast • Light Intensity = Ambient + Attenuation*Diffuse

  25. Reflection Characteristics of Surfaces, Color (5/7) • Colored Lights and Surfaces • Write separate equation for each component of color model • Lambda - wavelength • represent an object’s diffuse color by one value of for each component • e.g., in RGB • are reflected in proportion to • e.g., for the red component • Wavelength dependent equation • Evaluating the illumination equation at only 3 points in the spectrum is wrong, but often yields acceptable pictures. • To avoid restricting ourselves to one color sampling space, indicate wavelength dependence with (lambda).

  26. Reflection Characteristics of Surfaces, Specular Reflection (6/7) • Specular Reflection • Directed reflection from shiny surfaces • typical of bright, shiny surfaces, e.g. metals • color depends on material and how it scatters light energy • in plastics: color of point source, in metal: color of metal • in others: combine color of light and material color • dependent on light source position and viewer position • Early model by Phong neglected effect of material color on specular highlight • made all surfaces look plastic • for perfect reflector, see light iff • for real reflector, reflected light falls off as increases • Below, “shiny spot” size < as angle view >

  27. Reflection Characteristics of Surfaces, Specular Reflection (7a/7) • Phong Approximation • Again, non-physical, but works • Deals with differential “glossiness” in a computationally efficient manner • Below shows increasing n, left to right • “Tightness” of specular highlight • n in formula below (k, etc., next slide)

  28. Reflection Characteristics of Surfaces, Specular Reflection (7b/7) • Yet again, constant, k, for specular component • Vectors R and V express viewing angle and so amount of illumination • n is exponent to which viewing angle raised • Measure of how “tight”/small specular highlight is

  29. Putting it all together:A Simple Illumination Model • Non-Physical Lighting Equation • Energy from a single light reflected by a single surface element • For multiple point lights, simply sum contributions • An easy-to-evaluate equation that gives useful results • It is used in most graphics systems, • but it has no basis in theory and does not model reflections correctly!

  30. OpenGL Shading Functions

  31. OpenGL Shading Functions • Polygonal shading • Flat • Smooth • Gouraud • Steps in OpenGL shading • Enable shading and select model • Specify normals • Specify material properties • Specify lights

  32. Enabling Shading • Shading calculations are enabled by • glEnable(GL_LIGHTING) • Once lighting is enabled, glColor() ignored • Must enable each light source individually • glEnable(GL_LIGHTi) i=0,1….. • Can choose light model parameters: • glLightModeli(parameter, GL_TRUE) • GL_LIGHT_MODEL_LOCAL_VIEWER • do not use simplifying distant viewer assumption in calculation • GL_LIGHT_MODEL_TWO_SIDED • shades both sides of polygons independently

  33. Defining a Point Light Source • For each light source, can set an RGBA (A for alpha channel) • For diffuse, specular, and ambient components, and • For the position • Code below from Angel, other ways to do it (of course) GL float diffuse0[]={1.0, 0.0, 0.0, 1.0}; GL float ambient0[]={1.0, 0.0, 0.0, 1.0}; GL float specular0[]={1.0, 0.0, 0.0, 1.0}; Glfloat light0_pos[]={1.0, 2.0, 3,0, 1.0}; glEnable(GL_LIGHTING); glEnable(GL_LIGHT0); glLightv(GL_LIGHT0, GL_POSITION, light0_pos); glLightv(GL_LIGHT0, GL_AMBIENT, ambient0); glLightv(GL_LIGHT0, GL_DIFFUSE, diffuse0); glLightv(GL_LIGHT0, GL_SPECULAR, specular0);

  34. Distance and Direction • Position is given in homogeneous coordinates • If w =1.0, we are specifying a finite location • If w =0.0, we are specifying a parallel source with the given direction vector • Coefficients in distance terms are by default a=1.0 (constant terms), b=c=0.0 (linear and quadratic terms) • Change by: • a= 0.80; • glLightf(GL_LIGHT0, GLCONSTANT_ATTENUATION, a); GL float diffuse0[]={1.0, 0.0, 0.0, 1.0}; GL float ambient0[]={1.0, 0.0, 0.0, 1.0}; GL float specular0[]={1.0, 0.0, 0.0, 1.0}; Glfloat light0_pos[]={1.0, 2.0, 3,0, 1.0};

  35. Spotlights • Use glLightv to set • Direction GL_SPOT_DIRECTION • Cutoff GL_SPOT_CUTOFF • Attenuation GL_SPOT_EXPONENT • Proportional to cosaf f q -q

  36. Global Ambient Light • Ambient light depends on color of light sources • A red light in a white room will cause a red ambient term that disappears when the light is turned off • OpenGL also allows a global ambient term that is often helpful for testing • glLightModelfv(GL_LIGHT_MODEL_AMBIENT, global_ambient)

  37. Moving Light Sources • Light sources are geometric objects whose positions or directions are affected by the model-view matrix • Depending on where we place the position (direction) setting function, we can • Move the light source(s) with the object(s) • Fix the object(s) and move the light source(s) • Fix the light source(s) and move the object(s) • Move the light source(s) and object(s) independently

  38. Material Properties • Material properties are also part of the OpenGL state and match the terms in the modified Phong model • Set by glMaterialv() GLfloat ambient[] = {0.2, 0.2, 0.2, 1.0}; GLfloat diffuse[] = {1.0, 0.8, 0.0, 1.0}; GLfloatspecular[] = {1.0, 1.0, 1.0, 1.0}; GLfloat shine = 100.0 glMaterialf(GL_FRONT, GL_AMBIENT, ambient); glMaterialf(GL_FRONT, GL_DIFFUSE, diffuse); glMaterialf(GL_FRONT, GL_SPECULAR, specular); glMaterialf(GL_FRONT, GL_SHININESS, shine);

  39. Emissive Term • We can simulate a light source in OpenGL by giving a material an emissive component • This component is unaffected by any sources or transformations • GLfloat emission[] = 0.0, 0.3, 0.3, 1.0); • glMaterialf(GL_FRONT, GL_EMISSION, emission);

  40. Transparency • Material properties are specified as RGBA values • The A value can be used to make the surface translucent • The default is that all surfaces are opaque regardless of A • Later we will enable blending and use this feature

  41. Polygonal Shading • Shading calculations are done for each vertex • Vertex colors become vertex shades • By default, vertex shades are interpolated across the polygon • glShadeModel(GL_SMOOTH); • If we use glShadeModel(GL_FLAT); the color at the first vertex will determine the shade of the whole polygon

  42. “From Vertices to Fragments” • .

  43. Implementation • “From Vertices to Fragments” • Next steps in viewing pipeline: • Clipping • Eliminating objects that lie outside view volume • and, so, not visible in image • Rasterization • Produces fragments from remaining objects • Hidden surface removal (visible surface determination) • Determines which object fragments are visible • Show objects (surfaces)not blocked by objects closer to camera • Will consider above in some detail in order to give feel for computational cost of these elements • “in some detail” = algorithms for implementing • … algorithms that are efficient • Same algorithms for any standard API • Whether implemented by pipeline, raytracing, etc. • Will see different algorithms for same basic tasks

  44. Tasks to Render a Geometric Entity1 • Angel introduces more general terms and ideas, than just for OpenGL pipeline… • Recall, chapter title “From Vertices to Fragments” … and even pixels • From definition in user program to (possible) display on output device • Modeling, geometry processing, rasterization, fragment processing • Modeling • Performed by application program, e.g., create sphere polygons (vertices) • Angel example of spheres and creating data structure for OpenGL use • Product is vertices (and their connections) • Application might even reduce “load”, e.g., no back-facing polygons

  45. Tasks to Render a Geometric Entity2 • Geometry Processing • Works with vertices • Determine which geometric objects appear on display • 1. Perform clipping to view volume • Changes object coordinates to eye coordinates • Transforms vertices to normalized view volume using projection transformation • 2. Primitive assembly • Clipping object (and it’s surfaces) can result in new surfaces (e.g., shorter line, polygon of different shape) • Working with these “new” elements to “re-form” (clipped) objects is primitive assembly • Necessary for, e.g., shading • 3. Assignment of color to vertex • Modeling and geometry processing called “front-end processing” • All involve 3-d calculations and require floating-point arithmetic

  46. Tasks to Render a Geometric Entity3 • Rasterization • Only x, y values needed for (2-d) frame buffer • Rasterization, scan conversion, determines which fragments displayed (put in frame buffer) • For polygons, rasterization determines which pixels lie inside 2-d polygon determined by projected vertices • Colors • Most simply, fragments (and their pixels) are determined by interpolation of vertex shades & put in frame buffer • Output of rasterizer is in units of the display (window coordinates)

  47. Tasks to Render a Geometric Entity4 • Fragment Processing • Colors • OpenGL can merge color (and lighting) results of rasterization stage with geometric pipeline • E.g., shaded, texture mapped polygon (next chapter) • Lighting/shading values of vertex merged with texture map • For translucence, must allow light to pass through fragment • Blending of colors uses combination of fragment colors, using colors already in frame buffer • e.g., multiple translucent objects • Hidden surface removal performed fragment by fragment using depth information • Anti-aliasing also dealt with

  48. Efficiency and Algorithms • For cg illumination/shading, saw how role of efficiency drove algorithms • Phong shading is “good enough” to be perceived as “close enough” to real world • Close attention to algorithmic efficiency • Similarly, for often calculated geometric processing efficiency is a prime consideration • Will consider efficient algorithms for: • Clipping • Line drawing • Visible surface drawing

  49. Recall, Clipping … • Scene's objects are clipped against clip space bounding box • Eliminates objects (and pieces of objects) not visible in image • Efficient clipping algorithms for homogeneous clip space • Perspective division divides all by homogeneous coordinates, w • Clip space becomes Normalized Device Coordinate (NDC) space after perspective division

  50. Rasterization

More Related