1 / 38

Non-Photoreal Rendering (and other stuff)

Non-Photoreal Rendering (and other stuff). CSE 191A: Seminar on Video Game Programming Lecture 10: Non-Photoreal Rendering UCSD, Spring, 2003 Instructor: Steve Rotenberg. Info. Rockstar positions Next year Homework. Non-Photoreal Rendering. NPR. Medium

Télécharger la présentation

Non-Photoreal Rendering (and other stuff)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Non-Photoreal Rendering (and other stuff) CSE 191A: Seminar on Video Game Programming Lecture 10: Non-Photoreal Rendering UCSD, Spring, 2003 Instructor: Steve Rotenberg

  2. Info • Rockstar positions • Next year • Homework

  3. Non-Photoreal Rendering

  4. NPR • Medium • Substrate simulation (paper, canvas…) • Medium simulation (pencil, pen, watercolor, oil paint…) • Rendering • Strokes • Edge handling • Shading & filling • Stylistic Simulation • Impressionism, cartoon, mosaic, technical illustration… • Vision & computer interpretation

  5. Medium Simulation • For offline NPR, one can do detailed simulations of the actual substrate & drawing medium • Paper/substrate surface & volumetric characteristics • Pencil/paper interaction & tone transfer • Dynamic processes (watercolor diffusion, paint mixing…) • Real time NPR will most likely use higher level abstractions, but it is still important to think about the physical processes involved in drawing and painting

  6. Strokes • Entire pictures can be described in terms of the individual brush strokes used to create them. • Brush strokes can carry lots of information, including: • Path • Tool orientation • Pressure • Ink/tone transfer rates • Other specific information • One doesn’t have to render the entire image with brush strokes, but they are a useful concept for a lot of NPR applications, especially edges.

  7. Edge Rendering • Boundary edge • Creases & hard edges • Material edge • Silhouette edge

  8. Surface Angle Edge Rendering • This technique uses spherical environment mapping to render darkened edges • Normals are transformed to view space (camera relative) and then the x and y components of the normal map to a texture coordinate n'=n·W·C-1 t.x=0.5*(n.x+1.0) t.y=0.5*(n.y+1.0) • Issues • Can be difficult to tune • Requires smooth shaded normals, so it can be difficult to apply to faceted objects

  9. Surface Angle Edge Rendering

  10. Line Edge Rendering • With this scheme, edges are explicitly rendered as lines. • Boundary edges, material edges, and creases are identified offline and always rendered • Soft edges (potential silhouettes) are identified offline and tested at runtime. • Every soft edge contains pointers to the two triangles it connects. • If one triangle is facing the viewer and the other is not, then the edge must be a silhouette and gets rendered. • Z-Buffer biasing (or projection biasing) is used to clean up z-fighting

  11. Geometric Edge Rendering • There are a variety of specific techniques, but they all use geometric manipulation of model polygons to extend edges • Typically, front and back facing polygons are handled separately. Usually front facing ones are rendered as normal. Back facing polys are extended or moved in some fashion.

  12. Geometric Edge Rendering

  13. Geometric Edge Rendering

  14. Image Based Edge Rendering • With this technique, the geometry is rendered into the framebuffer and z-buffer. In some variations, the x,y,z value of the normal is written into the framebuffer RGB. • After the image is rendered, the z-buffer and/or framebuffer is processed to identify discontinuities

  15. Image Based Edge Rendering

  16. Image Based Edge Rendering

  17. Stroke-Based Edges • First, edges are found using any usable algorithm • Visible edges are processed and converted into 2D strokes • Strokes are rendered using any stylized algorithm desired

  18. Stroke-Based Edges

  19. Cartoon Shading • Traditionally, cartoon characters are ‘shaded’ with discreet colors, rather than smooth gradations. A simple is the two-tone light/shadow approach. Observe that the shading boundary will probably not fall on actual vertices and polygon edges. • A simple way to do cartoon shading is by using a 1-dimensional texture map that has the desired ‘color ramp’. Instead of the lighting calculations outputting a color, they output a 1-D texture coordinate. • Another option is the use of ‘normal maps’, but these are mostly useful in more static lighting conditions. • Both of these techniques allow for arbitrary color ramps with hard and/or soft boundaries

  20. Stylistic Rendering • Edge handling • Medium simulation • Image space patterns • Surface patterns

  21. Image Based Style Patterns

  22. Surface Based Style Patterns • “Real Time Hatching”, Praun, et.al., SIGGRAPH 2001

  23. Tonal Art Maps

  24. Style Patterns • Klein, et.al, SIGGRAPH 2000

  25. Graftals

  26. Mosaics & Patterns • Using off-screen rendering, an image can be first rendered as a normal bitmap, and then re-sampled by an irregular pattern of colored polygons

  27. Image Processing Effects • By using off-screen rendering to implement a two-pass scheme, one can do whatever type of image processing desired to achieve whatever effect necessary (subject to hardware limitations, of course) • Examples • Solarization • Thresholding • Edge detection • Blurring

  28. Non-Photoreal Animation? • Squash & stretch • Free-form deformations • Cartoon physics (& modal dynamics) • Nonlinear camera projections • Silly particles & effects

  29. Software Architecture

  30. C++ • Advantages • Availability on a wide range of platforms (including brand new ones) • Standardization (code portability, people portability) • Performance • Compatibility with DirectX, OpenGL, and middleware • Disadvantages • Can be lead to memory/performance bloat if not used carefully • Has been surpassed by ‘cleaner’ languages

  31. Object Oriented Programming • Use objects for things that should be objects. • Use member functions for things that should be member functions. • Use virtual functions only when there is a clear need for polymorphism • Prefer containment over than inheritance when appropriate • Define clear software layers • Use coding conventions • Code mitosis & refactoring • Use a lot of automated testing

  32. Levelization • Minimize cyclic and complex dependencies • Define explicit levels and boundaries for class interaction • Collect groups of related classes into packages

  33. Game Library Levelization • Core libraries (data structures, file IO, math routines…) • Device libraries (rendering, audio, input, networking) • Development libraries (widgets, testing frameworks…) • Graphics (culling, effects, lighting) • Physics (collision detection, particles, rigid bodies…) • Character animation • Game components (weapons, HUD…) • AI • Game

  34. Software Engineering References • “Large Scale C++ Software Design”, Lakos, 1996 • “Design Patterns”, Gamma, et.al., 1995 • “Extreme Programming Explained”, Beck, 2000 • “Agile Software Development”, Cockburn, 2002

  35. Car Physics

  36. Core Components • Rigid body motion • Collision detection • Rigid body collision response • Wheel physics (suspension, friction) • Engine/drivetrain

  37. Car Update For each car { Update engine based on current throttle For each wheel { Compute suspension & friction forces based on final configuration of previous frame, but use new engine torque & current control inputs Apply forces to rigid body } Apply other forces to car (gravity…) Move car to new candidate position } Detect & resolve rigid collisions For each car { For each wheel { Do collision probe to find new wheel orientation } }

  38. Conclusion

More Related