1 / 22

KIPA Game Engine Seminars

KIPA Game Engine Seminars. Day 4. Jonathan Blow Seoul, Korea November 29, 2002. High-Level Networking (continued). Review of yesterday…. Deciding What To Transmit. Limited bandwidth to fit all entity updates into Apportion this out into slices somehow?

michellk
Télécharger la présentation

KIPA Game Engine Seminars

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. KIPA Game Engine Seminars Day 4 Jonathan Blow Seoul, Korea November 29, 2002

  2. High-Level Networking(continued) • Review of yesterday…

  3. Deciding What To Transmit • Limited bandwidth to fit all entity updates into • Apportion this out into slices somehow? • Can we do this without adding latency? A hard problem! • Need to accept the fact that the client won’t be perfectly updated about everything, always a little bit wrong • Approach network communications as an error minimization problem

  4. Deciding What To Transmit • Need a metric for the amount of error between object on client and server • Position, orientation, state variables • Probably attenuated by distance to deal with viewpoint issues! • What about a rocket with contrail attached? • Record the world state that you sent to the client, and diff it against current state • A lot of memory! • A lot of CPU!

  5. There is a lot of coherence between clients’ error functions • We ought to be able to exploit that • Example of objects that move a lot • They will have high error in all world views • Similarly for objects that move slowly • How do we detect “a lot of motion”? • Should not use distance traveled per frame • example on whiteboard

  6. Detecting “a lot of motion” • Idea: Neighborhood bounding box • Too quantized; how do we decide when to move the center? • Idea: Bounding sphere with moving center • How do we compute this without holding big arrays of data? • Also, too anisotropic • (we are picky because we pay for bandwidth!)

  7. Why we want anisotropy • For distant objects, we care most about motion parallel to the view plane • Motion orthogonal to that plane only produces small changes in perspective size • (graph of 1/z on whiteboard)

  8. Variance of a vector • Also: “Covariance” of the vector components • “Variance/Covariance Matrix” of components • (demo) • Can be filtered, like scalars, to approximate where something has been over different periods of time

  9. Summary of Variance Methods • Characterized by ellipsoid • Find ellipsoid by eigenvalues/eigenvectors of outer product matrix • These variances can be treated intuitively like mass (tensor of inertia, in physics)

  10. Derivation of Variance Recentering • Allows us to filter the variance of the vector, and transform that relative to a filtered position, in order to visualize • (derivation on whiteboard)

  11. Code Inspection • Covariance2, openGL demo app • Covariance3 • discuss finding eigenvectors of 2x2 versus 3x3 matrix

  12. Do we need the eigenvectorsfor networking? • Perhaps not! • First, discussion of how we would use the eigenvectors • But instead of back-projecting, can we forward-query? • A simple query is very cheap • (example on whiteboard)

  13. For the global sort,it’s even easier • The product of the eigenvalues is the determinant of the matrix • Volume of ellipsoid! • The sum of eigenvalues is the trace of the matrix • Useful in approximating eccentricity of ellipsoid; ratio of volume to ideal volume of a sphere with radius of (1/3) tr M

  14. Shaders

  15. Shaders • Some people are confused by marketing hype to think shaders are new… • They have been around for a long time in software rendering • In hardware, the fixed-function pipeline provided shader functionality.

  16. Interesting idea:Deferred Shading • Only write iterated vertex parameters into the frame buffer • Perform complicated shading operations in a post-pass • If multipass rendering, vertex shader will run less often • But technique is of limited use? http://www.delphi3d.net/articles/viewarticle.php?article=deferred.htm

  17. Early Game Shader Language:Quake3 shaders.txt • Goal: to abstract away the number of texture stages in a graphics card’s pipeline • Earlier cards had 1, 2, or 3 stages • Also: enable level designers to create shaders by hand

  18. Normalization Cube Map • Promoted by Mark Kilgard of Nvidia • An interesting idea for early shaders, but outdated now? • With more shader instructions we can actually run a fast normalization function • Does not require texture memory or a texture input slot! • Cube maps are still useful for parameterizing arbitrary functions over the sphere

  19. Xbox / Old PC’s texture pipeline

  20. GameCube’s texture pipeline

  21. Early DirectX vertex / pixel shaders • (Version 1.0, 1.1) • Did not do much you couldn’t already do in fixed function pipeline • But, an important step toward paradigm of programmability

  22. OpenGL vs. DirectX:Extensions vs. Control • OpenGL provides extensions • DirectX is about Microsoft creating a “standard”

More Related