1 / 41

Introduction to WebGL

Introduction to WebGL. Mitch Williams 3D-Online. Tutorial Overview. Introduction to WebGL JavaScript GLSL / Shader Languages Interactive WebGL demo’s to illustrate programming concepts Code provided as reference with expectation you’ll download the slides and examples.

Télécharger la présentation

Introduction to WebGL

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Introduction to WebGL Mitch Williams 3D-Online

  2. Tutorial Overview • Introduction to WebGL • JavaScript • GLSL / Shader Languages • Interactive WebGL demo’s to illustrate programming concepts • Code provided as reference with expectation you’ll download the slides and examples.

  3. Web 3D Evolution • Web 3D introduced with VRML file format • Web Browser plug-in, or Java-based 3D engine • X3D • VRML Revision using XML file format, Validation • WebGL, X3Dom • Rendering on graphics card using OpenGL ES • Solved performance, quality issues. Used HTML 5 JavaScript, GLSL (same as Android, iPhone)

  4. Agenda • 3D Graphics Pipeline • 3D mesh, camera transforms, perspective view • Looking 3D • Lighting, texture maps, materials and normals • Building 3D Environments • Portals, Reflection, Interactivity (Picking), Fog

  5. 3D Graphics Pipeline Object Transformation Transform to Camera View Back-face Culling Perspective View, flatten onto a 2D Plane

  6. 2D Triangle WebGL Hello World <body onload="webGLStart();"> <canvas id="myCanvas" style="border: none;" width="640" height="640"> </canvas> </body>

  7. 2D Triangle (cont.) function webGLStart() { var canvas = document.getElementById("myCanvas"); initGL(canvas); // set WebGL width and height initShaders(); // discussed later initBuffers(); gl.clearColor(0.0, 0.0, 0.0, 1.0); drawScene(); }

  8. varvertexBuffer; function initBuffers() { vertexBuffer = gl.createBuffer(); gl.bindBuffer(gl.ARRAY_BUFFER, vertexBuffer); vertices = [ -0.75, 0.5, 0.4, -0.4, -0.5, -0.6 ]; gl.bufferData(gl.ARRAY_BUFFER, new Float32Array(vertices), gl.STATIC_DRAW); vertexBuffer.itemSize = 2; vertexBuffer.numItems = 3; } function drawScene() { gl.viewport(0, 0, gl.viewportWidth, gl.viewportHeight); gl.clear(gl.COLOR_BUFFER_BIT | gl.DEPTH_BUFFER_BIT); gl.bindBuffer(gl.ARRAY_BUFFER, vertexBuffer); gl.vertexAttribPointer(shaderProgram.vertexPositionAttribute, vertexBuffer.itemSize, gl.FLOAT, false, 0, 0); gl.drawArrays(gl.TRIANGLES, 0, vertexBuffer.numItems); }

  9. 3D triangle function initBuffers() { . . . vertices = [ -0.75, 0.5, -2.0, 0.4, -0.4, -2.0, -0.5, -0.6, -2.0 ]; vertexBuffer.itemSize = 3; vertexBuffer.numItems = 3; }

  10. Perspective View Field of View controller Initialization varpMatrix = mat4.create(); // create the 4x4 perspective matrix shaderProgram.pMatrixUniform= gl.getUniformLocation(shaderProgram, "uPMatrix"); Rendering mat4.perspective( fieldOfView, gl.viewportWidth/ gl.viewportHeight, 0.1, 100.0, pMatrix); gl.uniformMatrix4fv(shaderProgram.pMatrixUniform, false, pMatrix);

  11. Perspective View (Shader code) <script id="shader-fs" type="x-shader/x-fragment"> void main(void) { gl_FragColor = vec4(1.0, 1.0, 1.0, 1.0); } </script> <script id="shader-vs" type="x-shader/x-vertex"> attribute vec3 aVertexPosition; uniform mat4 uPMatrix; void main(void) { gl_Position = uPMatrix* vec4(aVertexPosition, 1.0); } </script>

  12. Backface Culling and Depth Backface Culling & Depth demo In webGLStart() gl.enable(gl.DEPTH_TEST); gl.enable(gl.CULL_FACE); • Backface Culling also helps in performance. • Disable backface culling is to see through transparencies. • Depth test can hurt performance • But if you know the order of depth, you could turn it off

  13. Transformation Rotation, Translation, Scale demo JavaScript, using “glMatrix-0.9.5.min.js” to set 4x4 Transformation matrix: mat4.identity(modelMatrix); mat4.translate(modelMatrix, meshObjectArray[i].translation ); mat4.scale(modelMatrix, meshObjectArray[i].scale ); mat4.rotate(modelMatrix, meshObjectArray[i].rotation[0], [1, 0, 0]); mat4.rotate(modelMatrix, meshObjectArray[i].rotation[1], [0, 1, 0]); mat4.rotate(modelMatrix, meshObjectArray[i].rotation[2], [0, 0, 1]); Setup, and passing modelMatrix to the Shader (GPU) shaderProgram.modelMatrixUniform= gl.getUniformLocation(shaderProgram, "uMMatrix"); gl.uniformMatrix4fv(shaderProgram.modelMatrixUniform, false, modelMatrix);

  14. Shader Language Vertex Shader attribute vec3 aVertexPosition; uniform mat4 uPMatrix; uniform mat4 uMMatrix; void main(void) { gl_Position = uPMatrix * uMMatrix * vec4(aVertexPosition, 1.0); …. }

  15. Camera Matrix Eye, Target, Up vectors demo JavaScript mat4.lookAt( eye, target, up, cameraMatrix); cameraMatrixsaved in uVMatrixfor Vertex Shader gl_Position= uPMatrix * uVMatrix* uMMatrix * vec4(aVertexPosition, 1.0);

  16. Looking 3D Textures And Lighting

  17. Texture Maps .obj file exported by 3DS Max/Maya/Blender v -1.0 1.0 1.0 v -1.0 -1.0 1.0 v 1.0 1.0 1.0 v 1.0 -1.0 1.0 vn0.0 0.0 1.0 vt 0.0 1.0 0.0 vt 0.0 0.0 0.0 vt 1.0 1.0 0.0 vt 1.0 0.0 0.0 f 1/1/1 2/2/1 3/3/1 f 4/4/1 3/3/1 2/2/1 Texture maps should be in dimensions that are powers of 2. Textured Plane with 30o rotation

  18. Programming Texture Coordinates Code Snippet (to identify key code) Create the shared memory between GPU/CPU shaderProgram.textureCoordAttribute= gl.getAttribLocation(shaderProgram, "aTextureCoord"); gl.enableVertexAttribArray(shaderProgram.textureCoordAttribute); // texture coords shaderProgram.samplerUniform = gl.getUniformLocation(shaderProgram, "uSampler"); // texture map Send the data to the Shader during run-time gl.bindBuffer(gl.ARRAY_BUFFER, myObject.textureCoordBuffer); gl.vertexAttribPointer(shaderProgram.textureCoordAttribute, myObject.textureCoordBuffer.itemSize, gl.FLOAT, false, 0, 0); gl.activeTexture(gl.TEXTURE0); gl.bindTexture(gl.TEXTURE_2D, myObject.textureMap); gl.uniform1i(shaderProgram.samplerUniform, 0);

  19. Textures inside Shaders Vertex Shader attribute vec2 aTextureCoord; varying vec2 vTextureCoord; vTextureCoord = aTextureCoord; //Pass through the texture coordinates Fragment Shader varying vec2 vTextureCoord; uniform sampler2D uSampler; gl_FragColor= texture2D(uSampler, vec2(vTextureCoord.s, vTextureCoord.t));

  20. Texture Transformations Tiling, Rotation, Scale Translation demo Initialization of Rotation, Scaling, Translation matrices plus tiling array vartextureRotationMatrix = mat3.create(); vartextureScalingMatrix = mat3.create(); vartextureTranslationMatrix = mat3.create(); vartextureTiling= [1.0, 1.0]; Sending data to the Shader gl.uniformMatrix3fv(shaderProgram.textureRotationMatrixUniform, false, textureRotationMatrix); gl.uniformMatrix3fv(shaderProgram.textureScalingMatrixUniform, false, textureScalingMatrix); gl.uniformMatrix3fv(shaderProgram.textureTranslationMatrixUniform, false, textureTranslationMatrix); gl.uniform2f(shaderProgram.textureTilingUniform, textureTiling[0], textureTiling[1] );

  21. Texture Transformations (cont) Matrix multiplication for Texture Transformation New Texture Coord = -Ctr * Scale * Rotation * Ctr * Translation * Texture Coordinate Inside Vertex Shader uniform mat3 uRotationMatrix; uniform mat3 uScalingMatrix; uniform mat3 uTranslationMatrix; uniform vec2 uTextureTiling; vec3 textureTransformed = uTextureNegCenter * uScalingMatrix * uRotationMatrix* uTextureCenter* uTranslationMatrix* vec3( aTextureCoord, 1.0); // Multiply the texture coordinates by the tiling values vTextureCoord= vec2(textureTransformed.s * uTextureTiling.s, textureTransformed.t * uTextureTiling.t);

  22. Lighting and Normals • Normals: a unit vector perpendicular to the polygon. • The angle between the light and the normal is the amount of light at that polygon or vertex. • Equal to the cosine of angle

  23. Lighting • 4 basic types of lights • Ambient: pervasive light • Directional: similar to the Sun • Point: a light bulb • Spot: direction, location and a beam width • With WebGL, you can make unique light • Fluorescent tube, Negative lights, Neon sign, etc.

  24. Lighting Calculated in Vertex Shader Lighting demo Ambient vec3 lightColor = vec3(0.0, 0.0, 0.0); if (uAmbientOn) lightColor += 0.2; Directional vec3 pixelNormal = normalize(vTransformedNormal); vec3 lightDirectionNormalized = normalize(uDirLightDirection); if (uDirectionalOn) lightColor+= max( dot(-lightDirectionNormalized, pixelNormal ), 0.0);

  25. Point Light if ( uPointOn ) { vec3 vectorLightPosToPixel= vec3(vPosition.xyz- uPointLightLocation); float distanceLightPosToPixel = length(vectorLightPosToPixel); if ( distanceLightPosToPixel < uLightRadius) { float angleLightToPixelNormal= dot(-normalize(vectorLightPosToPixel), pixelNormal); lightColor += max(angleLightToPixelNormal, 0.0) * uPointLightColor; } }

  26. Spot Light if ( uSpotOn ) { vec3 spotLightDirectionNormalized = normalize(uSpotLightDirection); vec3 vectorSpotLightToPixel= normalize(vec3(vPosition.xyz- uSpotLightLocation)); float angleLightToPixel= dot( vectorSpotLightToPixel, spotLightDirectionNormalized ); float angleLightToPixelNormal= dot( -vectorSpotLightToPixel, pixelNormal ); Spot Light Spot Light angleLightToPixel Spot Lt Direction Spot Lt Direction Pixel Normal angleLightToPixelNormal Pixel Normal

  27. Spot Light (continued) float lightAmt = 0.0; //note, these are the cosines of the angles, not actual angles if ( angleLightToPixel>= uSpotLightBeamWidth) lightAmt= angleLightToPixelNormal; else if ( (angleLightToPixel> uSpotLightCutOffAngle) && (uSpotLightBeamWidth > uSpotLightCutOffAngle) ) { lightAmt= angleLightToPixelNormal * (angleLightToPixel- uSpotLightCutOffAngle) / (uSpotLightBeamWidth- uSpotLightCutOffAngle); } float lightAtPixel= max(dot( -vectorSpotLightToPixel, pixelNormal ), 0.0); lightAmt*= lightAtPixel; lightColor+= vec3(lightAmt, lightAmt, lightAmt); }

  28. Normal Map Color *2 – 1 Normalize Direction Brick Pattern Normal Map 1, .5, 1 (1, 0, 1) (.707, 0, .707) +xz, right .5, 0, 1 (0, -1, 1) (0, -.707, .707) -y+z, down .5, .5, 1 (0, 0, 1) (0, 0, 1) +z, forward 0, 5. 1 (-1, 0, 1) (-.707, 0, .707) -x+z, left .5, 1, 1 (0, 1, 1) (0, .707, .707) +yz, up

  29. Normal Map (continued) vTransformedNormal (default) was used as pixel normal. Now use a Normal Map. uniform sampler2D uSampler; // texture map uniform sampler2D uSamplerNormalMap; // normal map uniform mat3 uNMatrix; // Normals rotated same as the 3D mesh vec4 bumpMapNormal = vec4( (texture2D(uSamplerNormalMap, vec2(vTextureCoord.s * uTextureMapTiling.s, vTextureCoord.t * uTextureMapTiling.t)) * 2.0) - 1.0 ); vec3 pixelNormal= normalize(uNMatrix* normalize(bumpMapNormal.rgb) );

  30. Materials • Besides texture maps, objects have materials: • Diffuse, emissive, transparency and specular highlights. • Specular highlights reflect the light into your eye and change based on your viewpoint, unlike other materials. • The Halfway vector is midway between the camera and the light source. • The angle between the Halfway vector and Normal is raised to the power of the shininess value.

  31. Specular Highlight Half-way vector vec3 eyeDirection = normalize( vPosition.xyz ); vec3 halfVector = normalize(vectorLightPosToPixel + eyeDirection); float specularValue = dot(-halfVector, pixelNormal); float specularLightWeighting = pow(specularValue, u_shinniness); GLSL Reflect command vec3 eyeDirection = -normalize( vPosition.xyz ); vec3 reflectionDirection= normalize(reflect(vectorLightPosToPixel, pixelNormal)); float specularLightWeighting= pow( max(dot(reflectionDirection, eyeDirection), 0.0), u_shinniness); // Note that the camera is at (0,0,0) so eyeDirection is from camera to pixel // Since specularValue is between 0 and 1, the higher the value, the smaller specular highlight

  32. Building 3D Worlds

  33. Portals • Portals render images as a texture map. • Create a texture map (rtt: “render-to-texture”) rttFramebuffer1 = gl.createFramebuffer(); rttTexture1 = gl.createTexture(); // additional code to set width, height, texture map parameters. • Real-time, switch from default framebuffer (where scene renders to screen), to rtt with portal camera gl.bindFramebuffer(gl.FRAMEBUFFER, rttFramebuffer1); mat4.lookAt(portalEye, portalTarget, portalUp, portalCamera);

  34. Portals (continued) • After rendering scene as normal, save rtt frame buffer as texture. gl.bindTexture(gl.TEXTURE_2D, rttTexture); gl.generateMipmap(gl.TEXTURE_2D); • Then render the scene as usual, and set the rtt texture map as the 3d mesh’s texture map. gl.activeTexture(gl.TEXTURE0); gl.bindTexture(gl.TEXTURE_2D, rttTexture1 ); gl.uniform1i(shaderProgram.samplerUniform, 0);

  35. Reflection • Reflection simulates a mirror. Often a Cube Map is used enclosing the scene inside a cube.

  36. Reflection • Vertex Shader, use Built-in GLSL reflect() func varying vec3 ReflectDir; vec3 worldNorm = vec3( uNMatrix * aVertexNormal ); vec3 worldView = normalize( uCameraEye - vPosition.xyz ); ReflectDir= reflect( -worldView, worldNorm); • Fragment Shader, GLSL textureCube() func. vec4 reflectColor = textureCube( uCubemap, ReflectDir ); gl_FragColor = vec4(reflectColor.rgb* lighting, 1.0);

  37. Refraction • Refraction is the bending of light from one surface (such as air to water). Fragment Shader float eta = 1.33; // eta is index of // refraction, such as air / water vec3 RefractDir= refract( -vPosition.xyz, vTransformedNormal, eta ); vec4 fragmentColor = textureCube( uCubemap, RefractDir ); gl_FragColor = vec4(fragmentColor.rgb * lightWeighting, 1.0);

  38. Fog (under water, depth cueing) Fragment Shader float distanceCameraToPixel = length(vVertexPosition.xyz); float fogDepth = 60.0; vec4 fragmentColor= . . . if ( distanceCameraToPixel < fogDepth ) { fragmentColor= vec4( fragmentColor.rgb * ( 1.0 - distanceCameraToPixel/fogDepth) + uFogColor.rgb* ( distanceCameraToPixel/fogDepth), 1.0 ); } else fragmentColor= vec4(uFogColor, 1.0); gl_FragColor= vec4(fragmentColor.rgb, 1.0);

  39. Picking • Picking is selecting an object in 3D space through an interface device, such as a mouse • Create invisible bounding box around 3d mesh. • Find the max and min x, y, z coordinates of 3d mesh. • Convert 2D mouse click to 3D ray • xW= (XS - widthS/2) * 2 / widthS • yW= -(YS - heightS/2) * 2 / heightS • zW = -distanceToViewPlane = -1 / tan( fieldOfView / 2) • Determine if ray intersects any bounding boxes

  40. Picking Using the rays coordinates(xW, yW, zW) and Bounding Box min and max: xMin = xBBmin / xWxMax = xBBmax/xW yMin= yBBmin/ yWyMax= yBBmax/yW zMin= zBBmin/ zWzMax= zBBmax/zW If max(xMin, yMin, zMin) < min(xMax, yMax, zMax) then we have an intersection. Precise ray-polygon intersection is math intensive, but may be helped by WebCL. Good question for the WebCL BOF : Can WebCL be used for Ray-Polygon intersection / Ray-tracing? Demo uses field-of-view = 11.4o so zW, distanceViewPlane = 10

  41. Mitch.Williams@3D-Online.com • come up and get a business card • Slides: Web 3D Conf. wiki, 3D-Online.com • “WebGL HotShot” • Available at SIGGRAPH bookstore. • Ebook discount code: • Khronos: Wed., Aug. 13, Marriott Pinnacle • WebCL BOF: 2:30pm – 3 pm • WebGL BOF: 4 pm – 5 pm • Special Thanks: Dave Blackburn • Q & A

More Related