1 / 58

Windows Presentation Foundation: Extensible BitmapEffects, Pixel Shaders, And WPF Graphics Futures

PC07. Windows Presentation Foundation: Extensible BitmapEffects, Pixel Shaders, And WPF Graphics Futures.  David Teitlebaum Program Manager Microsoft Corporation.

wei
Télécharger la présentation

Windows Presentation Foundation: Extensible BitmapEffects, Pixel Shaders, And WPF Graphics Futures

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. PC07 Windows Presentation Foundation: Extensible BitmapEffects, Pixel Shaders, And WPF Graphics Futures  David Teitlebaum Program Manager Microsoft Corporation

  2. Or Rather…Windows Presentation Foundation…Extensible ShaderEffects (Pixel Shaders), DX Interop,…And WPF Graphics Futures PC07

  3. In This TalkReading beyond the title • Motivations for the new Effect system • A brief HLSL primer • Exposure of new Effects in WPF • Direct3D Interop (D3DImage) • Coming graphics features in WPF 4.0

  4. Motivations For A New Effect SystemGoodbye BitmapEffects, we barely knew ye! • BitmapEffects • Introduced in WPF 3.0 • Allowed visual changes to be applied to a UIElement • Several built-in BitmapEffects • BevelBitmapEffect • BlurBitmapEffect • DropShadowBitmapEffect • EmbossBitmapEffect • OuterGlowBitmapEffect • BitmapEffect extensibility mechanism

  5. Motivations For A New Effect SystemGoodbye BitmapEffects, we barely knew ye! • When you applied a BitmapEffect to an element in your scene, we: • Rendered subgraph into an intermediate render target (IRT) • Software Rasterizer • Analogous to RenderTargetBitmap • Limitations on IRTs • No mip mapping in 3D • No anisotripic filtering in 3D • No cleartype • Applied BitmapEffect on CPU, not GPU • Composited BME with rest of scenegraph

  6. BitmapEffect Example • <Grid Background=“Black”> • <Image Source="c:\\cat.png" > • <Image.BitmapEffect> • <OuterGlowBitmapEffectGlowSize="100"/> • </Image.BitmapEffect> • </Image> • </Grid>

  7. BitmapEffect ExampleGlowing Cat

  8. Motivations For A New Effect SystemGoodbye BitmapEffects, we barely knew ye! • Many performance caveats to using BitmapEffects • Software rendered • Affected subgraph software rendered • no read-back from the GPU • Blur operations are very slow • Reason for different sampling kernels • Require multiple shader passes internally • BlurBitmapEffect • DropshadowBitmapEffect • GlowBitmapEffect

  9. Something Had To Be Done!What happens to Legacy BitmapEffects? • 3.5 SP1 and beyond • Throw “obselete” warning in Visual Studio • BlurBitmapEffectand DropShadowBitmapEffect have HW acceleration • HW “fall forward” • 4.0 and beyond • Software BitmapEffects to be removed • They will simply no-op • HW“fall forward” will continue to work

  10. Something Had To Be Done!Enter: Effects • The goal • GPU accelerated… at last!  • HW IRTs instead of SW, HW GPU effects  • HW accelerate the most popular built-in effects • Blur and DropShadow • Allow authoring of custom, hardware-accelerated Effects • Create a fast software fallback pipeline • Faster than legacy BMEs • Create a system that will run on most video cards • PS 2.0

  11. Effect Exposure In WPFBridging the gap • We created a new class from which you derive your custom ShaderEffect • Object • DispatcherObject • DependencyObject • Freezeable • Animateable • BitmapEffect • Effect • BlurEffect • DropShadowEffect • ShaderEffect • <Your Custom Effect Class>

  12. ShaderEffect Exposure In WPFBridging the gap • We did not extend the legacy BitmapEffect classes • Strongly discourage the use of BitmapEffects • Intermixing of BitmapEffects and Effects in BitmapEffectGroups would have been very difficult

  13. Limitations Of EffectsBridging the gap • No HW Acceleration for: • OuterGlowBitmapEffect • EmbossBitmapEffect • BevelBitmapEffect • No Effect Analogs either • Software fallback requires SSE2 • Added RenderCapability.IsShaderEffectSoftwareRenderingSupported • No EffectGroup • Group via nesting elements

  14. demo Extensible ShaderEffects on Webcam output borrowing code derived from Tamir Khason's Webcam Control  David Teitlebaum Program Manager WPF

  15. Instead Of An EffectGroup, Simply Nest Elements • <Grid Name=“Grid1”> • <Grid.Effect> • <BlurEffect /> • <Grid.Effect> • <Grid Name=“Grid2”> • <Grid.Effect> • <DropShadowEffect /> • <Grid.Effect> • Etc…

  16. Nuts And Bolts Of Shader EffectsWhat is a pixel shader? • Small program designed for GPU parallelism • Program is run once per destination pixel • Run in destination space • Commonly used in 3D rendering systems for texture blending, lighting model computations, etc. • SIMD – GPU uses massive parallelization to run many copies of program at same time • Scalar types: bool, int, uint, half, float, double • Vector types: e.g., float4, • You set one shader program, but it’s run thousands of times, once per pixel

  17. Extensible Pixel Shader EffectsWhat is a pixel shader? • Originally written in an assembly-like language • Now commonly written in two C-like languages • HLSL – used by Direct3D • GLSL – used by OpenGL • HLSL: First compiled byte-code, then consumed by the video driver • GLSL: Compiled by video driver itself

  18. HLSLHigh Level Shading Language • Several versions of HLSL • True looping constructs not added until PS 3.0 • Versions reflect DX Hardware Generation • WPF currently only supports PS 2.0

  19. HLSLList of intrinsic functions abs(x) acos(x) all(x) any(x) asfloat(x) asin(x) asint(x) asuint(x) atan(x) atan2(y, x) ceil(x) clamp(x, min, max) clip(x) cos(x) cosh(x) cross(x, y) D3DCOLORtoUBYTE4(x) ddx(x) ddy(x) degrees(x) determinant(m) distance(x, y) dot(x, y) exp(x) exp2(x) faceforward(n, i, ng) floor(x) fmod(x, y) frac(x) frexp(x, exp) fwidth(x) GetRenderTargetSampleCount() GetRenderTargetSamplePosition(x) isfinite(x) isinf(x) isnan(x) ldexp(x, exp) length(v) lerp(x, y, s) lit(n • l, n • h, m) log(x) log10(x) log2(x) max(x, y) min(x, y) modf(x, out ip) mul(x, y) noise(x) normalize(x) pow(x, y) radians(x) reflect(i, n) refract(i, n, R) round(x) rsqrt(x) saturate(x) sign(x) sin(x) sincos(x, out s, out c) sinh(x) smoothstep(min, max, x) sqrt(x) step(a, x) tan(x) tanh(x) tex1D(s, t) tex1Dbias(s, t) tex1Dgrad(s, t, ddx, ddy) tex1Dlod(s, t) tex1Dproj(s, t) tex2D(s, t) tex2Dbias(s, t) tex2Dgrad(s, t, ddx, ddy) tex2Dlod(s, t) tex2Dproj(s, t) tex3D(s, t) tex3Dbias(s, t) tex3Dgrad(s, t, ddx, ddy) tex3Dlod(s, t) tex3Dproj(s, t) texCUBE(s, t) texCUBEbias(s, t) texCUBEgrad(s, t, ddx, ddy) tex3Dlod(s, t) texCUBEproj(s, t) transpose(m) trunc(x)

  20. SIMD Math (HLSL)vector types • float4 myColor = { 0.5f, 0.2f, 0.4f, 0.2f } • float4 colorFilter = {0.0f, 0.0f, 1.0f, 1.0f} • float4 result = myColor * colorFilter; • <This is FOUR TIMES as fast as doing these operations separately, since they’re done in parallel> • result == {0.0f, 0.0f, 0.4f, 0.2f} • {result.x, result.y, result.z, result.w} • …equivalent to • {result.r, result,g, result.b, result.a}

  21. SIMD Math (HLSL)matrix types • float2x4 fMatrix; // float matrix with 2 rows, 4 columns • float2x2 fMatrix = { 0.0f, 0.1, // row 1 • 2.1f, 2.2f // row 2

  22. Nuts And Bolts Of Shader EffectsWhat is a pixel shader? • Shaders can read from one or many source textures • Shaders can take numerical inputs in the form of Shader Constants • Accepted types very between versions of shader languages • PS 2.0: float • Both scalar and vector types • Shader Constants are passed to shaders via Shader Constant Registers • Numerically Indexed

  23. Shader Constant Registers (HLSL)Passing values to your shader • float width : register(C0); • float height : register(C1); • float2 mousePosition : register(C2) • float totalPixels = width * height; • float mouseXpos = mousePosition.x;

  24. Nuts And Bolts Of Shader EffectsWhat is a pixel shader? • Source textures take the form of Samplers • Also Numerically Indexed • WPF allows up to four Samplers per shader • Multiple Samplers correspond to “multi-input” shaders • Even if you don’t pass a Sampler to your shader, WPF can create an “Implicit Input” • A Sampler is created for the Element on which the shader is applied • You don’t need to use it if, for example, you’re writing a generative shader

  25. Shader Constant Registers (HLSL)Passing values to your shader • sampler2DimplicitInputSampler : register(S0); • float2 texCoord = {0.4, 0.6}; • float4 color = tex2D( implicitInputSampler, texCoord ); • // Texture Coordinates always range from 0..1 • // on both x and y axes

  26. HLSL Example 1Removing red and green color channels

  27. HLSL –What Does It Look Like? • sampler2D Input : register(s0); • float4 main(float2 uv:TEXCOORD) : COLOR • { • float4 Color; • Color = tex2D(Input, uv.xy); • Color.rg = 0; • return Color; • }

  28. HLSL – What Does It Look Like?Breaking down the program • sampler2D Input : register(s0); • Defining an input texture to be accessible to your shader program • float4 main(float2 uv:TEXCOORD) : COLOR • Function signature of your shader program. It returns a color • float4 Color; • Defining a float4 – a single variable representing four floats, each separately addressable, though many math operations can act on all four simultaneously (add, multiply, etc.)

  29. HLSL – What Does It Look Like?Breaking down the program • Color = tex2D(Input, uv.xy); • Sampling from your input texture using the texture coordinates (normalized from 0..1) uv.xy (a 2-variable quantity) • return Color; • The function simply returns the sampled color and returns

  30. What Else Do We Need To Get This To Run In WPF? • We need our managed class which will consume this code…

  31. ShaderEffect C# Code • public class MyEffect : ShaderEffect • { • static MyEffect () • { _pixelShader.UriSource = Global.MakePackUri(“MyEffect.ps"); } • public MyEffect () • { • this.PixelShader = _pixelShader; • UpdateShaderValue(InputProperty); • } • public Brush Input • { • get { return (Brush)GetValue(InputProperty); } • set { SetValue(InputProperty, value); } • } • public static readonlyDependencyPropertyInputProperty = • ShaderEffect.RegisterPixelShaderSamplerProperty( • "Input", typeof(WaveEffect), 0); • private static PixelShader _pixelShader = new PixelShader(); • }

  32. How Do We Apply This ShaderEffect To A UIElement? • Just like BitmapEffects, we can use XAML, or we can use code…

  33. Applying A ShaderEffect • <Grid> • <Grid.Effect> • <MyEffect /> • </Grid.Effect> • </Grid> • Grid g = new Grid(); • MyEffect me = new MyEffect(); • g.Effect = me;

  34. How to Write ShaderEffectIn a Nutshell… • Write pixel shader in HLSL • Derive from the ShaderEffect class • Expose whatever DPs you want accessible from your ShaderEffect • Assign your DPs to Shader Constant Registers • Compile your ShaderEffect using the Visual Studio build task – HLSL gets compiled at this time • Apply your ShaderEffect via the new UIElement.Effect property

  35. How to Write ShaderEffectLast Minute Details… Premultiplied Alpha! • WPF Operates with Premultiplied Alpha • Remember to Un-Premultiply and Re-Premultiply your alpha in your shader if you want it to work right with transparency • Un-Premultiplicaiton: • Color = tex2D(Input, uv.xy); • Color.rgb /= Color.a; • Re-Premultiplication: • Color.rgb *= Color.a; • return Color;

  36. How to Write ShaderEffectLast Minute Details… Source Texture Partial Derivatives • DdxUvDdyUvRegisterIndex • Specifies a shader register that will contain the partial derivatives of the texture coordinates with respect to screen space • “If you step 1 pixel to the right in screen space (in the x direction), then ddx(u) is the amount that u changes in texture space, and ddx(v) is the amount that v changes in texture space. If the effect is axis-aligned when it is rendered, then ddx(v) is 0. If the effect is rotated when it is rendered, then ddx(v) is non-zero.”

  37. How to Write ShaderEffect Last Minute Details… Getting Proper Hit testing when doing spatial Transforms • EffectMapping property • GeneralTransform • Override to get proper post-shader inverse-mapping of mouse hit-testing and coordinate systems

  38. demo Writing a ShaderEffect in VisualStudio using Gerhard Schneider and Greg Schechter's Visual Studio Build Task  David Teitlebaum Program Manager WPF

  39. Direct3D InteropD3DImage

  40. Direct3D InteropEnter D3DImage • Prior to D3DImage • D3D “Interop” meant hwnd hosting • With D3DImage • WPF application consumes D3D content • D3D Content is rendered on a separate DX device, not WPF’s • Does not allow for the reverse • D3D cannot directly consume WPF content

  41. Direct3D InteropEnter D3DImage • Derived from ImageSource • Object • DispatcherObject • DependencyObject • Freezable • Animatable • ImageSource • D3DImage • Exists in Interop Namespace • Not included with the default XMLNS in WinFX

  42. Direct3D InteropEnter D3DImage • How does D3DImage work? • App creates D3DImage • App creates unmanaged D3D Device • D3D Device is used to create a surface • Surface is handed to D3DImage via the SetBackBuffer method on D3DImage • App renders D3D content by • Locking surface • Rendering to surface via D3D device • Adding dirty rects • Unlocking surface • WPF takes care of redrawing everything necessary when D3DImage has been dirtied

  43. Direct3D InteropEnter D3DImage • How does D3DImage work?

  44. D3DImage Pseudo-code • D3DImage myD3DImage = new D3DImage(); • // <Create a D3D Device> • myD3DImage.IsFrontBufferAvailableChanged += new EventHandler( myD3DImage_IsFrontBufferAvailableChanged ); • // <create surface on the D3D Device> • //when surface needs updating • if (myD3DImage.IsFrontBufferAvailable) • { • myD3DImage.Lock(); // or TryLock(...); • // <render into surface> • myD3DImage.AddDirtyRect(); • myD3DImage.Unlock(); • } • static void myD3DImage_IsFrontBufferAvailableChanged(...) • { • if (myD3DImage.IsFrontBufferAvailable) • { • myD3DImage.Lock(); // or TryLock() • myD3DImage.SetBackBuffer(); • // <render into surface> • myD3DImage.AddDirtyRect(); • myD3DImage.Unlock(); • } • else • { • // <suspend surface updates> • } • }

  45. demo D3DImage sample from D3DImage Lab written by Mark Smith  David Teitlebaum Program Manager WPF

  46. D3DImage Gotchas!Create the best D3D device possible • There are two kinds of D3D9 Devices • IDirect3D9 • Supported by XP’s XDDM driver model and Vista’s WDDM driver model • No shared surface support • IDirect3D9Ex • Only supported by Vista’s WDDM driver model • Shared surface support (better performance) • On Vista, use Direct3DCreate9Exif it’s available • Otherwise, use Direct3DCreate9

  47. D3DImage Gotchas!Choice of pixel format is very important! • The wrong pixel format will force a software surface -> surface copy

  48. D3DImage Gotchas!D3D Device can be lost at any time! • D3D Device can be lost when • Screen is locked • A full-screen exclusive D3D application runs • User-switching • Using Remote Desktop • Etc… • Application must handle device loss…

  49. D3DImage Gotchas!D3D Device Loss • Handling D3D Device Loss • Listen to IsFrontBufferAvailableChanged event • When not available, WPF rendering system releases its reference to the back buffer • When D3D device available again, IsFrontBufferAvailableChangedevent is again called • App restarts rendering by calling the SetBackBuffer method again with a valid Direct3D surface

  50. D3DImage Gotchas! Other Recommendations • Create your D3D Device using the D3DCREATE_MULTITHREADEDand D3DCREATE_FPU_PRESERVE creation flags • On multi-mon XP systems, you will get better performance if your DX device was created using the same monitor that your D3D image is using • Detect which monitor your D3DImage is on, and use the back buffer of the corresponding D3D Device when you render

More Related