1 / 24

5th International Workshop on Network and Operating System Support for Digital Audio and Video

A Method and Apparatus for Measuring Media Synchronization B. K. Schmidt, J. D. Northcutt, M. S. Lam. 5th International Workshop on Network and Operating System Support for Digital Audio and Video Presenter: Nadir Kiyanclar. Agenda. Problem – Reliable Measurement

unity
Télécharger la présentation

5th International Workshop on Network and Operating System Support for Digital Audio and Video

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. A Method and Apparatus for Measuring Media SynchronizationB. K. Schmidt, J. D. Northcutt, M. S. Lam 5th International Workshop on Network and Operating System Support for Digital Audio and Video Presenter: Nadir Kiyanclar

  2. Agenda • Problem – Reliable Measurement • Overview of synchronization as defined by authors • Background and system overview • Stimulus Generator • Test System • Measurement System • Implementation and design of media synchronization measurement system • Experimental setup and results • Observations and conclusions

  3. Problem: Reliable Measurement • Many schemes exists for synchronizing within and among media streams • Independent streams with timestamps • Interleaving • Advantages and drawbacks to all • Interleaving • ‘Coupling of failure modes resulting from packet loss during transmission’ • Timestamps • Must account for variations in clock • Many ways of implementing a given scheme • No uniform way to objectively measure the performance of these schemes

  4. Background: Definitions • Media Element: Single unit of a media type • Atomic with regard to measurement • E.g. video frame, audio sample • Media Stream: Series of media elements • Media Sequence: Time-ordered sequence of streams • E.g. Songs in an album

  5. Background: Synchronization • Intra-stream • Inter-stream • Intra-sequence • Inter-sequence A A A A A A A A V V V V V V V V A Audio Frame V Video Frame A Lost Frame

  6. Background: Synchronization • Intra-stream Jitter, drift • Inter-stream Lip sync • Intra-sequence • Inter-sequence A A A A A A A A V V V V V V V V Inter-frame arrival A Audio Frame V Video Frame A Lost Frame Inter-stream frame arrival

  7. System Overview: Test Environment System Under Test Transport/ Manipulation/ Storage Temporal Alignment Display (to user) Capture Stimulus Production System (optional) Internal Timing And reporting Test Generator Display Subsystem Measuring Device Event Capture Data Processor Stored Test Media Record Generator Measurement System

  8. System Overview: Stimulus Generator • Generate Stimuli (input) data for system to be tested • Stored media can have timestamps added ahead of time • Must also be able to generate live data for testing videoconferencing • Test sequence: • Long running to detect drift • Vary timing, event frequency to avoid biasing test due to resonance with clock of tested system. Stimulus Production System Test Generator Stored Test Media

  9. System Overview: Test System • Work as normal: Stimulus generation and measurement done on separate hardware • Optionally use a tracing API to record events internally • Allows the comparison of application-perceived and real-world performance • Timing Accuracy must exceed that of both the stimulus generator and tested system System Under Test Transport/ Manipulation/ Storage Temporal Alignment Display (to user) Capture (optional) Internal Timing And reporting

  10. System Overview: Measurement System • Receive input from both stimulus generator and application • Observe application display externally • Recognize and process events, generate statistics • Optionally make use of data provided by application via tracing API Display Subsystem Measuring Device Event Capture Data Processor Record Generator Measurement System

  11. System Overview: Measurement System • Each event e has • Generation time Tg(e) • Ideal display time Ti(e) • Actual Display time Ta(e) • Generate other measurements from these • End-to-end latency: Ta(e) – Tg(e) • Absolute Asynchrony: Ta(e) – Ti(e) • Relative Asynchrony between two events e & f: Ta(e) – Ta(f)

  12. Implementation • Stimulus Generator • Program which runs in real time scheduling class on a dedicated machine • Video Event: Sends commands to a simple NTSC signal generation device to switch to one of two frames • Black • Black with white square • Audio Events: generate a clicking noise • Tested System • Simple in-house media player • 8 bit monaural audio, 320*240 pixel video, JPEG encoded frames • One thread per stream • Master thread is audio, off which other threads synchronize

  13. Implementation: Measurement Device • CHAOS: Chronological Hardware Activity Observation System • Special purpose I/O board, generates timestamps for events • Events are signals on one of the board’s ports • Timestamps based on system clock • Corrected for error before running test with a high-precision GPS clock • By determining with high accuracy the actual clock speed of a computer, can relate ticks to real time reliably • Video event detection: photo-diode based transducer • Audio event detection: microphone

  14. Experimental Setup • Did not follow the architecture laid out earlier • Differences • Separate stimulus generator system not used • Source not mentioned, and only relative drift, skew statistics mentioned • Stimulus generator was itself tested under various conditions • Vary scheduling class, load from other processes • Sample media player tested as well, though less thouroughly • Compare internal statistics (measure by running program) and externally gathered data from CHAOS

  15. Results: Internal

  16. Results: External

  17. Results: Observations • No intra stream audio asynchrony detected internally by sample media player • Compare to externally observed values Internally Observed Externally Observed

  18. Results: Observations • Media player assumes audio thread has correct timing, synchronizes everything else off of that • Internal measurement is application-dependent • Underscores importance of external observation Internally Observed Externally Observed

  19. Results: Observations • Internally measured media player A/V synchronization is poor • Bad for the needs of stereo synchronization (~20us) • Suggests handling each stream with independent threads is not a good idea, at least Internally Observed Externally Observed

  20. Results: Observations • Notable increase in intra stream video skew, jitter detected internally by test generator when rendering to framebuffer instead of NTSC device • Because NTSC device write time was hardcoded into application • Emphasizes importance of display alogrithms which can take execution time into account NTSC FB

  21. Results: Observations • Externally measured intra stream video jitter for test generator worse than internally measured. Internal Measurement External Measurement

  22. Results: Observations • Externally measured values don’t show an increase in jitter when test generator is moved from real time scheduling class to timesharing class, although internal measurements do. • Std. dev. of jitter is 9.6ms about mean • NTSC encoder outputs frames with period of 17.6ms, and arriving toggle command deferred until next period • So NTSC encoder clock masks internal jitter values

  23. Conclusion • Internal application measurement not enough to guage synchronization performance effectively • Depends on implementation, and on what application actually attempts to measure • Application doesn’t always have information regarding output timing • E.g. writing to a framebuffer vs. when the frame is displayed • External measurement needed • Test system reflected actual performance of tested applications correctly

  24. Criticisms • Experimental setup didn’t match architecture description • Caused some confusion when reading graphs • Test generator was tested thoroughly, but not media player • Std. deviation was given for statistics, but no mean • Questions on generality of system • Geared towards live feeds and stored (e.g. disk) media • How to determine arrival time for network oriented media players?

More Related