1 / 13

Results from the User Survey

Results from the User Survey. Tobias Hossfeld. Summary. Apps of interest (in decreasing order) Adaptive streaming, 2D video images , VoIP, images, web browsing Interests and contributions by VIPs High interest: Design of test, statistical analysis

britain
Télécharger la présentation

Results from the User Survey

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Resultsfromthe User Survey Tobias Hossfeld WG2 TF„Crowdsourcing“ https://www3.informatik.uni-wuerzburg.de/qoewiki/qualinet:crowd

  2. Summary • Apps of interest (in decreasing order) • Adaptive streaming, 2D video images, VoIP, images, web browsing • Interests and contributions by VIPs • High interest: Design of test, statistical analysis • Very few VIPs: implementation and execution • Time concerns by VIPs, limited resources possible for doing tests • Focus on existing (lab and crowdsourcing) data sets • Discussion in Phone Conference, see doodle link • Crowdsourcing data available / VIPs available for all steps (test design, implementation, execution, analysis) • Web browsing: data available (Martin, Lea, Toni, Tobias) • VoIP and image: VIPs for all steps available • Lab results available / VIPs available • Available: images, 2D video • VoIP: will be executed • Web browsing: only implementation missing WG2 TF„Crowdsourcing“ https://www3.informatik.uni-wuerzburg.de/qoewiki/qualinet:crowd

  3. Which application? Your Contribution? Crowdsourcing How will you contribute to crowdsourcing experiment? Application Which application do you prefer for the JOC? Laboratory How will you contribute to the lab experiment? WG2 TF„Crowdsourcing“ https://www3.informatik.uni-wuerzburg.de/qoewiki/qualinet:crowd

  4. Detailed View: Contributions • Of interest and contributions • images, web browsing, VoIP, adaptive streaming, 2D video • Out of scope, too many problems • File storage, Radio streaming, Other WG2 TF„Crowdsourcing“ https://www3.informatik.uni-wuerzburg.de/qoewiki/qualinet:crowd

  5. Research Questions • Develop and apply methodology • Derive QoE model for selected app • Analyze impact of crowdsourcing environment • Providing database with crowdsourcing results • Do results using crowdsourcing platforms differ from results of an test using a dedicated panel and in which sense? What does it imply for QoE assessment and the tools we (can) use? • Do results using crowdsourcing differ from results from controlled lab experiments (and in a next step possibly even more realistic home environments)? WG2 TF„Crowdsourcing“ https://www3.informatik.uni-wuerzburg.de/qoewiki/qualinet:crowd

  6. Invididual comments • Contributions • We are currently developing 2 applications of possible interest- one is a VoIP client within webRTC and the other is an intermedia synch application similar to HbbTV (broadcast/broadbandTV)..which we also hope to deploy on webRTC platform. Both are still at development stage..so perhaps I am being a bit optimistic ! • I can do data analysis for first two options as well. • The chosen app and link to ongoing activities, will determine how much I can be involved. Also depending on the app, I could also link up to the iMinds panel. • Problems • Heterogeneous possibly time-variant users' connections • I am completely novice with everything related to the implementation, but I see some methodological challenges related to the cross-device use (and how this links up to QoE) of e.g., personal cloud storage apps and adaptive video streaming. • No time WG2 TF„Crowdsourcing“ https://www3.informatik.uni-wuerzburg.de/qoewiki/qualinet:crowd

  7. Next Steps • Summary via mailinglist / wiki • Yourinterests • Yourcontributions • Collective decisionwithin TF • Collectinfofrom all TF participants • Google survey form • Online meeting • Decision on concreteapplication, platform, researchquestions • Allocationofworkfor VIPs • Rough time schedule • Time plan • 15/03/2013: summary • 22/03/2013: googlesurveysentaround • 31/03/2013: TF fillssurvey • Mid april: online meeting WG2 TF„Crowdsourcing“ https://www3.informatik.uni-wuerzburg.de/qoewiki/qualinet:crowd

  8. Summary fromBreakout Session WG2 TF„Crowdsourcing“ https://www3.informatik.uni-wuerzburg.de/qoewiki/qualinet:crowd

  9. Contributions by Participants • Design of user test • Source contents for tests (video, images): Marcus Barkowsky • Test design: LucjanJanowski, Katrien de Moor, Miguel Rios-Quintero • Implementation of test • Lab test for image quality: Judith Redi, FilippoMazza • Lab test for VoIP: Christian Hoene • Online test for VoIP: Christian Hoene • Crowdsourcing test for images/video: Christian Keimel • Crowdsourcing test for HTTP video streaming: Andreas Sackl, Michael Seufert, Tobias Hossfeld • Crowdsourcing platform with screen quality measurements: Bruno Gardlo • Crowdsourcing micro-task platform: BabkNaderi, Tim Polzehl • Execution of test • Crowdsourcing: Tobias Hossfeld • Online panel: Katrien de Moor • Lab test for image quality: Judith Redi, FilippoMazza • Lab test for VoIP: Christian Hoene • Crowdsourcing test for images/video: Christian Keimel • Crowdsourcing test for HTTP video streaming: Andreas Sackl, Michael Seufert, Tobias Hossfeld • Data analysis • Identification of key influence factors and modeling: Tobias Hossfeld, Judith Redi • Comparison between crowdsourcing and lab: Tobias Hossfeld, Marcus Barkowsky, Katrien de Moor, Martin Varela, Lea Skorin-Kapov • Model validation: Marcus Barkowsky WG2 TF„Crowdsourcing“ https://www3.informatik.uni-wuerzburg.de/qoewiki/qualinet:crowd

  10. Summary of Interests WG2 TF„Crowdsourcing“ https://www3.informatik.uni-wuerzburg.de/qoewiki/qualinet:crowd

  11. Summary of Contributions WG2 TF„Crowdsourcing“ https://www3.informatik.uni-wuerzburg.de/qoewiki/qualinet:crowd

  12. Input collectedbefore Novi Sadmeeting WG2 TF„Crowdsourcing“ https://www3.informatik.uni-wuerzburg.de/qoewiki/qualinet:crowd

  13. Interest in Joint Qualinet Experiment • FilippoMazza, Patrick le Callet, Marcus Barkowsky: comparison of lab and crowdsourcing experiments considering model validation; directly related to “Validation TF” • Martin Varela, Lea Skorin-Kapov: impact of crowdsourcing environment on user results and QoE models, e.g. incentives and payments on the example of Web QoE; directly related to “Web/Cloud TF” • Christian Keimel: Impact of crowdsourcing environment on user results and QoE models, e.g. demographics • Andreas Sackl, Michael Seufert: Impact of content/consistency questions on QoE ratings, e.g. for HTTP video streaming; directly related to “Web/Cloud TF” • Bruno Gardlo: currently working on improved crowdsourcing platform with screen quality measurement etc.; interest in incentive design, gamification; platform may be used for experiment, e.g. for videos or images • Katrien de Moor: contribution in the questionnaire development/refinement and/or by setting up a comparative lab test • BabakNaderi: development of crowdsourcing micro-task platform which may be used for joint experiment; incentives, data quality control, effects of platform-dependent and user-dependent factors on motivation and data quality WG2 TF„Crowdsourcing“ https://www3.informatik.uni-wuerzburg.de/qoewiki/qualinet:crowd

More Related