1 / 22

CHI 2009

CHI 2009. Review Process. Changes area-based submissions and sub-committees. Guiding Principles. Key Problems scale (~1000+ papers + notes) difficult to assign papers to true experts research / evaluation methods differ for different types of work narrowing of field

nigel
Télécharger la présentation

CHI 2009

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CHI 2009 Review Process • Changes • area-based submissions and sub-committees

  2. Guiding Principles • Key Problems • scale (~1000+ papers + notes) • difficult to assign papers to true experts • research / evaluation methods differ for different types of work • narrowing of field • referee / decision bias to conventional research and methods

  3. Guiding Principles • Key Problems • scale (~1000+ papers + notes) • difficult to assign papers to true experts • research / evaluation methods differ for different types of work • narrowing of field • Goal: expert reviews • paper reviewed by true experts in the subject matter • Goal: area-appropriate evaluation • AC’s know area • set criteria that minimize randomness, bias and errors • Goal: encourage growth of field into new areas

  4. Method • Areas-based sub-committees • papers / notes integrated • authors submit to an area • Each area • expert area coordinator • chooses area ACs • Area ACs choose reviewers • area-specific criteria • runs as mini-PC process • All areas • same process (similar to existing one)

  5. Hurdles • Defining areas • fair coverage for all CHI interests • load balance between committees • Defining area criteria • avoiding biases and restrictions • Papers that don’t fit area • covers > 1 area • caught ‘between the cracks • marginalization of niche areas • new unanticipated area • cross discipline work

  6. Choosing Areas • CHI sub-disciplines • UIST, CSCW, DIS, … • By Method • how we do our research vs. what’ our research is about • By Contribution Type • Statistical clusters • best coverage of prior submissions / accepted papers

  7. By Contribution Type • Letters to the community • Normative (refine what we have) – empirical • Improved gizmos • Improves human processes • Breakthrough (new idea) – ‘aha’ rationale • gizmo • design concept • Understand what we have – field/in the wild… • who we are today (social use) • What we could be tomorrow (probes, etc) • Process – how we do what we do • design process • research methods (evaluations, sampling, etc) • Systems – how and why we build things

  8. By CHI Sub-disciplines • Interactions (CHI conventional) • CSCW / Group / ECSCW • UIST / IUI / Soups (Usable privacy) • Ubicomp / TEI / Mobile / HRI / Pervasive • DIS / DUX • UPA • Human Factors / ASSETS / Universal Usability • Info Vis / New Media (NIME) • Creativity & Cognition • Digital libraries/SIGDOC/Hypertext / Info retreival • …bridges to other fields …. • Issue: encourages existing silos ?

  9. By Method • Usable techniques – Quantitative • Usable techniques – Qualitative • Usable techniques –rational / reqt’s analysis • Understanding users and contexts • Tools and Infrastructure • Creativity and Vision • Usability Science • Theory and analytics • Issue: Focuses on ‘how’ not on ‘what’. • Did they do it well vs. what did they do?

  10. By Statistical Clusters • Input techniques 8% • CSCW 11% • Pervasive 10% • Multimodal 12% • Systems 8% • Design 14% • Applications 9% • Methods 18% • Media 6% • Issue: labels as ‘catch-alls’, not well understood…

  11. Constraints • Authors • clearly understandable criteria • clearly phrased acceptance criteria • Logistics • appropriate for committees • appropriate for expertise selection • equal division of labour (Cdn/UK) • equal division of labor (US) • Coverage / Values • broadens / grows community, areas, across disciplines • does not overly narrow into silos • does not dis-enfranchise (perception)

  12. By Statistical Clusters • Input techniques 8%

  13. By Statistical Clusters • CSCW 11% • Computer Supported Cooperative Work • Social Computing and Social Networking • Computer-Mediated Communication

  14. By Statistical Clusters • Pervasive 10% • Handheld Devices and Mobile Computing • Ubiquitous Computing / Smart Environments • Tangible UIs • Context-Aware Computing • Robots

  15. By Statistical Clusters • Multimodal 12% • Perceptual & Vision-based UIs • Multimedia UIs • Tangible UIs • Pen-based UIs • Tactile & Haptic UIs • Speech and Auditory I/O • 3D Interaction • Multi-modal interfaces • Augmented Reality and Tangible

  16. By Statistical Clusters • Systems 8% • Security & privacy • Agents and Intelligent Systems • Development Tools / Toolkits • Prototyping • End-user programming • Software architecture and engineering • Virtual Reality • Internationalization / Localization • Animation

  17. By Statistical Clusters • Design 14% • User-Centered design • User Interface design • User Experience design • Design Methods (Design Rational) • Interaction design • Multidisciplinary design • Concept design • Product design • Service design • Visual design…

  18. By Statistical Clusters • Input techniques 8% • CSCW 11% • Pervasive 10% • Multimodal 12% • Systems 8% • Design 14% • Applications 9% • Methods 18% • Media 6%

  19. By Statistical Clusters • Applications 9% • E-Learning and Education • Home / Domestic • Virtual Community • Health Care • Office and Workplace • Elderly • Creativity Support • Children • E-commerce • Business Strategy

  20. By Statistical Clusters • Methods 18% • User Studies • Usability Research • Usability Testing and Evaluation • Empirical Methods, Quantitative • Empirical Methods, Qualitative • Ethnography • User and Cognitive models • Analysis Methods (e.g. Task) • Performance Metrics

  21. By Statistical Clusters • Media 6% • Visualization • World Wide Web and Hypermedia • Entertainment • Video Content / Communications

  22. Constraints • Authors • clearly understandable criteria • clearly phrased acceptance criteria • Logistics • appropriate for committees • appropriate for expertise selection • equal division of labour (Cdn/UK) • equal division of labor (US) • Coverage / Values • broadens / grows community, areas, across disciplines • does not overly narrow into silos • does not dis-enfranchise (perception)

More Related