370 likes | 516 Vues
observing remote collaboration. Gestures Matter. presented by: Marie Silverstrim. 20 some years ago… Music. 20 some years ago… Movies. 20 some years ago… Politics. 20 some years ago… Technology. Honey, I shrunk the world. The need to be in the same place as your team is diminishing
E N D
observing remote collaboration Gestures Matter presented by: Marie Silverstrim
Honey, I shrunk the world. • The need to be in the same place as your team is diminishing • Teams can begin working remotely • New issues of remote collaboration come into the picture
So. Gestures… Where’d that come from? • Sara Bly • Specializes in qualitative, observational techniques • Managed the Collaborative Systems Group at Xerox PARC 1986-1994 • Currently owns Sara Bly Consulting in Portland, Or • MA in Education from Stanford Univ • PhD in Comp Science from UC Davis • BA in Math from Univ of Kansas
“A Use of Drawing Surfaces in Different Collaborative Settings”, Bly 1988 http://tiny.cc/d36bmw • Study consisted of 3 types of 2 person drawing sessions • Face to Face • Separated with Media link • Separated with Telephone link • Observations suggest that the process may be as important as the final product • Pull designers together • Contribute to verbal communication • Increase focus • Further research needed Hypothesize: How do you think the actions of: Gesturing, Writing, and Drawing will vary between interaction modes: Face to Face, Media link, and Telephone?
And further research was done. • John C. Tang • Working with Bly at PARC while completing his PhD • BS, MS, PhD in Design Division of Mech. Eng. from Stanford Univ. • Interested in research for: • Supporting distributed collaboration • Understanding how users integrate across devices to accomplish tasks • At Microsoft Research since 2008 • Previously at IBM Research, Sun Microsystems, Inc.
Who was paying attention? • Saul Greenberg • Full professor of Comp. Sci. at Univ of Calgary, Canada • Education: • BS Microbiology and Immunology • Diploma of Education, 1978 • MS Comp. Sci., 1984 Univ of Calgary • PhD Comp. Sci., 1989 Univ of Calgary • Leads GroupLab at U of Calgary • Focuses on HCI, UbiComp, CSCW • Research has common thread of ‘situated interaction’
So let’s see it… (Dr. Greenberg’s version) http://www.youtube.com/watch?v=yeqAa6nbz1w 0:52 – 3:45
But first… what is Collaboration? Is that a good definition? Is it missing anything? What about Cooperation? Is that the same as Collaboration? • Definition: “To work jointly with others or jointly especially in an intellectual endeavor” • merriam-webster.com/dictionary/collaborate (Denning & Yaholkovsky 2008)
Collaboration. What it is. (Denning & Yaholkovsky 2008)
Collaboration. What it is. (Denning & Yaholkovsky 2008)
Collaboration. And what it’s not. (Denning & Yaholkovsky 2008)
Back to Tang’s Research: Basics What is today’s “natural” environment for collaboration? • Goal: • Understand collaborative work • Identify resources and hindrances • Guide development of tools accordingly • Setup: • Video groups of 3-4 people working on a design project around paper or whiteboard • Analyze video for actions and functions of those actions • Make observations
J. C. Tang’s Research: Key Data • Functions to collaborate: • Store information = note taking • Express ideas = sharing • Mediate interaction = moderating • Actions performed: • Listing = no spatial importance • Drawing = sketching, may include text • Gesturing = context matters
J. C. Tang’s Research: Analysis How’s the breakdown of actions and functions? Is this enough? Too much? How do you define purposeful gestures? Is yawning bored or tired? Clicking pen angry, thinking, or nothing at all? How would adding qualitative measurements change the observations? “useful” versus “non useful”? How does the background of the group affect the gestures used? - Thanks to Lula Analysis:
J. C. Tang’s Research: Results 100% store 20% store 60% express 20% mediate 40% express 60% mediate 66% list 33% draw 66% draw 33% gesture 33% draw 66% gesture
J. C. Tang’s Research: Findings I caught a fish THIS big! Is no implementation of gestures and drawing better than badly done implementation? • Gestures • Used for mediation and expression of ideas • All gestures need context • But there’s no record, can’t refer back to a gesture • Can be used for storage via mimicking • Drawing • Mostly for expression of ideas, also store & mediation • Creating artifacts is problematic (recording gestures) • Timing matters, creating and discussion of graphics • Context still matters – who created the mark?
J. C. Tang’s Research: Findings How many people is too many people for collaboration? Does it change for F2F vs Virtual? • Fluent Intermingling of Actions • For research purposes, actions and functions are separated, but in reality, multiple actions for single function, and rapid switching between actions • Mediation • Turn taking cued by proximity and gestures • Parallel working decreases bottlenecks and focus • Context sets public and private space • Spatial Orientation • Sets ownership and audience, public vs private space
J. C. Tang’s Research: Take Aways Was this news then? Is this news now? Do we use these findings well in development today? • Observed needs for a Collaborative workspace • Hand gestures, in context, convey information • Process of creating drawings conveys information beyond the final product • Proximity and simultaneous access to work space are key to mediating the group’s interaction • Fluently intermix actions and functions in work space • Need common view AND spatial orientation in work space
J. C. Tang’s Research: Why do we care? • Laid foundations for role of human factors in shared visual spaces • Need to include insights to human factors in system development • Provided an early example of robust groupware development • Begin with observations of actualprocess • Look beyond the activity and end result
In Short. • “Design based on [human factors] should transcend technology in terms of the collaborative experience it offers the group.” Saul Greenberg
How does this other paper relate to group collaboration? • Research follows principles demonstrated by J. Tang, highlighted by S. Greenberg: • Need to include insights to human factors in system development • Begin with observations of actual process • Performed iterative research incorporating feedback from Deaf community • Gestures aren’t just helpful for a deaf person using an interpreter, they are essential
University of North Carolina team Dorian Miller 2009 PhD in CS, UNC Dissertation on this topic Now works at IBM RTP Karl Gyllstrom PhD in CS,UNC Now CS post doc researcher Katholieke Universiteit Leuven - Netherlands David Stotts CS Prof, UNC Advisor for Miller’s PhD Interests: CSCW, Collaborative UI James Culp MBA, UNC
Semi-transparent Video Interfaces to Assist Deaf Persons in Meetings • When a person focuses on one source of data, misses other sources of data • particularly bad for deaf persons watching an interpreter • Investigates overlays of semi-transparent video • useful for note – taking • useful for video conferencing • useful for aiding communication between hearing and deaf persons without an interpreter
How bad could it be? http://www.youtube.com/watch?v=20HjI9UNPKA0 4:25 - 5:25 • Think about: • Listen for detail information, imagine taking notes • How much would you lose by looking away?
Group Meeting Scenario problem • Multiple areas to focus on: interpreter, presenter, slides, taking notes • can use periphery vision, but if locations are too far apart, no good • deaf person is cut off from conversation when taking notes
Group Meeting Interface Solution Any other Good or Bad bits that should be highlighted? Features that could highlight or help fix those items? • Projects interpreter onto laptop • Can take notes while still following conversation • Good bits • User can set size, transparency, and record • Independent: can point camera wherever needed • Transparent or Opaque • Bad bits • Tradeoff with real estate and details of video • Recording takes up large amount of memory • Eye contact between interpreter and deaf person lost
Personal Meeting Scenario problem • Switching between application and video display is problematic • video display great for 2 deaf persons to communicate through sign language • when switch to work in an application, all communication is inferred from mouse movement
Personal Meeting Interface Solution Same question: Any other Good or Bad bits that should be highlighted? Features that could highlight or help fix those items? • FaceTop application developed at UNC • Images of both parties are overlaid on top of desktop • Good: • Can watch person’s gestures while using an application • Bad: • Details still lost in semi transparency, washed-out look • Gesturing away from camera is imprecise, handled by showing both video feeds
Feedback from Deaf community • Iteratively tested with various persons and communities • Good: • Useful in situations where student taking own notes helps, like math classes • Useful in situations with multiple deaf persons meeting and trying to transcribe minutes • Useful in situations with copy signing • Useful for video conferencing, could comfortably and successfully complete task
Feedback from Deaf community • Iteratively tested with various persons and communities • Needs Improvement: • Semi transparent video diffuses application and strains eyes • Locating the cameras problematic, consider central camera with streamed output as a room feature • Interpreters and deaf persons give each other feedback through eye contact, lost with video • Had to finger spell slower to allow for video rates
Why do we care? Is there a point of information overload? - Thanks to several folks Is this a crutch or a tool? Does changing the audience change the answer? - Thanks to Darya Extension to online capabilities? - Thanks to Jen and Doug Gestures are lost in space for everyone Better collaborative groupware Utilizing existing technology in new ways
The End. Thank you!