1 / 24

Written by: Andrew J. Ko , Brad A. Myers, Michael J. Coblenz, and Htet Aung

An Exploratory Study of How Developers Seek, Relate, and Collect Relevant Information During Software Maintenance Tasks. Written by: Andrew J. Ko , Brad A. Myers, Michael J. Coblenz, and Htet Aung IEEE Transactions on Software Engineering, December 2006

jela
Télécharger la présentation

Written by: Andrew J. Ko , Brad A. Myers, Michael J. Coblenz, and Htet Aung

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. An Exploratory Study of How Developers Seek, Relate, and Collect Relevant Information During Software Maintenance Tasks Written by: Andrew J. Ko, Brad A. Myers, Michael J. Coblenz, and HtetAung IEEE Transactions on Software Engineering, December 2006 Presented by: Bennie Lewis and VolodymyrPryyma School of Electrical Engineering and Computer Science University of Central Florida

  2. Overview • Introduction • Related Work • Methodology • Results • Limitations • Implications for Theory • Implications for Tools • Conclusions

  3. Introduction • Useful software life cycle • Brief period of development • Longer period of maintenance • Development tools • Help to understand software • Help to efficiently modify source code

  4. Introduction • Task context • Parts of artifacts relevant to developer working on maintenance task • Tools based on task context • Task context representation • Manually building task context • Inferring relevant task context

  5. Introduction • Exploratory study • 10 developers • Eclipse 2.1.2 • Small, unfamiliar system • Goals • Discover developers’ strategies • How are development environments related to the strategies

  6. Related Work • Strategies for understanding programs • Questions regarding • Structure • Intent • Behavior • Other studies • Collaboration • Representation

  7. Methodology • 10 developers in laboratory setting • 5 maintenance tasks • 70-minute time limit • Interruptions • Nature of tasks • 3 debugging tasks • 2 enhancement tasks

  8. Methodology • Participants • 31 were considered initially • Narrowed down to 10 participants • Undergraduate and graduate students • Based on self-reported survey • 7 Experts • 3 Above average

  9. Methodology • Paint application • Java Swing application • Allowed users to draw, erase, clear, and undo • Not as complex as programs used for other studies

  10. Methodology • Tasks • Invented complaints/requests • No task names were given • Brief description of requirements

  11. Methodology • Tools and instrumentation • Eclipse 2.1.2 IDE • Project with 9 source files • Allowed to use other tools • Experimenter to clarify/answer questions

  12. Methodology • Interruptions • Came from a server • Produced audible alert • Required developer’s full attention • Designed to mimic real world interruptions • Came every two and a half to three and a half minutes

  13. Methodology • Procedure • Initial survey • 5 tasks in 70 minutes • $10 per correct task completion • Lose $2 per ignored interruption • Once done, experimenter checked work and paid developers accordingly

  14. Results • Division of labor • Spent more time on difficult tasks • Fifth of their time on reading code • Fifth of their time on editing code • Quarter of their time on textual searches • Tenth of their time on testing

  15. Results • Task structure • Developers activities were not independent • Had to find code first • Then determine what to edit • And then edit the code • Developers introduced errors that had to be fixed, thus interleaving the sequence of activities

  16. Results • Forming perceptions of relevance • Involved several levels of engagement • Based on different types of cues • Common observation • Look at file name • If relevant, open file • Look for code identifiers, comments, etc • If found relevant method, examine more closely

  17. Results • Representing task contexts • Package explorer/file tabs • 2 developers used bookmarks • Windows task bar for running applications • Web browser for documentation • 2 developers used paper notes

  18. Results • Impact of interruptions • Only had an impact if two conditions were met • 1) an important task was not externalized at the time of acknowledging the interruption • 2) developers could not recall the state after returning from interruption • Developers always completed edits before acknowledging interruptions

  19. Limitations • Subjective and base on authors’ interpretations • Program size not representative • Small sample size • Inexperienced subjects

  20. Implications for Theory • There is a need for a new model of program understanding • Authors’ model • Searching, relating, and collecting relevant information • Forming perceptions of relevance

  21. Implications for Tools • Navigating between dependencies was a major concern • Took on average 19 minutes • Many repeated navigations • Could be reduced if more helpful tools were available

  22. Conclusions • Developers must locate and understand relevant portions of code before making a change • This study inspired a new model • Based on searching, relating, and collecting • Reliance of environment cues • Study identifies a need for more streamlined environments

  23. Paper Critique • Pros • Poses interesting questions • Introduces a model for identifying relevant tasks • Cons • The participants were students • Self-reporting survey to gauge expertise • Relatively simple program • Strange methods • Interruptions every 2-3 minutes • Judging relevance by developer behavior

  24. Questions?

More Related