1 / 13

Distributed Experiment: Planned Design for Collaboration and Analysis

This paper presents a framework for a distributed experiment, aimed at collecting useful information about inspections across different organizations. The goal is to understand and incorporate local differences and identify factors influencing effectiveness. Collaboration options at different levels are discussed, from industry surveys to controlled experiments. Known issues and limitations are also highlighted.

hannahe
Télécharger la présentation

Distributed Experiment: Planned Design for Collaboration and Analysis

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. ISERN Distributed Experiment: Planned Design Forrest Shull, Fraunhofer Center for Experimental Software Engineering - Maryland

  2. Goals for this Meeting • Present a framework for the distributed experiment • The broad strokes that are not negotiable! • Decide whether there is sufficient consensus to go forward with a distributed experiment • If yes, • Gauge the interest in specific hypotheses that can be explored using this framework. • Decide on a specific set of hypotheses for study and sign up participants.

  3. Distributed Design Goals • Collect useful information about inspections for researchers and project managers • Each local experiment is interesting • “Meta-analysis” across a critical mass of experiments yields another level of information • Understand and incorporate local differences • Identify factors influencing effectiveness across • Different types of organizations • Different types of development environments • Allow several options for collaboration • Allowing organizations to match required effort with anticipated benefits • All of which support the larger analysis

  4. Collaboration options: Level 1 • Industry Survey • Descriptive analysis, no benchmark. • Estimated effort for organization: • One respondent, 4 to 6 staff hours. • Output: A report characterizing types of responding organizations and their inspection processes. • Benefits: • Characterization of state-of-the-practice • Compilation of best practices • Understanding of problem areas that could be improved • [Measure distance between “standards” and local processes, local documents.]

  5. Collaboration Options: Level 1 • Industry Survey: Process • Identify/contact a respondent. • Respondent answers for his/her development team. • Responses are aggregated across all respondents and reflected in final report, along with lessons learned analysis.

  6. Collaboration Options: Level 2 • Benchmark an inspection technique • Pick some inspection variant and study it across different contexts. • Estimated effort for organization: • Contact person and >10 participants for 1 day, 65 to 85 staff hours total. • Benefits: • Provides training to participants in a beneficial inspection approach. • Some understanding of potential benefit due to process change in the local organization. • [Some understanding of expected improvement due to variant process in different contexts.]

  7. Collaboration Options: Level 2 • Benchmarking a technique: Process • Complete survey steps • Produce local documents (version control, seeding, …) • Training • Inspection using the new technique • Analysis: Compare inspection results to historical baseline for organization • Qualitative or quantitative • Feedback to participants

  8. Collaboration Options: Level 3 • Controlled experiment • Get very accurate data about the improvement due to some inspection variant • Estimated effort for organization: • 1 contact person, 8-10 hours • >10 participants for 1.5 days • Benefits: • Provides training to participants in a beneficial inspection approach. • Accurate understanding of potential benefit due to process change in the local organization. • [“Meta-analysis” across all organizations.]

  9. Collaboration Options: Level 3 • Controlled experiment: Process • Complete survey steps • Produce 2 local documents • Inspection of “local” document using the usual technique • Training • Inspection of “baseline” document using the new technique • Inspection of “local” document using the new technique • Analysis: • Compare results on local documents for new vs. usual inspection techniques • Compare results for new technique on local vs. baseline documents • Feedback to participants

  10. Collaboration Options: Shadow Experiments • For industrial collaborators who want to lower risk before investing in Level 2 or Level 3. • Must make a representative, “anonymized” local document available. • ISERN will match the industrial partner to a university course that would be willing to perform a first run of the same study • Industrial experiment will only occur if results from academic environment are promising.

  11. Known Issues • Most organizations don’t have a common inspection process. • Contact points and analysis must be done at the level of development teams. • Too little emphasis on accurate measures of effectiveness. • Can get some benefit from qualitative reflections, even at the survey level. • No “benchmark technique” will be equally effective in all environments. • Level 2 and shadow experiment give an opportunity to test the hypothesis with less commitment. • Seeding defects in any document is inaccurate. • We will emphasize using previously version-controlled documents with a defect history whenever possible, but have to rely on an organization’s own assessment of what the important issues are.

  12. Goals for the technique to be benchmarked • Require evidence of its likely benefit • Should be widely applicable (maximize potential pool of participants) • Some version should be able to be taught in a “reasonable” time • Should be of genuine interest to target audience • Results should be actionable

  13. Some options for the technique to be benchmarked • PBR (requirements) / OORTs (UML) • Training materials (including webcasts) with a history of reuse are available to assist consistency of training • PBR: Reusable ICSE tutorial • Inspection meeting approaches • Video/telecon; document circulation; netmeeting; DOORS-based protocol • Alternatives are commonly available, require minimal training

More Related