1 / 32

Myths in Software Engineering: Debunking Over-simplifications and Unconfirmed Claims

This article explores common myths and misconceptions in software engineering, providing evidence-based insights and debunking over-simplified and unconfirmed claims. It highlights the importance of critical thinking and collecting valid evidence to make informed decisions.

newtong
Télécharger la présentation

Myths in Software Engineering: Debunking Over-simplifications and Unconfirmed Claims

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Myths, Over-simplifications and Unconfirmed Claims in Software EngineeringMagne Jørgensen Magne Jørgensen

  2. The paper clip was invented by a Norwegian

  3. Short men are more aggressive (The Napoleon complex)

  4. Most communication is non-verbal

  5. Therewere/is a softwarecrisis (page 13 of their 1994-report): “We then called and mailed a number of confidential surveys to a random sample of top IT executives, asking them to share failure stories.”

  6. Difficult to remove a myth Years of critique may, however, have had an small effect (from the CHAOS Report – 2013):

  7. 45% of features of “traditional projects” are never used(source: The Standish Group, XP 2002) No-one seems to know (and the Standish Group does not tell) anything about this study! Why do so many believe (and use) this non-interpretable, non-validated claim? They benefit from it (agile community) + confirmation bias (we all know at least one instance that fit the claim)

  8. 14% Waterfall and 42% of Agile projects are successful(source: The Standish Group, The Chaos Manifesto 2012) Successful = “On time, on schedule and with specified functionality” Can you spot a serious error of this comparison?

  9. There is an increase in costofremovingerrors in later phases http://ntrs.nasa.gov/archive/nasa/casi.ntrs.nasa.gov/20100036670.pdf

  10. The magic number 7 ± 2

  11. The number one in the stink parade …

  12. The ease of creating myths: Are risk-willing or risk-averse developers better? Group A: Group B: Initially Average 3.3 Initially Average 5.4 Debriefing Average 2: 3.5 Debriefing Average 2: 5.0 2 weeks later Average 3: 3.5 2 weeks later Average 3: 4.9 Study design: Research evidence + Self-generated argument. Question: Based on your experience, do you think that risk-willing programmers are better than risk-averse programmers?1 (totally agree) – 5 (No difference) - 10 (totally disagree) Neutral group: Average 5.0

  13. How to convince software engineers Context: Assume that a test course provider claims: ”The course will lead to substantial increase in test efficiency and quality for most participants.” How likely do you think this claim is true, given [reduced explanation of options]:A: No other informationB: Supporting claims from reference clientsC: Supporting study conducted by the course providerD: Convincing explanation (but no empirical evidence)E: Supporting experience from a colleague (It helped him)F: Supporting scientific study completed at a renowned universityG: Own experience (It helped me)

  14. Results not very promising for evidence-based software engineering A: No other informationB: Support from reference clientsC: Supporting study conducted by the course providerD: Convincing explanation (but no empirical evidence)E: Supporting experience from a colleague (It helped him)F: Supporting scientific study completed at a renowned universityG: Own experience (It helped me)

  15. “I see it when I believe it” vs “I believe it when I see it” • Design: • Data sets with randomly set performance data comparing “traditional” and “agile” methods. • Survey of each developer’s belief in agile methods • Question: How much do you, based on the data set, agree in: “Use of agile methods has caused a better performance when looking at the combination of productivity and user satisfaction.” • Result: • Previous belief in agile determined what they saw in the random data

  16. When making a decision or choice, the world is no more the same (Dan Gilbert) http://www.ted.com/talks/lang/eng/dan_gilbert_asks_why_are_we_happy.html

  17. Elements of more evidence-based practise(Myth busting in software engineering) • Find out what is meant by the claim • Is it possible to falsify evaluate the claim? • How general is it meant to be? • Prepare for evaluation of validity of claim and argumentation • Be aware of your confirmation bias tendency, especially when you agree • Consider what you would consider as valid evidence to support the claim. • Check for vested interest • Collect/create and evaluate evidence • Research-basedevidence • Practice-based evidence • Locally created, experimental evidence • Synthesize evidence and conclude (if possible)

  18. Evidence-based software engineering (EBSE) The main steps of EBSE are as follows: • Convert a relevant problem or need for information into an answerable question. • Search the literature and practice-based experience for the best available evidence to answer the question. • Critically appraise the evidence for its validity, impact, and applicability. • Integrate the appraised evidence with practical experience and the client's values and circumstances to make decisions about practice. • Evaluate performance in comparison with previous performance and seek ways to improve it. NB: EBSE is NOT the same as systematic literature review! Tore Dybå, Barbara Kitchenham and Magne Jørgensen, Evidence-based Software Engineering for Practitioners, IEEE Software, Vol. 22, No. 1, Jan-Feb 2005.

  19. Experience from teaching evidence-based software engineering

  20. Background • University courses with students and with software professionals • 30 hours lecturing • Supervised project report writing with self-selected “problem formulation” • Short courses/seminars with software professionals • In total, I’v taught several hundreds of students and software professionals EBSE.

  21. Claim Data Warrant Backing Qualifier Reservation Example of what they learn:A structure of analyzing claims, argumentation and evidence(Toulmin’s model)

  22. Any positive impact from teaching Evidence-Based Software Engineering? • Positive impact: • They get more critical towards claims made by gurus and in textbooks, especially when there are “vested interest” • The get better and asking what claims really mean • They discover the strength of collecting and evaluating evidence • Challenges: • Evaluating evidence and argumentation is new to them. Only a few (20%?) develop good evaluation skills, although all seem to improve through the courses. • Unlike evidence-based medicine, EBSE cannot rely on research-based evidence only. They need to be trained in collecting practice-based evidence.

  23. Challenge: Lack of evidence • Empirical research is frequently sparse and not very relevant! • … and the quality is not impressive either. • A need for training in collecting practice-based evidence • Very much the same method as for evaluating research • A need for training in designing and collecting own evidence • This is more new ground, but the benefit from training in experimental method, use of representative “trials” etc. is very promising. • On the long term, this type of evidence (local, highly relevant, collected in the context it will be used) is perhaps the one with most potential of changing the software practices. • NB: The problems with this kind of learning/improvement in other frameworks (CMM-I, PSP, PDCA, GQM, …) do not defend very much optimism. An advantage of EBSE may be the focus on posing a meaningful problems and the combination of different sources of evidence.

  24. What I tried to say … • The software engineering discipline is filled with myths, fashion, over-simplifications and unconfirmed claims. • Being aware of the mechanisms may make us more rational • The software engineering industry waste much energy due to lack of collecting evidence on what works and what not. • Example: We change from one technology/method to the next without knowing much about whether it will improve anything • As researchers/university employees we should try • Train software professionals in EBSE, including how to become myth busters • Provide (synthesized) evidence that is relevant and of good quality. • Write books and guidelines that are evidence-based

  25. Coffee dehydrates your body

  26. Bonus material

  27. Impact of vested interest in software engineering studies (a brief distraction)35 published comparisons of regression and analogy-based effort estimation models

  28. Effect size = MMRE_analogy – MMRE_regression Regression-based cost estimation model better Analogy-based models better Study

  29. Development of own analogy-basedmodel (vested interests) Effect size = MMRE_analogy – MMRE_regression Regression-based cost estimation model better Regression models better Study

More Related