1 / 25

Chapter 28

Chapter 28. Methodology Comparisons. Learning Objectives. Comparison issues Framework for comparing methodologies General FW – NIMSAD Specific FW – 7 elements Methodologies Comparison. Reasons for comparing.

nelsonw
Télécharger la présentation

Chapter 28

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Chapter 28 Methodology Comparisons

  2. Learning Objectives • Comparison issues • Framework for comparing methodologies • General FW – NIMSAD • Specific FW – 7 elements • Methodologies Comparison

  3. Reasons for comparing Academic reason – to better understand the nature of methodologies (i.e. features, objectives, philosophies, etc) in order to perform classifications and to improve future information systems development. Practical reason – to choose a methodology, part of one, or a number of methodologies for a particular /group application, or for an organization as a whole.

  4. Rules Total coverage Understanding the information resource Documentation standards Separation of logical and physical designs Validity of design Early change Inter-stage communication Effective problem analysis Planning and control ‘Ideal-type’ Criteria for Assessing Methodologies

  5. Performance evaluation Increased productivity Improved quality Visibility of the product Teachable Information systems boundary Designing for change Effective communication Simplicity Ongoing relevance Automated development aids Criteria for assessing methodologies (cont…)

  6. Consideration of user goals and objectives Participation Relevance to practitioner Relevance to application The integration of the technical and the non-technical systems Scan for opportunity Separation of analysis and design Criteria for assessing methodologies (cont…)

  7. Bjørn-Andersen (1984) checklist • What research paradigms/perspective form the foundation for the methodology? • What are the underlying value systems? • What is the context where a methodology is useful? • To what extent is modification enhanced or even possible? • Does communication and documentation operate in the users’ dialect, either expert or not? • Does transferability exist? • Is the societal environment dealt with, including the possible conflicts? • Is user participation ‘really’ encouraged or supported?

  8. NIMSAD (Jayaratna, 1994) Normative Information Model-based Systems Analysis and Design Based on the models and epistemology of systems thinking, has three elements: • The ‘problem situation’ (the methodology context) • The intended problem solver (the methodology user) • The problem-solving process (the methodology)

  9. NIMSAD (Jayaratna, 1994) The Problem Situations Qs: • The clients and their understandings, experiences, and commitments • The problem owners, their concerns, and problems • The situation that the methodology users are facing, its diagnosis as structured or unstructured • The ways in which the methodology might help the situation • The culture and politics of the situation, including the risks associated with using the methodology in various circumstances • The views of the stakeholders concerning ‘reality’, for example, is there an objective real world problem out there, and what is the relationship of this to the methodology’s philosophical assumptions about reality? • The dominant perceptions in the problem situation; for example, are they technical, political, social, and so on?

  10. NIMSAD (Jayaratna, 1994) The intended problem solver Qs: • The methodology users’ beliefs, values, and ethical positions • The relationship of the above to that assumed or demanded by the methodology • The way in which mismatches in the above two may be handled or reconciled; for example, can the methodology processes be changed, and does the methodology help to achieve this? • The methodology users’ philosophical views, for example, science or systems-based • The methodology users’ experience, skills, and motives, in relation to those required by the methodology

  11. NIMSAD (Jayaratna, 1994) Methodology Qs: • Understanding the situation of concern and the setting of boundaries • Performing the diagnosis; for example, the models, tools, techniques, and the levels at which they operate, how they interact, what information they capture, what is not captured, what happens when people disagree, and so on • Defining the prognosis outline, for example, the desired states, what constitutes legitimate states, and the handling of conflict • Defining problems • Deriving notional systems; for example, are they derived at all, and if so how, and in what ways are they recorded? • Performing conceptual/logical and physical design, including who is involved and what are the implications; for example, does it lead to systems improvements or systems innovation? • Implementing the design; for example, how does it handle alternatives and how does it ensure success?

  12. Contingency (Davis, 1982) • System complexity or ill structuredness. • The state of flux of the system. • The user component of the system, for example, the number of people affected and their skill level. • The level of skill and experience of the analysts.

  13. Types of Situation Avison and Taylor (1996) • Well-structured problem situations with a well-defined problem and clear requirements. A traditional SDLC approach might be regarded as appropriate in this class of situation. • As above but with unclear requirements. A data, process modelling, or a prototyping approach is suggested as appropriate here. • Unstructured problem situation with unclear objectives. A soft systems approach would be appropriate in this situation. • High user-interaction systems. A people-focused approach, for example, ETHICS, would be appropriate here. • Very unclear situations, where a contingency approach, such as Multiview, is suggested.

  14. Comparisons 1. What aspects of the development process does the methodology cover? 2. What overall framework or model does it utilize? For example, is it systems development life cycle based, linear, or spiral? 3. What representation, abstractions, and models are employed? 4. What tools and techniques are used? 5. Is the content of the methodology well defined and described, such that a developer can understand and follow it? This applies not only to the stages and tasks but also to the philosophy and objectives of the methodology. 6. What is the focus of the methodology? Is it, for example, people-, data-, process-, and/or problem-oriented? Does it address organizational and strategic issues? 7. How are the results at each stage expressed?

  15. Comparisons 8. What situations and types of application is it suited to? 9. Does it aim to be scientific, behavioural, systemic, or whatever? 10. Is a computer solution assumed? What other assumptions are made? 11. Who plays what roles? Does it assume professional developers, require a methodology facilitator, involve users and managers, and, if so, how and to what degree? 12. What particular skills are required of the participants? 13. How are conflicting views and findings handled? 14. What control features does it provide and how is success evaluated? 15. What claims does it make as to benefits? How are these claims substantiated? 16. What are the underlying philosophical assumptions of the methodology? What makes it a legitimate approach?

  16. Framework for comparing methodologies – seven elements • Philosophy • Paradigm • Objectives • Domain • Target • Model • Techniques and tools • Scope • Outputs • Practice • Background • User base • Participants • Product • Philosophy • Set of principles that underlie a methodology • Four distinguishing factors: • Paradigm: specific way of thinking about problems • science vs. systems paradigm • science paradigm (by reductionism, repeatability and hypotheses refutation) • systems paradigm (concern for the whole picture, emergent properties, and interrelationships between parts of the whole) • Objectives, • e.g. • to develop a computerized information system? • to discover if there is a need for a computerized system?

  17. Four distinguishing factors (cont.): Domain: situations that methodologies address narrow problem vs. wider organization-level problems individual problems vs. many interrelated problems viewed as a whole Target: applicability of the methodology general-purpose vs. application/organization specific Model: abstraction and representation of the important factors of the information system or the organization Verbal Analytic or mathematical Iconic, pictorial or schematic Simulation Most methodologies are iconic, pictorial or schematic. Models are used as a means of communication, particularly between users and analysts Framework for comparing methodologies – seven elements

  18. Techniques and Tools: Are techniques and tools essential to the methodology? Which techniques/tools are used in a methodology? Examples: Rich pictures, root definitions, etc Entity modeling and normalization DFDs, decision tables, decision trees, entity life cycles OO design and UML Various organizational and people techniques Scope: indication of the stages of the life cycle of systems development that the methodology covers Recall SDLC (Systems development life cycle) Feasibility study System investigation Systems analysis Systems design Implementation Review and maintenance Framework for comparing methodologies – seven elements

  19. Outputs: what the methodology is producing Deliverables at each stage Nature of final deliverable Decision about whether to computerize a process Analysis specification Working implementation of a system Product: what purchasers actually get for their money Software Written documentation Agreed number of hours training, consultancy Telephone help service Practice: Methodology background: academic vs. commercial User base: numbers and types of users Participants and skill levels required Assessment of difficulties and problems encountered Perception of success and failure Degree to which the methodology is altered by the users according to the requirements of the situation Differences between the theory and the practice of the methodology Framework for comparing methodologies – seven elements

  20. Philosophy: Paradigm: SSM adopts systems paradigm (avoids reductionist approach) STRADIS, YSM, IE, SSADM, MERISE, RUP etc. adopt the science paradigm Objectives: STRADIS, YSM, IE, SSADM, MERISE, RUP etc have clear objectives to develop computerized information systems SSM aims at much more than developing an IT system Domain: IE, and SSM address general planning, organization, and strategy of information and systems in the organization (IE’s first stage is information strategy planning) STRADIS, YSM, SSADM, Merise and RUP are classified as specific problem-solving methodologies Target: RUP: general-purpose, not very useful for small systems STRADIS: general-purpose, DFDs not suitable for management information systems or web-based systems SSM: more applicable in human activity ‘messy’ situations XP: suitable for small and continuously evolving systems most methodologies (not XP) designed for large systems Methodology comparison: philosophy

  21. Model: STRADIS uses primarily DFDs DFDs are also used in YSM, SSADM, IE, SSM (but play a less significant role than in STRADIS) SSADM, IE, Merise, RUP integrate both processes and data Techniques: STRADIS is largely described in terms of its techniques SSM does not heavily use techniques and tools YSM, SSADM, RUP specify techniques and view them as important for the methodology IE explicitly suggests that the techniques are not a fundamental part of the methodology Scope: (see figure 27.3 in Information Systems Development, by Avison and Fidzgerald) Product: SSADM comes with a large set of manuals SSM comes only with some academic papers RUP has a range of books, and online specs Some methodologies offer certification of competency for developers Methodology comparison: philosophy …

  22. Outputs: Methodologies differ significantly in terms of Kinds of deliverables Degree of detail in which they are specified How deliverables are used to measure progress and move to the next stage Practice: STRADIS, YSM, IE, SSADM: commercial origin Merise, SSM, RUP: academic origin STRADIS, YSM, IE, SSADM, Merise, RUP: professional technical developers SSM: both business and technical people Methodology comparison: philosophy …

  23. Framework for analysing the underlying philosophies of methodologies (adapted from Lewis, 1994)

  24. Scope of methodologies

  25. End of Chapter 28 Thank You for Your Attention

More Related