1 / 43

IMS2805 - Systems Design and Implementation

IMS2805 - Systems Design and Implementation. Lecture 8 Finalising Design Specifications; Quality in Systems Development. References. HOFFER, J.A., GEORGE, J.F. and VALACICH ( 2002 ) 3r d ed., Modern Systems Analysis and Design , Prentice-Hall, New Jersey, Chap 5, 8, 15, 16

jzuniga
Télécharger la présentation

IMS2805 - Systems Design and Implementation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. IMS2805 - Systems Design and Implementation Lecture 8 Finalising Design Specifications; Quality in Systems Development

  2. References • HOFFER, J.A., GEORGE, J.F. and VALACICH (2002) 3rd ed., Modern Systems Analysis and Design, Prentice-Hall, New Jersey, Chap 5, 8, 15, 16 • HOFFER, J.A., GEORGE, J.F. and VALACICH (2005) 4th ed., Modern Systems Analysis and Design, Prentice-Hall, New Jersey, Chap 13 • WHITTEN, J.L., BENTLEY, L.D. and DITTMAN, K.C. (2001) 5th ed., Systems Analysis and Design Methods, Irwin/McGraw-Hill, New York, NY. Chapter 11, 17

  3. References ANDERSON, P.V. (1995). Technical writing: A reader-centred approach, 3rd ed. Harcourt, Brace & Co., Fort Worth. BROCKMAN, J. R. (1990). Writing better computer user documentation - From paper to hypertext. John Wiley and Sons, Inc., New York. Abel,D.E. and Rout, T.P. (1993) Defining and Specifying the Quality Attributes of Software Products The Australian Computer Journal 25:3 pp 105 - 112 (particularly pp 105 - 108) Dahlbom, B. and Mathiassen, L.(1993). Computers in Context: The Philosophy and Practice of Systems Design. Cambridge, Mass.: NCC Blackwell. (particularly pp 135 - 157) Elliot, S. (1993) Management of Quality in Computing Systems Education, Journal of Systems Management, September 1993 pp 6-11, 41-42 (particularly pp 6-8)

  4. References Perry, W.E. (1992) Quality Concerns in Software Development. The Challenge is Consistency. Information Systems Journal 9:2 pp 48 - 52 Standards Association of Australia (1991). Australian Standard 3563.1 -1991: Software Quality Management System. Part 1 Requirements Standards Association of Australia (1991). Australian Standard 3563.2 -1991: Software Quality Management System. Part 2 Implementation Guide

  5. Finalising Design Specifications Need for systems to be developed more quickly today The lines between analysis and design and design and implementation are blurring Traditional approaches allowed for a break between design and implementation Other approaches, such as CASE and prototyping, have caused overlap between the two phases from Hoffer, J.A., et al. (2002). Modern Systems Analysis and Design, (3rd edition) Addison-Wesley, Reading MA, USA, Chap 15.

  6. The Process of Finalizing Design Specifications • Less costly to correct and detect errors during the design phase • Two sets of guidelines • Writing quality specification statements • The quality of the specifications themselves • Quality requirement statements • Correct • Feasible • Necessary • Prioritized • Unambiguous • Verifiable from Hoffer, J.A., et al. (2002). Modern Systems Analysis and Design, (3rd edition) Addison-Wesley, Reading MA, USA, Chap 15.

  7. The Process of Finalizing Design Specifications • Quality requirements • Completely described • Do not conflict with other requirements • Easily changed without adversely affecting other requirements • Traceable back to origin from Hoffer, J.A., et al. (2002). Modern Systems Analysis and Design, (3rd edition) Addison-Wesley, Reading MA, USA, Chap 15.

  8. Finalising design specifications • Key deliverable: A set of physical design specifications • Sample contents: • Overall system description • Interface requirements • System features • Non-functional requirements • Other requirements • Supporting diagrams and models see Hoffer et al (2002) p 502

  9. Finalising design specifications Overall system description: • Functions • Operating environment • Types of users • Constraints • Assumptions and dependencies Interface requirements • User interfaces • Hardware interfaces • Software interfaces • Communication interfaces

  10. Finalising design specifications System features • Feature 1 description…. • Feature n description Non-functional requirements • Performance • Safety • Security • Software quality • Business rules Other requirements Supporting diagrams and models

  11. Finalising design specifications Representing design specifications: • Written document with diagrams and models – traditional approach uses structure charts • Computer-based requirements/CASE tool to input and manage specifications • Working prototype to capture specifications • Combinations of the above

  12. Finalising design specifications Structure Charts • Shows how an information system is organized in hierarchical models • Shows how parts of a system are related to one another • Shows breakdown of a system into programs and internal structures of programs written in third and fourth generation languages

  13. Finalising design specifications Structure Charts • Module • A self-contained component of a system, defined by a function • One single coordinating module at the root of structure chart • Single point of entry and exit • Communicate with each other by passing parameters • Data couple • A diagrammatic representation of the data exchanged between two modules in a structure chart • Flag • A diagrammatic representation of a message passed between two modules

  14. Finalising design specifications Structure Charts • Pseudocode • Method used for representing the instructions inside a module • Language similar to computer programming code • Two functions • Helps analyst think in a structured way about the task a module is designed to perform • Acts as a communication tool between analyst and programmer

  15. Finalising design specifications Prototyping • Construction of the model of a system • Allows developers and users to • Test aspects of the overall design • Check for functionality and usability • Iterative process • Two types • Evolutionary Prototyping • Throwaway Prototyping

  16. Finalising design specifications Evolutionary Prototyping • Begin by modeling parts of the target system • If successful, evolve rest of the system from the prototype • Prototype becomes actual production system • Often, difficult parts of the system are prototyped first • Exception handling must be added to prototype Throwaway Prototyping • Prototype is not preserved • Developed quickly to demonstrate unclear aspect of system design • Fast, easy to use development environment aids this approach

  17. Finalising design specifications: data design • Physical file and database design requires: • Normalised relations • Attribute definitions • Where and when data is entered, retrieved, updated and deleted • Response time and data integrity needs • Implementation technologies to be used • Design physical tables and file structures according to performance factors and constraints of the particular application

  18. Project Management Quality in Systems Development(must be embedded in the process) Analysts Role Initiation Analysis Design Implementation Documentation Review Maintenance Ethics Quality

  19. Definitions of Quality • Degree of excellence (Oxford) • Fitness for purpose (AS1057) • includes quality of design, the degree of conformance to design, and it may include such factors as economic or perceived values • Ability to satisfy stated/implied needs (ISO8402) • Conformance to requirements (Crosby, Horch)

  20. Determining Quality .. • when having a meal in a restaurant • when purchasing a car • when buying a computer The requirements vary immensely, and some of the success measures are very hard to quantify... Quality means different things to different people .. and it varies in different situations

  21. Information systems: quality issues The system: • does not meet the client’s business or processing needs • does not support the client’s working methods • is unstable and unreliable • does not improve productivity • is difficult to use or requires excessive training to use • is expensive to maintain

  22. Information systems: quality issues The system: • is incomplete • is expensive to operate • has a short life span • is delivered late • costs more than budget • cannot grow with the organisation • does not produce a return on investment

  23. Error detection in systems • “Effort spent on software maintenance is greater than that spent on software development.” • “An error is typically 100 times more expensive to correct in the maintenance phase on large projects, than in the requirements phase.” Boehm, B. (1981) Software Engineering Economics

  24. Error Detection • * The cost of detecting and correcting errors rises greatly during • the systems development cycle. $ Initiation Analysis Design Implementation • In addition to this is the cost to the organisation of having an incorrect system

  25. Quality Costs The tip of the Iceberg Obvious upfront costs to the organisation Review, Inspection, Re-do, The hidden costs of not having quality systems User complaints, Downtime, Loss of sales, Re-testing, Re-documenting, Re-training, Overtime, Customer complaints, Financial losses, Employee turnover

  26. Quality dimensions • Correctness - Does it accurately do what is intended? • Reliability - Does it do it right every time? • Efficiency- Does it run as well as it could? • Integrity - Is it precise and unambiguous? • Usability - Is it easy to use?

  27. Quality dimensions • Maintainability - Is it easy to fix? • Testability - Is correctness easy to check and verify? • Flexibility - Is it easy to adapt and extend? • Portability - Can it be easily converted? • Reusability - Does it consist of general purpose modules? • Interoperability - Will it integrate easily with other systems?

  28. The Quality Process • The quality process involves the functions of: • Quality control - monitoring a process and eliminating causes of unsatisfactory performance • Quality assurance - planning and controlling functions required to ensure a quality product or process

  29. Implementing a Quality System • Quality must start at the top - Executive sponsorship is vital. • Everyone must be involved and motivated to realise that they have a responsibility towards the final product, its use, and its quality. • Improve job processes by using standards, and preparing better documentation (using project control methodologies). • Use a Quality Assurance group. • Use technical reviews.

  30. Standards • Levels of standards • Industry / National / International • Organisational • Industry • Capability Maturity Model (Humphrey 1989) See Whittten et al (2001) pp 76-77 • National / International • Standards Australia (AS 3563) • International Standards Organisation (ISO 9000) • Organisational • The organisation may adopt or tailor industry, national or international standards.

  31. Standards • Standards can be of varying levels of enforcement and types which will depend on the organisation and project variables. For example : • mandatory practice: must be adhered to • advisable practice: can be breached with good reason • in the form of a checklist, template, or form.

  32. Standards - Examples • Document template (form, e.g. template for these slides) • Acceptance test sign off form (form) • Screen standards (standard - mandatory practice) • Unit test process (standard - mandatory practice) • COBOL II standards (standard - mandatory practice) • Post implementation review procedure (advisory practice) Note: different organisations and projects will have different views about whether a standard is mandatory or advisable.

  33. Quality reviews • Reviews are used in the quality control and quality assurance functions. There are two main forms of review: • Quality Assurance: • management reviews • Quality Control • technical reviews

  34. Management or Project Review • Management must check the baseline for a deliverable to see that it meets the quality assurance requirements. • This may involve simply noting that a technical review has passed a particular deliverable. The manager can then be assured of quality(given that the manager has actively taken part in the development of the quality system) • The manager can then alter the project plan if necessary to allow for delays or early completion.

  35. TechnicalReviews • A technical review is a structured meeting where a piece of work, which has previously been distributed to participants, is checked for errors, omissions, and conformance to standards. • All deliverables need review, otherwise how do you control quality? • The review is part of quality control and must produce a report so that the quality assurance function can be satisfied. • The report may be a checklist which indicates that the deliverable passes/fails the quality requirements for that type of deliverable. • This report is part of the baseline for the deliverable.

  36. Technical Reviews • A technical review: • is a formal meeting of a project team which is guided by an agenda and standards • allows input from many people • produces a report which is made public • requires committed participants to be responsible and accountable for their work • is educational as it clarifies standards, and highlights strengths and weaknesses of the team’s skills and knowledge • expects all participants to be responsible for the resulting quality of the artefact

  37. Technical Review Roles • review member • Know the review process • Be positive and supportive • Interested in improving the quality of the deliverable and the review process • review leader • Secure agreement on objectives and standards • Encourage input from all participants, and politely silence overly talkative participants • Facilitate agreement to ensure action on the deliverable • scribe • Record all issues clearly, accurately, and unambiguously. If not sure of a particular issue, seek clarification

  38. Review Member • Pre-requisites for review member: • Know the review process • Be positive and supportive • Interested in improving the quality of the deliverable and the review process • Preparation for review: • Read and annotate material before review (be prepared)

  39. Review Leader • Pre-requisites for review leader: • Know the review process • Know the objectives & standards for the review • Be objective & aware of the implications the review may have for certain participants • Prepare: • Agenda • Organise venue & any ancillary materials necessary • Notify participants & ensure they have review deliverable, agenda, & standards in advance.

  40. Review Leader • During the review: • Were preparations suitable ? - if not you may have to re-schedule • Secure agreement on objectives and standards • Encourage input from all participants, and politely silence overly talkative participants • Facilitate agreement to ensure action on the deliverable • Ensure participants understand he ramifications of these actions • Ensure scribe has accurately documented review

  41. Review Leader • After the review: • Was the review successful? • Is the outcome likely to produce a better quality product? • Can the review process be improved? Can the quality of the quality system be improved ? - this is continuous improvement. • Do the objectives and standards for the type of deliverable need review • Is the presenter of the deliverable capable of improving the quality of the item • When will the deliverable be reviewed again? Who is responsible? (Are they aware of the responsibility?)

  42. Scribe • Pre-requisites for the scribe: • Know the review process • Know the objectives and standards for the current review • Be objective and aware of the implications the review may have for certain participants

  43. Scribe • During the review: • Record all issues clearly, accurately, and unambiguously. If not sure on a particular issue, seek clarification • Be sure to use clear reference pointers between the review action list (report) and the review deliverable. • Gather copies of any materials introduced by participants to support an issue. • After the review: • Check the review action list is accurate and promptly distributed to relevant people (who may not have been in the review)

More Related