1 / 12

Group 19 Lab 8

Group 19 Lab 8. Michael Vernon Szlamowicz Evan Fillman Mark Rawls Robert Pricope. Overview. Report on the inspection The original inspection schedule The assignments given to each group member. The checklists that were used. The actual effort expended

oliana
Télécharger la présentation

Group 19 Lab 8

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Group 19 Lab 8 Michael Vernon Szlamowicz Evan Fillman Mark Rawls Robert Pricope

  2. Overview • Report on the inspection • The original inspection schedule • The assignments given to each group member. • The checklists that were used. • The actual effort expended • The questions about the implementation • Written versions of the answers to the questions • The results of each inspection phase • The result of the rework effort • Management report • Summary • Questions

  3. Inspection Report • Original Inspection Schedule • Phase 1 • Phase 2 • Phase 3 • Assignments to Each group member • Evan: Phase 1 • Mike: Phase 2 • Everyone: Phase 3

  4. Inspection Checklists: Phase 1 • Internal documentation • Internal documentation follows standard practice (clear, concise, relevant, &c.) • Any non obvious piece of code is explained • Comment header for most methods; specify pre- and post-conditions (as appropriate) • Loop invariants where appropriate • Consistency • Documents requirements as appropriate • Comments should be understandable to observer with minimal understanding of system • Implement C# comments where appropriate (///) • Source-code layout • The source code is well formatted and readable • Tabs and white space are consistent (good indenting practices) • Methods are ordered logically

  5. Inspection Checklist: Phase 2 • Naming conventions • Naming follows the C# naming convention • Naming is meaningful • Naming is consistent • Check: Namespaces, Classes, Methods, Variables • Object oriented practices • Encapsulation / Information hiding (don't leak control where it isn't needed) • Good OO design (not a single massive class, and not 30 smaller ones either) • Code is extensible (e.g., can add ability to connect to multiple robots without major restructuring) • Code is robust (e.g., Communication Protocol can change and effect is isolated) • Programming style • Methods of reasonable length (say < 2^16 bytes, which is Java's limit) • Uses C# libraries to full effect (no need to reinvent the wheel) • Follows standard C# idioms (e.g., foreach instead of god-awful enumerators) • Standard practices, such as keeping data private, using constants instead of hard-wired numbers, writing a conditional statement as if(const == var), and so forth. • Error handling done correctly (catch exceptions that can be handled and handle them) • Cohesiveness and coherence are appropriate • Effective Looping and control structures • Comment at the end of long block statement, to effect of say } // End of file processing • Any other deficiencies (I don't claim to have been exhaustive; indeed, style guides can be quite long and cumbersome, to say nothing of arbitrariness).

  6. Inspection Checklist: Phase 3 • Check requirements • Do we fulfill all of the requirements in the SRS • Do we fulfill the requirements correctly • GUI Callbacks • All appropriate events registered & handled correctly • Socket Communications • Connection established, can disconnect, handle exceptions, &ca.

  7. Questions and Answers • Mike: • 1) Does the scripting system adhere to good design guidelines such as information hiding? • 2) Why do we even need a message generator class? • 3) Should we worry about parsing commands before they are added to the listBox for the script? • Robert: • 1) How is information hiding implemented when every method of the Log class is public? • 2) Where in the code is the actual file path that the StreamWriter uses • Mark: • 1) Are all events handled? • 2) Does the GUI behave counter-intuitively to user expectations? • 3) Is the program thread-safe? • Evan: • 1) Does the IR sensor display actually help the end user? • 2) Are there any provisions such a thread priority to ensure that movement calls go through first • 3) Are exceptions handled well?

  8. Results of the Inspection • Phase 1 • Bad commenting • Reorder some methods • Phase 2 • Inconsistent Naming Conventions • Duplicated Functions • Generally Good Design • Phase 3 • Awesomeness

  9. Results of the Rework Effort • We added a plethora of comments in c# style • We aligned the code more logically • We refractored our method names/ gui objects to be standardized

  10. Management Report

  11. Summary • Report on the inspection • The original inspection schedule • The assignments given to each group member. • The checklists that were used. • The actual effort expended • The questions about the implementation • Written versions of the answers to the questions • The results of each inspection phase • The result of the rework effort • Management report • Summary • Questions

  12. Questions ============**

More Related