1 / 10

A Designer for Generating Complex Equipment Tests

A Designer for Generating Complex Equipment Tests. Instructed by: Ofir Erel Performed by: Adam Levi Marina Skarbovsky. Project Objectives. Create a GUI designer for generating complex unit tests from existing ones, which will enable:

loki
Télécharger la présentation

A Designer for Generating Complex Equipment Tests

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. A Designer for Generating Complex Equipment Tests Instructed by: Ofir Erel Performed by: Adam Levi Marina Skarbovsky

  2. Project Objectives • Create a GUI designer for generating complex unit tests from existing ones, which will enable: • Loading assemblies that contain unit-test classes and displaying them to the user. • Choosing test-cases from the loaded list and using them as building blocks for new tests. • Specifying the execution flow of the constructed test: • Execution order between inner test cases. • Serial / Parallel execution of test cases. • Delays between concurrently executed tests . • Setting test parameters for tests that require them.

  3. Project Objectives (cont.) • Compiling the new test into one (or both) of the following: • A DLL that is test case – so that the constructed test can be reused as a building block in future tests. • A DLL that is Test Fixture that wraps the test case – so that the constructed test can be run via Nunit. • Saving the tests for future usage. • Loading previously saved tests.

  4. Methodology • Microsoft’s .Net WinForms API was used to create the GUI. • We also used several free open-source GUI components to simplify the development of the docking GUI, options forms and more. • Reflection was used to enable: • Identifying the test classes in a loaded assembly. • Analyzing test parameter types. • Generate new test cases and assemblies from the compounded tests. • Object serialization was used to enable: • Saving/loading tests for future use. • Saving an instance of a test-case’ parameter to be used when executing a generated test-case • Multithreading was used to enable creating tests with concurrency. • Microsoft's XSD Object Generator was used for the creation of classes from XML Schemas. • The Generator was used mostly at the start of the project, and later was abandoned due to the XSD Object Generator limitations.

  5. Achievements • UI: • Supports all the required functionality such as loading assemblies, building and compiling new tests, editing test parameters and properties, etc... • Additional windows including the Grid-Output and the Options form were added to provide a better experience for the user (although they were not required by KLA-Tencor). • A “Complete package” – includes keyboard shortcuts for useful operations, tool bar, icons and more. • All of the UI was built with Microsoft’s Visual Studio “In mind” with the help of WeifenLuo'sDockSample, to make usage more intuitive to the “common” user. • Relatively long operations such as compilation and loads are executed in separate threads to avoid UI freezes.

  6. Achievements (cont.) • The Properties Window • A special window that enables the user to edit properties of different objects. • Similar to the Properties window in Visual Studio. • The window wraps a PropertyGrid object • A component that is part of the .Net framework • Provides support for displaying and editing all data types used by KLA-Tencor. • Our use of the PropertyGrid makes it easy for KLA-Tencor to extend it’s support for additional data types. • The PropertyGrid is widely used and is a well documented component.

  7. Achievements (cont.) • Compilation of new assemblies • Compilation of new assemblies is done with minimal ‘machine-generated’ code. • It is far easier to debug human written code. • Adding additional features to the application will require minimal (if any) interventions in the assembly-creation process. • Generating complex code might be slow. • Most of the generated test is actually an XML file that is embedded inside the generated assembly. • XML is human readable  • Several assemblies are embedded in the generated assembly to reduce dependencies • Enables us to create a “One Package” assembly.

  8. Achievements (cont.) • Saving/Loading • A test is saved as an XML file • The user can easily read and understand the generated file, if needed. • Installer • An installer was made using the free “Bytessence InstallMaker”. • Documentation provided • Users’ guide • Developers’ guide • An auto-generated code documentation • Generated Using Microsoft's “SandCastle” generator. • Has the familiar MSDN “Look and feel”.

  9. A word on the compilation process…

  10. Conclusions • When building a large system, good design is very important. • Well worth the time investment. • Using existing (free) source-code can save a lot of time and work. • Integration with the target system is crucial even after the application went through thorough testing. • The course is a good way to expose students to new technologies, gain some hands on experience with “real” software projects, get familiar with a project’s life-cycle and learn how to conduct oneself in a team.

More Related