1 / 50

Human - Assisted Pattern Classification

Human - Assisted Pattern Classification . Kathryn Durfee, Neville Kapoor, Matthew Muccioli, Richard Smart, David Wilkins , Amir Schur. Background on IVS Research.

homer
Télécharger la présentation

Human - Assisted Pattern Classification

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Human-AssistedPattern Classification Kathryn Durfee, Neville Kapoor, Matthew Muccioli, Richard Smart, David Wilkins, Amir Schur

  2. Background on IVS Research • “Interested in enhancing human-computer interaction in applications of pattern recognition where higher accuracy is required than is currently achievable by automated systems, but where there is enough time for a limited amount of human interaction.” • Combination of human and machine classification generally results in higher accuracy. • Additional samples added to the reference set improve system’s ability to accurately identify samples. Adding properly identified images to reference set is beneficial. Source: G. Nagy, C. Tappert, J. Zou, and S. Cha, "Combining human and machine capabilities for improved accuracy and speed in critical visual recognition tasks," NSF Proposal, 2003.

  3. Overview of Study • The purpose of the project is to conduct an experiment on the Interactive Visual System (IVS) desktop software to evaluate the effectiveness of its human-assisted interaction in the identification of flowers. • To provide recommendations and suggestions as to how to improve upon the functionality, accuracy and usage of the IVS software for the identification of flowers.

  4. Overview of Study • Our final experiment (like the preliminary testing previously done) was broken down into 3 parts…. • Manual part identifies a flower by looking the flower up in a guidebook to identify the flower image. • Automatic part identifies the flower by using the automated IVS to identify the flower through the computer system. • Interactive part identifies the flower by having the user input certain information about the flower image and then the computer tries to recognize the flower based on it's own database as well as information from the user.

  5. Overview of Study • Preliminary testing showed that human-computer interaction yielded a higher flower identification accuracy and proved more efficient then solo human identification or solo auto identification. • For final experiment reference data set expanded to 131 species (from 25). Test images increased to 30. • Expected results similar to Preliminary Test.

  6. PassifloraIncarnata

  7. Introduction to the IVS Software • Flower identification is a challenging task especially for untrained individuals. • Some of the reasons for this are: • The large number of different flowers • Many flowers look alike • For example, the two flowers below look alike • Rose Double Impatiens

  8. Introduction to the IVS Software • The Interactive Visual System (IVS) is a software systemdesigned to produce accurate identification of flowers by combining the “computational power of a computer” with the pattern recognition abilities of humans. • Designed to be used on a desktop pc or handheld PDA. Smartphone versions planned for Droid and iOS. • Users can take pictures of flowers they wish to correctly identify and upload these pictures into the IVS system. IVS can then be used to identify the flower, using one or more of the set of parameters mentioned above.

  9. Introduction to the IVS Software • The IVS software is designed to use a subset of those parts when attempting flower identification: • Petals • Petal Shape • Petal Count • Petal Colors • Stamen • Stamen Color

  10. Introduction to the IVS Software Stamen Parts of Flower used by IVS Software Petal

  11. Petal Color 1 Introduction to the IVS Software-Feature Parameters Petal Color 2 Stamen Color Example of Parameters Accepted by IVS Petal count

  12. Introduction to the IVS Software-Additional Features • Crop- Additionally, IVS can be used to crop the picture so that only the flower to be identified is processed by the software, and not the background. • Petal Outline- IVS allows for the user to draw an outline around the petal shape to help narrow the range of flower choices.

  13. Coreopsis Tinctora

  14. Introduction to the IVS Software • Auto-IVS can attempt to automatically extract the features mentioned above (i.e. Auto) • Interactive- The features,or a subset of them,can be manually input (i.e. Interactive). • IVS then returns the top three closest matches for the user to choose from on the front screen. The user can scroll, however, to reveal all flowers in order of closest match.

  15. Introduction to the IVS-Auto Identifcation • You may also click 'AUTO' which will estimate automatically the flower's predominant colors. The 'AUTO' function may be used in combination with user-generated Feature Actions. In addition, if you use the Crop Area function and then click 'AUTO', IVS will attempt to segment the flower and estimate its colors within the cropped region, leading possibly to more accurate results. • Click 'IDENTIFY' to get images and names of the top three closest species matches. Press and hold the individual image match to view it in the main display along with its full species name.

  16. Introduction to the IVS-Interactive Identification • At the lower left-hand side of the display, click on the ‘Action’ menu to view a complete list of Feature Actions. Select one by pressing on it (e.g. Petal Color 1). A Feature Panel will appear at the top of the screen specific to the Feature Action selected. Select the part of the flower having the color you wish to indicate. When you have selected a color, click OK. • You must complete a minimum of one Action in order to identify a flower and may complete up to all six possible actions.

  17. Demonstration the IVS Software • Start the application by clicking ivs.jar • Click the 'LOAD' button, select a target flower image file and click 'Open' to load it to the main display.

  18. Demonstration the IVS Software

  19. Demonstration the IVS Software • Image Loaded in IVS

  20. Demonstration the IVS Software • Flower Correctly Identified by IVS

  21. Demonstration the IVS Software • Possible Actions • Petal Color 1 Petal Color 2

  22. Demonstration the IVS Software • Possible Actions Continued • Petal Count Stamen Color

  23. Demonstration the IVS Software • Possible Actions Continued • Petal Outline Crop Area

  24. Introduction to the IVS-Storing Data • If the target flower has been identified, the user can click 'STORE' to add the species, image and feature data as a new example to the existing database used for identification. • You may add new references for species currently in the database or you may add an entirely new species.

  25. Aquilegia Coerulea

  26. Methodology: Building Flower Inventory • Source: Images gathered through online databases provided by botanical societies, universities and agricultural institutions, seed distributors, etc. • Difficulties:Image quality and variance. Chose to discard images where: species was questionable, quality made extracting at least 3 features difficult. • Outcomes:Total images (combined w/existing) was 131 species, 535 images. Ave. 4 images/species.

  27. Methodology: Experiment Set-up • For the experiment, all subjects would need identical IVS systems… • Loaded his/her images (except test images) to the IVS, extracting data for all 5 features and storing. • Created ZIP file containing his/her new species folders. • Saved data.txt file containing all new data points extracted. • Each team member…. • The Coordinator… • Compiled all zip files into a new ‘images’ folder w/all species. • Created a new ‘newimages’ folder containing the test images. • Compiled all data.txt files into one complete data.txt file. • These 3 components saved to flash drives and transferred to subjects’ laptops at start of experiment.

  28. Methodology: Final Experiment • Larger sample size:Testing conducted on 30 flower images in each stage. • Same Images (added control): Test images used for auto/interactive were the same ones used for manual. • Time limits:Maximum 5min for manual testing. (Recorded >5min) • Same methodology as preliminary testing with a few exceptions….

  29. Manual Findings • Accuracy was not as high as expected; flowers proved difficult to identify • Only29/90 flowers correctly identified using the guidebook (32.2% accuracy). • Maximum time allowed was set to 5minutes. • The average time to locate a flower was2minutes and53seconds • Time vs. accuracy: is manual identification efficient?

  30. *Note: Data lines overlap

  31. Auto Testing Findings • IVS system proved very consistent; the 4 data sets largely overlap. • IVS system was inaccurate in this phase and placed the correct flower in the top 3 positions only13.3%of the time. • The system sometimes placed the correct flower in the100th slot or above. • The system appeared to struggle with flowers that were not distinct. • Results were returned in about 2 seconds.

  32. Interactive Testing Findings- Part A • IVS system becomes less consistent but more accurate (15.56% accuracy) • Still placing correct flowers in100th slot or above on occasion • Using IVS features, humans locate the correct flower in an average of 43.8 sec • Petal count proves helpful but not a great feature to distinguish between flowers • Efficiency appears to be increasing slightly

  33. Interactive Testing Findings- Part B • Significant increase in accuracy (51.11%) • Placing correct flowers in 40th slot or above • Overall correct flowers much more likely to be in top20positions • Correct flower located by humans in an average of40.5seconds • Petal count and color prove to be an effective combination to distinguish flowers • Was this the most efficient use of the IVS?

  34. Interactive Testing Findings- Part C • Accuracydecreased (38.89%) • Correct flowers aremore likely in top19butnot as likley in top 3 positions • System still struggling with certain flowers • Correct flower located by humans in an average of44.4seconds • Time increase due to more feature selection • Do more features mean greater efficiency? Not necessarily…

  35. InteractiveFindings (Expanded) • Top 5 and top 9 data shows much greater accuracy than top 3 • Data appears to stabilize with steadyaccuracy increases in top 9 and top 19 • Top 19 had impressive accuracy(81.11%) • Expanded data indicates IVS was not as incorrect as it first appeared • Correct flowers are not in top 3 but still close enough for humans to find efficiently (within top 20)

  36. Conclusionsfrom FinalTesting • Manual identification is very time-consuming and accurate less than 50% of the time. • Auto identification without any human interaction is extremely fast, but the accuracy is not an improvement over manual identification. • Human-assisted identification increases the accuracy while maintaining a short time to complete. The accuracy of interactive phase was low in the top three positions but significantly increased when expanded

  37. Additional Conclusionsfrom FinalTesting • Decrease in Interactive C may be attributed to the quality of images (see recommendations). • Accuracy of the system, overall, could likely be improved by supplying the system with better reference images (see recommendations). • Knowing which features to use in the interactive phasecan help significantly increase accuracy. • Example 1: Image w/poor lighting- don't use color. • Example 2: Distant image- don't use crops,etc. data.

  38. Clarkia Pulchella

  39. Recommendations for Future Work • Test the IVS with a larger data set (1000+ images) • Test the IVS with more human testers • Test the IVS with a common set of guidelines for petal count (to avoid human error) • Test the IVS allowing subject to use his/her discretion on features to use in identification. • Test the IVS with only high quality images • Test the Android Mobile IVS • Create and develop a mobile IVS for human face features ( PatternFace Recognition)

  40. Recommendations for IVS • Graphical User Interface (GUI) should be made full screen, allowing for easier viewing • The returned results should also include a brief description of the flower • More prompts for the user as to which buttons to select and when. • Zooming in on the images to help extract features • Better light and color sensing differentiate between shades, shadows, and petal colors.

  41. Recommendations for IVS • Build a better reference set: • Better quality reference images (lighting, angles) • Consult with skilled botanist to collect actual samples of each flower type and photograph at least one image per species, preferably on solid backdrop • Develop a web application that enables the admin to update the reference set or have the IVS connect to the web for updates

  42. Resources used so far.... • Amir Schir Soriano, A. (2012, March 2). Telephone interview • Amir Schir Soriano, A. (2012, March). Email interview • Brandenburg, David M. (2010) National Wildlife Federation field guide to wildflowers of North America. New York, NY: Sterling Publishing Company • http://www.wildflowerinformation.org/CommonListing.asp • http://aggie-horticulture.tamu.edu/wildseed/wildflowers.html • http://www.ct-botanical-society.org/galleries/galleryindex.html • http://wildflowerfinder.org.uk/

More Related