1 / 28

CS 175: Project Proposal Guidelines

CS 175: Project Proposal Guidelines. Padhraic Smyth Department of Computer Science, UC Irvine CS 175, Fall 2007. Project Milestones and Deliverables. Timeline Assignment 6: project proposal due next Friday, Nov 9 th , 12 noon (but feel free to submit earlier) 2-page progress report

oliana
Télécharger la présentation

CS 175: Project Proposal Guidelines

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CS 175: Project Proposal Guidelines Padhraic Smyth Department of Computer Science, UC Irvine CS 175, Fall 2007

  2. Project Milestones and Deliverables • Timeline • Assignment 6: project proposal • due next Friday, Nov 9th, 12 noon • (but feel free to submit earlier) • 2-page progress report • due Monday Nov 19th, noon • In-class presentations: • Thursday Dec 6th • About 4 minutes per student, + questions • Final project reports • due noon Wednesday December 12th (finals week).

  3. Class Grading • Assignments 1 through 5: • best 4 from 1st 5 assignments are selected • worth 40% of your grade, 10% for best 4 out of 5 • Assignments 6 and 7 (project proposal and progress report) • 20% of your grade (10% each) • Final Project report and in-class demonstration • worth 40% of your grade

  4. Guidelines for Projects • All projects are individual projects (no group projects) • Discussion of ideas with other students is encouraged • however, no sharing of code with other students • You can use publicly-available software if you wish as part of your project • MATLAB code made available by researchers on the Web • e.g., other classifiers • e.g., feature extraction/image-analysis algorithms • You must clearly indicate which (if any) code in your project was not written by you and you must reference the source. • You cannot however only use such code in your project • i.e., you need to write at least part of the project code yourself

  5. Submission of Proposals • Proposals can be submitted any time next week • must be submitted no later than noon next Friday • Upload to “Project Proposals” in EEE • Once I approve your project I will let you know straight away by email so that you can start working on it • If the project is not approved, I will give you feedback, and you must resubmit. • Detailed instructions will be available on the class Web site • Format of project proposals • should contain all of the required sections • please use the supplied proposal template (in Word, online) • should be clear and professional • will be graded like a regular assignment

  6. Recommended Reading (papers on Web page) • Face Recognition: a Literature Survey • This is a very comprehensive article, but quite long, so I don't expect you read all of it. Please try to read as much of sections 1, 2, 3 and 5 as you can.  • Robust Real-Time Object Detection • A state-of-the art algorithm for face detection • Face Recognition: Features versus Templates • a well-written article that describes in detail methods and experiments comparing feature-based recognition of faces versus template-based recognition. • Read what you can from these papers • Introductory sections are recommended • You may find good project ideas and suggestions in these papers

  7. Optional Reading (on Web page) • Image Analysis for Face Recognition:  • good survey paper on face recognition. • Face Recognition HomePage: • many useful papers here under "Interesting Papers", "New Papers", and "Algorithms". • Neural Network-Based Face Detection: • describes in detail a fairly complex system for detecting faces in images using multilayer neural networks. • The FERET Evaluation Methodology for Face-Recognition Algorithms: • a paper describing a set of government-sponsored tests to evaluate different face recognition algorithms and systems.

  8. Project Proposals • Distinct parts to your project proposal • The specific classification task you will try to solve • The data sets you will use for your experiments. • The specific image representation (features) you will investigate for solving this problem • The classification (or other) algorithms you plan to use. • An “extended task” • A specific plan for how you will test and evaluate your algorithms • Specific milestones and deliverables • These sections are described in detail in the handout for Project Proposals (on the Web page)

  9. Part 1: Classification Tasks • Pose Recognition • Classify faces into one of 4 classes, i.e., up, straight, right, or left. This is a generalization of the problem in Assignment 4. • Sunglasses Recognition • Classify face images into two classes, "sunglasses" or "no sunglasses". • Expression Recognition • Classify faces into 4 classes based on what expression the person has (angry, happy, sad, neutral), e.g., based on straight faces • Individual Recognition • Classify the faces according to the name of the individual, i.e., into one of 20 classes, given any pose or expression for that individual.

  10. Easy versus Hard Tasks • In order of easiest to hardest • Sunglasses recognition • Quite easy – so you should go beyond the basic problem to make this a bit more interesting and challenging • Pose recognition • Relatively easy (e.g., see assignment 4) • Individual recognition • Relatively easy • Can be made more difficult by cropping the images • Expression recognition • Quite difficult

  11. Other Possible Classification Tasks • Combine various options • Recognize individuals both with sunglasses and without • Recognize individuals using different poses • Gender recognition • Classify images of men versus images of women • Likely to be quite difficult with the default images • Divide individuals randomly into 2 groups: • “authorized” • “non-authorized” • train only on a subset of images “authorized” individuals • test on • (a) other images from authorized individuals • (b) non-authorized individuals

  12. Proposal Part 2: Data Sets • Standard face image data set we have been using: • 20 different individuals (as in Assignments 4 and 5) • 4 basic expressions: neutral, happy, angry, sad. • 4 "poses": straight, up, right, and left. • For each of the above combinations, there are 2 images: one with sunglasses and one without. • Total number of images • for each individual, there are 4 x 4 x 2 = 32 different images • Overall there are roughly 20 x 32 = 640 images available.

  13. Image Resolution • 3 Different Image Resolutions • Full resolution images are 120 x 128 pixels in size • half-resolution are 60 x 64 • quarter resolution are 30 x 32. • The smaller images are just averaged versions of the larger images. • The smaller images are easier to download, process, and manipulate. • However, the full images have more detail and are recommended • You are free to use whichever images you wish • you could compare recognition at multiple resolutions if you like • you could use the smaller images for initial development

  14. Other Data Sets • You can use either the standard data set (previous slides) or some of these other data sets • Other data sets often have more individuals, higher resolution • Links on the Web page to: • Stirling University face database • Yale face database • CMU face image databases • University of Manchester face images • Other data sets available under “databases” at Face Recognition Homepage • at http://www.face-rec.org/

  15. Aspects of Other Data Sets • Formats: • Can use “imread” to read different image formats into MATLAB (e.g., jpg, pgm, gif, etc) – may need to write a script to do this. • Color: • Some data sets are in color • Get 3 color intensities (R,G,B) per pixel rather than just 1 grayscale • More complicated, but better for recognition than grayscale • Resolution • Resolution is often much higher than our 120 x 128 default set, especially with newer image data sets • Computationally more intensive, but can give better results

  16. Part 3: Image Representation • 2 basic ways to represent images for classification • 1. Pixel Representation • m x n pixel image => vector of mn pixel-valued features • i.e., just use the pixels directly as features • 2. Feature Representation • extract more “abstract” features from an image • e.g., average brightness in a certain region • e.g., relative location of an eye and nose template • in your project you are required to experiment with at least 2 high-level features

  17. Examples of Image Features • “Low-level” features • Pixel summaries: average, minimum, maximum brightness. • Location and shape information from a thresholded version of the image. • Information from a blurred version of the image: the location or brightness of the brightest "blob" in an 8 x 8 reduced version of the image. • “Higher-level” features • Information from the edge response image, or edge map, e.g., fit an ellipse to the edge information • Template-based methods, which use templates (as described in class) to estimate where the eyes, nose, mouth, ears, etc., are, and then tries to measure various features such as their relative positions, size, relative brightness, etc.

  18. Two Simple Examples of Features • Example of Edge-based features • calculate response_image = edge(image) • divide response_image into 16 local subimages • calculate the average pixel value in each subimage • this gives a vector of 16 “local edge features” • Example of Template-based Features • generate response_image = template(image, eye_template) • f1 = xmin_location, f2 = ymin_location, f3 = min(response_image) • produces a vector of 3 features • Feel free to be imaginative and creative in designing features!

  19. For example….. Feature.m Input image Edge features Template features Other features [ feature vector]

  20. Part 4: Classification Algorithms • Minimum-distance classifier • kNN classifier • perceptron • any others that you wish to implement on your own • e.g., you are allowed to use other implementations of classifiers in MATLAB (e.g., available on the Web), such as “boosting” or “support vector machines” but you must acknowledge this in your project writeup. • You should use at least 2 classifiers in your experiments • kNN and minimum-distance are the easiest to work with • Note: data sets, tasks, and features often have much more influence on classification accuracy than the classifier used

  21. Part 5: Extended Tasks • Effect of resolution on classification • e.g., compare results at the 3 different resolutions • Effect of pose on classification • Effect of expression on classification • Effect of sunglasses on classification • Add random noise to the image: • evaluate how accuracy degrades as a function of noise added • Simulate “obscuring objects” • e.g., add a “blank” rectangle at random locations “over” the face • evaluate how this affects accuracy

  22. Part 6: Evaluation • Describe a specific evaluation methodology • At a minimum, some cross-validation is required • define which images you will use • there may be multiple experiments, e.g., full to 1/4 resolution • compare pixel-based representation and features • e.g., pixel-based vs feature set 1 • e.g., feature set 1 vs feature set 2 • robustness: • test how your classifier works on smaller images • sensitivity • how sensitive is your classifier to changes in number of training images, to the value of k in kNN, etc? • Other options: • Training on data set and testing on another • Comparing algorithm to human performance

  23. Evaluation continued.. • Cross-validation experiment 1 • usual cross-validation methodology • e.g., v = 10 random test subsets • Cross-validation experiment 2 • do cross-validation using “leave-one-out” on individuals • e.g., for pose, expression or sunglasses recognition • 19 individuals in the training data • 1 individual in the test data • repeat 20 times and average • a more realistic test of the classifier’s performance

  24. Part 7: Milestones • Provide a short time-table • Outline what you plan to accomplish each week • For example: • Week 1: • Download data, write code for features, try some simple kNN classification • Week 2: • Implement some new features • Use cross-validation to select/compare different features. • Week 3: • Implement a new classifier • Perform large-scale cross-validation experiments • Week 4: • Experiments on extended task, test robustness of classifier • Write final report • Keep in mind that a progress report will be due (worth 10% of your class grade) on Monday Nov 19th (2 weeks from Monday)

  25. Project Degree of Difficulty • Project needs to have a minimum degree of difficulty, • e.g., repeating Assignment 4 (classification of right v. up) will not be eligible for a good grade! • Some problems are easier, some harder • sunglasses recognition: fairly straightforward • expression classification: difficult • You will be graded on the quality of your work relative to the difficulty of the task • your grade will not depend on the accuracy of the classifier as long as a good attempt was made (e.g., you could get full points even if your classifier is only 60% accurate) • It is more important to understand why a particular technique works (or does not work) than it is to get high accuracy

  26. Techniques we have not yet discussed….. • It is ok in your project proposal to propose a more advanced technique (e.g., eigenimages) that we have not discussed in class • You should have some basic knowledge about the technique (find an introductory paper and read it) • You should make sure to include a simpler baseline method to compare to • Topics we will cover in future lectures • Image segmentation using clustering and region growing • Eigenimages and SVD techniques • Morphological image analysis • Image resizing

  27. Project Milestones and Deliverables • Timeline • Assignment 6: project proposal • due next Friday, Nov 9th, 12 noon • (but feel free to submit earlier) • 2-page progress report • due Monday Nov 19th, noon • In-class presentations: • Thursday Dec 6th • About 4 minutes per student, + questions • Final project reports • due noon Wednesday December 12th (finals week).

  28. Your “to do” list… • Complete Assignment 5 by Tuesday • Read Assignment 6 (project proposal instructions) carefully by next Tuesday – have questions ready in class. In Tuesday’s lecture we will discuss projects again, answer questions, etc. • Work on your project proposal next week. • No lecture next Thursday, instead we will have office hours from 9:30 to 11 in Bren Hall 4212. Please feel free to come by to discuss your project ideas. • Submit your project proposal to EEE no later than noon Friday Nov 9th (next week). I will send you an “approved/not approved” email as soon as the project reports are graded.

More Related