html5-img
1 / 42

Object Recognition

Object Recognition. By A.Sravya. Object recognition problem. Given some knowledge of how certain objects may appear and an image of a scene possibly containing those objects, report which objects are present in the scene and where. Applications. Image panoramas Image watermarking

skah
Télécharger la présentation

Object Recognition

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Object Recognition By A.Sravya

  2. Object recognition problem • Given some knowledge of how certain objects may appear and an image of a scene possibly containing those objects, report which objects are present in the scene and where.

  3. Applications • Image panoramas • Image watermarking • Global robot localization • Face Detection • Optical Character Recognition • Manufacturing Quality Control • Content-Based Image Indexing • Object Counting and Monitoring • Automated vehicle parking systems • Visual Positioning and tracking • Video Stabilization

  4. Introduction • Pattern or Object: Arrangement of descriptors(features) • Pattern class: Family of patterns that share some common properties • Pattern Recognition: Techniques for assigning patterns to their respective classes • Common pattern arrangements: 1. vectors – ( for quantitative descriptors) 2. strings 3. trees – (for structural descriptors) • Approaches to pattern recognition • Decision – theoretic  quantitative descriptors • Structural  qualitative descriptors

  5. where xi represents the ith descriptor n is no: of descriptors associated with the pattern Example : Consider 3 types of iris flowers- setosa,virginica and versicolor Each flower is described by petal length and width . Therefore the pattern vector is given by:

  6. Patterns and pattern classes vector example

  7. Patterns and pattern classesanother vector example • Here is another example of pattern vector generation. • In this case, we are interested in different types of noisy shapes.

  8. Strings and trees • Recognition problems in which not only quantitative measures about each feature but also the spatial relationships between them determine class membership,are solved by structural approach Example: Fingerprint recognition • Strings • String descriptions generate patterns of objects whose structure is based on relatively simple connectivity of primitives usually associated with boundary shape

  9. String example • String of symbols w =……..abababab……….

  10. Trees • Tree descriptors more powerful than strings • Most hierarchical ordering schemes lead to tree structures Example:

  11. Recognition based on Decision-Theoritic Methods Based on the use of decision functions ( d(x) ) Here we find W decision functions d1(x), d2(x),....... dW(x) with the property that, if a pattern x belongs to class ωi , then 1 The decision boundary separating class and is given by Now the objective is to develop various approaches for finding decision functions that satisfy Eq(1)

  12. Decision theoritic methods- matching • Here we represent each class by a prototype pattern vector • An unknown pattern is assigned to the class to which it is closest in terms of a predefined approach • The two approaches are: • Minimum distance classifier – calculate the Euclidean distance • correlation

  13. Minimum distance classifier Prototype pattern vector Calculate the Euclidean distance between the unknown vector and the prototype vector Distance measure is the decision function …….large numerical value

  14. Contd.. Decision boundary b/w classes and is ….perpendicular bisector If dIj(x) > 0, then x belongs to IfdIj(x) < 0, then x belongs to

  15. example

  16. Matching by correlation • Correlation is used for finding matches of a sub image w(x,y) of size J X K within an image f(x,y) of size M X N • Correlation between w(x,y) and f(x,y) is given by

  17. Contd.. • The maximum values of c indicates the positions where w best matches f

  18. Optimum statistical classifiers • This is a probabilistic approach to pattern recognition • Average loss The classifier that minimizes the total average loss is called the Bayes classifier

  19. Optimum statistical classifiers Bayes classifier assigns an unknown pattern x to class if Loss for a correct decision is assigned ‘0’ and for incorrect decision ‘1’

  20. Optimum statistical classifiers Further simplified to Finally ….Bayes Decision Function BDF depends on the pdfs of the patterns in each class and the probability of occurrence of each class Sample patterns are assigned to each class and then necessary parameters are estimated Most commonly used form for is the Gaussian pdf

  21. Bayesian classifier for guassian pattern classes Bayes decision function for Gaussian pattern classes is here n = 1 & W = 2

  22. Bayesian classifier for guassian pattern classes • In n-dimensional case Bayesian decision function for gaussian pattern classes under 0-1 loss function

  23. Bayesian classifier for guassian pattern classes • BDF reduces to minimum distance classifier if: • Pattern classes are Gaussian • All covariance matrices are equal to the identity matrix 3. All classes are equally likely to occur • Therefore minimum distance classifier is optimum in Bayes sense if the above conditions are satisfied

  24. Neural Networks Neural network: information processing paradigm inspired by biological nervous systems, such as our brain Structure: large number of highly interconnected processing elements (neurons) working together Neurons are arranged in layers

  25. Neural Networks Each neuron within the network is usually a simple processing unit which takes one or more inputs and produces an output. At each neuron, every input has an associated weight which modifies the strength of each input. The neuron simply adds together all the inputs and calculates output.

  26. NN contd.. • Neurons: Elemental nonlinear computing elements • We use these networks for adaptively developing the coefficients of decision functions via successive presentations of training set of patterns • Training patterns: Sample patterns used to estimate desired parameters • Training set: Set of such patterns from each class • Learning or Training: Process by which a training set is used to obtain decision functions • Perceptronmodel basic model of a neuron • Perceptrons are learning machines

  27. Perceptron for two pattern classes

  28. Another way :

  29. Training algorithms-linearly seperable classes then If ω2 and This algorithm makes a change in w only if the pattern being considered at the kthstep in the training sequence is misclassified

  30. Training algorithms-Nonseperable classes • This method minimizes the error between the actual and the desired response From gradient descent algorithm

  31. Training algorithms-Nonseperable classes Changing weights reduces the error by a factor

  32. Multilayer feedforward neural networks • We focus on decision functions of multiclass pattern recognition problems, independent of whether the classes are separable or not

  33. Multilayer feedforward neural networks • Activation element is a sigmoid function Input to the activation element of each node in layer J The outputs of layer K are The final sigmoid function is

  34. Training by back propagation • We begin by concentrating on the output layer • The process starts with an arbitrary set of weights through out the network • Generalized delta rule has two basic phases: • Phase 1 • A training vector is propagated through the layers to compute the output Ojfor each node • The outputs Oqof the nodes in the output layer are then compared against their desired responses rp, to generate the error terms δq • Phase 2 • A backward pass through the network during which the appropriate error signal is passed to each node and the corresponding weight changes are made

  35. example

  36. Performance of a neural network as a function of noise level

  37. Improvement in peformance by increasing no.of training patterns

  38. Face recognition

  39. Thank you

More Related