1 / 16

160 likes | 417 Vues

Geometry at Work. Adapted by Dr. Sarah from a talk by Dr. Catherine A. Gorini. Computer Learning To Diagnose and Categorize. We will use these tools: • higher-dimensional vector spaces • convex sets • inner products. Geometry in Learning Kristin P. Bennett

Télécharger la présentation
## Geometry at Work

**An Image/Link below is provided (as is) to download presentation**
Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.
Content is provided to you AS IS for your information and personal use only.
Download presentation by click this link.
While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.
During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

**Geometry at Work**Adapted by Dr. Sarah from a talk by Dr. Catherine A. Gorini**Computer Learning To Diagnose and Categorize**We will use these tools: • higher-dimensional vector spaces • convex sets • inner products**Geometry in Learning**Kristin P. Bennett Rensselaer Polytechnic Institute Erin J. Bredensteiner University of Evansville Goal: To classify objects into two classes based on specific measurements made on each object. Examples: tumors: benign or malignant patients: healthy or with heart disease Congressmen: Republicans or Democrats**• Data is collected for a large sample of individuals.**• Individuals are assigned to one of two classes by experts. • A perceptron is created. A perceptron is a linear model that is used to classify points into two sets. • New individuals are then classified by a computer using the perceptron.**• Each individual corresponds to a point in Rn,where n is**the number of measurements recorded for each individual. • The perceptron corresponds to a plane that separates Rn into two half-spaces, each half-space containing points of only one type.**A plane with normal vector wRn is given by the vector**equation xw= where is the Euclidean distance from the origin for R. If p is the position vector of a point on the plane, then (x – p) w =0 xw – p w= 0 xw = p w= xw = and g gives the location of the plane relative to the origin. This plane is g units away from the parallel plane through the origin, xw = .**Definition: Let x be a point in Rnto be classified as a**member of class A or class B. A perceptron with weights wRn and threshold R assigns x to class A or to class B using the following rule: If xw – then x A. If xw – <then x B. By convention, if xw = then x B.**Our Goal**Use the data to solve for wRn and R that gives the “best” plane that separates the two sets of points, A and B, in Rn There are two cases: Linearly Separable Case Linearly Inseparable Case We need some terminology before we can define the cases.**A set is convex if the segment connecting any two points in**the set is also in the set. The convex hull of a set of points is the smallest convex set that contains the set. Let A1,A2, … , Am be the points in set A. Then u1 A1+u2 A2 +… + umAm is in the convex hull of A if u1 +u2 +… + um = 1 and ui > 0.**Our Goal**Use the data to solve for wRn and R that gives the “best” plane that separates the two sets of points, A and B, in Rn There are two cases: Linearly Separable Case: There are many planes lying between the two sets. This happens when the convex hulls of the sets A and B are disjoint. Linearly Inseparable Case: There is no plane lying between the two sets. This happens when the convex hulls of the sets A and B intersect.**Solution of the Linearly Separable Case**• Find the plane perpendicular to the bisector of the segment connecting the two closest points in the convex hulls of A and B. This is an optimization problem: • minimize • || u1 A1+u2 A2 +… + umAm – (v1 B1+v2 B2 +… + vkBk) || • such that u1 +u2 +… + um = 1 • v1 +v2 +… + vk = 1 • ui • vj This standard optimization problem can be solved on a computer.**separating plane**Set A Set B**Solution of the Linearly Inseparable Case**If there is no plane that separates the two sets, there are many different planes that could define a perceptron. Robust Linear Programming. Minimize the maximum distance from any misclassified point to the separating plane. This can give importance to one hard-to-classify point. Multisurface Method of Pattern Recognition. Find the plane that misclassifies the least number of points. Generalized Optimal Plane. Reduce the average distance of misclassified points from the separating plane and decrease the maximum classification error.**Practical Applications**• Heart Disease • Breast Cancer • Sonar Signals to Distinguish Mines from Rocks • Voting Patterns of Congressmen**Practical Applications and Dimensions of the Spaces**• Heart Disease • 13 attributes such as age, cholesterol, and resting blood pressure • Cleveland Heart Disease Database 297 patients • Breast Cancer • 9 attributes obtained via needle aspiration of a tumor such as clump thickness, uniformity of cell size, and uniformity of cell shape • Wisconsin Breast Cancer Database 682 patients • 100% correctness on computer diagnosis of 131 new cases**Practical Applications and Dimensions of the Spaces**• Sonar Signals to Distinguish Mines from Rocks • 60 attributes • 208 mines and rocks • linearly separable example • Voting Patterns of Congressmen • 1984 Voting Records Database of 16 key votes • 435 Congressmen

More Related