1 / 29

Topic 8: Level 1 Identification

Topic 8: Level 1 Identification. David L. Hall. Topic Objectives. Continue introduction of Level-1 processing with focus on attribute fusion (e.g., for target identification) Introduce common pattern recognition algorithms Understand issues and limitations. Level-1 (Identity Declaration).

taline
Télécharger la présentation

Topic 8: Level 1 Identification

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Topic 8: Level 1 Identification David L. Hall

  2. Topic Objectives Continue introduction of Level-1 processing with focus on attribute fusion (e.g., for target identification) Introduce common pattern recognition algorithms Understand issues and limitations

  3. Level-1 (Identity Declaration)

  4. DATA FUSION DOMAIN Level O Signal Refinement Level One Object Refinement Level Two Situation Refinement Level Three Threat Refinement Sources Human Computer Interaction Database Management System Support Database Fusion Database Level Four Process Refinement JDL Level One Processing Object Refinement PROCESS Data Alignment Data/Object Correlation Object Positional Estimation Object Identity Estimation FUNCTION • Spatial Reference Adjustment • Temporal Reference Adjustment • Units Adjustment • System Models • Optimization Criteria • Optimization Approach • Processing Approach • Physical Models • Feature-based Inference Techniques • Cognitive-based Models • Gating • Association Measures • Assignment Strategies CATEGORY JDL Level One ProcessingObject Refinement

  5. Sensor #1 Data Alignment Preprocessing Sensor #2 Data Alignment Preprocessing Sensor N Data Alignment Preprocessing • Observation • File • Track File • Sensor • Information Conceptual Processing Flow for Level 1 Fusion Position/ Kinematic/ Attribute Estimation Bulk Gating Data Association Identity Estimation

  6. Model-based methods (high-fidelity physical models) S1 • Decision-based methods • Voting • Decision trees • Logical templates • Bayesian Belief nets • Dempster-Shafer method • Fuzzy logic • Rule-based Systems • Feature-based • Classification • Methods • Neural nets • Cluster algorithms • Parametric templates S2 Feature extraction SN Declaration of Identity Declaration of Identity Features Raw Data Methods for Attribute Fusion Identity or class of entity, object or activity

  7. TargetModels of a priori data Feature Space SignalSpace Target Class A Classifier Sensors FeatureExtraction y PropagationMedia Target Class B • Cluster Methods • Neutral Networks • Templating • etc. Decision Space Sensor Reaction Signal, Image Feature Vector Declaration Of Identity Energy Example of Single-Sensor Feature Based Object Identity Declaration

  8. Cluster Analysis Cluster Analysis: • Basic use is for classification analysis based on multi-parameter similarity. • Provides estimation of pair-wise/cluster-wise similarities. • Supports parametric model development. • Helpful when studying new entities/new parameters. • Can be computationally demanding.

  9. Tagged Data Set: Observations Associated with Specific Objects Sensor A Selection and Calculation of Resemblance Coefficients Selection and Calculation of Clustering Method Clustering Threshold Selection Cluster Definition Sensor B Sensor C Cluster i Object Observation/ Featurej Cluster j y Resemblance Coefficients Object Observation/Featurei Concept of Cluster Analysis

  10. Input Pinion Bearing Corrosion Quill Shaft Crack Spiral Bevel Input Pinion Spalling Collector Gear Crack Helical Input Pinion Chipping No Defect Helical Idler Gear Crack Example: Helicopter Transmission Fault Classification A. K. Garga et al, ”Fault Classification in Helicopter Signals,“ Proc. Amer. Helicopter Soc. 53rd Annual Forum, 1997. Purpose: • Classify faults using data collected from the aft transmission of a Westland CH-46E helicopter • 8 accelerometers • 7 faults and a no-fault • Faulty and seeded fault components used Results: • Achieved robust classification using feature reduction • Separates faults with similar signatures but with • significantly different criticalities • failure progressions

  11. Cluster algorithmic approaches Hierarchical agglomerative methods Iterative partitioning methods Hierarchical divisive methods Density search methods Factor analytic methods (based on correlation matrix processing) Clumping methods (allows membership in more than one class) Graph theoretic methods

  12. Assessment of cluster methods The good news • Requires no a priori “knowledge” of data or target physical characteristics • Allows exploration of features and classes • Simple to use • Extensive COTS software available, see the review at site • http://www.pitt.edu/~csna/software.html The bad news • Requires extensive training data • Results dependent upon • Association measure • Scaling of feature components • Order of data processed • Clustering scheme • Specific training data • etc

  13. Input Vector Output Vector w0 x0 w1 x1  Output f(y) w2 y x2 Adaptive linear combiner wn xn Overview of Adaptive Neural Systems

  14. =5.0 1.0 1.0 =1.0 =0.2 f(y) f(y) 0.0 0.0 -10.0 0.0 10.0 -10.0 0.0 10.0 y (a) y (b) fs(y) = (1 - ey)-1 *V.R. Hush and B.G. Horne The Activation Function

  15. 1 = a • 0 = b • 0 = c • 0 = z Feature Vector Example: Handwriting Recognition

  16. Choosing the Network Size • Tradeoff between too large and too small • Emerging systematic techniques for size selection • Complexity of Learning • Back-propagation (BP) Methods notoriously slow • Weighting search problem is NP-complete • Generalization • How much training data required for general results? • (rule of thumb is 10 x wij) • Generalization error (error on training data vs actual problem) • Network Interconnectivity • Optimal brain damage • Complexity regularization • Weight sharing Neural Network Issues and Limitations

  17. Parametric Templating and Decision Trees • If we know a priori the parametric “boundaries” related to decision or identification classes then we may represent these via parametric templates, decision-trees, or rule-based systems; e.g. • Mechanical system fault if • Engine temperature exceeds Tcritical • Possible bearing failure if vibration exceeds X, • Etc.

  18. Model consists of target signature and contextual information Design Target Model Stored Models Weather (MET) Time/Season Range Sensor Phenomenology Contextual Model Target Signature • Found at specified altitudes • Minimum speed equals 150mph • Travels in specified groups Combined Syntactic/Contextual Target Modeling CONTEXTUAL INTERPRETATION SYNTACTICAL COMPOSITION PATTERN RECOGNITION

  19. A S S O C I A T I O N Decision Level Fusion Identity Declaration Sensor A F E A T U R E EXTRACTION Identity Declaration I/DA Sensor B Joint Identity Declaration Identity Declaration I/DB Sensor N Identity Declaration I/DN Alternate Architectures for Multisensor Identity Fusion A. Decision-Level Fusion

  20. Sensor A F E A T U R E EXTRACTION A S S O C I A T I O N Feature Level Fusion Identity Declaration Sensor B Joint Identity Declaration Sensor N Alternate Architectures for Multisensor Identity Fusion B. Feature-Level Fusion

  21. Sensor A F E A T U R E EXTRACTION Identity Declaration A S S O C I A T I O N Data Level Fusion Sensor B Joint Identity Declaration Sensor N AlternateArchitectures for Multisensor Identity Fusion C. Data-Level Fusion

  22. Fuzzy Mathematics • The world of human cognition is not binary • Many concepts are not defined with math precision: • Examples of fuzzy notions • about two • somewhat heavy • ugly • handsome • tall • borderline Interpretation is context dependent • Fuzzy set theory argues that imprecision is an intrinsic property of various notions Not an approximation of truth Not a failure to comprehend An admission that some notions may forever be imprecise • Do not try to quantify the unquantifiable; but formalize a way to deal with it

  23. Fuzzy Sets: Mathematics • Introduce the membership function:  A(X)  [0, 1] V x  E • Fuzzy sets are sets of ordered pairs: (x,  (x)) • Note: BOOLEAN FUZZY SETs (x) [x,  (x)] T or F partial truth/uncertainty feasible • Membership functions are not unique: Varying solutions Sensitivity analysis to choice of membership function

  24. Fuzzy Sets:Elementary Operations • INCLUSION: A  B  A (X) ≤ B (X) • EQUALITY: A = B  A (X) = B (X) • COMPLEMENTATION: A (X) = 1 - A (X) • UNION: (X)AB= MAX [A (X), B (X)] • INTERSECTION:(X)AB= MIN [A (X), B (X)] • DIFFERENCE:(X)AB= MAX [A (X), B (X)] FOR EXAMPLE, LET A = (X10.2, X2 0.7, X3 1, X4 0.1) AND B = (X10.5, X2 0.3, X3 1, X4 0.0) THEN AB = (X10.5, X2 0.7, X3 1, X4 0.1) AND AB = (X10.2, X2 0.3, X3 1, X4 0.0)

  25. Interpreting Some Fuzzy Set Operations: Intersection • By definition, an element in a fuzzy set can reside partly in one set and partly in another (including the complementary set) • An element cannot be more true in the intersection than it is in either set • An element cannot be in the intersection to a degree more than it is in one of the subsets; this argues for min () • Intersection creates a middle level type set • EXAMPLE: the intersection of TALL and NOT TALL sets

  26. FUZZY LOGIC • If A and B  C • If A and B  D SENSOR 1 A y1 Quantified Inferences C D DEFUZZIFICATION  FUZZIFICATION   SENSOR N B yN Fuzzy Membership Function Xforms Inverse Fuzzy Membership Xforms Fuzzy Rules Fuzzy Calculus • Ad Hoc • Neural Nets • Templates • Etc. Data Fusion with Fuzzy Logic

  27. Spatial Temp. Units Screen Correla Assign Obs. State Uncer Object ID Com Declare Uncer Adj. Adj. Adj. -ing - tion -ment Predict Update -tainty Mgmt. Mea -pari -tainty ID Algorithms and Mgmt. -sures -son Mgmt. Techniques Data Data/Object Object positional/ Object Identity Alignment Correlation Kinematic/Attribute Estimation Estimation Coordinate Transforms X X X X X Sensor Models X Physical Models X X X X X Association Measures X X X X X Assignment Logic X X X Equations of Motion X Optimization Methods X X Kalman Filters X X Covariance Error X X Bayesian Inference X X X X Dempster-Shafer X X X X Voting X X Pattern Recognition X X X X Templating X X X X X Expert Systems X X X X Fuzzy Sets X X X Applicability of Techniques for Level 1 Fusion

  28. Topic 8 Assignments Preview the on-line topic 8 materials Read chapter 5 of Hall and McMullen (2004) Writing assignment 7: Develop a one-page discussion of how level-1 identification and pattern recognition applies to your selected application. Discussion 4: Discuss the concept of identification; how have automated identification processes and sensors (e.g., tags on objects, cell phones, smart cards, etc) become integrated into common activities? What are issues of failure in automated identification techniques?

  29. Data Fusion Tip of the Week Here is an ancient Chinese classification of animals: "Animals are divided into (a) those that belong to the Emperor, (b) embalmed ones, (c) those that are trained, (d) suckling pigs, (e) mermaids, (f) fabulous ones, (g) stray dogs, (h) those that are included in this classification, (i) those that tremble as if they were mad, (j) innumerable ones, (k) those drawn with a very fine camel's hair brush, (l) others, (m) those that have just broken a flower vase, and (n) those that resemble flies from a distance." from Other Inquisitions: 1937-1952 by Jorge Luis Borges Downloaded from http://www.alaska.net/~royce/Funny/classify.html July 30, 2008 It is easy to forget that identification and classification are inherently a labeling process (attaching labels to physical objects, activities and events); Such classifications may not actually be observable or possible – the link between observable features and classes may not be feasible with any technique

More Related