1 / 86

Soft Computing Methods

Soft Computing Methods. J.A. Johnson Dept. of Math and Computer Science Seminar Series February 8, 2013. Outline. Fuzzy Sets Neural Nets Rough Sets Bayesian Nets Genetic Algorithms. Fuzzy sets. Fuzzy set theory is a means of specifying how well an object satisfies a vague description.

gefjun
Télécharger la présentation

Soft Computing Methods

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Soft Computing Methods J.A. Johnson Dept. of Math and Computer Science Seminar Series February 8, 2013

  2. Outline • Fuzzy Sets • Neural Nets • Rough Sets • Bayesian Nets • Genetic Algorithms

  3. Fuzzy sets • Fuzzy set theory is a means of specifying how well an object satisfies a vague description. • A fuzzy set can be defined as a set with fuzzy boundaries • Fuzzy sets were first introduced by Zadeh (1965).

  4. How do we represent a fuzzy set in a computer? First, the membership function must be determined.

  5. Example • Consider the proposition "Nate is tall." • Is the proposition true if Nate is 5' 10"? • The linguistic term "tall" does not refer to a sharp demarcation of objects into two classes—there are degrees of tallness.

  6. Fuzzy set theory treats Tall as a fuzzy predicate and says that the truth value of Tall(Nate) is a number between 0 and 1, rather than being either true or false.

  7. Let A denote the fuzzy set of all tall employees and x be a member of the universe X of all employees. What would the function μA(x) look like

  8. μA(x) = 1 if x is definitely tall • μA(x) = 0 if x is definitely not tall • 0 <μA(x) <1 for borderline cases

  9. Classical Set • Fuzzy Set

  10. Standard Fuzzy set operations • Complement cA(x) = 1 − A(x) • Intersection(A ∩ B)(x) = min [A(x), B(x)] • Union(A ∪ B)(x) = max [A(x), B(x)]

  11. Linguistic variables and hedges • The range of possible values of a linguistic variable represents the universe of discourse of that variable. • A linguistic variable carries with it the concept of fuzzy set qualifiers, called hedges. Hedges are terms that modify the shape of fuzzy sets.

  12. For instance, the qualifier “very” performs concentration and creates a new subset.(very, extremely) • An operation opposite to concentration is dilation. It expands the set.(More or less, somewhat)

  13. Representation of hedges

  14. Representation of hedges Hedge Mathematical Expression Graphical representation

  15. Fuzzy logic is not logic that is fuzzy, but logic that is used to describe fuzziness. • Fuzzy logic deals with degrees of truth.

  16. Building a Fuzzy Expert System • Specify the problem and define linguistic variables. • Determine fuzzy sets. • Elicit and construct fuzzy rules. • Perform fuzzy inference. • Evaluate and tune the system.

  17. References [1]Artificial Intelligence (A Guide to Intelligent Systems) 2nd Edition by MICHAEL NEGNEVITSKY [2]An Introduction to Fuzzy Sets by WitoldPedrycz and Fernando Gomide [3]Fuzzy Sets and Fuzzy Logic: Theory and Applications by Bo Yuan and George J. [4]ELEMENTARY FUZZY MATRIX THEORY AND FUZZY MODELS FOR SOCIAL SCIENTISTS by W. B. VasanthaKandasamy [5]Wikipedia: http://en.wikipedia.org/wiki/Fuzzy_logic [6] Wikipedia: http://en.wikipedia.org/wiki/Fuzzy

  18. References • http://www.softcomputing.net/fuzzy_chapter.pdf • http://www.cs.cmu.edu/Groups/AI/html/faqs/ai/fuzzy/part1/faq-doc-18.html • http://www.mv.helsinki.fi/home/niskanen/zimmermann_review.pdf • http://sawaal.ibibo.com/computers-and-technology/what-limits-fuzzy-logic-241157.html • http://my.safaribooksonline.com/book/software-engineering-and-development/9780763776473/fuzzy-logic/limitations_of_fuzzy_systems#X2ludGVybmFsX0ZsYXNoUmVhZGVyP3htbGlkPTk3ODA3NjM3NzY0NzMvMTUy

  19. Thanks to • Ding Xu • EdwigeNounangNgnadjo For help with researching content and preparation of overheads on Fuzzy Sets

  20. Artificial Neural Networks Neuron:basic information-processing units

  21. Single neural network basic information-processing units

  22. Single neural network

  23. Active function • The Step and Sign active function, also named hard limit functions, are mostly used in decision-making neurons. • The Sigmoid function transforms the input, which can have any value between plus and minus infinity, into a reasonable value in the range between 0 and 1. Neurons with this function are used in the back-propagation networks. • The Linear activation function provides an output equal to the neuron weighted input. Neurons with the linear function are often used for linear approximation.

  24. How the machine learns:Perceptron(Neuron+Weight training)

  25. The Algorithm of single neural network • Step 1: Initialization Set initial weights w1,w2, . . . ,wnand threshold to random numbers in the range [-0.5,0.5]。 • Step 2: Activation • Step 3: Weight training • Step 4: Iteration Increase iteration p by one, go back to Step 2 and repeat the process until convergence.

  26. How the machine learns Weight training

  27. The design of my program

  28. The design of my program

  29. The result

  30. Problem

  31. Multilayer neural network

  32. References • http://pages.cs.wisc.edu/~bolo/shipyard/neural/local.html. 2. Stuart J. Russell and Peter Norvig. Artificial Intelligence: A Modern Approach. Prentice Hall, 2009. 3. http://www.roguewave.com/Portals/0/products/imsl-numerical-libraries/c-library/docs/6.0/stat/default.htm?turl=multilayerfeedforwardneuralnetworks.htm 4. Notes on Multilayer, Feedforward Neural Networks , Lynne E. Parker. 5.http://www.doc.ic.ac.uk/~nd/surprise_96/journal/vol4/cs11/report.html#Why use neural networks

  33. Thanks to • Hongming(Homer) Zuo • Danni Ren For help with researching content and preparation of overheads on Neural Nets

  34. Rough Sets • Introduced by ZdzislawPawlak in the early 1980’s. • Formal framework for the automated transformation of data into knowledge. • Simplifies the search for dominant attributesin an inconsistent information table leading to derivation of shorter if-then rules.

  35. Inconsistent Information Table

  36. Certain rules for examples are: (Temperature, normal)  (Flu, no), (Headache, yes) and (Temperature, high)  (Flu, yes), (Headache, yes) and (Temperature, very_high)  (Flu, yes). Uncertain (or possible) rules are: (Headache, no)  (Flu, no), (Temperature, high)  (Flu, yes), (Temperature, very_high)  (Flu, yes).

  37. Strength of a Rule • Weights • Coverage: # elements covered by rule # elements in universe • Support: # positive elements covered by rule # elements in universe • Degree of certainty: support x 100 coverage

  38. Attribute Reduction • Which are the dominate attributes? • How do we determine redundant attributes?

  39. Indiscernibility Classes • An indiscernibility class, with respect to set of attributes X, is defined as a set of examples all of whose values for attributes x Є X agree • For example, the indiscernibility classes with respect to attributes X = {Headache, Temperature} are {e1}, {e2}, {e3}, {e4}, {e5, e7} and {e6, e8}

  40. Defined by a lower approximation and an upper approximation The lower approximation is X = i xi The upper approximation is X= (i x) i

  41. e5 e8 Lower and upper approximations of set X upper approximation of X Set X lower approximation of X e4 e7 e6 e1 e2 e3

  42. If the indiscernibility classes with and without attribute A are identical then attribute A is redundant.

  43. Inconsistent Information Table

  44. Inconsistent Information Table

  45. Set X

  46. Example:Identifying Edible Mushrooms with ILA Algorithm

  47. Mushrooms

  48. Mushroom Dataset Dataset contains 8124 entries of different mushrooms Each entry (mushroom) has 22 different attributes

  49. Cap-shape Cap-surface Cap-color Bruises Odor Gill-attachment Gill-spacing Gill-size Gill-color Stalk-shape Stalk-root Stalk-surface-above-ring Stalk-surface-below-ring Stalk-color-above-ring Stalk-color-below-ring Veil-type Veil-color Ring-number Ring-type Spore-print-color Population Habitat 22 different attributes

  50. almond anise creosote fishy foul musty none pungent spicy Soft Values for Attributes One of the attributes chosen is odor All the possible values are

More Related