1 / 25

Neural Network Language NNL

Language Project. Neural Network Language NNL. Neural Network. Neural networks have a mass appeal Simulates brain characteristics Weighted connection to nodes. Neural Network cont. A neural network consists of four main parts: 1. Processing units.

aysha
Télécharger la présentation

Neural Network Language NNL

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Language Project Neural Network LanguageNNL

  2. Neural Network • Neural networks have a mass appeal • Simulates brain characteristics • Weighted connection to nodes

  3. Neural Network cont. • A neural network consists of four main parts: • 1. Processing units. • 2. Weighted interconnections between the various processing units which determine how the activation of one unit leads to input for another unit. • 3. An activation rule which acts on the set of input signals at a unit to produce a new output signal, or activation. • 4. Optionally, a learning rule that specifies how to adjust the weights for a given input/output pair.

  4. Learning and predicting • Learning method is back propagation • Error is calculated at result nodes • This error is then fed backward through the net • Adjustments to weight reduce error • We look at our result nodes and Output node which is the last node, and • The difference we get is the error

  5. Predicting • Predicting is summation of weights • Dendrites have input value • Input times weight gives value • All value summed give total activation

  6. Parameters • Internal parameters • Learning rate • Threshold • Dendrite initial value • Value • Rand • Activation formula • Triangle • Logistic • linear

  7. Elements • sense- • Gets input from the file • Input is in sequence • dendrite- • Connects neurons • Has weight • Weight is a floating value from 0-1

  8. Elements cont • soma – • body of a neuron • synapse – • It is used for connection. It determines which dendrite will go to which neuron • result – • results are supplied to this node

  9. Parts of Language • The derived NN Engine consists of three parts • Framework initialization • Link necessary files • Topology implementation • NNL specific code • Processor • Process input using topology

  10. Properties • Its imperative • Keywords are not case sensitive • Id’s are case sensitive • Last neuron of layer is one to one relationship with result • We can implement two neural networks, take the same sense values and get different results to compare • Readable • Writable • We are using functions without declarations • Functions can take any values • Error will be caught in semantics

  11. Input and Output • Network input • Format in in in … in # out out …out • Error checking for input file • Network output • The output file is same format • Populate empty output with predictions

  12. Grammar Used • A-> O; A  | D; A | N ; A | Y; A | S ; A | R;A| e • O-> soma I F     --------body of a neuron • E->dendrite E • I-> id • Y->synapse II     --------- a connection • F->function P • P->( P’) • P’->Z,P’| Z • D-> dendrite I F    -------- input to neuron • N -> neuron E   --- a neuron composed of  soma and dendrite • S->sense Z I   ---- information is supplied to this node • R-> result Z id ---- results are supplied to this node • Z->number

  13. Sample Code • // create first neuron (Logistic and Triangle are functions) soma s1 Logistic(10, 2, 5); dendrite d1 Value(1);dendrite d2 Rand(1,2);neuron n1 s1 d1 d2 ;// create second neuronsoma s2 Triangle(3);dendrite d3 Value(1);dendrite d4 Rand(1,2);neuron n2 s2 d3 d4 ;

  14. // create second neuronsoma s3 Triangle(3);dendrite d5 value(4);dendrite d6 Rand(1,2);neuron n3 s3 d5 d6;// connect neuronssynapse n2 d2;synapse n3 d1;// input sense 1 d1;sense 2 d3;sense 3 d4;// out result 1 n1;

  15. Engine • Performs analysis on neurons • Detects layer of neuron • order neurons in a list by layer • Processes neuron • Cascades to get predictions • Performs back propagation • Input data • Streams data from input file • Check data for errors • Output data • Writes results

  16. Traversing neurons

  17. Compiler • Lexical Analyzer • Removes comments • Conditions code • Tokenizes • Parser • Recursive descent • Does not do static semantic

  18. Compiler cont. • Semantics • checks ids declarations • Checks how ids are assembled • Code generation • Transforms NNL code into java • Adds the engine

  19. System Interactions • Creating network • Using network

  20. Example • Detecting letters • detect L and T in a 5 by 5 grid • Compare L and T – no false positive • Compare random results (noise rejection) • Topology • One layer network • One output • Twenty five inputs

  21. Example cont.

  22. Future Enhancements • Persistent data for each NNL program • Easier training methods • Keyword to generate redundant declarations • Visual connection tool • Include parameter choices • High level semantics

  23. Questions

More Related