1 / 34

Theory and Applications of GF(2 p ) Cellular Automata

Theory and Applications of GF(2 p ) Cellular Automata. (LOGIC ON MEMORY). P. Pal Chaudhuri Department of CST Bengal Engineering College (DU) Shibpur, Howrah India. An Application Of LOGIC ON MEMORY. Logic on Memory. Basic Concept Classical Example Content Addressable Memory

Télécharger la présentation

Theory and Applications of GF(2 p ) Cellular Automata

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Theory and Applications of GF(2p) Cellular Automata (LOGIC ON MEMORY) P. Pal Chaudhuri Department of CST Bengal Engineering College (DU) Shibpur, Howrah India

  2. An Application Of LOGIC ON MEMORY

  3. Logic on Memory • Basic Concept • Classical Example • Content Addressable Memory • Content Addressable Processor Bit Line Word Line Cell Comp = 

  4. Logic on Memory • Sub Micron era • Search • Storage of (Large) size table and efficient search • Memory + CA • Efficient storage and search of data with CA based Classifier

  5. Logic-on-memory • Problem Definition • CA Based Solution Memory CA Memory Element XOR XNOR Logic Logic on Memory to Implement a specific function

  6. GF(2p) CA as a Classifier • Classification ---- a universal problem • Given the Input, fast search for the attribute of an input element • Uses a Special Class of CA • Non Group Multiple Attractor CA (MACA)

  7. Classifier • Design of a CA Based Classifier • Input is an element Cij ----- the classifier outputs Ai --- that is theCij belongs to class Ai • Implicit Memory • Fast Search LOGIC ON MEMORY Memory(Conventional & CA) + (CA)XOR Logic

  8. 8 10 9 11 13 15 4 7 1 12 14 00 11 01 10 2 6 5 3 0110 0101 0011 0000 0 4 8 7 10 9 11 1 13 2 12 15 14 6 5 3 0 00 01 Special Class Of CANon Group Multiple Attractor CA (MACA) MACA 10 11 D1 MACA

  9. 8 10 12 9 14 11 13 15 4 2 7 1 6 0 5 3 Problem Definition • Given Sets {P1} {P2} ………. {Pn} where each set {Pi} = {Xi1 , Xi2 , Xi3 …… Xim} • Given a randomly selected value Xkj • To Answer The Question Which Class does Xkj belong To?

  10. 8 10 12 9 14 11 13 15 4 2 7 1 6 0 5 3 Classifier • n bit CA with M Attractors is a natural Classifier • {0,3,5,6} Are the attractors • Inverted trees are the Attractor Basins

  11. 8 10 12 9 14 11 13 15 00 4 01 2 7 1 0011 0000 6 0 5 3 11 10 0110 0101 Classifier • Suppose we want to identify which class X = 7 lies in • The CA is loaded with X • CA is run in autonomous mode for k (=2) cycles where k is the depth of CA • The Pseudo Exhaustive bits (10 ) of the Attractor give the class of the pattern

  12. 13 15 1 01 00 12 14 3 0011 0000 2 0 1 13 2 12 15 14 3 0 00 01 Two Class D1 Classifier • We use Depth 1 CA (D1 CA) • We construct a CA satisfying the following 1. R1 x  P1 and  y  P2 T (x  y)  0 2. R2 T 2=T T (T  I ) = 0 Depth 1 CA (D1 MACA)

  13. 13 15 1 12 14 00 01 2 3 0011 0000 0 Algorithm • Any CA Satisfying R1 & R2 is a classifier for P = { { P1} {P2} } • P1 = { 0,2,12,14} and P2 = { 3,1,13,15} • Each basin of CA will contain patterns from either P1 or P2 • 2 attractors

  14. 13 15 1 12 14 00 01 2 3 0011 0000 0 Algorithm • In general, there will be 2 n-r attractors ( n=Size of CA , r=Rank of T matrix ) • 2n-r PE positions at certain (n-r) positions • The two Classes can be identified by a single bit memory stored in a 2n-r x 1 bit memory or a simple logic circuit

  15. 8 10 9 11 13 15 4 7 1 12 14 00 11 01 10 2 6 5 3 0110 0101 0011 0000 0 Multiclass Classifier • But what about multi class classifier ? • A general CA based solution does not exist • However we can use hierarchical Two Classifier to build a solution

  16. Multiclass Classifier • Hierarchical Two Class classifier • Built by partitioning the pattern set P • P = {P1, P2, P3 ,…Pn} as {{P1,P2,P3…Pk},{Pk+1,….Pn}} and finding a two class classifier for this • This is repeated for each subset • Number of CAs required is log2n where n is the number of classes

  17. 8 10 9 11 13 15 4 7 1 12 14 00 11 01 10 2 6 5 3 0110 0101 0011 0000 0 Multiclass Classifier Classes are • P1 = {0,2,12,14} • P2 = {3,1,13,15} • P3 = {5,7, 9,11} • P4 = {6,4,8,10}

  18. 8 10 9 11 13 15 4 7 1 12 14 00 11 01 10 2 6 5 3 0110 0101 0011 0000 0 Multiclass Classifier • Initially we built a Two Classifier to identify these two classes • Temp0 = {P1,P2} • Temp1 = {P3,P2} • Then two more Classifiers to identify {P1 and P2} and {P3 and P4} Temp 1 Temp 0

  19. Temp0 Temp1 Temp 11 Temp 00 Templm Pn Pk P2 General Multiclass Classifier log2 n CA s

  20. Multiclass Classifier in GF (2p) • Handles class elements of Symbol string rather than a bit string • A T matrix satisfying R1 and R2 is efficiently obtained using BDD in GF(2) • In GF (2p) we have introduced certain hueristics to get a solution T matrix in reasonably fast time

  21. Application Areas • Fast encoding in vector quantization of images • Fault diagnosis

  22. Image Compression • Target Pictures  Portraits and similar images • Image size 352 x 240 ( CCIR size ) • Target compression ratio 97.5 % - 99 % • Target PSNR value 25 - 30 dB • Target application  low bit rate coding for video telephony

  23. B2 Blocks B1 Bi Bm Bn Algorithm Training Images • Used a training set of 12 pictures of a similar nature • The images were partitioned in sizes of 8 x 8 • These 8 x 8 blocks are clustered around 8192 pivot points using standard LBG algorithm

  24. C1 C2 C8192 ….. C1 C2 Clusters Cn …. …. Pivot Points Algorithm • Elements are 64 length GF (2p) Symbol string --- 8 x 8 pixel block • Therefore we have 8192 clusters • And these can be addressed using 13 bits • A multi class classifier is designed for these 8192 classes • The depth of this classifier is 13 Codebook

  25. Image Block Classifier Algorithm • The target image to be coded is divided into 8 x 8 blocks • Each of these blocks is input to the Multi Class Classifier • The Multi Class Classifier outputs the class id of the block • This is done in effectively 13 clock cycles plus some memory access times • Encoding time is thus drastically reduced Class id

  26. Algorithm C1 C2 C8192 ….. C1 C2 Image Clusters Block Cn …. …. Classifier Pivot Points Training Images Codebook B2 Blocks B1 Bi Bm Bn

  27. Sample Results

  28. Sample Images • PSNR 27.8 db • Compression ratio 97.5 %

  29. Sample Images PSNR 25.1 db Compression ratio 97.5 % PSNR 28.5 db Compression ratio 97.5 %

  30. Schematic of a CA Based Vector Quantizer Memory CA Conf. CA PE bits Controller Shift Register Output

  31. Hardware Design for CA Based Vector Quantizer

  32. Improvements Over the Basic scheme • A hierarchical encoder has been implemented • The image is first encoded using 16 x 16 blocks …. • If a match cannot be obtained with any of the classes in the training set, then a match with 8 x 8 blocks is tried • This pushes up the Compression ratio to 99 %

  33. Dynamic Classification • Static Database • The solution assumes the target pattern is present in the cluster set • If a new pattern outside this range is input , the classifier indicates No entry in The Database • So a linked queue of these new blocks is maintained • At periodic intervals, a new Multiclass Classifier is obtained using these updated data members after incorporating them in the appropiate classes

  34. Thank You

More Related