Presentation is loading. Please wait.

Presentation is loading. Please wait.

Theory and Applications of GF(2 p ) Cellular Automata P. Pal Chaudhuri Department of CST Bengal Engineering College (DU) Shibpur, Howrah India (LOGIC ON.

Similar presentations


Presentation on theme: "Theory and Applications of GF(2 p ) Cellular Automata P. Pal Chaudhuri Department of CST Bengal Engineering College (DU) Shibpur, Howrah India (LOGIC ON."— Presentation transcript:

1 Theory and Applications of GF(2 p ) Cellular Automata P. Pal Chaudhuri Department of CST Bengal Engineering College (DU) Shibpur, Howrah India (LOGIC ON MEMORY)

2 An Application Of LOGIC ON MEMORY

3 Logic on Memory Basic Concept Classical Example –Content Addressable Memory –Content Addressable Processor Cell Comp Bit Line Word Line = 

4 Logic on Memory Sub Micron era Search Storage of (Large) size table and efficient search Memory + CA Efficient storage and search of data with CA based Classifier

5 Logic-on-memory Problem Definition CA Based Solution MemoryCA Memory Element XOR XNOR Logic Logic on Memory to Implement a specific function

6 GF(2 p ) CA as a Classifier Classification ---- a universal problem Given the Input, fast search for the attribute of an input element Uses a Special Class of CA Non Group Multiple Attractor CA (MACA)

7 Classifier Design of a CA Based Classifier Input is an element C ij ----- the classifier outputs A i --- that is the C ij belongs to class A i Implicit Memory Fast Search LOGIC ON MEMORY Memory(Conventional & CA) + (CA)XOR Logic

8 Special Class Of CA Non Group Multiple Attractor CA (MACA) 9 7 11 5 8 4 10 6 D1 MACA MACA 8 10 4 6 9 11 7 5 0110 0101 11 10 12 14 2 0 13 15 1 3 0000 0011 00 01 1110 13 1 15 3 12 2 14 0 0100

9 Problem Definition Given Sets {P1} {P2} ………. {Pn} where each set {Pi} = {X i1, X i2, X i3 …… X im } Given a randomly selected value X kj To Answer The Question Which Class does X kj belong To? 12 14 2 0 8 10 4 6 9 11 7 5 13 15 1 3

10 Classifier n bit CA with M Attractors is a natural Classifier {0,3,5,6} Are the attractors Inverted trees are the Attractor Basins 12 14 2 0 8 10 4 6 9 11 7 5 13 15 1 3

11 Classifier Suppose we want to identify which class X = 7 lies in The CA is loaded with X CA is run in autonomous mode for k (=2) cycles where k is the depth of CA The Pseudo Exhaustive bits (10 ) of the Attractor give the class of the pattern 12 14 2 0 8 10 4 6 9 11 7 5 13 15 1 3 0110 0101 11 10 0000 0011 00 01

12 Two Class D1 Classifier We use Depth 1 CA (D1 CA) We construct a CA satisfying the following 1. R1  x  P1 and  y  P2 T (x  y)  0 2. R2 T 2 =T T (T  I ) = 0 12 14 2 0 13 15 1 3 0000 0011 00 01 13 1 15 3 12 2 14 0 0100 Depth 1 CA (D1 MACA)

13 Algorithm Any CA Satisfying R1 & R2 is a classifier for P = { { P1} {P2} } P1 = { 0,2,12,14} and P2 = { 3,1,13,15} Each basin of CA will contain patterns from either P1 or P2 2 attractors 12 14 2 0 13 15 1 3 0000 0011 00 01

14 Algorithm In general, there will be 2 n-r attractors ( n=Size of CA, r=Rank of T matrix ) 2 n-r PE positions at certain (n-r) positions The two Classes can be identified by a single bit memory stored in a 2 n-r x 1 bit memory or a simple logic circuit 12 14 2 0 13 15 1 3 0000 0011 00 01

15 Multiclass Classifier But what about multi class classifier ? A general CA based solution does not exist However we can use hierarchical Two Classifier to build a solution 8 10 4 6 9 11 7 5 0110 0101 11 10 12 14 2 0 13 15 1 3 0000 0011 00 01

16 Multiclass Classifier Hierarchical Two Class classifier Built by partitioning the pattern set P P = {P1, P2, P3,…Pn} as {{P1,P2,P3…Pk},{Pk+1,….Pn}} and finding a two class classifier for this This is repeated for each subset Number of CAs required is log 2 n where n is the number of classes

17 Multiclass Classifier Classes are P1 = {0,2,12,14} P2 = {3,1,13,15} P3 = {5,7, 9,11} P4 = {6,4,8,10} 8 10 4 6 9 11 7 5 0110 0101 11 10 12 14 2 0 13 15 1 3 0000 0011 00 01

18 Multiclass Classifier Initially we built a Two Classifier to identify these two classes Temp 0 = {P1,P2} Temp 1 = {P3,P2} Then two more Classifiers to identify {P1 and P2} and {P3 and P4} 8 10 4 6 9 11 7 5 0110 0101 11 10 12 14 2 0 13 15 1 3 0000 0011 00 01 Temp 0 Temp 1

19 General Multiclass Classifier P2 Pn Pk Temp 1 Temp 0 Temp 00 Temp 11 Temp lm log 2 n CA s

20 Multiclass Classifier in GF (2 p ) Handles class elements of Symbol string rather than a bit string A T matrix satisfying R1 and R2 is efficiently obtained using BDD in GF(2) In GF (2 p ) we have introduced certain hueristics to get a solution T matrix in reasonably fast time

21 Application Areas Fast encoding in vector quantization of images Fault diagnosis

22 Image Compression Target Pictures  Portraits and similar images Image size 352 x 240 ( CCIR size ) Target compression ratio 97.5 % - 99 % Target PSNR value 25 - 30 dB Target application  low bit rate coding for video telephony

23 Algorithm Used a training set of 12 pictures of a similar nature The images were partitioned in sizes of 8 x 8 These 8 x 8 blocks are clustered around 8192 pivot points using standard LBG algorithm B1B1 B2B2 BiBi BmBm BnBn Blocks Training Images

24 Algorithm Elements are 64 length GF (2 p ) Symbol string --- 8 x 8 pixel block Therefore we have 8192 clusters And these can be addressed using 13 bits A multi class classifier is designed for these 8192 classes The depth of this classifier is 13 C1C1 C2C2 ….. …. CnCn Clusters Pivot Points C1C1 C2C2 C 8192 Codebook

25 Algorithm The target image to be coded is divided into 8 x 8 blocks Each of these blocks is input to the Multi Class Classifier The Multi Class Classifier outputs the class id of the block This is done in effectively 13 clock cycles plus some memory access times Encoding time is thus drastically reduced Image Block Classifier Class id

26 Training Images B1B1 B2B2 BiBi BmBm BnBn Blocks C1C1 C2C2 ….. …. CnCn Clusters Pivot Points C1C1 C2C2 C 8192 Image Block Classifier Codebook Algorithm

27 Sample Results

28 Sample Images PSNR 27.8 db Compression ratio 97.5 %

29 Sample Images PSNR 25.1 db Compression ratio 97.5 % PSNR 28.5 db Compression ratio 97.5 %

30 Schematic of a CA Based Vector Quantizer CA Memory CA Conf. Controller PE bits Shift Register Output

31 Hardware Design for CA Based Vector Quantizer

32 Improvements Over the Basic scheme A hierarchical encoder has been implemented The image is first encoded using 16 x 16 blocks …. If a match cannot be obtained with any of the classes in the training set, then a match with 8 x 8 blocks is tried This pushes up the Compression ratio to 99 %

33 Dynamic Classification Static Database The solution assumes the target pattern is present in the cluster set If a new pattern outside this range is input, the classifier indicates No entry in The Database So a linked queue of these new blocks is maintained At periodic intervals, a new Multiclass Classifier is obtained using these updated data members after incorporating them in the appropiate classes

34 Thank You


Download ppt "Theory and Applications of GF(2 p ) Cellular Automata P. Pal Chaudhuri Department of CST Bengal Engineering College (DU) Shibpur, Howrah India (LOGIC ON."

Similar presentations


Ads by Google