Presentation is loading. Please wait.

Presentation is loading. Please wait.

Behavioral Fault Model for Neural Networks A. Ahmadi, S. M. Fakhraie, and C. Lucas Silicon Intelligence and VLSI Signal Processing Laboratory, School of.

Similar presentations


Presentation on theme: "Behavioral Fault Model for Neural Networks A. Ahmadi, S. M. Fakhraie, and C. Lucas Silicon Intelligence and VLSI Signal Processing Laboratory, School of."— Presentation transcript:

1 Behavioral Fault Model for Neural Networks A. Ahmadi, S. M. Fakhraie, and C. Lucas Silicon Intelligence and VLSI Signal Processing Laboratory, School of Electrical and Computer Engineering, University of Tehran, Tehran, Iran. International Conference on Computer Engineering and Technology 2009 (ICCET 2009) January 23, 2009

2 Outline Introduction to ANNs Fault in ANNs Conventional Fault Model for ANNs and their Limitation Proposed Fault Model and Simulation Results

3 Outline Introduction to ANNs Fault in ANNs Conventional Fault Model for ANNs and their Limitation Proposed Fault Model and Simulation Results

4 Introduction to ANNs What are (everyday) computer systems good at... and not so good at? Good atNot so good at Rule-based systems: doing what the programmer wants them to do Dealing with noisy data Dealing with unknown environment data Massive parallelism Fault tolerance Adapting to circumstances

5 Introduction to ANNs Neural network: information processing paradigm inspired by biological nervous systems, such as our brain Structure: large number of highly interconnected processing elements (neurons) working together Like people, they learn from experience (by example)

6 Introduction to ANNs (Applications) Prediction: learning from past experience –pick the best stocks in the market –predict weather –identify people with cancer risk Classification –Image processing –Predict bankruptcy for credit card companies –Risk assessment

7 Introduction to ANNs (Applications) Recognition –Pattern recognition: SNOOPE (bomb detector in U.S. airports) –Character recognition –Handwriting: processing checks Data association –Not only identify the characters that were scanned but identify when the scanner is not working properly

8 Introduction to ANNs (Applications) Data Conceptualization –infer grouping relationships e.g. extract from a database the names of those most likely to buy a particular product. Data Filtering e.g. take the noise out of a telephone signal, signal smoothing Planning –Unknown environments –Sensor data is noisy –Fairly new approach to planning

9 Introduction to ANNs (Mathematical representation of Artificial Neuron ) The neuron calculates a weighted sum of inputs.

10 Introduction to ANNs (cont’d) Inputs Output An artificial neural network is composed of many artificial neurons that are linked together according to a specific network architecture. The objective of the neural network is to transform the inputs into meaningful outputs.

11 Outline Introduction to ANNs Fault in ANNs Conventional Fault Model for ANNs and their Limitation Proposed Fault Model and Simulation Results

12 Fault in ANNs Unit is Functionality of Neuron It shows that defects will occur in these three components

13 Fault in ANNs (cont’d) ANNs are Fault-tolerance due to: –Non-Linearity of NN –Distributed manner of information storage –Number of neurons in a NN –Difference between training and operational error margins

14 Outline Introduction to ANNs Fault in ANNs Conventional Fault Model for ANNs and their Limitation Proposed Fault Model and Simulation Results

15 ANN’s Fault Model In [3] a model for fault in neural networks is presented. They assumed that defects in three components of neural networks could be cover with broken link defect. In [4] use single error bits and also clusters of bits for weights, and the error model for adders and multipliers can be represented by erroneous computation results.

16 ANN’s Fault Model (cont’d) Conventional Model –Stuck at-0, stuck at-1 for inputs and weights Disadvantageous –Every faults in inputs and weights detected as a fault in ANN.

17 Outline Introduction to ANNs Fault in ANNs Conventional Fault Model for ANNs and their Limitation Proposed Fault Model Simulation Results

18 Proposed Fault Model An Artificial Neuron

19 Proposed Fault Model (cont’d) Y = Tanh(x) = - Transition region -3<x<3 - Saturation region 3<x or x<-3

20 Proposed Fault Model (cont’d) All faults enable Error signal. Some faults could be mask. Weighted sum In saturation region Fault should be masked if faulty weighted sum remain in saturation region. In transition region Fault should be masked if faulty (abs(WS – FWS) < μ)

21 Proposed Fault Model (cont’d) Define reliable margin for weighted sum in two regions. f(FWS) = f(WS ± μ) ~ f(WS) Extract μ by Simulation –Inject single fault in inputs and weights and calculate MSE

22 Proposed Fault Model (cont’d) Saturation Region

23 Proposed Fault Model (cont’d) Transition Region

24 Simulation to Extract μ for saturation region

25 Simulation to Extract μ for transition region

26 Simulation Results Model an ANN in C++ and inject fault. With extracted μ many faults could be masked. results for XOR Problem (2-3-1). # faults Appear in output # faults recognized (conventional model) # faults recognized (our fault model) # of masked faults # of injected faults 81448136144 (2*3 +3*1)*8*2 Weights 1232122032Inputs 2017620156176Total

27 Simulation Results # faults Appear in output # faults recognized (conventional model) # faults recognized (our fault model) # of masked faults # of injected faults 20002700020002500027000Weights 165520355165520Inputs 21652752023552516527520Total Results for character recognition Problem (65-15-10).

28 Questions about this Fault Model Why is this Model important? –ANNs have an inherit tolerance to faults. How could be use? –In all fault-tolerant methods in fault detection phase, compare two outputs to detect faults, by using this model compare should be as: abs(output1 – output2) > μ So it could be used in TPG and FT methods

29 References [1 ] A.S. Pandya, Pattern Recognition with Neural network using C++, IEEE PRESS, J. New York, 2nd ed. vol. 3. [2] J. L. Holt, J.N. Hwang. “Finite error precision analysis of neural network hardware implementation”, IEEE Transactions on Computers, vol. 42, no. 3, March 1993, pp. 1380-1389. [3] K. Takahashi, et. al. “Comparison with defect compensation methods for feed-forward neural networks”, International Symposium on Dependable Computing (PRDC’02), 4/02, 0-7695-1852. [4] D. Uwemedimo “A fault tolerant technique for feed-forward neural networks,” PHD’s thesis University of Saskatchewan Fall 1997. [5] L. Breveglieri and V. Piuri, “Error detection in digital neural networks: an algorithm-based approach for inner product protection,” Advance Signal Processing, San Diego CA, July 1994,' pp. 809_820. [6] M. Stevenson, R. Winter, and B. Widrow, “Sensitivity of feedforward neural networks to weight errors,” IEEE Trans. Neural Networks 1 (March 1990), 71_80. [7] C. Lehmann and F. Blayo, “A VLSI implementation of a generic systolic synaptic building block for neural networks”, workshop on VLSI for Artificial Intelligence and Neural Networks, Oxford, UK,1990.' [8] D.S. Phatak and I. Koren, “Complete and partial fault tolerance of feedforward neural nets,” IEEE Trans. Neural Network., 1995, pp.446–456. [9] T. Horita, et. al, “Learning algorithms which make multilayer neural networks multiple-weight-and- neuron-fault”, IEICE Trans. Inf. &Syst., VOL.E91–D, NO.4 APRIL 2008.

30 Thanks for your attention Questions?


Download ppt "Behavioral Fault Model for Neural Networks A. Ahmadi, S. M. Fakhraie, and C. Lucas Silicon Intelligence and VLSI Signal Processing Laboratory, School of."

Similar presentations


Ads by Google