Presentation is loading. Please wait.

Presentation is loading. Please wait.

VLSI Project Neural Networks based Branch Prediction Alexander ZlotnikMarcel Apfelbaum Supervised by: Michael Behar, Spring 2005.

Similar presentations


Presentation on theme: "VLSI Project Neural Networks based Branch Prediction Alexander ZlotnikMarcel Apfelbaum Supervised by: Michael Behar, Spring 2005."— Presentation transcript:

1 VLSI Project Neural Networks based Branch Prediction Alexander ZlotnikMarcel Apfelbaum Supervised by: Michael Behar, Spring 2005

2 VLSI Project Spring 2005 2 Introduction  Branch Prediction has always been a “ hot ” topic 20% of all instructions are branches  Correct prediction makes faster execution Misprediction has high costs  Classic predictors are based on 2 bit counter state-machines 00 SNT taken not-taken taken not-taken 01 WNT 10 WT 11 ST

3 VLSI Project Spring 2005 3 Introduction (cont.)  Modern predictors are 2 level and use 2 bit counters and branch history (local\global)  Known problems are: Memory size exponential to history lengthMemory size exponential to history length Too long history can cause errorsToo long history can cause errors  Recent studies explore Branch Prediction using Neural Networks

4 VLSI Project Spring 2005 4 Project Objective  Develop a mechanism for branch prediction  Explore the practicability and applicability of such mechanism and explore its success rates  Use of a known Neural Networks technology: The Perceptron  Compare and analyze against “ old ” predictors

5 VLSI Project Spring 2005 5 Project Requirements  Develop for SimpleScalar platform to simulate OOOE processors  Run developed predictor on accepted benchmarks  C language  No hardware components equivalence needed, software implementation only

6 VLSI Project Spring 2005 6 Background and Theory Perceptron

7 VLSI Project Spring 2005 7 Background and Theory (cont.) Perceptron Training Let ө =training threshold t=1 if the branch was taken, or -1 otherwise x=history vector if (sign( y out ) != t) or |y out |<= ө then for i := 0 to n do w i := w i + t*x i end for end if

8 VLSI Project Spring 2005 8 Development Stages 1. Studying the background 2. Learning SimpleScalar platform 3. Coding a "dummy" predictor and using it to be sure that we understand how branch prediction is handled in the SimpleScalar platform 4. Coding the perceptron predictor itself 5. Coding perceptron behavior revealer 6. Benchmarking (smart environment) 7. A special study of our suggestion regarding perceptron predictor performance

9 VLSI Project Spring 2005 9 Principles  Branch prediction needs a learning methodology, NN provides it based on inputs and outputs (patterns recognition)  As history grows, the data structures of our predictor grows linearly.   We use a perceptron to learn correlations between particular branch outcomes in the global history and the behavior of the current branch. These correlations are represented by the weights. The larger the weight, the stronger the correlation, and the more that particular branch in the history contributes to the prediction of the current branch. The input to the bias weight is always 1, so instead of learning a correlation with a previous branch outcome, the bias weight learns the bias of the branch, independent of the history.

10 VLSI Project Spring 2005 10 Design and Implementation

11 VLSI Project Spring 2005 11 Hardware budget  History length Long history length -> less perceptrons Long history length -> less perceptrons  Threshold The threshold is a parameter to the perceptron training algorithm that is used to decide whether the predictor needs more training.  Representation of weights Weights are signed integers. Weights are signed integers. Nr of bits = 1 + floor(log(Θ)). Nr of bits = 1 + floor(log(Θ)).

12 VLSI Project Spring 2005 12 Algorithm  Fetch stage 1. The branch address is hashed to produce an index i Є 0..n - 1 into the table of perceptrons. 2. The i-th perceptron is fetched from the table into a vector register, of weights P. 3.The value of y is computed as the dot product of P and the global history register. 4.The branch is predicted not taken when y is negative, or taken otherwise.

13 VLSI Project Spring 2005 13 Algorithm (cont.)  Execution stage 1. 1. Once the actual outcome of the branch becomes known, the training algorithm uses this outcome and the value of y to update the weights in P (training) 2. P is written back to the i-th entry in the table.

14 VLSI Project Spring 2005 14 Simulation Results In all parameters Perceptron based predictor outran the GSHARE Simulation done over Benchmarks of VPR, Perl, Parser from the ss_spec2k

15 VLSI Project Spring 2005 15 Simulation Results (cont.)

16 VLSI Project Spring 2005 16 Simulation Results (cont.)

17 VLSI Project Spring 2005 17 Simulation Results (cont.)

18 VLSI Project Spring 2005 18 Special Problems  Software simulation of hardware Utilizing existing data structures of SimpleScalarUtilizing existing data structures of SimpleScalar  Compiling self written programs for SimpleScalar After several weeks of hard work we decided to use accepted benchmarksAfter several weeks of hard work we decided to use accepted benchmarks

19 VLSI Project Spring 2005 19 Summary  We implemented a different branch prediction mechanism and received exciting results  Hardware implementation of the mechanism is hard, but possible  Longer history in perceptron helps getting better predictions


Download ppt "VLSI Project Neural Networks based Branch Prediction Alexander ZlotnikMarcel Apfelbaum Supervised by: Michael Behar, Spring 2005."

Similar presentations


Ads by Google