Presentation is loading. Please wait.

Presentation is loading. Please wait.

Analyzing Promoter Sequences with Multilayer Perceptrons Glenn Walker ECE 539.

Similar presentations


Presentation on theme: "Analyzing Promoter Sequences with Multilayer Perceptrons Glenn Walker ECE 539."— Presentation transcript:

1 Analyzing Promoter Sequences with Multilayer Perceptrons Glenn Walker ECE 539

2 Background (DNA) Deoxyribonucleic acid (DNA) is a long molecule made up of combinations of four smaller molecules (base pairs): adenine (A), cytosine (C), guanine (G), thymine (T). These four molecules are combined in an order unique to each living organism. The order of the molecules contains the information to make all the parts necessary for any organism to survive. AGTCAATTGAGACCGATTAGAGATT TCAGTTAACTCTGGCTAATCTCTAA DNA is two-stranded and complementary

3 Background (DNA) Genes are sections of DNA that can contain from a few hundred base-pairs to tens of thousands. Genes contain instructions on how to make proteins -- molecules necessary for building and maintaining organisms. Three different genes on piece of DNA “junk” DNA

4 Background Promoters are sequences of DNA to which RNA polymerase can bind and begin transcription of a gene. Transcription is the process of making a complementary copy of the DNA which is then translated into a protein. promoter sequence actual gene information RNA polymerase binds here and begins transcription

5 Problem Knowing gene locations is desirable for medical reasons One way to find genes is to look for promoter regions How do we find promoter regions?

6 One Solution Promoter regions are highly conserved -- different regions often contain similar patterns We can train neural networks to recognize promoter regions We choose a multilayer perceptron

7 Neural Network Configuration The multilayer perceptron (MLP) is a very common neural network configuration We used a MLP with 3 layers -- an input, output, and hidden layer Number of:InputsHiddenOutput 1 115/58 4,8,16, 20,24,28, 32

8 Neural Network Configuration Two ways of presenting input were tried -- one used 58 inputs and the other 115 Different numbers of hidden nodes were tried to find the optimally structured neural network Only one output was used to indicate whether the input was a promoter sequence or not (1 or 0, respectively)

9 Neural Network Inputs The inputs consisted of 106 sets of 57 bases of DNA. 53 were promoters and 53 were not. One of the input promoter sequences: TACTAGCAATACGCTTGCGTTCGGTGGTTAAGTATGTATAATGCGCGGGCTTGTCGT The input was presented to the neural network in two ways: A 00 C 01 G 10 T 11 A 0.2 C 0.4 G 0.6 T 0.8 114 input neurons 57 input neurons

10 Neural Network Training Each configuration was run 10 times. Within each of the 10 runs, 106 runs were performed. For each of these, 105 of the promoter sequences were used for training with the 106 th used for testing. The testing sequences were changed for each of the 106 runs so that each sequence was the test sequence only once. Ten runs were necessary since weights for the MLP were initialized to random values which might have led to different classifications for the same input sequence.

11 Hidden Nodes vs. Classification Rate

12 Scaled Input vs. Classification Rate

13 Compared to Others Walker (NN)78% O’Neil (NN)83% Towell (KBANN) > 90% O’Neil (Rule-based)70% ID3 (Decision tree)76%

14 Conclusion Not the best but not the worst Using a hybrid technique would improve results The MLP is a very useful tool for the field of bioinformatics

15 References Harley, C. B. and Reynolds, R. P. 1987. Analysis of E. coli promoter sequences. Nucleic Acids Research, 15(5):2343-2361. O’Neill, M. C. 1991. Training back-propagation neural networks to define and detect DNA-binding sites. Nucleic Acids Research, 19(2):313-318. Quinlan, J. 1986. Induction of decision trees. Machine Learning, 1:81-106. Towell, G. G., Shavlik, J. W., and Noordewier, M. O. 1990. Refinement of Approximate Domain Theories by Knowledge-Based Neural Networks. AAAI-90, 861-866.


Download ppt "Analyzing Promoter Sequences with Multilayer Perceptrons Glenn Walker ECE 539."

Similar presentations


Ads by Google