Presentation is loading. Please wait.

Presentation is loading. Please wait.

Reverse Engineering of Genetic Networks (Final presentation)

Similar presentations


Presentation on theme: "Reverse Engineering of Genetic Networks (Final presentation)"— Presentation transcript:

1 Reverse Engineering of Genetic Networks (Final presentation)
Ji Won Yoon (s ) supervised by Dr. Dirk Husmeier. MSc in Informatics at Edinburgh University,

2 Reverse Engineering What is reverse engineering of gene network?
Missing gene + “up” and “down” data from micro array - Relevance Network My own method - MCMC for Bayesian network

3 Past works Comparison of existing approaches to the reverse engineering of genetic networks, Mutual information relevance networks My own method Bayesian networks using Markov Chain Monte Carlo method Applying all methods to synthetic data generated from a gene network simulator. Applying to Biological data Diffuse large B cell Lymphoma gene expression data Arabidopsis gene expression data

4 Relevance Network (Butte, 2000)
Using mutual information MI(A, B) = H(A) – H(A|B) = H(B) – H(B|A) = MI(B, A) -> Symmetric MI(A, B) = H(A) + H(B) – H(A, B) Mutual information is zero if two genes are independent. Pair wise relation

5 Relevance Network (cont.)
Relevance Network (Butte, 2000) Useful only to local relation due to pair wise relation. Important to select proper threshold to get good relations. Bootstrapping (Comparison of results in real data and in randomly permuted data) Difficulty to identify the relation with two parents due to the locality. MI(A, [B, C, D])> MI(A, B)< MI(A, C)< MI(A, D)< Cannot detect XOR operation MI(A, C) = MI(B, C) = 0 No direction of edges due to symmetric property Fast and light computation. Useful for a number of genes C D A B A B C 1

6 My method (using mutual information)
Based on Scale free network Crucial genes will have more connections than other genes. A B C D A B C D On insert new gene F, A will have more chance to have it than other genes. E G = (N, E) = (N, E, L) (L is level information}

7 My method (Insertion step)
Finding better parents and merging Clusters Threshold = 0.3 ( ) a 9 1 4 5 6 7 S4 3 8 S1 10 11 2 1 12 5 a 4 7 S3 6 S2 MI(1, a) = 0.34 MI(4, a) = 0.28 MI(5, a) = 0.35 MI(6, a) = 0.31 MI(7, a) = 0.4

8 My method (Deletion step)
Assumption The network generated from insertion step of my method is in stationary state in marginal log likelihood except one edge, which is investigated to check the connection Three case in an edge e X->Y, X<-Y, and X Y P (D | M) = U * P (X | pa (X)) * P (Y | pa (Y))

9 My method (Deletion step)
X Y Graph G e A C D H I B E F

10 My method Mainly two steps Insertion and deletion steps Insertion step
f g h c d a b = 0.5 = 0.4 = 0.3 Continue up to t = 0. Insertion step Deletion step

11 My method Advantage Based on Biological facts (Scale Free Network)
No need of thresholds Online approach Scalability Easy to explore sub-networks Fast computation

12 My method Disadvantage Input order dependency
Risky in exploring parents in data with big noise values. (It can be over-fitted to training data) 61 % edges are less order dependent (in part B)

13 Bayesian network with MCMC
D E Problem 1 Problem 2 Left: in large data set. Right: in small data set.

14 Bayesian network with MCMC
MCMC (Markov Chain Monte Carlo) Inference rule for Bayesian Network Sample from the posterior distribution Proposal Move : Given M_old, propose a new network M_new with probability Acceptance and Rejection :

15 Bayesian network with MCMC

16 MCMC in Bayes Net toolbox
Hasting factor The proposal probability is calculated from the number of neighbours of the model.

17 Improvement of MCMCs Fan-in
The sparse data leads the prior probability to have a non-negligible influence on the posterior P(M|D). Limit the maximum number of edges converging on a node, fan-in. If FI(M) > a, P(M)=0. Otherwise, P(M)=1. The time complexity reduced largely A B C D E Acceptable configuration of child and parents in fan-in 3

18 Improvement of MCMCs DAG to CPDAG X Y X Y
(DAG : Directed Acyclic Graph, CPDAG : Completed Partially Directed Acyclic Graph) X Y X Y P(X, Y) = P(X)P(X|Y) = P(Y|P(Y|X) Set of all equivalent DAGs DAG to CPDAG DE is reversible others are compelled.

19 Improvement of MCMCs This CPDAG concept bring several advantages:
The space of equivalent classes is more reduced. It is easy to trap in local optimum in moving DAG spaces. Incorporating CPDAG to MCMC

20 MCMCMC Trapping A A : global optima B : local optima B
Easy to be trapped in local optima B.  Multi chains with different temperatures will be useful to escape from it.

21 MCMCMC Trapping

22 MCMCMC A super chain, S Acceptance ratios of a super chain

23 Importance Sampling . Partition function Proposal distribution
Acceptance probability We only case the prior distribution for acceptance. Importance Sampling is also combined with MCMCMC. MCMCMC with Importance Sampling : Likelihood for configuration of a node n and its parents

24 Order MCMC It sample over total orders not over structures. A B C A
B A C C A B C B A A B C A C B B C

25 Order MCMC It sample over total orders not over structures.
Proposal move flipping two nodes of the previous order Computational limitations Using candidate sets Sets of parents with the highest scores in likelihood for each node Reduces the computation time.

26 Order MCMC

27 Order MCMC Selection features
We can extract the edges by approximating and averaging under the stationary distribution, where

28 Synthetical data 41th to 50th genes are not connected.

29 Synthetic data - MCMCMC with Importance Sampling has the best performance. - Order MCMC is the second. - Order MCMC is much faster than MCMCMC with Importance Sampling.

30 Synthetic data I changed one parameters for MCMC simulation.
Standard application (using standard parameters) Change a noise value (Decrease noise value to 0.1) 3) Change a training data size (Decrease the size to 50) 4) Change the number of iterations (Increase the number to 50000) Standard parameters ( MCMC in Bayes Net Toolbox ) training data size:200, noise value:0.3, the number of iterations: 5000 (5000 samples and 5000 burn-ins)

31 Synthetic data

32 Synthetic data 2 1 Convergence MCMC in BNT MCMCMC Importance
Sampling (IM) 3) MCMCMC Importance Sampling (ID) 4) Order MCMC 3 4 training set size : 200 noise : 0.3 5000 iterations.

33 Synthetic data MCMCMC (Burn-in# + Sample #) Left: 5000 + 5000
Right: Acceptance ratios Left: MCMC in BNT, Right : Order MCMC Middle: MCMCMC with Importance Sampling

34 Diffuse large B cell lymphoma Data
Data discretisation I used K-means algorithms to discretise gene expression levels for each genes since the stationary level for each gene can be different from others. (up, down and normal) Problem of this discretisation If there are too many noises, the noises can make fluctuations Finally, this method can not work well for gene3.

35 Diffuse large B cell lymphoma Data
Comparison of convergence MCMC in BNT MCMCMC with Importance Sampling(ID) Order MCMC # of genes : 27 Training data size : 105 Iterations : 20000

36 Diffuse large B cell lymphoma Data
Comparison of Acceptance Ratios The number of genes : 27, Training data size : 105, Iterations : 20000

37 Gene expression inoculated by viruses in susceptible Arabidopsis thaliana plants
4) Viruses Cucumber mosaic cucumovirus Oil seed rape tobamovirus Turnip vein clearing tobamovirus Potato virus X potexvirus Turnip mosaic potyvirus 1) 5) 2) 3) Inoculation DAI = Day after inoculation 1DAI 2DAI 3DAI 4DAI 5DAI 7DAI Symptom occurs. Gene a Training data : 127 genes with 20 data size ( 4 DAIs * 5 viruses )

38 Gene expression inoculated by viruses in susceptible Arabidopsis thaliana plants
only for 20 genes (1DAI and 2DAI) 10000 samples from MCMCMC with Importance Sampling(ID) 1000 samples from my method

39 Gene expression inoculated by viruses in susceptible Arabidopsis thaliana plants
127 genes Average global connectivity = Genes with Higher connectivity

40 Gene expression inoculated by viruses in susceptible Arabidopsis thaliana plants
127 genes p-value check for transcription function - f is the number of genes with j th function in 127 genes. - m is the number of genes with j th function in 14 genes.

41 Gene expression inoculated by viruses in susceptible Arabidopsis thaliana plants
for 127 genes from my method (100 samples)

42 Conclusion We need to select methodologies depending on the characteristics of training data. To obtain the closest result to real networks, MCMCMC with Importance and Order MCMC are suitable. MCMCMC with Importance Sampling has the best performance but it is slower than other MCMCs. Order MCMC has the second performance but it is four times faster than MCMCMC with Importance Sampling. If we want to process large scale data and we do not have enough time to run MCMCs, Relevance Network and My method are proper. Also, several methods generate different networks so that combining them will give better results.

43 Conclusion Biological meaning
Transcription genes have higher connectivities more than other genes (from my method). That is, genes with transcription function may act as hubs in a network for response against viruses in Arabidopsis thaliana plant.


Download ppt "Reverse Engineering of Genetic Networks (Final presentation)"

Similar presentations


Ads by Google