alia joko 21st Nordic Process Control Workshop

Slides:



Advertisements
Similar presentations
Multi-Layer Perceptron (MLP)
Advertisements

A Brief Overview of Neural Networks By Rohit Dua, Samuel A. Mulder, Steve E. Watkins, and Donald C. Wunsch.
Slides from: Doug Gray, David Poole
EE 690 Design of Embodied Intelligence
NVIS: An Interactive Visualization Tool for Neural Networks Matt Streeter Prof. Matthew O. Ward by Prof. Sergio A. Alvarez advised by and.
Machine Learning Neural Networks
Neural Networks II CMPUT 466/551 Nilanjan Ray. Outline Radial basis function network Bayesian neural network.
RBF Neural Networks x x1 Examples inside circles 1 and 2 are of class +, examples outside both circles are of class – What NN does.
I welcome you all to this presentation On: Neural Network Applications Systems Engineering Dept. KFUPM Imran Nadeem & Naveed R. Butt &
Identifiability of biological systems Afonso Guerra Assunção Senra Paula Freire Susana Barbosa.
An Introduction To The Backpropagation Algorithm Who gets the credit?
Hazırlayan NEURAL NETWORKS Radial Basis Function Networks I PROF. DR. YUSUF OYSAL.
Self-organizing Learning Array based Value System — SOLAR-V Yinyin Liu EE690 Ohio University Spring 2005.
Hazırlayan NEURAL NETWORKS Radial Basis Function Networks II PROF. DR. YUSUF OYSAL.
Machine Learning CUNY Graduate Center Lecture 3: Linear Regression.
Classification Part 3: Artificial Neural Networks
Integrating Neural Network and Genetic Algorithm to Solve Function Approximation Combined with Optimization Problem Term presentation for CSC7333 Machine.
Artificial Neural Nets and AI Connectionism Sub symbolic reasoning.
Using Neural Networks to Predict Claim Duration in the Presence of Right Censoring and Covariates David Speights Senior Research Statistician HNC Insurance.
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition LECTURE 16: NEURAL NETWORKS Objectives: Feedforward.
LINEAR CLASSIFICATION. Biological inspirations  Some numbers…  The human brain contains about 10 billion nerve cells ( neurons )  Each neuron is connected.
Neural and Evolutionary Computing - Lecture 9 1 Evolutionary Neural Networks Design  Motivation  Evolutionary training  Evolutionary design of the architecture.
Applications of Neural Networks in Time-Series Analysis Adam Maus Computer Science Department Mentor: Doctor Sprott Physics Department.
Robust Pareto Design of GMDH-type Neural Networks for Systems with Probabilistic Uncertainties N. Nariman-zadeh, F. Kalantary, A. Jamali, F. Ebrahimi Faculty.
Over-Trained Network Node Removal and Neurotransmitter-Inspired Artificial Neural Networks By: Kyle Wray.
Neural Networks Vladimir Pleskonjić 3188/ /20 Vladimir Pleskonjić General Feedforward neural networks Inputs are numeric features Outputs are in.
CHEE825 Fall 2005J. McLellan1 Nonlinear Empirical Models.
Perceptrons Michael J. Watts
Nonlinear balanced model residualization via neural networks Juergen Hahn.
Data Mining: Neural Network Applications by Louise Francis CAS Convention, Nov 13, 2001 Francis Analytics and Actuarial Data Mining, Inc.
Bump Hunting The objective PRIM algorithm Beam search References: Feelders, A.J. (2002). Rule induction by bump hunting. In J. Meij (Ed.), Dealing with.
Chapter 11 – Neural Nets © Galit Shmueli and Peter Bruce 2010 Data Mining for Business Intelligence Shmueli, Patel & Bruce.
A Presentation on Adaptive Neuro-Fuzzy Inference System using Particle Swarm Optimization and it’s Application By Sumanta Kundu (En.R.No.
An Introduction To The Backpropagation Algorithm.
Evolutionary Design of the Closed Loop Control on the Basis of NN-ANARX Model Using Genetic Algoritm.
Inductive modelling: Detection of system states validity Pavel Kordík Department of Computer Science and Engineering, FEE,CTU Prague
Energy System Control with Deep Neural Networks
Chapter 7. Classification and Prediction
Deep Feedforward Networks
Artificial Neural Networks
an introduction to: Deep Learning
Randomness in Neural Networks
Learning in Neural Networks
Advanced information retreival
LECTURE 28: NEURAL NETWORKS
Forecasting The Future of Movies
Roberto Battiti, Mauro Brunato
CSSE463: Image Recognition Day 17
Going Backwards In The Procedure and Recapitulation of System Identification By Ali Pekcan 65570B.
Classification / Regression Neural Networks 2
Machine Learning Today: Reading: Maria Florina Balcan
Artificial Intelligence Methods
of the Artificial Neural Networks.
An Introduction To The Backpropagation Algorithm
CSSE463: Image Recognition Day 17
Connecting Data with Domain Knowledge in Neural Networks -- Use Deep learning in Conventional problems Lizhong Zheng.
Neural Networks Geoff Hulten.
Lecture Notes for Chapter 4 Artificial Neural Networks
LECTURE 28: NEURAL NETWORKS
CSSE463: Image Recognition Day 17
Boltzmann Machine (BM) (§6.4)
Fundamentals of Neural Networks Dr. Satinder Bal Gupta
Neural networks (1) Traditional multi-layer perceptrons
CSSE463: Image Recognition Day 17
3. Feedforward Nets (1) Architecture
CSSE463: Image Recognition Day 17
Computer Vision Lecture 19: Object Recognition III
DeltaV Neural - Expert In Expert mode, the user can select the training parameters, recommend you use the defaults for most applications.
David Kauchak CS158 – Spring 2019
Presentation transcript:

alia joko 21st Nordic Process Control Workshop 19.01.18 Neural network-based pruning and sensitivity analysis of an apoptosis model alia joko 21st Nordic Process Control Workshop 19.01.18

Apoptosis – what and why? Programmed and morphologically distinct form of cell death Relevance Development Sculpting of fingers Homeostasis Millions of blood cells eliminated daily to balance production Immune Surveillance Killing cancer and virus infected cells

Apoptosis – information processing and response Large number of signalling proteins Complex interactions Emergent Systems behaviour What is the ‘point of no return’ ? How robust is this decision point? Which signalling motifs are responsible – positive feedback loops, competitive inhibition Why are some cells in a population more sensitivity to cell death stimuli than others? What principles underlie this variability?

Mechanistic Apoptosis Model Formalisms Cell 2011 144, 926-939DOI: (10.1016/j.cell.2011.03.002)

Data-driven models of signal transduction New technologies are permitting large-scale quantitative studies of signal transduction networks Data-driven modelling approaches becoming the standard tools Which are the essential features, Nature Cell Biology 8, 1195–1203 (2006) doi:10.1038/ncb1497

Modelling by Feedforward Networks No or little a priori knowledge of the process required Captures nonlinear relations between process variables Works in arbitrary dimensions Resulting models are fast Requires many data points Default network architecture not a good choice Risk of Trash in Trash out May overfit the data Remedies Judicious choice of inputs based on “process” knowledge Limit the architecture (“don’t use too many hidden nodes”) Pruning large networks or let small network grow Simultaneous optimization of weights and connections Judicious choice of inputs based on “process” knowledge Limit the architecture (“don’t use too many hidden nodes”) Pruning large networks or let small network grow Simultaneous optimization of weights and connections

Saxén - Petterson neural network pruning algorithm Multiple networks evolved by a pruning algorithm Rank the resulting models, retaining the best ones Appearance of the inputs in these reflects their importance Pruning Algorithm Randomly generate lower-layer weights, W0. Set i = 1. Reset in turn each weight, j, in Wi and determine the upper layer weights, wj , by linear least squares, giving the objective function value Fij Set Wi = Wi-1 and reset permanently the weight with minimum Fij. Call this index Set and save this variable in a book-keeping matrix Set i = i + 1. If i < n N, go to 2, else end. Fij w W

Bentele-Toivonen Apoptosis Model Single cell model with 41 molecules, 32 reactions, 50 parameters to be estimated Reactions modelled by mass action and Michaelis- Menten kinetics Stochastic log-normal distributions of protein concentrations and reaction rates to mimic cell population Populations with 1000 cells simulated Time to apoptosis :

Results from the network pruning Networks with 10 hidden nodes and 59 inputs were pruned 500 times from different starting weight matrices For each lower layer complexity the run with the minimum error is retained Minimum approximation error

Relevant model parameters The occurrences of 56 inputs on the Pareto fronts of the 20 best models show 11 relevant parameters

Model – data fit

Local sensitivity analysis

Reduced model : new insights? Experimentally verified

Summary Fast and easy generation of promising reduced model subsets (as collected on Pareto fronts) Successfully finds relevant parameters in a complex model of cell death (apoptosis) Provides a means of analysing the sensitivity of the output to the identified relevant parameters Introduces possibilities for experiment design

Henrik Saxén, Prof Frank Petterson, Ph.D Acknowledgements Henrik Saxén, Prof Frank Petterson, Ph.D Thank you for your attention