Computational challenges in assessment of marine resources Hans Julius skaug Talk given at Inst. of Informatics, UiB Nov. 28, 2002.

Slides:



Advertisements
Similar presentations
Introduction to Data Assimilation NCEO Data-assimilation training days 5-7 July 2010 Peter Jan van Leeuwen Data Assimilation Research Center (DARC) University.
Advertisements

Bayesian Estimation in MARK
Supervised Learning Recap
The current status of fisheries stock assessment Mark Maunder Inter-American Tropical Tuna Commission (IATTC) Center for the Advancement of Population.
Reporter: Hsu Hsiang-Jung Modelling stochastic fish stock dynamics using Markov Chain Monte Carlo.
Industrial Engineering College of Engineering Bayesian Kernel Methods for Binary Classification and Online Learning Problems Theodore Trafalis Workshop.
Lecture 17: Supervised Learning Recap Machine Learning April 6, 2010.
1 Graphical Models in Data Assimilation Problems Alexander Ihler UC Irvine Collaborators: Sergey Kirshner Andrew Robertson Padhraic Smyth.
Part 4 b Forward-Backward Algorithm & Viterbi Algorithm CSE717, SPRING 2008 CUBS, Univ at Buffalo.
Stochastic Differentiation Lecture 3 Leonidas Sakalauskas Institute of Mathematics and Informatics Vilnius, Lithuania EURO Working Group on Continuous.
Machine Learning CUNY Graduate Center Lecture 7b: Sampling.
458 Lumped population dynamics models Fish 458; Lecture 2.
Barents Sea fish modelling in Uncover Daniel Howell Marine Research Institute of Bergen.
Today Introduction to MCMC Particle filters and MCMC
Introduction to template model builder, an improved tool for maximum likelihood and mixed-effects models in R James Thorson.
458 Fitting models to data – III (More on Maximum Likelihood Estimation) Fish 458, Lecture 10.
Image Analysis and Markov Random Fields (MRFs) Quanren Xiong.
Bayes Factor Based on Han and Carlin (2001, JASA).
1 Bayesian methods for parameter estimation and data assimilation with crop models Part 2: Likelihood function and prior distribution David Makowski and.
Overview G. Jogesh Babu. Probability theory Probability is all about flip of a coin Conditional probability & Bayes theorem (Bayesian analysis) Expectation,
Gunnar Stefansson Marine Research Institute/Univ. Iceland
3 September, 2009 SSP2009, Cardiff, UK 1 Probabilistic Image Processing by Extended Gauss-Markov Random Fields Kazuyuki Tanaka Kazuyuki Tanaka, Muneki.
Pacific Hake Management Strategy Evaluation Joint Technical Committee Northwest Fisheries Science Center, NOAA Pacific Biological Station, DFO School of.
Embedding population dynamics models in inference S.T. Buckland, K.B. Newman, L. Thomas and J Harwood (University of St Andrews) Carmen Fernández (Oceanographic.
GADGET - Globally applicable Area Disaggregated General Ecosystem Toolbox, Bjarte Bogstad, Institute of Marine Research, Bergen, Norway.
Machine Learning Lecture 23: Statistical Estimation with Sampling Iain Murray’s MLSS lecture on videolectures.net:
Learning Lateral Connections between Hidden Units Geoffrey Hinton University of Toronto in collaboration with Kejie Bao University of Toronto.
Computer Science, Software Engineering & Robotics Workshop, FGCU, April 27-28, 2012 Fault Prediction with Particle Filters by David Hatfield mentors: Dr.
Workshop on Stochastic Differential Equations and Statistical Inference for Markov Processes Day 3: Numerical Methods for Stochastic Differential.
Modeling growth for American lobster Homarus americanus Yong Chen, Jui-Han Chang School of Marine Sciences, University of Maine, Orono, ME
Markov Random Fields Probabilistic Models for Images
Fast Simulators for Assessment and Propagation of Model Uncertainty* Jim Berger, M.J. Bayarri, German Molina June 20, 2001 SAMO 2001, Madrid *Project of.
Virtual Vector Machine for Bayesian Online Classification Yuan (Alan) Qi CS & Statistics Purdue June, 2009 Joint work with T.P. Minka and R. Xiang.
DEEPFISHMAN Using bioeconomic modeling for evaluation of management measures – an example Institute of Economic Studies.
Machine Learning CUNY Graduate Center Lecture 4: Logistic Regression.
-Arnaud Doucet, Nando de Freitas et al, UAI
Workshop on Stock Assessment Methods 7-11 November IATTC, La Jolla, CA, USA.
Maximum a posteriori sequence estimation using Monte Carlo particle filters S. J. Godsill, A. Doucet, and M. West Annals of the Institute of Statistical.
CS Statistical Machine learning Lecture 24
July 11, 2006Bayesian Inference and Maximum Entropy Probing the covariance matrix Kenneth M. Hanson T-16, Nuclear Physics; Theoretical Division Los.
MCMC (Part II) By Marc Sobel. Monte Carlo Exploration  Suppose we want to optimize a complicated distribution f(*). We assume ‘f’ is known up to a multiplicative.
Sequential Monte-Carlo Method -Introduction, implementation and application Fan, Xin
Lecture 2: Statistical learning primer for biologists
Evaluation of harvest control rules (HCRs): simple vs. complex strategies Dorothy Housholder Harvest Control Rules Workshop Bergen, Norway September 14,
Markov Chain Monte Carlo for LDA C. Andrieu, N. D. Freitas, and A. Doucet, An Introduction to MCMC for Machine Learning, R. M. Neal, Probabilistic.
Expectation-Maximization (EM) Algorithm & Monte Carlo Sampling for Inference and Approximation.
Stochastic Loss Reserving with the Collective Risk Model Glenn Meyers ISO Innovative Analytics Casualty Loss Reserving Seminar September 18, 2008.
1 Chapter 8: Model Inference and Averaging Presented by Hui Fang.
Introduction to Sampling Methods Qi Zhao Oct.27,2004.
The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes.
CS Statistical Machine learning Lecture 25 Yuan (Alan) Qi Purdue CS Nov
Bayesian Modelling Harry R. Erwin, PhD School of Computing and Technology University of Sunderland.
Anders Nielsen Technical University of Denmark, DTU-Aqua Mark Maunder Inter-American Tropical Tuna Commission An Introduction.
The influence of climate on cod, capelin and herring in the Barents Sea Dag Ø. Hjermann (CEES, Oslo) Nils Chr. Stenseth (CEES, Oslo & IMR, Bergen) Geir.
Bayesian Brain Probabilistic Approaches to Neural Coding 1.1 A Probability Primer Bayesian Brain Probabilistic Approaches to Neural Coding 1.1 A Probability.
Density Estimation in R Ha Le and Nikolaos Sarafianos COSC 7362 – Advanced Machine Learning Professor: Dr. Christoph F. Eick 1.
Introduction to emulators Tony O’Hagan University of Sheffield.
Institute of Statistics and Decision Sciences In Defense of a Dissertation Submitted for the Degree of Doctor of Philosophy 26 July 2005 Regression Model.
Hierarchical Models. Conceptual: What are we talking about? – What makes a statistical model hierarchical? – How does that fit into population analysis?
10 October, 2007 University of Glasgow 1 EM Algorithm with Markov Chain Monte Carlo Method for Bayesian Image Analysis Kazuyuki Tanaka Graduate School.
An Introduction to AD Model Builder PFRP
ICS 280 Learning in Graphical Models
Ch3: Model Building through Regression
Filtering and State Estimation: Basic Concepts
POINT ESTIMATOR OF PARAMETERS
Where did we stop? The Bayes decision rule guarantees an optimal classification… … But it requires the knowledge of P(ci|x) (or p(x|ci) and P(ci)) We.
Graduate School of Information Sciences, Tohoku University
Expectation-Maximization & Belief Propagation
Graduate School of Information Sciences, Tohoku University
Uncertainty Propagation
Presentation transcript:

Computational challenges in assessment of marine resources Hans Julius skaug Talk given at Inst. of Informatics, UiB Nov. 28, 2002

Outline Mathematical ecology –Management of living resources Computation and statistics –Models –Numerical integration in high dimensions –Automatic differentiation Examples of ’assessment problems’ –Harp seals

Institute of Marine Research Units –Bergen, Tromsø, Flødevigen, Austevoll, Matre 500 employees –135 researches Center for marine resources –Fish (cod, herring, capelin, …) –Whales and seals

Management of living resources ‘Assessment’ = Estimation of –Current stock size –Biological parameters: mortality rates, birth rates, … Prediction of future stock size –Given today’s status –Under a given management regime (quota) Principles –Maximize harvest –Minimize risk of extinction Uncertainty –Important for determining risk

The complication Acoustic and trawl surveys do NOT give absolute estimates of stock size I = q · U where I = survey index U = stock size

Population dynamics u t = stock size at time t Exponential growth u t = u t-1  (1 +  1 ) Density regulation u t = u t-1  [1 +  1 (1- u t-1 /  2 )]

22  1 = 0.05  2 = 500

Population dynamics Exponential growth u t = u t-1  (1 +  1 ) Density regulation u t = u t-1  [1 +  1 (1- u t-1 /  2 )] Harvest u t = (u t-1 - c t-1 )  [1 +  1 (1- u t-1 /  2 )] Stochastic growth rate u t = (u t-1 - c t-1 )  [1+  1 (1- u t-1 /  2 ) + N(0,  3 ) t ]

Additonal complexity Age structure Spatial structure Interactions with other species Effect of environment

Models in statistics Parameters –  structural parameter, dim(  )  20 –ustate parameter, dim(u)  1000 Data –y‘measurements’ Probability distributions –p(y | u,  )probability of data given u and  –p(u |  )probability of u given  –p(  )’prior’ on 

State-space model State equationp(u |  ) u t = (u t-1 - c t-1 )  [1+  1 (1- u t-1 /  2 ) + N(0,  3 ) t ] Observation equationp(y | u,  ) y t =  4  u t + N(0,  5 ) t

Models in statistics Joint probability distribution p(y,u,  ) = p(y|u,  )  p(u|  )  p(  ) Marginal distribution p(y,  ) =  p(y,u,  ) du

Statistical computation Major challenge (today) p(y,  ) =  p(y,u,  ) du Maximum likelihood estimation   = argmax  p(y,  )

Algorithms Simple Monte Carlo integration Importance sampling Markov Chain Monte Carlo –Metropolis-Hastings algorithm –Gibbs sampler EM-algorithm Kalman filter (nonlinear)

Laplace approximation p(y,  ) =  p(y,u,  ) du

Pierre-Simon Laplace ( ) Bayes formula p(u|y) = p(y|u)  p(u) / p(y) where p(y) =  p(y|u)  p(u) du Replaces integration with maximization Exact for the Gaussian case

Maximum likelihood estimation Estimate  by maximizing L(  ) Gradient   L by automatic differentiation

AD (Automatic Differentiation) Given that we have a code for L AD gives you   L –at machine precision –’no’ extra programming –efficiently (reverse mode AD) cost(   L)  4  cost(L) AD people –Bischof, Griewank, Steihaug, …. Software: ADIFOR, ADOL-C, ADMB …

Laplace approximation again where

Research questions AD versus analytical formulae –Which is the most efficient? Higher order derivatives (3th order) –Less efficient than the gradient Sparseness structure

Constrained optimization formulation Maximize jointly w.r.t. u and  under the constraint

BeMatA project NFR- funded ( ) Collaboration between: –Inst. of Marine Research –Inst. of informatics, UiB –University of Oslo –Norwegian Computing Center, Oslo Joins expertise in –Statistical modeling –Computer science / Optimization

Fisheries assessments Age determination of fishes invented around 1900 – like counting growth layers in a tree ’Fundamental law’ u t,a = (u t-1,a-1 - c t-1,a-1 )  ( 1 -  ) –t is an index of year –a is an index of year –  is mortality rate

Recruitment mechanism Number of spawners U t =  a u t,a Number of fish ’born’ u t,0 = R( U t,  ) Strong relationship: seals Weak relationship: herring

Case study: Harp seals Barents Sea stock –Currently 2 million individuals Data –Historical catch records –Only pups can be counted –Age samples on whelping grounds Questions –Pre-exploitation stock size –Lowest level attained by the stock

Annotated references BeMatA project Title: Latent variable models handled with optimization aided by automatic differentiation; application to marine resource State-space models in fisheries Gudmundsson, G. (1994). Time-series analysis of catch-at-age observations, Applied Statistics, 43, p AD Griewank, A. (2000). Evaluating Derivatives: Principles and Techniques of Algorithmic Differentiation, SIAM, Philadelphia. AD and statistics Skaug, H.J. (2002). Automatic differentiation to facilitate maximum likelihood estimation in nonlinear random effects models. Journal of Computational and Graphical Statistics. 11 p Laplace approximation in statistics Tierney, L. and Kadane, J.B. (1986). Accurate approximations for posterior moments and marginal distributions. Journal of the American Statistical Association 81 p