An Introduction to AD Model Builder PFRP

Slides:



Advertisements
Similar presentations
Introduction to Stock Synthesis
Advertisements

Introduction to Monte Carlo Markov chain (MCMC) methods
1 -Classification: Internal Uncertainty in petroleum reservoirs.
Yi Heng Second Order Differentiation Bommerholz – Summer School 2006.
Bayesian inference of normal distribution
Pattern Recognition and Machine Learning
ECE 8443 – Pattern Recognition LECTURE 05: MAXIMUM LIKELIHOOD ESTIMATION Objectives: Discrete Features Maximum Likelihood Resources: D.H.S: Chapter 3 (Part.
Uncertainty and confidence intervals Statistical estimation methods, Finse Friday , 12.45–14.05 Andreas Lindén.
Bayesian Estimation in MARK
1 Parametric Sensitivity Analysis For Cancer Survival Models Using Large- Sample Normal Approximations To The Bayesian Posterior Distribution Gordon B.
The current status of fisheries stock assessment Mark Maunder Inter-American Tropical Tuna Commission (IATTC) Center for the Advancement of Population.
Bayesian statistics – MCMC techniques
Computing the Posterior Probability The posterior probability distribution contains the complete information concerning the parameters, but need often.
Computational challenges in assessment of marine resources Hans Julius skaug Talk given at Inst. of Informatics, UiB Nov. 28, 2002.
USE OF LAPLACE APPROXIMATIONS TO SIGNIFICANTLY IMPROVE THE EFFICIENCY
Introduction to template model builder, an improved tool for maximum likelihood and mixed-effects models in R James Thorson.
Robert M. Saltzman © DS 851: 4 Main Components 1.Applications The more you see, the better 2.Probability & Statistics Computer does most of the work.
1 Bayesian methods for parameter estimation and data assimilation with crop models Part 2: Likelihood function and prior distribution David Makowski and.
The BioAnalytics Group LLC Global Optimization Toolkit Project First Prototype Delivery.
Using car4ams, the Bayesian AMS data-analysis code V. Palonen, P. Tikkanen, and J. Keinonen Department of Physics, Division of Materials Physics.
Model Inference and Averaging
Introduction to MCMC and BUGS. Computational problems More parameters -> even more parameter combinations Exact computation and grid approximation become.
Computational Physics Introduction 3/30/11. Goals  Calculate solutions to physics problems  All physics problems can be formulated mathematically. 
Sparse Gaussian Process Classification With Multiple Classes Matthias W. Seeger Michael I. Jordan University of California, Berkeley
Comparison of Bayesian Neural Networks with TMVA classifiers Richa Sharma, Vipin Bhatnagar Panjab University, Chandigarh India-CMS March, 2009 Meeting,
Mean Field Variational Bayesian Data Assimilation EGU 2012, Vienna Michail Vrettas 1, Dan Cornford 1, Manfred Opper 2 1 NCRG, Computer Science, Aston University,
Mixture Models, Monte Carlo, Bayesian Updating and Dynamic Models Mike West Computing Science and Statistics, Vol. 24, pp , 1993.
17 May 2007RSS Kent Local Group1 Quantifying uncertainty in the UK carbon flux Tony O’Hagan CTCD, Sheffield.
The Stock Synthesis Approach Based on many of the ideas proposed in Fournier and Archibald (1982), Methot developed a stock assessment approach and computer.
- 1 - Bayesian inference of binomial problem Estimating a probability from binomial data –Objective is to estimate unknown proportion (or probability of.
Texas A&M University, Department of Aerospace Engineering AN EMBEDDED FUNCTION TOOL FOR MODELING AND SIMULATING ESTIMATION PROBLEMS IN AEROSPACE ENGINEERING.
July 11, 2006Bayesian Inference and Maximum Entropy Probing the covariance matrix Kenneth M. Hanson T-16, Nuclear Physics; Theoretical Division Los.
Lecture 2: Statistical learning primer for biologists
Intermediate 2 Computing Unit 2 - Software Development.
Semi-mechanistic modelling in Nonlinear Regression: a case study by Katarina Domijan 1, Murray Jorgensen 2 and Jeff Reid 3 1 AgResearch Ruakura 2 University.
Multilevel and multifrailty models. Overview  Multifrailty versus multilevel Only one cluster, two frailties in cluster e.g., prognostic index (PI) analysis,
Stochastic Loss Reserving with the Collective Risk Model Glenn Meyers ISO Innovative Analytics Casualty Loss Reserving Seminar September 18, 2008.
1 Chapter 8: Model Inference and Averaging Presented by Hui Fang.
Using distributions of likelihoods to diagnose parameter misspecification of integrated stock assessment models Jiangfeng Zhu * Shanghai Ocean University,
Matrix Models for Population Management & Conservation March 2014 Lecture 10 Uncertainty, Process Variance, and Retrospective Perturbation Analysis.
1 Getting started with WinBUGS Mei LU Graduate Research Assistant Dept. of Epidemiology, MD Anderson Cancer Center Some material was taken from James and.
Anders Nielsen Technical University of Denmark, DTU-Aqua Mark Maunder Inter-American Tropical Tuna Commission An Introduction.
Markov-Chain-Monte-Carlo (MCMC) & The Metropolis-Hastings Algorithm P548: Intro Bayesian Stats with Psych Applications Instructor: John Miyamoto 01/19/2016:
Estimating Uncertainty. Estimating Uncertainty in ADMB Model parameters and derived quantities Normal approximation Profile likelihood Bayesian MCMC Bootstrap.
Density Estimation in R Ha Le and Nikolaos Sarafianos COSC 7362 – Advanced Machine Learning Professor: Dr. Christoph F. Eick 1.
Introduction to emulators Tony O’Hagan University of Sheffield.
Probability and Likelihood. Likelihood needed for many of ADMB’s features Standard deviation Variance-covariance matrix Profile likelihood Bayesian MCMC.
Introduction We consider the data of ~1800 phenotype measurements Each mouse has a given probability distribution of descending from one of 8 possible.
Hierarchical Models. Conceptual: What are we talking about? – What makes a statistical model hierarchical? – How does that fit into population analysis?
MCMC Output & Metropolis-Hastings Algorithm Part I
Xing Cai University of Oslo
/* LIFE RUNS ON CODE*/ Konstantinos Pantos Microsoft MVP ASP.NET
Template Model Builder-I
ERGM conditional form Much easier to calculate delta (change statistics)
ICS 280 Learning in Graphical Models
Introducing Bayesian Approaches to Twin Data Analysis
Model Inference and Averaging
Jon Brodziak1, Teresa A’mar2 , Matthew Supernaw2,
Performing a Computer Simulation using C++
STA 216 Generalized Linear Models
Unfolding Problem: A Machine Learning Approach
More about Posterior Distributions
Pattern Recognition and Machine Learning
Verification and Validation Using Code-Based Sensitivity Techniques
Fixed-point Analysis of Digital Filters
Yalchin Efendiev Texas A&M University
Probabilistic Surrogate Models
Uncertainty Propagation
Derivatives and Gradients
Presentation transcript:

An Introduction to AD Model Builder PFRP

Instructors Anders Nielsen (Technical University of Denmark, DTU-Aqua) Johnoel Ancheta (Pelagic Fisheries Research Program, PFRP) Mark Maunder (Inter-American Tropical Tuna Commission, IATTC)

Introduce yourself Name Organization Main research

Questionnaire What do you know Remember to ask Participants about WinBUGs.

What is AD Model Builder Tool for developing nonlinear models Efficient estimation of model parameters C++ libraries Template

Simplifying the development of models Removes the need to manage the interface between the model parameters and function minimizer. The template makes it easy to input and output data from the model, set up the parameters to estimate, and set up objective function to optimize (minimize). Adding additional estimable parameters or converting fixed parameters into estimable parameters is a simple process. ADMB is very flexible as model code is in C++ Experienced C++ programmers to create their own libraries

Efficient and stable function minimizer Analytical derivatives –Adjoint code –Chain rule More efficient and stable than other packages that use finite difference approximation. Stepwise process to sequentially estimate the parameters Bounds on all estimated parameters that restrict the range of possible parameter values.

MCMC algorithm for Bayesian integration Starts at the mode of the posterior reduces the burn-in time. Jumping rules based on the variance- covariance estimates at the mode of the posterior distribution

Automated likelihood profiles Normal approximation of confidence intervals based on the Hessian matrix and derived quantities using the delta method Automatically calculate likelihood profiles for model parameters and derived quantities producing asymmetrical confidence intervals

Random effects parameters Random effects parameters implemented using Laplace’s approximation (and importance sampling) Automatic analytical second derivatives. Use for process error or meta analysis

Matrix algebra Matrix algebra with associated precompiled adjoint code for derivative calculations Can greatly reduce computation time and memory usage compared to loops

Other features non-linear programming solver numerical integration routine random number generation high dimensional and ragged arrays estimation of the variance-covariance matrix dynamic link libraries with other software products (e.g. s- plus, Excel, Visual Basic) safe mode compiling for bounds checking ability to make ADMB C++ libraries. Parallel processing

What its good for: Highly parameterize nonlinear models Thousands of parameters Combining many data sets or analyses General Models –Stock Synthesis (Rick Methot NMFS)

What its good for: Nonlinear models with large data sets Integrating GLMs into nonlinear models

What its good for: Numerous optimizations of the objective function Simulation analysis Likelihood profiles Bootstrap/cross validation Model testing/sensitivity analysis Management strategy evaluation Numerical integration/simulated likelihood

What its good for: Nonlinear mixed effects models Crossed random effects Nonlinear state-space models.

The ADMB project Make ADMB Free Make ADMB open source Develop ADMB Facilitate the use of ADMB Promote ADMB

Outline 1.Introduction 2.Installation 3.First example 4.Likelihood based inference 5.What happens internally 6.Parameter setup 7.Data input and outputting results 8.Simulation 9.Estimating uncertainty 10.Random effects 11.Summary