Monitoring Principles Stella Swanson, Ph.D.. Principle #1: Know Why We Are Monitoring Four basic reasons to monitor:  Compliance Monitoring: to demonstrate.

Slides:



Advertisements
Similar presentations
Framework for the Ecological Assessment of Impacted Sediments at Mining Sites in Region 7 By Jason Gunter (R7 Life Scientist) and.
Advertisements

Statistical basics Marian Scott Dept of Statistics, University of Glasgow August 2008.
Numerical benchmarks: proposed levels and underlying reasoning
Topic 5.3 / Option G.1: Community Ecology 2 Populations and Sampling Methods Assessment Statements: , G.1.3-G.1.4.
Massachusetts Water Resources Authority Proposed Revisions to MWRA’s Ambient Monitoring Plan: Introduction Andrea Rex Director Environmental Quality Dept.
Lake Status Indicator Selection and Use in SLICE David F. Staples.
Statistical Decision Making
COURSE: JUST 3900 INTRODUCTORY STATISTICS FOR CRIMINAL JUSTICE Instructor: Dr. John J. Kerbs, Associate Professor Joint Ph.D. in Social Work and Sociology.
Estimation of Sample Size
1 Cumulative Impact Management: A Framework for Northeast BC Presentation of Research Results Science and Community Environmental Knowledge Fund Presented.
The Art and Science of Teaching (2007)
Lake Status Indicator Selection David F. Staples Ray Valley.
Methods for Incorporating Aquatic Plant Effects into Community Level Benchmarks EPA Development Team Regional Stakeholder Meetings January 11-22, 2010.
Determine impurity level in relevant batches1
Characterizing Baseline Water Body Conditions. What? Confirm impairments and identify problems Statistical summary Spatial analysis Temporal analysis.
ALEC 604: Writing for Professional Publication Week 7: Methodology.
Part 5 Staffing Activities: Employment
PY 427 Statistics 1Fall 2006 Kin Ching Kong, Ph.D Lecture 6 Chicago School of Professional Psychology.
Sample Size Determination
Chapter 7 Correlational Research Gay, Mills, and Airasian
Richard M. Jacobs, OSA, Ph.D.
What should be the basis of
performance INDICATORs performance APPRAISAL RUBRIC
Chapter 12 Inferential Statistics Gay, Mills, and Airasian
Codex Guidelines for the Application of HACCP
Introduction to Earth Science Chapter 1. Essential Questions 1.What does an Earth Scientist study? 2.What information do various maps give to an Earth.
Statistics 11 Hypothesis Testing Discover the relationships that exist between events/things Accomplished by: Asking questions Getting answers In accord.
Basic Statistics (for this class) Special thanks to Jay Pinckney (The HPLC and Statistics Guru) APOS.
Chapter 8 Introduction to Hypothesis Testing
The Vocabulary of Research. What is Credibility? A researcher’s ability to demonstrate that the study is accurate based on the way the study was conducted.
1 Validation & Verification Chapter VALIDATION & VERIFICATION Very Difficult Very Important Conceptually distinct, but performed simultaneously.
Hydrologic Modeling: Verification, Validation, Calibration, and Sensitivity Analysis Fritz R. Fiedler, P.E., Ph.D.
Tuesday 11:00 – 1:50 Thursday 11:00 – 1:50 Instructor: Nancy Wheat Ecology Bio 47 Spring 2015.
Chapter 8 Introduction to Hypothesis Testing
Jericho Aquatic Discharge Assessment Presented by: Bruce Ott, Senior Environmental Scientist, AMEC Earth & Environmental.
The Scientific Method Honors Biology Laboratory Skills.
Getting Ready for the Future Woody Turner Earth Science Division NASA Headquarters May 7, 2014 Biodiversity and Ecological Forecasting Team Meeting Sheraton.
Generic Approaches to Model Validation Presented at Growth Model User’s Group August 10, 2005 David K. Walters.
MJ Paul Tetra Tech Inc. Center for Ecological Sciences RTP, NC USING BIOLOGICAL RESPONSES IN NUTRIENT CRITERIA DEVELOPMENT: APPLICATIONS, OPPORTUNITIES,
Tahera Environmental Monitoring Commitments Summary Use TK in monitoring; cooperate with communities Collect reliable information that will: Allow detection.
Planning and Data Collection
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
Evaluating Fish Response to Habitat Restoration Overview of Intensively Monitored Watershed Research in the PNW Rationale for IMW approach Extent of current.
Objectives: 1.Enhance the data archive for these estuaries with remotely sensed and time-series information 2.Exploit detailed knowledge of ecosystem structure.
STRATEGIC ENVIRONMENTAL ASSESSMENT METHODOLOGY AND TECHNIQUES.
- Aquatics - Presented by: Rick Pattenden Mainstream Aquatics Ltd.
January 27, 2011 Summary Background on Delta Flow and Habitat Relationships Delta Stewardship Council Presentation by the Independent Consultant.
Great Lakes Environmental Research Laboratory Review – Ann Arbor, MI November 15-19, Click to edit Master text styles –Second level Third level.
Can you teach a long-term benthic monitoring program new tricks? Marc Vayssières, Karen Gehrts and Cindy Messer - CA Dept. of Water Resources Assessment.
IERM Overview Goals: 1. Development of an integrated, whole-system model for ecological response to water level/flow scenarios 2. Blend ecological research.
PCB 3043L - General Ecology Data Analysis.
Spatial ABM H-E Interactions, Lecture 4 Dawn Parker, George Mason University Notes on Grimm and Railsback Chapter 1, “Individual-based modeling and ecology”
1 Module One: Measurements and Uncertainties No measurement can perfectly determine the value of the quantity being measured. The uncertainty of a measurement.
CHAPTER 11 Planning and Budgeting the Marketing Mix Part 1: Pages
Chapter 8: Introduction to Hypothesis Testing. Hypothesis Testing A hypothesis test is a statistical method that uses sample data to evaluate a hypothesis.
Research Methods in Psychology Introduction to Psychology.
Metrics and MODIS Diane Wickland December, Biology/Biogeochemistry/Ecosystems/Carbon Science Questions: How are global ecosystems changing? (Question.
Unit 3 Investigative Biology. SQA Success Criteria  Explain the difference between random sampling, systematic sampling and stratified sampling.
Ecology --- primary definition The scientific study of how organisms interact with the natural world.
“ Building Strong “ Delivering Integrated, Sustainable, Water Resources Solutions Uncertainty & Variability Charles Yoe, Ph.D.
Statistical Inference for the Mean Objectives: (Chapter 8&9, DeCoursey) -To understand the terms variance and standard error of a sample mean, Null Hypothesis,
Chapter 9 Introduction to the t Statistic
Same dredger, different location: Environmental impacts of dredging
Expert Meeting Methods for assessing current and future coastal vulnerability to climate change 27 – 28 October 2010 Draft conclusions.
Lakes & Large Impoundments Chapter 22
A blueprint for experiment success.
The normal balance of ingredients
Revision of MSFD Decision 2010/477/EU - overview
HELCOM and operational oceanography
Investigations using the
Presentation transcript:

Monitoring Principles Stella Swanson, Ph.D.

Principle #1: Know Why We Are Monitoring Four basic reasons to monitor:  Compliance Monitoring: to demonstrate compliance with license requirements  Monitoring in Support of Certification: e.g. ISO; reclamation certificate  Operational Monitoring for Adaptive Management: e.g. effluent treatment data in support of continuous improvement goals of the Environmental Management Plan  Regional Cumulative Effects: e.g. joint industry/government studies of airsheds

Principle #2: Monitoring Is Not Research Monitoring cannot answer all questions. It is important to know when a question must be answered by research.

Research Versus Monitoring Research Objective: investigate fundamental scientific questions Focus: test theory Outcomes: scientific papers, further development of theory Applications: input to monitoring programs, models, design of mitigation, reclamation refinements to regulations Monitoring Objective: demonstrate effectiveness of environmental management and regulations Focus: specific questions regarding status, trends or compliance Outcomes: databases, monitoring reports Applications: feedback to operations

Research Versus Monitoring Example Research Objective: study the effects of naphthenic acids plus salinity Focus: test Hypothesis that naphthenic acids plus salinity will act together to cause more effects than either one separately by conducting laboratory and field observations Applications: predicting multiple stressor effects Monitoring Objective: test toxicity of pit lake water using standard test species Focus: confirm predictions developed from research Applications: feedback to EMS and decisions re: requirements for treatment and/or additional dilution of pit lake water

Research Versus Monitoring Quick-Check If it’s an interesting “what if” question, it’s probably research If it’s a question of “let’s check to be sure” it’s probably monitoring

When Will Research Be Required? examples of questions that cannot be answered without research: –Baseline: year-to-year variation in phytoplankton populations in regional lakes –Monitoring: what is the cause/effect relationship between variation in zooplankton community structure and exposure to OSPW

Principle #3: Know the Questions We are Asking Monitoring must address specific questions Three main categories: 1. Status: point-in-time 2. Trends: temporal and spatial 3. Effects: project effects; cumulative effects

Status Questions - Examples Compliance monitoring: Are monthly means and yearly maximum within license limits? Certification monitoring: Do littoral zone performance criteria (e.g. macrophyte biomass) meet design requirements? Operational monitoring: Did the adjustment to the flow- through rate produce the expected results? Regional monitoring: Did the unusually wet spring affect the length of time that turbidity persisted in the lake?

Trend Questions - Examples Compliance monitoring: Are there seasonal trends in parameters that are governed by license limits? Certification monitoring: Is macrophyte cover and benthic invertebrate biomass in the lake increasing as predicted? Operational monitoring: Has the flow-through rate adjustment made because of site-wide water management constraints affected naphthenic acid degradation rate? Regional monitoring: Have there been similar year-to-year trends in zooplankton populations in regional lakes as has been observed in the pit lake?

Effects Questions - Examples Compliance monitoring: Do chronic toxicity test results using the required suite of tests stay within license requirements? Certification monitoring: Do long-term monitoring quadrats in the littoral zone show the expected gradual build-up of a detrital layer on the sediments? Operational monitoring: Does the number of waterfowl interactions with pit lake water and sediments warrant change in mitigation measures ? Regional monitoring: Has there been a statistically significant change in fish growth rates or age distribution over the past 5 years in regional lakes compared to the pit lake?

Principle #4: Be Clear About Purpose of Indicators Intrinsic importance; e.g. waterfowl Early warning; e.g. acute toxicity tests minimal time lag in response to stress discrimination low screening tool: accept false positives Sensitive indicator; e.g. proportion of metal-sensitive invertebrate species high fidelity in showing adverse effect must be relevant to state of ecosystem Process/functional indicator; e.g. primary production

Principle #5: Use Consistent Criteria for Selecting Indicators High Signal-to-Noise Ratio Rapid Response Reliability/Specificity of Response Ease/Economy of Monitoring Ecological Relevance Effectiveness of Feedback to Regulation and Adaptive Management

Application of Criteria for Indicator Selection Varies Compliance monitoring: rapid response Certification monitoring: reliability Operational monitoring: high signal-to-noise ratio; feedback to management Regional monitoring: reliability; ecological relevance; feedback to management Ease/Economy is always an important criterion and is correlated with the state of knowledge

Principle #6: Define Acceptable Change q Status q Trends q Effects Definition will depend upon the type of monitoring:

Defining Acceptable Change: Status Monitoring Compliance monitoring: compare to license limits Certification monitoring: compare to certification requirements Operational monitoring: compliance with Environmental Management Plan objectives Regional monitoring: compare to baseline

Defining Acceptable Change: Trend Monitoring Compliance monitoring: e.g. spatial extent of water quality change within defined limits of mixing zone; temporal maxima within license limits Certification monitoring: e.g. 5-year record of littoral development Operational monitoring: e.g. consistent improvement in ability to predict seasonal lake water quality Regional monitoring: e.g. consistent decline in metal concentrations with distance from point sources as predicted in EIAs

Defining Acceptable Change: Effects Monitoring Statistical definitions: e.g. “critical effect size” of two standard deviations from a reference mean –require reliable data on natural variability from valid reference areas –require professional judgment because the links between observations or experimental results and effects on population persistence, community structure or ecosystem function can be highly uncertain.

Defining Acceptable Change: Effects Monitoring probabilistic definitions: e.g. “a 10% chance, or less, that 20% or more of the total population of forage fish would receive an exposure greater than the Ecological Benchmark Value” (Oregon DEQ 1998) –require the estimation of the probability of exposure –require estimation of local population abundance –require sufficient data to determine the EBV

Summary Know Why We are Monitoring Monitoring is Not Research Know the Questions We are Asking –Status; Trends; Effects Be Clear about the Purpose of Indicators Use Consistent Criteria for Selecting Indicators Define Acceptable Change