KSU Monitoring Designs # 1 WE PROBABLY COULD HAVE MORE FUN TALKING ABOUT THESE TRAFFIC STOPPERS.

Slides:



Advertisements
Similar presentations
Sampling Design, Spatial Allocation, and Proposed Analyses Don Stevens Department of Statistics Oregon State University.
Advertisements

Designs for estimating variability structure and implications for detecting watershed restoration effectiveness David P. Larsen –Western Ecology Division,
Oregon State University, USA
VARYING RESIDUAL VARIABILITY SEQUENCE OF GRAPHS TO ILLUSTRATE r 2 VARYING RESIDUAL VARIABILITY N. Scott Urquhart Director, STARMAP Department of Statistics.
A spatial design for monitoring the health of a large-scale freshwater system Melissa Dobbie, Charis Burridge (CMIS) and Bill Senior (Qld Environment and.
San Diego Chapter ASA # 1 USES OF POWER IN DESIGNING LONG-TERM ENVIRONMENTAL SURVEYS N. Scott Urquhart Department of Statistics Colorado State University.
# 1 METADATA: A LEGACY FOR OUR GRANDCHILDREN N. Scott Urquhart STARMAP Program Director Department of Statistics Colorado State University.
Evaluation Research Pierre-Auguste Renoir: Le Grenouillere, 1869.
# 1 CSU’s EPA-FUNDED PROGRAM ON SPACE-TIME AQUATIC RESOURCE MODELING and ANALYSIS PROGRAM (STARMAP) N. SCOTT URQUHART SENIOR RESEARCH SCIENTIST DEPARTMENT.
Multi-Lag Cluster Enhancement of Fixed Grids for Variogram Estimation for Near Coastal Systems Kerry J. Ritter, SCCWRP Molly Leecaster, SCCWRP N. Scott.
Model- vs. design-based sampling and variance estimation on continuous domains Cynthia Cooper OSU Statistics September 11, 2004 R
# 1 STATISTICAL ASPECTS OF COLLECTIONS OF BEES TO STUDY PESTICIDES N. SCOTT URQUHART SENIOR RESEARCH SCIENTIST DEPARTMENT OF STATISTICS COLORADO STATE.
Robust sampling of natural resources using a GIS implementation of GRTS David Theobald Natural Resource Ecology Lab Dept of Recreation & Tourism Colorado.
Nonparametric, Model-Assisted Estimation for a Two-Stage Sampling Design Mark Delorey, F. Jay Breidt, Colorado State University Abstract In aquatic resources,
1 STARMAP: Project 2 Causal Modeling for Aquatic Resources Alix I Gitelman Stephen Jensen Statistics Department Oregon State University August 2003 Corvallis,
EPA & Ecology 2005 # 1 AN ACADEMICIAN’S VIEW OF EPA’s ECOLOGY PROGRAM ESPECIALLY ITS ENVIRONMENTAL MONITORING AND ASSESSMENT PROGRAM (EMAP) N. Scott Urquhart,
State-Space Models for Within-Stream Network Dependence William Coar Department of Statistics Colorado State University Joint work with F. Jay Breidt This.
Semiparametric Mixed Models in Small Area Estimation Mark Delorey F. Jay Breidt Colorado State University September 22, 2002.
Wageningen 2004 # 1. Wageningen 2004 # 2 USES OF POWER IN DESIGNING LONG-TERM ENVIRONMENTAL SURVEYS N. Scott Urquhart Department of Statistics Colorado.
1 Accounting for Spatial Dependence in Bayesian Belief Networks Alix I Gitelman Statistics Department Oregon State University August 2003 JSM, San Francisco.
LEARNING MATERIALS for AQUATIC MONITORING N. Scott Urquhart Department of Statistics Colorado State University.
PAGE # 1 Presented by Stacey Hancock Advised by Scott Urquhart Colorado State University Developing Learning Materials for Surface Water Monitoring.
Distribution Function Estimation in Small Areas for Aquatic Resources Spatial Ensemble Estimates of Temporal Trends in Acid Neutralizing Capacity Mark.
Two-Phase Sampling Approach for Augmenting Fixed Grid Designs to Improve Local Estimation for Mapping Aquatic Resources Kerry J. Ritter Molly Leecaster.
Example For simplicity, assume Z i |F i are independent. Let the relative frame size of the incomplete frame as well as the expected cost vary. Relative.
Habitat association models  Independent Multinomial Selections (IMS): (McCracken, Manly, & Vander Heyden, 1998) Product multinomial likelihood with multinomial.
PAGE # 1 STARMAP OUTREACH Scott Urquhart Department of Statistics Colorado State University.
October, A Comparison of Variance Estimates of Stream Network Resources Sarah J. Williams Candidate for the degree of Master of Science Colorado.
Distribution Function Estimation in Small Areas for Aquatic Resources Spatial Ensemble Estimates of Temporal Trends in Acid Neutralizing Capacity Mark.
State-Space Models for Biological Monitoring Data Devin S. Johnson University of Alaska Fairbanks and Jennifer A. Hoeting Colorado State University.
1 Learning Materials for Surface Water Monitoring Gerald Scarzella.
Optimal Sample Designs for Mapping EMAP Data Molly Leecaster, Ph.D. Idaho National Engineering & Environmental Laboratory Jennifer Hoeting, Ph. D. Colorado.
"Developing statistically-valid and -defensible frameworks to assess status and trends of ecosystem condition at national scales" "Developing statistically-valid.
Knowledge is Power Marketing Information System (MIS) determines what information managers need and then gathers, sorts, analyzes, stores, and distributes.
SARMMM 9/8/05 # 1 DEVELOPMENTS IN TREND DETECTION IN AQUATIC SURVEYS N. Scott Urquhart STARMAP Program Director Department of Statistics Colorado State.
Applications of Nonparametric Survey Regression Estimation in Aquatic Resources F. Jay Breidt, Siobhan Everson-Stewart, Alicia Johnson, Jean D. Opsomer.
# 1 POSSIBLE LESSONS FOR CEER-GOM FROM EMAP N. Scott Urquhart STARMAP Program Director Department of Statistics Colorado State University.
Random Effects Graphical Models and the Analysis of Compositional Data Devin S. Johnson and Jennifer A. Hoeting STARMAP Department of Statistics Colorado.
1 Learning Materials for Surface Water Monitoring Gerald Scarzella.
Distribution Function Estimation in Small Areas for Aquatic Resources Spatial Ensemble Estimates of Temporal Trends in Acid Neutralizing Capacity Mark.
Distribution Function Estimation in Small Areas for Aquatic Resources Spatial Ensemble Estimates of Temporal Trends in Acid Neutralizing Capacity Mark.
1 Adjustment Procedures to Account for Nonignorable Missing Data in Environmental Surveys Breda Munoz Virginia Lesser R
Introduction to the design (and analysis) of experiments James M. Curran Department of Statistics, University of Auckland
MGT-491 QUANTITATIVE ANALYSIS AND RESEARCH FOR MANAGEMENT OSMAN BIN SAIF Session 14.
1 Spatial and Spatio-temporal modeling of the abundance of spawning coho salmon on the Oregon coast R Ruben Smith Don L. Stevens Jr. September.
RESEARCH A systematic quest for undiscovered truth A way of thinking
Spatial Survey Designs Anthony (Tony) R. Olsen USEPA NHEERL Western Ecology Division Corvallis, Oregon (541) Web Page:
Comparison of Variance Estimators for Two-dimensional, Spatially-structured Sample Designs. Don L. Stevens, Jr. Susan F. Hornsby* Department of Statistics.
MANAGEMENT AND ANALYSIS OF WILDLIFE BIOLOGY DATA Bret A. Collier 1 and T. Wayne Schwertner 2 1 Institute of Renewable Natural Resources, Texas A&M University,
Various topics Petter Mostad Overview Epidemiology Study types / data types Econometrics Time series data More about sampling –Estimation.
Research Process Parts of the research study Parts of the research study Aim: purpose of the study Aim: purpose of the study Target population: group whose.
NPS Temporal Conference # 1. NPS Temporal Conference # 2 DESIGNING PANEL SURVEYS SPECIFICALLY RELEVANT TO NATIONAL PARKS IN THE NORTHWEST N. Scott Urquhart.
Research & Experimental Design Why do we do research History of wildlife research Descriptive v. experimental research Scientific Method Research considerations.
1 National Stream and River Assessment Monitoring Design Anthony R. Olsen 1, David V. Peck 1, Steven G. Paulsen 1, John L. Stoddard 1, and Susan Holdsworth.
DAMARS/STARMAP 8/11/03# 1 STARMAP YEAR 2 N. Scott Urquhart STARMAP Director Department of Statistics Colorado State University Fort Collins, CO
Spatial Survey Designs for Aquatic Resource Monitoring Anthony (Tony) R. Olsen Western Ecology Division US Environmental Protection Agency Corvallis, Oregon.
The concept of a Master Sample: Lower Columbia Example Phil Larsen Pacific States Marine Fisheries Commission c/o USEPA 200 SW 35 th St. Corvallis, OR.
# 1 CSU’s EPA-FUNDED PROGRAM ON “APPLYING SPATIAL AND TEMPORTAL MODELING OF STATISTICAL SURVEYS TO AQUATIC RESOURCES” N. SCOTT URQUHART RESEARCH SCIENTIST.
Anthony (Tony) R. Olsen USEPA NHEERL Western Ecology Division Corvallis, Oregon Voice: (541) Generalized Random Tessellation.
Implications of Model Specification and Temporal Revisit Designs on Trend Detection Leigh Ann Starcevich (OSU) Kathryn M. Irvine (USGS) Andrea M. Heard.
Chapter Two Copyright © 2006 McGraw-Hill/Irwin The Marketing Research Process.
VARYING DEVIATION BETWEEN H 0 AND TRUE  SEQUENCE OF GRAPHS TO ILLUSTRATE POWER VARYING DEVIATION BETWEEN H 0 AND TRUE  N. Scott Urquhart Director, STARMAP.
Statistical Concepts Basic Principles An Overview of Today’s Class What: Inductive inference on characterizing a population Why : How will doing this allow.
Virtual University of Pakistan
USES OF POWER IN DESIGNING LONG-TERM ENVIRONMENTAL SURVEYS
Statistical Data Analysis
Chapter 1 – Ecological Data
Statistical Data Analysis
TROUBLESOME CONCEPTS IN STATISTICS: r2 AND POWER
Introduction to the design (and analysis) of experiments
Presentation transcript:

KSU Monitoring Designs # 1 WE PROBABLY COULD HAVE MORE FUN TALKING ABOUT THESE TRAFFIC STOPPERS

KSU Monitoring Designs # 2 WHO CLEARLY HAVE THE RIGHT OF WAY! BUT…

KSU Monitoring Designs # 3 DESIGNING MONITORING SURVEYS OVER TIME (PANEL SURVEYS) POWER, VARIANCE and RELATED TOPICS N. Scott Urquhart Senior Research Scientist Department of Statistics Colorado State University Fort Collins, CO

KSU Monitoring Designs # 4 OUTLINEOUTLINE  Anatomy Of Sampling Studies Of Ecological Responses Through Time  Collaborator = Tony Olsen, EPA, WED    Urquhart, N.S. (1981). Anatomy of a study. HortScience 16:  Elaboration on  Survey Designs – GRTS – Work of Don Stevens  Temporal Designs  Power to detect trend – joint with Tom Kincade  Uses components of variance  Current work = estimating variance  Work of Sarah Williams, finishing MS this month

KSU Monitoring Designs # 5 A CONTEXT  “EMAP-TYPE SITUATIONS” EMAP = US EPA’S Environmental Monitoring and Assessment Program “EMAP-TYPEEMAPEnvironmental Monitoring and Assessment Program“EMAP-TYPEEMAPEnvironmental Monitoring and Assessment Program  Estimate status, changes, and trends in selected indicators of our nation’s ecological resources on a regional scale with known confidence.  Estimate status, changes, and trends in the extent and geographic coverage of our nation’s ecological resources on a regional scale with known confidence.  Describe associations between indicators of anthropogenic stress and indicators of condition.

KSU Monitoring Designs # 6 WHO MUST COMMUNICATE  Ecologists & Other Biologists  Statisticians  Geographers  Geographic Information Specialists  Information Managers  Quality Assurance Personnel  Managers, at Various Levels

KSU Monitoring Designs # 7 “SAMPLING”“SAMPLING”  A WORD OF MANY MEANINGS  A statistician often associates it with survey sampling  An ecologist may associate it with the selection of local sites or material  A laboratory scientist may associate it with the selection of material to be analyzed from the material supplied  Common general meaning, varied specific meanings

KSU Monitoring Designs # 8 THE SPECIAL NEED  Communication Demands a Distinction Between  The local process of evaluating a response, and  The statistical selection of a sampling unit, for example, A lake A lake A point on a steam A point on a steam A point in vegetation A point in vegetation  The terms  Response design  Sampling design or survey design  Can be used to make this distinction

KSU Monitoring Designs # 9 BASIC ROLES  Survey Design Tells Us Where To Go to Collect Sample Information or Material  Response Design Tells Us What To Do Once We Get There  But These Two Components Exist in a Broader Context

KSU Monitoring Designs # 10 AN IMPORTANT DISTINCTION  Monitoring Strategy  Conceptual  Impacted by objectives  Addressable without regard to the inference strategy  Inference Strategy  Places to evaluate the response  Relation between points evaluated and the population Ie, the basis for inference Ie, the basis for inference

KSU Monitoring Designs # 11 SAMPLING STUDIES OF ECOLOGICAL RESPONSES THROUGH TIME HAVE  Monitoring Strategy  Universe model  Statistical population  Domain design  Response design  Inference Strategy  Survey design  Temporal design  Quality assurance design These components exist regardless of the inference strategy These components exist for any monitoring strategy

KSU Monitoring Designs # 12 The UNIVERSE MODEL  Reality (Universe): Ecological Entity Within a Defined Geographic Area to be Monitored  Model of the Universe:  Development of monitoring approach requires construction of a model for the universe  Elements Of The Universe Model: Set of Entities Composing the Entire Universe of Concern

KSU Monitoring Designs # 13 The UNIVERSE MODEL  Population Description And Its Sampling Require Definition Of the “Units” in the Population  Discrete units: Lakes may be viewed this way Lakes may be viewed this way Individual trees can be viewed this way, too Individual trees can be viewed this way, too  Continuous structure in space of some dimension: 2-SPACE: Forests or Agroecosystems 2-SPACE: Forests or Agroecosystems 1-SPACE: Streams 1-SPACE: Streams 3-SPACE: Groundwater 3-SPACE: Groundwater

KSU Monitoring Designs # 14 A CONTINUOUS MODEL FOR STREAMS Strahler Orders Third Order First Orders First Order First Orders Second Order Second Order

KSU Monitoring Designs # 15 The STATISTICAL POPULATION  The Collection of Units (as modeled) Over Some Region of Definition  Spatial  Temporal  Spatial and Temporal  Population Definition Could Include Features Which Depend on Response Values  EX: acid sensitive streams at upper elevations

KSU Monitoring Designs # 16 The DOMAIN Design  Specifies Subpopulations or “Domains” of Special Interest  May Specify Meaningful Comparisons Between Domains  Similar to “planned comparisons” in experimental design situations  Domain design may depend in response values EX: Warm Versus Cold Water Lakes EX: Warm Versus Cold Water Lakes

KSU Monitoring Designs # 17 The RESPONSE DESIGN  The Response Design Specifies  The process of obtaining a response At an individual element (site) At an individual element (site) Of the resource Of the resource During a single monitoring period During a single monitoring period  Response: What Will Be Determined on an Element Needs to be responsive to the objectives of the monitoring activity Needs to be responsive to the objectives of the monitoring activity

KSU Monitoring Designs # 18 The INFERENCE STRATEGY  Is The Basis For Scientific Inference  Provides The Connection Between Objectives and the Monitoring Strategy  Monitoring Strategy Usually Must Rely On Obtaining Information on a Subset Of All Possible Elements in the Universe  Specifies Which Elements of the Universe Will Have Responses Determined on Them  Can Be Based on Either  Judgment selection of units Inferential validity rests on knowledge of relation between the universe and the units evaluated Inferential validity rests on knowledge of relation between the universe and the units evaluated –Why do a study if you know this much about the population?  Probability selection of units The focus here The focus here

KSU Monitoring Designs # 19 The SURVEY Design  Probability Based Survey Designs are Considered Here May Be Somewhat Limited To Sedentary Resources  Positive Features -- As An Observational Study  Permit clear statistical inference to well-defined populations  Measurements often can be made in natural settings, giving to greater realism to results

KSU Monitoring Designs # 20 The SURVEY DESIGN - CONTINUED  Disadvantages  Limited control over predictor variables  Restricts causative inference  Usually will produce inaccessible sampling points Good - for inference Good - for inference Bad - for logistics Bad - for logistics

KSU Monitoring Designs # 21 The TEMPORAL Design  The TEMPORAL DESIGN specifies the pattern of revisits to sites selected by the Survey Design  Sampled population units are partitioned into one (degenerate case) or more PANELS.  Each population unit in the same panel has the same temporal pattern of revisits.  Panel definition could be probabilistic or systematic  Several temporal designs follow after a brief discussion of the rest of the Anatomy, and a bit on site selection.

KSU Monitoring Designs # 22 QUALITY ASSURANCE DESIGN  Defines Those Activities Intended to Provide Data of Known Quality: Blind duplicates Blind duplicates Accepted chemical standards, etc Accepted chemical standards, etc  Can Provide Valid Estimates of the Variance Of Pure Measurement Error

KSU Monitoring Designs # 23 ON SITE SELECTION  Systematically Selected Sites  Good for means & totals, but do not support design-based estimate of variance  Probably OK for large areas like national forests,  Systematic designs can systematically miss things that have a natural layout. EX: Triangular grid (deliberately skewed) in early EMAP got fowled up with EX: Triangular grid (deliberately skewed) in early EMAP got fowled up with –Coastline in the Northeast –The canal network in Florida –Lakes east of the Cascade Mountain Range in Oregon  How to select spatially balanced, but random sites?

KSU Monitoring Designs # 24 GENERALIZED RANDOM TESSELLATION STRATIFIED (GRTS) DESIGN  Due to Don Stevens – see references  Allows  A continuous population model  Variable density sampling by defined areas  Accommodates an “imperfect frame” = reality  Sequential addition of points while maintaining spatial balance  Differing measurements Lots of points for inexpensive measures Lots of points for inexpensive measures A subset for more expensive measures A subset for more expensive measures A further subset for very expensive measures A further subset for very expensive measures Implemented in Southern California Bight Implemented in Southern California Bight

KSU Monitoring Designs # 25 GENERALIZED RANDOM TESSELLATION STRATIFIED (GRTS) DESIGN  Two GIS-based implementations  EMAP R code operates on ARC “Shape” files, and returns points there Begin at Begin at  STARMAP – Dave Theobald  RRQRR operates completely in ArcGIS  Both Allow Variable (spatial) Sampling Rates  Generally much better than stratification  (We can talk about this more if you want)

KSU Monitoring Designs # 26 Urquhart, N.S. and T.M Kincaid (1999). Designs for detecting trend from repeated surveys of ecological resources. Journal of Agricultural, Biological and Environmental Statistics 4: Initially presented at the invited conference Environmental Monitoring Surveys Over Time, held at the University the Washington, Seattle, in 1998 THE FOLLOWING MATERIAL WAS ADAPTED FROM

KSU Monitoring Designs # 27 MOTIVATING SITUATION  In 1986 Oregon Department of Fisheries and Wildlife Sought a “One Time” Probability Sampling Design To Survey Coastal Salmon. They Used It In  It showed earlier estimates of salmon returns to spawn to have been grossly overstated.  Consequence: continue to repeat an available design.  How Good Is The Repeated Use Of Such a Design For Estimating Trend?

KSU Monitoring Designs # 28 CONCLUSIONSCONCLUSIONS  General: Power for Trend Detection  Planned revisits are far superior to obtaining revisits from random “hits”  Year Variance: Power Deteriorates Fast as Increases  Site Variance:  No problem with revisit designs.  Without revisits it increases residual variance.  Sampling Rate: Power Increases with Sampling Rate (No surprise!)

KSU Monitoring Designs # 29 EVALUATION CONTEXT  General Perspective  Finite population sampling  But model assisted A generalization of the “error analysis” perspective of samplers A generalization of the “error analysis” perspective of samplers But recognizing realities of natural resource sampling But recognizing realities of natural resource sampling  Specific Perspective  Finite population, like of stream segments.  Response exists continuously in time, or at least for reoccurring blocks of time.  Take independent samples at different points in time (during an “index window”)

KSU Monitoring Designs # 30 EVALUATION CONTEXT (CONTINUED)  Model:  Sites (or stream segments) = a random effect  Years = a random effect, but may contain trend  Residual = a random effect Specific evaluation time Specific evaluation time Variation introduced by collection protocol Variation introduced by collection protocol Crew effect, if present Crew effect, if present –(often present for large surveys) “Measurement error” - broadly interpreted “Measurement error” - broadly interpreted

KSU Monitoring Designs # 31 PANEL PLANS = “TEMPORAL DESIGNS”  Sampled Population Units are Partitioned into One (Degenerate Case) or More Panels  Each population unit in the same panel has the same temporal pattern of revisits.  Panel definition could be probabilistic or systematic  Specific Plans  Always revisit  Never revisit  repeated surveys  Random revisits and other plans

KSU Monitoring Designs # 32 TEMPORAL DESIGN #1: ALWAYS REVISIT = ONE PANEL (This is Wayne Fuller’s “PURE PANEL”)

KSU Monitoring Designs # 33 TEMPORAL DESIGN #2: NEVER REVISIT = NEW PANEL EACH YEAR (INDEPENDENT SURVEYS IN A LARGE POPULATION)

KSU Monitoring Designs # 34 TEMPORAL DESIGN #3: ROTATING PANEL like NASS

KSU Monitoring Designs # 35 TEMPORAL DESIGN #3: ROTATING PANEL  A Rotating Panel Design Is The Temporal Design Used By The National Agricultural Statistical Service (US - “NASS”)  This Temporal Design Is “Connected” In The Experimental Design Sense  It is fairly well suited for estimation “status,”  But not nearly particularly powerful for detecting trend over intermediate time spans

KSU Monitoring Designs # 36 TEMPORAL DESIGN: SERIALLY ALTERNATING (ORIGINAL EMAP)  This Temporal Design Is “Unconnected” in the Experimental Design Sense.

KSU Monitoring Designs # 37 TEMPORAL DESIGN #5: AUGMENTED SERIALLY ALTERNATING (CURRENTLY USED BY EMAP FOR SURFACE WATERS)  This Temporal Design Is “Connected” in the Experimental Design Sense.

KSU Monitoring Designs # 38 TEMPORAL DESIGN #6: RANDOM PANELS

KSU Monitoring Designs # 39 STATISTICAL MODEL  Consider A Finite Population Of Sites  {S 1, S 2, …, S N }  and a Time Series Of Response Values At Each Site:  A finite population of time series  Time is continuous, but suppose Only a sample can be observed in any year, and Only a sample can be observed in any year, and Only during an index window of, say, 10% of a year Only during an index window of, say, 10% of a year

KSU Monitoring Designs # 40 STATISTICAL MODEL -- II

KSU Monitoring Designs # 41

KSU Monitoring Designs # 42 STATISTICAL MODEL -- III

KSU Monitoring Designs # 43 STATISTICAL MODEL -- IV  If P Indexes Panels, Then  Sites are nested in panels: p( i ) and  Years of visit are indicated by panel with n pj > 0 or n pj = 0 for panels visited or not visited in year j  The vector of cell means ( of “visited” cells) has a covariance matrix 

KSU Monitoring Designs # 44 STATISTICAL MODEL -- V  Now Let X Denote a Regressor Matrix Containing a Column Of 1’s and a Column of the Numbers of the Time Periods Corresponding to the Filled Cells. The Second Elements of Contain an Estimate Of Trend and its Standard Error.

KSU Monitoring Designs # 45  Ability of a Panel Plan to Detect Trend Can Be Expressed As Power.  We Will Evaluate Power in Terms of Ratios of Variance Components:  and of TOWARD POWER

KSU Monitoring Designs # 46 A SIMULATION STUDY TO MAKE POWER COMPARISONS   n = 60  N = 60, 240, 600, 1200, 10,000  ==> Sampling rates of 100%, 25%, 10%, 5%, ~ 0%  ==> Sampling rates of 100%, 25%, 10%, 5%, ~ 0%

KSU Monitoring Designs # 47 ALWAYS REVISIT, or EMAP-LIKE

KSU Monitoring Designs # , 0.075, 0.15, 0.30 ALWAYS REVISIT, or EMAP-LIKE

KSU Monitoring Designs # 49 ALWAYS REVISIT, or EMAP-LIKE

KSU Monitoring Designs # 50 NEVER REVISITALWAYS REVISIT, or EMAP-LIKE

KSU Monitoring Designs # 51 RANDOM REVISITALWAYS REVISIT, or EMAP-LIKE NEVER REVISIT

KSU Monitoring Designs # 52 NEVER REVISITALWAYS REVISIT, or EMAP-LIKE RANDOM REVISIT

KSU Monitoring Designs # 53 ROTATING PANEL

KSU Monitoring Designs # 54 CONCLUSIONSCONCLUSIONS  General: Power for Trend Detection  Planned revisits are far superior to obtaining revisits from random “hits”  Year Variance: Power Deteriorates Fast as Increases  Site Variance:  No problem with revisit designs.  Without revisits it increases residual variance.  Sampling Rate: Power Increases with Sampling Rate (No surprise!)

KSU Monitoring Designs # 55 CURRENT WORK   Stevens D.L. Jr and A.R. Olsen (2003). Variance estimation for spatially balanced samples of environmental resources. Environmetrics 14:   Proposed a local estimator for variance.   I have been using some variance component estimators. How do these two approaches relate? Should one be used rather than the other?   MS Student – Sarah Williams   Use local estimator for things like status measures Because it includes some site variance   Use components of variance for trend studies Revisits to sites remove most of the effect of that component   Currently investigating variance component of trend   And its impact on trend detection

KSU Monitoring Designs # 56 This research is funded by U.S.EPA – Science To Achieve Results (STAR) Program Cooperative Agreement # CR The work reported here today was developed under the STAR Research Assistance Agreement CR awarded by the U.S. Environmental Protection Agency (EPA) to Colorado State University. This presentation has not been formally reviewed by EPA. The views expressed here are solely those of presenter and STARMAP, the Program he represented. EPA does not endorse any products or commercial services mentioned in this presentation. FUNDING ACKNOWLEDGEMENT

KSU Monitoring Designs # 57 DISTRIBUTION OF SIMULATED POWER: 10 YEARS SITE VARIANCE = 1.875; YEAR VARIANCE: RATE = 25% RATE = 5% RATE = 10%

KSU Monitoring Designs # 58 DISTRIBUTION OF SIMULATED POWER: 20 YEARS SITE VARIANCE = 1.875; YEAR VARIANCE: RATE = 25% RATE = 5% RATE = 10%

KSU Monitoring Designs # 59 DISTRIBUTION OF SIMULATED POWER: 10 YEARS SITE VARIANCE = 2.50; YEAR VARIANCE: RATE = 25% RATE = 5% RATE = 10%

KSU Monitoring Designs # 60 DISTRIBUTION OF SIMULATED POWER: 20 YEARS SITE VARIANCE = 2.50; YEAR VARIANCE: RATE = 25% RATE = 5% RATE = 10%