Design of Experiments Questions Network Inference Working Group October 8, 2008.

Slides:



Advertisements
Similar presentations
Kin 304 Regression Linear Regression Least Sum of Squares
Advertisements

Notes Sample vs distribution “m” vs “µ” and “s” vs “σ” Bias/Variance Bias: Measures how much the learnt model is wrong disregarding noise Variance: Measures.
FACTORIAL ANOVA Overview of Factorial ANOVA Factorial Designs Types of Effects Assumptions Analyzing the Variance Regression Equation Fixed and Random.
From
Pattern Recognition and Machine Learning
Ch11 Curve Fitting Dr. Deshi Ye
06/05/2008 Jae Hyun Kim Chapter 2 Probability Theory (ii) : Many Random Variables Bioinformatics Tea Seminar: Statistical Methods in Bioinformatics.
Longitudinal Experiments Larry V. Hedges Northwestern University Prepared for the IES Summer Research Training Institute July 28, 2010.
Statistical Methods Chichang Jou Tamkang University.
Rules for means Rule 1: If X is a random variable and a and b are fixed numbers, then Rule 2: If X and Y are random variables, then.
Programme in Statistics (Courses and Contents). Elementary Probability and Statistics (I) 3(2+1)Stat. 101 College of Science, Computer Science, Education.
Machine Learning CMPT 726 Simon Fraser University
Chapter 11 Multiple Regression.
Probability theory 2010 Conditional distributions  Conditional probability:  Conditional probability mass function: Discrete case  Conditional probability.
Chapter 5. Operations on Multiple R. V.'s 1 Chapter 5. Operations on Multiple Random Variables 0. Introduction 1. Expected Value of a Function of Random.
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
General Linear Model. Instructional Materials MultReg.htmhttp://core.ecu.edu/psyc/wuenschk/PP/PP- MultReg.htm.
Lecture 10: Mean Field theory with fluctuations and correlations Reference: A Lerchner et al, Response Variability in Balanced Cortical Networks, q-bio.NC/ ,
EE462 MLCV Lecture (1.5 hours) Segmentation – Markov Random Fields Tae-Kyun Kim 1.
1 Experimental Statistics - week 10 Chapter 11: Linear Regression and Correlation Note: Homework Due Thursday.
Simplify Warm Up. Classifying Polynomials Section 8-1.
The Mean of a Discrete RV The mean of a RV is the average value the RV takes over the long-run. –The mean of a RV is analogous to the mean of a large population.
13.01 Polynomials and Their Degree. A polynomial is the sum or difference of monomials. x + 3 Examples: Remember, a monomial is a number, a variable,
Polynomials 10.0 Given a set of polynomials, students will write standard form and find leading coefficient.
Unit 8, Lesson 1.  ynomials/preview.weml ynomials/preview.weml.
8-1 Adding and Subtracting Polynomials. Degree of a monomial: is the sum of the exponents of its variables. The degree of a nonzero constant is 0. Zero.
Inferential Statistics. The Logic of Inferential Statistics Makes inferences about a population from a sample Makes inferences about a population from.
Single-Factor Studies KNNL – Chapter 16. Single-Factor Models Independent Variable can be qualitative or quantitative If Quantitative, we typically assume.
Session 13: Correlation (Zar, Chapter 19). (1)Regression vs. correlation Regression: R 2 is the proportion that the model explains of the variability.
Correlation. Correlation is a measure of the strength of the relation between two or more variables. Any correlation coefficient has two parts – Valence:
CJT 765: Structural Equation Modeling Class 8: Confirmatory Factory Analysis.
Expressions with only multiplication, division and exponents are called monomials Write 3 monomials.
Understanding Polynomials
Representation of CPTs CH discrete Canonical distribution: standard Deterministic nodes: values computable exactly from parent nodes Noisy-OR relations:
Differential Equations Linear Equations with Variable Coefficients.
5-1 MGMG 522 : Session #5 Multicollinearity (Ch. 8)
Stats Term Test 4 Solutions. c) d) An alternative solution is to use the probability mass function and.
8.1 ADDING AND SUBTRACTING POLYNOMIALS To classify, add, and subtract polynomials.
8.1 adding and subtracting polynomials Day 1. Monomial “one term” Degree of a monomial: sum of the exponents of its variables. Zero has no degree. a.
Degrees of a Monomial. Degree of a monomial: Degree is the exponent that corresponds to the variable. Examples: 32d -2x 4 16x 3 y 2 4a 4 b 2 c 44 has.
Chapter 11 Linear Regression and Correlation. Explanatory and Response Variables are Numeric Relationship between the mean of the response variable and.
PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 1: INTRODUCTION.
Introduction to Differential Equations
Chapter 7. Classification and Prediction
Statistical Modelling
In this lesson we will classify, add and subtract polynomials.
The general linear model and Statistical Parametric Mapping
Lesson 4.1 Understanding Polynomial Expressios
CH 5: Multivariate Methods
Section 6.2: Polynomials: Classification, Addition, and Subtraction
IB Math SL 21A: Correlation.
Chapter 7: Sampling Distributions
Special Topics In Scientific Computing
BPK 304W Correlation.
In-Class Exercise: Discrete Distributions
The General Linear Model (GLM)
Warm Up – September 25, 2017 Solve the following: 2x2 – 3x – 5 = 0
Filtering and State Estimation: Basic Concepts
ASV Chapters 1 - Sample Spaces and Probabilities
A number, a variable or the product of them
descending order of exponents
Means and Variances of Random Variables
Objectives Classify polynomials and write polynomials in standard form. Evaluate polynomial expressions.
Understanding polynomials
ASV Chapters 1 - Sample Spaces and Probabilities
Make sure you have book and working calculator EVERY day!!!
Learning Target: I will be able to identify polynomials
Sociological Investigation
CLASSIFYING POLYNOMIAL
Presentation transcript:

Design of Experiments Questions Network Inference Working Group October 8, 2008

Questions: Model Classification How do we classify models? –Linear –Dynamic Bayesian networks –Log transformed –Differential equations –Polynomial dynamical systems –Hybrid: discrete-continuous (for example, Glass networks) What kinds of restrictions do we put on a model space in order to identify models from a given data set? Distinguish between correlation and causation (direct vs. indirect; with and without strengths).

Questions: Model Identification What kinds of data sets identify a fixed model class? –Linear, i.e. sums of variables –Products of variables, perhaps over a fixed field, such as ZZ/2 –How many random data points (perhaps independent samples) will identify a monomial dynamical system with fixed degree? –How many correlated data points (i.e., time course) will identify…? What kinds of models are identifiable by a fixed class of data set? –Orthogonal arrays –Factorial designs –Symmetric designs What kinds of data identify a given nested canalyzing function?

Questions: Model Validation How do we test the effect of a variable/term (sum of terms) on a function? –Sensitivity analysis (e.g., for computer programs) –Information-theoretic measures ANOVA (or some other measure of variance) for discrete data? –Discrete noise? –Look to coding theory literature