Presentation is loading. Please wait.

Presentation is loading. Please wait.

Pat Langley Dileep George Stephen Bay Computational Learning Laboratory Center for the Study of Language and Information Stanford University, Stanford,

Similar presentations


Presentation on theme: "Pat Langley Dileep George Stephen Bay Computational Learning Laboratory Center for the Study of Language and Information Stanford University, Stanford,"— Presentation transcript:

1 Pat Langley Dileep George Stephen Bay Computational Learning Laboratory Center for the Study of Language and Information Stanford University, Stanford, CA Kazumi Saito NTT Communication Science Laboratories Soraku, Kyoto, JAPAN Robust Induction of Process Models from Time-Series Data This research was funded in part by NTT Communication Science Laboratories and in part by Grant NCC from NASA Ames Research Center.

2 A Process Model for an Aquatic Ecosystem model AquaticEcosystem variables: phyto, zoo, nitro, residue observables: phyto, nitro process phyto_exponential_decay equations:d[phyto,t,1] = phyto equations:d[phyto,t,1] = phyto d[residue,t,1] = phyto process zoo_exponential_decay equations:d[zoo,t,1] = zoo equations:d[zoo,t,1] = zoo d[residue,t,1] = process zoo_phyto_predation equations:d[zoo,t,1] = zoo equations:d[zoo,t,1] = zoo d[residue,t,1] = zoo d[phyto,t,1] = zoo process nitro_uptake conditions:nitro > 0 conditions:nitro > 0 equations:d[phyto,t,1] = phyto equations:d[phyto,t,1] = phyto d[nitro,t,1] = phyto process nitro_remineralization; equations:d[nitro,t,1] = residue equations:d[nitro,t,1] = residue d[residue,t,1 ] = residue

3 Predictions from the Ecosystem Model

4 Advantages of Quantitative Process Models they refer to notations and mechanisms familiar to scientists; they refer to notations and mechanisms familiar to scientists; they embed quantitative relations within qualitative structure; they embed quantitative relations within qualitative structure; they provide dynamical predictions of changes over time; they provide dynamical predictions of changes over time; they offer causal and explanatory accounts of phenomena; they offer causal and explanatory accounts of phenomena; while retaining the modularity needed to support induction. while retaining the modularity needed to support induction. Process models are a good target for discovery systems because: Quantitative process models provide an important alternative to formalisms used currently in machine learning and discovery.

5 Observed values for a set of continuous variables as they vary over time or situations Generic processes that characterize causal relationships among variables in terms of conditional equations Inductive Process Modeling A specific process model that explains the observed values and predicts future data accurately Induction training data background knowledge learned model

6 Generic Processes as Background Knowledge the variables involved in a process and their types; the variables involved in a process and their types; the parameters appearing in a process and their ranges; the parameters appearing in a process and their ranges; the forms of conditions on the process; and the forms of conditions on the process; and the forms of associated equations and their parameters. the forms of associated equations and their parameters. Our framework casts background knowledge as generic processes that specify: Generic processes are building blocks from which one can compose a specific quantitative process model.

7 Generic Processes for Aquatic Ecosystems generic process exponential_decaygeneric process remineralization variables: S{species}, D{detritus} variables: N{nutrient}, D{detritus} variables: S{species}, D{detritus} variables: N{nutrient}, D{detritus} parameters: [0, 1] parameters: [0, 1] parameters: [0, 1] parameters: [0, 1] equations:d[S,t,1] = 1 S equations:d[N, t,1] = D equations:d[S,t,1] = 1 S equations:d[N, t,1] = D d[D,t,1] = Sd[D, t,1] = 1 D generic process predationgeneric process constant_inflow variables: S1{species}, S2{species}, D{detritus} variables: N{nutrient} variables: S1{species}, S2{species}, D{detritus} variables: N{nutrient} parameters: [0, 1], [0, 1] parameters: [0, 1] parameters: [0, 1], [0, 1] parameters: [0, 1] equations:d[S1,t,1] = S1 equations:d[N,t,1] = equations:d[S1,t,1] = S1 equations:d[N,t,1] = d[D,t,1] = (1 ) S1 d[S2,t,1] = 1 S1 generic process nutrient_uptake variables: S{species}, N{nutrient} variables: S{species}, N{nutrient} parameters: [0, ], [0, 1], [0, 1] parameters: [0, ], [0, 1], [0, 1] conditions:N > conditions:N > equations:d[S,t,1] = S equations:d[S,t,1] = S d[N,t,1] = 1 S

8 Previous Results: The IPM Algorithm 1. Find all ways to instantiate known generic processes with specific variables, subject to type constraints; 2. Combine instantiated processes into candidate generic models, with limits on the total number of processes; 3. For each generic model, carry out gradient descent search through parameter space to find good parameter values; 4. Select the parameterized model that produces the lowest mean squared error on the training data. Langley et al. (2002) reported IPM, an algorithm that constructs process models from generic components in four stages: We showed that IPM could induce accurate process models from noisy time series, but it tended to include extra processes.

9 The Revised IPM Algorithm Accepts as input those variables that can appear in the induced model, both observable and unobservable; Accepts as input those variables that can appear in the induced model, both observable and unobservable; Utilizes the parameter-fitting routine to estimate initial values for unobservable variables; Utilizes the parameter-fitting routine to estimate initial values for unobservable variables; Invokes the parameter-fitting method to induce the thresholds on process conditions; and Invokes the parameter-fitting method to induce the thresholds on process conditions; and Selects the parameterized model with the lowest description length: M d = (M v + M c ) log (n) + n log (M e ). Selects the parameterized model with the lowest description length: M d = (M v + M c ) log (n) + n log (M e ). We have revised and extended the IPM algorithm so that it now: We have evaluated the new system on synthetic and natural data.

10 Evaluation of the IPM Algorithm 1. We used the aquatic ecosystem model to generate data sets over 100 time steps for the variables nitro and phyto; 2. We replaced each true value x with x (1 + r n), where r followed a Gaussian distribution ( = 0, = 1) and n > 0; 3. We ran IPM on these noisy data, giving it type constraints and generic processes as background knowledge. To demonstrate IPM's ability to induce process models, we ran it on synthetic data for a known system: In two experiments, we let IPM determine the initial values and thresholds given the correct structure; in a third study, we let it search through a space of 256 generic model structures.

11 Experimental Results with IPM The main results of our studies with IPM on synthetic data were: 1. The system infers accurate estimates for the initial values of unobservable variables like zoo and residue; 2. The system induces estimates of condition thresholds on nitro that are close to the target values; and 3. The MDL criterion selects the correct model structure in all runs with 5% noise, but only 40% of runs with 10% noise. These suggest that the basic approach is sound, but that we should consider other MDL schemes and other responses to overfitting.

12 Results with Unobserved Initial Values

13 Electric Power on the International Space Station

14 Telemetry Data from Space Station Batteries Predictor variables included the batterys current and temperature.

15 Induced Process Model for Battery Behavior model Battery variables: Rs, Vcb, soc, Vt, i, temperature observable: soc, Vt, i, temperature process voltage_chargeprocess voltage_discharge conditions:i 0 conditions:i < 0 conditions:i 0 conditions:i < 0 equations:Vt = Vcb Rs i equations:Vt = Vcb 1.0 / (Rs + 1.0) equations:Vt = Vcb Rs i equations:Vt = Vcb 1.0 / (Rs + 1.0) process charge_transfer equations:d[soc,t,1] = i Vcb/ equations:d[soc,t,1] = i Vcb/ process quadratic_influence_Vcb_soc equations:Vcb = soc soc equations:Vcb = soc soc process linear_influence_Vcb_temp equations:Vcb = temperature equations:Vcb = temperature process linear_influence_Rs_soc equations:Rs = soc equations:Rs = soc

16 Results on Battery Test Data

17 Best Fit to Data on Protozoan Predation

18 computational scientific discovery (e.g., Langley et al., 1983) computational scientific discovery (e.g., Langley et al., 1983) knowledge-based learning methods (e.g., ILP, theory revision) knowledge-based learning methods (e.g., ILP, theory revision) qualitative physics and simulation (e.g., Forbus, 1984) qualitative physics and simulation (e.g., Forbus, 1984) scientific simulation environments (e.g., STELLA, MATLAB ) scientific simulation environments (e.g., STELLA, MATLAB ) Intellectual Influences Our work on inductive process modeling incorporates ideas from many traditions: However, the most similar research comes from Todorovski and Dzeroski (1997) and from Bradley, Easley, and Stolle (2001). Their approaches also use knowledge to guide the induction of differential equation models, though without a process formalism.

19 Directions for Future Research produce additional results on other scientific data sets produce additional results on other scientific data sets develop more robust methods for fitting model parameters develop more robust methods for fitting model parameters explore alternative techniques that mitigate overfitting explore alternative techniques that mitigate overfitting extend the approach to handle data sets with missing values extend the approach to handle data sets with missing values implement heuristic methods for searching the model space implement heuristic methods for searching the model space utilize knowledge of subsystems to further constrain search utilize knowledge of subsystems to further constrain search Despite our progress to date, we need further work in order to: Our goal is a robust approach to inductive process modeling that can aid scientists and engineers in model construction.

20 End of Presentation


Download ppt "Pat Langley Dileep George Stephen Bay Computational Learning Laboratory Center for the Study of Language and Information Stanford University, Stanford,"

Similar presentations


Ads by Google