Presentation is loading. Please wait.

Presentation is loading. Please wait.

Pan-STARRS Photometric Classification of Supernovae using a Hierarchical Bayesian Model George Miller, Edo Berger, Nathan Sanders Harvard-Smithsonian Center.

Similar presentations


Presentation on theme: "Pan-STARRS Photometric Classification of Supernovae using a Hierarchical Bayesian Model George Miller, Edo Berger, Nathan Sanders Harvard-Smithsonian Center."— Presentation transcript:

1 Pan-STARRS Photometric Classification of Supernovae using a Hierarchical Bayesian Model George Miller, Edo Berger, Nathan Sanders Harvard-Smithsonian Center for Astrophysics

2 1.8m f/4.4 telescope with 3.2 degree FOV and 1.6 Gpix camera PS1 Sky Surveys: May 2010 – March 2014 Medium Deep 10 fields of 7 sq. deg. with a 3-day staggered cadence and grizy filters Full reprocessing MD.PV3 underway with full forced photometry Currently the best precursor and source of training data for LSST classification tools – Similar cadence (~3 days) and sensitivity – Similar grizy band-passes (+u for LSST) – Wide variety of science drivers and a large distributed collaboration PS1 Medium Deep Survey

3 – MISSION: To develop a joint probabilistic model for the populations of different classes of transients and subsequent observations of individual objects. Allow this model to be fit using PS1 classified (~20%) and unclassified (~80%) data and then be applied to achieve classification and parameter inference on new objects, e.g. from LSST. Implementation

4 – MISSION: To develop a joint probabilistic model for the populations of different classes of transients and subsequent observations of individual objects. Allow this model to be fit using PS1 classified (~20%) and unclassified (~80%) data and then be applied to achieve classification and parameter inference on new objects, e.g. from LSST. – COMPUTATIONAL METHODS: STAN Implementation Stan is a probabilistic programming language, written in C++ with python and R interfaces. Uses a “no U-turn” (NUTS) Hamiltonian Monte Carlo (HMC) sampler, which adaptively refines the Metropolis step size by the transversal of the posterior. Incorporates algorithms for automatic differentiation and adaptive tuning parameter refinement

5 PS1/MDS Spectroscopic Sample Sample of 514 events with spectroscopic follow up (~150 nights on MMT, Magellan, Gemini by the Harvard PS1 team) 69% Ia 22% IIP/n 4% Ib/c 4% ULSN, TDE, etc. 1% Unclassified

6 A hF [N CL,N F ] t 1 [N CL,N F,N SN ] β [N CL,N F,N SN ] τ rise [N CL,N F,N SN ] τ fall [N CL,N F,N SN ] M [N] t 0 [N CL,N F,N SN ] t hSNF [N CL,N F,N SN,2] t hF [N CL,N F,2] τ hSNF [N CL,N F,N SN,2] τ hF [N CL,N F,2] β hSNF [N CL,N F,N SN ] β hF [N CL,N F ] t h [N CL,2] λ [N SN,2] τ h [N CL,2] Β h [N CL ] c [N CL,N SN ] A [N CL,N F,N SN ] σ [N CL,N F,N SN ] HG [N CL,N F,N SN ] A hSNF [N CL,N F,N SN ] c hSNF [N CL,N F,N SN ] HG hSNF [N CL,N F,N SN ] σ hF [N CL,N F ] σ hSNF [N CL,N F,N SN ] chch A h [N CL ] HG h [N CL ] σ h [N CL ] Flux [N] λ [N SN ] λ [N SN,2] λ [N SN ] λ [N SN ] λ [N SN ]

7 Adapted from the ‘non- I a’ lightcurve model (Bazin+ 2009) Required a functional form sufficiently generic to fit a wide array of possible SN lightcurves Added second time parameter and linear decay function to allow for plateau effect near and after maximum light

8 Confirmed Type IIP Confirmed Type Ia Individual fits using PyMC MH

9 Adapted from Bazin+ 2009 ‘non- I a’ lightcurve model Required a functional form sufficiently generic to fit a wide array of possible SN lightcurves Added second time parameter and linear decay function to allow for plateau effect near and after maximum light Additional normal error term to encapuslate intrinsic scatter in flux and allow for overdispersion Additional terms for host galaxy information

10 A hF [N CL,N F ] t 1 [N CL,N F,N SN ] β [N CL,N F,N SN ] τ rise [N CL,N F,N SN ] τ fall [N CL,N F,N SN ] M [N] t 0 [N CL,N F,N SN ] t hSNF [N CL,N F,N SN,2] t hF [N CL,N F,2] τ hSNF [N CL,N F,N SN,2] τ hF [N CL,N F,2] β hSNF [N CL,N F,N SN ] β hF [N CL,N F ] t h [N CL,2] λ [N SN,2] τ h [N CL,2] Β h [N CL ] c [N CL,N SN ] A [N CL,N F,N SN ] σ [N CL,N F,N SN ] HG [N CL,N F,N SN ] A hSNF [N CL,N F,N SN ] c hSNF [N CL,N F,N SN ] HG hSNF [N CL,N F,N SN ] σ hF [N CL,N F ] σ hSNF [N CL,N F,N SN ] chch A h [N CL ] HG h [N CL ] σ h [N CL ] Flux [N] λ [N SN ] λ [N SN,2] λ [N SN ] λ [N SN ] λ [N SN ]

11 Hierarchical Bayes Probabilistic Connection y i [N] θ i [N] φ Model in which parameter dependences can be constructed on multiple structured levels. A common example: The one-way normal model θ = (θ i ) φ = (μ, τ)

12 Hierarchical Bayes Probabilistic Connection y [N] φ Model in which parameter dependences can be constructed on multiple structured levels. A common example: The one-way normal model φ = (μ, τ) Non-centered parameterizations exchanges certain dependences between hyperparameters with correlations between hyperparameters and the data ν [N] Linear Connection

13 Hierarchical Bayes Probabilistic Connection Deterministic Connection τ rise hF [N F ] τ rise h τ rise [N F,N SN ] M [N] Three level model, with a top level hyperparameters for each parameter, and 4 mid level hyperparameters for each filter Non-centered parameterization to remove correlations between hyperparameters

14 Hierarchical Bayes Probabilistic Connection Deterministic Connection τ rise hF [N F ] τ rise h τ rise [N F,N SN ] M [N] τ rise hSNF [N F,N SN ] Three level model, with a top level hyperparameters for each parameter, and 5 mid level hyperparameters for each filter Non-centered parameterization to remove correlations between hyperparameters Adopt normal hyperprior for all location (mean) hyperparameters and half-cauchy distribution for all scale (variance) hyperparameters May then feed point estimates and test quantities into other Machine Learning classification algorithms

15 Ia vs. non-Ia AdaBoost classification Without redshift information With redshift information

16 A hF [N CL,N F ] t 1 [N CL,N F,N SN ] β [N CL,N F,N SN ] τ rise [N CL,N F,N SN ] τ fall [N CL,N F,N SN ] M [N] t 0 [N CL,N F,N SN ] t hSNF [N CL,N F,N SN,2] t hF [N CL,N F,2] τ hSNF [N CL,N F,N SN,2] τ hF [N CL,N F,2] β hSNF [N CL,N F,N SN ] β hF [N CL,N F ] t h [N CL,2] λ [N SN,2] τ h [N CL,2] Β h [N CL ] c [N CL,N SN ] A [N CL,N F,N SN ] σ [N CL,N F,N SN ] HG [N CL,N F,N SN ] A hSNF [N CL,N F,N SN ] c hSNF [N CL,N F,N SN ] HG hSNF [N CL,N F,N SN ] σ hF [N CL,N F ] σ hSNF [N CL,N F,N SN ] chch A h [N CL ] HG h [N CL ] σ h [N CL ] Flux [N] λ [N SN ] λ [N SN,2] λ [N SN ] λ [N SN ] λ [N SN ]

17 Categorical Mixture Model Probabilistic Connection Linear Connection Top level multinomial simplex parameter controlling the classes of supernovae τ rise hF [N CL,N F ] τ rise h [N CL ] τ rise [N CL,N F,N SN ] τ rise hSNF [N CL,N F,N SN ] λ [N SN ] Produces full posterior probability of classification for each event observed. Will likely need to set more informative priors on hyperparameters to allow for convergence across the entire simplex space.

18 Online predictions and computational scale Probabilistic Connection Linear Connection Run model on full database and save marginal posteriors for hyperparameters Use these distributions as priors for new analysis of unclassified events Better method would use classified samples as auxiliary data and construct the hierarchical model using both tagged and untagged data This may not be computationally feasible on timescales needed for follow-up observations Can fix certain hyperparameters to reduce computational time i.e. for well understood classes such as Ia τ rise hF [N CL,N F ] τ rise h [N CL ] τ rise [N CL,N F,N SN ] τ rise hSNF [N CL,N F,N SN ] λ [N SN ]

19 To Do Incorporate photo-z information rather than spectroscopic redshifts Model K-corrections as a function of photo-z and SN type Include host-galaxy information (color, positional offset, etc.) Adapt functional light curve model to be more sensitive to SN rise time and shape Better constrain early online predictions Explore Riemannian manifold HMC techniques

20 Summary Developed joint probabilistic Bayesian Hierarchical model to be fit with PS1 MD classified SN sample Coded model using the HMC probabilistic program STAN to effectively explore the high dimensional space Classification may be accomplished using separate predictive model of fitted parameters, or by developing a full mixture model with simplex parameters Stay tuned for first results!


Download ppt "Pan-STARRS Photometric Classification of Supernovae using a Hierarchical Bayesian Model George Miller, Edo Berger, Nathan Sanders Harvard-Smithsonian Center."

Similar presentations


Ads by Google