Kaggle Winner Presentation Template. Agenda 1.Background 2.Summary 3.Feature selection & engineering 4.Training methods 5.Important findings 6.Simple.

Slides:



Advertisements
Similar presentations
Slide number 1 EE3P BEng Final Year Project Group Session 2 Processing Patterns using a Multi- Layer Perceptron (MLP) Martin Russell.
Advertisements

Funding Networks Abdullah Sevincer University of Nevada, Reno Department of Computer Science & Engineering.
Competitive Networks. Outline Hamming Network.
Assuming normally distributed data! Naïve Bayes Classifier.
Learning from Observations Chapter 18 Section 1 – 4.
Statistics 350 Lecture 1. Today Course outline Stuff Section
Team Name Final Presentation Team Members Date Rev
Team Name Final Presentation Team Members Date. User notes –You may reformat to fit your design but make sure you cover the following points –You may.
The Conference on Nuclear Training and Education Presentation Title Author List Affiliations February, 2007 Jacksonville, Florida Space for your company.
Machine Learning1 Machine Learning: Summary Greg Grudic CSCI-4830.
Chapter 9 Neural Network.
Warm up 8/26 Write the equation of the transformation you see Potatoes cost a chef $18 a box, and carrots cost $12 a box. The chef want to spend.
Copyright © 2010, SAS Institute Inc. All rights reserved. Applied Analytics Using SAS ® Enterprise Miner™
Zhangxi Lin ISQS Texas Tech University Note: Most slides are from Decision Tree Modeling by SAS Lecture Notes 5 Auxiliary Uses of Trees.
Application of Data Mining Algorithms in Atmospheric Neutrino Analyses with IceCube Tim Ruhe, TU Dortmund.
Jeff Howbert Introduction to Machine Learning Winter Regression Linear Regression.
The Perceptron. Perceptron Pattern Classification One of the purposes that neural networks are used for is pattern classification. Once the neural network.
EMBC2001 Using Artificial Neural Networks to Predict Malignancy of Ovarian Tumors C. Lu 1, J. De Brabanter 1, S. Van Huffel 1, I. Vergote 2, D. Timmerman.
Ensembles. Ensemble Methods l Construct a set of classifiers from training data l Predict class label of previously unseen records by aggregating predictions.
October 19, 2000ACAT 2000, Fermilab, Suman B. Beri Top Quark Mass Measurements Using Neural Networks Suman B. Beri, Rajwant Kaur Panjab University, India.
Howto use Eureka ICS 105 Research Team Fall 2000.
The Viola/Jones Face Detector A “paradigmatic” method for real-time object detection Training is slow, but detection is very fast Key ideas Integral images.
CANE 2007 Spring Meeting Visualizing Predictive Modeling Results Chuck Boucek (312)
Kaggle Competition Prudential Life Insurance Assessment
ECE 5984: Introduction to Machine Learning Dhruv Batra Virginia Tech Topics: –Ensemble Methods: Bagging, Boosting Readings: Murphy 16.4; Hastie 16.
Week 10 Emily Hand UNR.
June 2013 BIG DATA SCIENCE: A PATH FORWARD. CONFIDENTIAL | 2  Data Science Lead.
Enhancements to IIIG LTMS By: Todd Dvorak
PowerPoint Cool for School Tips Summer Training 2010.
Automated Mapping of Marine Habitats from Marine Sonar MEPF Ref No: MEPF 09/P107 Note to reader: These PowerPoint slides accompany the final report for.
This slide is used as a ‘Template' for generating agenda slides. Please do not delete this slide. Adjust the design of this slide and click the 'Sync Agenda'
Copyright © 2015, 2012, and 2009 Pearson Education, Inc. 1 Chapter Normal Probability Distributions 5.
May 8, 2012 Objective: Students will create a successful Power Point to present their Business Plan concept to the class.
More Symbolic Learning CPSC 386 Artificial Intelligence Ellen Walker Hiram College.
AD Lab Andrews-Dalkilic Data Exploration. AD Lab Outline Architecture What is Bioinformatics? Explore some bioinformatics problems and tools Talk about.
Copyright © 2014, 2010, 2007 Pearson Education, Inc. Section 2.2, Slide 1 Set Theory 2 Using Mathematics to Classify Objects.
Information Processing by Neuronal Populations Chapter 6: Single-neuron and ensemble contributions to decoding simultaneously recoded spike trains Information.
Understanding Business Intelligence with Neural Networks James McCaffrey Microsoft Research Labs Wednesday, May 4, :15 – 4:00 PM Room Breakers CD.
Statistics 350 Review. Today Today: Review Simple Linear Regression Simple linear regression model: Y i =  for i=1,2,…,n Distribution of errors.
Lecture 3b: CNN: Advanced Layers
Kaggle competition Airbnb Recruiting: New User Bookings
Convolutional Sequence to Sequence Learning
Object Classification through Deconvolutional Neural Networks
Data Mining, Neural Network and Genetic Programming
Boosting and Additive Trees (2)
Demographics and Weblog Hackathon – Case Study
2006 IEEE World Congress on Computational Intelligence, International Joint Conference on Neural Networks (IJCNN) Evolutionary Search for Interesting Behavior.
Author List Affiliations February, 2007 Jacksonville, Florida
Author List Affiliations February, 2007 Jacksonville, Florida
R-CNN region By Ilia Iofedov 11/11/2018 BGU, DNN course 2016.
Author List Affiliations February, 2007 Jacksonville, Florida
CIKM Competition 2014 Second Place Solution
CIKM Competition 2014 Second Place Solution
کارگاه حجم نمونه با نرم افزار G*Power
Road Traffic Sign Recognition
Chapter 5 Normal Probability Distributions.
Simple Linear Regression
Using decision trees and their ensembles for analysis of NIR spectroscopic data WSC-11, Saint Petersburg, 2018 In the light of morning session on superresolution.
Title Go back and review the instructions and hints
Resource Recommendation for AAN
Ensemble learning.
Tuning CNN: Tips & Tricks
Somi Jacob and Christian Bach
Analysis for Predicting the Selling Price of Apartments Pratik Nikte
Generative Models and Naïve Bayes
Lecture 4. Niching and Speciation (1)
CIS 519 Recitation 11/15/18.
Unsupervised Perceptual Rewards For Imitation Learning
Deep learning enhanced Markov State Models (MSMs)
Chapter 5 Normal Probability Distributions.
Presentation transcript:

Kaggle Winner Presentation Template

Agenda 1.Background 2.Summary 3.Feature selection & engineering 4.Training methods 5.Important findings 6.Simple model

Background [ Your professional/academic background ] [ Prior experience (if any) that helped you succeed in this competition ]

Summary [ Training methods you used eg. Convolutional Neural Network, XGBoost ] [ Most important features ] [ The tools you used ] [ How long does it take to train your model? ]

Features Selection / Engineering [ Most important features. Recommend variable importance plot – see next slide ] [ Outline any important feature transformations ]

Features Selection / Engineering Variable Importance Plot

Features Selection / Engineering [ Relationship between the most important features and the target variable. Recommend using partial plots – see next slide ]

Features Selection / Engineering Partial Plot of Important Feature #1

Features Selection / Engineering Partial Plot of Important Feature #2

Training Methods [ Training methods you used ] [ Did you ensemble? How did you weight different models? ]

Important and Interesting Findings [ What set you apart from others in the competition? ] [ Interesting relationships in the data that don't fit in the sections above. Recommend showing interesting visualizations – see next slide. ]

Important and Interesting Findings Interesting visualization found when exploring the data

Simple Model [ Outline a subset of features that would get 90-95% of your final performance ] [ If you used an ensemble, was there a single classifier that did most of the work? Which one? ] [ What would the simplified model score? ]