Presentation is loading. Please wait.

Presentation is loading. Please wait.

Machine Learning Applying Our Knowledge Gold Team Week 3: Random Forests and SVM Results.

Similar presentations


Presentation on theme: "Machine Learning Applying Our Knowledge Gold Team Week 3: Random Forests and SVM Results."— Presentation transcript:

1 Machine Learning Applying Our Knowledge Gold Team Week 3: Random Forests and SVM Results

2 The Good, The Bad, and The Ugly We started the week with implementing Fast Random Forests and libSVM into Weka. This wasn't too difficult. Phil has a powerful computer and we decided to run all the data sets as is. It still took a couple days to do all the runs we needed Collating the data, after we received it all, took quite a while. The Good: The Ugly: The Bad:

3 Collating the Results We set up experimenter to look at all the different test runs. To get the percent of correctly classified instances. To figure out the number of correctly classified and the number of incorrectly classified instance we opened the results as a.csv in Excel, and calculated the number correct over folds.

4 Results of Week 3: Random Forests

5 Summary of Results We concluded that the Primary data set performed the best over all with fast random forest with k = 0.5x and the number of trees didn't seem to matter. We concluded that primary ran the best on SVM with poly kernel, where C nor the exponent mattered. The percent correct was 83.18%. Also we noticed that with the RBF kernel all results we're generally bad. We noticed that FastRandomForest was indeed significantly faster than Random forest but SVM seemed to perform around the same speed as regular SMO. Something Interesting:


Download ppt "Machine Learning Applying Our Knowledge Gold Team Week 3: Random Forests and SVM Results."

Similar presentations


Ads by Google