Presentation is loading. Please wait.

Presentation is loading. Please wait.

Overview of the final test for CSC 2515. Overview PART A: 7 easy questions –You should answer 5 of them. If you answer more we will select 5 at random.

Similar presentations


Presentation on theme: "Overview of the final test for CSC 2515. Overview PART A: 7 easy questions –You should answer 5 of them. If you answer more we will select 5 at random."— Presentation transcript:

1 Overview of the final test for CSC 2515

2 Overview PART A: 7 easy questions –You should answer 5 of them. If you answer more we will select 5 at random. So cross out any extra answers. –Worth 8 points each. –Each question should take 3-5 minutes. Typical easy question: –a) Write down the softmax function –b) What is the purpose of the denominator?

3 Overview (cont.) PART B: 4 hard questions. –You must only answer 3 of them. If you answer more we will select 3 of your answers at random. So cross out any extra answers. –Worth 20 points each. –They should take ~10 minutes each. So you’ll have one hour to do the exam. Typical hard question –If the input vectors from two classes are drawn from a Gaussian distribution, and the classes share a covariance matrix, then what is the form of the posterior distribution, and why?

4 Summary What to study? –Material covered in lectures and tutorial –Use the textbook as back-up, to help understand the methods and derivations The exam is closed book and notes –Do not focus on memorizing formulas, but instead main ideas and methods

5 Topics to Study Week 2 –Lecture : Linear Regression When is minimizing the squared error equivalent to Maximum Likelihood Learning? Online vs. Batch learning Regularized least squares – alternative regularizers Bayesian approach – integrating over posterior over parameters Week 3 –Lecture: Classification Compare methods of performing multi-class discrimination Probabilistic interpretation of LR; loss function Regularizer: form, rationale Naïve Bayes: key assumption, cost function Week 4 –Lecture : Neural Networks Backprop: when applicable (constraints on cost, activation functions)? Weight constraints: how and why What is a trigram, and why is it inadequate as a model of sentences? Methods to prevent over-fitting

6 Topics to Study (cont.) Week 5 –Lecture: Deep Belief Networks only responsible for basic RBM slides main idea – RBMs: energies and probabilities; learning rule Week 6 –Lecture : Clustering & EM Hard vs. soft k-means EM: definition of steps in algorithm Free energy – relationship to EM, convergence proof Mixture models Week 7 –Lecture: Continuous Latent Variable Models PCA: motivation, basic algorithm Probabilistic vs. standard PCA, and compare to full Gaussian PPCA, FA, ICA: Generative models, compare/contrast

7 Topics to Study (cont.) Week 8 –Lecture : Combining Models When is ensemble a win? Bagging vs. Boosting Boosting: main steps in algorithm; 2 different types of weights, rationale Decision trees: algorithm; choosing nodes via MI; decision boundaries, stumps Mixture of experts Week 9 – Lecture: Support Vector Machines Understand max margin, kernel trick Compare to logistic regression Week 10 –Lecture: Bayesian Reasoning Not responsible for details of Bayesian regression main idea - style of approach vs. earlier methods Should understand conjugacy Week 11 Directed vs. undirected graphical models


Download ppt "Overview of the final test for CSC 2515. Overview PART A: 7 easy questions –You should answer 5 of them. If you answer more we will select 5 at random."

Similar presentations


Ads by Google