Presentation is loading. Please wait.

Presentation is loading. Please wait.

Learning Invariances and Hierarchies Pierre Baldi University of California, Irvine.

Similar presentations


Presentation on theme: "Learning Invariances and Hierarchies Pierre Baldi University of California, Irvine."— Presentation transcript:

1 Learning Invariances and Hierarchies Pierre Baldi University of California, Irvine

2 Two Questions 1.“If we solve computer vision, we have pretty much solved AI.” 2.A-NNs vs B-NNs and Deep Learning.

3 If we solve computer vision…

4 If we solve computer audition,….

5 If we solve computer vision… If we solve computer audition,…. If we solve computer olfaction,…

6 If we solve computer vision… If we solve computer audition,…. If we solve computer olfaction,… If we solve computer vision, how can we build computers that can prove Fermat’s last theorem?

7 Invariances Invariances in audition. We can recognize a tune invariantly with respect to: intensity, speed, tonality, harmonization, instrumentation, style, background. Invariances in olfaction. We can recognize an odor invariantly with respect to: concentrations, humidity, pressure, winds, mixtures, background.

8 Non-Invariances Invariances evolution did not care about (although we are still evolving!...) – We cannot recognize faces upside down. – We cannot recognize tunes played in reverse. – We cannot recognize stereoisomers as such. Enantiomers smell differently.

9 A-NNs vs B-NNs

10 Origin of Invariances Weight sharing and translational invariance. Can we quantify approximate weight sharing? Can we use approximate weight sharing to improve performance? Some of the invariance comes from the architecture. Some may come from the learning rules.

11 Learning Invariances E Hebb symmetric connections w ij =w ji Acyclic orientation of the Hypercube O(H) Isometry Hebb O(H) H I(O(H)) I(H)

12 Deep Learning ≈ Deep Targets Training set: (x i,y i ) or i=1,..., m ?

13 Deep Target Algorithms

14

15

16

17

18 In spite of the vanishing gradient problem, (and the Newton problem) nothing seems to beat back-propagation. Is backpropagation biologically plausible?

19 Mathematics of Dropout (Cheap Approximation to Training Full Ensemble)

20 Two Questions 1.“If we solve computer vision, we have pretty much solved AI.” 2.A-NNs vs B-NNs and Deep Learning.


Download ppt "Learning Invariances and Hierarchies Pierre Baldi University of California, Irvine."

Similar presentations


Ads by Google