Presentation is loading. Please wait.

Presentation is loading. Please wait.

HIERARCHICAL TEMPORAL MEMORY WHY CANT COMPUTERS BE MORE LIKE THE BRAIN?

Similar presentations


Presentation on theme: "HIERARCHICAL TEMPORAL MEMORY WHY CANT COMPUTERS BE MORE LIKE THE BRAIN?"— Presentation transcript:

1 HIERARCHICAL TEMPORAL MEMORY WHY CANT COMPUTERS BE MORE LIKE THE BRAIN?

2 The presentation is divided into ten major sections, listed below: 1.What are HTMs? 2.Why need HTMs? 3.The Structure of an HTM 4.How an HTM works? 5.Capabilities of HTM 6.Why is Hierarchy important? 7.Implementation Details 8.Application of HTMs 9.Future Enhancements 10.Conclusion

3 WHAT ARE HTMs? A machine learning model that replicates some of the structural and algorithmic properties of the human brain.

4 WHY NEED HTMs? No viable algorithm to perform cognitive tasks such as visual pattern recognition and understanding spoken language on the computer. In a human, these capabilities are largely performed by the neocortex – a thin sheet of cells, folded to form the convolutions occupying nearly 60 percent of brain volume.

5 STRUCTURE OF HTMs Hierarchy of nodes. Each node implements a common learning and memory function. Information can flow both up and down the hierarchy.

6 HOW DO HTMs WORK? Works on a common algorithm for all modalities of sensory input. Knowledge distributed across the nodes up and down the hierarchy. Simple structures in low-level nodes and more complex structures in higher-level nodes.

7

8 Learning Phase All HTMs go through a learning phase where they are exposed to various objects ( known as causes) in the world. Develops internal representation of causes. WorldHTM/CortexSenses People Cars Buildings Words Songs Ideas patterns cause1 0.22 cause2 0.07 cause3 0.00 cause4 0.63 cause5 0.00 cause6 0.08 “Beliefs” “Causes”

9 Establishing and Reporting beliefs On encountering a novel input, each node creates table listing the probability for each of its learned causes. This moment-to-moment distribution is called belief. Child node reports its beliefs to its parent node. Parent may also pass its predictions down. Thus, HTMs turn rapidly changing sensory patterns at bottom into stable thoughts and concepts at top of it.

10 4 pixels Level 1 Level 2 Level 3 Simple HTM Vision System

11 Common sequence: assign to cause Common sequence: assign to cause Uncommon sequence: ignore Common patterns: remember Uncommon patterns: ignore time Discovering Spatial and Temporal Patterns

12 CAPABILITIES OF HTMs Capable of performing 4 basic functions : 1. Discover causes in the world 2. Infer causes of novel input 3. Make predictions 4. Direct behavior

13 WHY IS HIERARCHY IMPORTANT? Shared representations reduce memory requirements and training time. Mirrors the hierarchical structure of the world (both in space and time) Belief propagation-like techniques ensure the network quickly reaches the best mutually consistent set of beliefs Affords a simple mechanism for covert attention

14 80% woof 20% meow 70% pig image 30% cat image 90% cat CPT Belief Propagation Each node in the system represents a belief that is mutually consistent with all the other nodes. Resolves ambiguity. Makes large system settle rapidly.

15 IMPLEMETATION DETAILS Current version of software platform is the research release called ‘Numenta Platform for Intelligent Computing (NuPIC). Available totally free of cost. Composed of three main components: 1.Run Time-Engine. 2.Development Tools. 3.Plug-in API and its associated source code. A set of sample HTM networks, as well as documentation and training materials also created to help the developers get started.

16 Net list Supervisor Dev Tools Configurator Trainer Debugger Supervisor API Node Processor 2 Node Processor N : Gigabit switch Fileserver Node Processor Run time engine

17 Run-Time Engine Set of C++ routines. Highly scalable-Laptops with single CPU to PCs with multiple cores, to a large computer cluster. Supports on Linux and MacOS. Windows to come. Features

18 Run-Time Engine Manages the creation, training, and running of HTM networks on standard computer hardware. Handles message processing back and forth for a HTM network on cluster of servers. Functions

19 Written in the Python scripting language. Source code available for inspection for developers to modify and enhance them. Development Tools Features Functions Used by the developers to create, train and test a network.

20 Plug-in API and associated source code Enables the developer to create new kinds of nodes. Plugs the nodes into the network. Functions

21 APPLICATIONS OF HTMs Car Manufacturing – visual recognition of various parts. Modeling Networks, including computer networks, power networks etc – predicts undesirable future conditions. Oil exploration – to decide where best to drill a well. Pharmaceutical firms – discover and test new drugs. Businesses – study corporate behavior. Robotics – direct the actions of an android.

22 FUTURE ENHANCEMENTS Being a new technology, there are many advances ahead in our understanding of HTMs 1. Improve ability to measure and define the capacity of an HTM. 2. Develop useful heuristics for how best to specify hierarchies to match particular problems. 3. Improve the existing algorithms for spatial quantization and time-based pooling. 4. Extend the implementation of Numenta HTM platform to Windows-based systems.

23 CONCLUSION We have recognized a fundamental concept of how the neocortex uses hierarchy and time to create a model of the world and to perceive novel patterns as part of that model. If we continue the advancements and refinements in the right direction, the true age of intelligent machines may just be getting started!

24 THANK YOU......


Download ppt "HIERARCHICAL TEMPORAL MEMORY WHY CANT COMPUTERS BE MORE LIKE THE BRAIN?"

Similar presentations


Ads by Google