Presentation is loading. Please wait.

Presentation is loading. Please wait.

Maseeh College of Engineering and Computer Science 5/11/2015 1 ERA: Architectures for Inference Dan Hammerstrom Electrical And Computer Engineering.

Similar presentations


Presentation on theme: "Maseeh College of Engineering and Computer Science 5/11/2015 1 ERA: Architectures for Inference Dan Hammerstrom Electrical And Computer Engineering."— Presentation transcript:

1 Maseeh College of Engineering and Computer Science 5/11/2015 1 ERA: Architectures for Inference Dan Hammerstrom Electrical And Computer Engineering

2 Maseeh College of Engineering and Computer Science Hammerstrom 5/11/2015 2 Intelligent Computing In spite of the transistor bounty of Moore’s law, there is a large class of problems that computers still do not solve well These problems involve the transformation of data across the boundary between the real world and the digital world They occur wherever a computer is sampling and acting on real world data, which includes almost all embedded computing applications Our lack of general solutions to these problems, outside of specialized niches, constitutes a significant barrier to computer usage and huge potential markets

3 Maseeh College of Engineering and Computer Science Hammerstrom 5/11/2015 3 These are difficult problems that require computers to find complex structures and relationships through space and time in massive quantities of low precision, ambiguous, noisy data AI pursued solutions in this area, but ran into scaling problems among other things Artificial Neural Networks (ANN) extended computational intelligence in a number of important ways, primarily by adding the ability to incrementally learn and adapt However, ANNs also had trouble scaling, and they were often difficult to apply to many problems Traditional Rule Based Knowledge systems are now evolving into probabilistic structures where inference becomes the key computation, generally based on Bayes’ rule

4 Maseeh College of Engineering and Computer Science Hammerstrom 5/11/2015 4 Bayesian Networks We now have Bayesian networks  A major contribution to this effort was the work of Judea Pearl Pearl, J., Probabilistic Reasoning In Intelligent Systems – Networks of Plausible Inference, Morgan Kaufman, 1988 & 1997  These systems are far less brittle and they also more faithfully model aspects of animal behavior  Animals learn from their surroundings and appear to do a kind of probabilistic inference from learned knowledge as they interact with their environment Bayesian nets express the structured, graphical representations of probabilistic relationships between several random variables  The graph structure is an explicit representation of conditional dependence (encoded by network edges) And the fundamental computation has become probabilistic inference

5 Maseeh College of Engineering and Computer Science Hammerstrom 5/11/2015 5 A Simple Bayesian Network P(d|b,c)d1d1 d2d2 b 1, c 1 0.5 b 2, c 1 0.30.7 b 1, c 2 0.90.1 b 2, c 2 0.80.2 The CPT for node D, there are similar tables for A, B, and C Each node has a CPT or “Conditional Probability Table”

6 Maseeh College of Engineering and Computer Science Hammerstrom 5/11/2015 6 We need to “infer” the most likely original message given the data we received and our knowledge of the statistics of channel errors and the messages being generated The Inference Problem: Choose the most likely y, based on P[y|x’] Inference, A Simple Example

7 Maseeh College of Engineering and Computer Science Hammerstrom Assumption: Bayesian Inference As A Fundamental Computation In Future Systems In mapping inference computations to hardware there there are a number of issues to be considered, including:  The type and degree of parallelism (multiple, independent threads versus data parallelism)  Arithmetic precision  Inter-thread communication  Local storage requirements, etc. There are several variations on basic Bayesian techniques, over a number of different fields, from communication theory and pattern recognition, to computer vision, robotics, and speech recognition However, for this review of inference as a computational model, three general families of algorithms are considered: 1) Inference by Analysis, 2) Inference by Random Sampling, and 3) Inference Using Distributed Representations 5/11/2015 7

8 Maseeh College of Engineering and Computer Science Hammerstrom Analytic Inference Analytic techniques constitute the most widely used approach for doing inference over Bayesian Networks Most Bayesian Networks are evaluated using Bayesian Belief Propagation (BBP) developed by Pearl  Typically data are input to the network by setting certain variables to known or observed values (“evidence”)  Bayesian Belief Propagation is then performed to find the probability distributions of the free variables Analytic techniques require significant precision and dynamic range, generally in the form of floating point representations This plus the limited parallelism makes them good candidates for multi-core architectures, but not necessarily for more advanced nano-scale computation 5/11/2015 8

9 Maseeh College of Engineering and Computer Science Hammerstrom Random Sampling Another approach to handling inference is via random sampling There are a number of techniques, most of which fall under the general category of Monte Carlo Simulation  Again, evidence is input by setting some nodes to known values  Then random samples of the free variables are generated Two techniques that are commonly used at Adaptive Importance Sampling and Markov Chain Monte Carlo simulation These techniques basically use adaptive sampling techniques to do an algorithmic search of the model’s vector space For large complex Bayesian Structures, such random sampling can often be the best way to evaluate the network However, random sampling suffers from the fact that as the size of the Bayesian Network increases, increasingly larger sample sets are required to obtain sufficiently accurate statistics 5/11/2015 9

10 Maseeh College of Engineering and Computer Science Hammerstrom An example of hardware / device support for such a system is the work of Prof. Krishna Palem (Rice University) on probabilistic CMOS (pCMOS)  Prof. Palem has shown that using pCMOS can provide significant performance benefits in implementing Monte Carlo random sampling  pCMOS logic is used to accelerate the generation of random numbers, thereby accelerating the sampling process Monte Carlo techniques are computationally intensive and so still tend to have scaling limitations However, on the plus side, massively parallel evaluations are possible and arithmetic precision requirements are less constrained  So such techniques map cleanly to simpler, massively parallel, low precision computing structures These techniques may also benefit from morphic cores with hardware accelerated random number generation. 5/11/2015 10

11 Maseeh College of Engineering and Computer Science Hammerstrom Distributed Data Representation Networks Bayesian Networks based on distributed data representations (DDR) are actually a different way to structure Bayesian networks, and although analytic and sampling techniques can be used on these structures, they also allow different kinds of massively parallel execution The use of DDR is very promising, but is also the most limited in terms of demonstrations of real applications Computing with DDRs can be thought of as the computational equivalent of spread spectrum communication  In a distributed representation, meaning is not represented by single symbolic units, but is the result of the interaction of a group of units typically configured in a network structure, and often each unit can participate in several representations  Representing data in this manner more easily allows incremental, integrative, decentralized adaptation  The computational and communication loads are spread more evenly across the system  Distributed representation also appears to be an important computational principle in neural systems 5/11/2015 11

12 Maseeh College of Engineering and Computer Science Hammerstrom Biological neural circuits perform inference over huge knowledge structures in fractions of a second It is not clear exactly how they manage this incredible trick, especially since bayesian inference is so computationally intense However, there is some speculation that DDR plays an important role – however, more theory and algorithm development is needed  This is an active area of research and no doubt much progress will be made by the time many Emerging Research Devices become commercially available. One hypothesis is that hierarchical networks lead naturally to the distribution of representation, and subsequently to significant “factorization” of the inference process, this, coupled with massively parallel hardware, may enable entirely new levels of inference capabilities. 5/11/2015 12

13 Maseeh College of Engineering and Computer Science Hammerstrom 13 One such model was developed by Lee and Mumford, who proposed a hierarchical, Bayesian inference model of the primate visual cortex Lee and Mumford Visual cortex model

14 Maseeh College of Engineering and Computer Science Hammerstrom Another example is the work of Jeff Hawkins and Dileep George at Numenta, Inc. Their model starts with an approximation to a general Bayesian module, which can then be combined into a hierarchy to form what they call a Hierarchical Temporal Memory (HTM) Issues related to hardware architectures for Bayesian Inference and how they may be implemented with emerging devices are now being studied 5/11/2015 14

15 Maseeh College of Engineering and Computer Science Hammerstrom Architecture Mapping Bayesian Networks to a multi-core implementation is straightforward, just implement each node as a task - and in a simple SMP based, multi-core machine that would most certainly provide good performance However, this approach breaks down as we scale to very large networks Bayesian Networks tend to be storage intensive, so implementation issues such as data structure organization, memory management and cache utilization also become important  In fact a potentially serious performance constraint may be access to primary memory, and it is not yet clear how effective caching will be in ameliorating this delay However, as we scale to the very large networks required to solve complex problems, a variety of optimizations become possible and, in fact, necessary 5/11/2015 15

16 Maseeh College of Engineering and Computer Science Hammerstrom One promising massively parallel approach is that of associative processing, which has been shown to approximate Bayesian inference, and which has the potential for huge levels of parallelism Using morphic cores for heterogeneous multi-core structures, such massively parallel implementations of Bayesian networks becomes relevant Another interesting variation is to eliminate synchrony, where inter-module update messages arrive at random times and computation within a module proceeds at its own pace, updating its internal estimates when it receives update messages, otherwise continuing without them More study is needed to explore radical new implementation technologies and how they may be used to do inference 5/11/2015 16

17 Maseeh College of Engineering and Computer Science Hammerstrom Algorithm Family Summary TechniqueParallelism (Threads) Inter-Thread Communicatio n Computational Precision Storage/NodeState of the Art AnalyticModerate HighModerateMature Random Sampling HighLowModerate Mature Distributed / Hierarchies HighLow Preliminary 5/11/2015 17


Download ppt "Maseeh College of Engineering and Computer Science 5/11/2015 1 ERA: Architectures for Inference Dan Hammerstrom Electrical And Computer Engineering."

Similar presentations


Ads by Google