Presentation is loading. Please wait.

Presentation is loading. Please wait.

Analog recurrent neural network simulation, Θ(log 2 n) unordered search with an optically-inspired model of computation.

Similar presentations


Presentation on theme: "Analog recurrent neural network simulation, Θ(log 2 n) unordered search with an optically-inspired model of computation."— Presentation transcript:

1 Analog recurrent neural network simulation, Θ(log 2 n) unordered search with an optically-inspired model of computation

2 Index Continuous Space Machine Structure Continuous Space Machine Structure Analog Recurrent Neural Network Simulation and Complexity Result Analog Recurrent Neural Network Simulation and Complexity Result Unordered Search Algorithm Unordered Search Algorithm

3 The Continuous Space Machine (CSM) Definition: Definition: : grid dimensions : address of sta, a, and b : addresses of the k input images : the r programming symbols and their addresses : addresses of l output images

4 Instructions of CSM h and v : h and v : h gives the 1-D Fourier transformation in the x-direction, and v gives the 1-D Fourier transformation in the y-direction.

5 Instructions of CSM (II) * : * : * gives the complex conjucate of its argument image. where f * is the complex conjucate of f. where f * is the complex conjucate of f.

6 Instructions of CSM (III) ∙and +: ∙and +: ∙gives the pointwise complex product of its two argument images, + gives the pointwise complex sum of its two argument images.

7 Instructions of CSM (IV) ρ: ρ: ρ performs amplitude thresholding on its first image argument using its other two real-valued image arguments as lower and upper amplitude thresholds, respectively.

8 Instructions of CSM (V) ld and st ld and st ld parameters p 1 to p 4 to image at well-known address a. st copies the image at well-known address a to a ‘rectangle’ of images specified by the st parameters p 1 to p 4.

9 Instructions of CSM (VI) br and hlt br and hlt br gives the unconditional jump to the address that the parameter indicates. hlt gives the program termination.

10 Instructions of CSM (Review)

11 The relation between images and data Complex-valued image Complex-valued image A complex-valued image is a function A complex-valued image is a function, where [0, 1] is the real unit interval. Zero Image Zero Image An image that has value 0 everywhere represents 0.

12 The relation between images and data (II) Binary symbol image Binary symbol image The symbol ψ   is represented by the binary symbol image f ψ Real number image Real number image The real number r  R is represented by the real number image f r

13 Two kinds of Binary words Stack images Stack images ld and st instead of push and pop. List images List images Load all images at once.

14 Matrix image for ARNN simulation R  C matrix image R  C matrix image The R  C matrix A with real-valued components a ij, is represented by the R  C matrix image f A

15 Complexity measure Time Time The number of instructions executed in the program. Space Space The total space needed to execute the program. Resolution Resolution The maximum resolution of the grid images in the Computation sequences Range Range The maximum amplitude precision needed.

16 ARNN ARNNs are finite size feedback first order neural networks wirh real weights. The state of each neuron x i at time t + 1 is given by an update equation of the form: We can take p neurons of x i for output.

17 ARNN (II) The CSM model can simulate the ARNN The CSM model can simulate the ARNN The pseudo code is as below

18 ARNN (III) Complexity Complexity If ARNN being simulated is defined for time t = 1, 2, 3, … has M input, N neurons, and k is the number of stacked image elements used to encode the active input to the simulator, the four complexity are Time = O((N + M + 1)t + 1), Space = O(1), Resolution = Max(2 k+M-1, 2 2N-2, 2 N+M-2, 2 t+N-1 ), Range = Infinity. (Real value needs infinite bits.)

19 ARNN Conclusion Because ARNN can be simulated by CSM, Because ARNN can be simulated by CSM, the computation power of CSM is at least as strong as TM.

20 Unordered Search (Needle in the haystack problem) L = {w: w  0*10*}, ω  L be written as ω = ω 0 ω 1 …ω n-1. Input: ω Input: ω Output: Output: Binary representation of i, where ω i =1.

21 Solve NIH in other model In the classic model, this may be solved in In the classic model, this may be solved in O(n) time naïvely, and it seems that the naive method might have the best performance in this model. In the quantum computer, this may be In the quantum computer, this may be solved in Ω( ) with Grover’s work.

22 NIH in the CSM model Thinking … Thinking … Use a binary list image to represent ω, and a binary stack image to represent n with log 2 n bits. Because the ω has only one non-zero point, we can use some convenient instructions in CSM to solve this problem in shorter time…

23 Pseudo Code of θ (log 2 n) unordered search


Download ppt "Analog recurrent neural network simulation, Θ(log 2 n) unordered search with an optically-inspired model of computation."

Similar presentations


Ads by Google