Download presentation

Presentation is loading. Please wait.

Published byRobert Thomasson Modified over 2 years ago

1
Monte Carlo Simulation Wednesday, 9/11/2002 Stochastic simulations consider particle interactions. Ensemble sampling Markov Chain Metropolis Sampling

2
Deterministic vs. Stochastic F = m a Random Walk Newton’s equation of motion

3
Brownian motion n = 200 s =.02 x = rand(n,1)-0.5; y = rand(n,1)-0.5; h = plot(x,y,'.'); axis([ ]) axis square grid off set(h,'EraseMode','xor','MarkerSize',18) while 1 drawnow x = x + s*randn(n,1); y = y + s*randn(n,1); set(h,'XData',x,'YData',y) end Animations

4
Lennard-Jones Potential potential force

5
Measuring elastic constants

6
Replace the time average with ensemble average An ensemble is a collection of systems. The probability for a system in state s is P s. If you average the velocity of one molecule of the air in your room, as it collides from one molecule to the next, that average comes out the same as for all molecules in the room at one instant.

7
Thought experiment Let’s pretend that our universe really is replicated over and over -- that our world is just one realization along with all the others. We're formed in a thousand undramatic day-by-day choices. Parallel universes ensembles

8
Canonical Ensemble Fixed number of atoms, system energy, and system volume. Partition function

9
Finite number of microstates Importance sampling

10
Metropolis Sampling I 1. Current configuration: C(n) 2. Generate a trial configuration by selecting an atom at random and move it. 3. Calculate the change in energy for the trial configuration, U.

11
Metropolis Sampling II If U < 0, accept the move, so that the trial configuration becomes the (n+1) configuration, C(n+1). If U >= 0, generate a random number r between 0 and 1; If r <= exp( - U/k B T ), accept the move, C(n+1) = C(t); If r > exp( - U/k B T ), reject the trial move. C(n+1) = C(n). A sequence of configurations can be generated by using the above steps repeatedly. Properties from the system can be obtained by simply averaging the properties of a large number of these configurations.

12
Markov Chain A sequence X 1, X 2, …, of random variable is called Markov if, for any n, i.e., if the conditional distribution F of X n assuming X n-1, X n-2, …, X 1 equals the conditional distribution F of X n assuming of only X n-1.

13
Markov Process Dart hit-or-miss Random Walk (RW) Self-Avoiding Walk (SAW) Growing Self-Avoiding Walk (GSAW) Diffusion Limited Aggregation

Similar presentations

© 2016 SlidePlayer.com Inc.

All rights reserved.

Ads by Google