F.F. Assaad. MPI-Stuttgart. Universität-Stuttgart. 21.10.2002 Numerical approaches to the correlated electron problem: Quantum Monte Carlo.  The Monte.

Slides:



Advertisements
Similar presentations
PRAGMA – 9 V.S.S.Sastry School of Physics University of Hyderabad 22 nd October, 2005.
Advertisements

Monte Carlo Methods and Statistical Physics
Random Number Generation. Random Number Generators Without random numbers, we cannot do Stochastic Simulation Most computer languages have a subroutine,
Ch 11. Sampling Models Pattern Recognition and Machine Learning, C. M. Bishop, Summarized by I.-H. Lee Biointelligence Laboratory, Seoul National.
Random number generation Algorithms and Transforms to Univariate Distributions.
1 The Monte Carlo method. 2 (0,0) (1,1) (-1,-1) (-1,1) (1,-1) 1 Z= 1 If  X 2 +Y 2  1 0 o/w (X,Y) is a point chosen uniformly at random in a 2  2 square.
Markov Chains 1.
Boris Altshuler Columbia University Anderson Localization against Adiabatic Quantum Computation Hari Krovi, Jérémie Roland NEC Laboratories America.
CHAPTER 16 MARKOV CHAIN MONTE CARLO
Lecture 3: Markov processes, master equation
Bayesian Reasoning: Markov Chain Monte Carlo
Graduate School of Information Sciences, Tohoku University
BAYESIAN INFERENCE Sampling techniques
CS774. Markov Random Field : Theory and Application Lecture 16 Kyomin Jung KAIST Nov
1 Quantum Monte Carlo Methods Jian-Sheng Wang Dept of Computational Science, National University of Singapore.
1 CE 530 Molecular Simulation Lecture 8 Markov Processes David A. Kofke Department of Chemical Engineering SUNY Buffalo
. PGM: Tirgul 8 Markov Chains. Stochastic Sampling  In previous class, we examined methods that use independent samples to estimate P(X = x |e ) Problem:
Computational statistics 2009 Random walk. Computational statistics 2009 Random walk with absorbing barrier.
1 Cluster Monte Carlo Algorithms & softening of first-order transition by disorder TIAN Liang.
Monte Carlo Methods in Partial Differential Equations.
Monte Carlo Methods H. Rieger, Saarland University, Saarbrücken, Germany Summerschool on Computational Statistical Physics, NCCU Taipei, Taiwan.
Introduction to Monte Carlo Methods D.J.C. Mackay.
6. Markov Chain. State Space The state space is the set of values a random variable X can take. E.g.: integer 1 to 6 in a dice experiment, or the locations.
Monte Carlo Simulation of Ising Model and Phase Transition Studies By Gelman Evgenii.
1 CE 530 Molecular Simulation Lecture 7 David A. Kofke Department of Chemical Engineering SUNY Buffalo
Simulation of Random Walk How do we investigate this numerically? Choose the step length to be a=1 Use a computer to generate random numbers r i uniformly.
Advanced methods of molecular dynamics Monte Carlo methods
1 Statistical Mechanics and Multi- Scale Simulation Methods ChBE Prof. C. Heath Turner Lecture 11 Some materials adapted from Prof. Keith E. Gubbins:
Monte Carlo Methods: Basics
Classical and Quantum Monte Carlo Methods Or: Why we know as little as we do about interacting fermions Erez Berg Student/Postdoc Journal Club, Oct
Monte Carlo Simulation of Interacting Electron Models by a New Determinant Approach Mucheng Zhang (Under the direction of Robert W. Robinson and Heinz-Bernd.
F.F. Assaad. MPI-Stuttgart. Universität-Stuttgart Numerical approaches to the correlated electron problem: Quantum Monte Carlo.  The Monte.
Introduction to Monte Carlo Simulation. What is a Monte Carlo simulation? In a Monte Carlo simulation we attempt to follow the `time dependence’ of a.
1 Lesson 3: Choosing from distributions Theory: LLN and Central Limit Theorem Theory: LLN and Central Limit Theorem Choosing from distributions Choosing.
Module 1: Statistical Issues in Micro simulation Paul Sousa.
Monte Carlo Methods1 T Special Course In Information Science II Tomas Ukkonen
Introduction to Lattice Simulations. Cellular Automata What are Cellular Automata or CA? A cellular automata is a discrete model used to study a range.
Monte Carlo Methods So far we have discussed Monte Carlo methods based on a uniform distribution of random numbers on the interval [0,1] p(x) = 1 0  x.
APPENDIX D R ANDOM N UMBER G ENERATION Organization of chapter in ISSO* – General description and linear congruential generators Criteria for “good” random.
4. Numerical Integration. Standard Quadrature We can find numerical value of a definite integral by the definition: where points x i are uniformly spaced.
Maximum a posteriori sequence estimation using Monte Carlo particle filters S. J. Godsill, A. Doucet, and M. West Annals of the Institute of Statistical.
F.F. Assaad. MPI-Stuttgart. Universität-Stuttgart Numerical approaches to the correlated electron problem: Quantum Monte Carlo.  The Monte.
F.F. Assaad. MPI-Stuttgart. Universität-Stuttgart Numerical approaches to the correlated electron problem: Quantum Monte Carlo.  The Monte.
7. Metropolis Algorithm. Markov Chain and Monte Carlo Markov chain theory describes a particularly simple type of stochastic processes. Given a transition.
The Markov Chain Monte Carlo Method Isabelle Stanton May 8, 2008 Theory Lunch.
Javier Junquera Importance sampling Monte Carlo. Cambridge University Press, Cambridge, 2002 ISBN Bibliography.
CS774. Markov Random Field : Theory and Application Lecture 15 Kyomin Jung KAIST Oct
The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes.
CS Statistical Machine learning Lecture 25 Yuan (Alan) Qi Purdue CS Nov
G. Cowan Lectures on Statistical Data Analysis Lecture 5 page 1 Statistical Data Analysis: Lecture 5 1Probability, Bayes’ theorem 2Random variables and.
Monte Carlo Simulation of Canonical Distribution The idea is to generate states i,j,… by a stochastic process such that the probability  (i) of state.
Kevin Stevenson AST 4762/5765. What is MCMC?  Random sampling algorithm  Estimates model parameters and their uncertainty  Only samples regions of.
Random Sampling Algorithms with Applications Kyomin Jung KAIST Aug ERC Workshop.
Computational Physics (Lecture 10) PHY4370. Simulation Details To simulate Ising models First step is to choose a lattice. For example, we can us SC,
Monte Carlo Simulation of the Ising Model Consider a system of N classical spins which can be either up or down. The total.
The Monte Carlo Method/ Markov Chains/ Metropolitan Algorithm from sec in “Adaptive Cooperative Systems” -summarized by Jinsan Yang.
Introduction to Sampling based inference and MCMC
Some open questions from this conference/workshop
B3 Correlations in antiferromagnets
Markov Chain Monte Carlo methods --the final project of stat 6213
Computational Physics (Lecture 10)
Advanced Statistical Computing Fall 2016
4. Numerical Integration
Jun Liu Department of Statistics Stanford University
Markov chain monte carlo
Lecture 2 – Monte Carlo method in finance
DIAGRAMMATIC MONTE CARLO:
Lecture 4 - Monte Carlo improvements via variance reduction techniques: antithetic sampling Antithetic variates: for any one path obtained by a gaussian.
Statistical Methods for Data Analysis Random number generators
Presentation transcript:

F.F. Assaad. MPI-Stuttgart. Universität-Stuttgart Numerical approaches to the correlated electron problem: Quantum Monte Carlo.  The Monte Carlo method. Basic.  Spin Systems. World-lines, loops and stochastic series expansions.  The auxiliary field method I  The auxiliary filed method II Ground state, finite temperature and Hirsch-Fye.  Special topics (Kondo / Metal-Insulator transition) and outlooks.

Problem: ~10 23 electrons per cm 3. Question: Ground state, elementary excitations. Correlations (Coulomb) Correlations (Coulomb). Low energy elementary excitations remain particle and holes. Fermi liquid theory. Screening, phase space.  1D: Luttinger liquid. (Spinon, Holon)  2D: Fractional Quantum Hall effect.  Magnetism.  Mott insulators.  Metal-insulator transition.  Heavy fermions. Fermi statistics. No correlations. Fermi-sea. Elementary excitations: particle-holes. CPU time N 3 Some Generalities. Complexity of problem scales as e N

World-line approach with loop updates. Stochastic series expansion O(N) method.  Non-frustrated spin systems.  Bosonic systems.  1-D Hubbard and t-J models.  Non-interacting electrons in dimensions larger than unity. Lattice Hamiltonian H. Trace over Fock space. Path integral. Not unique. Sign Problem. Approximate strategies: CPQMC, PIRG Determinantal method. O(N 3 ) method.  Any mean-field Hamiltonian.  Models with particle-hole symmetry. Half filled Hubbard. Kondo lattices.  Models with attractive interactions Attractive Hubbard model Holstein model.  Impurity problems. No.

The Monte Carlo Method. Basic ideas. Aim: Let and split the domain in hyper-cubes of linear size h and use an integration method where the systematic error scales as h k The systematic error in terms of the number of function evaluations N = V/h d of is then proportional to: Thus poor results for large values of d and the Monte Carlo method becomes attractive.

The central limit theorem. Let Be a set of statistically independent points distributed according to the probability distribution P(x). Then we can estimate What is the error? Distribution of X. For practical purposes we estimate: Thus the error (i.e. the width of the Gaussian distribution) scales as irrespective of the dimensionality of the integration space. Demonstration of the theorem.

An Example: Calculation of  x y 1 In this case, Draw N {(x,y)} random points. x, y are drawn from uniform distribution in the interval [0:1] Take N=8000 to obtain Repeat this simulation many time to compute D(X) D(X)

Markov Chains: Generating points according to a distribution P(x). Define a Monte Carlo time dependent probability distribution: which evolves according to a Markov process: the future depends only on the present. The time evolution is given by: Requirement: Conditions on T:    Stationarity condition is fulfilled if detailed balance condition is satisfied: But stationarity condition is essential not detailed balance!

Convergence to P(x). Rules. Rate of convergence. Eigenvalues,, of T satisfy corresponds to the stationary distribution. The rate of convergence will depend on the second largest eigenvalue 1. Let

Explicit construction of T. Probability of proposing a move from x to y. Has to satisfy the ergodicity condition (2) and (1). Probability of accepting the move. Note: (1) (2) (3) so that T satisfies (1) To satisfy (3) we will require detailed balance: Ansatz: MetropolisHeatbath

Ergodicity. To achieve ergodicity, one will often want to combine different types on moves. Let satisfy (1) and (3). (1) (2) (3) We can combine those moves randomly: or sequentially to achieve ergodicity. Note: If T (i), :1...N, satisfies the detailed balance condition then T R satisfies the detailed balance condition but T S satisfies only the stationarity condition.

Autocorrelation time and error analysis: Binning analysis. Monte Carlo simulation: 1) Start with configuration x 0 2) Propose a move from x 0 to y according to and accept it with probability 3) 4) Goto 1) Autocorrelation time: Generate a sequence: which if N is large enough will be distributed according to P(x) so that. Relevant time scale to forgett memory of intial configuration is

To use the central limit theorem to evaluate the error, we need statistically independent measurements. Binning. Group the raw data into bins of size and estimate the error with. If n is large enough (n~5-10) the error will be independent on n.

Example. The one dimensional Ising model. We want to compute spin-spin correlation functions: Algorithm.  Choose a site randomly.  Propose a spin flip.  Accept with Metropolis or Heat-bath.  Carry out the measurement e.g. after a sweep.

Example of error analysis L=24 1D Ising model: Unit is a single sweep Unit is the autocorrelation time as determined from (a) JJ g(L/2) exactg(L/2) MC / / Results obtained after 2X10 6 sweeps

Random number generators. Linear congruential Period: 2 31, 32 bit integer.  Deterministic (i.e. pseudo ramdom). For a given initial value of I the sequence of random numbers is reprodicible.  Quality checks. (1). Ditribution: X P(x) (Ref: Numerical recipes. Cambridge University Press)

(2) Correlations: (3) 2-tupels.

The generation of good pseudo random numbers is a quite delicate issue which requires some care and extensive quality check. It is therefore highly recommended not to invent ones secret recursion rules but to use one of the well-known generators which have been tested by many other workers in the field.