Volume computation László Lovász Microsoft Research

Slides:



Advertisements
Similar presentations
1/15 Agnostically learning halfspaces FOCS /15 Set X, F class of functions f: X! {0,1}. Efficient Agnostic Learner w.h.p. h: X! {0,1} poly(1/ )
Advertisements

Lower Bounds for Local Search by Quantum Arguments Scott Aaronson (UC Berkeley) August 14, 2003.
Slow and Fast Mixing of Tempering and Swapping for the Potts Model Nayantara Bhatnagar, UC Berkeley Dana Randall, Georgia Tech.
Slice Sampling Radford M. Neal The Annals of Statistics (Vol. 31, No. 3, 2003)
Size-estimation framework with applications to transitive closure and reachability Presented by Maxim Kalaev Edith Cohen AT&T Bell Labs 1996.
An Efficient Membership-Query Algorithm for Learning DNF with Respect to the Uniform Distribution Jeffrey C. Jackson Presented By: Eitan Yaakobi Tamar.
Bayesian Estimation in MARK
Ch 11. Sampling Models Pattern Recognition and Machine Learning, C. M. Bishop, Summarized by I.-H. Lee Biointelligence Laboratory, Seoul National.
1 The Monte Carlo method. 2 (0,0) (1,1) (-1,-1) (-1,1) (1,-1) 1 Z= 1 If  X 2 +Y 2  1 0 o/w (X,Y) is a point chosen uniformly at random in a 2  2 square.
Randomized Algorithms Kyomin Jung KAIST Applied Algorithm Lab Jan 12, WSAC
1 Heat flow and a faster Algorithm to Compute the Surface Area of a Convex Body Hariharan Narayanan, University of Chicago Joint work with Mikhail Belkin,
On the monotonicity of the expected volume of a random simplex Luis Rademacher Computer Science and Engineering The Ohio State University TexPoint fonts.
Random graphs and limits of graph sequences László Lovász Microsoft Research
CHAPTER 16 MARKOV CHAIN MONTE CARLO
Random Walks Ben Hescott CS591a1 November 18, 2002.
Parallel random walks Brian Moffat. Outline What are random walks What are Markov chains What are cover/hitting/mixing times Speed ups for different graphs.
BAYESIAN INFERENCE Sampling techniques
Analysis of Network Diffusion and Distributed Network Algorithms Rajmohan Rajaraman Northeastern University, Boston May 2012 Chennai Network Optimization.
CS774. Markov Random Field : Theory and Application Lecture 16 Kyomin Jung KAIST Nov
1 CE 530 Molecular Simulation Lecture 8 Markov Processes David A. Kofke Department of Chemical Engineering SUNY Buffalo
Discrete geometry Lecture 2 1 © Alexander & Michael Bronstein
Outline Formulation of Filtering Problem General Conditions for Filtering Equation Filtering Model for Reflecting Diffusions Wong-Zakai Approximation.
Computational statistics 2009 Random walk. Computational statistics 2009 Random walk with absorbing barrier.
Approximate Nearest Neighbors and the Fast Johnson-Lindenstrauss Transform Nir Ailon, Bernard Chazelle (Princeton University)
Sampling and Approximate Counting for Weighted Matchings Roy Cagan.
511 Friday Feb Math/Stat 511 R. Sharpley Lecture #15: Computer Simulations of Probabilistic Models.
Monte Carlo Methods in Partial Differential Equations.
Randomness – A computational complexity view Avi Wigderson Institute for Advanced Study.
Ragesh Jaiswal Indian Institute of Technology Delhi Threshold Direct Product Theorems: a survey.
Dense subgraphs of random graphs Uriel Feige Weizmann Institute.
1 MCMC Style Sampling / Counting for SAT Can we extend SAT/CSP techniques to solve harder counting/sampling problems? Such an extension would lead us to.
Graph limit theory: Algorithms László Lovász Eötvös Loránd University, Budapest May
July The Mathematical Challenge of Large Networks László Lovász Eötvös Loránd University, Budapest
Algorithms on large graphs László Lovász Eötvös Loránd University, Budapest May
Why is it useful to walk randomly? László Lovász Mathematical Institute Eötvös Loránd University October
02/10/03© 2003 University of Wisconsin Last Time Participating Media Assignment 2 –A solution program now exists, so you can preview what your solution.
Monte Carlo Methods So far we have discussed Monte Carlo methods based on a uniform distribution of random numbers on the interval [0,1] p(x) = 1 0  x.
October Large networks: a new language for science László Lovász Eötvös Loránd University, Budapest
Suppressing Random Walks in Markov Chain Monte Carlo Using Ordered Overrelaxation Radford M. Neal 발표자 : 장 정 호.
Optimization in very large graphs László Lovász Eötvös Loránd University, Budapest December
MTA SzTAKI & Veszprém University (Hungary) Guests at INRIA, Sophia Antipolis, 2000 and 2001 Paintbrush Rendering of Images Tamás Szirányi.
Improved Cross Entropy Method For Estimation Presented by: Alex & Yanna.
Seminar on random walks on graphs Lecture No. 2 Mille Gandelsman,
CS 188: Artificial Intelligence Bayes Nets: Approximate Inference Instructor: Stuart Russell--- University of California, Berkeley.
The Markov Chain Monte Carlo Method Isabelle Stanton May 8, 2008 Theory Lunch.
An Introduction to Monte Carlo Methods in Statistical Physics Kristen A. Fichthorn The Pennsylvania State University University Park, PA
Javier Junquera Importance sampling Monte Carlo. Cambridge University Press, Cambridge, 2002 ISBN Bibliography.
CS774. Markov Random Field : Theory and Application Lecture 15 Kyomin Jung KAIST Oct
geometric representations of graphs
Classical mathematics and new challenges László Lovász Microsoft Research One Microsoft Way, Redmond, WA Theorems and Algorithms.
Discrete mathematics: the last and next decade László Lovász Microsoft Research One Microsoft Way, Redmond, WA 98052
Monte` Carlo Methods 1 MONTE` CARLO METHODS INTEGRATION and SAMPLING TECHNIQUES.
Introduction to Sampling Methods Qi Zhao Oct.27,2004.
The Poincaré Constant of a Random Walk in High- Dimensional Convex Bodies Ivona Bezáková Thesis Advisor: Prof. Eric Vigoda.
Sampling algorithms and Markov chains László Lovász Microsoft Research One Microsoft Way, Redmond, WA 98052
Kevin Stevenson AST 4762/5765. What is MCMC?  Random sampling algorithm  Estimates model parameters and their uncertainty  Only samples regions of.
Daphne Koller Sampling Methods Metropolis- Hastings Algorithm Probabilistic Graphical Models Inference.
Counting and Sampling in Lattices: The Computer Science Perspective Dana Randall Advance Professor of Computing Georgia Institute of Technology.
4. Numerical Integration
Generalization and adaptivity in stochastic convex optimization
Path Coupling And Approximate Counting
Log-Sobolev Inequality on the Multislice (and what those words mean)
Markov chain monte carlo
Recent Progress On Sampling Problem
Haim Kaplan and Uri Zwick
Sampling Distribution
Sampling Distribution
Hariharan Narayanan, University of Chicago Joint work with
geometric representations of graphs
VQMC J. Planelles.
Presentation transcript:

Volume computation László Lovász Microsoft Research

Volume computation Given:, convex Want: volume of K by a membership oracle; with relative error ε Not possible in polynomial time, even if ε=n cn. in n, 1/ε and log R Elekes, Bárány, Füredi

Dyer-Frieze-Kannan 1989 But if we allow randomization: There is a polynomial time randomized algorithm that computes the volume of a convex body with high probability with arbitrarily small relative error

Why not just.... * * * * * * * * * * ** * * * * * * S Need exponential size S before nonzero!

Can use Monte-Carlo! But... Now we have to generate random points from K i+1.

Do sufficiently long random walk on centers of cubes in K Construct sufficiently dense lattice Pick random point p from little cube If p is outside K, abort; else return p

- How dense should be the lattice? - Where to start the walk? - How long to walk? - How many trials will be aborted? - How close will be the returned point to random? “warm start”: use the points you already have infinitely... mixing time > bottleneck > isoperimetric inequality “rounding”: preprocessing by affine transformation mixing time + small isolated parts Issues

bottleneck isoperimetric quantity Conductance

General mixing time bound starting density Jerrum - Sinclair Mixing time is >1/φ but < (log M)/φ 2.

- make boundary small (sandwiching) bottleneck isolated cube The problem with the boundary - make boundary smoother - re-define conductance by excluding small sets - walk on all points - separate global and local conductance - start far from trouble

Dyer-Frieze-Kannan 1989 multi-Phase Monte-Carlo (product estimator) Markov chain sampling isoperimetric inequalities Polynomial time! Cost of volume computation (number of oracle calls) Amortized cost of sample point Cost of sample point

Dyer-Frieze-Kannan 1989 Lovász-Simonovits 1990 isoperimetric inequalities via Localization Lemma, exceptional small sets, warm start: start from random point from a distribution already close to uniform > start far from trouble > avoid start penalty Bootstrapping: re-using points from previous phase as starting points

Isoperimetric Inequality

infinitesimally narrow truncated cone Localization Lemma

Dyer-Frieze-Kannan 1989 Lovász-Simonovits 1990 Applegate-Kannan 1990 integration of logconcave functions, isoperimetric inequality for logconcave functions, Metropolis algorithm, better sandwiching

The Metropolis algorithm Given: time-reversible Markov chain M on V with stationary distribution  ; Want: Sample from distribution with density proportional to F. Modified Markov chain M’: - generate step i  j - if F(j)  F(i), make step; - if F(j)≤F(i), make step with probability F(j)/F(i), else stay where you are. M’ is time-reversible, and its density is proportional to F.

Dyer-Frieze-Kannan 1989 Lovász-Simonovits 1990 Applegate-Kannan 1990 Lovász 1991 ball walk

Dyer-Frieze-Kannan 1989 Lovász-Simonovits 1990 Applegate-Kannan 1990 Lovász 1991 Dyer-Frieze 1991 independence of errors

Dyer-Frieze-Kannan 1989 Lovász-Simonovits 1990 Applegate-Kannan 1990 Lovász 1991 Dyer-Frieze 1991 Lovász-Simonovits 1992,93 integration of smoother functions randomized preprocessing generalization of multi-phase Monte-Carlo to simulated annealing scheme

Want: Random walk on K “Simulated annealing” for integration

X : sample from  k,

Dyer-Frieze-Kannan 1989 Lovász-Simonovits 1990 Applegate-Kannan 1990 Lovász 1991 Dyer-Frieze 1991 Lovász-Simonovits 1992,93 Kannan-Lovász-Simonovits 1997 isotropic positition local and global obstructions (speedy walk) bootstrapping preprocessing and sampling

Dyer-Frieze-Kannan 1989 Lovász-Simonovits 1990 Applegate-Kannan 1990 Lovász 1991 Dyer-Frieze 1991 Lovász-Simonovits 1992,93 Kannan-Lovász-Simonovits 1997 Lovász 1999 analysis of the hit-and-run algorithm

Smith 1984 Hit-and-run walk

Dyer-Frieze-Kannan 1989 Lovász-Simonovits 1990 Applegate-Kannan 1990 Lovász 1991 Dyer-Frieze 1991 Lovász-Simonovits 1992,93 Kannan-Lovász-Simonovits 1997 Lovász 1999 Kannan-Lovász 1999 average conductance, log-Cheeger inequality

Dyer-Frieze-Kannan 1989 Lovász-Simonovits 1990 Applegate-Kannan 1990 Lovász 1991 Dyer-Frieze 1991 Lovász-Simonovits 1992,93 Kannan-Lovász-Simonovits 1997 Lovász 1999 Kannan-Lovász 1999 Lovász-Vempala 2002 sampling from general logconcave distributions, ball walk and hit-and-run walk

Dyer-Frieze-Kannan 1989 Lovász-Simonovits 1990 Applegate-Kannan 1990 Lovász 1991 Dyer-Frieze 1991 Lovász-Simonovits 1992,93 Kannan-Lovász-Simonovits 1997 Lovász 1999 Kannan-Lovász 1999 Lovász-Vempala 2002 A.Kalai-Lovász-Vempala 2003 Simulated annealing

The pencil construction 0 2R2R

Two possibilities for further improvement: - The Slicing Conjecture - Reflecting walk

The Slicing Conjecture Smallest bisecting surface F H Smallest bisecting hyperplane ? ?

Reflecting random walk in K steplength h large How fast does this mix? Stationary distribution: uniform Chain is time-reversible e.g. exponentially distributed with expectation  diam(K).