An introduction to Monte Carlo methods¶

The corresponding notebooks are here: github repository and docker



Michel Ferrero
International Summer School on Computational Quantum Materials
Jouvence, June 9, 2022

Outline¶

  • Introduction to Monte Carlo

  • Newton–Cotes quadrature

  • Importance sampling

  • Direct sampling methods

  • Markov chain sampling and balance condition

  • Metropolis-Hastings algorithm

  • The two-dimensional Ising model

  • Error bar analysis

  • References

%matplotlib inline
%config InlineBackend.figure_format = 'retina'
import matplotlib.pyplot as plt
import numpy as np
plt.style.use('style.mpl')

Motivation¶

  • Physical systems are often described by a very large number of degrees of freedom.

  • Obtaining information, such as the partition function, in principle requires to compute a sum over all of them.

  • In most cases, this is very difficult! Example: Ising model with $N$ spins, the sum has $2^N$ terms.

  • Monte Carlo methods allow to circumvent this difficulty using a stochastic approach.

The birth of computational Monte Carlo¶

  • What are the chances to win a Solitaire game?

  • Rather than trying to solve the complex combinatorial problem, Ulam thought he could play 100 times and have an idea of the result from the statistics.

  • With the appearance of first computers this gave birth to the first Monte Carlo algorithms.

Drawing Drawing
Stanislaw Ulam
John von Neumann

Monte Carlo for integration¶

  • Underlying principle in Monte Carlo methods: In order to gain information about a system that depends on many degrees of freedom, it is enough to inspect a subset of cleverly chosen representatives.

  • These representatives are generated by a stochastic process.

  • Monte Carlo methods are now widely used in modern computational physics. They exist in many different flavours.

  • We will be interested in Monte Carlo integration

Example: Computing $\pi$ with pebbles¶

  • How can we stochastically estimate $\pi$? We can obtain it from \begin{equation*} \pi = 4 \, \frac{A_\mathrm{circle}}{A_\mathrm{square}} \end{equation*}

Uniform sampling over the square¶

  • Throw pebbles uniformly in $[-1,1]^2$. The ratio between pebbles inside the circle and the total number of pebbles is an estimate of $A_\mathrm{circle} / A_\mathrm{square}$. Then $\pi \simeq 4 N_\mathrm{inside} / N_\mathrm{total}$
n_pebbles = 2**12
pebbles = np.random.uniform(-1, 1, size=(2, n_pebbles))
n_inside = np.sum(np.linalg.norm(pebbles, axis=0) < 1)
print(f"inside: {n_inside}, total: {n_pebbles}, π ≃ {4*n_inside/n_pebbles}")
inside: 3230, total: 4096, π ≃ 3.154296875

What have we done?¶

  • Throwing the $n^\mathrm{th}$ pebble is described by a random variable

    \begin{equation*} x_n = \begin{cases} 4 \quad \text{with probability} \quad \pi/4 \\ 0 \quad \text{with probability} \quad (1-\pi/4) \end{cases} \end{equation*}

  • The average of $x_n$ is

    \begin{equation*} \langle x_n \rangle = \frac{\pi}{4} \cdot 4 + \left(1-\frac{\pi}{4}\right) \cdot 0 = \pi \end{equation*}

  • We have traded an integration for the computation of an average value.

Integration of multivariable functions¶

  • Can a similar strategy be used to compute integrals of multivariable functions?

  • How does this approach compare to standard quadrature methods?

  • How should the stochastic sampling be done?

Outline¶

  • Introduction to Monte Carlo

  • Newton–Cotes quadrature

  • Importance sampling

  • Direct sampling methods

  • Markov chain sampling and balance condition

  • Metropolis-Hastings algorithm

  • The two-dimensional Ising model

  • Error bar analysis

  • References