BLACK FRIDAY EXTRA SAVINGS EVENT - EXTENDED
START FREE

Numerical Method: Monte Carlo Integration

With four parameters I can fit an elephant, and with five I can make him wiggle his trunk.
– John von Neumann, mathematician, physicist, computer scientist, and polymath

John von Neumann and Stanislaw Ulam invented the Monte Carlo Method.

Monte Carlo method is a technique that rely on random sampling to obtain numerical results. Monte Carlo integration is using the Monte Carlo method to numerically estimate definite integrals.

Monte Carlo method can be used to estimate functions with closed-form solution. But, this method truly shines where no closed-form solution exists. In this article, we will mostly work with definite integral that has analytical solution so that we can compare the results of the Monte Carlo estimates with the analytical results. We shall, however, solve one integration problem with no closed-form solution.

The Math Behind Monte Carlo Integration

Monte Carlo integration rely on:

  • The Law of Large Numbers
  • Central Limit Theorem
  • Transformation of definite integral to expectation using probability density function (PDF).

We will not go into the Law of Large Numbers and Central Limit Theorem in details. There are abundance of resources online on this topics. We shall mostly focus on how we will transform our definite integral to expectation using PDF.

Imagine there is a function, \int_Af(x)dx, that you intend to integrate. This function can be transformed to an expectation as follows:

\int_Af(x)dx = \int_Ag(x)h(x)dx = E[g(x); A]

Where h(x) is the PDF and E[g(x); A] is the expected value of g(x) over A.

We will now look at examples. The closed-form solution would be provided with the examples if they exist.

Example 1: . \int_0^1sin(x)dx = cos(1) + 1

Before we proceed, it is worth noting that if we our integral is bounded we shall sample from a uniform distribution. However, if it is unbounded, we shall sample from a normal distribution. An integral is unbounded if one of its boundary values goes to infinity.

If we are sampling from a uniform distribution, the PDF is 1 \over b-a, while for a standard normal distribution, the PDF is e^{-x^2\over2} * {1\over \sqrt{2\pi}} .

Back to example 1. The PDF is h(x) = {1 \over (b-a)} = {1 \over (1-0)} = 1 .
Therefore, \int_0^1sin(x)dx = \int_0^1sin(x)*1dx = E[sin(X); U(0, 1)]

We shall apply the Monte Carlo Integration to this problem:

import scipy as sp
import numpy as np
import matplotlib.pyplot as plt
from scipy.stats import norm, uniform

estimates = []
stdvs = []

for i in range(1, 200):
    unif = uniform.rvs(size=i*1000)
    expectation = np.sin(unif)
    estimates.append(np.mean(expectation))  # Law of Large Number
    stdvs.append(np.std(expectation) / np.sqrt(i * 1000))  
    
plt.plot([-np.cos(1) + 1] * 200)
plt.plot(estimates, '.')
plt.plot(-np.cos(1) + 1 - 3 * np.array(stdvs))   # Central Limit Theorem
plt.plot(-np.cos(1) + 1 + 3 * np.array(stdvs))  # Central Limit Theorem
plt.annotate('Integral of sin(x) in [0, 1]', xy=(70, 0.475))   
plt.annotate('Approx. -> -np.cos(1) + 1', xy=(70, 0.445)) 
plt.xlabel('Sample size (000)')
plt.ylabel('Monte Carlo Estimate')

image

From the code snippet, we sampled random variables, X, from a uniform distribution and estimated the expectation with the sin(X) function. We can observe from the graph that our Monte Carlo Estimate approaches the analytical solution with a confidence level of 99.7% - from central limit theorem, 3\sigma from the mean.

Example 2: \int_0^2sin(2x)dx = {1 \over 2}(-cos(4) + 1)

h(x) = {1 \over (b-a)} = {1 \over (2-0)} = {1 \over 2}
\int_0^2sin(2x)dx = 2*\int_0^2sin(2x)*{1 \over 2}dx = 2 * E[sin(2X); U(0,2)]

estimates = []
stdvs = []

for i in range(1, 200):
    unif = uniform.rvs(size=i*1000) * 2  # U(0,2)
    expectation = 2 * np.sin(2*unif)
    estimates.append(np.mean(expectation))
    stdvs.append(np.std(expectation) / np.sqrt(i * 1000))
    
plt.plot([1/2 *(-np.cos(4) + 1)] * 200)
plt.plot(estimates, '.')
plt.plot(1/2 *(-np.cos(4) + 1) - 3 * np.array(stdvs))
plt.plot(1/2 *(-np.cos(4) + 1)+ 3 * np.array(stdvs))
plt.annotate('Integral of sin(2x) in [0, 2]', xy=(70, 0.9))
plt.annotate('Approx. -> 1 / 2 * (-np.cos(4) + 1)', xy=(70, 0.775))
plt.xlabel('Sample size (000)')
plt.ylabel('Monte Carlo Estimate')

image

Again, our numerical technique safely approximate the analytical solution for this problem.

Example 3: \int_0^2sin^2xdx = 1 - {sin(4) \over 4}

h(x) = {1 \over (b-a)} = {1 \over (2-0)} = {1 \over 2}
\int_0^2sin^2xdx = 2*\int_0^2sin^2x*{1 \over 2}dx = 2 * E[sin^2X; U(0,2)]

estimates = []
stdvs = []

for i in range(1, 200):
    unif = uniform.rvs(size=i*1000) * 2
    expectation = 2 * np.sin(unif)**2
    estimates.append(np.mean(expectation))
    stdvs.append(np.std(expectation) / np.sqrt(i * 1000))
    
plt.plot([(1-np.sin(4)/4)] * 200)
plt.plot(estimates, '.')
plt.plot((1-np.sin(4)/4) - 3 * np.array(stdvs))
plt.plot((1-np.sin(4)/4) + 3 * np.array(stdvs))
plt.annotate('Integral of sin^2(x) in [0, 2]', xy=(70, 1.23))
plt.annotate('Approx. -> (1-np.sin(4)/4)', xy=(70, 1.15))
plt.xlabel('Sample size (000)')
plt.ylabel('Monte Carlo Estimate')

image

Monte Carlo Integration technique successfully approximate the solution to this problem. Let’s see more examples with the transformation steps before we begin to skip them.

Example 4: \int_{-\infty}^{\infty}e^{-x^2}dx = \sqrt\pi

\int_Ae^{-x^2}dx = \int_Ae^{{-x^2\over2}+{-x^2\over2}}dx = \int_Ae^{-x^2\over2}e^{-x^2\over2}dx = {\sqrt{2\pi}}*\int_Ae^{-x^2\over2}e^{-x^2\over2} * {1\over \sqrt{2\pi}}dx
h(x) = e^{-x^2\over2} * {1\over \sqrt{2\pi}}; \ g(x) = e^{-x^2\over2}
\int_{-\infty}^{\infty}e^{-x^2}dx = {\sqrt{2\pi}} * E[e^{-x^2\over2}; N(0,1)]

estimates = []
stdvs = []

for i in range(1, 200):
    nrm = norm.rvs(size=i*1000)  # Sampling from Normal Distribution U(0,1)
    expectation = np.sqrt(2*np.pi) * np.exp(-nrm**2/2)
    estimates.append(np.mean(expectation))
    stdvs.append(np.std(expectation) / np.sqrt(i * 1000))
    
plt.plot([np.sqrt(np.pi)] * 200)
plt.plot(estimates, '.')
plt.plot(np.sqrt(np.pi)- 3 * np.array(stdvs))
plt.plot(np.sqrt(np.pi) + 3 * np.array(stdvs))
plt.annotate('Integral of e^-(x^2) in [-inf, inf]', xy=(70, 1.81))
plt.annotate('Approx. -> np.sqrt(np.pi)', xy=(70, 1.74))
plt.xlabel('Sample size (000)')
plt.ylabel('Monte Carlo Estimate')

image

For this example, our intergal is unbounded and we sampled from a normal distribution. Let’s see an example with no closed-form solution

Example 5: \int_0^1e^{-x^3}dx No Closed form Solution. Approximated as {1\over3}\Gamma{1\over3} = 0.89280

g(x) = e^{-x^3}; \ h(x) = 1
\int_0^1e^{-x^3}dx = E[e^{-x^3}; U(0, 1)]

estimates = []
stdvs = []

for i in range(1, 200):
    unf = uniform.rvs(size=i*1000) 
    expectation = np.exp(-unf**3)
    estimates.append(np.mean(expectation))
    stdvs.append(np.std(expectation) / np.sqrt(i * 1000))
    
plt.plot([np.mean(estimates)] * 200)
plt.plot(estimates, '.')
plt.plot(np.mean(estimates)- 3 * np.array(stdvs))
plt.plot(np.mean(estimates)+ 3 * np.array(stdvs))
plt.annotate('Integral of e^-(x^3) in [0, 1]', xy=(50, .817))
plt.annotate('Approx. -> 0.89280', xy=(50, .794))
plt.xlabel('Sample size (000)')
plt.ylabel('Monte Carlo Estimate')

image

Let us now see double integral examples.

Example 6: . \int_0^1\int_0^yxydxdy = {1 \over 8}

Let us see how the transformation is done here. We shall skip this step subsequently.
\int_0^yxydx = y\int_0^yxy{1 \over (y-0)}dx = yE[Xy]
Transform \int_0^yxdx to expectation with its PDF 1 \over y and change x to random variable X.
We now have: \int_0^1yE[Xy]dy = PDF*\int_0^1yE[Xy]*{1\over PDF}. The PDF is {1\over1-0} = 1
We now have: 1*\int_0^1yE[Xy]*{1\over 1} = E[yE[Xy]] = EE[XY^3] = E[XY^3]
The big trick is whenever you are moving from the outside expectation to the inside expectation, you square the variable or number. If it is a variable, it changes to a random variable: y to Y.

Going forward, the transformation will be shown in the format below:
\int_0^1y*E[Xy]dy = E[y*E[Xy]] = E[XY^3; U(0,1)]

estimates = []
stdvs = []

for i in range(1, 200):
    unf1 = uniform.rvs(size=i*1000) # X
    unf2 = uniform.rvs(size=i*1000) # Y
    expectation = unf1*unf2**3
    estimates.append(np.mean(expectation))
    stdvs.append(np.std(expectation) / np.sqrt(i * 1000))
    
plt.plot([1/8] * 200)
plt.plot(estimates, '.')
plt.plot(np.mean(estimates)- 3 * np.array(stdvs)) 
plt.plot(np.mean(estimates)+ 3 * np.array(stdvs)) 
plt.annotate('Double Integral of xy in ([0, y], [0,1])', xy=(70, .135))
plt.annotate('Approx. -> 1/8', xy=(70, .115))
plt.xlabel('Sample size (000)')
plt.ylabel('Monte Carlo Estimate')

image

Example 7: \int_0^1\int_0^{y^2}xydxdy = {1 \over 12}

\int_0^1y^2*E[Xy]dy = E[y^2*E[Xy]] = E[XY^5; U(0,1)]

estimates = []
stdvs = []

for i in range(1, 200):
    unf1 = uniform.rvs(size=i*1000)
    unf2 = uniform.rvs(size=i*1000) 
    expectation = unf1*unf2**5
    estimates.append(np.mean(expectation))
    stdvs.append(np.std(expectation) / np.sqrt(i * 1000))
    
plt.plot([1/12] * 200)
plt.plot(estimates, '.')
plt.plot(np.mean(estimates)- 3 * np.array(stdvs))
plt.plot(np.mean(estimates)+ 3 * np.array(stdvs))
plt.annotate('Double Integral of xy in ([0, y^2], [0,1])', xy=(60, .093))
plt.annotate('Approx. -> 1/12', xy=(60, .075))
plt.xlabel('Sample size (000)')
plt.ylabel('Monte Carlo Estimate')

image

Example 8: \int_3^5\int_0^{y}xydxdy = 68

\int_3^5y*E[Xy]dy = E[y*E[Xy]]_3^5 = E[XY^3; U(0,1)]_3^5

estimates = []
stdvs = []

for i in range(1,200):
    unf1 = uniform.rvs(size=i*1000)
    unf2 = uniform.rvs(size=i*1000)
    expectation = (5*unf1*(5*unf2)**3-3*unf1*(3*unf2)**3)
    
    estimates.append(np.mean(expectation))
    stdvs.append(np.std(expectation) / np.sqrt(i * 1000))
    
plt.plot([68] * 50)
plt.plot(estimates, '.')
plt.plot(np.mean(estimates)- 3 * np.array(stdvs))
plt.plot(np.mean(estimates)+ 3 * np.array(stdvs))
plt.annotate('Double Integral of xy in ([0, y], [3,5])', xy=(50, 72.5))
plt.annotate('Approx. -> 68', xy=(50, 63))
plt.xlabel('Sample size (000)')
plt.ylabel('Monte Carlo Estimate')

image

Example 9: \int_3^5\int_2^{y}xydxdy = 52

\int_3^5(y-2)*E[Xy]dy = E[y*E[Xy]-2*E[Xy]]_3^5 = E[XY^3-4XY; U(0,1)]_3^5

estimates = []
stdvs = []

for i in range(1, 200):
    unf1 = uniform.rvs(size=i*1000)
    unf2 = uniform.rvs(size=i*1000)
    expectation = (5*unf1*(5*unf2)**3 - 4*5*unf1*5*unf2) - (3*unf1*(3*unf2)**3 - 4*3*unf1*3*unf2)
    
    estimates.append(np.mean(expectation))
    stdvs.append(np.std(expectation) / np.sqrt(i * 1000))
    
plt.plot([52] * 200)
plt.plot(estimates, '.')
plt.plot(np.mean(estimates)- 3 * np.array(stdvs))
plt.plot(np.mean(estimates)+ 3 * np.array(stdvs))
plt.annotate('Double Integral of xy in ([2, y], [3, 5])', xy=(50, 57))
plt.annotate('Approx. -> 52', xy=(50, 48))
plt.xlabel('Sample size (000)')
plt.ylabel('Monte Carlo Estimate')

image

Example 10: \int_3^5\int_0^{y^2}xydxdy = {3724 \over 3}

\int_3^5y^2*E[Xy]dy = E[y^2*E[Xy]]_3^5 = E[XY^5; U(0,1)]_3^5

estimates = []
stdvs = []

for i in range(1, 200):
    unf1 = uniform.rvs(size=i*1000)
    unf2 = uniform.rvs(size=i*1000) 
    expectation = (5*unf1)*(5*unf2)**5 - (3*unf1)*(3*unf2)**5
    estimates.append(np.mean(expectation))
    stdvs.append(np.std(expectation) / np.sqrt(i * 1000))
    
plt.plot([3724/3] * 200)
plt.plot(estimates, '.')
plt.plot(np.mean(estimates)- 3 * np.array(stdvs))
plt.plot(np.mean(estimates)+ 3 * np.array(stdvs))
plt.annotate('Double Integral of xy in ([0, y^2], [3,5])', xy=(50, 1350))
plt.annotate('Approx. -> 3724/3', xy=(50, 1150))
plt.xlabel('Sample size (000)')
plt.ylabel('Monte Carlo Estimate')

image

Example 11: \int_2^6\int_0^{y}xydxdy = 160

\int_2^6y*E[Xy]dy = E[y*E[Xy]]_2^6 = E[XY^3; U(0,1)]_2^6

estimates = []
stdvs = []

for i in range(1,200):
    unf1 = uniform.rvs(size=i*1000)
    unf2 = uniform.rvs(size=i*1000)
    expectation = (6*unf1*(6*unf2)**3-2*unf1*(2*unf2)**3)
    
    estimates.append(np.mean(expectation))
    stdvs.append(np.std(expectation) / np.sqrt(i * 1000))
    
plt.plot([160] * 200)
plt.plot(estimates, '.')
plt.plot(np.mean(estimates)- 3 * np.array(stdvs))
plt.plot(np.mean(estimates)+ 3 * np.array(stdvs))
plt.annotate('Double Integral of xy in ([0, y], [2,6])', xy=(50, 170))
plt.annotate('Approx. -> 160', xy=(50, 150))
plt.xlabel('Sample size (000)')
plt.ylabel('Monte Carlo Estimate')

image

Example 12: \int_2^6\int_3^{y}xydxdy = 88

\int_2^6(y-3)*E[Xy]dy = E[(y-3)*E[Xy]]_2^6 = E[XY^3 - 9XY; U(0,1)]_2^6

estimates = []
stdvs = []

for i in range(1, 400):
    unf1 = uniform.rvs(size=i*1000)
    unf2 = uniform.rvs(size=i*1000)
    expectation = (6*unf1*(6*unf2)**3 - 9*6*unf1*6*unf2) - (2*unf1*(2*unf2)**3 - 9*2*unf1*2*unf2)
    
    estimates.append(np.mean(expectation))
    stdvs.append(np.std(expectation) / np.sqrt(i * 1000))
    
plt.plot([88] * 200)
plt.plot(estimates, '.')
plt.plot(np.mean(estimates)- 3 * np.array(stdvs))
plt.plot(np.mean(estimates)+ 3 * np.array(stdvs))
plt.annotate('Double Integral of xy in ([3, y], [2,6])', xy=(100, 95))
plt.annotate('Approx. -> 88', xy=(100, 80))
plt.xlabel('Sample size (000)')
plt.ylabel('Monte Carlo Estimate')

image

Example 12: \int_3^5\int_2^{y}xy^2dxdy = {3343 \over 15}

\int_3^5(y-2)*E[Xy^2]dy = E[(y-2)*E[Xy^2]]_3^5 = E[XY^4-4XY^2; U(0,1)]_3^5

estimates = []
stdvs = []

for i in range(1, 200):
    unf1 = uniform.rvs(size=i*1000)
    unf2 = uniform.rvs(size=i*1000)
    expectation = (5*unf1*(5*unf2)**4 - 4*5*unf1*(5*unf2)**2) - (3*unf1*(3*unf2)**4 - 4*3*unf1*(3*unf2)**2)
    
    estimates.append(np.mean(expectation))
    stdvs.append(np.std(expectation) / np.sqrt(i * 1000))
    
plt.plot([3343/15] * 200)
plt.plot(estimates, '.')
plt.plot(np.mean(estimates)- 3 * np.array(stdvs))
plt.plot(np.mean(estimates)+ 3 * np.array(stdvs))
plt.annotate('Double Integral of xy in ([2, y], [3,5])', xy=(50, 240))
plt.annotate('Approx. -> 3343/15', xy=(50, 210))
plt.xlabel('Sample size (000)')
plt.ylabel('Monte Carlo Estimate')

image

Try this: \int_3^5\int_2^{y^2}xy^2dxdy = {112353 \over 3}

Conclusion

In this article, we discussed how to perform Monte Carlo integration by sampling from random variables. One of the most important step is changing from Integral to Expectation and we have seen how this can be done for single and double integral.