import numpy as np;
import matplotlib.pyplot as plt;
plt.rcParams.update({"text.usetex":True});
%config InlineBackend.figure_format = "svg"
from ipywidgets import interactive
In this lecture, we are going to analyze how drawing random states can help us figure out the properties of a system. In particular we will be looking at an approximation for $\pi$ through the darts experiment. In other words, the darts experiment, where you "hit" a point in a 2D space and find out whether the dart is inside a circle or outside it, is, in essence, an approximation to the area enclosed by the circle.
Let's first look at generating a large number of coordinate pairs such that $(x,y)\in [-1, 1]$. The x and y coordinates are drawn uniformly. Each value between -1 and 1 is equally likely. By figuring out the number of points which lie inside the circle against the total number of samples drawn, we obtain an estimate of $\pi$ as shown below. Essentially it is $4 \pi = \frac{N_t}{N}$ where $N_t$ represents the number of samples which lie inside the circle and $N$ represents the total number of samples.
N = 100000;
v = np.random.uniform(-1, 1, size=(N,2)); x = v[:,0]; y = v[:,1];
#plt.plot(x, y, '.k', markersize=1); plt.xlabel('X'); plt.ylabel('Y');
#ax = plt.gca(); ax.set_aspect(1);
#plt.plot([-1, 1, 1, -1, -1], [-1, -1, 1, 1, -1], '-k');
#theta = np.linspace(0, 2*np.pi, 100); plt.plot(np.cos(theta), np.sin(theta), '-r');
r = (x**2 + y**2);
nt = np.sum((r<1).astype(int));
p = nt/N;
pi_est = 4*p;
print("Estimated pi: %f\t error = %f"%(pi_est, np.abs(np.pi - pi_est)));
Estimated pi: 3.127680 error = 0.013913
N_a = np.logspace(1,7,80, dtype=int);
for N in N_a:
v = np.random.uniform(-1, 1, size=(N,2)); x = v[:,0]; y = v[:,1];
r = (x**2 + y**2);
nt = np.sum((r<1).astype(int));
p = nt/N;
pi_est = 4*p;
plt.plot(np.log10(N), np.log10(np.abs(np.pi - pi_est)), 'ok', markersize=6)
x = np.linspace(1, 7); plt.plot(x, -1/2*x, '-r');
plt.xlabel('$\log N$'); plt.ylabel('$|\pi - \pi_e|$');
The above figure demonstrates how the absolute eror in the estimate of $\pi$ converges as the number of samples drawn increases. We observe that $|\pi - \pi_e|\sim N^{-1/2}$.
Next, we consider Buffon's needle experiment. In this experiment, there are parallel lines which are spaced at a distance $a$. We then toss a needle of length $b$ on the floor. We are interested in the probability that the needle intersects the line. Interestingly, this experiment also yields an expression for $\pi$ indirectly. Quite naturally, the outcome of this experiment also depends on the relative aspect ratio, i.e. $b/a$. The snippet below shows how we can use uniformly generated numbers, first for the angle of orientation of the needle with respect to the parallel lines, and the second, a compound expression involving the aspect ratio. The check for the intersection of the needle with the parallel line is performed and the number of such samples satifying the condition is compared to the total number of trials performed.
N_a = np.logspace(1,6,80, dtype=int);
b = 0.6;
a = 1.0;
for N in N_a:
v = np.random.uniform(0, 1, size=(N,2)); r1 = v[:,0]; r2 = v[:,1];
e = r1-b/a*np.sin(r2*np.pi);
nt = np.sum((e<0).astype(int));
p = nt/N;
pi_est = 2*b/(p*a);
plt.plot(np.log10(N), np.log10(np.abs(np.pi - pi_est)), 'ok', markersize=6)
x = np.linspace(1, 7); plt.plot(x, -1/2*x, '-r')
plt.xlabel('$\log N$'); plt.ylabel('$|\pi - \pi_e|$');
The above expression also provides for an estimate of $\pi$, albeit through a slightly more non-intuitive problem.