780.20: 1094 Session 11

Handouts: gaussian_random.cpp and random_walk.cpp printouts, Monte Carlo excerpt from Landau/Paez chap. 7.

Today we'll play some games with the GSL random number generators.

Your goals for today:

Please work in pairs (more or less). The instructors will bounce around 1094 and answer questions.


Random Number Generation

The program gaussian_random.cpp calls GSL routines to generate both uniformly distributed and gaussian distributed numbers.

  1. Look at the gaussian_random.cpp code (there is a printout) and identify where the random number generators are allocated, seeded, and called. Make a "gaussian_random" project and compile, link, and run the code to generate pairs of uniformly and gaussian distributed numbers in random_numbers.dat.
  2. Devise and carry out a way to use gnuplot to roughly check that the random numbers are uniformly distributed. [Hint: Your eye is a good judge of nonuniformity.] What did you do?


  3. You can check the distributions more quantitatively by making histograms of the random numbers. Think about how you would do that. Then take a look at gaussian_random_new.cpp, which has added histogramming (as well as automatic seeding). [Feel free to criticize the implementation!)] Create a "gaussian_random_new" project and add gaussian_random_new.cpp and random_seed.cpp. Compile and run with about 100,000 points. Look at random_histogram.dat. Use gnuplot to plot appropriate columns (with appropriate ranges of y) to check the uniform and gaussian distributions. Do they look random?


  4. Run gaussian_random_new.plt to plot and fit the gaussian distributions with gnuplot. Try 1,000,000 points and 10,000 points. Do you reproduce the parameters of the gaussian distribution? Attach a plot. (You may need to set b to a reasonable starting point like approximate peak height to get a useful fit.)


  5. Take a look at (and run) rolling_dice.nb to see how to do random numbers in Mathematica. [For your information only.]

Random Walking

We'll generate random walks in two dimensions using method 2 from the list in Section 11.3 of the Session 11 notes. In particular we'll start at the origin: (x,y) = (0,0) and for each step select Delta_x at random in the range [-sqrt(2), sqrt(2)] and Delta_y in the same range. So positive and negative steps in each direction are equally likely. The code random_walk.cpp implements this plan.

  1. Why do the ranges for x and y go up to sqrt(2) instead of 1? [Don't spend a long time on this!]


  2. Look at the random_walk.cpp code and identify where the random number generator is allocated, seeded, and called. Create a "random_walk" project and add random_walk.cpp and random_seed.cpp. Generate a random walk of 6400 steps.
  3. Plot the random walk (stored in "random_walk.dat") using gnuplot (use "with lines" to connect the points). Repeat a couple of times to get some intuition for what the walks look like.
  4. Check (using an editor) for the endpoints of a few walks. Roughly how do the distances R from the origin scale with N? (Can you reproduce the derivation from the notes of how R scales with N?)


  5. Now we'll study more systematically how the final distance from the origin R = sqrt(x_final^2 + y_final^2) scales with the number of steps N. Note that now we don't need to save anything from a run except the value of R. The value of R will fluctuate from run to run, so for each N we want to average over a number of trials. How many trials? The Landau/Paez book claims about sqrt(N); can you account for this?


    Edit the code to make sqrt(N) runs for each value of N and takes the average of R. Make (and attach) an appropriate plot that reveals the dependence of R on N. [The code random_walk_length.cpp and plot file random_walk_length.plt implement this task. Try it yourself before looking at those.] Does it agree with expectations?



Monte Carlo Integration: Uniform Sampling

Your goal is to calculate the 10-dimensional integral of

  (x1 + x2 + x3 + ... + x10)^2
where each of the variables ranges from 0 to 1. The exact answer is 155/6. The basic Monte Carlo method is described in Section 7.12 (see handout). In particular, equations (7.14) and (7.16) show that the integral is given approximately by the range(s) times the average of the function evaluated at N random vectors. (So for a 5-dimensional integral, each vector is a set of 5 random numbers {x1,x2,x3,x4,x5}.)

[If you are running short of time, use mc_integration.cpp and mc_integration.plt below instead of writing your own. You'll need a "mc_integration" project and include random_seed.cpp as well as mc_integration.cpp.]

  1. Modify gauss_random.cpp or random_walk.cpp (copy one of them to mc_int.cpp) to make a Monte Carlo estimate of the 10-dimensional integral using N vectors (where N is input as in random_walk.cpp). Run it a few times with different seeds.
  2. Now for each value of N, make 16 trials and take the average as your official estimate of the integral.
  3. Try sample sizes of N = 10, 20, 40, ..., (i.e., from 10 to some suitably large number by multiples of 2).
  4. Plot the absolute value of the error versus N on a log-log plot and try to identify the approximate dependence.



Monte Carlo Integration: GSL Routines

Run a test program to do the same integral as in the last section but with vegas and miser.

  1. Take a look at the program gsl_monte_carlo_test.cpp while also looking at Monte Carlo integration in the online GSL library. Make a project gsl_monte_carlo_test (you'll need random_seed.cpp again).
  2. The integral above is not a great test. After running the program, change the integrand to something more interesting (your choice!). Don't worry about knowing the exact answer; compare the results from the different routines. What do you find?




C++ Class for a Random Walk

The random walk code random_walk.cpp is basically written as a C code with C++ input and output. Here we reimplement the code as a C++ class (originally in anticipation of using it in a Qt application).

  1. In the RandomWalk subdirectory, create a project RandomWalk_test and add RandomWalk_test.cpp, RandomWalk.cpp, RandomWalk.h, and random_seed.cpp. Compile and run it to generate "RandomWalk_test.dat", which you should plot with gnuplot to verify that the output looks the same as from random_walk.cpp.
  2. Compare the old and new code (you have printouts of each). Discuss with your partner the advantages (and any disadvantages) of the definition of RandomWalk as a class. List some.


  3. An advantage of programming with classes is the ease of extending or generalizing a code. List two ways to extend the class definition.


  4. For the experts (or anyone willing to try!), modify the code to do the following:


780.20: 1094 Session 11. Last modified: 10:11 am, May 01, 2008.
furnstahl.1@osu.edu