Monte Carlo methods have been used for centuries, but only in the past several decades has the technique gained the status of a full-fledged numerical method capable of addressing the most complex applications. The Monte Carlo method may be thought of as similar to a political poll, where a carefully selected statistical sample is used to predict the behavior or characteristics of a large group.
Enrico Fermi in the 1930's used Monte Carlo in the calculation of neutron diffusion, and later designed the Fermiac, a Monte Carlo mechanical device used in the calculation of criticality in nuclear reactors.
In the 1940's, a formal foundation for the Monte Carlo method was developed by von Neumann, who established the mathematical basis for probability density functions (PDFs), inverse cumulative distribution functions (CDFs), and pseudorandom number generators. The work was done in collaboration with Stanislaw Ulam, who realized the importance of the digital computer in the implementation of the approach.
Before digital computers were available to the labs, "computer" was a job title. Parallel computing was done by rows and columns of mathematicians. The applications, which arose mostly from the Manhattan Project, included design of shielding for reactors.
Uses of Monte Carlo methods have been many and varied since that time. In the late 1950's and 1960's, the method was tested in a variety of engineering fields. At that time, even simple problems were compute-bound. Many complex problems remained intractable through the seventies. With the advent of high-speed supercomputers, the field has received increased attention, particularly with parallel algorithms which have much higher execution rates.
``All the pieces of the puzzle'' have just come into confluence for large-scale Monte Carlo analysis. First, supercomputers are now sufficiently powerful to enable the simulation of very large engineering and physics systems, involving thousands of surfaces and billions of particle emissions. Secondly, a comprehensive formulation for material properties exists in the aggregate of several models. Thirdly, an estimate of the number of trials required to achieve a specified level of accuracy is now obtainable prior to execution. This makes possible a formulation which allows the number of emissions to evolve dynamically as the simulation proceeds. Finally, a number of investigators have effectively vectorized diverse Monte Carlo transport algorithms, with a sufficient base to establish a synthesized approach. We now even have a quantitative model which allows the assessment of the degree of parallelism and the amount of overhead required. Moreover, with the emergence of lagged Fibonacci generators, parallelization at any granularity appears to be easily implemented, and robust.
history of Monte Carlo method | more history
Today's applications of Monte Carlo methods include: cancer therapy, traffic flow, Dow-Jones forecasting, and oil well exploration, as well as more traditional physics applications like stellar evolution, reactor design, and quantum chromo-dynamics. Monte Carlo methods are widely used in modeling of materials and chemicals, from grain growth modeling in metallic alloys, to behavior of nanostrutures and polymers, and protein structure predictions.
Glioblastoma multiforme is an especially virulent type of brain cancer that affects about 7,000 Americans each year. Life expectancy is limited, with fewer than three percent surviving beyond five years. Under a U.S. Food and Drug Administration protocol, the U.S. Department of Energy's Brookhaven National Laboratory has been conducting a multi-patient clinical trial of an experimental treatment called boron neutron capture therapy (BNCT).
The Boron Neutron Capture Therapy program developed by INEEL uses MCNP for neutron source engineering calculations (designing the reactor that generates the beam) and a similar Monte Carlo solution scheme in their radiation dosimetry and treatment planning calculations.
It has been demonstrated that a typical BNCT neutron source engineering input deck could complete its calculations in 19.35 minutes by using 1,024 processors on the ORNL Intel Paragon XPS-150 massively parallel computer. The identical calculation required over 4.7 days (6,800 minutes) to execute on a DEC Alpha workstation at INEEL. This is a speed-up of over 350 times. Furthermore, it was demonstrated that the actual neutron transport calculations scaled linearly with increasing processors. These results may prove to be very significant in the broader use of the MCNP code, which had not been used for treatment calculations because it was not fast enough. Being able to do the calculations in under an hour would make it feasible to use MCNP for clinical irradiations.
a collection of MCNP/BNCT images: brain scans, tumor models, etc.
photo showing administration of BNCT treatment to a patient
more info on MCNP code with images
more details on MCNP computation
more details on BNCT medical aspects
Peregrine: another 3D Monte Carlo program for radiation therapy in the treatment of cancer, from LLNL.