Computational Challenges and Opportunities of Simulating Cosmic Ray Showers at Global Scale

Computational Challenges and Opportunities of
Simulating Cosmic Ray Showers at Global Scale

Olesya Sarajlic Georgia State University
Department of Physics and Astronomy
P.O. Box 5060AtlantaGeorgia30302-5060 osarajlic@gmail.com
Xiaochun He Georgia State University
Department of Physics and Astronomy
P.O. Box 5060AtlantaGeorgia30302-5060 xhe@gsu.edu
Semir Sarajlic Georgia Institute of Technology
Partnership for an Advanced Computing Environment
258 Fourth Street NWAtlantaGeorgia30332-0700 semir.sarajlic@oit.gatech.edu
 and  Ting-Cun Wei Northwestern Polytechnical University
School of Computer Science and Engineering
No. 1 Dongxiang Rd., Chang’an DistrictXi’an 710129, Shaanxi, PRChina weitc@nwpu.edu.cn
Abstract.

Galactic cosmic rays are the high-energy particles that stream into our solar system from distant corners of our Galaxy and some low energy particles are from the Sun which are associated with solar flares. The Earth atmosphere serves as an ideal detector for the high energy cosmic rays which interact with the air molecule nuclei causing propagation of extensive air showers. In recent years, there are growing interests in the applications of the cosmic ray measurements which range from the space/earth weather monitoring, homeland security based on the cosmic ray muon tomography, radiation effects on health via air travel, etc. A simulation program (based on the GEANT4 software package developed at CERN) has been developed at Georgia State University for studying the cosmic ray showers in atmosphere. The results of this simulation study will provide unprecedented knowledge of the geo-position-dependent cosmic ray shower profiles and significantly enhance the applicability of the cosmic ray applications. In the paper, we present the computational challenges and the opportunities for carrying out the cosmic ray shower simulations at the global scale using various computing resources including XSEDE.

XSEDE; GEANT4; ECRS; cosmic rays; geomagnetic field
copyright: acmcopyrightjournalyear: 2018copyright: acmcopyrightconference: Practice and Experience in Advanced Research Computing; July 22–26, 2018; Pittsburgh, PA, USAbooktitle: PEARC ’18: Practice and Experience in Advanced Research Computing, July 22–26, 2018, Pittsburgh, PA, USAprice: 15.00doi: 10.1145/3219104.3229281isbn: 978-1-4503-6446-1/18/07ccs: Applied computing Physicsccs: Computing methodologies Modeling and simulation

1. Introduction

Galactic cosmic rays are the high-energy particles that stream into our solar system from distant corners of our Galaxy and some low energy particles are from the Sun which are associated with solar flares. The primary cosmic ray particles are mainly energetic protons (79) and about 14 alpha particles, which are originated from supernovae explosions or other astrophysical events (pdg:cosmic, ; dorman:crv, ). The primary cosmic ray particles interact with the molecules in the atmosphere and produce showers of secondary particles (mainly pions) at about 15 km altitude. These pions are decaying into muons which are the dominant cosmic ray particle radiation (about 80) at the surface of the Earth.

Over the past decades, numerous studies have reported the correlations between the dynamical changes of the Earth weather patterns and cosmic ray flux variation measured at the surface (kirkby:climate, ; lu:correlation, ; ollila:changes, ; shaviv:climate, ). In recent years, other interesting applications of the cosmic ray measurements have been discovered which include the cosmic ray muon tomography for homeland security, volcanic activity monitoring, nuclear reactor core monitoring, etc. (Pyramid_muon, ; Muon_tomography, ).

The Nuclear Physics Group at Georgia State University (GSU) (np6:phys, ) is currently developing novel, low-cost and portable cosmic ray detectors to be distributed around the world. One of the main goals of this project is to measure the cosmic ray radiation at the surface of the earth simultaneously at global scale to study the dynamical changes of the upper troposphere and the lower stratosphere. The success of this global measurement could lead to an unprecedented and accurate weather forecasting system both in short- and long-term. There are two computing related challenges for this project. One is the need to monitor and collect data from the cosmic ray detector nodes in a world-wide cosmic ray detector network. The other is the systematic simulation of cosmic ray shower development in the atmosphere with variable geomagnetic field and atmospheric air density. To address the second challenge, a GEANT4-based cosmic ray shower simulation (ECRS) (sanjeewa:simulation, ) has been developed to model cosmic ray showers in the Earth’s atmosphere. The results of this simulation study will provide unprecedented knowledge of the geo-position-dependent cosmic ray shower profiles and significantly enhance the applicability of the cosmic ray applications.

The GEANT4 software package (agostinneli:simulation, ; allison:geant4, ) is widely applied in the field of high energy, nuclear and accelerator physics, as well as in medical and space sciences. The main goal of the ECRS simulation is to perform an extensive study of solar, geomagnetic field, temperature, and barometric pressure effects on cosmic ray showers in the atmosphere.

In this study, we discuss the computational challenges of tracking all produced particles in each event in the whole depth of the atmosphere and sampling many events to obtain the statistically meaningful results. We compare the benchmarks of our analysis across the computing resources that were available to us which include desktop workstations, campus cluster at GSU, computing facility at Brookhaven National Laboratory (BNL), and the Pittsburgh Supercomputing Center (PSC) Bridges (towns:xsede, ).

In the following sections, we present an overview of the ECRS simulation and the use of computing resources to achieve statistically meaningful results. The details of the ECRS simulation setup is given in Section 2. In Section 3, we show our results from running the simulation on various computing resources. Section 4 outlines the preliminary results from this study and the scalability of the ECRS simulation using various computing resources. A brief summary and outlook is given in Section 5.

2. ECRS Simulation Setup

The ECRS simulation includes a realistic implementation of atmospheric air composition and density according to the US Standard Atmospheric Model (lide:crc, ) and a time-dependent geo-magnetic field due to the varying solar activity (finlay:igrf, ; tsyganenko:model, ). The earth atmosphere in the ECRS model is divided into 100 atmospheric layers in order to properly parameterize the air density variation as a function of the altitude. The atmosphere air consists of 78.09 N, 20.95 O, 0.93 Ar, and 0.03 CO. The earth is represented by the 11-kilometer shell consisting of water material, which allows one to study the cosmic ray radiation level at the depth of the ocean.

The geo-magnetic field implemented in ECRS consists of the internal and the external magnetic fields. Internal geomagnetic field is given by the Internal Geomagnetic Reference Field model (finlay:igrf, ), and the external is using well established Tsyganenko models (tsyganenko:model, ). Figure 1 shows both, internal and external, field lines that are surrounding the Earth. The internal field is fairly symmetric, while the sun facing side of the external is being compressed by the solar wind and the tail extends further in space.

Figure 1. Visulization of the magnetic field lines around the Earth implemented in ECRS.

To emphasize the complex structure of the magnetic field, Figure 2 shows its effect on the path of the low-energy incoming protons. This intricacy of the magnetic field impacts the computation time that is needed for tracking many of these particles in the simulation.

Figure 2. (color online) The effect of geomagnetic field around the Earth. The motion of the proton on a magnetic shell is plotted in red, and the Earth is represented in blue. Top panel: a top view visualization of the motion of 100 MeV protons on a geomagnetic shell during 60 seconds. Bottom panel: a side view visualization of the motion of 10 MeV protons on a geomagnetic shell during 60 seconds.

The primary cosmic ray particles that we are interested in studying in this project are protons with energies below 100 GeV which are dominant in the primary cosmic ray spectrum, as shown in Fig. 3.

Figure 3. (color online) Flux of protons of the primary cosmic rays in units of particles per energy as a function of energy.

Figure 4 shows a cosmic ray shower event display from the ECRS simulation produced by a single 50 GeV proton. As shown in Fig. 4, most of the secondary particles are produced around 15 km in altitude, which is a few km higher than the typical flight altitude of air travel. It is for this reason that the study of cosmic ray shower activities is also important for understanding the health hazard for flight crew.

Figure 4. Cosmic ray shower event display from a 50 GeV primary proton launched toward the polar region. The blue, red, and green color trajectories represent positive, negative, and neutral (gamma) particles, respectively. The curved trajectories are due to the magnetic field effect.

A typical event, as shown in Fig. 4, will take on average about 10 minutes to complete on a Linux desktop environment. To carry out the cosmic ray shower simulation with statistically meaningful results is computationally demanding considering the following factors:

  • Accumulating large number of cosmic ray shower events at a given geo-position (i.e., geo-magnetic field variations) with variable atmospheric air density profile;

  • Tracking low energy cosmic ray shower particles at the earth-size scale (i.e., computing time consumption);

  • Outputting extensive shower particle information produced in the atmosphere for offline data analysis.

ECRS simulation is intrinsically parallel at per event level. This means that one could run events independently of one another on different compute resources. ECRS is an exceptionally optimized code that utilizes 100 of the CPU throughout the duration of the event run, which results in a 100 resource utilization of the resource reserved via a workload scheduler. In the following section, we demonstrate the scalability of ECRS simulation from personal computer to institutional small scale cluster and later national resources at XSEDE and BNL.

3. Computational Challenges of Running ECRS Simulation

3.1. Desktop Computer

In order to provide a reference for assessing the computing resources in XSEDE, we ran ECRS on a high-end desktop machine (Mac Pro: 3.5 GHz 6-Core Intel Xeon E5 with 64 GB RAM) by launching cosmic rays toward 33.75 North and 264.39 East from 1.2 Earth’s radius in altitude. The CPU execution time (i.e., event time) of this simulation per cosmic ray event as a function of the primary particle energy is shown in Fig. 5 with and without geo-magnetic field. As it is expected, it takes much longer CPU time for tracking particles in the geo-magnetic field. For example, for a proton at 60 GeV energy, it only takes on average 9 seconds to complete the event without the geo-magnetic field in comparison to 700 seconds with the geo-magnetic field.

It is also interesting to notice here that it takes very little CPU time when the primary energies are less than 15 GeV in case when the geo-magnetic field is enabled. In other words, the geo-magnetic field will deflect low-energy primary cosmic ray particles away from entering into the earth atmosphere. Given the fact that the geo-magnetic field is non-uniform and asymmetric, one needs to run ECRS simulation at each location accordingly in order to properly take into account the field effect. This ultimately brings the computing challenges to carry out these simulations with reasonably achievable statistical accuracy.

Figure 5. Scatter plot of event time vs. the incident primary particle energy. The cosmic ray particles are launched from 1.2 Earth’s radii toward the center of the earth to 33.75 North and 264.39 East geo-position. Top panel: cosmic ray shower simulation without geo-magnetic field. Bottom panel: cosmic ray shower simulation with geo-magnetic field.

3.2. GSU Cluster

GSU’s Orion (sarajlic:orion, ) is a heterogeneous Linux cluster comprised of 360 cores, 4.25 TB of RAM, and 87 TB of NFS storage with LSF workload manager for scheduling batch and interactive jobs. This is an ideal system for testing the ECRS simulation performance by submitting many batch jobs in parallel to multiple nodes in order to achieve higher statistics.

For this test, we used a total of 182,105 CPU hours between April, 2016 and July, 2016 as shown in Fig. 6 which accounted for 32.3 of overall cluster utilization during that period 111Open XDMoD (palmer:xdmod, ) for Georgia State University: http://xdmod.rs.gsu.edu. While this computing resource was not sufficient for achieving the required event statistics, we were able to successfully run our ECRS simulation in a shared cluster environment. We then turned to larger national resources to supplement our computing needs.

Figure 6. (color online) CPU hours used by Nuclear Physics Group/user (SP00013725) on Orion between 04/01/2016 and 07/31/2016, which is 182,105.6 CPU hours.

3.3. RHIC Computing Facility

In order to qualitatively explore the magnetic field effect on the cosmic ray shower development in the atmosphere, we ran the ECRS simulation on a computing farm (well-over 10,000 nodes) at the RHIC Computing Facility at BNL by launching primary cosmic rays from 1.2 Earth’s radii toward the surface of the earth at 10 degree increment in latitude and longitude. This simulation exercise was divided into 6840 batch jobs and took more two weeks of time to complete 1000 events per batch job. Figure 7 shows the distributions of the ionizing particle radiation (including protons, neutrons, muons, electrons and gamma rays) that reached sea level.

Figure 7. Left panel: Latitude versus longitude distributions of particles that reached the surface of the earth (with magnetic field being implemented). Right panel: 3D display of the global particle distributions at the surface with magnetic field implementation.

While it is very clear to see the geo-position dependent cosmic ray shower particle distributions at the surface of the earth as shown in Fig. 7, it is still statistically limited toward obtaining adequate distributions to quantify this variation in any of the interesting applications aforementioned.

3.4. XSEDE Computing Resources

Through XSEDE Campus Champion (GEO150002) and startup allocation (PHY160043) grants we got access to Bridges cluster at Pittsburgh Supercomputing Center (PSC). Figures 8 and 9 show total SUs charged by allocation from both XSEDE grants. We ran our simulation on the Regular Shared Memory (RSM) computational nodes comprised of HPE Apollo 2000s with 2 Intel Xeon E5-2695 v3 CPUs (14 core per CPU), 128 GB RAM, and 8TB on-node storage.

Figure 8. (color online) Service units charged from XSEDE grants by allocation: 55 from GEO150002 and 45 from PHY160043.

Similar to the ECRS simulation setup as described in Section 3.3, we launched the primary cosmic ray particles over 4 steradian in direction with 10 degree increments both in geographic latitude and longitude. Given the extended CPU time needed per event with the geo-magnetic field on, each job is limited to 500 events in order to complete the batch jobs within the 48-hour requirement on the XSEDE Bridges. From 10/01/2016 to 03/10/2017, as shown in Figures 9, we consumed a total of 1,150,868 SUs (186,675.9 CPU hours) for running these simulation jobs, which greatly exceeded the total SU allocation for the combined PHY160043 and GEO150002 projects. One of the major reasons of consuming a large number of SUs was related to the 48-hour requirement per single batch job. In some cases, if the sampled primary particle energy is very high, it takes tremendous amount of CPU time to track all low-energy secondary particles produced in the cosmic ray shower which in turn prevents the completion of the total number of events for the batch job. If this happens, we had to re-submit the incomplete batch job in order to meet the required statistics at each geo-position, which we automated via a script that would scan the job outputs and resubmit the incomplete jobs from the position where the initial job completed.

For achieving meaningful statistical accuracy, one needs to simulate more than 10,000 events at each geo-position. This would require an XSEDE allocation greater than 2 million SUs for running ECRS simulation to accumulate statistically accurate results at the global scale. We will continue exploring the computing opportunities at XSEDE in our future work.

Figure 9. (color online) Service units charged by user on XSEDE between 10/01/2016 and 03/10/2017, which is roughly 200,000 CPU hours.

4. Results and Discussion

In this section, we highlight some of the important results of cosmic ray shower simulations that we could obtain by using the computing resources aforementioned. As it is shown in bottom panel of Fig. 5, more CPU time is required to run cosmic ray shower simulation for greater primary particle energies (approximately a linear relationship above the cutoff energy). As seen in Fig. 3, most of the cosmic ray shower events have lower energies that require less CPU time to complete the shower simulation. However, there is a small percentage of events which has tens of GeV energies and takes up most of the total CPU time of the simulation batch jobs. This is a trend that we experienced across the computing resources we studied so far.

One of the innovative features of the ECRS simulation is that one can simulate cosmic ray shower activities simultaneously at global scale as shown in Fig. 7. It is impossible to complete this task in a single or small cluster desktop environment. However, based on our initial test with national resources via BNL and XSEDE, it is very possible to carry out the ECRS simulations with large statistics if more resources can be allocated. As an example, one could track all particle species produced in the cosmic ray showers at each geo-position in the whole atmosphere, as shown in Fig. 10. This study is important for using the cosmic ray muon and neutron particles to determine the effective atmospheric temperature in the higher altitude in atmosphere (¿6 km).

Figure 10. (color online) Distributions of the secondary cosmic ray particles in the atmosphere as a function of the altitude: muons (blue curve), neutrons (magenta curve), electrons (black curve), and gamma ray photons (green curve). The color code is match the color scheme in Fig. 7.

The possibility of carrying out the ECRS simulation with large statistics also allows us to study the cosmic ray radiation budget (i.e. mainly muon and neutron particle flux) at the surface of the earth at global scale simultaneously. Figure 11 shows the particle energy distributions of muons, neutrons, electrons and photons for a given geo-position as an example. This information is important since one could use it to compare with the results from the cosmic ray detectors installed at different locations at the surface of the earth222World Data Center for Cosmic Rays: http://cidas.isee.nagoya-u.ac.jp/WDCCR/.

Figure 11. (color online) Particle energy distributions of muons (blue curve), neutrons (magenta curve), electrons (black curve), and gamma ray photons (green curve).

Another challenge of the ECRS simulation studies is to manage the simulated data and to analyze the output. This is especially true when running the ECRS simulation for different solar cycles and variable atmospheric profile to explore the long-term variations of the cosmic ray flux which could be important to for climate change studies. An example of this type of analysis is to look into the lateral distribution of the cosmic ray muons and neutrons in the polar region which are associated with the energy of the primary cosmic ray protons as shown in Fig. 12.

Figure 12. (color online) Muon (top panel) and neutron (bottom panel) lateral distributions (in candle/box plot) in the polar region generated by different ranges of primary energies: 4 GeV - 10 GeV (red curve), 10 GeV - 15 GeV (brown curve), 15 GeV - 30 GeV (orange curve), 30 GeV - 50 GeV (green curve), 50 GeV - 70 GeV (blue curve), and 70 GeV - 100 GeV (magenta curve).

Based on our preliminary study of ECRS simulation on XSEDE Bridges, we are highly encouraged to see that one could achieve many of the important simulation tasks to explore the applications of the cosmic rays. While continuing on optimizing the ECRS simulation software, we would like to acquire more XSEDE resource allocations to carry out focused simulations specifically associated with the global weather forecasting.

5. Summary and Future Work

In the paper, we briefly described the simulation software (ECRS) that have been developed at GSU for studying the cosmic ray shower characteristics in the atmosphere at global scale. We showed some of the preliminary results of running ECRS from different computing environments which include personal workstation, GSU cluster, RHIC Computing Facility, and XSEDE. This approach closely followed the HPC model at Georgia State in that we start with local or personal resources, then scale to institutional cluster followed by national resources given the growing needs (sarajlic:acore, ). The results of this simulation is of great importance with many practical applications which range from weather forecasting, muon tomography and cosmic ray related health issues.

We see the great opportunities of accomplishing the important cosmic ray shower studies using XSEDE resources based on our initial studies for obtaining secondary cosmic ray particle distributions in the whole atmosphere, which can aid many of these important studies associated with cosmic ray applications. We also see the challenges of using XSEDE not only for carrying out ECRS simulation with high statistics but also for managing and analyzing the simulated data, which we developed workflows during our initial work via XSEDE’s startup allocation. We would like to continue using the XSEDE to carry out focused simulations in near future, which we will pursue further with Research Allocation through XSEDE Resource Allocations Committee (XRAC).

Acknowledgements.
We acknowledge the use of Georgia State’s research computing resources that are supported by Georgia State’s Research Solutions. This work used the Extreme Science and Engineering Discovery Environment (XSEDE), which is supported by National Science Foundation grant number ACI-1053575. We acknowledge the use of XSEDE resources via Campus Champion Grant - GEO150002 and Startup allocation - PHY160043.

References

  • [1] J. Beringer and et al., Cosmic Rays. Phys. Rev. D, 2012.
  • [2] L. Dorman, Cosmic Ray Variations. State Publishing House, Moscow, Russia, 1957.
  • [3] J. Kirkby, Cosmic Rays and Climate. Surv. Geophys, 28, 333-375, 2007. 10.1007/s10712-008-9030-6.
  • [4] Q. Lu, Correlation between Cosmic Rays and Ozone Depletion. Phys. Rev. Lett., 102(11), 118501, Mar. 2009. http://link.aps.org/doi/10.1103/PhysRevLett.102.118501.
  • [5] A. Ollila, Changes in cosmic ray fluxes improve correlation to global warming. Int. J. Phys. Scie., 7, 822-826, 2012. 10.5897/IJPS11.1484.
  • [6] N. J. Shaviv, On climate response to changes in the cosmic ray flux and radiative budget. J. Geophys. Res, 110, A08105, 2005. 10.1029/2004JA010866.
  • [7] K. Morishima and et al., Discovery of a big void in Khufu’s Pyramid by observation of cosmic-ray muons. Nature, Dec. 2017.
  • [8] G. Zumerle and et al., The Cosmic Muon Tomography (CMT) Project. Institute Nazionale di Fisica Nucleare (INFN) Legnaro National Laboratories (LNL), 2018. http://mutomweb.pd.infn.it.
  • [9] N. Group, Nuclear Physics Group at GSU. Georgia State University, 2018. http://phynp6.phy-astr.gsu.edu/.
  • [10] H. Sanjeewa, X. He, and C. Cleven, Air shower development simulation program for the cosmic ray study. NIMB, 261, 918-921, 2007. 10.1016/j.nimb.2007.04.281.
  • [11] S. Agostinelli and et al., A Simulation Toolkit. Nucl. Inst. Meth. Phys. Res., 506(3), 250-303, 2003.
  • [12] J. Allison and et al., Geant4 developments and applications. IEEE Transactions on Nuclear Science, 53(1), 270-278, 2006.
  • [13] J. Towns, T. Cockerill, M. Dahan, I. Foster, K. Gaither, A. Grimshaw, V. Hazlewood, S. Lathrop, D. Lifka, G. D. Peterson, R. Roskies, J. R. Scott, and N. Wilkins-Diehr, XSEDE: Accelerating Scientific Discovery. Computing in Science and Engineering, 16(5), 62-74, Sept. 2014. 10.1109/MCSE.2014.80.
  • [14] D. R. Lide, CRC Handbook of Chemistry and Physics. CRC Press, 76th Edition, 1995.
  • [15] C. C. Finlay, S. Maus, C. D. Beggan, T. N. Bondar, and A. Chambodut, International Geomagnetic Reference Field: the eleventh generation. Geophys. J. Int., 183, 1216-1230, 2010. 10.1111/j.1365-246X.2010.04804.x.
  • [16] N. A. Tsyganenko and M. I. Sitnov, Magnetospheric configurations from a high-resolution data-based magnetic field model. J. Geophys. Res., 112, A06225, 2007. 10.1029/2007JA012260.
  • [17] S. Sarajlic, N. Edirisinghe, Y. Lukinov, M. Walters, B. Davis, and G. Faroux, Orion: Discovery Environment for HPC Research and Bridging XSEDE Resources. ACM, New York, NY, USA, 2016. http://doi.acm.org/10.1145/2949550.2952770.
  • [18] J. T. Palmer, S. M. Gallo, T. R. Furlani, M. D. Jones, R. L. DeLeon, J. P. White, N. Simakov, A. Patra, J. Sperhac, T. Yearke, R. Rathsam, M. Innus, C. D. Cornelius, J. Browne, W. Barth, and R. Evans, Open XDMoD: A Tool for the Comprehensive Management of High-Performance Computing Resources. Computing in Science & Engineering 17.4, 2015. https://xdmod.ccr.buffalo.edu/.
  • [19] S. Sarajlic, N. Edirisinghe, Y. Wu, Y. Jiang, and G. Faroux, Training-based Workforce Development in Advanced Computing for Research and Education (ACoRE). ACM, 2017. https://doi.org/10.1145/3093338.3104178.
Comments 0
Request Comment
You are adding the first comment!
How to quickly get a good reply:
  • Give credit where it’s due by listing out the positive aspects of a paper before getting into which changes should be made.
  • Be specific in your critique, and provide supporting evidence with appropriate references to substantiate general statements.
  • Your comment should inspire ideas to flow and help the author improves the paper.

The better we are at sharing our knowledge with each other, the faster we move forward.
""
The feedback must be of minimum 40 characters and the title a minimum of 5 characters
   
Add comment
Cancel
Loading ...
227130
This is a comment super asjknd jkasnjk adsnkj
Upvote
Downvote
""
The feedback must be of minumum 40 characters
The feedback must be of minumum 40 characters
Submit
Cancel

You are asking your first question!
How to quickly get a good answer:
  • Keep your question short and to the point
  • Check for grammar or spelling errors.
  • Phrase it like a question
Test
Test description