A New Public Release of the GIZMO Code
We describe a major update to the public GIZMO code. GIZMO has been used in simulations of cosmology; galaxy and star formation and evolution; black hole accretion and feedback; proto-stellar disk dynamics and planet formation; fluid dynamics and plasma physics; dust-gas dynamics; giant impacts and solid-body interactions; collisionless gravitational dynamics; and more. This release of the public code supports: hydrodynamics (using various mesh-free finite-volume Godunov methods or SPH), ideal and non-ideal MHD, anisotropic conduction and viscosity, radiative cooling and chemistry, star and black hole formation and feedback, sink particles, dust-gas (aero)-dynamics (with or without magnetic fields), elastic/plastic dynamics, arbitrary (gas, stellar, degenerate, solid/liquid material) equations of state, passive scalar/turbulent diffusion, large-eddy and shearing boxes, self-gravity with fully-adaptive force softenings, arbitrary cosmological expansion, and on-the-fly group-finding. It is massively-parallel with hybrid MPI+OpenMP scaling verified up to million threads. The code is extensively documented, with test problems and tutorials provided for these different physics modules.
keywords:methods: numerical — hydrodynamics — instabilities — turbulence — planets: formation — stars: formation — galaxies: formation — cosmology: theory
This is an announcement and very brief description of the new public release of the GIZMO code, available at:
The simulation code GIZMO (Hopkins, 2015) is a flexible, arbitrary Lagrangian-Eulerian multi-method code, with a wide range of different physics modules described below. The code is descended from P-GADGET and GADGET-2 (Springel, 2005), and is fully-compatible with GADGET analysis codes, snapshots, and initial conditions.
Documentation: GIZMO is extensively documented. Users should read the User Guide, bundled with the code. This contains detailed descriptions of all code modules, compiler flags, run-time parameters, initial conditions files, snapshots, etc., as well as instructions for building or analyzing your own simulations (and links to many public codes which can generate initial conditions or visualize outputs). Dozens of test problems with complete tutorials, setup, and initial conditions are provided in the User Guide.
Compatibility: GIZMO is written in standard C, and uses only widely-available public libraries. The portability of the code has been confirmed on a large number of systems, ranging from NSF’s Comet, Stampede, and Blue Waters, NASA’s Pleiades, and DOE’s Titan and Mira super-clusters, through Mac and Linux laptops.
Scaling and Parallelism: GIZMO is a massively-parallel code which uses a hybrid OpenMP+MPI architecture to efficiently scale to large numbers of CPU cores and/or threads. While code scaling is always highly problem-dependent, actual production problems (e.g. large cosmological hydrodynamic simulations with star formation and magnetic fields) have achieved near-ideal strong or weak scalings through million threads.
Development Code: Some modules – for example radiation-hydrodynamics – are not yet in the public code because they are in active development and not yet de-bugged or tested at the level required for use “out of the box” (but will be made public as soon as this stage is reached). Other modules (e.g. the FIRE project feedback physics) are not public because they involve proprietary code developed by others with their own collaboration policies, which must be respected (see User Guide for details).
License: The public version of the code is free software, distributed under the GNU General Public License. Read the User Guide for more details. The authors retain their copyright on the code. The development version of the code is private and can only be accessed or shared with the explicit permission of the authors.
2 Physics in the Public Code
GIZMO is a modular, multi-physics code; some examples of different GIZMO simulations are shown in Fig. 1. This public release adds support for a range of physics including (but not limited to):
Hydrodynamics using any of several fundamentally different methods (e.g. new Lagrangian finite-element Godunov schemes, or various “flavors” of smoothed particle hydrodynamics, or Eulerian fixed-grid schemes). The fluid “mesh” can be assigned any arbitrary motion the user desires (or allowed to move with the fluid).
Ideal and non-ideal magneto-hydrodynamics (MHD), including Ohmic resistivity, ambipolar diffusion, and the Hall effect. Coefficients can be set by-hand or calculated self-consistently from the code chemistry.
Radiative heating and cooling, including pre-built libraries with photo-ionization, photo-electric, dust, Compton, Brehmstrahhlung, recombination, fine structure, molecular, species-by-species metal line cooling, or hooks for popular external chemistry and cooling libraries.
Star formation & feedback on galactic scales (for galaxy-formation or ISM studies) or individual-star scales (for IMF studies), including conversion of gas into sink particles according to various user-defined criteria (e.g. density, self-gravity, molecular thresholds). This includes mass return and feedback with resolved mass, metal, and energy input into the surrounding ISM, or various popular sub-grid galactic wind models.
Black holes including on-the-fly formation and seeding with user-defined criteria (based on local or group properties), mergers, growth via various sub-grid accretion models (with or without models for an un-resolved accretion disk), or via explicitly resolved gravitational capture, and “feedback” scaled with the accretion.
Elastic & plastic dynamics, with support for arbitrary Tillotson-type equations-of-state for solid, liquid, or vapor media, negative-pressure media, anisotropic deviatoric stresses and plasticity, and various pre-defined material properties.
Arbitrary equations-of-state (EOSs), including support for trivially-specifiable modular EOSs, multi-fluid systems, and pre-programmed EOSs for stellar systems or degenerate objects (Helmholtz-type EOSs), as well as solid/liquid/vapor mixtures (Tillotson-type EOSs).
Sink particles, with dynamical accretion and formation properties, and user-specified formation conditions from local gas or group properties.
Anisotropic conduction and viscosity: “real” Navier-Stokes or fully-anisotropic Spitzer-Braginskii conduction and viscosity, with dynamically calculated or arbitrarily chosen coefficients.
Dust-gas mixtures, e.g. aerodynamically coupled grains or other particles. The treatment is flexible and can handle both sub and super-sonic cases, compressible fluids, and grain-gas back-reaction, with arbitrary dust drag laws (Epstein, Stokes, Coulomb drag) and Lorentz forces on charged grains from magnetic fields.
Self-gravity for arbitrary geometries and boundary conditions, with fully-adaptive Lagrangian gravitational softenings for all fluids and particle types. Arbitrary analytic potentials can be added trivially.
Turbulent eddy diffusion of passive scalars and dynamical quantities (Smagorinski diffusion for subgrid-scale turbulence).
Shearing boxes (stratified or unstratified), “large-eddy simulations” (driven turbulent boxes), periodic, open, or reflecting boundary conditions are supported.
Cosmological integrations (both large-volume and “zoom-in” cosmological simulations), with support for “non-standard” cosmologies including dynamical dark energy equations-of-state or arbitrarily time-dependent expansion histories or gravitational constants.
Group-finding and/or power-spectrum computation, run on-the-fly, for arbitrary combinations of “target” species (e.g. halo or galaxy or cluster-finding).
Particle splitting/merging according to arbitrary user-defined criteria, to allow for super-Lagrangian “hyper-refinement” simulations.
3 Code Scaling & Performance
GIZMO employs a hybrid MPI+OpenMP parallelization strategy with a flexible domain decomposition and hierarchical adaptive timesteps (together with a large number of optimizations for different problems and physics), which enable it to scale efficiently on massively-parallel systems with problem sizes up to and beyond billions of resolution elements (Hopkins et al., 2017). Code scalings are always (highly) problem-and-resolution-dependent, but illustrative examples of scalings for real “production” simulations (run on the DOE Titan and Mira clusters) are shown in Fig. 2.
4 Arbitrary (Moving) Meshes
GIZMO is multi-method in that the “mesh” over which the fluid equations is solved can be specified with arbitrary motion or geometry (or lack thereof). GIZMO can be run as a Lagrangian mesh-free finite-volume code (where the mesh moves with the fluid), or as an SPH code, or as a fixed Cartesian grid code (similar to codes like ATHENA and ZEUS). More generally, users can specify any background mesh motion they desire (e.g. shearing, expanding, collapsing, accelerating, or differentially rotating boxes).
5 Thanks & Acknowledgments
We thank Volker Springel both for his personal mentoring, and for writing GADGET, without which GIZMO would not exist. We also thank the large number of GIZMO code developers who have contributed to the content in the public code, especially Daniel Angles-Alcazar, Xiangcheng Ma, Shea Garrison-Kimmel, Mike Grudic, Alessandro Lupi, Robert Thompson, Quirong Zhu, Hongping Deng, Dusan Keres, and Paul Torrey.
Support for PFH and GIZMO development was provided by an Alfred P. Sloan Research Fellowship, NSF Collaborative Research Grant #1715847 and CAREER grant #1455342, Caltech compute cluster “Wheeler,” allocations from XSEDE TG-AST130039 and PRAC NSF.1713353 supported by the NSF, and NASA HEC SMD-16-7592.
- Davé et al. (2016) Davé, R., Thompson, R., & Hopkins, P. F. 2016, MNRAS, 462, 3265
- Deng et al. (2017) Deng, H., Reinhardt, C., Benitez, F., Mayer, L., & Stadel, J. 2017, ApJ, in press, arXiv:1711.04589
- Grudić et al. (2016) Grudić, M. Y., Hopkins, P. F., Faucher-Giguère, C.-A., Quataert, E., Murray, N., & Kereš, D. 2016, MNRAS, in press, arXiv:1612.05635
- Hopkins (2015) Hopkins, P. F. 2015, MNRAS, 450, 53
- Hopkins (2017) —. 2017, MNRAS, 466, 3387
- Hopkins & Raives (2016) Hopkins, P. F., & Raives, M. J. 2016, MNRAS, 455, 51
- Hopkins et al. (2016) Hopkins, P. F., Torrey, P., Faucher-Giguère, C.-A., Quataert, E., & Murray, N. 2016, MNRAS, 458, 816
- Hopkins et al. (2017) Hopkins, P. F., et al. 2017, MNRAS, in press, arXiv:1702.06148
- Lee et al. (2017) Lee, H., Hopkins, P. F., & Squire, J. 2017, MNRAS, 469, 3532
- Raives (2016) Raives, M. J. 2016, private communication
- Springel (2005) Springel, V. 2005, MNRAS, 364, 1105
- Su et al. (2017) Su, K.-Y., Hopkins, P. F., Hayward, C. C., Faucher-Giguère, C.-A., Kereš, D., Ma, X., & Robles, V. H. 2017, MNRAS, 471, 144
- Wetzel et al. (2016) Wetzel, A. R., Hopkins, P. F., Kim, J.-h., Faucher-Giguère, C.-A., Kereš, D., & Quataert, E. 2016, ApJL, 827, L23