Skip to content

Commit

Permalink
Light editing and cleanups in the docs
Browse files Browse the repository at this point in the history
  • Loading branch information
lgarrison committed Jul 28, 2020
1 parent c3468c5 commit 64ff7ac
Show file tree
Hide file tree
Showing 9 changed files with 155 additions and 170 deletions.
4 changes: 1 addition & 3 deletions docs/abacus.rst
Original file line number Diff line number Diff line change
@@ -1,8 +1,6 @@
Abacus
=======

Here we briefly describe the Abacus N-body code.

What is Abacus?
---------------

Expand All @@ -12,7 +10,7 @@ clustered simulations. It is extremely fast: we clock over 30
million particle updates per second on commodity dual-Xeon, dual-GPU
computers and nearly 70 million particle updates per second on each
node of the Summit supercomputer. But it is also extremely accurate:
typical force accuracy is below 1e-5 and we are using global
typical force accuracy is below :math:`10^{-5}` and we are using global
timesteps, so the leapfrog timesteps away from the cluster cores
are much smaller than the dynamical time.

Expand Down
14 changes: 8 additions & 6 deletions docs/abacussummit.rst
Original file line number Diff line number Diff line change
Expand Up @@ -9,24 +9,26 @@ was run on the `Summit <https://www.olcf.ornl.gov/summit/>`_ supercomputer at th
Computing Facility under a time allocation from the Department of Energy's ALCC program.

Most of the simulations in AbacusSummit are 6912\ :sup:`3` = 330 billion
particles in 2 Gpc/h volume, yielding a particle mass of about 2e9 Msun/h.
particles in 2 Gpc/*h* volume, yielding a particle mass of about :math:`2\times 10^9\ \mathrm{M}_\odot/h`.

AbacusSummit consists of over 140 of these simulations, plus other smaller simulations,
totaling about 50 trillion
AbacusSummit consists of over 140 of these simulations, plus other larger and smaller simulations,
totaling about 60 trillion
particles. Detailed specifications of the :doc:`simulations` and :doc:`cosmologies`
are available on other pages.

Key portions of the suite are:

* A primary Planck2018 LCDM cosmology with 25 base simulations (330 billion particles in 2 Gpc/h).
* A primary Planck2018 LCDM cosmology with 25 base simulations (330 billion particles in 2 Gpc/*h*).

* Four secondary cosmologies with 6 base simulations, phase matched to the first 6 of the primary boxes.

* A grid of 79 other cosmologies, each with 1 phase-matched base simulation, to support interpolation in an 8-dimensional parameter space, including w0, wa, Neff, and running of the spectral index.
* A grid of 79 other cosmologies, each with 1 phase-matched base simulation, to support interpolation in an 8-dimensional parameter space, including :math:`w_0`, :math:`w_a`, :math:`N_\mathrm{eff}`, and running of the spectral index.

* A suite of 1800 small boxes at the base mass resolution to support covariance estimation

* Other base simulations to match the cosmology of external flagship simulations and to explore the effects of our neutrino approximation.

* A 6x higher mass resolution simulation of the primary cosmology to allow study of group finding, and a large-volume 27x lower mass resolution simulation of the primary cosmology to provide full-sky light cone to z>2.
* A 6x higher mass resolution simulation of the primary cosmology to allow study of group finding, and a large-volume 27x lower mass resolution simulation of the primary cosmology to provide full-sky light cone to *z*>2.

* Specialty simulations including those with fixed-amplitude white noise and scale-free simulations.

Expand Down
25 changes: 12 additions & 13 deletions docs/compaso.rst
Original file line number Diff line number Diff line change
@@ -1,14 +1,14 @@
The CompaSO Halo Finder
=======================
CompaSO Halo Finder
===================

All group finding in AbacusSummit is done on the fly. We are using
a hybrid algorithm, summarized as follows.
a hybrid FoF-SO algorithm, dubbed CompaSO, summarized as follows.

First, we compute a kernel density estimate around all particles.
This uses a weighting (1-r<sup>2</sup>/b<sup>2</sup>), where b is 0.4 of the interparticle
This uses a weighting :math:`(1-r^2/b^2)`, where :math:`b` is 0.4 of the interparticle
spacing. We note that the effective volume of this kernel is
equivalent to a top-hat of 0.737b, so 85 kpc/h comoving, and that
the mean weighted counts at an overdensity delta is about delta/10
equivalent to a top-hat of :math:`0.737b`, so 85 kpc/*h* comoving, and that
the mean weighted counts at an overdensity :math:`\delta` is about :math:`\delta/10`
with a variance of 4/7 of the mean.

Second, we segment the particle set into what we call L0 halos.
Expand All @@ -20,14 +20,13 @@ the bounds of the L0 halo set be set by the kernel density estimate,
which has lower variance than the nearest neighbor method of FOF
and imposes a physical smoothing scale.

.. note:: The terms *groups* and *halos* have specific meanings in Abacus.
Groups are clusters of particles at any level of group finding
(L0/L1/L2). Halos are L1 groups (although sometimes we do use
"halos" to refer to another level, in which case we say *L0 halos*
or *L2 halos*).
.. note:: In Abacus, L0 groups are large, "fluffy" sets of particles
that typically encompass several L1 groups. L1 groups correspond
to classical "halos". L2 groups correspond to "halo cores"
or perhaps "subhalos".

We stress that all L1/L2 finding and all halo statistics are based
solely on the particles in the L0 halo.
solely on the particles in the L0 halo.

Third, within each L0 halo, we construct L1 halos by a competitive
spherical overdensity algorithm. We begin by selecting the particle
Expand All @@ -47,7 +46,7 @@ we start another nucleus.

With each successive nucleus, we again search for the SO(200) radius,
using all L0 particles. Now a particle is assigned to the new group
if is previously unassigned OR if it is estimated to have an enclosed
if is previously unassigned *or* if it is estimated to have an enclosed
density with respect to the new group that is twice that of the
enclosed density with respect to its assigned group. In detail,
these enclosed densities are not computed exactly, but rather scaled
Expand Down
4 changes: 3 additions & 1 deletion docs/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -56,4 +56,6 @@
html_favicon = 'images/icon_red.png'

def setup(app):
app.add_css_file('custom.css')
app.add_css_file('custom.css')

intersphinx_mapping = {'abacusutils': ('https://abacusutils.readthedocs.io/en/latest', None)}
22 changes: 14 additions & 8 deletions docs/cosmologies.rst
Original file line number Diff line number Diff line change
@@ -1,6 +1,9 @@
Cosmologies
===========

Cosmology Specifications
------------------------

This page describes the specification of the Cosmologies and the CLASS
parameters that they define. The CLASS parameter files and resulting
power spectra and transfer functions are available in the `AbacusSummit/Cosmologies <https://github.com/abacusorg/AbacusSummit/tree/master/Cosmologies>`_
Expand Down Expand Up @@ -28,7 +31,7 @@ Further details are below the table.

-------

All cosmologies use tau=0.0544. Most use 60 meV neutrinos, omega_nu = 0.00064420, scaling from z=1.
All cosmologies use tau=0.0544. Most use 60 meV neutrinos, omega_nu = 0.00064420, scaling from *z* = 1.
We use HyRec, rather than RecFast.

CLASS is run with the pk_ref.pre precision choices, unless the name ends with \_fast, in which case we use the defaults.
Expand All @@ -37,18 +40,19 @@ for this.

Remember that Omega_m = (omega_b+omega_cdm+oemga_ncdm)/h^2.

We output five redshifts from CLASS, z=0.0, 1.0, 3.0, 7.0, and 49, which are called z1,z2,z3,z4,z5.
We output five redshifts from CLASS, *z* = 0.0, 1.0, 3.0, 7.0, and 49, which are called z1,z2,z3,z4,z5.

We use the CDM+Baryon power spectrum at z=1 (z2_pk_cb) and scale back by D(z_init)/D(1)
to define our matter-dominated CDM-only simulation IC. The growth function includes the
We use the CDM+Baryon power spectrum at *z* = 1 (z2_pk_cb) and scale back by the ratio of growth
factors :math:`D(z_\mathrm{init})/D(1)` to define our matter-dominated CDM-only simulation IC. The growth function includes the
neutrinos as a smooth component.

.. TODO: better way to link this CSV file?
Cosmologies Table
-----------------

Download the cosmologies table `here <https://github.com/abacusorg/AbacusSummit/blob/master/Cosmologies/cosmologies.csv>`_.
However, in analysis applications, users are encouraged to use the cosmological parameters stored as in the ``header`` field
of the ASDF data product files (which is loaded into the ``meta`` field of Astropy tables) rather than referencing the
cosmologies table.
of the ASDF data product files (which is loaded into the ``meta`` field of Astropy tables, or the ``header`` field of
``CompaSOHaloCatalog`` objects) rather than referencing the cosmologies table.


.. note:: The following table is wide, you may have to scroll to the right to see all the columns.
Expand All @@ -58,7 +62,9 @@ cosmologies table.
:header-rows: 1
:escape: '

Further details about the cosmology choices:

Additional Details
------------------

Beyond the Planck2018 LCDM primary cosmology, we chose 4 other secondary cosmologies.
One was WMAP7, to have a large change in omega_m, H0, and sigma8.
Expand Down
2 changes: 1 addition & 1 deletion docs/data-access.rst
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ What data are available?
------------------------
The :doc:`data-products` page documents the data products.
In some cases, extra data products may be archived on tape and can be made available upon request.
Please email lgarrison@flatironinstitute.org for details.
Please email deisenstein@cfa.harvard.edu, lgarrison@flatironinstitute.org, and nina.maksimova@cfa.harvard.edu for details.

Note that you will almost certainly need to use the utilities at
https://abacusutils.readthedocs.io/en/latest/index.html
Expand Down
Loading

0 comments on commit 64ff7ac

Please sign in to comment.