Compile Ats 1.0 On Nersc

4 minute read


Compile ATS v1.0 on NERSC

The instruction follows this ATS Installation Guide with a few changes for compilation on NERSC Cori environment. To get some background on compiling codes on NERSC, see this tutorial on NERSC website.


set architechture

The default is haswell. If using knl do the following:

module swap craype-haswell craype-mic-knl

Set programming env

The default environment on NERSC is PrgEnv-intel, and you need to switch to PrgEnv-gnu, which is applicable for most open-source codes.

module swap PrgEnv-intel PrgEnv-gnu

Note: this will also automatically load gcc/8.3.0

Load cmake

The default /usr/bin/cmake did not work well. Load cmake using

module load cmake

You should be able to test if it has been loaded by running which cmake, and you should see the directory: /global/common/sw/cray/cnl7/haswell/cmake/3.14.4/gcc/8.2.0/2hef55n/bin/cmake

The cmake version is 3.14.4

Load MPI

For compiling on cray system, the use of cray-mpich is recommended. Now it will automatically load it for you when switching to PrgEnv-gnu.

# module load cray-mpich

You can also show the installation path MPICH_DIR using

module show cray-mpich

# return
# CRAY_MPICH_DIR /opt/cray/pe/mpt/7.7.10/gni/mpich-gnu/8.2

IMPORTANT: copy MPICH_DIR or CRAY_MPICH_DIR, you will need this later!

load python

The default /usr/bin/python uses2.7 and is missing a lot of packages for ATS regression test.

module load python/3.7-anaconda-2019.10

This is what it looks like using module list

Currently Loaded Modulefiles:
  1) modules/
  2) altd/2.0
  3) darshan/3.1.7
  4) gcc/8.3.0
  5) craype-haswell # or craype-mic-knl
  6) craype-hugepages2M
  7) craype-network-aries
  8) craype/2.6.2
  9) cray-mpich/7.7.10
 10) cray-libsci/19.06.1
 11) udreg/2.3.2-
 12) ugni/
 13) pmi/5.0.14
 14) dmapp/7.1.1-
 15) gni-headers/
 16) xpmem/2.2.20-
 17) job/2.2.4-
 18) dvs/2.12_2.2.156-
 19) alps/6.6.58-
 20) rca/2.2.20-
 21) atp/2.1.3
 22) PrgEnv-gnu/6.0.5
 23) cmake/3.14.4
 24) python/3.7-anaconda-2019.10 # new

Set path

  • If is available, do

Put the following in the file. The OPENMPI_DIR is the same as MPICH_DIR from previous step.

# set static build, this may change in the future
export CRAYPE_LINK_TYPE=dynmaic # as of 10/22/2020

# environment variables for compiling ats
export ATS_BASE=/global/project/projectdirs/m1800/pin/ats-master
export ATS_BUILD_TYPE=Release # OR Debug
export ATS_VERSION=master
export OPENMPI_DIR=/opt/cray/pe/mpt/7.7.10/gni/mpich-gnu/8.2 # from CRAY_MPICH_DIR

export AMANZI_TPLS_DIR=${ATS_BASE}/amanzi_tpls-install-${ATS_VERSION}-${ATS_BUILD_TYPE}

export AMANZI_SRC_DIR=${ATS_BASE}/repos/amanzi
export AMANZI_DIR=${ATS_BASE}/amanzi-install-${ATS_VERSION}-${ATS_BUILD_TYPE}

export ATS_SRC_DIR=${AMANZI_SRC_DIR}/src/physics/ats

export PATH=${ATS_DIR}/bin:${PATH}
export PATH=${AMANZI_TPLS_DIR}/bin:${PATH}
export PYTHONPATH=${ATS_SRC_DIR}/tools/utils:${PYTHONPATH}

Download and compile Amanzi

make base dir

mkdir -p ${ATS_BASE}
cd ${ATS_BASE}

clone Amanzi from Github

git clone -b master $AMANZI_SRC_DIR

# optional
git clone -b ecoon/land_cover $ATS_SRC_DIR

configure Amanzi TPLs, Amanzi, and ATS

  • First, you need to modify the $AMANZI_SRC_DIR/ script.
# change the following. Is this enough?
  • Then, edit the $AMANZI_SRC_DIR/ or run the following

# change the following flags and options
   ${dbg_option} \
   --with-mpi=${OPENMPI_DIR} \
   --enable-shared \
   --disable-clm \ #changed
   --disable-structured  --enable-unstructured \
   --disable-stk_mesh --enable-mstk_mesh \
   --enable-hypre \
   --disable-silo \  # changed
   --disable-petsc \
   --disable-amanzi_physics \
   --enable-ats_physics \
   --disable-ats_dev \
   --disable-geochemistry \
   --amanzi-install-prefix=${AMANZI_DIR} \
   --amanzi-build-dir=${AMANZI_BUILD_DIR} \
   --tpl-install-prefix=${AMANZI_TPLS_DIR} \
   --tpl-build-dir=${AMANZI_TPLS_BUILD_DIR} \
   --tpl-download-dir=${ATS_BASE}/amanzi-tpls/Downloads \
   --tools-download-dir=${ATS_BASE}/amanzi-tpls/Downloads \
   --tools-build-dir=${ATS_BASE}/build \
   --tools-install-prefix=${ATS_BASE}/install \
   --with-cmake=`which cmake` \
   --with-ctest=`which ctest` \
   --branch_ats=${ATS_VERSION} \
   --parallel=8 \
   --arch=NERSC  # added!

run bootstrap


It will take 1~2 hr to complete.

Run testing problem

Download the example repo

mkdir testing
cd testing
git clone -b master

Run a test

  • then run the test problem
cd ats-demos/01_richards_steadystate
ats --xml_file=./richards_steadystate.xml &> out.log

# on NERSC
salloc -N 1 -C haswell -q interactive -t 00:30:00 -L SCRATCH
srun -n 1 ats --xml_file=./richards_steadystate.xml 

It should take less than a second to finish!

Running ATS on NERSC

  • (optional) if using ATS from IDEAS repo
export IDEAS_HOME=/project/projectdirs/m2398/ideas
source ${IDEAS_HOME}/tools/init/ideas.bashrc

# do one of the following to load ATS exe.
module load ATS/dev-transpiration/basic/Release/PrgEnv-gnu-6.0.5
  • first request some interactive node
salloc -N 1 -C haswell -q interactive -t 00:30:00 -L SCRATCH
  • make sure meshconvert and ats is within your PATH
~ $ which meshconvert
~ $ which ats
  • partion mesh.
srun -n 128 meshconvert --partition-method=2 ../American_final_mesh-100m-gauss3.exo ./American_final_mesh.par
  • launch job
srun -n 32 ats --xml_file=./spinup-soil.xml