Skip to content

Piz Daint Deployment

Jorge Blanco Alonso edited this page Feb 1, 2024 · 47 revisions

List of latest packages deployed

  • neurodamus-hippocampus/1.8-2.15.0-2.8.0
  • neurodamus-neocortex/1.12-2.15.0-2.8.0
  • neurodamus-mousify/1.6-2.15.0-2.8.0
  • neuron/9.0.a8
  • py-neurodamus/2.15.0
  • py-bluepy/2.5.3
  • py-bluepyopt/1.14.4
  • py-libsonata/0.1.24
  • py-netpyne/1.0.3.1
  • py-matplotlib/3.4.3

Instructions for Using Software Stack

  • Run a simulation with small circuit target
#!/bin/bash -l
#SBATCH --job-name="test sim"
#SBATCH --time=00:30:00
#SBATCH --nodes=1
#SBATCH --ntasks-per-core=1
#SBATCH --ntasks-per-node=36
#SBATCH --cpus-per-task=1
#SBATCH --partition=debug
#SBATCH --constraint=mc
#SBATCH --output=simout-%j.log
#SBATCH --error=simout-%j.log
#SBATCH [email protected]
#SBATCH --mail-type=ALL
#SBATCH --account=ich002

module purge
module load PrgEnv-intel
module load daint-mc cray-python/3.9.4.1

module use /apps/hbp/ich002/hbp-spack-deployments/softwares/20-10-2023/modules

# load only relevant modules needed for your job
module load neurodamus-hippocampus

export HDF5_USE_FILE_LOCKING=FALSE

# Run the simulation with neurodamus py 
srun special -mpi -python $NEURODAMUS_PYTHON/init.py 
  • Using py-bluepy or BluePyOpt
#!/bin/bash -l
#SBATCH --job-name="test sim"
#SBATCH --time=00:30:00
#SBATCH --nodes=1
#SBATCH --ntasks-per-core=1
#SBATCH --ntasks-per-node=36
#SBATCH --cpus-per-task=1
#SBATCH --partition=debug
#SBATCH --constraint=mc
#SBATCH --output=simout-%j.log
#SBATCH --error=simout-%j.log
#SBATCH [email protected]
#SBATCH --mail-type=ALL
#SBATCH --account=ich002

module purge
module load daint-mc cray-python
module load PrgEnv-intel
module unload intel
module load intel/19.1.3.304 cray-mpich

module use /apps/hbp/ich002/hbp-spack-deployments/softwares/20-10-2023/modules

# always load neuron module
module load neuron

# if matplotlib/numpy are needed
module load py-matplotlib

# load only relevant modules needed for your job
module load py-bluepyopt
module load py-bluepy
module load py-netpyne

# Avoid warnings during execution
export PMI_NO_FORK=1
export PMI_NO_PREINITIALIZE=1
export PMI_MMAP_SYNC_WAIT_TIME=300

$ python
>>> import bluepy
>>> import pandas
>>> import bluepyopt
>>> import netpyne

Known issues

If your simulation fails with a segmentation fault after something like this:

[STEP] ================ INSTANTIATING SIMULATION ================
[STEP] Handling Replay
[STEP] Creating connections in the simulator
[INFO] Instantiating synapses... Params: {'replay_mode': <ReplayMode.NONE: 0>}
[INFO]  * Connections among hippocampus_neurons -> hippocampus_neurons, attach src: True
Pop:(0, 0)|srun: error: nid00108: tasks 0-6,8-35: Segmentation fault

It is possible that you need to adapt certain blocks in the BlueConfig. If you are updating tau_r_NMDA and tau_d_NMDA variables via SynapseConfigure in BlueConfig, they are now RANGE variables instead of GLOBAL. For that reason, you would have to update the relevant BlueConfig blocks.

For example, use this syntax:

SynapseConfigure %s.NMDA_ratio = 1.22 %s.tau_r_NMDA = 3.9 %s.tau_d_NMDA = 148.5

instead of:

SynapseConfigure %s.NMDA_ratio = 1.22 tau_r_NMDA_ProbAMPANMDA_EMS = 3.9   tau_d_NMDA_ProbAMPANMDA_EMS = 148.5

Visualisation stack deployment on Piz Daint

Emsim is deployed as an AppImage: /apps/hbp/ich002/hbp-visualisation-deployements/emsim/startEmsim.sh

Brayns is composed of two components: braynsService (backend) and webbrayns (web frontend): /apps/hbp/ich002/hbp-visualisation-deployements/brayns/cnr/brayns.sbatch