Skip to content
This repository has been archived by the owner on Jan 20, 2024. It is now read-only.

Releases: maierbn/opendihu

Version 1.4

14 Jan 11:53
224ccfa
Compare
Choose a tag to compare

This release is the last under Benjamin's (@maierbn) account before moving the code base to its own orga.

It contains the latest developments: extensions with regards to precice coupling, some cleanup and improvements of the documentation.

What's Changed

New Contributors

Full Changelog: v1.3...v1.4

Version 1.3

20 Apr 22:13
Compare
Choose a tag to compare

This is the code at the time of submission of Benjamin's Ph.D. thesis. The described studies can be reproduced with this release.

You'll also need the input files, which can be downloaded here.

Version 1.2

29 Dec 15:19
Compare
Choose a tag to compare

This is the version at the end of 2020. The major code architecture of the core is now more or less stable. Future additions are expected to just be new solvers or adapters that reuse the available infrastructure.

  • The examples directory is now better organized. There are numerous new examples, including the chemo-electro-mechanical model or multidomain, with and without fat layer, with and without contraction model. More CellML subcellular models have been tested. Refer to existing examples to get an overview.
  • Adapters for surface coupling with preCICE have been added. They are used in an example with muscle + tendons.
  • The stiffness matrix computation is now also vectorized with Vc. This can be turned off to reduce compilation times (see user-variables.scons.py).
  • There is no more Python2, everything is Python3 now (settings, scons etc.).

This release installs on a clean Ubuntu 20.04 if you follow the instructions in the documentation. It can be installed on any server of SGS without the need for additional system packages.

Version 1.1

11 Feb 15:01
Compare
Choose a tag to compare

This version contains numerous internal improvements which make the code more flexible and more stable. Some features include:

  • Better data mapping between solvers, a diagram of the solver structure and data connections can be automatically created (solverStructureDiagramFile).
  • The dependencies have been updated to their newest versions, notably this includes improvements in the CG solver of PETSc. Also, a geometric algebraic multigrid solver has been successfully applied.
  • Support of CellML models has been improved. Now the raw XML files can be processed. The code generator has been updated to produce explicitly vectorized code (using the library Vc).
  • Dynamic and static hyperelasticity with analytic jacobian has been implemented.

Version 1.0

15 Apr 13:21
Compare
Choose a tag to compare

This version is the initial complete release with the following features:

  • Equations:
    • Those with Laplace operator (Laplace, Poisson, Diffusion) with Dirichlet and Neumann-type boundary conditions, also generalized Laplace operator (∇•c∇)
    • Monodomain (0D-1D coupled problems), with reaction term given by CellML file, examples include Hodgkin-Huxley and Shorten model
    • MultipleInstances class, such that we can handle multiple Monodomain fibers at once.
    • Multidomain, Bidomain
    • 0D-1D-3D as used in the paper [1]
  • Unit tests cover all combinations of
    • Mesh: StructuredDeformableOfDimension<D>, StructuredRegularFixedOfDimension<D>, UnstructuredDeformableOfDimension<D> for D = 1,2,3
    • BasisFunction: LagrangeOfOrder<1>, LagrangeOfOrder<2>, Hermite
  • Output file formats:
    • Paraview (binary and ASCII), 1 file for each process or combined files for all processes (MPI file I/O)
    • Exnode, Exelem version 1
    • Python (binary (pickle) and ASCII), to be used with the included plotting utility or for own postprocessing scripts, also possible as callback during the simulation
    • ADIOS2
  • Parallel execution: Structured meshes are completely parallelized, UnstructuredDeformableOfDimension<D> is designed for serial execution.
    Input data (geometry, boundary conditions) can be either specified globally or locally, MPI parallel input is possible for geometry data.

It is also the version of opendihu that was used for parallel scaling results on up to ~27,000 cores in the paper [1] . The used example is electrophysiology/fibers_emg.

[1] "Highly Parallel Multi-Physics Simulation of Muscular Activation and EMG"