icon-symbol-logout-darkest-grey

ServicebwForCluster Helix

Powerful high performance computer for the life sciences and computational humanities in Baden-Württemberg

bwForCluster Helix is Heidelberg University's high performance computer and is available to all researchers in Baden-Württemberg.

Logo Helix

The computing cluster is designed to be used primarily in the fields of:

  • structural system biology
  • medical science
  • soft matter
  • computational humanities

as well as for discipline-independent methods development.

Helix is a building block of the bwHPC strategy of the state of Baden-Württemberg, one of whose aims is providing high performance computing clusters for a variety of scientific disciplines at five locations. The goal of bwHPC is to ensure an excellent basic supply of computing and data storage resources for researchers in the state.

Helix is financed by the Deutschen Forschungsgemeinschaft (DFG) (German Research Foundation), the Ministry for Sciences, Research and Arts of Baden-Württemberg (MWK) and Heidelberg University. It is the successor system to bwForCluster MLS&WISO.

Target Group

  • University employees
  • Students
  • Doctoral candidates
  • Researchers
  • External members (Researchers from throughout Baden-Württemberg)

Use

  • Performing complex computations and analyses on a high performance computing cluster for the fields of structural system biology, medical science, soft matter and computational humanities
  • Discipline-independent/interdisciplinary methods development for scientific computing

Access and Requirements

Registration is required for access. There is a detailed description of the process for obtaining user access to the bwForCluster in the bwHPC Wiki.

Technical Information

An overview of the specifications of the cluster:

  • 20,000 AMD EPYC Milan processor cores
  • Approx. 100 terabytes of main memory as well as
  • Approx. 200 NVIDIA Ampere Tensor Core GPUs (A100 and A40)
  • For communication technology: a non-blocking NVIDIA Mellanox InfiniBand HDR network with a connection of at least 100 GBit/s per computer node.
  • High performance storage with a parallel IBM Spectrum Scale file system with a total capacity of about 11 petabytes with a flash storage capacity of about 800 terabytes