Parallel Computing on Agate

MSI Tutorial - Parallel Computing on Agate

In this tutorial, we will give an overview of the Agate cluster resources, the newest high performance computing cluster at MSI. We will walk through examples that use SLURM job arrays, as well as the main two ways to run parallel programs:  thread parallel (with OpenMP), and MPI parallel jobs.

This tutorial will cover:

  • The architecture of the Agate cluster at the Minnesota Supercomputing Institute
  • How to run thread parallel jobs within one node, with OpenMP
  • How to run parallel jobs, generally across multiple nodes, with MPI

To be successful, you should have:

  • An active MSI account
  • Basic understanding of Linux
  • If creating your own programs, familiarity with compiling your own application codes
  • Familiarity with running jobs through the SLURM  scheduler or via a MSI-provided interactive interface like Open OnDemand or Notebooks

Date, time and location:

  • Feb. 1, 2024
  • 1:00 pm to 3:00 pm
  • 575 Walter Library and online

Click here to Register


UPCOMING SESSIONS

Upcoming sessions

TUTORIAL INFORMATION

Tutorial Information

Recommended background: 

  • Familiarity with Linux, knowledge of MSI Systems

Training Level: 

  • Intermediate

Tutorial format: 

  • Lecture

Training Materials: 

  • Lecture Slides

Skills: 

  • MSI SystemsParallel Computing

PREVIOUS RECORDING

Previous Recording

TUTORIAL MATERIALS

Tutorial Materials

Discover Advanced Computing and Data Solutions at MSI

Our Services