Designing Parallel Programs in OpenMP

  • Partition
    • Divide problem into tasks
  • Communicate
    • Determine amount and pattern of communication
  • Agglomerate
    • Combine tasks
  • Map
    • Assign agglomerated tasks to physics processors

 

 

 

 

 

 

 

 

  • Partition
    • In OpenMP, look for any independent operations (loop parallel, task parallel)
  • Communicate
    • In OpenMP, look for synch points and dependencies
  • Agglomerate
    • In OpenMP, mark parallel loops and/or parallel sections
  • Map
    • In OpenMP, implicit or explicit scheduling
    • Data mapping goes outside the standard