Hadoop

Software Summary
Last Updated On: 

Wednesday, December 7, 2016

Support Level: 
Minimal Support
Software Access Level: 
Open Access
Software Categories: 
Data Management Systems
Software Description
Software Description: 

The Hadoop Map/Reduce framework harnesses a cluster of machines and executes user defined Map/Reduce jobs across the nodes in the cluster.  On itasca, a script exists to create an ephemeral Hadoop cluster on the set of nodes assigned by the scheduler.  The script setup_cluster will format a HDFS filesystem on the local scratch disks. 

This resource is best-suited for application benchmarking, and algorithm testing.  All data must be moved to HDFS after the cluster is brought up when the jobs starts.  Any data that you wish to save must be moved to your home directory before the job completes.  Many job scripts will follow the pattern:

  1. Set up cluster
  2. move data to hdfs with "hadoop fs -put"
  3. execute test program
  4. move data to home directory with "hadoop fs -get"

If you need a persistent cluster for your work, please see the information at: https://www.msi.umn.edu/content/hadoop-cluster

Software Documentation

Software Documentation Tabs