Class Submitter

java.lang.Object
org.apache.hadoop.conf.Configured
org.apache.hadoop.mapred.pipes.Submitter
All Implemented Interfaces:
Configurable, Tool

@Public @Stable public class Submitter extends Configured implements Tool
The main entry point and job submitter. It may either be used as a command line-based or API-based method to launch Pipes jobs.
  • Field Details

  • Constructor Details

    • Submitter

      public Submitter()
    • Submitter

      public Submitter(Configuration conf)
  • Method Details

    • getExecutable

      public static String getExecutable(JobConf conf)
      Get the URI of the application's executable.
      Parameters:
      conf -
      Returns:
      the URI where the application's executable is located
    • setExecutable

      public static void setExecutable(JobConf conf, String executable)
      Set the URI for the application's executable. Normally this is a hdfs: location.
      Parameters:
      conf -
      executable - The URI of the application's executable.
    • setIsJavaRecordReader

      public static void setIsJavaRecordReader(JobConf conf, boolean value)
      Set whether the job is using a Java RecordReader.
      Parameters:
      conf - the configuration to modify
      value - the new value
    • getIsJavaRecordReader

      public static boolean getIsJavaRecordReader(JobConf conf)
      Check whether the job is using a Java RecordReader
      Parameters:
      conf - the configuration to check
      Returns:
      is it a Java RecordReader?
    • setIsJavaMapper

      public static void setIsJavaMapper(JobConf conf, boolean value)
      Set whether the Mapper is written in Java.
      Parameters:
      conf - the configuration to modify
      value - the new value
    • getIsJavaMapper

      public static boolean getIsJavaMapper(JobConf conf)
      Check whether the job is using a Java Mapper.
      Parameters:
      conf - the configuration to check
      Returns:
      is it a Java Mapper?
    • setIsJavaReducer

      public static void setIsJavaReducer(JobConf conf, boolean value)
      Set whether the Reducer is written in Java.
      Parameters:
      conf - the configuration to modify
      value - the new value
    • getIsJavaReducer

      public static boolean getIsJavaReducer(JobConf conf)
      Check whether the job is using a Java Reducer.
      Parameters:
      conf - the configuration to check
      Returns:
      is it a Java Reducer?
    • setIsJavaRecordWriter

      public static void setIsJavaRecordWriter(JobConf conf, boolean value)
      Set whether the job will use a Java RecordWriter.
      Parameters:
      conf - the configuration to modify
      value - the new value to set
    • getIsJavaRecordWriter

      public static boolean getIsJavaRecordWriter(JobConf conf)
      Will the reduce use a Java RecordWriter?
      Parameters:
      conf - the configuration to check
      Returns:
      true, if the output of the job will be written by Java
    • getKeepCommandFile

      public static boolean getKeepCommandFile(JobConf conf)
      Does the user want to keep the command file for debugging? If this is true, pipes will write a copy of the command data to a file in the task directory named "downlink.data", which may be used to run the C++ program under the debugger. You probably also want to set JobConf.setKeepFailedTaskFiles(true) to keep the entire directory from being deleted. To run using the data file, set the environment variable "mapreduce.pipes.commandfile" to point to the file.
      Parameters:
      conf - the configuration to check
      Returns:
      will the framework save the command file?
    • setKeepCommandFile

      public static void setKeepCommandFile(JobConf conf, boolean keep)
      Set whether to keep the command file for debugging
      Parameters:
      conf - the configuration to modify
      keep - the new value
    • submitJob

      @Deprecated public static RunningJob submitJob(JobConf conf) throws IOException
      Deprecated.
      Submit a job to the map/reduce cluster. All of the necessary modifications to the job to run under pipes are made to the configuration.
      Parameters:
      conf - the job to submit to the cluster (MODIFIED)
      Throws:
      IOException
    • runJob

      public static RunningJob runJob(JobConf conf) throws IOException
      Submit a job to the map/reduce cluster. All of the necessary modifications to the job to run under pipes are made to the configuration.
      Parameters:
      conf - the job to submit to the cluster (MODIFIED)
      Throws:
      IOException
    • jobSubmit

      public static RunningJob jobSubmit(JobConf conf) throws IOException
      Submit a job to the Map-Reduce framework. This returns a handle to the RunningJob which can be used to track the running-job.
      Parameters:
      conf - the job configuration.
      Returns:
      a handle to the RunningJob which can be used to track the running-job.
      Throws:
      IOException
    • run

      public int run(String[] args) throws Exception
      Description copied from interface: Tool
      Execute the command with the given arguments.
      Specified by:
      run in interface Tool
      Parameters:
      args - command specific arguments.
      Returns:
      exit code.
      Throws:
      Exception - command exception.
    • main

      public static void main(String[] args) throws Exception
      Submit a pipes job based on the command line arguments.
      Parameters:
      args -
      Throws:
      Exception