Package org.apache.hadoop.mapred.pipes
Class Submitter
java.lang.Object
org.apache.hadoop.conf.Configured
org.apache.hadoop.mapred.pipes.Submitter
- All Implemented Interfaces:
Configurable,Tool
The main entry point and job submitter. It may either be used as a command
line-based or API-based method to launch Pipes jobs.
-
Field Summary
Fields -
Constructor Summary
Constructors -
Method Summary
Modifier and TypeMethodDescriptionstatic StringgetExecutable(JobConf conf) Get the URI of the application's executable.static booleangetIsJavaMapper(JobConf conf) Check whether the job is using a Java Mapper.static booleangetIsJavaRecordReader(JobConf conf) Check whether the job is using a Java RecordReaderstatic booleangetIsJavaRecordWriter(JobConf conf) Will the reduce use a Java RecordWriter?static booleangetIsJavaReducer(JobConf conf) Check whether the job is using a Java Reducer.static booleangetKeepCommandFile(JobConf conf) Does the user want to keep the command file for debugging?static RunningJobSubmit a job to the Map-Reduce framework.static voidSubmit a pipes job based on the command line arguments.intExecute the command with the given arguments.static RunningJobSubmit a job to the map/reduce cluster.static voidsetExecutable(JobConf conf, String executable) Set the URI for the application's executable.static voidsetIsJavaMapper(JobConf conf, boolean value) Set whether the Mapper is written in Java.static voidsetIsJavaRecordReader(JobConf conf, boolean value) Set whether the job is using a Java RecordReader.static voidsetIsJavaRecordWriter(JobConf conf, boolean value) Set whether the job will use a Java RecordWriter.static voidsetIsJavaReducer(JobConf conf, boolean value) Set whether the Reducer is written in Java.static voidsetKeepCommandFile(JobConf conf, boolean keep) Set whether to keep the command file for debuggingstatic RunningJobDeprecated.Methods inherited from class org.apache.hadoop.conf.Configured
getConf, setConfMethods inherited from class java.lang.Object
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, waitMethods inherited from interface org.apache.hadoop.conf.Configurable
getConf, setConf
-
Field Details
-
LOG
protected static final org.slf4j.Logger LOG -
PRESERVE_COMMANDFILE
- See Also:
-
EXECUTABLE
- See Also:
-
INTERPRETOR
- See Also:
-
IS_JAVA_MAP
- See Also:
-
IS_JAVA_RR
- See Also:
-
IS_JAVA_RW
- See Also:
-
IS_JAVA_REDUCE
- See Also:
-
PARTITIONER
- See Also:
-
INPUT_FORMAT
- See Also:
-
PORT
- See Also:
-
-
Constructor Details
-
Submitter
public Submitter() -
Submitter
-
-
Method Details
-
getExecutable
Get the URI of the application's executable.- Parameters:
conf-- Returns:
- the URI where the application's executable is located
-
setExecutable
Set the URI for the application's executable. Normally this is a hdfs: location.- Parameters:
conf-executable- The URI of the application's executable.
-
setIsJavaRecordReader
Set whether the job is using a Java RecordReader.- Parameters:
conf- the configuration to modifyvalue- the new value
-
getIsJavaRecordReader
Check whether the job is using a Java RecordReader- Parameters:
conf- the configuration to check- Returns:
- is it a Java RecordReader?
-
setIsJavaMapper
Set whether the Mapper is written in Java.- Parameters:
conf- the configuration to modifyvalue- the new value
-
getIsJavaMapper
Check whether the job is using a Java Mapper.- Parameters:
conf- the configuration to check- Returns:
- is it a Java Mapper?
-
setIsJavaReducer
Set whether the Reducer is written in Java.- Parameters:
conf- the configuration to modifyvalue- the new value
-
getIsJavaReducer
Check whether the job is using a Java Reducer.- Parameters:
conf- the configuration to check- Returns:
- is it a Java Reducer?
-
setIsJavaRecordWriter
Set whether the job will use a Java RecordWriter.- Parameters:
conf- the configuration to modifyvalue- the new value to set
-
getIsJavaRecordWriter
Will the reduce use a Java RecordWriter?- Parameters:
conf- the configuration to check- Returns:
- true, if the output of the job will be written by Java
-
getKeepCommandFile
Does the user want to keep the command file for debugging? If this is true, pipes will write a copy of the command data to a file in the task directory named "downlink.data", which may be used to run the C++ program under the debugger. You probably also want to set JobConf.setKeepFailedTaskFiles(true) to keep the entire directory from being deleted. To run using the data file, set the environment variable "mapreduce.pipes.commandfile" to point to the file.- Parameters:
conf- the configuration to check- Returns:
- will the framework save the command file?
-
setKeepCommandFile
Set whether to keep the command file for debugging- Parameters:
conf- the configuration to modifykeep- the new value
-
submitJob
Deprecated.UserunJob(JobConf)Submit a job to the map/reduce cluster. All of the necessary modifications to the job to run under pipes are made to the configuration.- Parameters:
conf- the job to submit to the cluster (MODIFIED)- Throws:
IOException
-
runJob
Submit a job to the map/reduce cluster. All of the necessary modifications to the job to run under pipes are made to the configuration.- Parameters:
conf- the job to submit to the cluster (MODIFIED)- Throws:
IOException
-
jobSubmit
Submit a job to the Map-Reduce framework. This returns a handle to theRunningJobwhich can be used to track the running-job.- Parameters:
conf- the job configuration.- Returns:
- a handle to the
RunningJobwhich can be used to track the running-job. - Throws:
IOException
-
run
Description copied from interface:ToolExecute the command with the given arguments. -
main
Submit a pipes job based on the command line arguments.- Parameters:
args-- Throws:
Exception
-
runJob(JobConf)