Uses of Class
org.apache.hadoop.mapred.JobID

Packages that use JobID
org.apache.hadoop.mapred A software framework for easily writing applications which process vast amounts of data (multi-terabyte data-sets) parallelly on large clusters (thousands of nodes) built of commodity hardware in a reliable, fault-tolerant manner. 
org.apache.hadoop.mapred.jobcontrol Utilities for managing dependent jobs. 
org.apache.hadoop.streaming Hadoop Streaming is a utility which allows users to create and run Map-Reduce jobs with any executables (e.g. 
 

Uses of JobID in org.apache.hadoop.mapred
 

Methods in org.apache.hadoop.mapred that return JobID
static JobID JobID.downgrade(JobID old)
          Downgrade a new JobID to an old one
static JobID JobID.forName(String str)
          Construct a JobId object from given string
 JobID RunningJob.getID()
          Get the job identifier.
 JobID JobStatus.getJobID()
           
 JobID Task.getJobID()
          Get the job name for this task.
 JobID JobProfile.getJobID()
          Get the job id.
 JobID TaskAttemptID.getJobID()
           
 JobID JobInProgress.getJobID()
           
 JobID TaskID.getJobID()
           
 JobID JobTracker.getNewJobId()
          Allocates a new JobId string.
 JobID LocalJobRunner.getNewJobId()
           
static JobID JobID.read(DataInput in)
          Deprecated. 
 

Methods in org.apache.hadoop.mapred with parameters of type JobID
 void JobClient.displayTasks(JobID jobId, String type, String state)
          Display the information about a job's tasks, of a particular type and in a particular state
 TaskReport[] JobClient.getCleanupTaskReports(JobID jobId)
          Get the information of the current state of the cleanup tasks of a job.
 TaskReport[] JobTracker.getCleanupTaskReports(JobID jobid)
           
 TaskReport[] LocalJobRunner.getCleanupTaskReports(JobID id)
           
static String JobHistory.getHistoryFilePath(JobID jobId)
          Given the job id, return the history file path from the cache
 RunningJob JobClient.getJob(JobID jobid)
          Get an RunningJob object to track an ongoing job.
 JobInProgress JobTracker.getJob(JobID jobid)
           
 Counters JobTracker.getJobCounters(JobID jobid)
           
 Counters LocalJobRunner.getJobCounters(JobID id)
           
static String JobHistory.JobInfo.getJobHistoryFileName(JobConf jobConf, JobID id)
          Recover the job history filename from the history folder.
 JobProfile JobTracker.getJobProfile(JobID jobid)
           
 JobProfile LocalJobRunner.getJobProfile(JobID id)
           
 JobStatus JobTracker.getJobStatus(JobID jobid)
           
 JobStatus LocalJobRunner.getJobStatus(JobID id)
           
static String JobTracker.getLocalJobFilePath(JobID jobId)
          Get the localized job file path on the job trackers local file system
static String JobHistory.JobInfo.getLocalJobFilePath(JobID jobId)
          Get the path of the locally stored job file
 MapTaskCompletionEventsUpdate TaskUmbilicalProtocol.getMapCompletionEvents(JobID jobId, int fromIndex, int maxLocs, TaskAttemptID id, org.apache.hadoop.mapred.JvmContext jvmContext)
          Called by a reduce task to get the map output locations for finished maps.
 MapTaskCompletionEventsUpdate TaskTracker.getMapCompletionEvents(JobID jobId, int fromEventId, int maxLocs, TaskAttemptID id, org.apache.hadoop.mapred.JvmContext jvmContext)
           
 TaskReport[] JobClient.getMapTaskReports(JobID jobId)
          Get the information of the current state of the map tasks of a job.
 TaskReport[] JobTracker.getMapTaskReports(JobID jobid)
           
 TaskReport[] LocalJobRunner.getMapTaskReports(JobID id)
           
 TaskReport[] JobClient.getReduceTaskReports(JobID jobId)
          Get the information of the current state of the reduce tasks of a job.
 TaskReport[] JobTracker.getReduceTaskReports(JobID jobid)
           
 TaskReport[] LocalJobRunner.getReduceTaskReports(JobID id)
           
 TaskReport[] JobClient.getSetupTaskReports(JobID jobId)
          Get the information of the current state of the setup tasks of a job.
 TaskReport[] JobTracker.getSetupTaskReports(JobID jobid)
           
 TaskReport[] LocalJobRunner.getSetupTaskReports(JobID id)
           
 TaskCompletionEvent[] JobTracker.getTaskCompletionEvents(JobID jobid, int fromEventId, int maxEvents)
           
 TaskCompletionEvent[] LocalJobRunner.getTaskCompletionEvents(JobID jobid, int fromEventId, int maxEvents)
           
 void Task.initialize(JobConf job, JobID id, Reporter reporter, boolean useNewApi)
           
 void JobTracker.killJob(JobID jobid)
           
 void LocalJobRunner.killJob(JobID id)
           
static void JobHistory.JobInfo.logFailed(JobID jobid, long timestamp, int finishedMaps, int finishedReduces, String failReason)
          Logs job failed event.
static void JobHistory.JobInfo.logFinished(JobID jobId, long finishTime, int finishedMaps, int finishedReduces, int failedMaps, int failedReduces, Counters mapCounters, Counters reduceCounters, Counters counters)
          Log job finished.
static void JobHistory.JobInfo.logInited(JobID jobId, long startTime, int totalMaps, int totalReduces)
          Logs launch time of job.
static void JobHistory.JobInfo.logJobInfo(JobID jobid, long submitTime, long launchTime)
           
static void JobHistory.JobInfo.logJobInfo(JobID jobid, long submitTime, long launchTime, int restartCount)
          Deprecated. Use JobHistory.JobInfo.logJobInfo(JobID, long, long) instead.
static void JobHistory.JobInfo.logJobPriority(JobID jobid, JobPriority priority)
          Log job's priority.
static void JobHistory.JobInfo.logKilled(JobID jobid, long timestamp, int finishedMaps, int finishedReduces)
          Logs job killed event.
static void JobHistory.JobInfo.logStarted(JobID jobId)
          Logs job as running
static void JobHistory.JobInfo.logStarted(JobID jobId, long startTime, int totalMaps, int totalReduces)
          Deprecated. Use JobHistory.JobInfo.logInited(JobID, long, int, int) and JobHistory.JobInfo.logStarted(JobID)
static void JobHistory.JobInfo.logSubmitted(JobID jobId, JobConf jobConf, String jobConfPath, long submitTime)
          Deprecated. Use JobHistory.JobInfo.logSubmitted(JobID, JobConf, String, long, boolean) instead.
static void JobHistory.JobInfo.logSubmitted(JobID jobId, JobConf jobConf, String jobConfPath, long submitTime, boolean restarted)
           
 void JobTracker.setJobPriority(JobID jobid, String priority)
           
 void LocalJobRunner.setJobPriority(JobID id, String jp)
           
 JobStatus JobTracker.submitJob(JobID jobId, String jobSubmitDir, Credentials ts)
          JobTracker.submitJob() kicks off a new job.
 JobStatus LocalJobRunner.submitJob(JobID jobid, String jobSubmitDir, Credentials credentials)
           
 

Constructors in org.apache.hadoop.mapred with parameters of type JobID
JobInProgress(JobID jobid, JobConf conf, JobTracker tracker)
          Create an almost empty JobInProgress, which can be used only for tests
JobStatus(JobID jobid, float setupProgress, float mapProgress, float reduceProgress, float cleanupProgress, int runState, JobPriority jp)
          Create a job status object for a given jobid.
JobStatus(JobID jobid, float mapProgress, float reduceProgress, float cleanupProgress, int runState)
          Create a job status object for a given jobid.
JobStatus(JobID jobid, float mapProgress, float reduceProgress, float cleanupProgress, int runState, JobPriority jp)
          Create a job status object for a given jobid.
JobStatus(JobID jobid, float mapProgress, float reduceProgress, int runState)
          Create a job status object for a given jobid.
 

Uses of JobID in org.apache.hadoop.mapred.jobcontrol
 

Methods in org.apache.hadoop.mapred.jobcontrol that return JobID
 JobID Job.getAssignedJobID()
           
 

Methods in org.apache.hadoop.mapred.jobcontrol with parameters of type JobID
 void Job.setAssignedJobID(JobID mapredJobID)
          Set the mapred ID for this job as assigned by the mapred framework.
 

Uses of JobID in org.apache.hadoop.streaming
 

Fields in org.apache.hadoop.streaming declared as JobID
protected  JobID StreamJob.jobId_
           
 



Copyright © 2009 The Apache Software Foundation