Uses of Class
org.apache.hadoop.mapreduce.TaskAttemptID

Packages that use TaskAttemptID
org.apache.hadoop.mapred A software framework for easily writing applications which process vast amounts of data (multi-terabyte data-sets) parallelly on large clusters (thousands of nodes) built of commodity hardware in a reliable, fault-tolerant manner. 
org.apache.hadoop.mapreduce   
 

Uses of TaskAttemptID in org.apache.hadoop.mapred
 

Subclasses of TaskAttemptID in org.apache.hadoop.mapred
 class TaskAttemptID
          TaskAttemptID represents the immutable and unique identifier for a task attempt.
 

Methods in org.apache.hadoop.mapred with parameters of type TaskAttemptID
protected static
<INKEY,INVALUE,OUTKEY,OUTVALUE>
Reducer.Context
Task.createReduceContext(Reducer<INKEY,INVALUE,OUTKEY,OUTVALUE> reducer, Configuration job, TaskAttemptID taskId, RawKeyValueIterator rIter, Counter inputKeyCounter, Counter inputValueCounter, RecordWriter<OUTKEY,OUTVALUE> output, OutputCommitter committer, StatusReporter reporter, RawComparator<INKEY> comparator, Class<INKEY> keyClass, Class<INVALUE> valueClass)
           
static TaskAttemptID TaskAttemptID.downgrade(TaskAttemptID old)
          Downgrade a new TaskAttemptID to an old one
 

Uses of TaskAttemptID in org.apache.hadoop.mapreduce
 

Methods in org.apache.hadoop.mapreduce that return TaskAttemptID
static TaskAttemptID TaskAttemptID.forName(String str)
          Construct a TaskAttemptID object from given string
 TaskAttemptID TaskAttemptContext.getTaskAttemptID()
          Get the unique name for this task attempt.
 

Methods in org.apache.hadoop.mapreduce with parameters of type TaskAttemptID
 void Job.failTask(TaskAttemptID taskId)
          Fail indicated task attempt.
 void Job.killTask(TaskAttemptID taskId)
          Kill indicated task attempt.
 

Constructors in org.apache.hadoop.mapreduce with parameters of type TaskAttemptID
MapContext(Configuration conf, TaskAttemptID taskid, RecordReader<KEYIN,VALUEIN> reader, RecordWriter<KEYOUT,VALUEOUT> writer, OutputCommitter committer, StatusReporter reporter, InputSplit split)
           
Mapper.Context(Configuration conf, TaskAttemptID taskid, RecordReader<KEYIN,VALUEIN> reader, RecordWriter<KEYOUT,VALUEOUT> writer, OutputCommitter committer, StatusReporter reporter, InputSplit split)
           
ReduceContext(Configuration conf, TaskAttemptID taskid, RawKeyValueIterator input, Counter inputKeyCounter, Counter inputValueCounter, RecordWriter<KEYOUT,VALUEOUT> output, OutputCommitter committer, StatusReporter reporter, RawComparator<KEYIN> comparator, Class<KEYIN> keyClass, Class<VALUEIN> valueClass)
           
Reducer.Context(Configuration conf, TaskAttemptID taskid, RawKeyValueIterator input, Counter inputKeyCounter, Counter inputValueCounter, RecordWriter<KEYOUT,VALUEOUT> output, OutputCommitter committer, StatusReporter reporter, RawComparator<KEYIN> comparator, Class<KEYIN> keyClass, Class<VALUEIN> valueClass)
           
TaskAttemptContext(Configuration conf, TaskAttemptID taskId)
           
TaskInputOutputContext(Configuration conf, TaskAttemptID taskid, RecordWriter<KEYOUT,VALUEOUT> output, OutputCommitter committer, StatusReporter reporter)
           
 



Copyright © 2009 The Apache Software Foundation