@InterfaceAudience.LimitedPrivate(value="MapReduce") @InterfaceStability.Unstable public class MapTask extends Task
| Modifier and Type | Class and Description |
|---|---|
static class |
MapTask.MapOutputBuffer<K,V> |
Task.CombineOutputCollector<K,V>, Task.CombinerRunner<K,V>, Task.CombineValuesIterator<KEY,VALUE>, Task.Counter, Task.NewCombinerRunner<K,V>, Task.OldCombinerRunner<K,V>, Task.TaskReporter| Modifier and Type | Field and Description |
|---|---|
static int |
MAP_OUTPUT_INDEX_RECORD_LENGTH
The size of each record in the index file for the map-outputs.
|
committer, conf, DEFAULT_COMBINE_RECORDS_BEFORE_PROGRESS, extraData, failedShuffleCounter, FILESYSTEM_COUNTER_GROUP, gcUpdater, jobCleanup, jobContext, jobRunStateForCleanup, jobSetup, lDirAlloc, mapOutputFile, MERGED_OUTPUT_PREFIX, mergedMapOutputsCounter, outputFormat, shuffleSecret, spilledRecordsCounter, taskCleanup, taskContext, tokenSecret, umbilical| Constructor and Description |
|---|
MapTask() |
MapTask(String jobFile,
TaskAttemptID taskId,
int partition,
JobSplit.TaskSplitIndex splitIndex,
int numSlotsRequired) |
| Modifier and Type | Method and Description |
|---|---|
org.apache.hadoop.util.Progress |
getSortPhase() |
boolean |
isMapTask() |
void |
localizeConfiguration(JobConf conf)
Localize the given JobConf to be specific for this task.
|
void |
readFields(DataInput in) |
void |
run(JobConf job,
TaskUmbilicalProtocol umbilical)
Run this task as a part of the named job.
|
void |
write(DataOutput out) |
createReduceContext, done, getConf, getEncryptedSpillKey, getFileSystemCounterNames, getFsStatistics, getJobFile, getJobID, getJobTokenSecret, getMapOutputFile, getNumSlotsRequired, getPartition, getPhase, getProgress, getShuffleSecret, getSkipRanges, getTaskID, initialize, isSkipping, keepTaskFiles, normalizeStatus, reportFatalError, reportNextRecordRange, runJobCleanupTask, runJobSetupTask, runTaskCleanupTask, setConf, setEncryptedSpillKey, setJobFile, setJobTokenSecret, setPhase, setShuffleSecret, setSkipping, setSkipRanges, setWriteSkipRecs, statusUpdate, toString, toWriteSkipRecspublic static final int MAP_OUTPUT_INDEX_RECORD_LENGTH
public MapTask()
public MapTask(String jobFile, TaskAttemptID taskId, int partition, JobSplit.TaskSplitIndex splitIndex, int numSlotsRequired)
public void localizeConfiguration(JobConf conf) throws IOException
TasklocalizeConfiguration in class TaskIOExceptionpublic void write(DataOutput out) throws IOException
write in interface org.apache.hadoop.io.Writablewrite in class TaskIOExceptionpublic void readFields(DataInput in) throws IOException
readFields in interface org.apache.hadoop.io.WritablereadFields in class TaskIOExceptionpublic void run(JobConf job, TaskUmbilicalProtocol umbilical) throws IOException, ClassNotFoundException, InterruptedException
Taskrun in class Taskumbilical - for progress reportsIOExceptionClassNotFoundExceptionInterruptedExceptionpublic org.apache.hadoop.util.Progress getSortPhase()
Copyright © 2008–2019 Apache Software Foundation. All rights reserved.