org.apache.hadoop.examples
Class MultiFileWordCount.MapClass

java.lang.Object
  extended by org.apache.hadoop.mapred.MapReduceBase
      extended by org.apache.hadoop.examples.MultiFileWordCount.MapClass
All Implemented Interfaces:
Closeable, JobConfigurable, Mapper<MultiFileWordCount.WordOffset,Text,Text,LongWritable>
Enclosing class:
MultiFileWordCount

public static class MultiFileWordCount.MapClass
extends MapReduceBase
implements Mapper<MultiFileWordCount.WordOffset,Text,Text,LongWritable>

This Mapper is similar to the one in MultiFileWordCount.MapClass.


Constructor Summary
MultiFileWordCount.MapClass()
           
 
Method Summary
 void map(MultiFileWordCount.WordOffset key, Text value, OutputCollector<Text,LongWritable> output, Reporter reporter)
          Maps a single input key/value pair into an intermediate key/value pair.
 
Methods inherited from class org.apache.hadoop.mapred.MapReduceBase
close, configure
 
Methods inherited from class java.lang.Object
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
 
Methods inherited from interface org.apache.hadoop.mapred.JobConfigurable
configure
 
Methods inherited from interface java.io.Closeable
close
 

Constructor Detail

MultiFileWordCount.MapClass

public MultiFileWordCount.MapClass()
Method Detail

map

public void map(MultiFileWordCount.WordOffset key,
                Text value,
                OutputCollector<Text,LongWritable> output,
                Reporter reporter)
         throws IOException
Description copied from interface: Mapper
Maps a single input key/value pair into an intermediate key/value pair.

Output pairs need not be of the same types as input pairs. A given input pair may map to zero or many output pairs. Output pairs are collected with calls to OutputCollector.collect(Object,Object).

Applications can use the Reporter provided to report progress or just indicate that they are alive. In scenarios where the application takes an insignificant amount of time to process individual key/value pairs, this is crucial since the framework might assume that the task has timed-out and kill that task. The other way of avoiding this is to set mapred.task.timeout to a high-enough value (or even zero for no time-outs).

Specified by:
map in interface Mapper<MultiFileWordCount.WordOffset,Text,Text,LongWritable>
Parameters:
key - the input key.
value - the input value.
output - collects mapped keys and values.
reporter - facility to report progress.
Throws:
IOException


Copyright © 2009 The Apache Software Foundation