Uses of Class
org.apache.hadoop.mapred.JobConf
Packages that use JobConf
Package
Description
-
Uses of JobConf in org.apache.hadoop.mapred
Methods in org.apache.hadoop.mapred that return JobConfModifier and TypeMethodDescriptionJobContext.getJobConf()Get the job ConfigurationTaskAttemptContext.getJobConf()Methods in org.apache.hadoop.mapred with parameters of type JobConfModifier and TypeMethodDescriptionstatic voidFileInputFormat.addInputPath(JobConf conf, Path path) Add aPathto the list of inputs for the map-reduce job.static voidFileInputFormat.addInputPaths(JobConf conf, String commaSeparatedPaths) Add the given comma separated paths to the list of inputs for the map-reduce job.voidFileOutputFormat.checkOutputSpecs(FileSystem ignored, JobConf job) voidOutputFormat.checkOutputSpecs(FileSystem ignored, JobConf job) Check for validity of the output-specification for the job.voidSequenceFileAsBinaryOutputFormat.checkOutputSpecs(FileSystem ignored, JobConf job) voidvoidInitializes a new instance from aJobConf.voidvoidDefault implementation that does nothing.voidvoidstatic booleanFileOutputFormat.getCompressOutput(JobConf conf) Is the job output compressed?static PathFilterFileInputFormat.getInputPathFilter(JobConf conf) Get a PathFilter instance of the filter set for the input paths.static Path[]FileInputFormat.getInputPaths(JobConf conf) Get the list of inputPaths for the map-reduce job.static SequenceFile.CompressionTypeSequenceFileOutputFormat.getOutputCompressionType(JobConf conf) Get theSequenceFile.CompressionTypefor the outputSequenceFile.static Class<? extends CompressionCodec>FileOutputFormat.getOutputCompressorClass(JobConf conf, Class<? extends CompressionCodec> defaultValue) Get theCompressionCodecfor compressing the job outputs.static PathFileOutputFormat.getOutputPath(JobConf conf) Get thePathto the output directory for the map-reduce job.static PathFileOutputFormat.getPathForCustomFile(JobConf conf, String name) Helper function to generate aPathfor a file that is unique for the task within the job output directory.abstract RecordReader<K,V> FileInputFormat.getRecordReader(InputSplit split, JobConf job, Reporter reporter) FixedLengthInputFormat.getRecordReader(InputSplit genericSplit, JobConf job, Reporter reporter) InputFormat.getRecordReader(InputSplit split, JobConf job, Reporter reporter) Get theRecordReaderfor the givenInputSplit.KeyValueTextInputFormat.getRecordReader(InputSplit genericSplit, JobConf job, Reporter reporter) abstract RecordReader<K,V> MultiFileInputFormat.getRecordReader(InputSplit split, JobConf job, Reporter reporter) SequenceFileAsBinaryInputFormat.getRecordReader(InputSplit split, JobConf job, Reporter reporter) SequenceFileAsTextInputFormat.getRecordReader(InputSplit split, JobConf job, Reporter reporter) SequenceFileInputFilter.getRecordReader(InputSplit split, JobConf job, Reporter reporter) Create a record reader for the given splitSequenceFileInputFormat.getRecordReader(InputSplit split, JobConf job, Reporter reporter) TextInputFormat.getRecordReader(InputSplit genericSplit, JobConf job, Reporter reporter) abstract RecordWriter<K,V> FileOutputFormat.getRecordWriter(FileSystem ignored, JobConf job, String name, Progressable progress) MapFileOutputFormat.getRecordWriter(FileSystem ignored, JobConf job, String name, Progressable progress) OutputFormat.getRecordWriter(FileSystem ignored, JobConf job, String name, Progressable progress) Get theRecordWriterfor the given job.SequenceFileAsBinaryOutputFormat.getRecordWriter(FileSystem ignored, JobConf job, String name, Progressable progress) SequenceFileOutputFormat.getRecordWriter(FileSystem ignored, JobConf job, String name, Progressable progress) TextOutputFormat.getRecordWriter(FileSystem ignored, JobConf job, String name, Progressable progress) static Class<? extends WritableComparable>SequenceFileAsBinaryOutputFormat.getSequenceFileOutputKeyClass(JobConf conf) Get the key class for theSequenceFileSequenceFileAsBinaryOutputFormat.getSequenceFileOutputValueClass(JobConf conf) Get the value class for theSequenceFileSplits files returned byFileInputFormat.listStatus(JobConf)when they're too big.Logically split the set of input files for the job.static JobClient.TaskStatusFilterJobClient.getTaskOutputFilter(JobConf job) Get the task output filter out of the JobConf.static PathFileOutputFormat.getTaskOutputPath(JobConf conf, String name) Helper function to create the task's temporary output directory and return the path to the task's output file.static StringFileOutputFormat.getUniqueName(JobConf conf, String name) Helper function to generate a name that is unique for the task.static PathFileOutputFormat.getWorkOutputPath(JobConf conf) Get thePathto the task's temporary output directory for the map-reduce job Tasks' Side-Effect FilesvoidConnect to the default clusterprotected FileStatus[]FileInputFormat.listStatus(JobConf job) List input directories.protected FileStatus[]SequenceFileInputFormat.listStatus(JobConf job) booleanJobClient.monitorAndPrintJob(JobConf conf, RunningJob job) Monitor a job and print status in real-time as progress is made and tasks fail.static RunningJobUtility that submits a job, then polls for progress until the job is complete.static voidFileOutputFormat.setCompressOutput(JobConf conf, boolean compress) Set whether the output of the job is compressed.static voidFileInputFormat.setInputPathFilter(JobConf conf, Class<? extends PathFilter> filter) Set a PathFilter to be applied to the input paths for the map-reduce job.static voidFileInputFormat.setInputPaths(JobConf conf, String commaSeparatedPaths) Sets the given comma separated paths as the list of inputs for the map-reduce job.static voidFileInputFormat.setInputPaths(JobConf conf, Path... inputPaths) Set the array ofPaths as the list of inputs for the map-reduce job.static voidSequenceFileOutputFormat.setOutputCompressionType(JobConf conf, SequenceFile.CompressionType style) Set theSequenceFile.CompressionTypefor the outputSequenceFile.static voidFileOutputFormat.setOutputCompressorClass(JobConf conf, Class<? extends CompressionCodec> codecClass) Set theCompressionCodecto be used to compress job outputs.static voidFileOutputFormat.setOutputPath(JobConf conf, Path outputDir) Set thePathof the output directory for the map-reduce job.static voidSequenceFileAsBinaryOutputFormat.setSequenceFileOutputKeyClass(JobConf conf, Class<?> theClass) Set the key class for theSequenceFilestatic voidSequenceFileAsBinaryOutputFormat.setSequenceFileOutputValueClass(JobConf conf, Class<?> theClass) Set the value class for theSequenceFilestatic voidSkipBadRecords.setSkipOutputPath(JobConf conf, Path path) Set the directory to which skipped records are written.static voidJobClient.setTaskOutputFilter(JobConf job, JobClient.TaskStatusFilter newValue) Modify the JobConf to set the task output filter.static voidSet thePathof the task's temporary output directory for the map-reduce job.Submit a job to the MR system.JobClient.submitJobInternal(JobConf conf) Constructors in org.apache.hadoop.mapred with parameters of type JobConf -
Uses of JobConf in org.apache.hadoop.mapred.jobcontrol
Methods in org.apache.hadoop.mapred.jobcontrol that return JobConfMethods in org.apache.hadoop.mapred.jobcontrol with parameters of type JobConfModifier and TypeMethodDescriptionvoidJob.setJobConf(JobConf jobConf) Set the mapred job conf for this job.Constructors in org.apache.hadoop.mapred.jobcontrol with parameters of type JobConf -
Uses of JobConf in org.apache.hadoop.mapred.join
Methods in org.apache.hadoop.mapred.join with parameters of type JobConfModifier and TypeMethodDescriptionComposableInputFormat.getRecordReader(InputSplit split, JobConf job, Reporter reporter) CompositeInputFormat.getRecordReader(InputSplit split, JobConf job, Reporter reporter) Construct a CompositeRecordReader for the children of this InputFormat as defined in the init expression.Build a CompositeInputSplit from the child InputFormats by assigning the ith split from each child to the ith composite split.voidInterpret a given string as a composite expression.Constructors in org.apache.hadoop.mapred.join with parameters of type JobConfModifierConstructorDescriptionJoinRecordReader(int id, JobConf conf, int capacity, Class<? extends WritableComparator> cmpcl) MultiFilterRecordReader(int id, JobConf conf, int capacity, Class<? extends WritableComparator> cmpcl) -
Uses of JobConf in org.apache.hadoop.mapred.lib
Fields in org.apache.hadoop.mapred.lib declared as JobConfMethods in org.apache.hadoop.mapred.lib that return JobConfMethods in org.apache.hadoop.mapred.lib with parameters of type JobConfModifier and TypeMethodDescriptionstatic voidMultipleInputs.addInputPath(JobConf conf, Path path, Class<? extends InputFormat> inputFormatClass) Add aPathwith a customInputFormatto the list of inputs for the map-reduce job.static voidMultipleInputs.addInputPath(JobConf conf, Path path, Class<? extends InputFormat> inputFormatClass, Class<? extends Mapper> mapperClass) static <K1,V1, K2, V2>
voidChainMapper.addMapper(JobConf job, Class<? extends Mapper<K1, V1, K2, V2>> klass, Class<? extends K1> inputKeyClass, Class<? extends V1> inputValueClass, Class<? extends K2> outputKeyClass, Class<? extends V2> outputValueClass, boolean byValue, JobConf mapperConf) Adds a Mapper class to the chain job's JobConf.static <K1,V1, K2, V2>
voidChainReducer.addMapper(JobConf job, Class<? extends Mapper<K1, V1, K2, V2>> klass, Class<? extends K1> inputKeyClass, Class<? extends V1> inputValueClass, Class<? extends K2> outputKeyClass, Class<? extends V2> outputValueClass, boolean byValue, JobConf mapperConf) Adds a Mapper class to the chain job's JobConf.static voidMultipleOutputs.addMultiNamedOutput(JobConf conf, String namedOutput, Class<? extends OutputFormat> outputFormatClass, Class<?> keyClass, Class<?> valueClass) Adds a multi named output for the job.static voidMultipleOutputs.addNamedOutput(JobConf conf, String namedOutput, Class<? extends OutputFormat> outputFormatClass, Class<?> keyClass, Class<?> valueClass) Adds a named output for the job.voidFilterOutputFormat.checkOutputSpecs(FileSystem ignored, JobConf job) voidLazyOutputFormat.checkOutputSpecs(FileSystem ignored, JobConf job) voidNullOutputFormat.checkOutputSpecs(FileSystem ignored, JobConf job) voidvoidConfigures the ChainMapper and all the Mappers in the chain.voidConfigures the ChainReducer, the Reducer and all the Mappers in the chain.voidvoidvoidvoidvoidvoidvoidvoidprotected voidCombineFileInputFormat.createPool(JobConf conf, List<PathFilter> filters) Deprecated.protected voidCombineFileInputFormat.createPool(JobConf conf, PathFilter... filters) Deprecated.protected abstract RecordWriter<K,V> MultipleOutputFormat.getBaseRecordWriter(FileSystem fs, JobConf job, String name, Progressable arg3) protected RecordWriter<K,V> MultipleSequenceFileOutputFormat.getBaseRecordWriter(FileSystem fs, JobConf job, String name, Progressable arg3) protected RecordWriter<K,V> MultipleTextOutputFormat.getBaseRecordWriter(FileSystem fs, JobConf job, String name, Progressable arg3) static booleanMultipleOutputs.getCountersEnabled(JobConf conf) Returns if the counters for the named outputs are enabled or not.protected StringMultipleOutputFormat.getInputFileBasedOutputFileName(JobConf job, String name) Generate the outfile name based on a given name and the input file name.static Class<? extends OutputFormat>MultipleOutputs.getNamedOutputFormatClass(JobConf conf, String namedOutput) Returns the named output OutputFormat.static Class<?>MultipleOutputs.getNamedOutputKeyClass(JobConf conf, String namedOutput) Returns the key class for a named output.MultipleOutputs.getNamedOutputsList(JobConf conf) Returns list of channel names.static Class<?>MultipleOutputs.getNamedOutputValueClass(JobConf conf, String namedOutput) Returns the value class for a named output.static StringTotalOrderPartitioner.getPartitionFile(JobConf job) Deprecated.abstract RecordReader<K,V> CombineFileInputFormat.getRecordReader(InputSplit split, JobConf job, Reporter reporter) This is not implemented yet.CombineSequenceFileInputFormat.getRecordReader(InputSplit split, JobConf conf, Reporter reporter) CombineTextInputFormat.getRecordReader(InputSplit split, JobConf conf, Reporter reporter) NLineInputFormat.getRecordReader(InputSplit genericSplit, JobConf job, Reporter reporter) FilterOutputFormat.getRecordWriter(FileSystem ignored, JobConf job, String name, Progressable progress) LazyOutputFormat.getRecordWriter(FileSystem ignored, JobConf job, String name, Progressable progress) MultipleOutputFormat.getRecordWriter(FileSystem fs, JobConf job, String name, Progressable arg3) Create a composite record writer that can write key/value data to different output filesNullOutputFormat.getRecordWriter(FileSystem ignored, JobConf job, String name, Progressable progress) Logically splits the set of input files for the job, splits N lines of the input as one split.static booleanMultipleOutputs.isMultiNamedOutput(JobConf conf, String namedOutput) Returns if a named output is multiple.protected FileStatus[]CombineFileInputFormat.listStatus(JobConf job) List input directories.static voidMultipleOutputs.setCountersEnabled(JobConf conf, boolean enabled) Enables or disables counters for the named outputs.static voidLazyOutputFormat.setOutputFormatClass(JobConf job, Class<? extends OutputFormat> theClass) Set the underlying output format for LazyOutputFormat.static voidTotalOrderPartitioner.setPartitionFile(JobConf job, Path p) Deprecated.static <K1,V1, K2, V2>
voidChainReducer.setReducer(JobConf job, Class<? extends Reducer<K1, V1, K2, V2>> klass, Class<? extends K1> inputKeyClass, Class<? extends V1> inputValueClass, Class<? extends K2> outputKeyClass, Class<? extends V2> outputValueClass, boolean byValue, JobConf reducerConf) Sets the Reducer class to the chain job's JobConf.static <K,V> void InputSampler.writePartitionFile(JobConf job, org.apache.hadoop.mapred.lib.InputSampler.Sampler<K, V> sampler) Constructors in org.apache.hadoop.mapred.lib with parameters of type JobConfModifierConstructorDescriptionCombineFileRecordReader(JobConf job, CombineFileSplit split, Reporter reporter, Class<RecordReader<K, V>> rrClass) A generic RecordReader that can hand out different recordReaders for each chunk in the CombineFileSplit.CombineFileSplit(JobConf job, Path[] files, long[] lengths) CombineFileSplit(JobConf job, Path[] files, long[] start, long[] lengths, String[] locations) InputSampler(JobConf conf) MultipleOutputs(JobConf job) Creates and initializes multiple named outputs support, it should be instantiated in the Mapper/Reducer configure method. -
Uses of JobConf in org.apache.hadoop.mapred.lib.aggregate
Methods in org.apache.hadoop.mapred.lib.aggregate that return JobConfModifier and TypeMethodDescriptionstatic JobConfValueAggregatorJob.createValueAggregatorJob(String[] args) Create an Aggregate based map/reduce job.static JobConfValueAggregatorJob.createValueAggregatorJob(String[] args, Class<?> caller) Create an Aggregate based map/reduce job.static JobConfValueAggregatorJob.createValueAggregatorJob(String[] args, Class<? extends ValueAggregatorDescriptor>[] descriptors) static JobConfValueAggregatorJob.createValueAggregatorJob(String[] args, Class<? extends ValueAggregatorDescriptor>[] descriptors, Class<?> caller) Methods in org.apache.hadoop.mapred.lib.aggregate with parameters of type JobConfModifier and TypeMethodDescriptionvoidDo nothing.voidget the input file name.voidCombiner does not need to configure.voidConfigure the objectvoidstatic voidValueAggregatorJob.setAggregatorDescriptors(JobConf job, Class<? extends ValueAggregatorDescriptor>[] descriptors) Constructors in org.apache.hadoop.mapred.lib.aggregate with parameters of type JobConfModifierConstructorDescriptionUserDefinedValueAggregatorDescriptor(String className, JobConf job) -
Uses of JobConf in org.apache.hadoop.mapred.lib.db
Methods in org.apache.hadoop.mapred.lib.db with parameters of type JobConfModifier and TypeMethodDescriptionvoidDBOutputFormat.checkOutputSpecs(FileSystem filesystem, JobConf job) Check for validity of the output-specification for the job.voidInitializes a new instance from aJobConf.static voidDBConfiguration.configureDB(JobConf job, String driverClass, String dbUrl) Sets the DB access related fields in the JobConf.static voidDBConfiguration.configureDB(JobConf job, String driverClass, String dbUrl, String userName, String passwd) Sets the DB access related fields in the JobConf.DBInputFormat.getRecordReader(InputSplit split, JobConf job, Reporter reporter) Get theRecordReaderfor the givenInputSplit.DBOutputFormat.getRecordWriter(FileSystem filesystem, JobConf job, String name, Progressable progress) Get theRecordWriterfor the given job.Logically split the set of input files for the job.static voidDBInputFormat.setInput(JobConf job, Class<? extends DBWritable> inputClass, String inputQuery, String inputCountQuery) Initializes the map-part of the job with the appropriate input settings.static voidDBInputFormat.setInput(JobConf job, Class<? extends DBWritable> inputClass, String tableName, String conditions, String orderBy, String... fieldNames) Initializes the map-part of the job with the appropriate input settings.static voidInitializes the reduce-part of the job with the appropriate output settingsstatic voidInitializes the reduce-part of the job with the appropriate output settings -
Uses of JobConf in org.apache.hadoop.mapred.nativetask
Methods in org.apache.hadoop.mapred.nativetask with parameters of type JobConfModifier and TypeMethodDescriptionprotected abstract booleanPlatform.support(String keyClassName, INativeSerializer<?> serializer, JobConf job) whether a platform supports a specific key should at least satisfy two conditions 1. the key belongs to the platform 2. the associated serializer must implementINativeComparableinterface -
Uses of JobConf in org.apache.hadoop.mapred.pipes
Methods in org.apache.hadoop.mapred.pipes with parameters of type JobConfModifier and TypeMethodDescriptionstatic StringSubmitter.getExecutable(JobConf conf) Get the URI of the application's executable.static booleanSubmitter.getIsJavaMapper(JobConf conf) Check whether the job is using a Java Mapper.static booleanSubmitter.getIsJavaRecordReader(JobConf conf) Check whether the job is using a Java RecordReaderstatic booleanSubmitter.getIsJavaRecordWriter(JobConf conf) Will the reduce use a Java RecordWriter?static booleanSubmitter.getIsJavaReducer(JobConf conf) Check whether the job is using a Java Reducer.static booleanSubmitter.getKeepCommandFile(JobConf conf) Does the user want to keep the command file for debugging?static RunningJobSubmit a job to the Map-Reduce framework.static RunningJobSubmit a job to the map/reduce cluster.static voidSubmitter.setExecutable(JobConf conf, String executable) Set the URI for the application's executable.static voidSubmitter.setIsJavaMapper(JobConf conf, boolean value) Set whether the Mapper is written in Java.static voidSubmitter.setIsJavaRecordReader(JobConf conf, boolean value) Set whether the job is using a Java RecordReader.static voidSubmitter.setIsJavaRecordWriter(JobConf conf, boolean value) Set whether the job will use a Java RecordWriter.static voidSubmitter.setIsJavaReducer(JobConf conf, boolean value) Set whether the Reducer is written in Java.static voidSubmitter.setKeepCommandFile(JobConf conf, boolean keep) Set whether to keep the command file for debuggingstatic RunningJobDeprecated. -
Uses of JobConf in org.apache.hadoop.mapreduce.security
Methods in org.apache.hadoop.mapreduce.security with parameters of type JobConfModifier and TypeMethodDescriptionstatic CredentialsDeprecated.UseCredentials.readTokenStorageFile(org.apache.hadoop.fs.Path, org.apache.hadoop.conf.Configuration)instead, this method is included for compatibility against Hadoop-1.
CombineFileInputFormat.createPool(List).