Uses of Class
org.apache.hadoop.conf.Configuration

Packages that use Configuration
org.apache.hadoop.conf Configuration of system parameters. 
org.apache.hadoop.fs An abstract file system API. 
org.apache.hadoop.fs.ftp   
org.apache.hadoop.fs.kfs A client for the Kosmos filesystem (KFS) 
org.apache.hadoop.fs.permission   
org.apache.hadoop.fs.s3 A distributed, block-based implementation of FileSystem that uses Amazon S3 as a backing store. 
org.apache.hadoop.fs.s3native A distributed implementation of FileSystem for reading and writing files on Amazon S3
org.apache.hadoop.fs.viewfs   
org.apache.hadoop.ha   
org.apache.hadoop.io Generic i/o code for use when reading and writing data to the network, to databases, and to files. 
org.apache.hadoop.io.compress   
org.apache.hadoop.mapred   
org.apache.hadoop.mapred.join   
org.apache.hadoop.mapred.pipes   
org.apache.hadoop.mapreduce   
org.apache.hadoop.mapreduce.lib.aggregate   
org.apache.hadoop.mapreduce.lib.chain   
org.apache.hadoop.mapreduce.lib.db   
org.apache.hadoop.mapreduce.lib.input   
org.apache.hadoop.mapreduce.lib.jobcontrol   
org.apache.hadoop.mapreduce.lib.join   
org.apache.hadoop.mapreduce.lib.output   
org.apache.hadoop.mapreduce.lib.partition   
org.apache.hadoop.mapreduce.security   
org.apache.hadoop.mapreduce.tools   
org.apache.hadoop.mapreduce.v2.hs   
org.apache.hadoop.net Network-related classes. 
org.apache.hadoop.util Common utilities. 
org.apache.hadoop.yarn.applications.distributedshell   
org.apache.hadoop.yarn.client   
 

Uses of Configuration in org.apache.hadoop.conf
 

Methods in org.apache.hadoop.conf that return Configuration
 Configuration Configured.getConf()
           
 Configuration Configurable.getConf()
          Return the configuration used by this object.
 

Methods in org.apache.hadoop.conf with parameters of type Configuration
static void Configuration.dumpConfiguration(Configuration config, Writer out)
          Writes out all the parameters and their properties (final and resource) to the given Writer The format of the output would be { "properties" : [ {key1,value1,key1.isFinal,key1.resource}, {key2,value2, key2.isFinal,key2.resource}...
 void Configured.setConf(Configuration conf)
           
 void Configurable.setConf(Configuration conf)
          Set the configuration to be used by this object.
 

Constructors in org.apache.hadoop.conf with parameters of type Configuration
Configuration(Configuration other)
          A new configuration with the same settings cloned from another.
Configured(Configuration conf)
          Construct a Configured.
 

Uses of Configuration in org.apache.hadoop.fs
 

Methods in org.apache.hadoop.fs that return Configuration
 Configuration FilterFileSystem.getConf()
           
 

Methods in org.apache.hadoop.fs with parameters of type Configuration
static boolean FileUtil.copy(File src, FileSystem dstFS, Path dst, boolean deleteSource, Configuration conf)
          Copy local files to a FileSystem.
static boolean FileUtil.copy(FileSystem srcFS, Path[] srcs, FileSystem dstFS, Path dst, boolean deleteSource, boolean overwrite, Configuration conf)
           
static boolean FileUtil.copy(FileSystem srcFS, Path src, File dst, boolean deleteSource, Configuration conf)
          Copy FileSystem files to local files.
static boolean FileUtil.copy(FileSystem srcFS, Path src, FileSystem dstFS, Path dst, boolean deleteSource, boolean overwrite, Configuration conf)
          Copy files between FileSystems.
static boolean FileUtil.copy(FileSystem srcFS, Path src, FileSystem dstFS, Path dst, boolean deleteSource, Configuration conf)
          Copy files between FileSystems.
static boolean FileUtil.copyMerge(FileSystem srcFS, Path srcDir, FileSystem dstFS, Path dstFile, boolean deleteSource, Configuration conf, String addString)
          Copy all files in a directory to one output file (merge).
static AbstractFileSystem AbstractFileSystem.createFileSystem(URI uri, Configuration conf)
          Create a file system instance for the specified uri using the conf.
static FileSystem FileSystem.get(Configuration conf)
          Returns the configured filesystem implementation.
static AbstractFileSystem AbstractFileSystem.get(URI uri, Configuration conf)
          The main factory method for creating a file system.
static FileSystem FileSystem.get(URI uri, Configuration conf)
          Returns the FileSystem for this URI's scheme and authority.
static FileSystem FileSystem.get(URI uri, Configuration conf, String user)
          Get a filesystem instance based on the uri, the passed configuration and the user
static URI FileSystem.getDefaultUri(Configuration conf)
          Get the default filesystem URI from a configuration.
static FileContext FileContext.getFileContext(AbstractFileSystem defFS, Configuration aConf)
          Create a FileContext with specified FS as default using the specified config.
static FileContext FileContext.getFileContext(Configuration aConf)
          Create a FileContext using the passed config.
static FileContext FileContext.getFileContext(URI defaultFsUri, Configuration aConf)
          Create a FileContext for specified default URI using the specified config.
 FileSystem Path.getFileSystem(Configuration conf)
          Return the FileSystem that owns this Path.
static Class<? extends FileSystem> FileSystem.getFileSystemClass(String scheme, Configuration conf)
           
static TrashPolicy TrashPolicy.getInstance(Configuration conf, FileSystem fs, Path home)
          Get an instance of the configured TrashPolicy based on the value of the configuration paramater fs.trash.classname.
static LocalFileSystem FileSystem.getLocal(Configuration conf)
          Get the local file system.
static FileContext FileContext.getLocalFSFileContext(Configuration aConf)
           
static FileSystem FileSystem.getNamed(String name, Configuration conf)
          Deprecated. call #get(URI,Configuration) instead.
abstract  void TrashPolicy.initialize(Configuration conf, FileSystem fs, Path home)
          Used to setup the trash policy.
 void FilterFileSystem.initialize(URI name, Configuration conf)
          Called after a new FileSystem instance is constructed.
 void LocalFileSystem.initialize(URI name, Configuration conf)
           
 void RawLocalFileSystem.initialize(URI uri, Configuration conf)
           
 void FileSystem.initialize(URI name, Configuration conf)
          Called after a new FileSystem instance is constructed.
static boolean Trash.moveToAppropriateTrash(FileSystem fs, Path p, Configuration conf)
          In case of the symlinks or mount points, one has to move the appropriate trashbin in the actual volume of the path p being deleted.
static FileSystem FileSystem.newInstance(Configuration conf)
          Returns a unique configured filesystem implementation.
static FileSystem FileSystem.newInstance(URI uri, Configuration conf)
          Returns the FileSystem for this URI's scheme and authority.
static FileSystem FileSystem.newInstance(URI uri, Configuration conf, String user)
          Returns the FileSystem for this URI's scheme and authority and the passed user.
static LocalFileSystem FileSystem.newInstanceLocal(Configuration conf)
          Get a unique local file system object
 void ChecksumFileSystem.setConf(Configuration conf)
           
static void FileSystem.setDefaultUri(Configuration conf, String uri)
          Set the default filesystem URI in a configuration.
static void FileSystem.setDefaultUri(Configuration conf, URI uri)
          Set the default filesystem URI in a configuration.
 

Constructors in org.apache.hadoop.fs with parameters of type Configuration
Trash(Configuration conf)
          Construct a trash can accessor.
Trash(FileSystem fs, Configuration conf)
          Construct a trash can accessor for the FileSystem provided.
 

Uses of Configuration in org.apache.hadoop.fs.ftp
 

Methods in org.apache.hadoop.fs.ftp with parameters of type Configuration
 void FTPFileSystem.initialize(URI uri, Configuration conf)
           
 

Uses of Configuration in org.apache.hadoop.fs.kfs
 

Methods in org.apache.hadoop.fs.kfs with parameters of type Configuration
 void KosmosFileSystem.initialize(URI uri, Configuration conf)
           
 

Uses of Configuration in org.apache.hadoop.fs.permission
 

Methods in org.apache.hadoop.fs.permission with parameters of type Configuration
static FsPermission FsPermission.getUMask(Configuration conf)
          Get the user file creation mask (umask) UMASK_LABEL config param has umask value that is either symbolic or octal.
static void FsPermission.setUMask(Configuration conf, FsPermission umask)
          Set the user file creation mask (umask)
 

Uses of Configuration in org.apache.hadoop.fs.s3
 

Methods in org.apache.hadoop.fs.s3 with parameters of type Configuration
 void S3FileSystem.initialize(URI uri, Configuration conf)
           
 

Uses of Configuration in org.apache.hadoop.fs.s3native
 

Methods in org.apache.hadoop.fs.s3native with parameters of type Configuration
 void NativeS3FileSystem.initialize(URI uri, Configuration conf)
           
 

Uses of Configuration in org.apache.hadoop.fs.viewfs
 

Methods in org.apache.hadoop.fs.viewfs with parameters of type Configuration
 void ViewFileSystem.initialize(URI theUri, Configuration conf)
          Called after a new FileSystem instance is constructed.
 

Constructors in org.apache.hadoop.fs.viewfs with parameters of type Configuration
ViewFileSystem(Configuration conf)
          Convenience Constructor for apps to call directly
ViewFs(Configuration conf)
           
 

Uses of Configuration in org.apache.hadoop.ha
 

Methods in org.apache.hadoop.ha with parameters of type Configuration
 HAServiceProtocol HAServiceTarget.getProxy(Configuration conf, int timeoutMs)
           
 org.apache.hadoop.ha.ZKFCProtocol HAServiceTarget.getZKFCProxy(Configuration conf, int timeoutMs)
           
 

Uses of Configuration in org.apache.hadoop.io
 

Methods in org.apache.hadoop.io that return Configuration
 Configuration AbstractMapWritable.getConf()
           
 Configuration EnumSetWritable.getConf()
           
 Configuration ObjectWritable.getConf()
           
 Configuration GenericWritable.getConf()
           
 

Methods in org.apache.hadoop.io with parameters of type Configuration
static
<T extends Writable>
T
WritableUtils.clone(T orig, Configuration conf)
          Make a copy of a writable object using serialization to a buffer.
static void IOUtils.copyBytes(InputStream in, OutputStream out, Configuration conf)
          Copies from one stream to another.
static void IOUtils.copyBytes(InputStream in, OutputStream out, Configuration conf, boolean close)
          Copies from one stream to another.
static org.apache.hadoop.io.SequenceFile.Writer SequenceFile.createWriter(Configuration conf, FSDataOutputStream out, Class keyClass, Class valClass, org.apache.hadoop.io.SequenceFile.CompressionType compressionType, CompressionCodec codec)
          Deprecated. Use SequenceFile.createWriter(Configuration, Writer.Option...) instead.
static org.apache.hadoop.io.SequenceFile.Writer SequenceFile.createWriter(Configuration conf, FSDataOutputStream out, Class keyClass, Class valClass, org.apache.hadoop.io.SequenceFile.CompressionType compressionType, CompressionCodec codec, org.apache.hadoop.io.SequenceFile.Metadata metadata)
          Deprecated. Use SequenceFile.createWriter(Configuration, Writer.Option...) instead.
static org.apache.hadoop.io.SequenceFile.Writer SequenceFile.createWriter(Configuration conf, org.apache.hadoop.io.SequenceFile.Writer.Option... opts)
          Create a new Writer with the given options.
static org.apache.hadoop.io.SequenceFile.Writer SequenceFile.createWriter(FileContext fc, Configuration conf, Path name, Class keyClass, Class valClass, org.apache.hadoop.io.SequenceFile.CompressionType compressionType, CompressionCodec codec, org.apache.hadoop.io.SequenceFile.Metadata metadata, EnumSet<CreateFlag> createFlag, org.apache.hadoop.fs.Options.CreateOpts... opts)
          Construct the preferred type of SequenceFile Writer.
static org.apache.hadoop.io.SequenceFile.Writer SequenceFile.createWriter(FileSystem fs, Configuration conf, Path name, Class keyClass, Class valClass)
          Deprecated. Use SequenceFile.createWriter(Configuration, Writer.Option...) instead.
static org.apache.hadoop.io.SequenceFile.Writer SequenceFile.createWriter(FileSystem fs, Configuration conf, Path name, Class keyClass, Class valClass, int bufferSize, short replication, long blockSize, boolean createParent, org.apache.hadoop.io.SequenceFile.CompressionType compressionType, CompressionCodec codec, org.apache.hadoop.io.SequenceFile.Metadata metadata)
          Deprecated. 
static org.apache.hadoop.io.SequenceFile.Writer SequenceFile.createWriter(FileSystem fs, Configuration conf, Path name, Class keyClass, Class valClass, int bufferSize, short replication, long blockSize, org.apache.hadoop.io.SequenceFile.CompressionType compressionType, CompressionCodec codec, Progressable progress, org.apache.hadoop.io.SequenceFile.Metadata metadata)
          Deprecated. Use SequenceFile.createWriter(Configuration, Writer.Option...) instead.
static org.apache.hadoop.io.SequenceFile.Writer SequenceFile.createWriter(FileSystem fs, Configuration conf, Path name, Class keyClass, Class valClass, org.apache.hadoop.io.SequenceFile.CompressionType compressionType)
          Deprecated. Use SequenceFile.createWriter(Configuration, Writer.Option...) instead.
static org.apache.hadoop.io.SequenceFile.Writer SequenceFile.createWriter(FileSystem fs, Configuration conf, Path name, Class keyClass, Class valClass, org.apache.hadoop.io.SequenceFile.CompressionType compressionType, CompressionCodec codec)
          Deprecated. Use SequenceFile.createWriter(Configuration, Writer.Option...) instead.
static org.apache.hadoop.io.SequenceFile.Writer SequenceFile.createWriter(FileSystem fs, Configuration conf, Path name, Class keyClass, Class valClass, org.apache.hadoop.io.SequenceFile.CompressionType compressionType, CompressionCodec codec, Progressable progress)
          Deprecated. Use SequenceFile.createWriter(Configuration, Writer.Option...) instead.
static org.apache.hadoop.io.SequenceFile.Writer SequenceFile.createWriter(FileSystem fs, Configuration conf, Path name, Class keyClass, Class valClass, org.apache.hadoop.io.SequenceFile.CompressionType compressionType, CompressionCodec codec, Progressable progress, org.apache.hadoop.io.SequenceFile.Metadata metadata)
          Deprecated. Use SequenceFile.createWriter(Configuration, Writer.Option...) instead.
static org.apache.hadoop.io.SequenceFile.Writer SequenceFile.createWriter(FileSystem fs, Configuration conf, Path name, Class keyClass, Class valClass, org.apache.hadoop.io.SequenceFile.CompressionType compressionType, Progressable progress)
          Deprecated. Use SequenceFile.createWriter(Configuration, Writer.Option...) instead.
static long MapFile.fix(FileSystem fs, Path dir, Class<? extends Writable> keyClass, Class<? extends Writable> valueClass, boolean dryrun, Configuration conf)
          This method attempts to fix a corrupt MapFile by re-creating its index.
static org.apache.hadoop.io.SequenceFile.CompressionType SequenceFile.getDefaultCompressionType(Configuration job)
          Get the compression type for the reduce outputs
static
<K> K
DefaultStringifier.load(Configuration conf, String keyName, Class<K> itemClass)
          Restores the object from the configuration.
static
<K> K[]
DefaultStringifier.loadArray(Configuration conf, String keyName, Class<K> itemClass)
          Restores the array of objects from the configuration.
static Class<?> ObjectWritable.loadClass(Configuration conf, String className)
          Find and load the class with given name className by first finding it in the specified conf.
static Writable WritableFactories.newInstance(Class<? extends Writable> c, Configuration conf)
          Create a new instance of a class with a defined factory.
static Object ObjectWritable.readObject(DataInput in, Configuration conf)
          Read a Writable, String, primitive type, or an array of the preceding.
static Object ObjectWritable.readObject(DataInput in, ObjectWritable objectWritable, Configuration conf)
          Read a Writable, String, primitive type, or an array of the preceding.
 void AbstractMapWritable.setConf(Configuration conf)
           
 void EnumSetWritable.setConf(Configuration conf)
           
 void ObjectWritable.setConf(Configuration conf)
           
 void GenericWritable.setConf(Configuration conf)
           
static void SequenceFile.setDefaultCompressionType(Configuration job, org.apache.hadoop.io.SequenceFile.CompressionType val)
          Set the default compression type for sequence files.
static
<K> void
DefaultStringifier.store(Configuration conf, K item, String keyName)
          Stores the item in the configuration with the given keyName.
static
<K> void
DefaultStringifier.storeArray(Configuration conf, K[] items, String keyName)
          Stores the array of items in the configuration with the given keyName.
static void ObjectWritable.writeObject(DataOutput out, Object instance, Class declaredClass, Configuration conf)
          Write a Writable, String, primitive type, or an array of the preceding.
static void ObjectWritable.writeObject(DataOutput out, Object instance, Class declaredClass, Configuration conf, boolean allowCompactArrays)
          Write a Writable, String, primitive type, or an array of the preceding.
 

Constructors in org.apache.hadoop.io with parameters of type Configuration
DefaultStringifier(Configuration conf, Class<T> c)
           
 

Uses of Configuration in org.apache.hadoop.io.compress
 

Methods in org.apache.hadoop.io.compress that return Configuration
 Configuration DefaultCodec.getConf()
           
 

Methods in org.apache.hadoop.io.compress with parameters of type Configuration
static List<Class<? extends CompressionCodec>> CompressionCodecFactory.getCodecClasses(Configuration conf)
          Get the list of codecs discovered via a Java ServiceLoader, or listed in the configuration.
static Compressor CodecPool.getCompressor(CompressionCodec codec, Configuration conf)
          Get a Compressor for the given CompressionCodec from the pool or a new one.
 void Compressor.reinit(Configuration conf)
          Prepare the compressor to be used in a new stream with settings defined in the given Configuration
static void CompressionCodecFactory.setCodecClasses(Configuration conf, List<Class> classes)
          Sets a list of codec classes in the configuration.
 void DefaultCodec.setConf(Configuration conf)
           
 

Constructors in org.apache.hadoop.io.compress with parameters of type Configuration
CompressionCodecFactory(Configuration conf)
          Find the codecs specified in the config value io.compression.codecs and register them.
 

Uses of Configuration in org.apache.hadoop.mapred
 

Subclasses of Configuration in org.apache.hadoop.mapred
 class JobConf
          A map/reduce job configuration.
 

Fields in org.apache.hadoop.mapred declared as Configuration
protected  Configuration SequenceFileRecordReader.conf
           
 

Methods in org.apache.hadoop.mapred that return Configuration
 Configuration RunningJob.getConfiguration()
          Get the underlying job configuration
 

Methods in org.apache.hadoop.mapred with parameters of type Configuration
static int SkipBadRecords.getAttemptsToStartSkipping(Configuration conf)
          Get the number of Task attempts AFTER which skip mode will be kicked off.
static boolean SkipBadRecords.getAutoIncrMapperProcCount(Configuration conf)
          Get the flag which if set to true, SkipBadRecords.COUNTER_MAP_PROCESSED_RECORDS is incremented by MapRunner after invoking the map function.
static boolean SkipBadRecords.getAutoIncrReducerProcCount(Configuration conf)
          Get the flag which if set to true, SkipBadRecords.COUNTER_REDUCE_PROCESSED_GROUPS is incremented by framework after invoking the reduce function.
static long SkipBadRecords.getMapperMaxSkipRecords(Configuration conf)
          Get the number of acceptable skip records surrounding the bad record PER bad record in mapper.
static org.apache.hadoop.io.SequenceFile.Reader[] SequenceFileOutputFormat.getReaders(Configuration conf, Path dir)
          Open the output generated by this format.
static org.apache.hadoop.io.MapFile.Reader[] MapFileOutputFormat.getReaders(FileSystem ignored, Path dir, Configuration conf)
          Open the output generated by this format.
static long SkipBadRecords.getReducerMaxSkipGroups(Configuration conf)
          Get the number of acceptable skip groups surrounding the bad group PER bad group in reducer.
static Path SkipBadRecords.getSkipOutputPath(Configuration conf)
          Get the directory to which skipped records are written.
static void SkipBadRecords.setAttemptsToStartSkipping(Configuration conf, int attemptsToStartSkipping)
          Set the number of Task attempts AFTER which skip mode will be kicked off.
static void SkipBadRecords.setAutoIncrMapperProcCount(Configuration conf, boolean autoIncr)
          Set the flag which if set to true, SkipBadRecords.COUNTER_MAP_PROCESSED_RECORDS is incremented by MapRunner after invoking the map function.
static void SkipBadRecords.setAutoIncrReducerProcCount(Configuration conf, boolean autoIncr)
          Set the flag which if set to true, SkipBadRecords.COUNTER_REDUCE_PROCESSED_GROUPS is incremented by framework after invoking the reduce function.
static void SequenceFileInputFilter.setFilterClass(Configuration conf, Class filterClass)
          set the filter class
static void SkipBadRecords.setMapperMaxSkipRecords(Configuration conf, long maxSkipRecs)
          Set the number of acceptable skip records surrounding the bad record PER bad record in mapper.
static void SkipBadRecords.setReducerMaxSkipGroups(Configuration conf, long maxSkipGrps)
          Set the number of acceptable skip groups surrounding the bad group PER bad group in reducer.
 

Constructors in org.apache.hadoop.mapred with parameters of type Configuration
JobClient(Configuration conf)
          Build a job client with the given Configuration, and connect to the default cluster
JobClient(InetSocketAddress jobTrackAddr, Configuration conf)
          Build a job client, connect to the indicated job tracker.
JobConf(Configuration conf)
          Construct a map/reduce job configuration.
JobConf(Configuration conf, Class exampleClass)
          Construct a map/reduce job configuration.
KeyValueLineRecordReader(Configuration job, FileSplit split)
           
SequenceFileAsTextRecordReader(Configuration conf, FileSplit split)
           
SequenceFileRecordReader(Configuration conf, FileSplit split)
           
 

Uses of Configuration in org.apache.hadoop.mapred.join
 

Methods in org.apache.hadoop.mapred.join that return Configuration
 Configuration CompositeRecordReader.getConf()
          Return the configuration used by this object.
 

Methods in org.apache.hadoop.mapred.join with parameters of type Configuration
 void CompositeRecordReader.setConf(Configuration conf)
          Set the configuration to be used by this object.
 

Uses of Configuration in org.apache.hadoop.mapred.pipes
 

Constructors in org.apache.hadoop.mapred.pipes with parameters of type Configuration
Submitter(Configuration conf)
           
 

Uses of Configuration in org.apache.hadoop.mapreduce
 

Methods in org.apache.hadoop.mapreduce that return Configuration
 Configuration JobContext.getConfiguration()
          Return the configuration for the job.
 

Methods in org.apache.hadoop.mapreduce with parameters of type Configuration
static int Job.getCompletionPollInterval(Configuration conf)
          The interval at which waitForCompletion() should check.
static Job Job.getInstance(Cluster ignored, Configuration conf)
          Deprecated. Use Job.getInstance(Configuration)
static Job Job.getInstance(Cluster cluster, JobStatus status, Configuration conf)
          Creates a new Job with no particular Cluster and given Configuration and JobStatus.
static Job Job.getInstance(Configuration conf)
          Creates a new Job with no particular Cluster and a given Configuration.
static Job Job.getInstance(Configuration conf, String jobName)
          Creates a new Job with no particular Cluster and a given jobName.
static Job Job.getInstance(JobStatus status, Configuration conf)
          Creates a new Job with no particular Cluster and given Configuration and JobStatus.
static int Job.getProgressPollInterval(Configuration conf)
          The interval at which monitorAndPrintJob() prints status
static org.apache.hadoop.mapreduce.Job.TaskStatusFilter Job.getTaskOutputFilter(Configuration conf)
          Get the task output filter.
static void Job.setTaskOutputFilter(Configuration conf, org.apache.hadoop.mapreduce.Job.TaskStatusFilter newValue)
          Modify the Configuration to set the task output filter.
 

Constructors in org.apache.hadoop.mapreduce with parameters of type Configuration
Cluster(Configuration conf)
           
Cluster(InetSocketAddress jobTrackAddr, Configuration conf)
           
Job(Configuration conf)
          Deprecated. 
Job(Configuration conf, String jobName)
          Deprecated. 
 

Uses of Configuration in org.apache.hadoop.mapreduce.lib.aggregate
 

Methods in org.apache.hadoop.mapreduce.lib.aggregate that return Configuration
static Configuration ValueAggregatorJob.setAggregatorDescriptors(Class<? extends ValueAggregatorDescriptor>[] descriptors)
           
 

Methods in org.apache.hadoop.mapreduce.lib.aggregate with parameters of type Configuration
 void ValueAggregatorDescriptor.configure(Configuration conf)
          Configure the object
 void ValueAggregatorBaseDescriptor.configure(Configuration conf)
          get the input file name.
 void UserDefinedValueAggregatorDescriptor.configure(Configuration conf)
          Do nothing.
static Job ValueAggregatorJob.createValueAggregatorJob(Configuration conf, String[] args)
          Create an Aggregate based map/reduce job.
protected static ArrayList<ValueAggregatorDescriptor> ValueAggregatorJobBase.getAggregatorDescriptors(Configuration conf)
           
protected static ValueAggregatorDescriptor ValueAggregatorJobBase.getValueAggregatorDescriptor(String spec, Configuration conf)
           
static void ValueAggregatorJobBase.setup(Configuration job)
           
 

Constructors in org.apache.hadoop.mapreduce.lib.aggregate with parameters of type Configuration
UserDefinedValueAggregatorDescriptor(String className, Configuration conf)
           
 

Uses of Configuration in org.apache.hadoop.mapreduce.lib.chain
 

Methods in org.apache.hadoop.mapreduce.lib.chain with parameters of type Configuration
static void ChainReducer.addMapper(Job job, Class<? extends Mapper> klass, Class<?> inputKeyClass, Class<?> inputValueClass, Class<?> outputKeyClass, Class<?> outputValueClass, Configuration mapperConf)
          Adds a Mapper class to the chain reducer.
static void ChainMapper.addMapper(Job job, Class<? extends Mapper> klass, Class<?> inputKeyClass, Class<?> inputValueClass, Class<?> outputKeyClass, Class<?> outputValueClass, Configuration mapperConf)
          Adds a Mapper class to the chain mapper.
static void ChainReducer.setReducer(Job job, Class<? extends Reducer> klass, Class<?> inputKeyClass, Class<?> inputValueClass, Class<?> outputKeyClass, Class<?> outputValueClass, Configuration reducerConf)
          Sets the Reducer class to the chain job.
 

Uses of Configuration in org.apache.hadoop.mapreduce.lib.db
 

Methods in org.apache.hadoop.mapreduce.lib.db that return Configuration
 Configuration DBInputFormat.getConf()
           
 Configuration DBConfiguration.getConf()
           
 

Methods in org.apache.hadoop.mapreduce.lib.db with parameters of type Configuration
static void DBConfiguration.configureDB(Configuration job, String driverClass, String dbUrl)
          Sets the DB access related fields in the JobConf.
static void DBConfiguration.configureDB(Configuration conf, String driverClass, String dbUrl, String userName, String passwd)
          Sets the DB access related fields in the Configuration.
protected  RecordReader<LongWritable,T> DBInputFormat.createDBRecordReader(org.apache.hadoop.mapreduce.lib.db.DBInputFormat.DBInputSplit split, Configuration conf)
           
protected  RecordReader<LongWritable,T> DataDrivenDBInputFormat.createDBRecordReader(org.apache.hadoop.mapreduce.lib.db.DBInputFormat.DBInputSplit split, Configuration conf)
           
protected  RecordReader<LongWritable,T> OracleDataDrivenDBInputFormat.createDBRecordReader(org.apache.hadoop.mapreduce.lib.db.DBInputFormat.DBInputSplit split, Configuration conf)
           
static void DataDrivenDBInputFormat.setBoundingQuery(Configuration conf, String query)
          Set the user-defined bounding query to use with a user-defined query.
 void DBInputFormat.setConf(Configuration conf)
          Set the configuration to be used by this object.
static void OracleDBRecordReader.setSessionTimeZone(Configuration conf, Connection conn)
          Set session time zone
 List<InputSplit> DBSplitter.split(Configuration conf, ResultSet results, String colName)
          Given a ResultSet containing one record (and already advanced to that record) with two columns (a low value, and a high value, both of the same type), determine a set of splits that span the given values.
 List<InputSplit> BooleanSplitter.split(Configuration conf, ResultSet results, String colName)
           
 List<InputSplit> BigDecimalSplitter.split(Configuration conf, ResultSet results, String colName)
           
 List<InputSplit> DateSplitter.split(Configuration conf, ResultSet results, String colName)
           
 List<InputSplit> IntegerSplitter.split(Configuration conf, ResultSet results, String colName)
           
 List<InputSplit> TextSplitter.split(Configuration conf, ResultSet results, String colName)
          This method needs to determine the splits between two user-provided strings.
 List<InputSplit> FloatSplitter.split(Configuration conf, ResultSet results, String colName)
           
 

Constructors in org.apache.hadoop.mapreduce.lib.db with parameters of type Configuration
DataDrivenDBRecordReader(org.apache.hadoop.mapreduce.lib.db.DBInputFormat.DBInputSplit split, Class<T> inputClass, Configuration conf, Connection conn, DBConfiguration dbConfig, String cond, String[] fields, String table, String dbProduct)
           
DBConfiguration(Configuration job)
           
DBRecordReader(org.apache.hadoop.mapreduce.lib.db.DBInputFormat.DBInputSplit split, Class<T> inputClass, Configuration conf, Connection conn, DBConfiguration dbConfig, String cond, String[] fields, String table)
           
MySQLDataDrivenDBRecordReader(org.apache.hadoop.mapreduce.lib.db.DBInputFormat.DBInputSplit split, Class<T> inputClass, Configuration conf, Connection conn, DBConfiguration dbConfig, String cond, String[] fields, String table)
           
MySQLDBRecordReader(org.apache.hadoop.mapreduce.lib.db.DBInputFormat.DBInputSplit split, Class<T> inputClass, Configuration conf, Connection conn, DBConfiguration dbConfig, String cond, String[] fields, String table)
           
OracleDataDrivenDBRecordReader(org.apache.hadoop.mapreduce.lib.db.DBInputFormat.DBInputSplit split, Class<T> inputClass, Configuration conf, Connection conn, DBConfiguration dbConfig, String cond, String[] fields, String table)
           
OracleDBRecordReader(org.apache.hadoop.mapreduce.lib.db.DBInputFormat.DBInputSplit split, Class<T> inputClass, Configuration conf, Connection conn, DBConfiguration dbConfig, String cond, String[] fields, String table)
           
 

Uses of Configuration in org.apache.hadoop.mapreduce.lib.input
 

Fields in org.apache.hadoop.mapreduce.lib.input declared as Configuration
protected  Configuration SequenceFileRecordReader.conf
           
 

Methods in org.apache.hadoop.mapreduce.lib.input with parameters of type Configuration
static List<FileSplit> NLineInputFormat.getSplitsForFile(FileStatus status, Configuration conf, int numLinesPerSplit)
           
 

Constructors in org.apache.hadoop.mapreduce.lib.input with parameters of type Configuration
KeyValueLineRecordReader(Configuration conf)
           
 

Uses of Configuration in org.apache.hadoop.mapreduce.lib.jobcontrol
 

Constructors in org.apache.hadoop.mapreduce.lib.jobcontrol with parameters of type Configuration
ControlledJob(Configuration conf)
          Construct a job.
 

Uses of Configuration in org.apache.hadoop.mapreduce.lib.join
 

Fields in org.apache.hadoop.mapreduce.lib.join declared as Configuration
protected  Configuration CompositeRecordReader.conf
           
 

Methods in org.apache.hadoop.mapreduce.lib.join that return Configuration
 Configuration CompositeRecordReader.getConf()
          Return the configuration used by this object.
 

Methods in org.apache.hadoop.mapreduce.lib.join with parameters of type Configuration
 void CompositeRecordReader.setConf(Configuration conf)
          Set the configuration to be used by this object.
 void CompositeInputFormat.setFormat(Configuration conf)
          Interpret a given string as a composite expression.
 

Constructors in org.apache.hadoop.mapreduce.lib.join with parameters of type Configuration
JoinRecordReader(int id, Configuration conf, int capacity, Class<? extends WritableComparator> cmpcl)
           
MultiFilterRecordReader(int id, Configuration conf, int capacity, Class<? extends WritableComparator> cmpcl)
           
 

Uses of Configuration in org.apache.hadoop.mapreduce.lib.output
 

Methods in org.apache.hadoop.mapreduce.lib.output with parameters of type Configuration
static org.apache.hadoop.io.MapFile.Reader[] MapFileOutputFormat.getReaders(Path dir, Configuration conf)
          Open the output generated by this format.
 

Uses of Configuration in org.apache.hadoop.mapreduce.lib.partition
 

Methods in org.apache.hadoop.mapreduce.lib.partition that return Configuration
 Configuration BinaryPartitioner.getConf()
           
 Configuration KeyFieldBasedComparator.getConf()
           
 Configuration TotalOrderPartitioner.getConf()
           
 Configuration KeyFieldBasedPartitioner.getConf()
           
 

Methods in org.apache.hadoop.mapreduce.lib.partition with parameters of type Configuration
static String TotalOrderPartitioner.getPartitionFile(Configuration conf)
          Get the path to the SequenceFile storing the sorted partition keyset.
 void BinaryPartitioner.setConf(Configuration conf)
           
 void KeyFieldBasedComparator.setConf(Configuration conf)
           
 void TotalOrderPartitioner.setConf(Configuration conf)
          Read in the partition file and build indexing data structures.
 void KeyFieldBasedPartitioner.setConf(Configuration conf)
           
static void BinaryPartitioner.setLeftOffset(Configuration conf, int offset)
          Set the subarray to be used for partitioning to bytes[offset:] in Python syntax.
static void BinaryPartitioner.setOffsets(Configuration conf, int left, int right)
          Set the subarray to be used for partitioning to bytes[left:(right+1)] in Python syntax.
static void TotalOrderPartitioner.setPartitionFile(Configuration conf, Path p)
          Set the path to the SequenceFile storing the sorted partition keyset.
static void BinaryPartitioner.setRightOffset(Configuration conf, int offset)
          Set the subarray to be used for partitioning to bytes[:(offset+1)] in Python syntax.
 

Constructors in org.apache.hadoop.mapreduce.lib.partition with parameters of type Configuration
InputSampler(Configuration conf)
           
 

Uses of Configuration in org.apache.hadoop.mapreduce.security
 

Methods in org.apache.hadoop.mapreduce.security with parameters of type Configuration
static void TokenCache.cleanUpTokenReferral(Configuration conf)
          Remove jobtoken referrals which don't make sense in the context of the task execution.
static void TokenCache.obtainTokensForNamenodes(org.apache.hadoop.security.Credentials credentials, Path[] ps, Configuration conf)
          Convenience method to obtain delegation tokens from namenodes corresponding to the paths passed.
 

Uses of Configuration in org.apache.hadoop.mapreduce.tools
 

Constructors in org.apache.hadoop.mapreduce.tools with parameters of type Configuration
CLI(Configuration conf)
           
 

Uses of Configuration in org.apache.hadoop.mapreduce.v2.hs
 

Methods in org.apache.hadoop.mapreduce.v2.hs with parameters of type Configuration
 void HistoryFileManager.init(Configuration conf)
           
 

Uses of Configuration in org.apache.hadoop.net
 

Methods in org.apache.hadoop.net that return Configuration
 Configuration SocksSocketFactory.getConf()
           
 Configuration ScriptBasedMapping.getConf()
           
 Configuration AbstractDNSToSwitchMapping.getConf()
           
 Configuration TableMapping.getConf()
           
 

Methods in org.apache.hadoop.net with parameters of type Configuration
 void SocksSocketFactory.setConf(Configuration conf)
           
 void ScriptBasedMapping.setConf(Configuration conf)
          Set the configuration to be used by this object.
 void AbstractDNSToSwitchMapping.setConf(Configuration conf)
           
 void TableMapping.setConf(Configuration conf)
           
 

Constructors in org.apache.hadoop.net with parameters of type Configuration
AbstractDNSToSwitchMapping(Configuration conf)
          Create an instance, caching the configuration file.
ScriptBasedMapping(Configuration conf)
          Create an instance from the given configuration
 

Uses of Configuration in org.apache.hadoop.util
 

Methods in org.apache.hadoop.util with parameters of type Configuration
static
<T> T
ReflectionUtils.copy(Configuration conf, T src, T dst)
          Make a copy of the writable object using serialization to a buffer
static
<T> T
ReflectionUtils.newInstance(Class<T> theClass, Configuration conf)
          Create an object for the given class and initialize it from conf
static int ToolRunner.run(Configuration conf, Tool tool, String[] args)
          Runs the given Tool by Tool.run(String[]), after parsing with the given generic arguments.
static void ReflectionUtils.setConf(Object theObject, Configuration conf)
          Check and set 'configuration' if necessary.
 

Uses of Configuration in org.apache.hadoop.yarn.applications.distributedshell
 

Constructors in org.apache.hadoop.yarn.applications.distributedshell with parameters of type Configuration
Client(Configuration conf)
           
 

Uses of Configuration in org.apache.hadoop.yarn.client
 

Methods in org.apache.hadoop.yarn.client with parameters of type Configuration
 void YarnClientImpl.init(Configuration conf)
           
 



Copyright © 2012 Apache Software Foundation. All Rights Reserved.