Uses of Class
org.apache.hadoop.mapred.JobConf

Packages that use JobConf
org.apache.hadoop.contrib.utils.join   
org.apache.hadoop.examples Hadoop example code. 
org.apache.hadoop.io Generic i/o code for use when reading and writing data to the network, to databases, and to files. 
org.apache.hadoop.mapred A system for scalable, fault-tolerant, distributed computation over large data collections. 
org.apache.hadoop.mapred.jobcontrol Utilities for managing dependent jobs. 
org.apache.hadoop.mapred.lib Library of generally useful mappers, reducers, and partitioners. 
org.apache.hadoop.mapred.lib.aggregate Classes for performing various counting and aggregations. 
org.apache.hadoop.streaming   
org.apache.hadoop.tools   
org.apache.hadoop.util Common utilities. 
 

Uses of JobConf in org.apache.hadoop.contrib.utils.join
 

Fields in org.apache.hadoop.contrib.utils.join declared as JobConf
protected  JobConf DataJoinMapperBase.job
           
protected  JobConf DataJoinReducerBase.job
           
 

Methods in org.apache.hadoop.contrib.utils.join that return JobConf
static JobConf DataJoinJob.createDataJoinJob(String[] args)
           
 

Methods in org.apache.hadoop.contrib.utils.join with parameters of type JobConf
 void DataJoinMapperBase.configure(JobConf job)
           
 void DataJoinReducerBase.configure(JobConf job)
           
 void JobBase.configure(JobConf job)
          Initializes a new instance from a JobConf.
static boolean DataJoinJob.runJob(JobConf job)
          Submit/run a map/reduce job.
 

Uses of JobConf in org.apache.hadoop.examples
 

Methods in org.apache.hadoop.examples with parameters of type JobConf
 void PiEstimator.PiMapper.configure(JobConf job)
          Mapper configuration.
 void PiEstimator.PiReducer.configure(JobConf job)
          Reducer configuration.
 

Uses of JobConf in org.apache.hadoop.io
 

Methods in org.apache.hadoop.io with parameters of type JobConf
static Writable WritableUtils.clone(Writable orig, JobConf conf)
          Make a copy of a writable object using serialization to a buffer.
 

Uses of JobConf in org.apache.hadoop.mapred
 

Methods in org.apache.hadoop.mapred with parameters of type JobConf
 void OutputFormat.checkOutputSpecs(FileSystem ignored, JobConf job)
          Check whether the output specification for a job is appropriate.
 void OutputFormatBase.checkOutputSpecs(FileSystem ignored, JobConf job)
           
 void MapRunner.configure(JobConf job)
           
 void TextInputFormat.configure(JobConf conf)
           
 void JobConfigurable.configure(JobConf job)
          Initializes a new instance from a JobConf.
 void MapReduceBase.configure(JobConf job)
          Default implementation that does nothing.
static boolean OutputFormatBase.getCompressOutput(JobConf conf)
          Is the reduce output compressed?
static Class OutputFormatBase.getOutputCompressorClass(JobConf conf, Class defaultValue)
          Get the codec for compressing the reduce outputs
 RecordReader SequenceFileInputFilter.getRecordReader(InputSplit split, JobConf job, Reporter reporter)
          Create a record reader for the given split
 RecordReader SequenceFileAsTextInputFormat.getRecordReader(InputSplit split, JobConf job, Reporter reporter)
           
 RecordReader TextInputFormat.getRecordReader(InputSplit genericSplit, JobConf job, Reporter reporter)
           
abstract  RecordReader FileInputFormat.getRecordReader(InputSplit split, JobConf job, Reporter reporter)
           
 RecordReader SequenceFileInputFormat.getRecordReader(InputSplit split, JobConf job, Reporter reporter)
           
 RecordReader InputFormat.getRecordReader(InputSplit split, JobConf job, Reporter reporter)
          Construct a RecordReader for a FileSplit.
 RecordReader KeyValueTextInputFormat.getRecordReader(InputSplit genericSplit, JobConf job, Reporter reporter)
           
 RecordWriter OutputFormat.getRecordWriter(FileSystem ignored, JobConf job, String name, Progressable progress)
          Construct a RecordWriter with Progressable.
abstract  RecordWriter OutputFormatBase.getRecordWriter(FileSystem ignored, JobConf job, String name, Progressable progress)
           
 RecordWriter MapFileOutputFormat.getRecordWriter(FileSystem ignored, JobConf job, String name, Progressable progress)
           
 RecordWriter SequenceFileOutputFormat.getRecordWriter(FileSystem ignored, JobConf job, String name, Progressable progress)
           
 RecordWriter TextOutputFormat.getRecordWriter(FileSystem ignored, JobConf job, String name, Progressable progress)
           
 InputSplit[] FileInputFormat.getSplits(JobConf job, int numSplits)
          Splits files returned by FileInputFormat.listPaths(JobConf) when they're too big.
 InputSplit[] InputFormat.getSplits(JobConf job, int numSplits)
          Splits a set of input files.
static JobClient.TaskStatusFilter JobClient.getTaskOutputFilter(JobConf job)
          Get the task output filter out of the JobConf
protected  Path[] FileInputFormat.listPaths(JobConf job)
          List input directories.
protected  Path[] SequenceFileInputFormat.listPaths(JobConf job)
           
static void JobEndNotifier.localRunnerNotification(JobConf conf, JobStatus status)
           
static void JobEndNotifier.registerNotification(JobConf jobConf, JobStatus status)
           
static RunningJob JobClient.runJob(JobConf job)
          Utility that submits a job, then polls for progress until the job is complete.
static void OutputFormatBase.setCompressOutput(JobConf conf, boolean val)
          Set whether the output of the reduce is compressed
static void OutputFormatBase.setOutputCompressorClass(JobConf conf, Class codecClass)
          Set the given class as the output compression codec.
static void JobClient.setTaskOutputFilter(JobConf job, JobClient.TaskStatusFilter newValue)
          Modify the JobConf to set the task output filter
 RunningJob JobClient.submitJob(JobConf job)
          Submit a job to the MR system
 void FileInputFormat.validateInput(JobConf job)
           
 void InputFormat.validateInput(JobConf job)
          Are the input directories valid? This method is used to test the input directories when a job is submitted so that the framework can fail early with a useful error message when the input directory does not exist.
 

Constructors in org.apache.hadoop.mapred with parameters of type JobConf
FileSplit(Path file, long start, long length, JobConf conf)
          Constructs a split.
PhasedFileSystem(FileSystem fs, JobConf conf)
          Deprecated. This Constructor is used to wrap a FileSystem object to a Phased FilsSystem.
TaskTracker(JobConf conf)
          Start with the local machine name, and the default JobTracker
 

Uses of JobConf in org.apache.hadoop.mapred.jobcontrol
 

Methods in org.apache.hadoop.mapred.jobcontrol that return JobConf
 JobConf Job.getJobConf()
           
 

Methods in org.apache.hadoop.mapred.jobcontrol with parameters of type JobConf
 void Job.setJobConf(JobConf jobConf)
          Set the mapred job conf for this job.
 

Constructors in org.apache.hadoop.mapred.jobcontrol with parameters of type JobConf
Job(JobConf jobConf, ArrayList dependingJobs)
          Construct a job.
 

Uses of JobConf in org.apache.hadoop.mapred.lib
 

Methods in org.apache.hadoop.mapred.lib with parameters of type JobConf
 void NullOutputFormat.checkOutputSpecs(FileSystem ignored, JobConf job)
           
 void HashPartitioner.configure(JobConf job)
           
 void FieldSelectionMapReduce.configure(JobConf job)
           
 void RegexMapper.configure(JobConf job)
           
 void KeyFieldBasedPartitioner.configure(JobConf job)
           
 void MultithreadedMapRunner.configure(JobConf job)
           
 RecordWriter NullOutputFormat.getRecordWriter(FileSystem ignored, JobConf job, String name, Progressable progress)
           
 

Uses of JobConf in org.apache.hadoop.mapred.lib.aggregate
 

Methods in org.apache.hadoop.mapred.lib.aggregate that return JobConf
static JobConf ValueAggregatorJob.createValueAggregatorJob(String[] args)
          Create an Abacus based map/reduce job.
 

Methods in org.apache.hadoop.mapred.lib.aggregate with parameters of type JobConf
 void ValueAggregatorJobBase.configure(JobConf job)
           
 void ValueAggregatorDescriptor.configure(JobConf job)
          Configure the object
 void UserDefinedValueAggregatorDescriptor.configure(JobConf job)
          Do nothing.
 void ValueAggregatorCombiner.configure(JobConf job)
          Combiner does not need to configure.
 void ValueAggregatorBaseDescriptor.configure(JobConf job)
          get the input file name.
static boolean ValueAggregatorJob.runJob(JobConf job)
          Submit/run a map/reduce job.
 

Constructors in org.apache.hadoop.mapred.lib.aggregate with parameters of type JobConf
UserDefinedValueAggregatorDescriptor(String className, JobConf job)
           
 

Uses of JobConf in org.apache.hadoop.streaming
 

Fields in org.apache.hadoop.streaming declared as JobConf
protected  JobConf StreamJob.jobConf_
           
 

Methods in org.apache.hadoop.streaming with parameters of type JobConf
 void PipeMapRed.configure(JobConf job)
           
 void PipeMapper.configure(JobConf job)
           
static FileSplit StreamUtil.getCurrentSplit(JobConf job)
           
 RecordReader StreamInputFormat.getRecordReader(InputSplit genericSplit, JobConf job, Reporter reporter)
           
static org.apache.hadoop.streaming.StreamUtil.TaskId StreamUtil.getTaskInfo(JobConf job)
           
static boolean StreamUtil.isLocalJobTracker(JobConf job)
           
 void StreamBaseRecordReader.validateInput(JobConf job)
          This implementation always returns true.
 

Constructors in org.apache.hadoop.streaming with parameters of type JobConf
StreamBaseRecordReader(FSDataInputStream in, FileSplit split, Reporter reporter, JobConf job, FileSystem fs)
           
StreamXmlRecordReader(FSDataInputStream in, FileSplit split, Reporter reporter, JobConf job, FileSystem fs)
           
 

Uses of JobConf in org.apache.hadoop.tools
 

Methods in org.apache.hadoop.tools with parameters of type JobConf
 void Logalyzer.LogRegexMapper.configure(JobConf job)
           
 

Uses of JobConf in org.apache.hadoop.util
 

Methods in org.apache.hadoop.util with parameters of type JobConf
abstract  void CopyFiles.CopyFilesMapper.cleanup(Configuration conf, JobConf jobConf, String srcPath, String destPath)
          Interface to cleanup *distcp* specific resources
 void CopyFiles.FSCopyFilesMapper.cleanup(Configuration conf, JobConf jobConf, String srcPath, String destPath)
           
 void CopyFiles.HTTPCopyFilesMapper.cleanup(Configuration conf, JobConf jobConf, String srcPath, String destPath)
           
 void CopyFiles.FSCopyFilesMapper.configure(JobConf job)
          Mapper configuration.
 void CopyFiles.HTTPCopyFilesMapper.configure(JobConf job)
           
abstract  void CopyFiles.CopyFilesMapper.setup(Configuration conf, JobConf jobConf, String[] srcPaths, String destPath, boolean ignoreReadFailures)
          Interface to initialize *distcp* specific map tasks.
 void CopyFiles.FSCopyFilesMapper.setup(Configuration conf, JobConf jobConf, String[] srcPaths, String destPath, boolean ignoreReadFailures)
          Initialize DFSCopyFileMapper specific job-configuration.
 void CopyFiles.HTTPCopyFilesMapper.setup(Configuration conf, JobConf jobConf, String[] srcPaths, String destPath, boolean ignoreReadFailures)
          Initialize HTTPCopyFileMapper specific job.
 



Copyright © 2006 The Apache Software Foundation