Class IdentityTableReducer
java.lang.Object
org.apache.hadoop.mapreduce.Reducer<KEYIN,VALUEIN,KEYOUT,Mutation>
org.apache.hadoop.hbase.mapreduce.TableReducer<org.apache.hadoop.io.Writable,Mutation,org.apache.hadoop.io.Writable>
org.apache.hadoop.hbase.mapreduce.IdentityTableReducer
@Public
public class IdentityTableReducer
extends TableReducer<org.apache.hadoop.io.Writable,Mutation,org.apache.hadoop.io.Writable>
Convenience class that simply writes all values (which must be
Put
or Delete
instances) passed to it out to the configured HBase table. This works in combination with
TableOutputFormat
which actually does the writing to HBase.
Keys are passed along but ignored in TableOutputFormat. However, they can be used to control how your values will be divided up amongst the specified number of reducers.
You can also use the TableMapReduceUtil
class to set up the two classes in one step:
TableMapReduceUtil.initTableReducerJob("table", IdentityTableReducer.class, job);
This will also set the proper TableOutputFormat
which is given the
table
parameter. The Put
or
Delete
define the row and columns implicitly.-
Nested Class Summary
Nested classes/interfaces inherited from class org.apache.hadoop.mapreduce.Reducer
org.apache.hadoop.mapreduce.Reducer.Context
-
Field Summary
-
Constructor Summary
-
Method Summary
Modifier and TypeMethodDescriptionvoid
reduce
(org.apache.hadoop.io.Writable key, Iterable<Mutation> values, org.apache.hadoop.mapreduce.Reducer<org.apache.hadoop.io.Writable, Mutation, org.apache.hadoop.io.Writable, Mutation>.org.apache.hadoop.mapreduce.Reducer.Context context) Writes each given record, consisting of the row key and the given values, to the configuredOutputFormat
.Methods inherited from class org.apache.hadoop.mapreduce.Reducer
cleanup, run, setup
-
Field Details
-
LOG
-
-
Constructor Details
-
IdentityTableReducer
public IdentityTableReducer()
-
-
Method Details
-
reduce
public void reduce(org.apache.hadoop.io.Writable key, Iterable<Mutation> values, org.apache.hadoop.mapreduce.Reducer<org.apache.hadoop.io.Writable, Mutation, throws IOException, InterruptedExceptionorg.apache.hadoop.io.Writable, Mutation>.org.apache.hadoop.mapreduce.Reducer.Context context) Writes each given record, consisting of the row key and the given values, to the configuredOutputFormat
. It is emitting the row key and eachPut
orDelete
as separate pairs.- Overrides:
reduce
in classorg.apache.hadoop.mapreduce.Reducer<org.apache.hadoop.io.Writable,
Mutation, org.apache.hadoop.io.Writable, Mutation> - Parameters:
key
- The current row key.values
- ThePut
orDelete
list for the given row.context
- The context of the reduce.- Throws:
IOException
- When writing the record fails.InterruptedException
- When the job gets interrupted.
-