Package org.apache.hadoop.hbase.test
Class IntegrationTestBigLinkedList.Verify.VerifyReducer
java.lang.Object
org.apache.hadoop.mapreduce.Reducer<org.apache.hadoop.io.BytesWritable,org.apache.hadoop.io.BytesWritable,org.apache.hadoop.io.BytesWritable,org.apache.hadoop.io.BytesWritable>
org.apache.hadoop.hbase.test.IntegrationTestBigLinkedList.Verify.VerifyReducer
- Enclosing class:
- IntegrationTestBigLinkedList.Verify
public static class IntegrationTestBigLinkedList.Verify.VerifyReducer
extends org.apache.hadoop.mapreduce.Reducer<org.apache.hadoop.io.BytesWritable,org.apache.hadoop.io.BytesWritable,org.apache.hadoop.io.BytesWritable,org.apache.hadoop.io.BytesWritable>
Per reducer, we output problem rows as byte arrays so can be used as input for subsequent
investigative mapreduce jobs. Each emitted value is prefaced by a one byte flag saying what
sort of emission it is. Flag is the Count enum ordinal as a short.
-
Nested Class Summary
Nested classes/interfaces inherited from class org.apache.hadoop.mapreduce.Reducer
org.apache.hadoop.mapreduce.Reducer.Context -
Field Summary
FieldsModifier and TypeFieldDescriptionprivate org.apache.hadoop.hbase.client.Connectionprivate final org.apache.hadoop.io.BytesWritableprivate ArrayList<byte[]>private AtomicIntegerprivate final org.apache.hadoop.io.BytesWritable -
Constructor Summary
Constructors -
Method Summary
Modifier and TypeMethodDescriptionstatic byte[]addPrefixFlag(int ordinal, byte[] r) Returns new byte array that hasordinalas prefix on front taking up Bytes.SIZEOF_SHORT bytes followed byrprotected voidcleanup(org.apache.hadoop.mapreduce.Reducer<org.apache.hadoop.io.BytesWritable, org.apache.hadoop.io.BytesWritable, org.apache.hadoop.io.BytesWritable, org.apache.hadoop.io.BytesWritable>.org.apache.hadoop.mapreduce.Reducer.Context context) private StringBuilderdumpExtraInfoOnRefs(org.apache.hadoop.io.BytesWritable key, org.apache.hadoop.mapreduce.Reducer<org.apache.hadoop.io.BytesWritable, org.apache.hadoop.io.BytesWritable, org.apache.hadoop.io.BytesWritable, org.apache.hadoop.io.BytesWritable>.org.apache.hadoop.mapreduce.Reducer.Context context, List<byte[]> refs) Dump out extra info around references if there are any.static byte[]getRowOnly(org.apache.hadoop.io.BytesWritable bw) Returns Row bytes minus the type flag.voidreduce(org.apache.hadoop.io.BytesWritable key, Iterable<org.apache.hadoop.io.BytesWritable> values, org.apache.hadoop.mapreduce.Reducer<org.apache.hadoop.io.BytesWritable, org.apache.hadoop.io.BytesWritable, org.apache.hadoop.io.BytesWritable, org.apache.hadoop.io.BytesWritable>.org.apache.hadoop.mapreduce.Reducer.Context context) protected voidsetup(org.apache.hadoop.mapreduce.Reducer<org.apache.hadoop.io.BytesWritable, org.apache.hadoop.io.BytesWritable, org.apache.hadoop.io.BytesWritable, org.apache.hadoop.io.BytesWritable>.org.apache.hadoop.mapreduce.Reducer.Context context) whichType(byte[] bs) Returns type from the Counts enum of this row.Methods inherited from class org.apache.hadoop.mapreduce.Reducer
run
-
Field Details
-
refs
-
UNREF
-
LOSTFAM
-
rows
-
connection
-
-
Constructor Details
-
VerifyReducer
public VerifyReducer()
-
-
Method Details
-
setup
protected void setup(org.apache.hadoop.mapreduce.Reducer<org.apache.hadoop.io.BytesWritable, org.apache.hadoop.io.BytesWritable, throws IOException, InterruptedExceptionorg.apache.hadoop.io.BytesWritable, org.apache.hadoop.io.BytesWritable>.org.apache.hadoop.mapreduce.Reducer.Context context) - Overrides:
setupin classorg.apache.hadoop.mapreduce.Reducer<org.apache.hadoop.io.BytesWritable,org.apache.hadoop.io.BytesWritable, org.apache.hadoop.io.BytesWritable, org.apache.hadoop.io.BytesWritable> - Throws:
IOExceptionInterruptedException
-
cleanup
protected void cleanup(org.apache.hadoop.mapreduce.Reducer<org.apache.hadoop.io.BytesWritable, org.apache.hadoop.io.BytesWritable, throws IOException, InterruptedExceptionorg.apache.hadoop.io.BytesWritable, org.apache.hadoop.io.BytesWritable>.org.apache.hadoop.mapreduce.Reducer.Context context) - Overrides:
cleanupin classorg.apache.hadoop.mapreduce.Reducer<org.apache.hadoop.io.BytesWritable,org.apache.hadoop.io.BytesWritable, org.apache.hadoop.io.BytesWritable, org.apache.hadoop.io.BytesWritable> - Throws:
IOExceptionInterruptedException
-
addPrefixFlag
Returns new byte array that hasordinalas prefix on front taking up Bytes.SIZEOF_SHORT bytes followed byr -
whichType
Returns type from the Counts enum of this row. Reads prefix added byaddPrefixFlag(int, byte[]) -
getRowOnly
Returns Row bytes minus the type flag. -
reduce
public void reduce(org.apache.hadoop.io.BytesWritable key, Iterable<org.apache.hadoop.io.BytesWritable> values, org.apache.hadoop.mapreduce.Reducer<org.apache.hadoop.io.BytesWritable, org.apache.hadoop.io.BytesWritable, throws IOException, InterruptedExceptionorg.apache.hadoop.io.BytesWritable, org.apache.hadoop.io.BytesWritable>.org.apache.hadoop.mapreduce.Reducer.Context context) - Overrides:
reducein classorg.apache.hadoop.mapreduce.Reducer<org.apache.hadoop.io.BytesWritable,org.apache.hadoop.io.BytesWritable, org.apache.hadoop.io.BytesWritable, org.apache.hadoop.io.BytesWritable> - Throws:
IOExceptionInterruptedException
-
dumpExtraInfoOnRefs
private StringBuilder dumpExtraInfoOnRefs(org.apache.hadoop.io.BytesWritable key, org.apache.hadoop.mapreduce.Reducer<org.apache.hadoop.io.BytesWritable, org.apache.hadoop.io.BytesWritable, throws IOExceptionorg.apache.hadoop.io.BytesWritable, org.apache.hadoop.io.BytesWritable>.org.apache.hadoop.mapreduce.Reducer.Context context, List<byte[]> refs) Dump out extra info around references if there are any. Helps debugging.- Returns:
- StringBuilder filled with references if any.
- Throws:
IOException
-