Package org.apache.hadoop.hbase.test
Class IntegrationTestBigLinkedList.Verify.VerifyReducer
java.lang.Object
org.apache.hadoop.mapreduce.Reducer<org.apache.hadoop.io.BytesWritable,org.apache.hadoop.io.BytesWritable,org.apache.hadoop.io.BytesWritable,org.apache.hadoop.io.BytesWritable>
   
org.apache.hadoop.hbase.test.IntegrationTestBigLinkedList.Verify.VerifyReducer
- Enclosing class:
- IntegrationTestBigLinkedList.Verify
public static class IntegrationTestBigLinkedList.Verify.VerifyReducer
extends org.apache.hadoop.mapreduce.Reducer<org.apache.hadoop.io.BytesWritable,org.apache.hadoop.io.BytesWritable,org.apache.hadoop.io.BytesWritable,org.apache.hadoop.io.BytesWritable>   
Per reducer, we output problem rows as byte arrays so can be used as input for subsequent
 investigative mapreduce jobs. Each emitted value is prefaced by a one byte flag saying what
 sort of emission it is. Flag is the Count enum ordinal as a short.
- 
Nested Class SummaryNested classes/interfaces inherited from class org.apache.hadoop.mapreduce.Reducerorg.apache.hadoop.mapreduce.Reducer.Context
- 
Field SummaryFieldsModifier and TypeFieldDescriptionprivate org.apache.hadoop.hbase.client.Connectionprivate final org.apache.hadoop.io.BytesWritableprivate ArrayList<byte[]>private AtomicIntegerprivate final org.apache.hadoop.io.BytesWritable
- 
Constructor SummaryConstructors
- 
Method SummaryModifier and TypeMethodDescriptionstatic byte[]addPrefixFlag(int ordinal, byte[] r) Returns new byte array that hasordinalas prefix on front taking up Bytes.SIZEOF_SHORT bytes followed byrprotected voidcleanup(org.apache.hadoop.mapreduce.Reducer<org.apache.hadoop.io.BytesWritable, org.apache.hadoop.io.BytesWritable, org.apache.hadoop.io.BytesWritable, org.apache.hadoop.io.BytesWritable>.org.apache.hadoop.mapreduce.Reducer.Context context) private StringBuilderdumpExtraInfoOnRefs(org.apache.hadoop.io.BytesWritable key, org.apache.hadoop.mapreduce.Reducer<org.apache.hadoop.io.BytesWritable, org.apache.hadoop.io.BytesWritable, org.apache.hadoop.io.BytesWritable, org.apache.hadoop.io.BytesWritable>.org.apache.hadoop.mapreduce.Reducer.Context context, List<byte[]> refs) Dump out extra info around references if there are any.static byte[]getRowOnly(org.apache.hadoop.io.BytesWritable bw) Returns Row bytes minus the type flag.voidreduce(org.apache.hadoop.io.BytesWritable key, Iterable<org.apache.hadoop.io.BytesWritable> values, org.apache.hadoop.mapreduce.Reducer<org.apache.hadoop.io.BytesWritable, org.apache.hadoop.io.BytesWritable, org.apache.hadoop.io.BytesWritable, org.apache.hadoop.io.BytesWritable>.org.apache.hadoop.mapreduce.Reducer.Context context) protected voidsetup(org.apache.hadoop.mapreduce.Reducer<org.apache.hadoop.io.BytesWritable, org.apache.hadoop.io.BytesWritable, org.apache.hadoop.io.BytesWritable, org.apache.hadoop.io.BytesWritable>.org.apache.hadoop.mapreduce.Reducer.Context context) whichType(byte[] bs) Returns type from the Counts enum of this row.Methods inherited from class org.apache.hadoop.mapreduce.Reducerrun
- 
Field Details- 
refs
- 
UNREF
- 
LOSTFAM
- 
rows
- 
connection
 
- 
- 
Constructor Details- 
VerifyReducerpublic VerifyReducer()
 
- 
- 
Method Details- 
setupprotected void setup(org.apache.hadoop.mapreduce.Reducer<org.apache.hadoop.io.BytesWritable, org.apache.hadoop.io.BytesWritable, throws IOException, InterruptedExceptionorg.apache.hadoop.io.BytesWritable, org.apache.hadoop.io.BytesWritable>.org.apache.hadoop.mapreduce.Reducer.Context context) - Overrides:
- setupin class- org.apache.hadoop.mapreduce.Reducer<org.apache.hadoop.io.BytesWritable,- org.apache.hadoop.io.BytesWritable, - org.apache.hadoop.io.BytesWritable, - org.apache.hadoop.io.BytesWritable> 
- Throws:
- IOException
- InterruptedException
 
- 
cleanupprotected void cleanup(org.apache.hadoop.mapreduce.Reducer<org.apache.hadoop.io.BytesWritable, org.apache.hadoop.io.BytesWritable, throws IOException, InterruptedExceptionorg.apache.hadoop.io.BytesWritable, org.apache.hadoop.io.BytesWritable>.org.apache.hadoop.mapreduce.Reducer.Context context) - Overrides:
- cleanupin class- org.apache.hadoop.mapreduce.Reducer<org.apache.hadoop.io.BytesWritable,- org.apache.hadoop.io.BytesWritable, - org.apache.hadoop.io.BytesWritable, - org.apache.hadoop.io.BytesWritable> 
- Throws:
- IOException
- InterruptedException
 
- 
addPrefixFlagReturns new byte array that hasordinalas prefix on front taking up Bytes.SIZEOF_SHORT bytes followed byr
- 
whichTypeReturns type from the Counts enum of this row. Reads prefix added byaddPrefixFlag(int, byte[])
- 
getRowOnlyReturns Row bytes minus the type flag.
- 
reducepublic void reduce(org.apache.hadoop.io.BytesWritable key, Iterable<org.apache.hadoop.io.BytesWritable> values, org.apache.hadoop.mapreduce.Reducer<org.apache.hadoop.io.BytesWritable, org.apache.hadoop.io.BytesWritable, throws IOException, InterruptedExceptionorg.apache.hadoop.io.BytesWritable, org.apache.hadoop.io.BytesWritable>.org.apache.hadoop.mapreduce.Reducer.Context context) - Overrides:
- reducein class- org.apache.hadoop.mapreduce.Reducer<org.apache.hadoop.io.BytesWritable,- org.apache.hadoop.io.BytesWritable, - org.apache.hadoop.io.BytesWritable, - org.apache.hadoop.io.BytesWritable> 
- Throws:
- IOException
- InterruptedException
 
- 
dumpExtraInfoOnRefsprivate StringBuilder dumpExtraInfoOnRefs(org.apache.hadoop.io.BytesWritable key, org.apache.hadoop.mapreduce.Reducer<org.apache.hadoop.io.BytesWritable, org.apache.hadoop.io.BytesWritable, throws IOExceptionorg.apache.hadoop.io.BytesWritable, org.apache.hadoop.io.BytesWritable>.org.apache.hadoop.mapreduce.Reducer.Context context, List<byte[]> refs) Dump out extra info around references if there are any. Helps debugging.- Returns:
- StringBuilder filled with references if any.
- Throws:
- IOException
 
 
-