Class TestHFileOutputFormat2
java.lang.Object
org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2TestBase
org.apache.hadoop.hbase.mapreduce.TestHFileOutputFormat2
Simple test for
HFileOutputFormat2
. Sets up and runs a mapreduce job that writes hfile
output. Creates a few inner classes to implement splits and an inputformat that emits keys and
values.-
Nested Class Summary
Nested ClassesModifier and TypeClassDescriptionprivate static class
Nested classes/interfaces inherited from class org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2TestBase
HFileOutputFormat2TestBase.RandomKVGeneratingMapper, HFileOutputFormat2TestBase.RandomPutGeneratingMapper
-
Field Summary
FieldsModifier and TypeFieldDescriptionstatic final HBaseClassTestRule
private static final org.slf4j.Logger
Fields inherited from class org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2TestBase
DEFAULT_VALUE_LENGTH, FAMILIES, FAMILY_NAME, ROWSPERSPLIT, TABLE_NAMES, UTIL
-
Constructor Summary
Constructors -
Method Summary
Modifier and TypeMethodDescriptionprivate org.apache.hadoop.mapreduce.TaskAttemptContext
createTestTaskAttemptContext
(org.apache.hadoop.mapreduce.Job job) getMockColumnFamiliesForBlockSize
(int numCfs) getMockColumnFamiliesForBloomType
(int numCfs) getMockColumnFamiliesForCompression
(int numCfs) getMockColumnFamiliesForDataBlockEncoding
(int numCfs) private String
getStoragePolicyName
(org.apache.hadoop.fs.FileSystem fs, org.apache.hadoop.fs.Path path) private String
getStoragePolicyNameForOldHDFSVersion
(org.apache.hadoop.fs.FileSystem fs, org.apache.hadoop.fs.Path path) static void
void
manualTest
(String[] args) private void
private void
setupMockColumnFamiliesForBlockSize
(org.apache.hadoop.hbase.client.Table table, Map<String, Integer> familyToDataBlockEncoding) private void
setupMockColumnFamiliesForBloomType
(org.apache.hadoop.hbase.client.Table table, Map<String, org.apache.hadoop.hbase.regionserver.BloomType> familyToDataBlockEncoding) private void
setupMockColumnFamiliesForCompression
(org.apache.hadoop.hbase.client.Table table, Map<String, org.apache.hadoop.hbase.io.compress.Compression.Algorithm> familyToCompression) private void
setupMockColumnFamiliesForDataBlockEncoding
(org.apache.hadoop.hbase.client.Table table, Map<String, org.apache.hadoop.hbase.io.encoding.DataBlockEncoding> familyToDataBlockEncoding) private void
setupMockStartKeys
(org.apache.hadoop.hbase.client.RegionLocator table) private void
setupMockTableName
(org.apache.hadoop.hbase.client.RegionLocator table) void
Test thatHFileOutputFormat2
RecordWriter amends timestamps if passed a keyvalue whose timestamp isHConstants.LATEST_TIMESTAMP
.void
void
Test thatHFileOutputFormat2
RecordWriter writes tags such as ttl into hfile.void
void
Test thatHFileOutputFormat2
RecordWriter uses compression and bloom filter settings from the column family descriptorvoid
void
This test is to test the scenario happened in HBASE-6901.void
void
void
void
Test forHFileOutputFormat2.createFamilyBlockSizeMap(Configuration)
.void
Test forHFileOutputFormat2.createFamilyBloomTypeMap(Configuration)
.void
Test forHFileOutputFormat2.createFamilyCompressionMap(Configuration)
.void
Test forHFileOutputFormat2.createFamilyDataBlockEncodingMap(Configuration)
.void
Run small MR job.private void
writeRandomKeyValues
(org.apache.hadoop.mapreduce.RecordWriter<org.apache.hadoop.hbase.io.ImmutableBytesWritable, org.apache.hadoop.hbase.Cell> writer, org.apache.hadoop.mapreduce.TaskAttemptContext context, Set<byte[]> families, int numRows) Write random values to the writer assuming a table created usingHFileOutputFormat2TestBase.FAMILIES
as column family descriptorsMethods inherited from class org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2TestBase
generateData, generateRandomSplitKeys, generateRandomStartKeys, runIncrementalPELoad, setupRandomGeneratorMapper
-
Field Details
-
CLASS_RULE
-
LOG
-
-
Constructor Details
-
TestHFileOutputFormat2
public TestHFileOutputFormat2()
-
-
Method Details
-
test_LATEST_TIMESTAMP_isReplaced
Test thatHFileOutputFormat2
RecordWriter amends timestamps if passed a keyvalue whose timestamp isHConstants.LATEST_TIMESTAMP
.- Throws:
Exception
- See Also:
-
createTestTaskAttemptContext
private org.apache.hadoop.mapreduce.TaskAttemptContext createTestTaskAttemptContext(org.apache.hadoop.mapreduce.Job job) throws Exception - Throws:
Exception
-
test_TIMERANGE
- Throws:
Exception
-
testWritingPEData
Run small MR job.- Throws:
Exception
-
test_WritingTagData
Test thatHFileOutputFormat2
RecordWriter writes tags such as ttl into hfile.- Throws:
Exception
-
testJobConfiguration
- Throws:
Exception
-
testSerializeDeserializeFamilyCompressionMap
Test forHFileOutputFormat2.createFamilyCompressionMap(Configuration)
. Tests that the family compression map is correctly serialized into and deserialized from configuration- Throws:
IOException
-
setupMockColumnFamiliesForCompression
private void setupMockColumnFamiliesForCompression(org.apache.hadoop.hbase.client.Table table, Map<String, org.apache.hadoop.hbase.io.compress.Compression.Algorithm> familyToCompression) throws IOException- Throws:
IOException
-
getMockColumnFamiliesForCompression
private Map<String,org.apache.hadoop.hbase.io.compress.Compression.Algorithm> getMockColumnFamiliesForCompression(int numCfs) - Returns:
- a map from column family names to compression algorithms for testing column family compression. Column family names have special characters
-
testSerializeDeserializeFamilyBloomTypeMap
Test forHFileOutputFormat2.createFamilyBloomTypeMap(Configuration)
. Tests that the family bloom type map is correctly serialized into and deserialized from configuration- Throws:
IOException
-
setupMockColumnFamiliesForBloomType
private void setupMockColumnFamiliesForBloomType(org.apache.hadoop.hbase.client.Table table, Map<String, org.apache.hadoop.hbase.regionserver.BloomType> familyToDataBlockEncoding) throws IOException- Throws:
IOException
-
getMockColumnFamiliesForBloomType
private Map<String,org.apache.hadoop.hbase.regionserver.BloomType> getMockColumnFamiliesForBloomType(int numCfs) - Returns:
- a map from column family names to compression algorithms for testing column family compression. Column family names have special characters
-
testSerializeDeserializeFamilyBlockSizeMap
Test forHFileOutputFormat2.createFamilyBlockSizeMap(Configuration)
. Tests that the family block size map is correctly serialized into and deserialized from configuration- Throws:
IOException
-
setupMockColumnFamiliesForBlockSize
private void setupMockColumnFamiliesForBlockSize(org.apache.hadoop.hbase.client.Table table, Map<String, Integer> familyToDataBlockEncoding) throws IOException- Throws:
IOException
-
getMockColumnFamiliesForBlockSize
- Returns:
- a map from column family names to compression algorithms for testing column family compression. Column family names have special characters
-
testSerializeDeserializeFamilyDataBlockEncodingMap
Test forHFileOutputFormat2.createFamilyDataBlockEncodingMap(Configuration)
. Tests that the family data block encoding map is correctly serialized into and deserialized from configuration- Throws:
IOException
-
setupMockColumnFamiliesForDataBlockEncoding
private void setupMockColumnFamiliesForDataBlockEncoding(org.apache.hadoop.hbase.client.Table table, Map<String, org.apache.hadoop.hbase.io.encoding.DataBlockEncoding> familyToDataBlockEncoding) throws IOException- Throws:
IOException
-
getMockColumnFamiliesForDataBlockEncoding
private Map<String,org.apache.hadoop.hbase.io.encoding.DataBlockEncoding> getMockColumnFamiliesForDataBlockEncoding(int numCfs) - Returns:
- a map from column family names to compression algorithms for testing column family compression. Column family names have special characters
-
setupMockStartKeys
private void setupMockStartKeys(org.apache.hadoop.hbase.client.RegionLocator table) throws IOException - Throws:
IOException
-
setupMockTableName
private void setupMockTableName(org.apache.hadoop.hbase.client.RegionLocator table) throws IOException - Throws:
IOException
-
testColumnFamilySettings
Test thatHFileOutputFormat2
RecordWriter uses compression and bloom filter settings from the column family descriptor- Throws:
Exception
-
writeRandomKeyValues
private void writeRandomKeyValues(org.apache.hadoop.mapreduce.RecordWriter<org.apache.hadoop.hbase.io.ImmutableBytesWritable, org.apache.hadoop.hbase.Cell> writer, org.apache.hadoop.mapreduce.TaskAttemptContext context, Set<byte[]> families, int numRows) throws IOException, InterruptedExceptionWrite random values to the writer assuming a table created usingHFileOutputFormat2TestBase.FAMILIES
as column family descriptors- Throws:
IOException
InterruptedException
-
testExcludeAllFromMinorCompaction
This test is to test the scenario happened in HBASE-6901. All files are bulk loaded and excluded from minor compaction. Without the fix of HBASE-6901, an ArrayIndexOutOfBoundsException will be thrown.- Throws:
Exception
-
testExcludeMinorCompaction
- Throws:
Exception
-
quickPoll
- Throws:
Exception
-
main
- Throws:
Exception
-
manualTest
- Throws:
Exception
-
testBlockStoragePolicy
- Throws:
Exception
-
getStoragePolicyName
private String getStoragePolicyName(org.apache.hadoop.fs.FileSystem fs, org.apache.hadoop.fs.Path path) -
getStoragePolicyNameForOldHDFSVersion
private String getStoragePolicyNameForOldHDFSVersion(org.apache.hadoop.fs.FileSystem fs, org.apache.hadoop.fs.Path path) -
TestConfigureCompression
- Throws:
Exception
-
testMRIncrementalLoadWithLocalityMultiCluster
- Throws:
Exception
-