org.apache.hadoop.hbase.spark

HBaseRelation

case class HBaseRelation(parameters: Map[String, String], userSpecifiedSchema: Option[StructType])(sqlContext: SQLContext) extends BaseRelation with PrunedFilteredScan with InsertableRelation with Logging with Product with Serializable

Implementation of Spark BaseRelation that will build up our scan logic , do the scan pruning, filter push down, and value conversions

sqlContext

SparkSQL context

Annotations
@Private()
Linear Supertypes
Serializable, Serializable, Product, Equals, Logging, InsertableRelation, PrunedFilteredScan, BaseRelation, AnyRef, Any
Ordering
  1. Alphabetic
  2. By inheritance
Inherited
  1. HBaseRelation
  2. Serializable
  3. Serializable
  4. Product
  5. Equals
  6. Logging
  7. InsertableRelation
  8. PrunedFilteredScan
  9. BaseRelation
  10. AnyRef
  11. Any
  1. Hide All
  2. Show all
Learn more about member selection
Visibility
  1. Public
  2. All

Instance Constructors

  1. new HBaseRelation(parameters: Map[String, String], userSpecifiedSchema: Option[StructType])(sqlContext: SQLContext)

    sqlContext

    SparkSQL context

Value Members

  1. final def !=(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  2. final def !=(arg0: Any): Boolean

    Definition Classes
    Any
  3. final def ##(): Int

    Definition Classes
    AnyRef → Any
  4. final def ==(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  5. final def ==(arg0: Any): Boolean

    Definition Classes
    Any
  6. final def asInstanceOf[T0]: T0

    Definition Classes
    Any
  7. val batchNum: Int

  8. val blockCacheEnable: Boolean

  9. def buildPushDownPredicatesResource(filters: Array[Filter]): (RowKeyFilter, DynamicLogicExpression, Array[Array[Byte]])

  10. def buildRow(fields: Seq[Field], result: Result): Row

  11. def buildScan(requiredColumns: Array[String], filters: Array[Filter]): RDD[Row]

    Here we are building the functionality to populate the resulting RDD[Row] Here is where we will do the following: - Filter push down - Scan or GetList pruning - Executing our scan(s) or/and GetList to generate result

    Here we are building the functionality to populate the resulting RDD[Row] Here is where we will do the following: - Filter push down - Scan or GetList pruning - Executing our scan(s) or/and GetList to generate result

    requiredColumns

    The columns that are being requested by the requesting query

    filters

    The filters that are being applied by the requesting query

    returns

    RDD will all the results from HBase needed for SparkSQL to execute the query on

    Definition Classes
    HBaseRelation → PrunedFilteredScan
  12. val bulkGetSize: Int

  13. val cacheSize: Int

  14. val catalog: HBaseTableCatalog

  15. def clone(): AnyRef

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  16. val configResources: String

  17. def createTable(): Unit

  18. val encoder: BytesEncoder

  19. val encoderClsName: String

  20. final def eq(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  21. def finalize(): Unit

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  22. final def getClass(): Class[_]

    Definition Classes
    AnyRef → Any
  23. def getIndexedProjections(requiredColumns: Array[String]): Seq[(Field, Int)]

  24. def hbaseConf: Configuration

  25. val hbaseContext: HBaseContext

  26. def insert(data: DataFrame, overwrite: Boolean): Unit

    data
    overwrite

    Definition Classes
    HBaseRelation → InsertableRelation
  27. final def isInstanceOf[T0]: Boolean

    Definition Classes
    Any
  28. def isTraceEnabled(): Boolean

    Attributes
    protected
    Definition Classes
    Logging
  29. def log: Logger

    Attributes
    protected
    Definition Classes
    Logging
  30. def logDebug(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  31. def logDebug(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  32. def logError(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  33. def logError(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  34. def logInfo(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  35. def logInfo(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  36. def logName: String

    Attributes
    protected
    Definition Classes
    Logging
  37. def logTrace(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  38. def logTrace(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  39. def logWarning(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  40. def logWarning(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  41. val maxTimestamp: Option[Long]

  42. val maxVersions: Option[Int]

  43. val minTimestamp: Option[Long]

  44. final def ne(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  45. def needConversion: Boolean

    Definition Classes
    BaseRelation
  46. final def notify(): Unit

    Definition Classes
    AnyRef
  47. final def notifyAll(): Unit

    Definition Classes
    AnyRef
  48. val parameters: Map[String, String]

  49. def parseRowKey(row: Array[Byte], keyFields: Seq[Field]): Map[Field, Any]

    Takes a HBase Row object and parses all of the fields from it.

    Takes a HBase Row object and parses all of the fields from it. This is independent of which fields were requested from the key Because we have all the data it's less complex to parse everything.

    row

    the retrieved row from hbase.

    keyFields

    all of the fields in the row key, ORDERED by their order in the row key.

  50. val schema: StructType

    Generates a Spark SQL schema objeparametersct so Spark SQL knows what is being provided by this BaseRelation

    Generates a Spark SQL schema objeparametersct so Spark SQL knows what is being provided by this BaseRelation

    returns

    schema generated from the SCHEMA_COLUMNS_MAPPING_KEY value

    Definition Classes
    HBaseRelation → BaseRelation
  51. def sizeInBytes: Long

    Definition Classes
    BaseRelation
  52. val sqlContext: SQLContext

    SparkSQL context

    SparkSQL context

    Definition Classes
    HBaseRelation → BaseRelation
  53. final def synchronized[T0](arg0: ⇒ T0): T0

    Definition Classes
    AnyRef
  54. def tableName: String

  55. val timestamp: Option[Long]

  56. def transverseFilterTree(parentRowKeyFilter: RowKeyFilter, valueArray: MutableList[Array[Byte]], filter: Filter): DynamicLogicExpression

    For some codec, the order may be inconsistent between java primitive type and its byte array.

    For some codec, the order may be inconsistent between java primitive type and its byte array. We may have to split the predicates on some of the java primitive type into multiple predicates. The encoder will take care of it and returning the concrete ranges.

    For example in naive codec, some of the java primitive types have to be split into multiple predicates, and union these predicates together to make the predicates be performed correctly. For example, if we have "COLUMN < 2", we will transform it into "0 <= COLUMN < 2 OR Integer.MIN_VALUE <= COLUMN <= -1"

  57. def unhandledFilters(filters: Array[Filter]): Array[Filter]

    Definition Classes
    BaseRelation
  58. val useHBaseContext: Boolean

  59. val usePushDownColumnFilter: Boolean

  60. val userSpecifiedSchema: Option[StructType]

  61. final def wait(): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  62. final def wait(arg0: Long, arg1: Int): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  63. final def wait(arg0: Long): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  64. val wrappedConf: SerializableConfiguration

Inherited from Serializable

Inherited from Serializable

Inherited from Product

Inherited from Equals

Inherited from Logging

Inherited from InsertableRelation

Inherited from PrunedFilteredScan

Inherited from BaseRelation

Inherited from AnyRef

Inherited from Any

Ungrouped