@InterfaceAudience.Public public class RowCounter extends org.apache.hadoop.hbase.util.AbstractHBaseTool
Constructor and Description |
---|
RowCounter() |
Modifier and Type | Method and Description |
---|---|
protected void |
addOptions()
Override this to add command-line options using
AbstractHBaseTool.addOptWithArg(java.lang.String, java.lang.String) and similar methods. |
org.apache.hadoop.mapreduce.Job |
createSubmittableJob(org.apache.hadoop.conf.Configuration conf)
Sets up the actual job.
|
static org.apache.hadoop.mapreduce.Job |
createSubmittableJob(org.apache.hadoop.conf.Configuration conf,
String[] args)
Deprecated.
as of release 2.3.0. Will be removed on 4.0.0. Please use main method instead.
|
protected int |
doWork()
The "main function" of the tool
|
static void |
main(String[] args)
Main entry point.
|
protected org.apache.hbase.thirdparty.org.apache.commons.cli.CommandLineParser |
newParser()
Create the parser to use for parsing and validating the command line.
|
protected void |
printUsage() |
protected void |
printUsage(String usageStr,
String usageHeader,
String usageFooter) |
protected void |
processOldArgs(List<String> args)
For backward compatibility.
|
protected void |
processOptions(org.apache.hbase.thirdparty.org.apache.commons.cli.CommandLine cmd)
This method is called to process the options after they have been parsed.
|
addOption, addOptNoArg, addOptNoArg, addOptWithArg, addOptWithArg, addRequiredOption, addRequiredOptWithArg, addRequiredOptWithArg, doStaticMain, getConf, getOptionAsDouble, getOptionAsInt, getOptionAsInt, getOptionAsLong, getOptionAsLong, parseArgs, parseInt, parseLong, run, setConf
public RowCounter()
public org.apache.hadoop.mapreduce.Job createSubmittableJob(org.apache.hadoop.conf.Configuration conf) throws IOException
conf
- The current configuration.IOException
- When setting up the job fails.@Deprecated public static org.apache.hadoop.mapreduce.Job createSubmittableJob(org.apache.hadoop.conf.Configuration conf, String[] args) throws IOException
conf
- The current configuration.args
- The command line parameters.IOException
- When setting up the job fails.protected void printUsage()
printUsage
in class org.apache.hadoop.hbase.util.AbstractHBaseTool
protected void printUsage(String usageStr, String usageHeader, String usageFooter)
printUsage
in class org.apache.hadoop.hbase.util.AbstractHBaseTool
protected void addOptions()
org.apache.hadoop.hbase.util.AbstractHBaseTool
AbstractHBaseTool.addOptWithArg(java.lang.String, java.lang.String)
and similar methods.addOptions
in class org.apache.hadoop.hbase.util.AbstractHBaseTool
protected void processOptions(org.apache.hbase.thirdparty.org.apache.commons.cli.CommandLine cmd) throws IllegalArgumentException
org.apache.hadoop.hbase.util.AbstractHBaseTool
processOptions
in class org.apache.hadoop.hbase.util.AbstractHBaseTool
IllegalArgumentException
protected void processOldArgs(List<String> args)
org.apache.hadoop.hbase.util.AbstractHBaseTool
Option
. (because they don't pass validation, for
e.g. "-copy-to". "-" means short name which doesn't allow '-' in name). This function is to
allow tools to have, for time being, parameters which can't be parsed using Option
.
Overrides should consume all valid legacy arguments. If the param 'args' is not empty on
return, it means there were invalid options, in which case we'll exit from the tool. Note that
it's called before AbstractHBaseTool.processOptions(CommandLine)
, which means new options' values will
override old ones'.processOldArgs
in class org.apache.hadoop.hbase.util.AbstractHBaseTool
protected int doWork() throws Exception
org.apache.hadoop.hbase.util.AbstractHBaseTool
doWork
in class org.apache.hadoop.hbase.util.AbstractHBaseTool
Exception
public static void main(String[] args) throws Exception
args
- The command line parameters.Exception
- When running the job fails.protected org.apache.hbase.thirdparty.org.apache.commons.cli.CommandLineParser newParser()
org.apache.hadoop.hbase.util.AbstractHBaseTool
newParser
in class org.apache.hadoop.hbase.util.AbstractHBaseTool
Copyright © 2007–2020 The Apache Software Foundation. All rights reserved.