@InterfaceAudience.Private public class GssSaslClientAuthenticationProvider extends GssSaslAuthenticationProvider implements SaslClientAuthenticationProvider
| Modifier and Type | Field and Description |
|---|---|
private static org.slf4j.Logger |
LOG |
SASL_AUTH_METHODAUTH_TOKEN_TYPE| Constructor and Description |
|---|
GssSaslClientAuthenticationProvider() |
| Modifier and Type | Method and Description |
|---|---|
boolean |
canRetry()
Returns true if the implementation is capable of performing some action which may allow a
failed authentication to become a successful authentication.
|
SaslClient |
createClient(org.apache.hadoop.conf.Configuration conf,
InetAddress serverAddr,
SecurityInfo securityInfo,
org.apache.hadoop.security.token.Token<? extends org.apache.hadoop.security.token.TokenIdentifier> token,
boolean fallbackAllowed,
Map<String,String> saslProps)
Creates the SASL client instance for this auth'n method.
|
static String |
getHostnameForServerPrincipal(org.apache.hadoop.conf.Configuration conf,
InetAddress addr) |
org.apache.hadoop.security.UserGroupInformation |
getRealUser(User user)
Returns the "real" user, the user who has the credentials being authenticated by the remote
service, in the form of an
UserGroupInformation object. |
(package private) String |
getServerPrincipal(org.apache.hadoop.conf.Configuration conf,
SecurityInfo securityInfo,
InetAddress server) |
org.apache.hadoop.hbase.shaded.protobuf.generated.RPCProtos.UserInformation |
getUserInfo(User user)
Constructs a
RPCProtos.UserInformation from the given UserGroupInformation |
void |
relogin()
Executes any necessary logic to re-login the client.
|
private static boolean |
useCanonicalHostname(org.apache.hadoop.conf.Configuration conf) |
getSaslAuthMethodgetTokenKindclone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, waitgetSaslAuthMethod, getTokenKindprivate static final org.slf4j.Logger LOG
public GssSaslClientAuthenticationProvider()
private static boolean useCanonicalHostname(org.apache.hadoop.conf.Configuration conf)
public static String getHostnameForServerPrincipal(org.apache.hadoop.conf.Configuration conf, InetAddress addr)
String getServerPrincipal(org.apache.hadoop.conf.Configuration conf, SecurityInfo securityInfo, InetAddress server) throws IOException
IOExceptionpublic SaslClient createClient(org.apache.hadoop.conf.Configuration conf, InetAddress serverAddr, SecurityInfo securityInfo, org.apache.hadoop.security.token.Token<? extends org.apache.hadoop.security.token.TokenIdentifier> token, boolean fallbackAllowed, Map<String,String> saslProps) throws IOException
SaslClientAuthenticationProvidercreateClient in interface SaslClientAuthenticationProviderIOExceptionpublic org.apache.hadoop.hbase.shaded.protobuf.generated.RPCProtos.UserInformation getUserInfo(User user)
SaslClientAuthenticationProviderRPCProtos.UserInformation from the given UserGroupInformationgetUserInfo in interface SaslClientAuthenticationProviderpublic boolean canRetry()
SaslClientAuthenticationProvidercanRetry in interface SaslClientAuthenticationProviderpublic void relogin() throws IOException
SaslClientAuthenticationProviderrelogin in interface SaslClientAuthenticationProviderIOExceptionpublic org.apache.hadoop.security.UserGroupInformation getRealUser(User user)
SaslClientAuthenticationProviderUserGroupInformation object. It is common in the Hadoop
"world" to have distinct notions of a "real" user and a "proxy" user. A "real" user is the user
which actually has the credentials (often, a Kerberos ticket), but some code may be running as
some other user who has no credentials. This method gives the authentication provider a chance
to acknowledge this is happening and ensure that any RPCs are executed with the real user's
credentials, because executing them as the proxy user would result in failure because no
credentials exist to authenticate the RPC. Not all implementations will need to implement this
method. By default, the provided User's UGI is returned directly.getRealUser in interface SaslClientAuthenticationProviderCopyright © 2007–2020 The Apache Software Foundation. All rights reserved.