Skip to content

Commit

Permalink
tabs to spaces; add one more error message, without any explanation r…
Browse files Browse the repository at this point in the history
…ight now
  • Loading branch information
steveloughran committed May 19, 2016
1 parent 49a0a4d commit a700eae
Show file tree
Hide file tree
Showing 6 changed files with 120 additions and 118 deletions.
132 changes: 68 additions & 64 deletions sections/errors.md
Original file line number Diff line number Diff line change
Expand Up @@ -47,8 +47,8 @@ the stack trace go away.
```
WARN ipc.Client (Client.java:run(676)) - Couldn't setup connection for [email protected] to /172.22.97.127:8020
org.apache.hadoop.ipc.RemoteException(javax.security.sasl.SaslException): GSS initiate failed
at org.apache.hadoop.security.SaslRpcClient.saslConnect(SaslRpcClient.java:375)
at org.apache.hadoop.ipc.Client$Connection.setupSaslConnection(Client.java:558)
at org.apache.hadoop.security.SaslRpcClient.saslConnect(SaslRpcClient.java:375)
at org.apache.hadoop.ipc.Client$Connection.setupSaslConnection(Client.java:558)
```

This is widely agreed to be one of the most useless of error messages you can see. The only
Expand Down Expand Up @@ -247,19 +247,19 @@ Usually in a stack trace like

```
Caused by: java.net.SocketTimeoutException: Receive timed out
at java.net.PlainDatagramSocketImpl.receive0(Native Method)
at java.net.AbstractPlainDatagramSocketImpl.receive(AbstractPlainDatagramSocketImpl.java:146)
at java.net.DatagramSocket.receive(DatagramSocket.java:816)
at sun.security.krb5.internal.UDPClient.receive(NetClient.java:207)
at sun.security.krb5.KdcComm$KdcCommunication.run(KdcComm.java:390)
at sun.security.krb5.KdcComm$KdcCommunication.run(KdcComm.java:343)
at java.security.AccessController.doPrivileged(Native Method)
at sun.security.krb5.KdcComm.send(KdcComm.java:327)
at sun.security.krb5.KdcComm.send(KdcComm.java:219)
at sun.security.krb5.KdcComm.send(KdcComm.java:191)
at sun.security.krb5.KrbAsReqBuilder.send(KrbAsReqBuilder.java:319)
at sun.security.krb5.KrbAsReqBuilder.action(KrbAsReqBuilder.java:364)
at com.sun.security.auth.module.Krb5LoginModule.attemptAuthentication(Krb5LoginModule.java:735)
at java.net.PlainDatagramSocketImpl.receive0(Native Method)
at java.net.AbstractPlainDatagramSocketImpl.receive(AbstractPlainDatagramSocketImpl.java:146)
at java.net.DatagramSocket.receive(DatagramSocket.java:816)
at sun.security.krb5.internal.UDPClient.receive(NetClient.java:207)
at sun.security.krb5.KdcComm$KdcCommunication.run(KdcComm.java:390)
at sun.security.krb5.KdcComm$KdcCommunication.run(KdcComm.java:343)
at java.security.AccessController.doPrivileged(Native Method)
at sun.security.krb5.KdcComm.send(KdcComm.java:327)
at sun.security.krb5.KdcComm.send(KdcComm.java:219)
at sun.security.krb5.KdcComm.send(KdcComm.java:191)
at sun.security.krb5.KrbAsReqBuilder.send(KrbAsReqBuilder.java:319)
at sun.security.krb5.KrbAsReqBuilder.action(KrbAsReqBuilder.java:364)
at com.sun.security.auth.module.Krb5LoginModule.attemptAuthentication(Krb5LoginModule.java:735)
```

This means the UDP socket awaiting a response from KDC eventually gave up.
Expand Down Expand Up @@ -458,27 +458,27 @@ Not Kerberos, SASL itself
```
16/01/22 09:44:17 WARN Client: Exception encountered while connecting to the server :
javax.security.sasl.SaslException: DIGEST-MD5: No common protection layer between client and server
at com.sun.security.sasl.digest.DigestMD5Client.checkQopSupport(DigestMD5Client.java:418)
at com.sun.security.sasl.digest.DigestMD5Client.evaluateChallenge(DigestMD5Client.java:221)
at org.apache.hadoop.security.SaslRpcClient.saslConnect(SaslRpcClient.java:413)
at org.apache.hadoop.ipc.Client$Connection.setupSaslConnection(Client.java:558)
at org.apache.hadoop.ipc.Client$Connection.access$1800(Client.java:373)
at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:727)
at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:723)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:722)
at org.apache.hadoop.ipc.Client$Connection.access$2800(Client.java:373)
at org.apache.hadoop.ipc.Client.getConnection(Client.java:1493)
at org.apache.hadoop.ipc.Client.call(Client.java:1397)
at org.apache.hadoop.ipc.Client.call(Client.java:1358)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
at com.sun.proxy.$Proxy23.renewLease(Unknown Source)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.renewLease(ClientNamenodeProtocolTranslatorPB.java:590)
at sun.reflect.GeneratedMethodAccessor9.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at com.sun.security.sasl.digest.DigestMD5Client.checkQopSupport(DigestMD5Client.java:418)
at com.sun.security.sasl.digest.DigestMD5Client.evaluateChallenge(DigestMD5Client.java:221)
at org.apache.hadoop.security.SaslRpcClient.saslConnect(SaslRpcClient.java:413)
at org.apache.hadoop.ipc.Client$Connection.setupSaslConnection(Client.java:558)
at org.apache.hadoop.ipc.Client$Connection.access$1800(Client.java:373)
at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:727)
at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:723)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:722)
at org.apache.hadoop.ipc.Client$Connection.access$2800(Client.java:373)
at org.apache.hadoop.ipc.Client.getConnection(Client.java:1493)
at org.apache.hadoop.ipc.Client.call(Client.java:1397)
at org.apache.hadoop.ipc.Client.call(Client.java:1358)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
at com.sun.proxy.$Proxy23.renewLease(Unknown Source)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.renewLease(ClientNamenodeProtocolTranslatorPB.java:590)
at sun.reflect.GeneratedMethodAccessor9.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
```

## On windows: `No authority could be contacted for authentication`
Expand All @@ -503,37 +503,37 @@ because it surfaces in security code which assumes that all failures must be Ker

```
2016-04-06 11:00:35,796 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: Exception in secureMain java.io.IOException: java.lang.RuntimeException: Could not resolve Kerberos principal name: java.net.UnknownHostException: xubunty: xubunty: unknown error
at org.apache.hadoop.http.HttpServer2.<init>(HttpServer2.java:347)
at org.apache.hadoop.http.HttpServer2.<init>(HttpServer2.java:114)
at org.apache.hadoop.http.HttpServer2$Builder.build(HttpServer2.java:290)
at org.apache.hadoop.hdfs.server.datanode.web.DatanodeHttpServer.<init>(DatanodeHttpServer.java:108)
at org.apache.hadoop.hdfs.server.datanode.DataNode.startInfoServer(DataNode.java:781)
at org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:1138)
at org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:432)
at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:2423)
at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:2310)
at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:2357)
at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:2538)
at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:2562)
at org.apache.hadoop.http.HttpServer2.<init>(HttpServer2.java:347)
at org.apache.hadoop.http.HttpServer2.<init>(HttpServer2.java:114)
at org.apache.hadoop.http.HttpServer2$Builder.build(HttpServer2.java:290)
at org.apache.hadoop.hdfs.server.datanode.web.DatanodeHttpServer.<init>(DatanodeHttpServer.java:108)
at org.apache.hadoop.hdfs.server.datanode.DataNode.startInfoServer(DataNode.java:781)
at org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:1138)
at org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:432)
at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:2423)
at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:2310)
at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:2357)
at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:2538)
at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:2562)
Caused by: java.lang.RuntimeException: Could not resolve Kerberos principal name: java.net.UnknownHostException: xubunty: xubunty: unknown error
at org.apache.hadoop.security.AuthenticationFilterInitializer.getFilterConfigMap(AuthenticationFilterInitializer.java:90)
at org.apache.hadoop.http.HttpServer2.getFilterProperties(HttpServer2.java:455)
at org.apache.hadoop.http.HttpServer2.constructSecretProvider(HttpServer2.java:445)
at org.apache.hadoop.http.HttpServer2.<init>(HttpServer2.java:340)
... 11 more
at org.apache.hadoop.security.AuthenticationFilterInitializer.getFilterConfigMap(AuthenticationFilterInitializer.java:90)
at org.apache.hadoop.http.HttpServer2.getFilterProperties(HttpServer2.java:455)
at org.apache.hadoop.http.HttpServer2.constructSecretProvider(HttpServer2.java:445)
at org.apache.hadoop.http.HttpServer2.<init>(HttpServer2.java:340)
... 11 more
Caused by: java.net.UnknownHostException: xubunty: xubunty: unknown error
at java.net.InetAddress.getLocalHost(InetAddress.java:1505)
at org.apache.hadoop.security.SecurityUtil.getLocalHostName(SecurityUtil.java:224)
at org.apache.hadoop.security.SecurityUtil.replacePattern(SecurityUtil.java:192)
at org.apache.hadoop.security.SecurityUtil.getServerPrincipal(SecurityUtil.java:147)
at org.apache.hadoop.security.AuthenticationFilterInitializer.getFilterConfigMap(AuthenticationFilterInitializer.java:87)
... 14 more
at java.net.InetAddress.getLocalHost(InetAddress.java:1505)
at org.apache.hadoop.security.SecurityUtil.getLocalHostName(SecurityUtil.java:224)
at org.apache.hadoop.security.SecurityUtil.replacePattern(SecurityUtil.java:192)
at org.apache.hadoop.security.SecurityUtil.getServerPrincipal(SecurityUtil.java:147)
at org.apache.hadoop.security.AuthenticationFilterInitializer.getFilterConfigMap(AuthenticationFilterInitializer.java:87)
... 14 more
Caused by: java.net.UnknownHostException: xubunty: unknown error
at java.net.Inet4AddressImpl.lookupAllHostAddr(Native Method)
at java.net.InetAddress$2.lookupAllHostAddr(InetAddress.java:928)
at java.net.InetAddress.getAddressesFromNameService(InetAddress.java:1323)
at java.net.InetAddress.getLocalHost(InetAddress.java:1500)
... 18 more2016-04-06 11:00:35,799 INFO org.apache.hadoop.util.ExitUtil: Exiting with status 12016-04-06 11:00:35,806 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG:
at java.net.Inet4AddressImpl.lookupAllHostAddr(Native Method)
at java.net.InetAddress$2.lookupAllHostAddr(InetAddress.java:928)
at java.net.InetAddress.getAddressesFromNameService(InetAddress.java:1323)
at java.net.InetAddress.getLocalHost(InetAddress.java:1500)
... 18 more2016-04-06 11:00:35,799 INFO org.apache.hadoop.util.ExitUtil: Exiting with status 12016-04-06 11:00:35,806 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG:
{code}
```

Expand All @@ -551,3 +551,7 @@ in stack traces belonging to the security classes, wrapped with exception messag
implying a Kerberos problem. Always follow down to the innermost exception in a trace
as the immediate symptom of a problem, the layers above attempts to interpret that,
attempts which may or may not be correct.

## Against Active Directory: `Realm not local to KDC while getting initial credentials`

Nobody knows.
2 changes: 1 addition & 1 deletion sections/ipc.md
Original file line number Diff line number Diff line change
Expand Up @@ -127,7 +127,7 @@ public class MyRpcSecurityInfo extends SecurityInfo { ... }

The resource file `META-INF/services/org.apache.hadoop.security.SecurityInfo` lists all RPC APIs which have a matching SecurityInfo subclass in that JAR.

org.example.rpc.MyRpcSecurityInfo
org.example.rpc.MyRpcSecurityInfo

The RPC framework will read this file and build up the security information for the APIs (server side? Client side? both?)

Expand Down
2 changes: 0 additions & 2 deletions sections/jdk_versions.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,8 +17,6 @@
> HP Lovecraft [The Whisperer in Darkness](http://www.hplovecraft.com/writings/texts/fiction/wid.aspx), 1931


# Java and JDK Versions

Kerberos support is built into the Java JRE. It comes in two parts
Expand Down
48 changes: 24 additions & 24 deletions sections/terrors.md
Original file line number Diff line number Diff line change
Expand Up @@ -46,25 +46,25 @@ A stack trace
```
16/01/16 01:42:39 WARN ipc.Client: Exception encountered while connecting to the server : javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
java.io.IOException: Failed on local exception: java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]; Host Details : local host is: "os-u14-2-2.novalocal/172.22.73.243"; destination host is: "os-u14-2-3.novalocal":8020;
at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:773)
at org.apache.hadoop.ipc.Client.call(Client.java:1431)
at org.apache.hadoop.ipc.Client.call(Client.java:1358)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
at com.sun.proxy.$Proxy11.getFileInfo(Unknown Source)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:771)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:252)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:104)
at com.sun.proxy.$Proxy12.getFileInfo(Unknown Source)
at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:2116)
at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1315)
at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1311)
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1311)
at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1424)
at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:773)
at org.apache.hadoop.ipc.Client.call(Client.java:1431)
at org.apache.hadoop.ipc.Client.call(Client.java:1358)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
at com.sun.proxy.$Proxy11.getFileInfo(Unknown Source)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:771)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:252)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:104)
at com.sun.proxy.$Proxy12.getFileInfo(Unknown Source)
at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:2116)
at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1315)
at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1311)
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1311)
at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1424)
```

This looks like a normal "not logged in" problem, except for some little facts:
Expand All @@ -84,11 +84,11 @@ Default principal: qe@REALM
Valid starting Expires Service principal
01/16/2016 11:07:23 01/16/2016 21:07:23 krbtgt/REALM@REALM
renew until 01/23/2016 11:07:23
renew until 01/23/2016 11:07:23
01/16/2016 13:13:11 01/16/2016 21:07:23 HTTP/hdfs-3-5@
renew until 01/23/2016 11:07:23
renew until 01/23/2016 11:07:23
01/16/2016 13:13:11 01/16/2016 21:07:23 HTTP/hdfs-3-5@REALM
renew until 01/23/2016 11:07:23
renew until 01/23/2016 11:07:23
```

See that? There's a principal which doesn't have a stated realm. Does that matter?
Expand Down Expand Up @@ -147,9 +147,9 @@ A `klist` then returns a list of credentials without this realm-less one in.
```
Valid starting Expires Service principal
01/17/2016 14:49:08 01/18/2016 00:49:08 krbtgt/REALM@REALM
renew until 01/24/2016 14:49:08
renew until 01/24/2016 14:49:08
01/17/2016 14:49:16 01/18/2016 00:49:08 HTTP/hdfs-3-5@REALM
renew until 01/24/2016 14:49:08
renew until 01/24/2016 14:49:08
```

Because this was a virtual cluster, DNS/RDNS probably wasn't working, presumably kerberos
Expand Down
6 changes: 3 additions & 3 deletions sections/web_and_rest.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@
See the License for the specific language governing permissions and
limitations under the License. See accompanying LICENSE file.
-->

# Web, REST and SPNEGO

SPNEGO is the acronym of the protocol by which HTTP clients can authenticate with a web site using Kerberos. This allows the client to identify and authenticate itself to a web site or a web service.
Expand Down Expand Up @@ -137,7 +137,7 @@ TODO:



```
```java
private static UserGroupInformation getUser(HttpServletRequest req) {
String remoteUser = req.getRemoteUser();
UserGroupInformation callerUGI = null;
Expand All @@ -149,7 +149,7 @@ private static UserGroupInformation getUser(HttpServletRequest req) {
```

This can then be used to process the events
```
```java
@PUT
@Path("/jobs/{jobid}/tasks/{taskid}/attempts/{attemptid}/state")
@Produces({ MediaType.APPLICATION_JSON, MediaType.APPLICATION_XML })
Expand Down
Loading

0 comments on commit a700eae

Please sign in to comment.