Details
-
Bug
-
Status: Open
-
Major
-
Resolution: Unresolved
-
None
-
None
-
None
-
spark: 3.1.1
hbase-connector: 1.0.1-SNAPSHOT
Description
I used hbase-spark to read hbase data in spark streaming application, and the token expired error occurred after 7 days.
Error:
23/02/20 19:29:07 INFO AsyncRequestFutureImpl: id=38643, table=XXX, attempt=11/16, failureCount=1ops, last exception=java.io.IOException: Call to XXX.XXX/XXX.XXX.XXX.XXX:60020 failed on local exception: java.io.IOException: org.apache.hbase.thirdparty.io.netty.handler.codec.DecoderException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.token.SecretManager$InvalidToken): Token has expired on XXX.XXX,60020,1675839912342, tracking started null, retrying after=10009ms, operationsToReplay=1
HBase RpcClient has expired connections that have not been cleaned up.
arthas command:
watch org.apache.hadoop.hbase.ipc.AbstractRpcClient getConnection "target.connections.values.{token.toString}" -x 2 -n 2
result:
@ArrayList[ @String[Kind: HBASE_AUTH_TOKEN, Service: a88deda4-d74c-483e-af8b-0fb9de7c7594, Ident: ((username=XXX, keyId=575, issueDate=1676858059133, expirationDate=1677462859133, sequenceNumber=772469))], ... @String[Kind: HBASE_AUTH_TOKEN, Service: a88deda4-d74c-483e-af8b-0fb9de7c7594, Ident: ((username=XXX, keyId=573, issueDate=1676656459093, expirationDate=1677261259093, sequenceNumber=587447))], ... @String[Kind: HBASE_AUTH_TOKEN, Service: a88deda4-d74c-483e-af8b-0fb9de7c7594, Ident: ((username=XXX, keyId=571, issueDate=1676454859019, expirationDate=1677059659019, sequenceNumber=430605))], ... ]
The hbase connection seems to be cleaned up only after being idle for a period of time.
org.apache.hadoop.hbase.spark.HBaseConnectionCache#connectionMap org.apache.hadoop.hbase.ipc.AbstractRpcClient#connections
Attachments
Issue Links
- links to