Description
Saw the following on the Lars clusters (He's up on 0.16.1 and very recent 0.1 branch) trying to run a scan over all his content:
java.io.IOException: java.io.IOException: HStoreScanner failed construction
at org.apache.hadoop.hbase.HStore$StoreFileScanner.<init>(HStore.java:2241)
at org.apache.hadoop.hbase.HStore$HStoreScanner.<init>(HStore.java:2362)
at org.apache.hadoop.hbase.HStore.getScanner(HStore.java:2152)
at org.apache.hadoop.hbase.HRegion$HScanner.<init>(HRegion.java:1640)
at org.apache.hadoop.hbase.HRegion.getScanner(HRegion.java:1214)
at org.apache.hadoop.hbase.HRegionServer.openScanner(HRegionServer.java:1448)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:585)
at org.apache.hadoop.hbase.ipc.HbaseRPC$Server.call(HbaseRPC.java:413)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:910)
Caused by: java.io.FileNotFoundException: File hdfs://lv1-xen-pdc-2.worldlingo.com:9000/hbase/pdc-docs/1733592281/contents/mapfiles/3435064940161142159/data does not exist.
at org.apache.hadoop.dfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:341)
at org.apache.hadoop.fs.FileSystem.getLength(FileSystem.java:538)
at org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1387)
at org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1382)
at org.apache.hadoop.io.MapFile$Reader.<init>(MapFile.java:254)
at org.apache.hadoop.io.MapFile$Reader.<init>(MapFile.java:242)
at org.apache.hadoop.hbase.HStoreFile$HbaseMapFile$HbaseReader.<init>(HStoreFile.java:600)
at org.apache.hadoop.hbase.HStoreFile$BloomFilterMapFile$Reader.<init>(HStoreFile.java:655)
at org.apache.hadoop.hbase.HStoreFile$HalfMapFileReader.<init>(HStoreFile.java:758)
at org.apache.hadoop.hbase.HStoreFile.getReader(HStoreFile.java:424)
at org.apache.hadoop.hbase.HStore$StoreFileScanner.<init>(HStore.java:2216)
... 11 more
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
at java.lang.reflect.Constructor.newInstance(Constructor.java:494)
at org.apache.hadoop.hbase.RemoteExceptionHandler.decodeRemoteException(RemoteExceptionHandler.java:82)
at org.apache.hadoop.hbase.HTable$ClientScanner.nextScanner(HTable.java:874)
at org.apache.hadoop.hbase.HTable$ClientScanner.next(HTable.java:915)
at org.apache.hadoop.hbase.hql.SelectCommand.scanPrint(SelectCommand.java:233)
at org.apache.hadoop.hbase.hql.SelectCommand.execute(SelectCommand.java:100)
at org.apache.hadoop.hbase.hql.HQLClient.executeQuery(HQLClient.java:50)
at org.apache.hadoop.hbase.Shell.main(Shell.java:114)
The scanner breaks when it hits the above exception. The odd thing is that the referenced mapfile is out of a region that was deleted 4 days ago after purportedly all references had been let go:
2008-03-16 15:13:36,744 DEBUG org.apache.hadoop.hbase.HRegion: DELETING region hdfs://lv1-xen-pdc-2.worldlingo.com:9000/hbase/pdc-docs/1733592281