After Upgrade to BDA V4.5 DataNodes Fail to Start with: "org.apache.hadoop.hdfs.server.common.Storage: Storage directory [DISK]file:/u01/hadoop/dfs/ has already been used"
(Doc ID 2170965.1)
Last updated on AUGUST 03, 2021
Applies to:
Big Data Appliance Integrated Software - Version 4.1.0 and laterLinux x86-64
Symptoms
NOTE: In the examples that follow, user details, cluster names, hostnames, directory paths, filenames, etc. represent a fictitious sample (and are used to provide an illustrative example only). Any similarity to actual persons, or entities, living or dead, is purely coincidental and not intended in any manner.
After upgrade to BDA V4.5/CDH 5.7.0 from BDA V4.1.0/CDH 5.3.2 a DataNode fails to come up. Instead the error below is raised:
2016-08-06 08:27:16,683 WARN org.apache.hadoop.hdfs.server.common.Storage: Failed to add storage for block pool: BP-1575072498-<PRIVATE_IP_HOST>-1447645056196 : BlockPoolSliceStorage.recoverTransitionR
ead: attempt to load an used block storage: /u02/hadoop/dfs/current/BP-1575072498-<PRIVATE_IP_HOST>-1447645056196
2016-08-06 08:27:16,683 INFO org.apache.hadoop.hdfs.server.common.Storage: Storage directory [DISK]file:/u01/hadoop/dfs/ has already been used.
2016-08-06 08:27:16,699 INFO org.apache.hadoop.hdfs.server.common.Storage: Analyzing storage directories for bpid BP-1575072498-<PRIVATE_IP_HOST>-1447645056196
2016-08-06 08:27:16,699 WARN org.apache.hadoop.hdfs.server.common.Storage: Failed to analyze storage directories for block pool BP-1575072498- <PRIVATE_IP_HOST>-1447645056196
java.io.IOException: BlockPoolSliceStorage.recoverTransitionRead: attempt to load an used block storage: /u01/hadoop/dfs/current/BP-1575072498-<PRIVATE_IP_HOST>-1447645056196
at org.apache.hadoop.hdfs.server.datanode.BlockPoolSliceStorage.loadBpStorageDirectories(BlockPoolSliceStorage.java:212)
at org.apache.hadoop.hdfs.server.datanode.BlockPoolSliceStorage.recoverTransitionRead(BlockPoolSliceStorage.java:244)
at org.apache.hadoop.hdfs.server.datanode.DataStorage.addStorageLocations(DataStorage.java:395)
at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:477)
at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1394)
at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1355)
at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:317)
at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:228)
at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:829)
at java.lang.Thread.run(Thread.java:745)
2016-08-06 08:27:16,699 WARN org.apache.hadoop.hdfs.server.common.Storage: Failed to add storage for block pool: BP-1575072498-<PRIVATE_IP_HOST>-1447645056196 : BlockPoolSliceStorage.recoverTransitionR
ead: attempt to load an used block storage: /u01/hadoop/dfs/current/BP-1575072498-<PRIVATE_IP_HOST>-1447645056196
2016-08-06 08:27:16,700 FATAL org.apache.hadoop.hdfs.server.datanode.DataNode: Initialization failed for Block pool (Datanode Uuid unassigned) service to <HOSTNAME1>.<DOMAIN>/<PRIVATE_IP_HOST>:8022. Exiting.
java.io.IOException: All specified directories are failed to load.
...
ead: attempt to load an used block storage: /u02/hadoop/dfs/current/BP-1575072498-<PRIVATE_IP_HOST>-1447645056196
2016-08-06 08:27:16,683 INFO org.apache.hadoop.hdfs.server.common.Storage: Storage directory [DISK]file:/u01/hadoop/dfs/ has already been used.
2016-08-06 08:27:16,699 INFO org.apache.hadoop.hdfs.server.common.Storage: Analyzing storage directories for bpid BP-1575072498-<PRIVATE_IP_HOST>-1447645056196
2016-08-06 08:27:16,699 WARN org.apache.hadoop.hdfs.server.common.Storage: Failed to analyze storage directories for block pool BP-1575072498- <PRIVATE_IP_HOST>-1447645056196
java.io.IOException: BlockPoolSliceStorage.recoverTransitionRead: attempt to load an used block storage: /u01/hadoop/dfs/current/BP-1575072498-<PRIVATE_IP_HOST>-1447645056196
at org.apache.hadoop.hdfs.server.datanode.BlockPoolSliceStorage.loadBpStorageDirectories(BlockPoolSliceStorage.java:212)
at org.apache.hadoop.hdfs.server.datanode.BlockPoolSliceStorage.recoverTransitionRead(BlockPoolSliceStorage.java:244)
at org.apache.hadoop.hdfs.server.datanode.DataStorage.addStorageLocations(DataStorage.java:395)
at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:477)
at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1394)
at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1355)
at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:317)
at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:228)
at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:829)
at java.lang.Thread.run(Thread.java:745)
2016-08-06 08:27:16,699 WARN org.apache.hadoop.hdfs.server.common.Storage: Failed to add storage for block pool: BP-1575072498-<PRIVATE_IP_HOST>-1447645056196 : BlockPoolSliceStorage.recoverTransitionR
ead: attempt to load an used block storage: /u01/hadoop/dfs/current/BP-1575072498-<PRIVATE_IP_HOST>-1447645056196
2016-08-06 08:27:16,700 FATAL org.apache.hadoop.hdfs.server.datanode.DataNode: Initialization failed for Block pool (Datanode Uuid unassigned) service to <HOSTNAME1>.<DOMAIN>/<PRIVATE_IP_HOST>:8022. Exiting.
java.io.IOException: All specified directories are failed to load.
...
Cause
To view full details, sign in with your My Oracle Support account. |
|
Don't have a My Oracle Support account? Click to get started! |
In this Document
Symptoms |
Cause |
Solution |