My Oracle Support Banner

Loading Data Into HDFS Failed With Error: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.protocol.FSLimitException$MaxDirectoryItemsExceededException) (Doc ID 2430530.1)

Last updated on AUGUST 01, 2018

Applies to:

Big Data Appliance Integrated Software - Version 4.3.0 and later
Information in this document applies to any platform.

Symptoms

Loading data into HDFS failed with the following error:

org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.protocol.FSLimitException$MaxDirectoryItemsExceededException): The directory item limit of /user/hive/warehouse/xxx2hive_avro.ccr is exceeded: limit=1048576 items=1048576

 In Cloudera Manager, the 'file count limit' for the directory /user/hive/warehouse/xxx2hive_avro.ccr was set to "No Limit"and the 'Disk Space Limit' was also set to "No Limit"

Cause

To view full details, sign in with your My Oracle Support account.

Don't have a My Oracle Support account? Click to get started!


In this Document
Symptoms
Cause
Solution


My Oracle Support provides customers with access to over a million knowledge articles and a vibrant support community of peers and Oracle experts.