My Oracle Support Banner

Loading Data Into HDFS Failed With Error: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.protocol.FSLimitException$MaxDirectoryItemsExceededException) (Doc ID 2430530.1)

Last updated on JULY 20, 2024

Applies to:

Big Data Appliance Integrated Software - Version 4.3.0 and later
Information in this document applies to any platform.

Symptoms

NOTE: In the examples that follow, user details, table name, company name, email, hostnames, etc. represent a fictitious sample (and are used to provide an illustrative example only). Any similarity to actual persons, or entities, living or dead, is purely coincidental and not intended in any manner.

Loading data into HDFS failed with the following error:

org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.protocol.FSLimitException$MaxDirectoryItemsExceededException): The directory item limit of /user/hive/<DIRECTORY_PATH/<FILE>_hive_avro_ccr is exceeded: limit=1048576 items=1048576

 In Cloudera Manager, the 'file count limit' for the directory /user/hive/<DIRECTORY_PATH/<FILE>_hive_avro_ccr was set to "No Limit"and the 'Disk Space Limit' was also set to "No Limit"

Cause

To view full details, sign in with your My Oracle Support account.

Don't have a My Oracle Support account? Click to get started!


In this Document
Symptoms
Cause
Solution


My Oracle Support provides customers with access to over a million knowledge articles and a vibrant support community of peers and Oracle experts.