My Oracle Support Banner

Oracle Loader For Hadoop Job Fails With Java.lang.OutOfMemoryError: GC Overhead Limit Exceeded (Doc ID 1966732.1)

Last updated on NOVEMBER 08, 2022

Applies to:

Oracle Loader for Hadoop - Version 3.2.0 and later
Information in this document applies to any platform.


While using OLH connector to create datapump files inside HDFS for Hive tables. the OLH job fails with following error:

java.lang.OutOfMemoryError: GC overhead limit exceeded
  at java.util.Arrays.copyOf(
  at java.lang.AbstractStringBuilder.expandCapacity(
  at java.lang.AbstractStringBuilder.ensureCapacityInternal(
  at java.lang.AbstractStringBuilder.append(
  at java.lang.StringBuilder.append(
  at org.apache.hadoop.hive.metastore.MetaStoreUtils.getDDLFromFieldSchema(
  at org.apache.hadoop.hive.metastore.MetaStoreUtils.getSchema(
  at oracle.hadoop.loader.lib.input.HiveInputSplit.(
  at oracle.hadoop.loader.lib.input.HiveInputSplit.(
  at oracle.hadoop.loader.lib.input.HiveInputSplit.(
  at oracle.hadoop.loader.lib.input.HiveToAvroInputFormat.addSplits(
  at oracle.hadoop.loader.lib.input.HiveToAvroInputFormat.getSplits(
  at org.apache.hadoop.mapred.JobClient.writeNewSplits(
  at org.apache.hadoop.mapred.JobClient.writeSplits(
  at org.apache.hadoop.mapred.JobClient.access$600(
  at org.apache.hadoop.mapred.JobClient$
  at org.apache.hadoop.mapred.JobClient$
  at Method)
  at org.apache.hadoop.mapred.JobClient.submitJobInternal(



To view full details, sign in with your My Oracle Support account.

Don't have a My Oracle Support account? Click to get started!

In this Document

My Oracle Support provides customers with access to over a million knowledge articles and a vibrant support community of peers and Oracle experts.