How to Disable "value too large for column" and Related Errors in the BDS Hadoop Cluster bigdata-log4j.log File
(Doc ID 2400365.1)
Last updated on JANUARY 27, 2020
Applies to:
Oracle Big Data SQL - Version 3.1.0 and laterLinux x86-64
Goal
This document presents the way to disable errors like "value too large for column", "value exceeds byte length limit(n) actual(n)" that are seen in the BDS bigdata-log4j.log on the Hadoop Cluster.
If the com.oracle.bigdata.overflow={"action":"error"} parameter is explicitly defined when creating a BDS external table, error messages like below can be seen from bigdata-log4j.log on the Hadoop cluster.
18/04/11 07:44:56 ERROR database.CharacterColumn: value too large for column "COL1" (actual: 13, maximum: 10)
18/04/11 07:44:57 ERROR database.CharacterColumn: Value exceeds byte length limit(<number>) actual(<number>)
For the details about "com.oracle.bigdata.overflow" refer to: Understanding the Big Data SQL Access Parameter "com.oracle.bigdata.overflow" (Doc ID 2336619.1).
These errors indicate that parser errors occurred due to data overflow. Depending on the overflow data amount, a large number of error messages could be output to the bigdata-log4j.log which may leads to disk space problems.
Solution
To view full details, sign in with your My Oracle Support account. |
|
Don't have a My Oracle Support account? Click to get started! |
In this Document
Goal |
Solution |
References |