My Oracle Support Banner

ODI Loading Failure ORA-12899: value too large for column "<SCHEMA>"."C$_0<TABLE>"."<COLUMN>" (Doc ID 2503712.1)

Last updated on OCTOBER 17, 2019

Applies to:

Oracle Data Integrator - Version 11.1.1.3.0 and later
Information in this document applies to any platform.

Goal

An Oracle Data Integrator (ODI) job loads from a flat file to a table. It fails with the following error:

ODI-1217: Session <SESSION_NAME> fails with return code 12899.
ODI-1226: Step <STEP> fails after 1 attempt(s).
ODI-1240: Flow <FLOW> fails while performing a Loading operation. This flow loads target table <TABLE>.
ODI-1228: Task <LOADING TASK> fails on the target ORACLE connection <DATA SERVER>.
Caused By: java.sql.BatchUpdateException: ORA-12899: value too large for column "<SCHEMA>"."<C$ TABLE>"."<COLUMN>" (actual: <SIZE>, maximum: 50)
  at oracle.jdbc.driver.OraclePreparedStatement.executeBatch(OraclePreparedStatement.java:13349)
  ...

The integration is failing when inserting into the temporary C$ table.

When the column size on the flat file source Datastore is extended from string(50) to string(80), the C$ table is created with the larger size and the Mapping execution is successful.

  1. How can we handle situations where there is longer than the expected data length for a particular source column?
                 
  2. How can we trim data before it is inserted into the C$ table?

Solution

To view full details, sign in with your My Oracle Support account.

Don't have a My Oracle Support account? Click to get started!


In this Document
Goal
Solution


My Oracle Support provides customers with access to over a million knowledge articles and a vibrant support community of peers and Oracle experts.