ORA-01401 / ORA-12899 / ORA-01461 While Importing Or Loading Data In An AL32UTF8 / UTF8 (Unicode) Or Other Multibyte NLS_CHARACTERSET Database. (Doc ID 1297961.1)

Last updated on DECEMBER 05, 2016

Applies to:

Oracle Database - Enterprise Edition - Version 8.1.7.4 and later
Oracle Database - Standard Edition - Version 8.1.7.4 and later
Information in this document applies to any platform.

Symptoms



Import (both the old imp tool or datapump) give errors like

IMP-00019: row rejected due to ORACLE error 1401
IMP-00003: ORACLE error 1401 encountered
ORA-01401: inserted value too large for column

or

IMP-00058: ORACLE error 1461 encountered
ORA-01461: can bind a LONG value only for insert into a LONG column

or from 10g onwards:

IMP-00019: row rejected due to ORACLE error 12899
IMP-00003: ORACLE error 12899 encountered
ORA-12899: value too large for column "SCHEMA_NAME"."TABLE_NAME"."COLUMN_NAME" (actual: <xx>, maximum: <yy>)

or

ORA-02374: conversion error loading table "SCHEMA_NAME"."TABLE_NAME"
ORA-12899: value too large for column COLUMN_NAME (actual: <xx>, maximum: <yy>)

or

ORA-02375: conversion error loading table "SCHEMA_NAME.TABLE_NAME" partition "PARTITION_NAME"
ORA-12899: value too large for column COLUMN_NAME (actual: <xx>, maximum: <yy>)
ORA-02372: data for row:

or

ORA-31693: Table data object "SCHEMA_NAME"."TABLE_NAME" failed to load/unload and is being skipped due to error:
ORA-12899: value too large for column "SCHEMA_NAME"."TABLE_NAME"."COLUMN_NAME" (actual: <xx>, maximum: <yy>)

 

Note that with CHAR semantics on the column the ORA-12899 error gives the amount of characters for "actual" value.
Hence one can see an at first sight contradicting error like
ORA-12899: value too large for column COMMENT (actual: 2456, maximum: 4000) - 2456 is here expressed in characters, not bytes


Or you use database link from an non-AL32UTF8 database and move the data to an AL32UTF8 using "create table as select..." or insert into .. as select from " and you get ORA-01401: inserted value too large for column or ORA-12899: value too large for column "SCHEMA_NAME"."TABLE_NAME"."COLUMN_NAME" (actual: <xx>, maximum: <yy>)

Or other tools are used to load/insert data and ORA-12899: value too large for column is seen

Please note:

* If you want to change a WHOLE database to UTF8 or AL32UTF8 then please follow <Note 260192.1> Changing the NLS_CHARACTERSET to AL32UTF8/UTF8 (Unicode) in 8i, 9i , 10g and 11g.
This note is mainly intended for people who want to export/import only a subset of a database, a certain user or a collection of tables.

* When using UTF8 / AL32UTF8 or other multibyte character sets is might be good to read <Note 788156.1> AL32UTF8 / UTF8 (Unicode) Database Character Set Implications

* Do NOT use Impdp when going to (AL32)UTF8 or an other multibyte characterset on ALL 10g versions lower then 10.2.0.4 (including 10.1.0.5). Also 11.1.0.6 is affected.
It will provoke data corruption unless <Patch 5874989> is applied on the Impdp side. Expdp is not affected, hence the data in the dump file is correct. Also the "old" exp/imp tools are not affected.
This problem is fixed in the 10.2.0.4 and 11.1.0.7 patch set. Also Fixed in 11.2.0.1 and up
For windows the fix is included in
10.1.0.5.0 Patch 20 (10.1.0.5.20P) or later, see <Note 276548.1>
10.2.0.3.0 Patch 11 (10.2.0.3.11P) or later, see <Note 342443.1>

Changes


Importing or loading data from an US7ASCII or 8 bit (we8mswin1252 , we8iso8859p1 etc etc) source into an UTF8, AL32UTF8 or other multi byte character set database.

The NLS_CHARACTERSET of a database can be found using this select:

SQL> conn / as sysdba
Connected.
SQL> select value from NLS_DATABASE_PARAMETERS where parameter='NLS_CHARACTERSET';

VALUE
----------------------------------------
AL32UTF8

Cause

Sign In with your My Oracle Support account

Don't have a My Oracle Support account? Click to get started

My Oracle Support provides customers with access to over a
Million Knowledge Articles and hundreds of Community platforms