How to Migrate Large Amount of Binary XML Data between Databases
(Doc ID 2309649.1)
Last updated on APRIL 22, 2019
Applies to:Oracle Database - Enterprise Edition - Version 184.108.40.206.0 to 220.127.116.11 [Release 11.2 to 12.2]
Oracle Database Exadata Cloud Machine - Version N/A and later
Oracle Cloud Infrastructure - Database Service - Version N/A and later
Oracle Database Cloud Exadata Service - Version N/A and later
Oracle Database Exadata Express Cloud Service - Version N/A and later
Information in this document applies to any platform.
Why write this note?
The purpose of this note is to detail recommended methods to move a Large Amount of Binary XML Data between Databases. Please contact support for guidance on this note if additional help is required.
The note is intended for all Oracle users and developers using the Binary XMLType datatype.
The feature Transport Tablespace Operations for binary XML has limited support in 18.104.22.168, 22.214.171.124, 126.96.36.199 and 188.8.131.52 versions of source database. Starting with 184.108.40.206 version as source, TTS should work albeit at time it will go through re-encoding.
Can one use Transportable Tablespace Option to move Binary XML Data?
For source versions 220.127.116.11, 18.104.22.168 and 22.214.171.124, 126.96.36.199 at minimum before attempting any migration, please verify that patch 16178854 is installed in the source database. Please note, Bug 16178854 is fixed in 188.8.131.52 and 184.108.40.206 and above versions. Without this patch one would see the following errors after import on the destination database.
1) ORA-00600: internal error code, arguments: [qmcxdEvtNextPPP], , , , , , , , , , , 
2) ORA-00600: internal error code, arguments: [qmcxdGetQNameInfo2], , , , , , , , , , , 
If after installing this patch you get "ORA-39945: Token conflicting with existing tokens" then it might be worth looking into conventional datapump as an option.
Starting with 220.127.116.11 as source to destination 18.104.22.168 and higher no patches are required. With these versions, if there are conflicting tokens, at import documents would get re-encoding. We do recommend dropping/disabling indexes on XMLType tables(and tables containing XMLType column). This would make re-encoding faster. After import the indexes can be recreated/reenabled.
My database is too big to move using conventional datapump, What are the possible options to move Binary XML Data?
(i) Use attached TTS migration script.
WARNING : Use this script only if destination database is a fresh install with no user data.
Per typical maintenance operations, be sure to do a full database backup prior to implementing this method and test this on a development/test environment prior to implementing on production.
Short Description of problem
Binary XML uses tokens to compress/encode XML Data. The token ids and their value (Tag name) are stored in a central token table. During TTS import of data the tokens need to be reused, that means tokens on the exporting side and importing side cannot conflict with each other. In case of a conflict not XL data can be imported and an error message is raised during TTS import.
Solution provided by these scripts
We prevent a token conflict during TTS import by pruning the existing token table on the importig side and only use the tokens from the xporting side.
The script outputs the operations it is executing (SET SERVEROUTPUT ON) and spools to .log file. Please check this output for any errors (ORA-...). If you run into any error(s) please contact support. This patch only completes correctly if no error is being raised during execution.
If the endianess is not same between source and destination database, please use "RMAN convert" to change format of database files.
Access to the Script
You can access this procedure by clicking on the following link:
(ii) Plug/Unplug databases Containing Binary XML Data
This method can be used when you wish to move the entire database between the following versions:
- 11.2.x non-CDB to 12.1.x/12.2.x CDB
- 12.x PDB to 12.x CDB
Please note, in order to plug/unplug this set of conditions must be met for both source and destination databases:
- Same Endianess
- Compatible database characterset
- Same set of database Options installed
No additional licensing is required to convert a single instance database to a consolidated database with one PDB.
(iii) Export Import Utilities
The Export and Import utilities can be used to move Binary XML storage data between environments. This option works well on smaller datasets, however, since it involves several internal insert commands, it will be very resource intensive on larger data migrations. For customers with a limited timescale planned migration window, this is not a feasible option.
(iv) Procedure to move small Binary XML Datasets
The note listed below provides a means to move small datasets of Binary XML Data:
How to move XMLType tables/columns with Binary XML Storage between schemas/databases (Doc ID 1405457.1)
(v) Procedure to consolidate mutiple databases into single database
Lets say databases D1, D2, D3, D4 of sizes 10X, 5x, 2X and 2X are consolidated into a single database (standalone or PDB) that is fresh install. There are two options to proceeed with data migration :-
1) Using TTS into a single instance or CDB with one PDB that has been a fresh install:- Pick the database with largest size (D1 above) and use TTS to do the migration. If there are merge conflict, one can use TTSScript.zip. Use conventional database for rest of the databases (D2, D3 and D4).
2) Consolidate Database : You can unplug/Plug each of the database D1, D2, D3 and D4 as separate PDBs.
To view full details, sign in with your My Oracle Support account.
Don't have a My Oracle Support account? Click to get started!
In this Document
|Why write this note?|
|Can one use Transportable Tablespace Option to move Binary XML Data?|
|My database is too big to move using conventional datapump, What are the possible options to move Binary XML Data?|
|Short Description of problem|
|Solution provided by these scripts|
|Access to the Script|