Tag Archives: datapump

import into UTF8 database

A common error when you import single-byte characters (e.g. iso8859p1 or mswin1252) into multi-bytes databases (e.g. utf8) is ORA-12899: value too large for column.

The root cause is the default semantics in a database being BYTE


SQL> select VALUE, ISDEFAULT 
  from v$parameter 
  where NAME='nls_length_semantics'
VALUE   ISDEFAULT
------- ---------
BYTE    TRUE

It means, one char equals one byte. But after conversion, one char is larger than one byte and does not fit any longer.

single-byte


SQL> select VALUE 
  from nls_database_parameters 
  where parameter='NLS_CHARACTERSET';
VALUE
-------------
WE8MSWIN1252
SQL> create table t(x char(1));
Table created.
SQL> insert into t values ('é');
1 row created.
SQL> commit;
Commit complete.
$ expdp scott/tiger dumpfile=t.dmp tables=t
. . exported "SCOTT"."T"            1 rows

multi-byte


SQL> select VALUE 
  from nls_database_parameters 
  where parameter='NLS_CHARACTERSET';
VALUE
-----------
UTF8
$ impdp scott/tiger dumpfile=t.dmp
Processing object type TABLE_EXPORT/TABLE/TABLE
Processing object type 
  TABLE_EXPORT/TABLE/TABLE_DATA
ORA-02374: conversion error loading table 
  "SCOTT"."T"
ORA-12899: value too large for column X 
  (actual: 2, maximum: 1)
ORA-02372: data for row: X : 0X'E9'
. . imported "SCOTT"."T"            0 out of 1 rows

How do I import my data?

1) import the metadata


$ impdp scott/tiger dumpfile=t.dmp content=metadata_only
Processing object type TABLE_EXPORT/TABLE/TABLE

2) change the char_used of the column(s) from (B)yte to (C)har

SQL> select 
  column_name, char_used, data_length, data_type 
  from user_tab_columns 
  where table_name='T' and char_used='B';
COLUMN_NAME  C DATA_LENGTH DATA_TYPE
------------ - ----------- ---------
X            B           1 CHAR
SQL> alter table t modify x char(1 char);
Table altered.

3) import the data

$ impdp scott/tiger dumpfile=t.dmp content=data_only
Processing object type 
  TABLE_EXPORT/TABLE/TABLE_DATA
. . imported "SCOTT"."T"            1 rows

4) check

SQL> select x, length(x), lengthb(x) from t;
X  LENGTH(X) LENGTHB(X)
- ---------- ----------
é          1          2

My column has now a length of one char and two bytes.

Transport tablespace over db links

You do not want to export the metadata from the source database, but rather use a database link to get this.

As prerequisite, you have made a set of self-contained tablespaces in read-only mode and you have copied the datafiles.


SQL>  create tablespace test datafile '/u02/oradata/db01/test01.dbf' size 10m;

Tablespace created.

SQL> create table scott.x(x number) tablespace test;

Table created.

SQL> insert into scott.x values (1);

1 row created.

SQL> commit;

Commit complete.

SQL> alter tablespace test read only;

Tablespace altered.


$ scp srv01:/u02/oradata/db01/test01.dbf /u02/oradata/db02

create a database link on the target database DB02


SQL> create database link l using 'DB01';

Database link created.

Then import the tablespace via db link


impdp scott/tiger network_link=l transport_tablespaces=TEST transport_datafiles=/u02/oradata/db02/test01.dbf logfile=DATA_PUMP_LOG:impdp.log

This avoid the “expdp” step, but it does not avoid copying the datafiles

Duplicate table over database link

The old-style approach would be CREATE TABLE T AS SELECT * FROM T@L, where T is the table you want to duplicate and L the database link pointing to the remote database

If you want to keep more info about the structure of t, the constraints, the indexes, you may use Datapump. Here it is…


SQL> var n number
SQL> exec :n := dbms_datapump.open('IMPORT','TABLE','L')

PL/SQL procedure successfully completed.

SQL> print n
         N
----------
        28

SQL> exec dbms_datapump.metadata_filter(:n,'SCHEMA_LIST','''SCOTT''')

PL/SQL procedure successfully completed.

SQL> exec dbms_datapump.metadata_filter(:n,'NAME_LIST','''T''')

PL/SQL procedure successfully completed.

SQL> exec dbms_datapump.start_job(:n)

PL/SQL procedure successfully completed.
SQL> desc t
 Name              Null?    Type
 ----------------- -------- ------------
 X                 NOT NULL NUMBER

This is utterly simple. If you mess up with the link name, global name, syntax, and so on, you may end up with orphan jobs in DBA_SCHEDULER_JOBS. There is some metalink note on dropping the underlying tables (Note 336914.1) but first log off, get a coffee, and they may vanish after a few minutes.

Datapump : table like ‘FOO%’ or like ‘BAR%’

Today I tried to put two like condition in an INCLUDE clause of datapump.

I have the following tables

SQL> select table_name from user_tables order by 1;

TABLE_NAME
------------------------------
AAA
BAR1   ***
BAR2   ***
BLA
FOO    ***
FOO1   ***
GOZ

and I want tables like BAR% and tables likes FOO%

First try :

$ expdp scott/tiger include=table:"like'FOO%'or like'BAR%'"

Export: Release 11.2.0.2.0 - Production on Thu Jul 14 11:47:13 2011

Copyright (c) 1982, 2009, Oracle and/or its affiliates.  All rights reserved.

Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.2.0 - 64bit Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
ORA-39001: invalid argument value
ORA-39071: Value for INCLUDE is badly formed.
ORA-00936: missing expression

This does not work, because it would translate in WHERE {tablename} like’FOO%’or like’BAR%’

Ok, second try, let’s put multiple TABLE clause

$ expdp scott/tiger include=table:"like'FOO%'",table:"like'BAR%'"

Export: Release 11.2.0.2.0 - Production on Thu Jul 14 11:47:15 2011

Copyright (c) 1982, 2009, Oracle and/or its affiliates.  All rights reserved.

Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.2.0 - 64bit Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Starting "SCOTT"."SYS_EXPORT_SCHEMA_01":  scott/******** include=table:"like'FOO%'",table:"like'BAR%'" Estimate in progress using BLOCKS method...
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 0 KB
ORA-39168: Object path TABLE was not found.
ORA-31655: no data or metadata objects selected for job
Job "SCOTT"."SYS_EXPORT_SCHEMA_01" completed with 2 error(s) at 11:47:21

no data was found, because it did translate in WHERE {tablename} like ‘FOO%’ and{tablename} like ‘BAR%’. Which returns no row.

Ok, multiple INCLUDE conditions are joined by AND, so let’s do the math. (BAR% OR FOO%)=(>=BAR AND <FOP AND NOT BETWEEN BAS AND FONZZZ (where FONZZZ is immediately smaller than FOO)

$ expdp scott/tiger include=table:">='BAR'",table:"not between 'BAS' and 'FONZZZZZZZZZZZZZ'",table:"<'FOP'"

Export: Release 11.2.0.2.0 - Production on Thu Jul 14 11:47:21 2011

Copyright (c) 1982, 2009, Oracle and/or its affiliates.  All rights reserved.

Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.2.0 - 64bit Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Starting "SCOTT"."SYS_EXPORT_SCHEMA_01":  scott/******** include=table:">='BAR'",table:"not between 'BAS' and 'FONZZZZZZZZZZZZZ'",table:"<'FOP'"
Estimate in progress using BLOCKS method...
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 256 KB
Processing object type SCHEMA_EXPORT/TABLE/TABLE
. . exported "SCOTT"."BAR1"                             5.007 KB       1 rows
. . exported "SCOTT"."BAR2"                             5.007 KB       1 rows
. . exported "SCOTT"."FOO"                              5.007 KB       1 rows
. . exported "SCOTT"."FOO1"                             5.007 KB       1 rows
Master table "SCOTT"."SYS_EXPORT_SCHEMA_01" successfully loaded/unloaded
******************************************************************************
Dump file set for SCOTT.SYS_EXPORT_SCHEMA_01 is:
  /u01/app/oracle/admin/DB01/dmp/expdat.dmp
Job "SCOTT"."SYS_EXPORT_SCHEMA_01" successfully completed at 11:47:31

It is that simple ;)