Categories
plsql sql

better than CTAS


SQL> create table t1(x number primary key);
Table created.
SQL> desc t1
Name Null? Type
----------------------- -------- ----------------
X NOT NULL NUMBER


SQL> create table t2 as select * from t1;
Table created.
SQL> desc t2
Name Null? Type
----------------------- -------- ----------------
X NUMBER

The table T2 has the column X, but not the constraint (primary key / not null).

If you want to do a create table as select but want to keep index / constraints etc, then you rather use datapump

SQL> set autop on
SQL> var job_state varchar2(30)
SQL> declare
n number;
begin
n := DBMS_DATAPUMP.open('IMPORT','TABLE','DB01');
DBMS_DATAPUMP.metadata_filter(n,'NAME_LIST','''T1''');
DBMS_DATAPUMP.metadata_remap(n,'REMAP_TABLE','T1','T3');
DBMS_DATAPUMP.start_job(n);
DBMS_DATAPUMP.WAIT_FOR_JOB(n, :job_state);
end;
/
PL/SQL procedure successfully completed.
JOB_STATE
--------------------------------------------------
COMPLETED
SQL> desc t3
Name Null? Type
----------------------- -------- ----------------
X NOT NULL NUMBER

The Table T3 is a copy of T1. DB01 is my implicit loopback database link (database name).

Categories
dba

Drop table cascade and reimport

Happy new year 🙂

Today I had to import a subset of a database and the challenge was to restore a parent table without restoring its children. It took me some minutes to write the code, but it would have taken days to restore the whole database.

CREATE TABLE t1(
  c1 NUMBER CONSTRAINT t1_pk PRIMARY KEY);
INSERT INTO t1 (c1) VALUES (1);
CREATE TABLE t2(
  c1 NUMBER CONSTRAINT t2_t1_fk REFERENCES t1,
  c2 NUMBER CONSTRAINT t2_pk PRIMARY KEY);
INSERT INTO t2 (c1, c2) VALUES (1, 2);
CREATE TABLE t3(
  c2 NUMBER CONSTRAINT t3_t2_fk REFERENCES t2,
  c3 NUMBER CONSTRAINT t3_pk PRIMARY KEY);
INSERT INTO t3 (c2, c3) VALUES (2, 3);
CREATE TABLE t4(
  c3 NUMBER CONSTRAINT t4_t3_fk REFERENCES t3,
  c4 NUMBER CONSTRAINT t4_pk PRIMARY KEY);
INSERT INTO t4 (c3, c4) VALUES (3, 4);
COMMIT;

expdp scott/tiger directory=DATA_PUMP_DIR dumpfile=scott.dmp reuse_dumpfiles=y

Now what happen if I want to restore T2 and T3 ?

If possible, I check the dictionary for foreign keys from other tables pointing to T2 and T3.

SELECT constraint_name
FROM user_constraints
WHERE (r_constraint_name) IN (
    SELECT constraint_name
    FROM user_constraints
    WHERE table_name IN ('T2', 'T3'))
  AND table_name NOT IN ('T2', 'T3');

TABLE_NAME                     CONSTRAINT_NAME               
------------------------------ ------------------------------
T4                             T4_T3_FK                      

T4 points to T3 and T4 has data.

Now I can drop my tables with the cascade options

drop table t2 cascade constraints;
drop table t3 cascade constraints;

Now I import, first the tables, then the referential constraints dropped with the cascade clause and not on T2/T3.

impdp scott/tiger tables=T2,T3 directory=DATA_PUMP_DIR dumpfile=scott.dmp

impdp scott/tiger  "include=ref_constraint:\='T4_T3_FK'" directory=DATA_PUMP_DIR dumpfile=scott.dmp

It’s probably possible to do it in one import, but the include syntax is horrible. I tried there

Categories
dba unicode

import into UTF8 database

A common error when you import single-byte characters (e.g. iso8859p1 or mswin1252) into multi-bytes databases (e.g. utf8) is ORA-12899: value too large for column.

The root cause is the default semantics in a database being BYTE


SQL> select VALUE, ISDEFAULT
from v$parameter
where NAME='nls_length_semantics'
VALUE ISDEFAULT
------- ---------
BYTE TRUE

It means, one char equals one byte. But after conversion, one char is larger than one byte and does not fit any longer.

single-byte

SQL> select VALUE
from nls_database_parameters
where parameter='NLS_CHARACTERSET';
VALUE
-------------
WE8MSWIN1252
SQL> create table t(x char(1));
Table created.
SQL> insert into t values ('é');
1 row created.
SQL> commit;
Commit complete.
$ expdp scott/tiger dumpfile=t.dmp tables=t
. . exported "SCOTT"."T" 1 rows

multi-byte

SQL> select VALUE
from nls_database_parameters
where parameter='NLS_CHARACTERSET';
VALUE
-----------
UTF8
$ impdp scott/tiger dumpfile=t.dmp
Processing object type TABLE_EXPORT/TABLE/TABLE
Processing object type
TABLE_EXPORT/TABLE/TABLE_DATA
ORA-02374: conversion error loading table
"SCOTT"."T"
ORA-12899: value too large for column X
(actual: 2, maximum: 1)
ORA-02372: data for row: X : 0X'E9'
. . imported "SCOTT"."T" 0 out of 1 rows

How do I import my data?

1) import the metadata

$ impdp scott/tiger dumpfile=t.dmp content=metadata_only
Processing object type TABLE_EXPORT/TABLE/TABLE

2) change the char_used of the column(s) from (B)yte to (C)har

SQL> select
column_name, char_used, data_length, data_type
from user_tab_columns
where table_name='T' and char_used='B';
COLUMN_NAME C DATA_LENGTH DATA_TYPE
------------ - ----------- ---------
X B 1 CHAR
SQL> alter table t modify x char(1 char);
Table altered.

3) import the data

$ impdp scott/tiger dumpfile=t.dmp content=data_only
Processing object type
TABLE_EXPORT/TABLE/TABLE_DATA
. . imported "SCOTT"."T" 1 rows

4) check

SQL> select x, length(x), lengthb(x) from t;
X LENGTH(X) LENGTHB(X)
- ---------- ----------
é 1 2

My column has now a length of one char and two bytes.

Categories
dba

Transport tablespace over db links

You do not want to export the metadata from the source database, but rather use a database link to get this.

As prerequisite, you have made a set of self-contained tablespaces in read-only mode and you have copied the datafiles.


SQL> create tablespace test datafile '/u02/oradata/db01/test01.dbf' size 10m;

Tablespace created.

SQL> create table scott.x(x number) tablespace test;

Table created.

SQL> insert into scott.x values (1);

1 row created.

SQL> commit;

Commit complete.

SQL> alter tablespace test read only;

Tablespace altered.


$ scp srv01:/u02/oradata/db01/test01.dbf /u02/oradata/db02

create a database link on the target database DB02

SQL> create database link l using 'DB01';

Database link created.

Then import the tablespace via db link

impdp scott/tiger network_link=l transport_tablespaces=TEST transport_datafiles=/u02/oradata/db02/test01.dbf logfile=DATA_PUMP_LOG:impdp.log

This avoid the “expdp” step, but it does not avoid copying the datafiles

Categories
dba plsql sql

Duplicate table over database link

The old-style approach would be CREATE TABLE T AS SELECT * FROM T@L, where T is the table you want to duplicate and L the database link pointing to the remote database

If you want to keep more info about the structure of t, the constraints, the indexes, you may use Datapump. Here it is…


SQL> var n number
SQL> exec :n := dbms_datapump.open('IMPORT','TABLE','L')

PL/SQL procedure successfully completed.

SQL> print n
N
----------
28

SQL> exec dbms_datapump.metadata_filter(:n,'SCHEMA_LIST','''SCOTT''')

PL/SQL procedure successfully completed.

SQL> exec dbms_datapump.metadata_filter(:n,'NAME_LIST','''T''')

PL/SQL procedure successfully completed.

SQL> exec dbms_datapump.start_job(:n)

PL/SQL procedure successfully completed.
SQL> desc t
Name Null? Type
----------------- -------- ------------
X NOT NULL NUMBER

This is utterly simple. If you mess up with the link name, global name, syntax, and so on, you may end up with orphan jobs in DBA_SCHEDULER_JOBS. There is some metalink note on dropping the underlying tables (Note 336914.1) but first log off, get a coffee, and they may vanish after a few minutes.

Categories
dba

Datapump : table like ‘FOO%’ or like ‘BAR%’

Today I tried to put two like condition in an INCLUDE clause of datapump.

I have the following tables
SQL> select table_name from user_tables order by 1;

TABLE_NAME
------------------------------
AAA
BAR1 ***
BAR2 ***
BLA
FOO ***
FOO1 ***
GOZ

and I want tables like BAR% and tables likes FOO%

First try :
$ expdp scott/tiger include=table:"like'FOO%'or like'BAR%'"

Export: Release 11.2.0.2.0 - Production on Thu Jul 14 11:47:13 2011

Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.

Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.2.0 - 64bit Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
ORA-39001: invalid argument value
ORA-39071: Value for INCLUDE is badly formed.
ORA-00936: missing expression

This does not work, because it would translate in WHERE {tablename} like’FOO%’or like’BAR%’

Ok, second try, let’s put multiple TABLE clause

$ expdp scott/tiger include=table:"like'FOO%'",table:"like'BAR%'"

Export: Release 11.2.0.2.0 - Production on Thu Jul 14 11:47:15 2011

Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.

Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.2.0 - 64bit Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Starting "SCOTT"."SYS_EXPORT_SCHEMA_01": scott/******** include=table:"like'FOO%'",table:"like'BAR%'" Estimate in progress using BLOCKS method...
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 0 KB
ORA-39168: Object path TABLE was not found.
ORA-31655: no data or metadata objects selected for job
Job "SCOTT"."SYS_EXPORT_SCHEMA_01" completed with 2 error(s) at 11:47:21

no data was found, because it did translate in WHERE {tablename} like ‘FOO%’ and{tablename} like ‘BAR%’. Which returns no row.

Ok, multiple INCLUDE conditions are joined by AND, so let’s do the math. (BAR% OR FOO%)=(>=BAR AND (where FONZZZ is immediately smaller than FOO)

$ expdp scott/tiger include=table:">='BAR'",table:"not between 'BAS' and 'FONZZZZZZZZZZZZZ'",table:"<'FOP'" Export: Release 11.2.0.2.0 - Production on Thu Jul 14 11:47:21 2011 Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved. Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.2.0 - 64bit Production With the Partitioning, OLAP, Data Mining and Real Application Testing options Starting "SCOTT"."SYS_EXPORT_SCHEMA_01": scott/******** include=table:">='BAR'",table:"not between 'BAS' and 'FONZZZZZZZZZZZZZ'",table:"<'FOP'" Estimate in progress using BLOCKS method... Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA Total estimation using BLOCKS method: 256 KB Processing object type SCHEMA_EXPORT/TABLE/TABLE . . exported "SCOTT"."BAR1" 5.007 KB 1 rows . . exported "SCOTT"."BAR2" 5.007 KB 1 rows . . exported "SCOTT"."FOO" 5.007 KB 1 rows . . exported "SCOTT"."FOO1" 5.007 KB 1 rows Master table "SCOTT"."SYS_EXPORT_SCHEMA_01" successfully loaded/unloaded ****************************************************************************** Dump file set for SCOTT.SYS_EXPORT_SCHEMA_01 is: /u01/app/oracle/admin/DB01/dmp/expdat.dmp Job "SCOTT"."SYS_EXPORT_SCHEMA_01" successfully completed at 11:47:31

It is that simple 😉