Showing posts with label import. Show all posts
Showing posts with label import. Show all posts

Thursday, February 5, 2015

Creating an (import-) SQL file with DBMS_DATAPUMP

DBMS_DATAPUMP can create SQL files from a schema so that these files can later be run to re-create the schema.

This is described in Oracle Note 1519981.1: How to Generate A SQL File Using The DBMS_DATAPUMP_API?. Unfortunately, the note does not explicitely state that the creation of such an sql file consists of two steps, first the schema has to be dumped ordinarly, then, the dumped file has to be turned into the desired SQL file.

Here are the steps to create such an SQL file.

First step: creating the dump file

declare datapump_job number; job_state varchar2(20); begin datapump_job := dbms_datapump.open( operation => 'EXPORT', job_mode => 'SCHEMA', remote_link => null, job_name => 'Export dump file', version => 'LATEST' ); dbms_output.put_line('datapump_job: ' || datapump_job); dbms_datapump.add_file( handle => datapump_job, filename => 'export.dmp', directory => 'DATAPUMP_DIR', filetype => dbms_datapump.ku$_file_type_dump_file); dbms_datapump.start_job( handle => datapump_job, skip_current => 0, abort_step => 0); dbms_datapump.wait_for_job(datapump_job, job_state); dbms_output.put_line('Job state: ' || job_state); dbms_datapump.detach(datapump_job); end; /

Second step: turning the dump file into an SQL file

declare datapump_job number; job_state varchar2(20); begin datapump_job := dbms_datapump.open( operation => 'SQL_FILE', job_mode => 'SCHEMA', remote_link => null, job_name => 'Export SQL file', version => 'LATEST' ); dbms_output.put_line('datapump_job: ' || datapump_job); dbms_datapump.add_file( handle => datapump_job, filename => 'export.dmp', directory => 'DATAPUMP_DIR', filetype => dbms_datapump.ku$_file_type_dump_file); dbms_datapump.add_file( handle => datapump_job, filename => 'schema.sql', directory => 'DATAPUMP_DIR', filetype => dbms_datapump.ku$_file_type_sql_file); dbms_datapump.start_job( handle => datapump_job, skip_current => 0, abort_step => 0); dbms_datapump.wait_for_job(datapump_job, job_state); dbms_output.put_line('Job state: ' || job_state); dbms_datapump.detach(datapump_job); end; /
Source code on github

My question on dba.stackexchange.com

Cloning an Oracle schema with DBMS_DATAPUMP

Wednesday, February 4, 2015

Cloning an Oracle schema with dbms_datapump

I have a schema (FROM_SCHEMA_A) that I need to clone on the same database. So, the cloned schema will go by another name, in my case by TO_SCHEMA_A).

In order to make things a bit more complicated, FROM_SCHEMA_A references objects in another schema (named SCHEMA_B). This other schema won't be cloned.

Schema definitions

Before starting with the object definitions in the schemas, I create the needed schemas:
create user from_schema_A identified by p quota unlimited on users; grant create procedure, create session, create table, create trigger, create view to from_schema_A; create user schema_B identified by p quota unlimited on users; grant create session, create table to schema_B;

Here's the definition for the SCHEMA_B schema. It consisists of one table only:

create table table_b_1 ( a number, b varchar2(10) );

Since FROM_SCHEMA_A references this table, it needs some grants:

grant insert, select on table_b_1 to from_schema_A;

Here's the definition for the FROM_SCHEMA_A schema:

create table table_a_1 ( c number, d varchar2(20) ); insert into table_a_1 values (10, 'ten' ); insert into table_a_1 values (11, 'eleven'); create view view_a_1 as select * from schema_b.table_b_1; create package package_a_1 as function count_a return number; function count_b return number; end package_a_1; / create package body package_a_1 as function count_a return number is ret number; begin select count(*) into ret from table_a_1; return ret; end count_a; function count_b return number is ret number; begin select count(*) into ret from view_a_1; return ret; end count_b; end package_a_1; / create trigger trigger_a before insert on table_a_1 for each row begin insert into schema_b.table_b_1 values (:new.c, :new.d); end trigger_a; / -- There's a trigger on the table, so the -- following insersts should fill table_b_1 -- (in schema_b): insert into table_a_1 values (1, 'one'); insert into table_a_1 values (2, 'two');

Export/Import administrator

In order to perform the export and the import, I create a special export import administrator:
create user exp_imp_admin identified by p; grant exp_full_database, imp_full_database to exp_imp_admin; alter user exp_imp_admin quota unlimited on users;

Performing the export and import

With that user, I am able to export the schema:
connect exp_imp_admin/p declare datapump_job number; job_state varchar2(20); begin datapump_job := dbms_datapump.open( operation => 'EXPORT', job_mode => 'SCHEMA', remote_link => null, job_name => 'Clone schema A, export', version => 'LATEST', compression => dbms_datapump.ku$_compress_metadata ); dbms_output.put_line('datapump_job: ' || datapump_job); dbms_datapump.metadata_filter( handle => datapump_job, name =>'SCHEMA_LIST', value =>'''FROM_SCHEMA_A''' ); dbms_datapump.add_file( handle => datapump_job, filename => 'clone_schema_a_export.log', directory => 'DATAPUMP_DIR', filetype => dbms_datapump.ku$_file_type_log_file); dbms_datapump.add_file( handle => datapump_job, filename => 'clone_schema_a.dmp', -- Note, created will be in UPPERCASE! directory => 'DATAPUMP_DIR', filetype => dbms_datapump.ku$_file_type_dump_file); dbms_datapump.start_job (datapump_job); dbms_datapump.wait_for_job(datapump_job, job_state); dbms_output.put_line('Job state: ' || job_state); dbms_datapump.detach(datapump_job); exception when others then dbms_output.put_line(sqlcode); dbms_output.put_line(sqlerrm); if datapump_job is not null then dbms_datapump.detach(datapump_job); end if; end; /

And the following import actually clones the schema. Of particular insterest is the call of dbms_datapump.metadata_remap

declare datapump_job number; job_state varchar2(20); begin datapump_job := dbms_datapump.open( operation => 'IMPORT', job_mode => 'SCHEMA', remote_link => null, job_name => 'Clone schema A, import', version => 'LATEST', compression => dbms_datapump.ku$_compress_metadata ); dbms_output.put_line('datapump_job: ' || datapump_job); dbms_datapump.metadata_remap( datapump_job, 'REMAP_SCHEMA', 'FROM_SCHEMA_A', 'TO_SCHEMA_A'); dbms_datapump.add_file( handle => datapump_job, filename => 'clone_schema_a_import.log', directory => 'DATAPUMP_DIR', filetype => dbms_datapump.ku$_file_type_log_file); dbms_datapump.add_file( handle => datapump_job, filename => 'clone_schema_a.dmp', -- Note: export has created the file with UPPERCASE letters directory => 'DATAPUMP_DIR', filetype => dbms_datapump.ku$_file_type_dump_file); dbms_datapump.start_job (datapump_job); dbms_datapump.wait_for_job(datapump_job, job_state); dbms_output.put_line('Job state: ' || job_state); dbms_datapump.detach(datapump_job); exception when others then dbms_output.put_line(sqlcode); dbms_output.put_line(sqlerrm); if datapump_job is not null then dbms_datapump.detach(datapump_job); end if; end; /
Creating an (import-) SQL file with DBMS_DATAPUMP
Source code on github