Importing into the Autonomous Data Warehouse Using Oracle Data Pump

January 1, 2019 | 7 minute read
Text Size 100%:

Introduction

This post details the steps required to import into the Autonomous Data Warehouse (ADW) using the Oracle Data Pump Import utility on Oracle Cloud Infrastructure (OCI).

Versions

OCI Compute    Oracle-Linux-7.6-2018.11.19-0
Instant Client     18.3.0.0.0 for Linux x86-64 (64-bit)
ADW DB            18.0.3.3
SQL Developer 18.3.0.277

Assumptions

An Oracle Cloud Account and an OCI Tenancy
Access to an ADW instance
Administrator Credentials to both of the above
An export file (.dmp) created by the Oracle Data Pump Export utility
The following components exist in the OCI tenancy see Setting Up the Prerequisites
-- Customer Compartment under the root compartment of the tenancy
-- Virtual Cloud Network
-- Object Storage Bucket
-- OCI Object Storage Administrator User Name and Authorization Token
-- An OCI Compute Image created with an Oracle Linux x64 image

Validations

Validated December 21st, 2018 with the initial versions above.

Updates

Published on December 21, 2018

Topics

Preparing to Import into ADW
Running an ADW Data Pump Import
Examining the ADW Import Log and the ADW SQL


Preparing to Import into ADW

Topics

Preparing to Connect to the Autonomous Data Warehouse
Preparing Compute to Connect to the Autonomous Data Warehouse
Uploading a Data Pump Export File to Object Storage
Creating a Storage Credential in ADW
Running the ADW Data Pump Import Utility
Examining the ADW Import Log and the ADW SQL File


Preparing to Connect to the Autonomous Data Warehouse Cloud

The steps described in the post Preparing to Connect to the Autonomous Data Warehouse provide the following.

This post uses Windows 7 to receive the files.

A client credentials zip file containing the wallet and network configuration files used to make the connections.

Instant Client zip files for SQL*Net libraries and the Data Pump Import utility.


Preparing Compute to Connect to the Autonomous Data Warehouse

This post uses PuTTY and PSCP SSH applications on Windows 7 to connect to the Compute instance. The PuTTY programs may be downloaded from https://www.putty.org/. For details on using these tools refer to Connecting to a Compute Node Using the PuTTY Program on WindowsIf you are not using Windows, substitute scp for PSCP and ssh for PuTTY.

Topics

Obtaining the Compute Private Key
Obtaining the Compute Public IP Address
Preparing the ADW Credential Files in Compute
Preparing the Instant Client files in Compute


Obtaining the Compute Private Key and Administrator Credentials

Obtain the SSH private key file for the Compute instance from the Compute Instance Administrator and convert the private key file to the PuTTY .ppk format. SSH requires the usage of a public / private key pair associated with the instance.

To convert the private key:

* Run the PuTTYgen program

* Load the private key file

* Save the private key using the defaults SSH-2 RSA and 2048 for Number of bits. Save as type PuTTY Private Key Files (*.ppk).


Obtaining the Compute Public IP Address

Obtain the Compute Public IP address from the OCI Compute>Instances>Instance Details page.


Preparing the ADW Client Credential Files in Compute

Upload the ADW Client Credentials zip file, unzip it, and modify the files and the Compute environment.

Topics

Uploading the ADW Client Credentials to Compute
Updating Compute with the ADW Client Credential Files


Uploading the ADW Client Credentials to Compute

Upload the Client Credentials Zip File to the compute instance.

Wherever the convention < Something > is encountered, remove the brackets and spaces and use only the Something string.

In a CMD window, change to the PuTTY directory and run the PSCP command to upload the file.

cd < the PuTTY folder >\PuTTY
pscp -v -i < Compute SSH Private Key > < Client Credentials Zip File > opc@< Compute Public IP >:/tmp

Note: If there is a passphrase for the private key, you are prompted for it.


Updating Compute with the ADW Client Credential Files

Start a SSH session in the Compute server using PuTTY, the Compute Public IP address, and the SSH private key file (.ppk). Connect as the opc user. For additional details visit Accessing an Oracle Linux Instance Using SSH for details.

Change to the root user, create a directory, change the ownership and group of the directory and exit back to the opc user.

sudo su -
mkdir -p -v /u01/data/credentials; chown -R opc:opc /u01
exit

Change to the directory, and unzip the credentials.

cd /u01/data/credentials; unzip /tmp/< Client Credentials Zip File >

Set the TNS_ADMIN variable for the current session and future sessions.

# For The Current Session
export TNS_ADMIN=/u01/data/credentials
# Make a backup of the .bashrc file
cp ~/.bashrc ~/.bashrc.orig
# For Future Sessions
echo export TNS_ADMIN=$TNS_ADMIN >>~/.bashrc

Backup and edit the sqlnet.ora file to update the wallet location with the path to the credentials directory.

cp sqlnet.ora sqlnet.ora.orig
vi sqlnet.ora; cat sqlnet.ora

Save and exit and the file should look like this:

...

WALLET_LOCATION = (SOURCE = (METHOD = FILE) (METHOD_DATA = (DIRECTORY = /u01/data/credentials)))
SSL_SERVER_DN_MATCH=yes

__________________________________________________________________________________________________________

Preparing the Instant Client files in Compute

The Oracle Instant Client provides the libraries for SQL*Net and the Data Pump utilities. Only the instantclient-linux.x64- basic and tools files are required.

Topics

Uploading the Instant Client Files to Compute
Updating the Instant Client Files in Compute

_____________________________________________________________________________________________________________________
Uploading the Instant Client Files to Compute

Upload the Instant Client zip files.

In a CMD window, change to the PuTTY directory and run the PSCP command to upload the zip file to the /tmp directory.

Ensure only the instantclient-linux.x64 basic and tools files are in the source directory.

cd C:\Program Files (x86)\PuTTY
pscp -v -i < Compute SSH Private Key > < Instant Client Zip Directory >\instant*.zip opc@< Compute Public IP >:/tmp

If there is a passphrase for the private key, you are prompted for it.


Updating the Instant Client Files in Compute

Unzip the Instant Client zip files, create symbolic links, and update environmental variables.

Start a SSH session in the Compute instance using PuTTY, the Compute public IP address, and the Compute instance private key file (.ppk). Connect as the opc user. Refer to New Oracle Client Installation for additional details.

Ensure only the instantclient-linux.x64 basic and tools files are in the /tmp directory.

List the uploaded zip files.

ls /tmp/instant*.zip

Choose a parent directory for the Instant Client files and change to it e.g. /u01/data. Unzip the files. Note: Unzipping the files with this command creates a sub-directory for the files e.g. instantclient_18_3. Then change to the new sub-directory.

cd /u01/data
# Unzip the Instant Client Zip files
find /tmp -name instant*.zip -exec unzip '{}' ';'
#Change to the new directory
cd instant*

If these links are not in the directory, create them

ln -s libclntsh.so.18.1 libclntsh.so
ln -s libocci.so.18.1 libocci.so

Set the LD_LIBRARY_PATH variable for the current session and future sessions.

# Set the LD_LIBRARY_PATH for the session
export LD_LIBRARY_PATH=`pwd`:$LD_LIBRARY_PATH
# Save the command to a temporary file
echo export LD_LIBRARY_PATH=$LD_LIBRARY_PATH >/tmp/tmpLD
# Make a backup of the .bashrc file
cp ~/.bashrc ~/.bashrc.orig2
# Append the command to the bashrc file for future sessions
cat /tmp/tmpLD >>~/.bashrc

Set the PATH variable for the current session and future sessions.

# Set the PATH for the session
export PATH=`pwd`:$PATH
# Save the command to a temporary file
echo export PATH=$PATH >/tmp/tmpLD
# Make a backup of the .bashrc file
cp ~/.bashrc ~/.bashrc.orig3
# Append the command to the bashrc file for future sessions
cat /tmp/tmpLD >>~/.bashrc


Uploading a Data Pump Export File

Connect to the OCI account and upload the Data Pump export file to the Object Storage bucket. Refer to Upload Data Files to the Object Store for the details.


Creating a Storage Credential in ADW

This post use SQL Developer to access ADW and create the storage credential. Download and install SQL Developer from SQL Developer Downloads. See the Versions section above for the latest version validated.

Connect SQL*Developer to the ADW. Refer to Connect with Oracle SQL Developer (18.2 or later) for details.

Store the Object Storage credential using DBMS_CLOUD.CREATE_CREDENTIAL. For more information on the creating a credential for Object Storage services, see CREATE_CREDENTIAL Procedure.

BEGIN
DBMS_CLOUD.CREATE_CREDENTIAL( credential_name => '< Credential Name >', username => '< Object Storage User Name >', password => '< User Authorization Token >' ); END;


Running the ADW Data Pump Import Utility

Run the Import (impdp) utility.

Use the credential created above and a TNS net service name from the tnsnames.ora file in the ADW client credentials directory.

Study diligently the document Import Data Using Oracle Data Pump Version 18.3 or Later, especially the section on Restrictions for SQL Commands in ADW. This example runs the utility and places all the SQL commands into a file. This file may be viewed and compared to the restricted SQL to generate the exclude parameter for the real run.

impdp < ADW Schema Username >/< ADW password >@< Net Service Name > directory=data_pump_dir credential=< Credential name created above > sqlfile=adwimport.sql dumpfile=https://swiftobjectstorage.< Region >.oraclecloud.com/v1/< Tenancy >/< Bucket >/< .dmp File >

The actual command is shown in the results below.

Import: Release 18.0.0.0.0 - Production on Fri Dec 21 01:23:41 2018
Version 18.3.0.0.0

Copyright (c) 1982, 2018, Oracle and/or its affiliates. All rights reserved.

Connected to: Oracle Database 18c Enterprise Edition Release 12.2.0.1.0 - 64bit Production
Master table "ADMIN"."SYS_SQL_FILE_FULL_01" successfully loaded/unloaded
import done in AL32UTF8 character set and AL16UTF16 NCHAR character set
export done in WE8MSWIN1252 character set and AL16UTF16 NCHAR character set
Warning: possible data loss in character set conversions
Starting "ADMIN"."SYS_SQL_FILE_FULL_01": admin/********@dcadw_high directory=data_pump_dir credential=dcCredential sqlfile=adwimport.sql dumpfile=https://swiftobjectstorage.us-ashburn-1.oraclecloud.com/v1/gse00015629/dcBucket/POC.DMP
Processing object type SCHEMA_EXPORT/USER
Processing object type SCHEMA_EXPORT/SYSTEM_GRANT
Processing object type SCHEMA_EXPORT/ROLE_GRANT
Processing object type SCHEMA_EXPORT/DEFAULT_ROLE
Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
Processing object type SCHEMA_EXPORT/TABLE/TABLE
Processing object type SCHEMA_EXPORT/FUNCTION/FUNCTION
Processing object type SCHEMA_EXPORT/PROCEDURE/PROCEDURE
Processing object type SCHEMA_EXPORT/FUNCTION/ALTER_FUNCTION
Processing object type SCHEMA_EXPORT/PROCEDURE/ALTER_PROCEDURE
Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
Processing object type SCHEMA_EXPORT/VIEW/VIEW
Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
Job "ADMIN"."SYS_SQL_FILE_FULL_01" successfully completed at Fri Dec 21 01:24:07 2018 elapsed 0 00:00:25


Examining the ADW Import Log and the ADW SQL File

Move the .log and .sql files to the Object Storage using the procedure DBMS_CLOUD.PUT_OBJECT. This post uses SQL*Developer for this. For more information, see Summary of DBMS_CLOUD Subprograms.

It is the resource (file name) at the end of the object_uri parameter that becomes the object name in Object Storage.

BEGIN
DBMS_CLOUD.PUT_OBJECT(
credential_name => '< the Credential >',
object_uri => '< the storage bucket URL >/import.log',
directory_name => 'DATA_PUMP_DIR',
file_name => 'import.log');
END;
/

and

BEGIN
DBMS_CLOUD.PUT_OBJECT(
credential_name => '< the Credential >',
object_uri => '< the storage bucket URL >/adwimport.sql',
directory_name => 'DATA_PUMP_DIR',
file_name => 'adwimport.sql');
END;
/

Search for the files in the object storage bucket and download them. See Managing Objects for details.


Summary

This post detailed the steps required to import into the Autonomous Data Warehouse (ADW) using the Oracle Data Pump Import utility on Oracle Cloud Infrastructure (OCI).

For more OAC, BICS and BI best practices, tips, tricks, and guidance that A-Team members gain from real-world experiences working with customers and partners, visit http://www.ateam-oracle.com/dayne-carley

 

Dayne Carley


Previous Post

Connecting Oracle Data Integrator Cloud Service to the Autonomous Data Warehouse

Dayne Carley | 1 min read

Next Post


Best Practices to Model Customer, Supplier, and Partner Hierarchies in Oracle Fusion Cloud

Bala Mahalingam | 12 min read