Announcements

Wednesday, May 13, 2026

Data Pump to/from OCI Storage in Autonomous Database: General Considerations and Setup

This is the first of three blogs about using Data Pump on an Autonomous Oracle database.  
  1. General considerations and setup.
  2. Export
  3. Import
Using Data Pump on an Autonomous Oracle database is a little different.  I have run into several limitations, so I'm sharing what I have learned.  Throughout the examples in these articles, I am using an Oracle 26 Autonomous Database on Oracle OCI Free Tier.

No Operating System Access

The first thing to bear in mind is that there is no operating system command-line access on the database server, so you cannot run the datapump clients expdp and impdp on the database server.  Instead, Data Pump jobs can be created through the PL/SQL API; dbms_datapump.  This will be demonstrated in the other two blog posts in this series.

Database Directory DATA_PUMP_DIR

On an autonomous database, there is no access to the underlying operating system or file system.  However, several database directories are created in the database.  Though write access is only granted to two of these directories.  In both cases, write access is granted to and can be granted on by the ADMIN and C##CLOUD$SERVICE users.
  • DATA_PUMP_DIR is provided for the use of Data Pump.  Write access is also granted to roles EXP_FULL_DATABASE, IMP_FULL_DATABASE, DATAPUMP_CLOUD_EXP and DATAPUMP_CLOUD_IMP. 
  • SQL_TCB_DIR is used by the SQL test case builder (see DBMS_SQLDIAG).  
However, these two directories point to database file systems.  
select directory_name, directory_path from dba_directories
where directory_name IN('DATA_PUMP_DIR','SQL_TCB_DIR')
order by directory_path
/

DIRECTORY_NAME  DIRECTORY_PATH                                              
--------------- ------------------------------------------------------------
DATA_PUMP_DIR  /u03/dbfs/4A03C84C700735C1E063D760000A3AE1/data/dpdump    
SQL_TCB_DIR    /u03/dbfs/4A03C84C700735C1E063D760000A3AE1/data/tcb_dir
Thus, their context exists within the database.  It is in a tablespace called DBFS_DATA, and it counts towards the total size of the container.
select tablespace_name, file_id, file_name, bytes, status
from dba_data_Files
where tablespace_name like 'DBFS%'
/
TABLESPACE_NAME    FILE_ID FILE_NAME                                               BYTES STATUS   
--------------- ---------- -------------------------------------------------- ---------- ---------
DBFS_DATA           10777 +DATA/FCEE691J/4A03C84C700735C1E063D760000A3AE1/DA 4399824896 AVAILABLE
                           TAFILE/dbfs_data.5025.1224350611
It is very easy to export to and import from these directories, but if you create a large dump file, the data files of the DBFS_DATA tablespace will extend and consume space in the database container.  In an autonomous database, no user has sufficient access to shrink the tablespace by conventional means.  
In the paid Autonomous Database, it can be released using OCI CLI (see shrink command reference).  This is not available in the free tier.

Setting Up OCI Command Line Interface

Create a pair of public and private keys in the command line interface.
oci setup keys

File C:\Users\david\.oci\oci_api_key_public.pem already exists, do you want to overwrite? [y/N]: N
File C:\Users\david\.oci\oci_api_key.pem already exists, do you want to overwrite? [y/N]: N
Public key fingerprint: 72:7a:3d:1b:66:65:cf:a1:ed:2a:3d:42:6f:03:53:0b
Then follow the instructions How to Upload the Public Key to upload the public key to the Tokens and Keys section of OCI

Take the information in the configuration file preview and paste it back into the OCI configuration file
This is in the same location as the public and private key files, in my case C:\Users\david\.oci\config.
Specify the fully qualified path and name of the private key file.
[DEFAULT]
user=ocid1.user.oc1..aaaaaaaaohj2suignixq7jjyrbkodaz5r73wonx25pzaspfpmkiyg76sra77
fingerprint=05:d4:fc:e2:81:9e:18:fb:00:d5:2b:9e:12:34:56:78
key_file=C:\Users\david\.oci\oci_api_key.pem
tenancy=ocid1.tenancy.oc1..aaaaaaaaf5e7vjqhg6m3jiv3zdos3b2hkbixqyq3tvqhtuyftyemzejzwosv
region=uk-london-1

Create Credential

Finally, create a credential with access to the OCI storage bucket that will hold the datapump files.
BEGIN
  DBMS_CLOUD.CREATE_CREDENTIAL (
      credential_name   => 'OBJECT_STORE_CRED',
      username          => 'OracleIdentityCloudService/dba@go-faster.co.uk',
      password          => ';I)<P+3U:GRi6mprPD1n'
  );
END;
/

Default Credential

I have found it convenient to specify a default credential.
ALTER DATABASE PROPERTY SET DEFAULT_CREDENTIAL = 'ADMIN.OBJECT_STORE_CRED';
For example, I can query the OCI Bucket without having to specify a credential to access the OCI Bucket storage every time.
SELECT object_name, bytes/1024/1024 Mb
FROM table(
  DBMS_CLOUD.LIST_objects
  (location_uri => 'https://objectstorage.uk-london-1.oraclecloud.com/n/lrp********a/b/bucket-gofaster1/o/'));

OBJECT_NAME                                                        MB
------------------------------------------------------------ --------
export_strava_20260506.dmp                                      0.000
export_strava_20260506.dmp_segments/aaaaaa                      2.000
export_strava_20260506.dmp_segments/aaaaab 24.000 ...
export_strava_20260506.dmp_segments/aaaabf 11.563 export_strava_20260506.log 0.004
The next blog describes how to create data pump jobs to export data.

No comments:

Post a Comment

Comments are manually moderated!