UKOUG2020

Monday, June 01, 2020

Loading a Flat File from OCI Object Storage into an Autonomous Database. Part 1. Upload to Object Storage

This blog is the first in a series of three that looks at transferring a file to Oracle Cloud Infrastructure (OCI) Object Storage, and then reading it into the database with an external table or copying it into a regular table.
Last year I wrote a blog titled Reading the Active Session History Compressed Export File in eDB360/SQLd360 as an External Table.  I set myself the challenge of doing the same thing with an Autonomous database.  I would imagine that these techniques are commonly used Oracle Cloud operations, yet I found the documentation was spread over a number of places, and it took me a while to get it right. So, I hope you find this series helpful.

Install OCI

I could just upload my data file through the browser directly into an object storage bucket, but I don't want to copy it to a Windows desktop.  That is not a good option for very large files.  Instead, I am going to install the OCI Command Line Interface onto the Linux VM where my data file resides (see OCI CLI Quickstart Guide).
I am installing this into the oracle user on a Linux VM where the Oracle database has previously been installed, so I just accepted all the defaults.
bash -c "$(curl -L https://raw.githubusercontent.com/oracle/oci-cli/master/scripts/install/install.sh)"

Set up Token-Based Authentication for OCI

I couldn't get the instructions for generating a token without a browser to work.  Instead, installed OCI on a Windows machine and generated a token there and transferred it to my Linux VM (see Token-based Authentication for the CLI).
C:\Users\david.kurtz>oci session authenticate
Enter a region (e.g. ap-mumbai-1, ap-seoul-1, ap-sydney-1, ap-tokyo-1, ca-toronto-1, eu-frankfurt-1, eu-zurich-1, sa-saopaulo-1, uk-london-1, us-ashburn-1, us-gov-ashburn-1, us-gov-chicago-1, us-gov-phoenix-1, us-langley-1, us-luke-1, us-phoenix-1): uk-london-1
    Please switch to newly opened browser window to log in!
    Completed browser authentication process!
Config written to: C:\Users\david.kurtz\.oci\config

    Try out your newly created session credentials with the following example command:

    oci iam region list --config-file C:\Users\david.kurtz\.oci\config --profile DEFAULT --auth security_token
If I run the suggested example command, I get this response with the list of OCI regions.
{
  "data": [
…
    {
      "key": "LHR",
      "name": "uk-london-1"
    },
…
  ]
}

Export OCI Profile

Now I can export the profile to a zip file
C:\Users\david.kurtz>oci session export --profile DEFAULT --output-file DEFAULT
File DEFAULT.zip already exists, do you want to overwrite it? [y/N]: y
Exporting profile: DEFAULT from config file: C:\Users\david.kurtz\.oci\config
Export file written to: C:\Users\david.kurtz\DEFAULT.zip

Import OCI Profile

I can transfer this zip file to my Linux VM and import it.
[oracle@oracle-database .oci]$ oci session import --session-archive ./DEFAULT.zip --force
Config already contains a profile with the same name as the archived profile: DEFAULT. Provide an alternative name for the imported profile: myprofile
Imported profile myprofile written to: /home/oracle/.oci/config

    Try out your newly imported session credentials with the following example command:

    oci iam region list --config-file /home/oracle/.oci/config --profile myprofile --auth security_token
I can test it by again getting the list of OCI regions.

Upload a File

I have created a bucket on OCI.
I could upload a file through the OCI web interface, but I want to use a command-line from my Linux VM
[oracle@oracle-database ~]$ oci os object put --bucket-name bucket-20200505-1552 --file /media/sf_temp/dba_hist_active_sess_history.txt.gz --disable-parallel-uploads --config-file /home/oracle/.oci/config --profile myprofile --auth security_token
Upload ID: 1ad452f7-ab49-a24b-2fe9-f55f565cdf40
Split file into 2 parts for upload.
Uploading object  [####################################]  100%
{
  "etag": "66681c40-4e11-4b73-baf9-cc1e4c3ebd5f",
  "last-modified": "Wed, 06 May 2020 15:17:03 GMT",
  "opc-multipart-md5": "MFdfU7vGZlJ5Mb4nopxtpw==-2"
}
I can see the file in the bucket via the web interface, and I can see that the size and the MD5 checksum are both correct.
In the next post, I will explain how to read the file from Object Storage using an External Table.

No comments :