1. Create AWS S3 customer keys in OCI
1. Login to OCI tenancy and go to user profile section
2. Go to Customer secret keys section and create public/private key pair. Note – You need to save secret key at the time of creation.
2. Change S3 designation compartment for creating new buckets. In my case I am using my own compartment.
3. Set up Boto3 with python SDK
1. Use any python IDE – I am using pycharm. Please take care of boto version , python 2.x below versions only supports boto and python 3.x versions support boto3
1. Boto3 checks the configuration files under ~/.aws directory or system environment variables. Make sure you are using credentials from ~/.oci/config file or providing them manually
2. AWS S3 do not use compartment concept and it is very important to specify compartment for creating object storage / bucket. One can do it in SDK or can perform manually from OCI console.
3. Boto3 talks to OCI endpoint URL so it’s users responsibility to provide correct namespace and region in endpoint URL. If you are working for many regions then please provide list of regions.
These are some simple examples to use S3 services. One can modify and rewrite them according to their need.
####################### List out available buckets #################################### for bucket in s3.buckets.all(): print (bucket.name) ############## create S3 bucket /object storage ########################## s3.create_bucket(Bucket= Name of the bucket/object storage) response = s3.list_buckets() print(response) ###################### upload file ####################################### with open(Name of the file, 'r') as f: content = f.read() print(content) s3.upload_file(Name of the file) ########################### access file from s3 ########################## s3_resource = boto3.resource('s3') s3_object = s3_resource.Object(bucket_name=Name of the bucket, key=File Name or Path in Buket) from io import StringIO s3_data = StringIO(s3_object.get()['Body'].read().decode('utf-8')) data = pandas.read_csv(s3_data) print(data.head()) ######################## Delete bucket / object storage ##################### s3.delete_bucket(Bucket= Name of the bucket/object storage)
Pros | Cons |
|
|
|
|
By performing best practices user will eliminate some undefined errors like boto3 configuration , OCI endpoint declaration as some of these errors are not mentioned in documents. The Object Storage Service provided by Oracle Cloud Infrastructure and Amazon S3 use similar concepts and terminology so it is easy to work on SDKs/APIs.
1. AWS S3 documentation :- https://aws.amazon.com/s3/
2. Boto3 documentation :- https://boto3.amazonaws.com/
3. OCI - S3 compatibility API :- https://docs.cloud.oracle.com/en-us/iaas/Content/Object/Tasks/s3compatibleapi.htm
I am Principal Solution Architect in A-Team SaaS team. We are specialize in OCI SaaS Extension. Mainly focusing on OCI infra , PaaS , SaaS.