Install and Configure S3CMD Tool For Data Sync On S3 Bucket

Install and Configure S3CMD Tool For Data Sync On S3 Bucket

It’s a tool to perform the task on EC2 instances to upload and download the data from S3 storage etc. if you want to make a connection to S3 from EC2 instance, we will need this tool to be installed. First of all, we will have to create a user with programmatic access to the AWS cloud. We can create, install, and configure Bucket using s3cmd on Linux.

Check Out: How To Install Numix Circle Icon Theme On Ubuntu 20.04

Create a User On IAM:

Now we will create a user and you have to go to IAM user management. I have created “user-s3″ with programmatic login means we will access using access key and secret key. so

configure s3cmd on linux

I will attach a default existing AWS S3 policy to the user and if you want, you can create your own and attach it to. so

install s3cmd

Now the user is created.

Check Out: Error NOAUTH Authentication required while Accessing Redis

Create a Bucket On S3:

The Next Step is to create a bucket on S3 AWS. Go to the S3 storage and click on “Create bucket“.

create bucket

The master3 bucket created and it’s en empty.

Once you login to the EC2 instance, run the below commands to install S3CMD. 

Install The S3CMD Tool:

You can install this tool using the single command because it’s available in epel repository. we don’t need to download it separately.

[root@s3bucket ~]# yum install epel-release s3cmd -y

Check Out: How To Enable Haproxy Stats With GUI View On Linux

As you can see the s3cmd tool installed on Linux. 

s3cmd.noarch 0:2.0.0-2.el6

We have installed the packages and there’s no daemon or service for it because it’s just a command-line tool. we have to configure this tool to make a connection with EC2 instance to our Bucket on S3. you can follow the instructions as shown below.

[root@s3bucket ~]# s3cmd --configure
Enter new values or accept defaults in brackets with Enter.
Refer to user manual for detailed description of all options.

Access key and Secret key are your identifiers for Amazon S3. Leave them empty for using the env variables.
Access Key: AKIA3OZ56MC72JTW2IGX       ## User access key "user-s3"              
Secret Key: n2gLjn1HXa2HeReFXbWM3h9zzcXI1SObsNdQqj3C    ## secret key
Default Region [US]: us-east-1a

Use "" for S3 Endpoint and not modify it to the target Amazon S3.
S3 Endpoint []:

Use "%(bucket)" to the target Amazon S3. "%(bucket)s" and "%(location)s" vars can be used
if the target S3 system supports dns based buckets.
DNS-style bucket+hostname:port template for accessing a bucket [%(bucket)]:

Encryption password is used to protect your files from reading
by unauthorized persons while in transfer to S3
Encryption password: OZ56MC72J
Path to GPG program [/usr/bin/gpg]:

When using secure HTTPS protocol all communication with Amazon S3
servers is protected from 3rd party eavesdropping. This method is
slower than plain HTTP, and can only be proxied with Python 2.7 or newer
Use HTTPS protocol [Yes]: yes

New settings:
Access Key: AKIA3OZ56MC72JTW2IGX
Secret Key: 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
Default Region: us-east-1a
S3 Endpoint:
DNS-style bucket+hostname:port template for accessing a bucket: %(bucket)
Encryption password: OZ56MC72J
Path to GPG program: /usr/bin/gpg
Use HTTPS protocol: True
HTTP Proxy server name:
HTTP Proxy server port: 0

Test access with supplied credentials? [Y/n] Y
Please wait, attempting to list all buckets...

Success. Your access key and secret key worked fine :-)

Now verifying that encryption works...
Success. Encryption and decryption worked fine :-)

Save settings? [y/N] y
Configuration saved to '/root/.s3cfg'

Now it has been configured and we’re able to connect to S3 Storage. we can check the bucket available on S3 using the below commands. There’s only one bucket that I have created manually on the storage. 

  1. List the Bucket on S3 
[root@s3bucket ~]# s3cmd ls
2020-04-25 12:49 s3://masters3

2. Upload the data to the Bucket and the Syntax is s3cmd -r put file_path s3://bucket_name

[root@s3bucket ~]# s3cmd -r put testbucket s3://masters3
upload: 'testbucket' -> 's3://masters3/testbucket' [1 of 1]
0 of 0 0% in 0s 0.00 B/s done
upload: 'testbucket' -> 's3://masters3/testbucket' [1 of 1]
0 of 0 0% in 0s 0.00 B/s done

Check Out: How To Configure Simple Notification Service(SNS) On AWS Cloud

3. We will upload the directory on the Bucket.

[root@s3bucket ~]# mkdir data
[root@s3bucket ~]# touch data/abc.txt
[root@s3bucket ~]# s3cmd put -r data s3://masters3
upload: 'data/abc.txt' -> 's3://masters3/data/abc.txt' [1 of 1]
0 of 0 0% in 0s 0.00 B/s done
upload: 'data/abc.txt' -> 's3://masters3/data/abc.txt' [1 of 1]
0 of 0 0% in 0s 0.00 B/s done

4. List the contents inside the Bucket.  

[root@s3bucket ~]# s3cmd ls s3://masters3
DIR s3://masters3/data/
2020-04-25 12:57 0 s3://masters3/testbucket

5. Download the particular files from the Bucket.

[root@s3bucket ~]# s3cmd get s3://masters3/testbucket
download: 's3://masters3/testbucket' -> './testbucket' [1 of 1]
download: 's3://masters3/testbucket' -> './testbucket' [1 of 1]
0 of 0 0% in 0s 0.00 B/s done

Check Out: How To Install SQLyog 15 Version On Windows Machine

6. Delete the files from the Bucket

[root@s3bucket ~]# s3cmd del s3://masters3/testbucket
delete: 's3://masters3/testbucket'

7. Try to delete the Bucket and you will see the below error because bucket has the objects inside it. 

[root@s3bucket ~]# s3cmd rb s3://masters3
ERROR: S3 error: 409 (BucketNotEmpty): The bucket you tried to delete is not empty

So, we will have to delete the objects first then we can delete the Bucket.

[root@s3bucket ~]# s3cmd del s3://masters3/data/abc.txt
delete: 's3://masters3/data/abc.txt'

8. We can also synchronize the data using the below commands. 

[root@s3bucket ~]# s3cmd sync -r data/ s3://masters3

9. We can also create a Bucket using s3cmd on EC2 instance. This will be directly created on S3 Storage. The syntax is s3cmd MB s3://bucket_name

 [root@s3bucket ~]# s3cmd mb s3://masters

You’re done configure s3cmd on linux

Share on:

I'm the founder of Curious Viral. I hope this blog will provide you complete information about Linux Technology & I would like to share my technical knowledge with you which I have learned during this period.

Other Posts You May Like...

Leave a comment