7 May 2017

AWS S3 uploading and downloading from Linux command line

Source code

I recently wrote a bash script that automates a database backups to zipped files on a Raspberry Pi. I would then periodically SSH in and transfer the backup files.

This was a simple temporarily and manual solution, but I wanted a way to automate sending these files to a remote backup. I use AWS quite often, so my immediate plan was to transfer the files to S3 (Amazon’s simply storage platform). I found that Amazon has a very nifty command-line tool for AWS including S3.

Here are my notes…

Installation

The platform I’m demonstrating with is Raspbian Jessie. This should be much the same for other Debian-based Linux distros, like Ubuntu.

Install Python PIP

$ sudo apt-get update
$ sudo apt-get install python-pip

Install AWS CLI

This will take a little while to complete.
Documentation: http://docs.aws.amazon.com/cli/latest/userguide/installing.html

$ pip install --upgrade --user awscli

Add aws command to the PATH variable

This is for convenience and means we can access the aws command anywhere in the terminal.
Documentation: http://docs.aws.amazon.com/cli/latest/userguide/awscli-install-linux.html#awscli-install-linux-path

$ export PATH=~/.local/bin:$PATH
$ source ~/.profile
$ which aws
/home/darian/.local/bin/aws

$ chmod +x ~/.local/bin

Check installation

$ aws --version
aws-cli/1.11.82 Python/2.7.3 Linux/3.18.7+ botocore/1.5.45

 

Configuration

You will need to create a user on your AWS account and carefully configure it’s permissions and policies. I won’t cover this in detail, but the basics steps are:

  1. Log in the the AWS console web site.
  2. Go to the IAM Management Console > Users > Add user
  3. Type in a user name and select Programmatic access to get an access key ID and secret access key, instead of a password.
  4. Set up the user’s permissions.
  5. Apply the user credentials to AWS CLI on the Linux machine.

In my situation, I’m using this for remote backups, so I restricted the user to a single S3 Bucket (‘my-bucket’ in this example), and only list and upload permissions, but not delete.

Here’s my custom policy JSON:

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Effect": "Allow",
            "Action": [
                "s3:ListBucket",
                "s3:GetBucketLocation",
                "s3:ListBucketMultipartUploads"
            ],
            "Resource": "arn:aws:s3:::my-bucket",
            "Condition": {}
        },
        {
            "Effect": "Allow",
            "Action": [
                "s3:AbortMultipartUpload",
                "s3:GetObject",
                "s3:GetObjectAcl",
                "s3:GetObjectVersion",
                "s3:GetObjectVersionAcl",
                "s3:PutObject",
                "s3:PutObjectAcl",
                "s3:PutObjectAclVersion"
            ],
            "Resource": "arn:aws:s3:::my-bucket/*",
            "Condition": {}
        },
        {
            "Effect": "Allow",
            "Action": "s3:ListAllMyBuckets",
            "Resource": "*",
            "Condition": {}
        }
    ]
}

Make sure to keep the provided ID and key safe and secure. We’ll use this next. Note: The credentials displayed in my examples are fake! 😉

Back on the Linux machine, we’ll configure aws with our new user credentials:

$ aws configure
AWS Access Key ID [None]: AYL4EOGE3GCG8PA9PDVN
AWS Secret Access Key [None]: Zb5pCL0isKwbPvU6Zb5pLC0isKwbPvU6bZ5pLC0
Default region name [None]:
Default output format [None]:

That’s it! You should be ready to go 🙂

Example S3 operations

Here are a few basic examples on how to access S3 using command line.

List the contents of an S3 bucket

$ aws s3 ls s3://my-bucket
2017-05-04 13:30:36      51969 picture.jpg

List the contents of an S3 bucket directory

$ aws s3 ls s3://my-bucket/some/directory/
2017-05-03 13:39:42   13080027 20170502-1229_backup.zip
2017-05-04 13:48:13   13090301 20170503-1241_backup.zip
2017-05-04 14:56:19        675 profile.txt

Upload a file to S3

$ aws s3 cp local-file.zip s3://my-bucket/folder/remote-file.zip
upload: ./local-file.zip to s3://my-bucket/folder/remote-file.zip

Delete a file from S3

Note: I received an access denied message because my user should not be allowed to delete files.

$ aws s3 rm s3://my-bucket/some/directory/profile.txt
delete failed: s3://my-bucket/some/directory/profile.txt An error occurred (AccessDenied) when calling the DeleteObject operation: Access Denied

About the Author:

Hardware and software engineer with experience in product development and building automation. Director at Cabot Technologies and Product Manager at NEX Data Management Systems.

5 comments

  1. Syndesi

    Hi,

    thank you for the tutorial. Did you tried using AWS inside Docker or similar container software?

    Greetings,
    Syndesi

  2. Ralf Schmitz

    Thanks for the tutorial.

    By using it, I initially built my first ever backup to Amaz S3 bucket successfully.

    Your custom JSON seems a bit outdated as Amazon pointed out an obsolete ‘PutObjectAclVersion’ command, but it it is a pretty good start and easy to adjust in AWS Management Console to one’s need…

  3. Mike

    Thanks for the tutorial, but you forgot to include how to download as indicated in your title.

  4. Andy

    Helpful overview of all the pieces. Thank you.
    Make sure you know what the user flag does before you type
    pip install –upgrade –user

Leave a Reply

*

This site uses Akismet to reduce spam. Learn how your comment data is processed.