top of page
Writer's pictureSiddhesh Kadam

Upload a file to Amazon S3 using Python

To upload a file to Amazon S3 using Python, you can use the AWS SDK for Python (Boto3). Before you begin, make sure you have Boto3 installed and have configured your AWS credentials. You can install Boto3 using pip if it's not already installed:


[root@siddhesh ~]# pip3 install boto3
Collecting boto3
  Downloading boto3-1.28.79-py3-none-any.whl.metadata (6.7 kB)
Collecting botocore<1.32.0,>=1.31.79 (from boto3)
  Downloading botocore-1.31.79-py3-none-any.whl.metadata (6.1 kB)
Collecting jmespath<2.0.0,>=0.7.1 (from boto3)
  Downloading jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.8.0,>=0.7.0 (from boto3)
  Downloading s3transfer-0.7.0-py3-none-any.whl.metadata (1.8 kB)
Requirement already satisfied: python-dateutil<3.0.0,>=2.1 in ./.local/lib/python3.9/site-packages (from botocore<1.32.0,>=1.31.79->boto3) (2.8.2)
Requirement already satisfied: urllib3<1.27,>=1.25.4 in ./.local/lib/python3.9/site-packages (from botocore<1.32.0,>=1.31.79->boto3) (1.26.14)
Requirement already satisfied: six>=1.5 in ./.local/lib/python3.9/site-packages (from python-dateutil<3.0.0,>=2.1->botocore<1.32.0,>=1.31.79->boto3) (1.16.0)
Downloading boto3-1.28.79-py3-none-any.whl (135 kB)
   ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 135.8/135.8 kB 12.7 MB/s eta 0:00:00
Downloading botocore-1.31.79-py3-none-any.whl (11.3 MB)
   ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 11.3/11.3 MB 64.2 MB/s eta 0:00:00
Downloading s3transfer-0.7.0-py3-none-any.whl (79 kB)
   ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 79.8/79.8 kB 12.8 MB/s eta 0:00:00
Installing collected packages: jmespath, botocore, s3transfer, boto3
Successfully installed boto3-1.28.79 botocore-1.31.79 jmespath-1.0.1 s3transfer-0.7.0
[root@siddhesh ~]#

Below is the Python script to upload a file to an S3 bucket:


import boto3
import argparse
parser = argparse.ArgumentParser(description="Command Line Argument Of upload_object.py")
parser.add_argument("--file", "-f", help="file path to upload")
args = parser.parse_args()
input_file = args.file
access_key = 'XXXXXXXXXXXXXXXXXXXX'
secret_key = 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX'
bucket_name = 'mybucket-siddhesh-test'
s3_key = 'XXXXXXXXXXXXXXXXXXX'
s3_connection = boto3.client('s3', aws_access_key_id=access_key, aws_secret_access_key=secret_key)
try:
    s3_connection.upload_file(input_file, bucket_name, input_file)
    print(f'File sucesfully uploaded to S3 Bucket: s3_connection://{bucket_name}/{input_file}')
except Exception as e:
    print(f'File upload failed to S3: {input_file}')

1. Import necessary modules:

boto3: This is the AWS SDK for Python, which provides a Pythonic interface to Amazon Web Services.

argparse: This module is used to parse command-line arguments.


2. Create an ArgumentParser object:

argparse.ArgumentParser is used to define and manage command-line arguments. It sets up a parser for the script's command-line arguments with a description.


3. Define a command-line argument:

parser.add_argument("--file", "-f", help="file path to upload"): This line defines a command-line argument named --file (short form -f) and provides a help message for it. It allows you to specify the file path that you want to upload.

4. Parse the command-line arguments:

args = parser.parse_args(): This line parses the command-line arguments provided when running the script.

5. Access the command-line argument values:

input_file = args.file: It assigns the value of the --file argument to the input_file variable.

6. Define AWS credentials and S3 details:

access_key and secret_key: These variables store AWS access and secret keys, which are required for authentication.

bucket_name: This variable stores the name of the S3 bucket where the file will be uploaded.

s3_key: This variable seems to store an AWS key, which might be used for some other purpose (not clear from the provided code).


7. Create an S3 connection:

s3_connection = boto3.client('s3', aws_access_key_id=access_key, aws_secret_access_key=secret_key): This line creates an S3 client connection using the provided access and secret keys.

8. Try to upload the file to the S3 bucket:

s3_connection.upload_file(input_file, bucket_name, input_file): This line attempts to upload the specified file (input_file) to the specified S3 bucket (bucket_name).


9.Handle success and failure:

If the upload is successful, it prints a success message with the S3 URL of the uploaded file.

If an exception occurs during the upload, it prints an error message.

Let's execute the script and verify the data on S3 using the AWS CLI.

[root@siddhesh ~]# python3 /root/upload_object.py  -f mytest_log.txt
File sucesfully uploaded to S3 Bucket: s3_connection://mybucket-siddhesh-test/mytest_log.txt
[root@siddhesh ~]# aws s3 ls mybucket-siddhesh-test --recursive
2023-11-05 20:04:35     127387 mytest_log.txt
[root@siddhesh ~]#

Comentários

Avaliado com 0 de 5 estrelas.
Ainda sem avaliações

Adicione uma avaliação
bottom of page