boto3 python uploading a local file filepath

Learn how to apply cloud resources in your Python scripts

Photo past Raj Steven from Pexels

I am writing this post out of sheer frustration.

Every postal service I've read on this topic assumed that I already had an account in AWS, an S3 saucepan, and a mound of stored data. They simply testify the code only kindly shadow over the most of import part — making the lawmaking work through your AWS account.

Well, I could've figured out the code hands, thank you very much. I had to sift through many SO threads and the AWS docs to get rid of every nasty authentication fault along the mode.

Then that y'all won't experience the aforementioned and do the difficult piece of work, I will share all the technicalities of managing an S3 saucepan programmatically, correct from account creation to adding permissions to your local machine to access your AWS resources.

Pace 1: Setup an account

Correct, let's start with creating your AWS account if you lot oasis't already. Naught unusual, simply follow the steps from this link:

GIF by the author

Then, nosotros will go to the AWS IAM (Identity and Access Management) panel, where nosotros will be doing most of the work.

GIF by the author

You can easily switch betwixt unlike AWS servers, create users, add policies, and allow access to your user account from the console. Nosotros will exercise each 1 past 1.

Pace two: Create a user

For one AWS business relationship, you can create multiple users, and each user can take various levels of access to your business relationship'southward resources. Allow's create a sample user for this tutorial:

GIF past the author

In the IAM panel:

  1. Get to the Users tab.
  2. Click on Add together users.
  3. Enter a username in the field.
  4. Tick the "Access key — Programmatic access field" (essential).
  5. Click "Next" and "Adhere existing policies directly."
  6. Tick the "AdministratorAccess" policy.
  7. Click "Next" until you see the "Create user" push button
  8. Finally, download the given CSV file of your user's credentials.

It should wait similar this:

By me🥱

Store it somewhere rubber because nosotros volition be using the credentials later on.

Pace iii: Create a saucepan

At present, let's create an S3 bucket where nosotros can store data.

GIF past the writer

In the IAM console:

  1. Click services in the height left corner.
  2. Scroll down to storage and select S3 from the right-paw listing.
  3. Click "Create bucket" and give it a name.

You lot can choose any region y'all want. Leave the residual of the settings and click "Create bucket" in one case more.

Step 4: Create a policy and add it to your user

In AWS, admission is managed through policies. A policy can be a set up of settings or a JSON file attached to an AWS object (user, resource, grouping, roles), and information technology controls what aspects of the object yous can apply.

Below, we volition create a policy that enables us to collaborate with our bucket programmatically — i.east., through the CLI or in a script.

GIF by the writer

In the IAM console:

  1. Go to the Policies tab and click "Create a policy."
  2. Click the "JSON" tab and insert the code below:

replacing your-bucket-name with your own. If y'all pay attention, in the Action field of the JSON, we are putting s3:* to allow whatever interaction to our bucket. This is very broad, and so y'all may only allow specific deportment. In that case, cheque out this page of the AWS docs to acquire to limit access.

This policy is but fastened to the bucket, and we should connect it to the user every bit well so that your API credentials piece of work correctly. Here are the instructions:

GIF by the author

In the IAM console:

  1. Become to the Users tab and click on the user we created in the last section.
  2. Click the "Add permissions" button.
  3. Click the "Attach existing policies" tab.
  4. Filter them past the policy we just created.
  5. Tick the policy, review it and click "Add" the final time.

Pace 5: Download AWS CLI and configure your user

We download the AWS command-line tool because it makes authentication then much easier. Kindly go to this page and download the executable for your platform:

GIF by the author

Run the executable and reopen whatsoever active last sessions to let the changes have effect. And so, type aws configure:

GIF by the writer

Insert your AWS Key ID and Undercover Access Key, along with the region y'all created your bucket in (use the CSV file). You can find the region name of your bucket on the S3 folio of the console:

By me.

Merely click "Enter" when you accomplish the Default Output Format field in the configuration. At that place won't be whatsoever output.

Step 6: Upload your files

Nosotros are most there.

Now, we upload a sample dataset to our bucket and so that we can download it in a script afterward:

GIF by the author

It should be easy once you go to the S3 folio and open your bucket.

Step 7: Check if authentication is working

Finally, pip install the Boto3 package and run this snippet:

If the output contains your bucket proper name(s), congratulations — you lot now have full admission to many AWS services through boto3, non just S3.

Using Python Boto3 to download files from the S3 bucket

With the Boto3 package, you lot have programmatic access to many AWS services such as SQS, EC2, SES, and many aspects of the IAM console.

Still, as a regular data scientist, you volition mostly demand to upload and download data from an S3 bucket, so we will merely cover those operations.

Allow's commencement with the download. Later on importing the package, create an S3 course using the client function:

To download a file from an S3 bucket and immediately save information technology, we can use the download_file function:

In that location won't be any output if the download is successful. You should laissez passer the exact file path of the file to exist downloaded to the Primal parameter. The Filename should contain the pass you desire to relieve the file to.

Uploading is too very straightforward:

The part is upload_file and you only have to change the order of the parameters from the download function.

Conclusion

I advise reading the Boto3 docs for more advanced examples of managing your AWS resource. It covers services other than S3 and contains code recipes for the well-nigh common tasks with each i.

Thanks for reading!

Yous can become a premium Medium member using the link below and get admission to all of my stories and thousands of others:

Or just subscribe to my e-mail list:

You can reach out to me on LinkedIn or Twitter for a friendly chat nigh all things data. Or you tin only read another story from me. How most these:

changonsid1990.blogspot.com

Source: https://towardsdatascience.com/how-to-upload-and-download-files-from-aws-s3-using-python-2022-4c9b787b15f2?source=post_internal_links---------6-------------------------------

0 Response to "boto3 python uploading a local file filepath"

Post a Comment

Iklan Atas Artikel

Iklan Tengah Artikel 1

Iklan Tengah Artikel 2

Iklan Bawah Artikel