Python and AWS S3
Amazon Web Services or (AWS) offers many fantastic services, maybe none so utilitarian as amazon S3. S3 stands for simple storage service, you can store just about anything in S3. Today I will give a quick example of using the python package boto to do this.
I will assume that you already have an AWS account, to work with AWS programatically you will interact with the API and to do that we need to be authorized via client and secret, a good guide how to get your client id and secret is covered here.
Once you have your API Client and secret,its time to create a bucket in S3.
Once loged into your AWS console, search for S3.
Next Click create bucket.
Make a note of the bucket name, you may click next and you may leave all default options chosen for the purposes of this demo.
Now that you have a client and secret, you can optionally store the keys in an envormental variable. This will allow you to reference the keys on your machine without hard coding the values.
Lets load our file into our bucket by reading the secret and access key we just loaded into our os enviorment variables.
Here is an additional example of reading a csv file stored into S3 into a pandas dataframe object.
os.environ[‘awsAccessKey’] = ‘xxxxxx’ os.environ[‘awsSecretKey’] = ‘yyyyyy’