Async client for amazon services using botocore and aiohttp/asyncio.
Main purpose of this library to support amazon s3 api, but other services should work (may be with minor fixes). For now we have tested only upload/download api for s3, other users report that SQS and Dynamo services work also. More tests coming soon.
$ pip install aiobotocore
import asyncio
import aiobotocore
AWS_ACCESS_KEY_ID = "xxx"
AWS_SECRET_ACCESS_KEY = "xxx"
async def go(loop):
bucket = 'dataintake'
filename = 'dummy.bin'
folder = 'aiobotocore'
key = '{}/{}'.format(folder, filename)
session = aiobotocore.get_session(loop=loop)
client = session.create_client('s3', region_name='us-west-2',
aws_secret_access_key=AWS_SECRET_ACCESS_KEY,
aws_access_key_id=AWS_ACCESS_KEY_ID)
# upload object to amazon s3
data = b'\x01'*1024
resp = await client.put_object(Bucket=bucket,
Key=key,
Body=data)
print(resp)
# getting s3 object properties of file we just uploaded
resp = await client.get_object_acl(Bucket=bucket, Key=key)
print(resp)
# delete object from s3
resp = await client.delete_object(Bucket=bucket, Key=key)
print(resp)
# list s3 objects using paginator
paginator = client.get_paginator('list_objects')
async for result in paginator.paginate(Bucket=bucket, Prefix=folder):
for c in result.get('Contents', []):
print(c)
# get object from s3
response = await client.get_object(Bucket=bucket, key=key)
# this will ensure the connection is correctly re-used/closed
async with response['Body'] as stream:
bytes = await stream.read()
loop = asyncio.get_event_loop()
loop.run_until_complete(go(loop))
Make sure you have development requirements installed and your amazon key and secret accessible via environment variables:
$ cd aiobotocore $ export AWS_ACCESS_KEY_ID=xxx $ export AWS_SECRET_ACCESS_KEY=xxx $ pip install -Ur requirements-dev.txt
Execute tests suite:
$ py.test -v tests
https://groups.google.com/forum/#!forum/aio-libs