Skip to content
This repository has been archived by the owner on May 10, 2024. It is now read-only.

Commit

Permalink
Update docs with connect_to_region calls
Browse files Browse the repository at this point in the history
  • Loading branch information
jamesls committed Mar 7, 2013
1 parent 75c921d commit 8927783
Show file tree
Hide file tree
Showing 8 changed files with 51 additions and 67 deletions.
11 changes: 5 additions & 6 deletions docs/source/autoscale_tut.rst
Expand Up @@ -32,9 +32,6 @@ There are two ways to do this in boto. The first is:
>>> from boto.ec2.autoscale import AutoScaleConnection
>>> conn = AutoScaleConnection('<aws access key>', '<aws secret key>')

Alternatively, you can use the shortcut:

>>> conn = boto.connect_autoscale()

A Note About Regions and Endpoints
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Expand All @@ -43,7 +40,7 @@ default the US endpoint is used. To choose a specific region, instantiate the
AutoScaleConnection object with that region's endpoint.

>>> import boto.ec2.autoscale
>>> ec2 = boto.ec2.autoscale.connect_to_region('eu-west-1')
>>> autoscale = boto.ec2.autoscale.connect_to_region('eu-west-1')

Alternatively, edit your boto.cfg with the default Autoscale endpoint to use::

Expand Down Expand Up @@ -163,7 +160,8 @@ will now be a property of our ScalingPolicy objects.
Next we'll create CloudWatch alarms that will define when to run the
Auto Scaling Policies.

>>> cloudwatch = boto.connect_cloudwatch()
>>> import boto.ec2.cloudwatch
>>> cloudwatch = boto.ec2.cloudwatch.connect_to_region('us-west-2')

It makes sense to measure the average CPU usage across the whole Auto Scaling
Group, rather than individual instances. We express that as CloudWatch
Expand Down Expand Up @@ -199,7 +197,8 @@ beyond the limits of the Scaling Group's 'max_size' and 'min_size' properties.

To retrieve the instances in your autoscale group:

>>> ec2 = boto.connect_ec2()
>>> import boto.ec2
>>> ec2 = boto.ec2.connect_to_region('us-west-2)
>>> conn.get_all_groups(names=['my_group'])[0]
>>> instance_ids = [i.instance_id for i in group.instances]
>>> reservations = ec2.get_all_instances(instance_ids)
Expand Down
5 changes: 4 additions & 1 deletion docs/source/boto_config_tut.rst
Expand Up @@ -117,7 +117,10 @@ Even if you have your boto config setup, you can also have credentials and
options stored in environmental variables or you can explicitly pass them to
method calls i.e.::

>>> boto.connect_ec2('<KEY_ID>','<SECRET_KEY>')
>>> boto.ec2.connect_to_region(
... 'us-west-2',
... aws_access_key_id='foo',
... aws_secret_access_key='bar')

In these cases where these options can be found in more than one place boto
will first use the explicitly supplied arguments, if none found it will then
Expand Down
6 changes: 3 additions & 3 deletions docs/source/cloudwatch_tut.rst
Expand Up @@ -12,8 +12,8 @@ EC2Connection object or call the monitor method on the Instance object.
It takes a while for the monitoring data to start accumulating but once
it does, you can do this::

>>> import boto
>>> c = boto.connect_cloudwatch()
>>> import boto.ec2.cloudwatch
>>> c = boto.ec2.cloudwatch.connect_to_region('us-west-2')
>>> metrics = c.list_metrics()
>>> metrics
[Metric:NetworkIn,
Expand Down Expand Up @@ -113,4 +113,4 @@ about that particular data point.::
u'Timestamp': u'2009-05-21T19:55:00Z',
u'Unit': u'Percent'}

My server obviously isn't very busy right now!
My server obviously isn't very busy right now!
9 changes: 5 additions & 4 deletions docs/source/dynamodb_tut.rst
Expand Up @@ -16,8 +16,9 @@ Creating a Connection
The first step in accessing DynamoDB is to create a connection to the service.
To do so, the most straight forward way is the following::

>>> import boto
>>> conn = boto.connect_dynamodb(
>>> import boto.dynamodb
>>> conn = boto.dynamodb.connect_to_region(
'us-west-2',
aws_access_key_id='<YOUR_AWS_KEY_ID>',
aws_secret_access_key='<YOUR_AWS_SECRET_KEY>')
>>> conn
Expand All @@ -27,7 +28,7 @@ Bear in mind that if you have your credentials in boto config in your home
directory, the two keyword arguments in the call above are not needed. More
details on configuration can be found in :doc:`boto_config_tut`.

The :py:func:`boto.connect_dynamodb` functions returns a
The :py:func:`boto.dynamodb.connect_to_region` function returns a
:py:class:`boto.dynamodb.layer2.Layer2` instance, which is a high-level API
for working with DynamoDB. Layer2 is a set of abstractions that sit atop
the lower level :py:class:`boto.dynamodb.layer1.Layer1` API, which closely
Expand Down Expand Up @@ -297,7 +298,7 @@ method, or by passing in the
the ``dynamizer`` param::

>>> from boto.dynamodb.types import Dynamizer
>>> conn = boto.connect_dynamodb(dynamizer=Dynamizer)
>>> conn = boto.dynamodb.connect_to_region(dynamizer=Dynamizer)

This mechanism can also be used if you want to customize the encoding/decoding
process of DynamoDB types.
Expand Down
52 changes: 15 additions & 37 deletions docs/source/elb_tut.rst
Expand Up @@ -43,48 +43,26 @@ Creating a Connection

The first step in accessing ELB is to create a connection to the service.

>>> import boto
>>> conn = boto.connect_elb(
aws_access_key_id='YOUR-KEY-ID-HERE',
aws_secret_access_key='YOUR-SECRET-HERE'
)


A Note About Regions and Endpoints
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

Like EC2, the ELB service has a different endpoint for each region. By default
the US East endpoint is used. To choose a specific region, instantiate the
ELBConnection object with that region's information.

>>> from boto.regioninfo import RegionInfo
>>> reg = RegionInfo(
name='eu-west-1',
endpoint='elasticloadbalancing.eu-west-1.amazonaws.com'
)
>>> conn = boto.connect_elb(
aws_access_key_id='YOUR-KEY-ID-HERE',
aws_secret_access_key='YOUR-SECRET-HERE',
region=reg
)

Another way to connect to an alternative region is like this:
the US East endpoint is used. To choose a specific region, use the
``connect_to_region`` function::

>>> import boto.ec2.elb
>>> elb = boto.ec2.elb.connect_to_region('eu-west-1')
>>> import boto.ec2.elb
>>> elb = boto.ec2.elb.connect_to_region('us-west-2')

Here's yet another way to discover what regions are available and then
connect to one:

>>> import boto.ec2.elb
>>> regions = boto.ec2.elb.regions()
>>> regions
[RegionInfo:us-east-1,
RegionInfo:ap-northeast-1,
RegionInfo:us-west-1,
RegionInfo:ap-southeast-1,
RegionInfo:eu-west-1]
>>> elb = regions[-1].connect()
connect to one::

>>> import boto.ec2.elb
>>> regions = boto.ec2.elb.regions()
>>> regions
[RegionInfo:us-east-1,
RegionInfo:ap-northeast-1,
RegionInfo:us-west-1,
RegionInfo:ap-southeast-1,
RegionInfo:eu-west-1]
>>> elb = regions[-1].connect()

Alternatively, edit your boto.cfg with the default ELB endpoint to use::

Expand Down
19 changes: 9 additions & 10 deletions docs/source/emr_tut.rst
Expand Up @@ -27,18 +27,18 @@ and then call the constructor without any arguments, like this:

>>> conn = EmrConnection()

There is also a shortcut function in the boto package called connect_emr
that may provide a slightly easier means of creating a connection:
There is also a shortcut function in boto
that makes it easy to create EMR connections:

>>> import boto
>>> conn = boto.connect_emr()
>>> import boto.emr
>>> conn = boto.emr.connect_to_region('us-west-2')

In either case, conn points to an EmrConnection object which we will use
throughout the remainder of this tutorial.

Creating Streaming JobFlow Steps
--------------------------------
Upon creating a connection to Elastic Mapreduce you will next
Upon creating a connection to Elastic Mapreduce you will next
want to create one or more jobflow steps. There are two types of steps, streaming
and custom jar, both of which have a class in the boto Elastic Mapreduce implementation.

Expand Down Expand Up @@ -76,8 +76,8 @@ Creating JobFlows
-----------------
Once you have created one or more jobflow steps, you will next want to create and run a jobflow. Creating a jobflow that executes either of the steps we created above can be accomplished by:

>>> import boto
>>> conn = boto.connect_emr()
>>> import boto.emr
>>> conn = boto.emr.connect_to_region('us-west-2')
>>> jobid = conn.run_jobflow(name='My jobflow',
... log_uri='s3://<my log uri>/jobflow_logs',
... steps=[step])
Expand All @@ -102,7 +102,6 @@ Terminating JobFlows
--------------------
By default when all the steps of a jobflow have finished or failed the jobflow terminates. However, if you set the keep_alive parameter to True or just want to halt the execution of a jobflow early you can terminate a jobflow by:

>>> import boto
>>> conn = boto.connect_emr()
>>> import boto.emr
>>> conn = boto.emr.connect_to_region('us-west-2')
>>> conn.terminate_jobflow('<jobflow id>')

9 changes: 5 additions & 4 deletions docs/source/ses_tut.rst
Expand Up @@ -15,8 +15,9 @@ Creating a Connection
The first step in accessing SES is to create a connection to the service.
To do so, the most straight forward way is the following::

>>> import boto
>>> conn = boto.connect_ses(
>>> import boto.ses
>>> conn = boto.ses.connect_to_region(
'us-east-1',
aws_access_key_id='<YOUR_AWS_KEY_ID>',
aws_secret_access_key='<YOUR_AWS_SECRET_KEY>')
>>> conn
Expand All @@ -26,7 +27,7 @@ Bear in mind that if you have your credentials in boto config in your home
directory, the two keyword arguments in the call above are not needed. More
details on configuration can be fond in :doc:`boto_config_tut`.

The :py:func:`boto.connect_ses` functions returns a
The :py:func:`boto.ses.connect_to_region` functions returns a
:py:class:`boto.ses.connection.SESConnection` instance, which is a the boto API
for working with SES.

Expand Down Expand Up @@ -168,4 +169,4 @@ where we'll just show a short excerpt here::
]
}
}
}
}
7 changes: 5 additions & 2 deletions docs/source/simpledb_tut.rst
Expand Up @@ -13,8 +13,11 @@ Creating a Connection
The first step in accessing SimpleDB is to create a connection to the service.
To do so, the most straight forward way is the following::

>>> import boto
>>> conn = boto.connect_sdb(aws_access_key_id='<YOUR_AWS_KEY_ID>',aws_secret_access_key='<YOUR_AWS_SECRET_KEY>')
>>> import boto.sdb
>>> conn = boto.sdb.connect_to_region(
... 'us-west-2',
... aws_access_key_id='<YOUR_AWS_KEY_ID>',
... aws_secret_access_key='<YOUR_AWS_SECRET_KEY>')
>>> conn
SDBConnection:sdb.amazonaws.com
>>>
Expand Down

0 comments on commit 8927783

Please sign in to comment.