Skip to content

Commit

Permalink
Update Python READMEs
Browse files Browse the repository at this point in the history
  • Loading branch information
alfpark committed Jun 14, 2016
1 parent 6683d05 commit 6dbdda4
Show file tree
Hide file tree
Showing 2 changed files with 47 additions and 34 deletions.
76 changes: 43 additions & 33 deletions Python/Batch/README.md
Original file line number Diff line number Diff line change
@@ -1,9 +1,9 @@
##Azure Batch Python Samples

###Configuring the samples
In order the run the samples, they must be configured with Azure Batch and
Azure Storage credentials. The credentials for each sample are gathered from
the common configuration located [here](./configuration.cfg). Once you
In order to run these Python samples, they must be configured with Azure Batch
and Azure Storage credentials. The credentials for each sample are gathered
from the common configuration located [here](./configuration.cfg). Once you
have configured your account credentials, you can run any of the samples and
they will make use of the credentials provided in the common configuration
file.
Expand All @@ -13,10 +13,11 @@ Each sample also has a configuration file specific to the individual sample

###Setting up the Python environment
In order to run the samples, you will need a Python interpreter compatible
with version 2.7, 3.3, 3.4 or 3.5. You will also need to install the Azure
Batch and Azure Storage python packages. This can be done using the
[requirements.txt](./requirements.txt) file using
`pip install -r requirements.txt`
with version 2.7 or 3.3+. You will also need to install the
[Azure Batch](https://pypi.python.org/pypi/azure-batch) and
[Azure Storage](https://pypi.python.org/pypi/azure-storage) python packages.
Installation can be performed using the [requirements.txt](./requirements.txt)
file via the command `pip install -r requirements.txt`

You can also optionally use the
[Visual Studio project](./BatchSamples.pyproj) and the
Expand All @@ -41,9 +42,10 @@ sku ids, selecting a publisher, offer and sku for the Linux VM gallery image.
####[sample3\_encrypted\_resourcefiles.py](./sample3_encrypted_resourcefiles.py)
This sample shows how to generate on-demand encryption keys in conjunction with
[blobxfer](../Storage) to encrypt local files into Azure Storage which will
then be decrypted for the task when it executes. This sample showcases a
variety of Azure Batch interaction including: adding certificates to an
account, creating a pool with a certificate reference, and accessing
then be decrypted for the task when it executes which ensures encrypted files
not only in transit but encrypted in storage immediately. This sample
showcases a variety of Azure Batch interaction including: adding certificates
to an account, creating a pool with a certificate reference, and accessing
certificates on a Linux Batch compute node. This sample is geared towards
Linux with the assumption of a locally available OpenSSL and blobxfer
installation that is accessible from the sample path invocation. This sample
Expand All @@ -64,37 +66,38 @@ tunnel script to interact with the batch pool after the sample is run. When
using this option, you will also need to disable the delete pool option as
well.

####[sample5\_docker\_batch\_task.py](./sample4\_docker\_batch\_task.py)
This sample shows how to schedule tasks that run a docker container.
####[sample5\_docker\_batch\_task.py](./sample5\_docker\_batch\_task.py)
This sample shows how to schedule tasks that run a docker container.
Specifically, this sample uses pool start up task to install docker
on Batch VMs which is Ubuntu based. It then uses job preparation tasks to
pull the application image from docker hub and finally submit a set of simple
tasks. Each task will launch a docker container to perform ffmpeg transcoding
on a downloaded mp4 file and upload the result to blob storage. The docker
image is CentOS based with ffmpeg and blobxfer preinstalled. The docker image
(yidingz/ffmpeg:v3) is available on docker hub.
(yidingz/ffmpeg:v3) is available on docker hub.

###Azure Batch on Linux Best Practices
##Azure Batch on Linux Best Practices

Although some of the Python samples are not specific to Linux, the Azure Batch
team would like to provide guidance on best practices for hosting your Linux
workloads on Azure Batch.

* _Wrap your command(s) in a shell or provide a shell script_
####Wrap your command(s) in a shell or provide a shell script

Unless you have a single program you wish to execute that is resolvable in the
default `PATH` specified by the distribution (i.e., `/bin`, `/usr/bin`), then
it is advisable to wrap your command in a shell. For instance,
default `PATH` specified by the distribution (i.e., `/bin`, `/usr/bin`) or
the `$AZ_BATCH_TASK_WORKING_DIR`, then it is advisable to wrap your command
in a shell. For instance,

/bin/bash -c "command1 && command2"

would execute `command1` followed by `command2`, if command1 was successful
would execute `command1` followed by `command2`, if `command1` was successful
inside a bash shell.

Alternatively, upload a shell script as part of your resource files for
your task that encompasses your program execution workflow.

* _Check for exit codes for each command in a shell script_
####Check for exit codes for each command in a shell script

You should check for exit codes within your shell script for each command
invocation in a series if you depend on successful program execution for
Expand All @@ -120,31 +123,36 @@ modified to:
If `command2` fails, then the entire script will exit with the proper
return code of the failing command.

* _Wait for your background commands_
####Wait for your background commands

If you require executing multiple programs at the same time and cannot split
the invocation across multiple tasks, ensure you wrap your execution flow in
a shell and provide the appropriate wait command for all child processes. For
instance,

/bin/bash -c "command1 &; command2 &; command3 &; wait"
#!/usr/bin/env bash
set -e
command1 &
command2 &
command3 &
wait

This would ensure that all child processes exit before the parent exits.
Without the `wait` command, the Azure Batch service will not be able to
properly track when the compute node has completed execution of the
backgrounded tasks.

* _Set preferred locale and encoding_
####Set preferred locale and encoding

Linux shell scripts or program invocations via Azure Batch tasks will execute
under the `POSIX` locale. If your programs require a specific locale and
encoding, e.g., to encode Unicode characters, then you will need to set the
locale via an environment variable. For example,

# set environment variables on job: applies to all tasks in job
job = batchserviceclient.models.CloudJob(
job = azure.batch.models.CloudJob(
common_environment_settings=[
batchserviceclient.models.EnvironmentSettings('LC_ALL', 'en_US.UTF-8')
azure.batch.models.EnvironmentSetting('LC_ALL', 'en_US.UTF-8')
],
# ... more args
)
Expand All @@ -154,16 +162,17 @@ encoding for all tasks added to the job. Alternatively you can set environment
variables for each individual task:

# set environment variables on single task
task = batchserviceclient.models.CloudTask(
task = azure.batch.models.CloudTask(
environment_settings=[
batchserviceclient.models.EnvironmentSettings('LC_ALL', 'en_US.UTF-8')
azure.batch.models.EnvironmentSetting('LC_ALL', 'en_US.UTF-8')
],
# ... more args
)

There are similar environment settings arguments for start task, job
preparation task, and job release task. Although we recommend using the
built-in environment variable control provided by the Azure Batch API, you
built-in environment variable control provided by the Azure Batch API as
these environment variables will be set on execution of the task, you
can, as always, directly set shell environment variables in the shell
invocation for your task command(s):

Expand All @@ -173,13 +182,14 @@ A final note: not all locales may be present and installed on the compute node
and may require a start task or job preparation task for installation of the
desired locale.

* _stdout.txt and stderr.txt encoding_
####stdout.txt and stderr.txt encoding

On Linux compute nodes, these files are encoded with UTF-8. If your program
generates Unicode characters, ensure that the file is interpreted with UTF-8
encoding. Please see above related note regarding locale and encoding.
On Linux compute nodes, task `stdout.txt` and `stderr.txt` files are encoded
with UTF-8. If your program generates Unicode characters, ensure that the file
is interpreted with UTF-8 encoding. Please see above related note regarding
locale and encoding.

* _Do not perform release upgrades on compute nodes_
####Do not perform release upgrades on compute nodes

Many distributions offer the ability to perform a release upgrade. By
"release upgrade," we refer to major version upgrades such as from Ubuntu
Expand All @@ -195,7 +205,7 @@ new pool.
Note that we are evaluating automating os and security updates as a possible
future enhancement.

* _Consider asyncio for blocking Azure Batch calls_
####Consider asyncio for blocking Azure Batch calls

With Python [3.4](https://docs.python.org/3.4/library/asyncio.html),
[3.5+ (async/await)](https://docs.python.org/3.5/library/asyncio.html), or
Expand Down
5 changes: 4 additions & 1 deletion Python/Storage/README.rst
Original file line number Diff line number Diff line change
Expand Up @@ -224,7 +224,10 @@ specify the share name as the second positional argument.

The above example would upload all files in the ``localfiles`` directory to
the share named ``myshare``. Encryption/decryption options are compatible with
Azure Files as the destination or source.
Azure Files as the destination or source. Please refer to this `MSDN article_`
for features not supported by the Azure File Service.

.. _MSDN article: https://msdn.microsoft.com/en-us/library/azure/dn744326.aspx

General Notes
-------------
Expand Down

0 comments on commit 6dbdda4

Please sign in to comment.