docker machine is evolving towards becoming a useful tool for programmers and systems administrators. It is likely that a demand for:
- higher-order functionality such as coordinating the complex interactions and topologies of an Amazon VPC
- ways to automate
docker machine actions via programming language of choice without shelling out
will emerge in the future as the base gets more stable and robust. I would propose that machine should allow the ability to run in a server mode which accepts API requests (or, we provide some way of wrapping machine which does). It would look something like:
To start server:
$ machine -d -p 7000
Listening for machine requests on 127.0.0.1:7000....
[in a separate window]
$ curl --silent \
-X POST \
-H 'Content-Type: application/json' \
http://localhost:7000/create \
-d '{"Driver": "ec2", "AwsAccessKeyId": "blah", "AwsSecretKey": "foo", "HostName": "my-amazon"}'
{
"HostName":"my-amazon",
"Status":"Starting"
}
API responds rapidly to request for creating machine and queues action in background. Then user is freed up to do more stuff e.g. check status.
$ curl --silent \
-X GET \
http://localhost:7000/ls
[
{
"Name":"default",
"Active":true,
"Driver":"",
"State":"",
"Url":"unix:///var/run/docker.sock"
},
{
"Name":"my-amazon",
"Active":false,
"Driver":"ec2",
"State":"starting",
"Url":"tcp://50.116.43.32:2376"
}
]
It allows the possibility of long-running concurrent actions that aren't possible with the current (non-backgrounded) implementation as well- e.g. create a host on Digital Ocean, Rackspace, and Azure concurrently. Right now users would have to "bring their own concurrency" using & or some other solution.
docker machineis evolving towards becoming a useful tool for programmers and systems administrators. It is likely that a demand for:docker machineactions via programming language of choice without shelling outwill emerge in the future as the base gets more stable and robust. I would propose that
machineshould allow the ability to run in a server mode which accepts API requests (or, we provide some way of wrappingmachinewhich does). It would look something like:To start server:
[in a separate window]
$ curl --silent \ -X POST \ -H 'Content-Type: application/json' \ http://localhost:7000/create \ -d '{"Driver": "ec2", "AwsAccessKeyId": "blah", "AwsSecretKey": "foo", "HostName": "my-amazon"}' { "HostName":"my-amazon", "Status":"Starting" }API responds rapidly to request for creating machine and queues action in background. Then user is freed up to do more stuff e.g. check status.
$ curl --silent \ -X GET \ http://localhost:7000/ls [ { "Name":"default", "Active":true, "Driver":"", "State":"", "Url":"unix:///var/run/docker.sock" }, { "Name":"my-amazon", "Active":false, "Driver":"ec2", "State":"starting", "Url":"tcp://50.116.43.32:2376" } ]It allows the possibility of long-running concurrent actions that aren't possible with the current (non-backgrounded) implementation as well- e.g. create a host on Digital Ocean, Rackspace, and Azure concurrently. Right now users would have to "bring their own concurrency" using
&or some other solution.