New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Question: guidance/best practice on how to connect to other containers from functions #226
Comments
|
Hi Mark, There are three ways to do this. Let me know if you have any questions about the approaches listed. You're also welcome to join Slack if you email alex@openfaas.com Option 1 - host portBind a port to the host and use the host's IP / DNS-entry in the function Option 2 - Swarm serviceIf you are creating a new container such as MySQL then you can create a swarm service and simply specify the network as an additional parameter. That makes it resolvable via DNS and doesn't require a host to be exposed on the host. Option 3 - attachable networkGo to your docker-compose YAML file and uncomment the last line which says "attachable". Now delete and re-deploy OpenFaaS (this must remove func_functions from |
|
Thanks @alexellis, I really appreciate you getting back to me so quickly. Your comments made me revisit option 1. I'd been using '0.0.0.0' to try to access MySQL, but of course that won't work. As you say though, using the host's IP does work! Thanks! I'll still join Slack if that's ok--I can see myself using OpenFaaS a lot going forwards. |
|
In terms of best practice, I prefer Option 2 + 3. I like the question though, this is a gap in our online guides / docs. @johnmccabe @rgee0 Full index of guides including workflows / chaining. |
|
Send me an email with a 1-2 line intro and I'll introduce you to our community. |
|
Thanks @alexellis. Yes, I liked options 2 and 3 as well. However, the majority of the scenarios I'm working on involve using centralised databases. For example, a number of pipelines take data from MySQL and Postgres, map it to a different schema, and then place it into ElasticSearch; I'm working on using OpenFaaS to run these pipelines so that I get all of your Prometheus/scaling/healthcheck goodness 'for free'. At some point I may factor the source and destination parts of the pipeline out so that the functions just do the data processing, but for now it's a nice interim step to be able to run the reads and writes 'inside' the OpenFaaS function. There are also a number of little things that OpenFaaS is ideal for. For example, we want to have an endpoint that simply tells you the available versions of an image. It's a simple query against the database, and we were about to go through all of the hassle of setting up a REST (or more likely, GraphQL) endpoint. But by using OpenFaaS we get the endpoint we want running in no time, and more importantly get all of your scaling and monitoring stuff. None of these use cases will be news to you, of course! But they are illustrations of how you might need to access Docker services over which you don't have any control. |
|
On reflection I've actually gone for option 3. I'll explain why in case anyone stumbles across this thread in the future. Looking at it again, I realised that in a live environment the reference to some centralised DB is easy...it will be some host name that is unlikely to be running in the same swarm as the functions so visibility is unlikely to be a problem. My issue was that I am trying to run unit tests that need access to a test database on my own development machine. And certainly the easiest way to set that up is simply to allow the database container to join the OpenFaaS network. |
|
Just to let you know you can configure the database through environmental variables using the YAML stack file, here's an example: https://github.com/alexellis/faas-twitter-fanclub/blob/master/stack.yml |
Expected Behaviour
A function might query a MySQL database, for example.
Current Behaviour
A naive attempt at passing the IP address of the host to the function is not good enough, unfortunately.
Telling the function to join the same Docker network as the MySQL database is on also fails since port
8080is already in use (i.e., by usingfaas-cli deploy --network host).And doing it the other way around--i.e., telling the MySQL database to join the swarm network--gives me an error:
I have seen mention in various places that whether the swarm network is 'attachable' can be configured, but looking through the
faas-clicode I can't really see where that might be set.But anyway, I'm not sure that would be sufficient, since it would require being able to configure how the database is launched, when most of the time we wouldn't have that control (it would already be running).
Possible Solution
I know this is really a Docker networking issue but I don't know enough to know what to Google for!
So any help would be appreciated.
Thanks!
Context
I'd like functions to have access to services that are already running--like ElasticSearch, MySQL, Postgres, etc.
Your Environment
Docker version
docker version(e.g. Docker 17.0.05 ):17.09.0-ce, build afdb6d4
Are you using Docker Swarm or Kubernetes (FaaS-netes)?
Docker Swarm
Operating System and version (e.g. Linux, Windows, MacOS):
MacOS
Link to your project or a code example to reproduce issue:
Not yet public, I'm afraid.
The text was updated successfully, but these errors were encountered: