Skip to content
This repository has been archived by the owner on Jan 15, 2023. It is now read-only.

access external service such as Localstack and Elasticsearch #56

Closed
ghost opened this issue Sep 20, 2017 · 3 comments
Closed

access external service such as Localstack and Elasticsearch #56

ghost opened this issue Sep 20, 2017 · 3 comments

Comments

@ghost
Copy link

ghost commented Sep 20, 2017

I created a Lambda function using Localstack Lambda service and triggered it using docker-lambda.

My Lambda is supposed to save objects into Local stack s3 service which is in another container. But I always got this error messages. I wonder if anyone could help me to fix it.

err: 'UnknownEndpoint: Inaccessible host: test.localstack\'. This service may not be available in the us-east-1' region.\n

triggered Lambda by using:

docker run -d --link localstack:localstack --network mynetwork -v "/tmp/localstack/zipfile.283766df":/var/task "lambci/lambda:nodejs6.10" "test.handler"

My docker-compose file looks like following:

elasticsearch:
image: docker.elastic.co/elasticsearch/elasticsearch:5.2.1
volumes:
- ./elasticsearch/config/elasticsearch.yml:/usr/share/elasticsearch/config/elasticsearch.yml
ports:
- "9200:9200"
- "9300:9300"
environment:
ES_JAVA_OPTS: "-Xmx256m -Xms256m"
networks:
- mynetwork

lambci:
image: lambci/lambda:nodejs6.10
networks:
- mynetwork

localstack:
image: localstack/localstack
ports:
- "4567-4582:4567-4582"
- "8080:8080"
environment:
- DEFAULT_REGION=us-west-2
- SERVICES=${SERVICES-lambda, kinesis, s3}
- DEBUG=1
- DATA_DIR=${DATA_DIR- }
- LAMBDA_EXECUTOR=docker
- KINESIS_ERROR_PROBABILITY=${KINESIS_ERROR_PROBABILITY- }
- DOCKER_HOST=unix:///var/run/docker.sock
volumes:
- "/tmp/localstack:/tmp/localstack"
- "/var/run/docker.sock:/var/run/docker.sock"
networks:
- mynetwork

networks:
mynetwork:
driver: bridge

@dsole
Copy link

dsole commented Dec 1, 2017

You're trying to access localstack with the hostname test.localstack
Does it work if you use the hostname localstack as per the container's name ?

@rvolgers
Copy link

Hey, I know this issue is old but since I think I know the answer I'll leave it here in case anyone else has this problem.

S3 has some problems with localstack because it likes to prepend the bucket name to the hostname. So in this case, the bucket name was probably 'test'. Now this host alias probably doesn't exist, so it fails.

The solution is to use the 's3ForcePathStyle: true,' option (pass it to the AWS.S3 constructor, for example). This stops it from mangling path names and S3 should now work as expected.

@mhart
Copy link
Member

mhart commented May 6, 2019

Closing this as outdated

@mhart mhart closed this as completed May 6, 2019
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants