Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Integrate with AWS API Gateway. #5

Merged
merged 1 commit into from
Nov 20, 2018
Merged

Conversation

osery
Copy link
Contributor

@osery osery commented Nov 13, 2018

Wip on #2721.

@osery osery force-pushed the osery-lambda-api-gateway branch 2 times, most recently from f17c36b to 6d0c2dd Compare November 14, 2018 08:34
@osery osery requested review from chandanmaruthi and removed request for johnygomez November 14, 2018 08:53
Copy link

@chandanmaruthi chandanmaruthi left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@osery looks good.
Here are a couple of quick pointers and things to consider

  • Here is a link to an API spec we have been considering for scoring, this just FYI https://github.com/h2oai/h2oai/blob/69ac746dda5977bfdaeeac4394d17bd1dc1de713/h2oai_scorer/scoring-pipeline/rest_server/swagger.yaml
  • When a model is deployed for scoring, how do you know what the API end-point is?
  • I am assuming when multiple models are deployed there will be separate API end-points? If the API cannot be queried to know which model it is scoring it may become tricky quickly when you have more than one model
  • how is an API unpublished? assuming it easy to create and deploy models, will there be lots of dormant APIs?
  • assuming this is over the internet some minimum authentication mechanism for the API may be a good idea

@osery
Copy link
Contributor Author

osery commented Nov 16, 2018

Thanks a lot @chandanmaruthi. Great comments.

  • This lambda code actually uses the very same API (well the scoring request that is), see:
    https://github.com/h2oai/dai-deployment-templates/blob/osery-lambda-evaluation/aws-lambda-scorer/lambda-template/swagger.yaml.
  • In later PRs, the lambda is deployed using Terraform. We get the resulting endpoint URL as an output of terraform apply (details here: https://github.com/h2oai/dai-deployment-templates/blob/osery-lambda-readme/aws-lambda-scorer/README.md)
  • Yes, at this point, there would be one lambda per deployment. I plan to change this a bit still to also add the model query endpoint, but that is yet to come. We could even implement the very same API with just one model, though the id in the path might be a bit cumbersome. In principle, we could serve multiple models from the same lambda easily, but that would imho open a whole pandora box of versioning the shared lambda. So I would prefer to keep them separate unless there is a good reason not to. Wdyt?
  • Great catch. We already discussed this with @mmalohlava. We can start with this "naive" approach and add an optional auth a bit later. I say optional because for demo purposes (which is probably the biggest motivation of this) it may be easier to serve without auth.

@osery
Copy link
Contributor Author

osery commented Nov 16, 2018

I filled issues to track the additional work: #9, #10, #11.

@osery osery changed the base branch from osery-lambda-evaluation to master November 20, 2018 08:26
@osery osery merged commit dc6dad6 into master Nov 20, 2018
@osery osery deleted the osery-lambda-api-gateway branch November 20, 2018 08:31
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants