New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Error in Serverless deployment with AWS Lambda #273
Comments
Hi @ji-clara Thank you for reporting the issue and include a detailed steps for reproduction! From looking at the issue, I am fairly certain that I am over processing the serverless information that caused this. However I am unable to reproduce it from your steps. The error displayed to you happens when after we made the deployment request to Lambda and parse the return response. If we parse the response incorrectly, it will throw error even the deployment to AWS Lambda is successful. Can you log into your AWS console and navigate to the Lambda page to see is the deployed function up and running? That will helps me figure out what went wrong on your system. Thanks! |
Hi @yubozhao, Thanks for your help! It was not successfully deployed. The function was not shown in the AWS Lambda page. Thanks, |
I am sorry the deployment wasn't successful. Since you attempted deploy it, BentoML keep an record of it and snapshot it in your local file directory. You can find it at Once you navigate to there, you should see generated files for AWS lambda and a directory with the name of your service. Inside that directory, you could use Thank you so much for reporting and helping improving this project. Bo |
I ran as you said above. Please see the error message below and let me know if you need anything else. Thanks. Error -------------------------------------------------- STDOUT: STDERR: ERROR: Double requirement given: bentoml==v0.3.4 (from -r /var/task/requirements.txt (line 2)) (already in bentoml (from -r /var/task/requirements.txt (line 1)), name='bentoml') |
@ji-clara I see where the problem is. Our example notebook is not updated as we improve this project. In the example notebook directory, if you remove the bentoml from requirements.txt. Everything should works. If you can ran that again, and see what you end up with. That would be great. I will update the example notebook to make sure this doesn't happen anymore |
The same error occurred after I removing the bentoml from requirements.txt in the notebook directory. But after I removed the bentoml from requirements.txt in So updating the example notebook may not be sufficient. Look forward to a solution from your end. Thanks again for your help! |
Good job @ji-clara ! Glad you figure out solution This issue is caused by we didn’t update our example notebooks to keep up with the project improvement. We are going to be more vigilance about this and will go through examples to bring them up to date I am going to close this issue for now. Feel free to reopen it. |
@yubozhao Although I deployed the model using |
@ji-clara sounds good, lets reopen it and continue the discussion here. Could you provide me logs of the api gateway error? I am struggling to reproduce the issue from your suggested method. I will work with your logs for now, while trying to reproduce it on my system Thank you for the following up! |
@yubozhao I'll reproduce it later. I think there might be a simple way to resolve this. The call: Please note, even if Just a thought. |
Add bentoml into the I am very curious of the API gateway error you mentioned. Previously We would need include additional serverless plugin for allow image comes throu in base64 format for AWS lambda, since the serverless support wasn't there. That might be one possibility of causing API gateway error. I think after serverless version 1.45.0 or a version close to this, there is an official support for Gateway and its media type. I want to come back and look into this part and improve the experience. |
@yubozhao
|
Hi @yubozhao, Thanks for your recent updates. I ended up running Do you have any idea how to proceed? Thanks!
|
@ji-clara Thank you for keep working with me on this! Let's clean up your current Yatai server(BentoML deployment server) first.
Now you can use For
For now, I think the best course forward for you is:
Now, you can run |
@yubozhao Sure. After running |
Oh man my bad. Totally forgot the PR haven’t merge yet. If you only have one deployment record, you can go into |
Everything works now finally. It may help others if the instructions you gave to me are added to the repo. Feel free to close this. Thank you for helping all the way through! |
@ji-clara Awesome! I am glad it worked for you. Yes, you are right that we need to document all of the instructions. That's a top priority for us, after the current release of the deployment server |
Describe the bug
While running:
!bentoml deploy ./model --platform aws-lambda --region us-west-2
this error appears:
[2019-08-27 17:25:40,854] INFO - Using user AWS region: us-west-2
[2019-08-27 17:25:40,855] INFO - Using AWS stage: dev
Encounter error when deploying to aws-lambda
Error: 'Service Information' is not in list
To Reproduce
sudo npm install serverless@1.49.0 --global (tried serverless@1.50.0. didn't work for the example below too.)
Expected behavior
Successful deployment
Environment:
The text was updated successfully, but these errors were encountered: