Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can't get this working with "sam local" #38

Closed
radix opened this issue Dec 9, 2018 · 9 comments
Closed

Can't get this working with "sam local" #38

radix opened this issue Dec 9, 2018 · 9 comments
Labels
bug Something isn't working

Comments

@radix
Copy link

radix commented Dec 9, 2018

When trying to run my lambda function with sam local start-api, I get the following error about _LAMBDA_SERVER_PORT not being defined:

(sam) PS C:\Users\radix\Projects\pandt> sam local start-api --template .\SAM.yaml
2018-12-09 15:48:26 Mounting pandt at http://127.0.0.1:3000/ [GET]
2018-12-09 15:48:26 You can now browse to the above endpoints to invoke your functions. You do not need to restart/reload SAM CLI while working on your functions changes will be reflected instantly/automatically. You only need to restart SAM CLI if you update your AWS SAM template
2018-12-09 15:48:27  * Running on http://127.0.0.1:3000/ (Press CTRL+C to quit)
2018-12-09 15:48:35 Invoking lollerskates (provided)

Fetching lambci/lambda:provided Docker container image......
2018-12-09 15:48:36 Mounting C:\Users\radix\Projects\pandt\artifacts as /var/task:ro inside runtime container
thread 'main' panicked at 'failed to start runtime: environment error: the _LAMBDA_SERVER_PORT variable must specify a valid port to listen on', /home/rust/.cargo/git/checkouts/rust-aws-lambda-2a7b20b15eabd9cc/1435a4d/aws_lambda/src/lib.rs:56:31
stack backtrace:
   0: std::sys::unix::backtrace::tracing::imp::unwind_backtrace
             at libstd/sys/unix/backtrace/tracing/gcc_s.rs:49
   1: std::sys_common::backtrace::print
             at libstd/sys_common/backtrace.rs:71
             at libstd/sys_common/backtrace.rs:59
   2: std::panicking::default_hook::{{closure}}
             at libstd/panicking.rs:211
   3: std::panicking::default_hook
             at libstd/panicking.rs:227
   4: std::panicking::rust_panic_with_hook
             at libstd/panicking.rs:476
   5: std::panicking::continue_panic_fmt
             at libstd/panicking.rs:390
   6: std::panicking::begin_panic_fmt
             at libstd/panicking.rs:345
   7: aws_lambda::start::{{closure}}
   8: aws_lambda::start
   9: pandt_lambda::main
  10: std::rt::lang_start::{{closure}}
  11: std::panicking::try::do_call
             at libstd/rt.rs:59
             at libstd/panicking.rs:310
  12: __rust_maybe_catch_panic
             at libpanic_unwind/lib.rs:102
  13: std::rt::lang_start_internal
             at libstd/panicking.rs:289
             at libstd/panic.rs:392
             at libstd/rt.rs:58
  14: main

If I set _LAMBDA_SERVER_PORT to some random port value like 5000, then it doesn't immediately crash, but then I get a timeout:

2018-12-09 15:51:06 Mounting C:\Users\radix\Projects\pandt\artifacts as /var/task:ro inside runtime container
2018-12-09 15:51:17 Function 'pandt' timed out after 10 seconds
2018-12-09 15:51:17 Function returned an invalid response (must include one of: body, headers or statusCode in the response object). Response received:
2018-12-09 15:51:17 127.0.0.1 - - [09/Dec/2018 15:51:17] "GET / HTTP/1.1" 502 -

My function isn't doing anything special:

use lambda_runtime::{self, error::HandlerError, start};
use serde::Serialize;
use serde_json::Value;

#[derive(Serialize, Debug)]
struct Response {
  secret: String,
  input_event: Value,
}

#[allow(clippy::needless_pass_by_value)]
fn main() -> Result<(), failure::Error> {
  let handler = move |event: Value, _ctx: lambda_runtime::Context| -> Result<Response, HandlerError> {
    Ok(Response {
      secret: "foo".to_string(), 
      input_event: event,
    })
  };

  start(handler, None);
  Ok(())
}
@radix
Copy link
Author

radix commented Dec 9, 2018

FWIW, I just tried this with sam local start-lambda and sam local invoke and get the same timeout error.

@davidbarsky
Copy link
Contributor

@radix Ugh, sorry about that. I'm not entirely sure what's causing that. Can you share a minimal example (presumably, what you have) demonstrating the issue and I'll try to see what's up?

@radix
Copy link
Author

radix commented Dec 10, 2018

Hey @davidbarsky, I've uploaded my code to https://github.com/radix/pandt/tree/simplified-main-lambda-for-repro

there is a SAM.yaml in the root directory. It expects the executable to be at artifacts/bootstrap, and I have a dockerfile at docker/lambda-builder.dockerfile which builds the executable with MUSL (I have a script in ./docker/build-lambda.ps1 that runs docker and puts the executable into place, but I won't assume you run Windows :)

The thing that I am getting super confused about now is that I am even getting this _LAMBDA_SERVER_PORT error when I deploy the executable to AWS Lambda itself! I don't see any code in this repository that mentions that environment variable. I am worried that I am screwing something up. It may be interesting to note that I recently switched to using this git revision to get the support for closures: a8bdf7b

@radix radix changed the title Can't get this working with "sam local start-api" Can't get this working with "sam local" Dec 10, 2018
@softprops
Copy link
Contributor

There may be a more than one issue going on here but here's the one that jumps out at me.

I'm not super sam-savy but I think the part of the answer you're looking for is in how you're applying sam to your usecase based on the log examples above

sam local, if I recall, is intended to mount a local http server dispatching to aws lambda functions which expect gateway event's and yield api gateway responses

from your logs above

2018-12-09 15:51:17 Function returned an invalid response (must include one of: body, headers or statusCode in the response object). Response received:

Api gateway, the proxy integration at least, has an expectation that you yield a response of a certain format.

From you're example code above, it looks like you aren't meeting those expectations

#[derive(Serialize, Debug)]
struct Response {
  secret: String,
  input_event: Value,
}

If you're sticking to a published version of this crate, I'd recommend a structure with body, headers or statusCode as mentioned in the error message or if you don't mind depending on unpublished release there's a new lambda-http module available in master that targets api gateway integrations specifically. This uses the http crate as its core interface but also supports closures and as the more loosely typed rust lambda runtime interfaces support

@radix
Copy link
Author

radix commented Dec 11, 2018

@softprops The same thing is happening with sam local invoke and sam local start-lambda, which have no restrictions on the output format of the lambda, so I don't think that's related.

@softprops
Copy link
Contributor

Heads up. I'm working on bringing first class rust support to Sam aws/aws-lambda-builders#174

@jkshtj jkshtj added the bug Something isn't working label Jan 21, 2021
kenshih added a commit to kenshih/aws-lambda-rust-runtime that referenced this issue Jun 19, 2021
* Added unit tests for TryFrom<HeaderMap> for Context
@kenshih
Copy link
Contributor

kenshih commented Jun 19, 2021

👋 Hello! Just noting a proposed fix: #332

coltonweaver pushed a commit that referenced this issue Jul 27, 2021
* Added unit tests for TryFrom<HeaderMap> for Context
@brainstorm
Copy link

#332 got merged and it works now, yay! This issue can be closed, IMHO.

@coltonweaver
Copy link
Contributor

Agreed, closing this as the issue is resolved. Thanks for the contribution @kenshih

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

7 participants