Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Not clear on how to use sharp on sam-cli locally (cross-platform) #3473

Closed
jrhager84 opened this issue Nov 30, 2022 · 12 comments
Closed

Not clear on how to use sharp on sam-cli locally (cross-platform) #3473

jrhager84 opened this issue Nov 30, 2022 · 12 comments
Labels

Comments

@jrhager84
Copy link

jrhager84 commented Nov 30, 2022

Question about an existing feature

What are you trying to achieve?

I can't figure out how to specify the correct architecture based on whether or not I'm running locally (m1 mac arm64) or AWS/SAM (Linux)

I can see where to specify the architecture in the YAML file, but I'm not sure how to specify which to use based on the environment it's being invoked from, or how to figure that out.

When you searched for similar issues, what did you find that might be related?

There is a BOATLOAD of differing opinions on globally installing sharp, saying that 29+ doesn't matter, using brew, using taps, etc. There's no real authority for what the most recent and successful methods are

The issues here in the repo

Please provide a minimal, standalone code sample, without other dependencies, that demonstrates this question

import sharp from 'sharp'

// Buffer is in image.data
const compressed_image = await sharp(image.data).png({quality: 60}).toBuffer().catch((e) => console.log('Error with sharp: ', e))

Here is the error:

{"errorType":"Error","errorMessage":"\nSomething went wrong installing the \"sharp\" module\n\nCannot find module '../build/Release/sharp-linux-x64.node'\nRequire stack:\n- /var/task/node_modules/sharp/lib/sharp.js\n- /var/task/node_modules/sharp/lib/constructor.js\n- /var/task/node_modules/sharp/lib/index.js\n\nPossible solutions:\n- Install with verbose logging and look for errors: \"npm install --ignore-scripts=false --foreground-scripts --verbose sharp\"\n- Install for the current linux-x64 runtime: \"npm install --platform=linux --arch=x64 sharp\"\n- Consult the installation documentation: https://sharp.pixelplumbing.com/install","trace":["Error: ","Something went wrong installing the \"sharp\" module","","Cannot find module '../build/Release/sharp-linux-x64.node'","Require stack:","- /var/task/node_modules/sharp/lib/sharp.js","- /var/task/node_modules/sharp/lib/constructor.js","- /var/task/node_modules/sharp/lib/index.js","","Possible solutions:","- Install with verbose logging and look for errors: \"npm install --ignore-scripts=false --foreground-scripts --verbose sharp\"","- Install for the current linux-x64 runtime: \"npm install --platform=linux --arch=x64 sharp\"","- Consult the installation documentation: https://sharp.pixelplumbing.com/install","    at Object.<anonymous> (/var/task/node_modules/sharp/lib/sharp.js:34:9)","    at Module._compile (node:internal/modules/cjs/loader:1105:14)","    at Object.Module._extensions..js (node:internal/modules/cjs/loader:1159:10)","    at Module.load (node:internal/modules/cjs/loader:981:32)","    at Function.Module._load (node:internal/modules/cjs/loader:822:12)","    at Module.require (node:internal/modules/cjs/loader:1005:19)","    at require (node:internal/modules/cjs/helpers:102:18)","    at Object.<anonymous> (/var/task/node_modules/sharp/lib/constructor.js:8:1)","    at Module._compile (node:internal/modules/cjs/loader:1105:14)","    at Object.Module._extensions..js (node:internal/modules/cjs/loader:1159:10)"]}%   

It's arch-dependent, so a repo won't help. A blank install of any node instance in SAM locally with sharp will produce this result (at least on my end)

Please provide sample image(s) that help explain this question

@lovell
Copy link
Owner

lovell commented Dec 1, 2022

Did you see #3207 (comment) ?

@jrhager84
Copy link
Author

I did. I tried it, and even though the error doesn't appear, it doesn't 'work' either. It shows no console logs, nor does it give me any success or failure. Just kinda of silently fails.

Am I doing something wrong? Surely there are other people using an m1 Mac to develop lambda functions locally with Sam.

Something just isn't clicking with me on this. Thank you for your response 😊

@lovell
Copy link
Owner

lovell commented Dec 1, 2022

It shows no console logs, nor does it give me any success or failure. Just kinda of silently fails.

aws/aws-sam-cli#4329

@jrhager84
Copy link
Author

I should clarify (apologies):

The function silently fails. It builds correctly and runs the function (or appears to) but there are no console logs or any return logs. It even shows the correct time that the function should be running (it's a pretty long OpenAI generation)

Here is what my terminal shows:

joelhager@Joels-MacBook-Pro openai-sam % npm run local-function   

> openai-sam@0.0.1 local-function
' 'am build --use-container && sam local invoke openaiCompletionRequest -e event.json 2>&1 | tr '
> '

Starting Build inside a container
Building codeuri: /Users/joelhager/Development/adprompt/openai-sam runtime: nodejs16.x metadata: {} architecture: x86_64 functions: openaiCompletionRequest

Fetching public.ecr.aws/sam/build-nodejs16.x:latest-x86_64 Docker container image......
Mounting /Users/joelhager/Development/adprompt/openai-sam as /tmp/samcli/source:ro,delegated inside runtime container

Build Succeeded

Built Artifacts  : .aws-sam/build
Built Template   : .aws-sam/build/template.yaml

Commands you can use next
=========================
[*] Validate SAM template: sam validate
[*] Invoke Function: sam local invoke
[*] Test Function in the Cloud: sam sync --stack-name {stack-name} --watch
[*] Deploy: sam deploy --guided
        
Running NodejsNpmBuilder:NpmPack
Running NodejsNpmBuilder:CopyNpmrcAndLockfile
Running NodejsNpmBuilder:CopySource
Running NodejsNpmBuilder:NpmInstall
Running NodejsNpmBuilder:CleanUpNpmrc
Running NodejsNpmBuilder:LockfileCleanUp
Invoking src/handlers/openai-sam.openaiCompletionRequest (nodejs16.x)
Skip pulling image and use local one: public.ecr.aws/sam/emulation-nodejs16.x:rapid-1.56.1-x86_64.

Mounting /Users/joelhager/Development/adprompt/openai-sam/.aws-sam/build/openaiCompletionRequest as /var/task:ro,delegated inside runtime container
START RequestId: e88db3fb-8ede-40d1-9067-2cd071b196a6 Version: $LATEST
END RequestId: e88db3fb-8ede-40d1-9067-2cd071b196a6
REPORT RequestId: e88db3fb-8ede-40d1-9067-2cd071b196a6  Init Duration: 5.25 ms  Duration: 32222.59 ms   Billed Duration: 32223 ms      Memory Size: 128 MB     Max Memory Used: 128 MB

@jrhager84
Copy link
Author

I can also confirm -- I have upgraded Docker (just in case even though I'm not getting that issue) and it works 100% if I comment out any sharp import/usage. Otherwise, it just runs through - no logs or anything.

@lovell
Copy link
Owner

lovell commented Dec 2, 2022

REPORT RequestId: e88db3fb-8ede-40d1-9067-2cd071b196a6 Init Duration: 5.25 ms Duration: 32222.59 ms Billed Duration: 32223 ms Memory Size: 128 MB Max Memory Used: 128 MB

Are you sure 128MB is enough memory for the task, as this log entry suggests the request maxed out.

The use of .png({quality: 60}) means you'll be getting lossy PNG output, which involves a quantisation step and therefore the output image must be held uncompressed in memory.

@jrhager84
Copy link
Author

The raw image is only 3mb and it's only processing one. I just figured 128 would be enough. It only holds a couple text fields and an image.

@lovell
Copy link
Owner

lovell commented Dec 2, 2022

When running CPU-bound tasks on Lambda, artificially limiting RAM is usually a false economy.

https://sharp.pixelplumbing.com/install#aws-lambda

To get the best performance select the largest memory available. A 1536 MB function provides ~12x more CPU time than a 128 MB function.

@jrhager84
Copy link
Author

jrhager84 commented Dec 2, 2022 via email

@lovell
Copy link
Owner

lovell commented Dec 2, 2022

Thanks, this is useful extra information. It definitely sounds like you should be maintaining two functions and therefore invocations, ensuring you separate I/O bound tasks from CPU bound tasks.

@jrhager84
Copy link
Author

For sure! I guess I didn't think a single 3mb photo would cripple a 128mb instance, and since I already had the image in a buffer it seemed acceptable to just do it before uploading to my bucket.

I'll separate it out. Thanks!

@lovell
Copy link
Owner

lovell commented Dec 21, 2022

I hope this information helped. Please feel free to re-open with more details if further assistance is required.

@lovell lovell closed this as completed Dec 21, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants