-
-
Notifications
You must be signed in to change notification settings - Fork 2.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: Add local file provider and wire everything up in the file module #7134
Conversation
The latest updates on your projects. Learn more about Vercel for Git ↗︎
2 Ignored Deployments
|
|
ab7b7f8
to
30374da
Compare
30374da
to
f5a7e84
Compare
// TODO: For now we keep the files in memory, as that's how they get passed to the workflows | ||
// This will need revisiting once we are closer to prod-ready v2, since with workflows and potentially | ||
// services on other machines using streams is not as simple as it used to be. | ||
const upload = multer({ storage: multer.memoryStorage() }) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
question/thought: I guess we will need to somehow integrate this part of the flow with the file provider so that we can store the uploads on a remote server e.g. S3 if you are using the S3 plugin. Do you already have thoughts on this?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@olivermrbl So what happens currently is, the entire data/file the user sent will be kept in memory, passed through the workflow, to the file provider, and then it gets converted from in-memory binary content to a buffer and stored as a file.
If we decide that, for uploads, we want to avoid using workflows and call a module directly then we can probably support streaming more easily, as you wouldn't need any sort of serialization. That might be the way to go, but honestly for now it doesn't matter as much, as supporting that won't be a major change.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Overall LGTM, should be mergeable when tests are passing :)
No description provided.