Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ability to serve and upload dynamic binary assets (images, pdfs) #494

Open
faassen opened this issue Mar 7, 2022 · 4 comments
Open

Ability to serve and upload dynamic binary assets (images, pdfs) #494

faassen opened this issue Mar 7, 2022 · 4 comments

Comments

@faassen
Copy link
Contributor

faassen commented Mar 7, 2022

Wasp currently has no way to let users download dynamic binary assets nor upload them. This is a fairly common use case, so should be supported. Where these assets are stored is a related topic (database, s3, etc)

@faassen faassen changed the title Ability to serve dynamic binary assets (images, pdfs) Ability to serve and upload dynamic binary assets (images, pdfs) Mar 7, 2022
@Martinsos
Copy link
Member

Some of the additional conversation we had on this:

I had implemented file uploading in the past, in JS / Express / React apps, and you don't really want it to go through your server -> that is a lot of traffic that you rather want to avoid, as it suffocates your server and reduces its capability to do other, more important stuff, like answer user HTTP requests.
Instead, what you ideally want is to upload the file directly from client! So instead of sending it from client to server and then from server to file storage, you want to send it directly to file storage -> skip the server part.
Server still plays its role, but that role becomes doing authentication / giving permissions to client to upload to file storage, instead of doing the upload itself.

I guess the first question is, where do you want to store your files? Ideally you would use some file storage, like S3, or a similar offering from other hosting provider.
Then, you ideally want to find a nice React library that does most of this work for you, and also explains all the details around it. Such library will probably also advise you on how to set up your S3, and what you need to do on the server. And on the server, you will most likely need to implement a route or two that do some authentication with S3 and then provide those tokens to the client (React).
GitHub
Allow defining custom API routes (http) · Issue #268 · wasp-lang/wa...
Right now dev can define operations (actions, queries) which are then consumed from client via RPC that works via http. However, they can't define custom http API roues at the moment! They ...

[14:22]
As for file uploading libraries:

  1. I found this one by doing quick search, looks pretty good at first, but then it doesn't have many stars and hasn't had much work on it recently: https://github.com/apideck-samples/file-picker
  2. One I used some time ago and that worked well was Fine Uploader: https://github.com/FineUploader/react-fine-uploader -> however I see now it is archived! That sucks.
  3. There is this one: https://github.com/odysseyscience/react-s3-uploader -> has ok number of stars, but no work done in the last 2 years. But maybe it is stable?
  4. There is this link then, pretty fresh (2022) that covers the process in details, so this might be quite interesting: https://blog.devgenius.io/upload-files-to-amazon-s3-from-a-react-frontend-fbd8f0b26f5 . It might be enough to just follow this.

@infomiho
Copy link
Contributor

infomiho commented Mar 21, 2023

Going directly from the client to the storage provider can work for most cases, but not all. What if you are generating files on the server (reports, images etc.) and need to store them?

What I did to get it working for my use-case:

  1. Installed @aws-sdk/client-s3 library
  2. Setup S3 or an S3 compatible storage (I went with Cloudflare's R2)
  3. Use it like this:
import { S3Client, PutObjectCommand } from "@aws-sdk/client-s3";

const S3 = new S3Client({
  region: "auto",
  endpoint: process.env.R2_BUCKET_URL,
  credentials: {
    accessKeyId: process.env.R2_ACCESS_TOKEN!,
    secretAccessKey: process.env.R2_SECRET_TOKEN!,
  },
});

const upload = await S3.send(
  new PutObjectCommand({
    Bucket: "<bucket_name>",
    Key: key,
    Body: response.data,
    // If uploading images for example
    ContentType: "image/png",
  })
);

@Martinsos
Copy link
Member

Martinsos commented Feb 6, 2024

entjustdoit shared some great info on how he did it, on our Discord: https://discord.com/channels/686873244791210014/1062935979951923280/1088518391494619226 .

Here it goes:

Sure! First, go to S3 and setup a bucket and IAM user, then add these entries to your .env.server file.

AWS_S3_IAM_SECRET_KEY=
AWS_S3_FILES_BUCKET=
AWS_S3_REGION=

then add the aws sdk to your list of dependencies

dependencies: [
...
("aws-sdk", "^2.1294.0"),
...
]

In your front end, setup the functions for downloading and uploading files.

const handleUploadFile = async () => {
  ...
  let data = await getUploadFileSignedURL(...);
  // key is the identifier of the file in S3
  const { uploadUrl, key } = data;
  // upload the actual file using the signed URL, newFile here is the file selected in the form
  await axios.put(uploadUrl, newFile);

  // store the key as a field of a file entity for later retrieval
  ...
}
const handleDownloadFile = async (file) => {
  ...
  let downloadUrl = await getDownloadFileSignedURL(...);
  // ignore my ugly code below, its a workaround I had to do due to how I had setup my UI
  var link = document.createElement("a");
  link.download = file.filename;
  link.href = downloadUrl;
  document.body.appendChild(link);
  link.click();
  document.body.removeChild(link);
  ...
}

@Martinsos
Copy link
Member

I think this is a great candidate for potential Full Stack Modules.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants