-
-
Notifications
You must be signed in to change notification settings - Fork 475
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feature request: support for file upload #664
Comments
Hi @skitsanos ! Do you want to upload and get files with app.post('/form-upload', async (c) => {
const formData = await c.req.formData()
const file = formData.get('file')
const arr = await file.arrayBuffer()
fs.writeFile('foo.png', Buffer.from(arr), (err) => {
if (err) throw err
})
return c.text('uploaded!')
}) This is working well on Node.js with node-server. I'll try to figure out how to implement this without FormData. Regardless, I think the best way is that Bun supports |
Yes, Yusuke, I wanted it to run via |
I think we have to use the web-standard API as much as possible. In the case of Bun, it seems to be difficult to import
Hmm... :( |
What does parsing the request body has to do with that?:) I just explained the flow of how to make it happen :) |
Ah, yes. I know it:) |
I checked more, - one of the 'recipes' would be to stream to temp file multipart payload and then parse it in chunks... As I can see there is use multipart::server::Multipart;
use std::io::{BufRead, Cursor, Read};
fn main() {
let body = "----------------------------605243336009903535936235\r
Content-Disposition: form-data; name=\"id\"\r
\r
123\r
----------------------------605243336009903535936235\r
Content-Disposition: form-data; name=\"name\"\r
\r
TestName\r
----------------------------605243336009903535936235\r
Content-Disposition: form-data; name=\"file\"; filename=\"demo.txt\"\r
\r
demo/demo/demo\r
----------------------------605243336009903535936235\r
Content-Disposition: form-data; name=\"body\"\r
\r
TestBody\r
----------------------------605243336009903535936235--\r";
fs::write("data/demo.txt".to_string(), Vec::from(body.as_bytes()));
let body_from_file = fs::read_binary("data/demo.txt".to_string());
let file = Cursor::new(body_from_file);
let mut mp = Multipart::with_body(
file,
"--------------------------605243336009903535936235",
);
//https://stackoverflow.com/questions/73235131/how-to-extract-data-from-form-data-in-rust
while let Some(mut field) = mp.read_entry().unwrap() {
let data = field.data.fill_buf().unwrap();
let s = String::from_utf8_lossy(data);
//println!("{}", s)
println!("headers: {:?}, data: {}", field.headers, s);
} In |
How about doing "polyfill"? I have not used it, but there is a library like this: https://www.npmjs.com/package/formdata-polyfill
😱 |
You don't need to polyfill the FormData; as I told you before, the problem is not with creating from but with parsing incoming data. It comes in the format that I showed you in the Rust example above. You can get a whole array of incoming data: const buffer = [];
for await (const data of req.body)
{
buffer.push(data);
}
console.log(buffer.toString()) But it will 'die', if you will upload some big file because you will run out of RAM, so you need to stream your request body to the disk first and then parse the whole thing... |
Hey @yusukebe! 👋 Any update on this? Bun now supports both FormData and Blob, wondering if there's still a blocker or if it's possible to get this added? 🙂 |
Hi @isaac-mcfadyen ! Yes, Bun supported |
Yeah, I'd love it being supported. 😄 |
Hey, the same applies for @honojs/node. I tried using both |
Hi @isaac-mcfadyen ! I have thought about the "file upload" feature, but I think it would be a better way to use the app.post('/upload', async (c) => {
const { image } = await c.req.parseBody()
if (image instanceof File) {
console.log(image.name) // show the file name
const buffer = await image.arrayBuffer()
// do something with the buffer
}
return c.text('Uploaded!')
}) What do you think? By the way, it is not well documented and needs to be. |
You need to identify first if you have single file or array of files or filer/files with other form data, see my notes from above #664 (comment), I think it is faster to 'inject' formdata parser written in Rust and compiled into WASM, to have universal solution :) |
Hello @yusukebe Every day I use Hono I get more amazed 🤯 !
Please consider a A small note if I may. I use Node V18.16.0 and vanilla JS not TS. Since I'm using on the front end filepond that makes a fetch request to the server, I was able to do the following without really posting on the route where the image(s) is(are) added: // THIS IS THE ROUTE THAT FILEPOND LOOKS FOR TO MAKE A FETCH REQUEST.
.post("/add-image", async (c) => {
const { images } = await c.req.parseBody()
const arr = [images]
arr.forEach(async (image) => {
console.log(image)
const buffer = await image.arrayBuffer()
writeFile(`${join(process.cwd())}/static/images/${image.name}`, Buffer.from(buffer), (err) => {
if (err) throw err
})
})
// THIS WILL NEVER BE EXECUTED, JUST HERE TO AVOID CONTEXT ERROR.
return c.text("uploaded!")
})
// THIS IS THE ROUTE TO REDIRECT TO THE IMAGES PAGES.
.post("/save-image", (c) => {
return c.redirect("/admin-gallery")
}) Thank you very much. |
Hi @LebCit ! Thank you for using Hono and giving nice advice. I'll consider adding that to the website. |
I just checked how file uploading works over bun export default async (ctx: Context) =>
{
console.log('got file')
const {req} = ctx;
return ctx.json(response.result('hello there'));
}; Has anyone tried to upload big files? |
The problem is that you are loading the whole file into memory which might cause OOM Exceptions. You will need to stream the file. Best way to do this is something like:
Now the main problem is saw is that you don't get code completion on the client. When using zValidator it actually will parse the whole formBody which will again break for any larger files. @yusukebe I was therefore thinking - what if we update the
That way it's not reading the whole body. (still will work for strings, numbers, etc - but wont load files completely) Now we can use this in the route:
And this still seems to work perfectly. Any feedback would be appreciated. |
Hi @crivera You are right. Hono's Validator once extracts the contents of the |
Hi
Also I tested with regular objects using form and the new validator
and doing this in the code after
works completely fine |
This seems work perfectly but how do you response with |
@muhaimincs That method won't work when you deal with files 100-200Gb each... The safe way of doing things would be to stream upload to the disk |
app.post('/form-upload', async (c) => {
const formData = await c.req.formData()
const file = formData.get('file')
const arr = await file.arrayBuffer()
fs.writeFile('foo.png', Buffer.from(arr), (err) => {
if (err) throw err /** I just concern about this line. **/
})
return c.text('uploaded!')
}) If it hit the |
Its bc you are no awaiting the writing of the file - wrap it in a promise and then catch any error outside and return c.json() |
Or you could just use import { writeFile } from "fs/promises";
app.post('/form-upload', async (c) => {
const formData = await c.req.formData()
const file = formData.get('file')
const arr = await file.arrayBuffer()
try {
await writeFile('foo.png', Buffer.from(arr));
} catch (e) {
return c.json({ error: e.toString() }, 500);
}
return c.text('uploaded!')
}) |
Thanks for your help. Here is what I come out with async function writingFile(path, fileName, data) {
return new Promise((resolve, reject) => {
if (!existsSync(path)) {
fs.mkdir(path, { recursive: true }, (err) => {
if (err) {
reject(err)
}
})
}
fs.writeFile(fileName, data, (err) => {
if (err) {
reject(err)
}
resolve('Success')
})
})
} |
That will definitely work - just as an FYI you don't need the function to be marked as async, you're not doing any awaiting in it. |
aha yes. |
I have created a simple package based on busboy to handle very large files memory efficient with hono: Warning Keep in mind that this is an early version of the package. Might have some issues and breaking changes. I have tested the example with 100GB Files and it works really well only consuming around 150-200 MB of memory. I have also tested it in a project with https://github.com/tweedegolf/storage-abstraction and it works also. |
This approach works for me in new version of hono
|
@dEvAshirvad What is |
The problem with the large files is still standing. Just tried to upload a 14Gb file, and the connection resets, but the app doesn't crash. Am I missing something? There is a limit set somewhere? |
There should be no such limitation on the Hono side, so it may be a runtime issue. |
yes, just verified with "blank" Bun, it is not from Hono. It is something on Bun |
Okay, solved it, it seems:
|
I have a played a bit here with
bun
andhono
to see how it works and realized that the request body comes there in raw format if I want to upload files.surprisingly, getting body as a buffer worked out of the box:
So I have my
buffer
, but the question is -- do we have anything is there any functionality to have a list of files within that buffer?In other words, - maybe there is an example of implementing file uploading with
hono
, or at least some timeline on when it will be available.Thank you.
The text was updated successfully, but these errors were encountered: