Skip to content

Conversation

@amcasey
Copy link
Member

@amcasey amcasey commented Oct 7, 2024

EDIT: now owned by @adityamandaleeka
Fixes #34575

Illustration for dotnet/aspnetcore#58233.

We may want to consider updating the sample properly, but this is just a proof of concept.


Internal previews

📄 File 🔗 Preview link
aspnetcore/mvc/models/file-uploads.md aspnetcore/mvc/models/file-uploads

private readonly long _fileSizeLimit;
private readonly ILogger<StreamingController> _logger;
private readonly string[] _permittedExtensions = { ".txt" };
private readonly string[] _permittedExtensions = { ".txt", ".vhdx" };
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This was just an easy format in which to make a very large file.

streamedFileContent =
await FileHelpers.ProcessStreamedFile(section, contentDisposition,
ModelState, _permittedExtensions, _fileSizeLimit);
using (var memoryStream = new MemoryStream())
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This will still blow up over ~2GB. I only fixed up the file impl, not the EF impl.


if (!ModelState.IsValid)
{
return BadRequest(ModelState);
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We should probably delete the file if this happens.

webBuilder.UseStartup<Startup>();
webBuilder.ConfigureKestrel(options =>
{
options.Limits.MaxRequestBodySize = 6L << 30; // 6 GB
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This actually matters. The next two should just help perf for HTTP/2 (always measure though).


<PropertyGroup>
<TargetFramework>netcoreapp3.1</TargetFramework>
<TargetFramework>net9.0</TargetFramework>
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This mostly just made debugging easier. It's possible my changes work in 3.1 as well.

using (var memoryStream = new MemoryStream())
{
await section.Body.CopyToAsync(memoryStream);
await section.Body.CopyToAsync(destination);
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Reading into a MemoryStream was the real bottleneck, AFAICT>

"The upload failed. Please contact the Help Desk " +
$" for support. Error: {ex.HResult}");
// Log the exception
destination.SetLength(oldLength); // Reset the stream to its original length
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This clean-up may be inadequate or even counter-productive.

data.Position = 0;

using (var reader = new BinaryReader(data))
using (var reader = new BinaryReader(data, System.Text.Encoding.UTF8, leaveOpen: true))
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If we're going to operate on a real stream, we should probably not close it after validation.

// for files (when possible) for all file types you intend
// to allow on the system and perform the file signature
// check.
/*
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I was too lazy to work out a file signature for VHDX for a sample.

"StoredFilesPath": "c:\\files",
"FileSizeLimit": 2097152
"StoredFilesPath": "D:\\tmp\\LargeFileUpload",
"FileSizeLimit": 6442450944
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I calculated this in bytes (6 GiB) but it might actually be in MiB (i.e. 6 TiB). 🤷

@guardrex
Copy link
Collaborator

guardrex commented May 30, 2025

@DeagleGross ... Can we close this out now given the merge of #35527?

I mean even if you're still working on #35550, this PR can be closed at any time, correct?

@DeagleGross
Copy link
Member

yep @guardrex , I wanted to close this one after all my changes are merged, but we can do now :)

@guardrex guardrex closed this May 30, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Modify file upload sample to handle very large files

5 participants