-
Notifications
You must be signed in to change notification settings - Fork 233
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
put_block() with anything other than static data? #1219
Comments
The BlobClient::put_block implementation requires the payload be available such that the underlying layer can handle retries. Currently, this requires the payload to be In your example, if you changed the data from As far as a larger example, AVML supports uploading large blob to Azure Storage using |
I forgot to include a reference to the AVML implementation, sorry about that. https://docs.rs/avml/latest/avml/struct.BlobUploader.html |
Okay, thank you. It would be fantastic to have a simpler example to follow in the docs. Also maybe one that showed how to get a seekable stream from something other than bytes in memory. But at least I have something to work with now. |
It appears the same applies to For the file to bytes object, it can be as simple as: let file = File::open(file_path)?;
let mut reader = BufReader::new(file);
let mut buffer = vec![];
// *** // Credential code goes here, examples for this exists
let blob_client =
ClientBuilder::new(account, storage_credentials).blob_client(&container, blob_name);
blob_client
.put_block_blob(buffer)
.content_type(content_type)
.await?; or a little more complex as: let file = File::open(file_path)?;
let mut cursor = BufReader::new(file);
let mut fbytes = BytesMut::new();
let mut chunk = vec![0; 10 * 1024 * 1024];
loop {
let bytes_read = match cursor.read(&mut chunk) {
Ok(0) => break,
Ok(x) => x,
Err(_) => break,
};
fbytes.put(&chunk[..bytes_read]);
}
// *** // Credential code goes here, examples for this exists
let blob_client =
ClientBuilder::new(account, storage_credentials).blob_client(&container, blob_name);
blob_client
.put_block_blob(fbytes)
.content_type(content_type)
.await?; Both of these work for smaller files, but anything larger (gigabytes worth) will fail. Any feedback on this would be wonderful. For now, I'll try out the |
@demoray Can we please reopen this issue for including an example implementing SeekableStream so that we may use this library for uploading larger files? I looked at AVML but I'd prefer using this repo if it has the built-in features already. |
The library should at least provide an implementation for (e.g.) tokio’s |
The current API to asynchronously write a blob is pretty clunky due to the combination of AsyncSeek and SeekableStream. I'd be happy to contribute our implementation on-top of those that allows for any |
A lot of the complexity here can now be simplified by using Hyper's version of I'd like to avoid using our own custom put requests using SAS urls to allow byte streams to work. @Porges thoughts about accepting changes to the SeekableStream trait? |
@thehydroimpulse I don't work on this project; @demoray? |
Addresses Azure#1219 This builds upon Azure#1359, Azure#1358, and Azure#1357.
In addition to the For an example, see https://github.com/Azure/azure-sdk-for-rust/blob/main/sdk/storage_blobs/examples/stream_blob_02.rs |
It appears that
put_block()
can only be called with data that has a static lifetime. How can I upload a file that I am reading into a buffer? Is this simply not possible at this point?rustc complains that
buffer
does not live long enough.The text was updated successfully, but these errors were encountered: