Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bug 27755 - Using the Subtle Crypto Interface with Streams #73

Open
mwatson2 opened this issue May 24, 2016 · 46 comments
Open

Bug 27755 - Using the Subtle Crypto Interface with Streams #73

mwatson2 opened this issue May 24, 2016 · 46 comments

Comments

@mwatson2
Copy link
Collaborator

@mwatson2 mwatson2 commented May 24, 2016

Bug 27755:

Though the StreamsAPI is referenced in Informative Reference, the functions under window.crypto.subtle are specified with only one-shot data inputs.

Use-cases: Data may not be available at once. Data may be too huge to keep in memory.

For encrypt()/decrypt() it would make sense to have a streaming readable output if the input is a readable stream.

@jimsch
Copy link
Collaborator

@jimsch jimsch commented May 24, 2016

After listening to Ryan rage about the use of BER encoding for ASN.1 objects, I have a feeling that this should be closed as won't fix because it presents a security issue. When one looks at the encrypt/decrypt APIs for authenticated encryption, it is required that the entire stream be observed on the decrypt side and could be argued that it needs to be observed on the encrypt side prior to emitting the processed stream. This is due to the fact that if the decryption process does not validate then no output is to be produced for consumption. Allowing this to be done in a streaming fashion means that the browser potentially needs to have an infinite size buffer to hold the intermediate result to be returned to the client.

Similar issues hold for processing of signature values for the new X448 EdDSA algorithm where the message M is hashed twice. Allowing for an indefinite length input means that there are potential buffer overrun problems.

@feross
Copy link

@feross feross commented May 24, 2016

Node.js has a streaming crypto API without any security issues:

const crypto = require('crypto');
const hash = crypto.createHash('sha256');

hash.update('some data to hash');
hash.update('more data');
hash.update('even more data');
console.log(hash.digest('hex'));

Why can't the web platform?

@indutny
Copy link

@indutny indutny commented May 24, 2016

I absolutely agree with @feross on this. Most (if not all) of the APIs can work in a streaming mode without any security issues. In fact, this is how these APIs are exposed in OpenSSL, so they always work in a streaming mode under the hood anyway, regardless of what high-level API may look like.

@jimsch
Copy link
Collaborator

@jimsch jimsch commented May 24, 2016

All of the current hash functions that I am familiar with will allow for streaming APIs because they are built using a Merkle–Damgård construction. This means that they are processed on a block by block basis. However there are algorithms for which this is not doable. For example, the EdDSA algorithm that I mentioned above computes:

R = fn( SHAKE256(dom(F, C) || prefix || M, 114) )
and then
k = SHAKE256(dom(F, C) || R || A || M, 114)

as you can see, you need to all of the message M to compute R before you can start doing the computation of k. This means that the entire message needs to be buffered unlike the hash example you gave above.

Note also the comment that I made on authenticated decryption where the entire message needs to be kept before doing the validation step at the end.

@indutny
Copy link

@indutny indutny commented May 24, 2016

@jimsch in your description SHAKE256 appears to be just a hashing function, most of the hashing functions support streaming input. There is nothing that could prevent one from creating two streaming SHAKE256 hashes and using their digests at the end of the stream to compute R and k.

Authenticated decryption should work as well, as far as I can tell... Though, the fact that the integrity is checked only at the end of decryption process means that the API will be kind of awkward. I don't think that there are much pros of using streams for authenticated decryption.

@jimsch
Copy link
Collaborator

@jimsch jimsch commented May 24, 2016

@indutny please re-read my previous post and look at the requirements to finish R before using M for k

@indutny
Copy link

@indutny indutny commented May 24, 2016

@jimsch oh I see it now. Sorry about that! Yeah, streaming won't work for this kind of encryption/decryption schemes indeed.

Still many hashes and ciphers work just fine with streams.

@tanx
Copy link

@tanx tanx commented May 24, 2016

A native streaming api would indeed be great. Our use-case would be large file encryption in OpenPGP.js.

@mwatson2
Copy link
Collaborator Author

@mwatson2 mwatson2 commented May 24, 2016

If we address this, I think it will not be in this version since it requires substantial work.

@mwatson2 mwatson2 added this to the VNext milestone May 24, 2016
@hhalpin
Copy link

@hhalpin hhalpin commented Jun 20, 2016

I imagine we can close this as won't fix, but when streaming stabilizes we can then revisit as part of maintenance of the spec since as @jimsch correctly points out, it won't work for quite a few algorithms. We could also try to test to see if anyone supports streaming - any ideas?

@hhalpin
Copy link

@hhalpin hhalpin commented Jun 20, 2016

v.Next.

@evilaliv3
Copy link

@evilaliv3 evilaliv3 commented Dec 13, 2016

Is there any update on this topic?

@roccomuso
Copy link

@roccomuso roccomuso commented Mar 6, 2017

+1

@ericmackrodt
Copy link

@ericmackrodt ericmackrodt commented Mar 30, 2017

If streaming/progressive encryption isn't implemented, it's going to hugely limit the scope of usage of the API. I really need that kind of functionality for the software I work on.

@neckaros
Copy link

@neckaros neckaros commented May 9, 2017

+1!

@neckaros
Copy link

@neckaros neckaros commented May 9, 2017

Privacy is a groing concern. Being able to decrypt locally without consuming too much memory is a must i think.
For exemple encrypt huge file locally as you send it to a server so the server never has the decrypted data.
It works well on nodejs

@alanwaketan
Copy link

@alanwaketan alanwaketan commented Jun 10, 2017

I think digest maybe a good point to start with.

@thiccar
Copy link

@thiccar thiccar commented Jul 14, 2017

+1000

@daviddias
Copy link

@daviddias daviddias commented Aug 28, 2017

Hi all, bringing this issue back up. Any updates or recent discussion on it?

I believe that the security considerations do not hold and it what it promotes is for users to find other ways to encrypt their files as the usage of browsers to share large documents grows. Possibly by having to shim their own encryption streaming API which will be considerably slower than a native one through WebCrypto.

@JulianKlug
Copy link

@JulianKlug JulianKlug commented Jan 13, 2018

+1

@johnozbay
Copy link

@johnozbay johnozbay commented Feb 16, 2018

100% agreed with @ericmackrodt & @neckaros & @diasdavid. With GDPR on the horizon this would make things a lot more easier for European establishments.

@isiahmeadows
Copy link

@isiahmeadows isiahmeadows commented Mar 27, 2018

@jimsch By any chance, could a streaming API be provided for those encryption schemes that could be streamed? Just because it's not possible for some doesn't make it impossible for all (and there's different tradeoffs for each). And one good example of this is with client decryption of large files on mobile (only high end phones/tablets have the RAM available to reliably decrypt a 750MB video download in-memory).

@jimsch
Copy link
Collaborator

@jimsch jimsch commented Mar 27, 2018

It could, on the other hand there may be other things that could be done as well. For example one could do chunked encryption of large objects such as video which is designed to be streamed so that each chunk can be independently decrypted and streamed. The world is moving towards only using authenticated encrypted algorithms and doing streaming such as you suggest means that you are willing to use a decrypted stream that may have been corrupted w/o being able to detect this.

Additionally, one would need to get a group of people together at the W3C who are interested in doing an update to the document and then decide which algorithms could/should be streamable and which should not.

@jakearchibald
Copy link

@jakearchibald jakearchibald commented Jan 16, 2019

@v0l

I would add the same comment to FileReader, I have no idea why FileReader doesnt expose a ReadableStream...

fwiw you can do new Response(blob).body.

@jakearchibald
Copy link

@jakearchibald jakearchibald commented Jan 16, 2019

Writable and transform streams have now shipped in Chrome. Pretty sure the streaming spec is stable enough to look at this.

I agree with others that the digest method is a good place to start.

@jakearchibald
Copy link

@jakearchibald jakearchibald commented Jan 17, 2019

Suggested API:

const value = await crypto.subtle.digest(algorithm, readableStream);

Overload the existing method so it takes a readable stream. Example:

// Digest the HTML spec:
const request = await fetch('https://html.spec.whatwg.org/');
const value = await crypto.subtle.digest('SHA-256', request.body);

This would also allow providing the chunks manually, or using a combination of many sources:

// Digest a combination of the HTML spec, the DOM spec, and "That's all folks".
const responsesToDigest = [
  fetch('https://html.spec.whatwg.org/'),
  fetch('https://dom.spec.whatwg.org/'),
];
const { writable, readable } = new TransformStream();
const valuePromise = crypto.subtle.digest('SHA-256', readable);

for await (const response of responsesToDigest) {
  await response.body.pipeTo(writable, { preventClose: true });
}

const writer = writable.getWriter();
writer.write(new TextEncoder().encode("That's all folks"));
writer.close();

const value = await valuePromise;

@gannons
Copy link

@gannons gannons commented Mar 20, 2019

It sounds like this feature has yet to be implemented. Are there any alternatives to using webcrypto?

@jimmywarting
Copy link

@jimmywarting jimmywarting commented Jun 7, 2019

Similar issues hold for processing of signature values for the new X448 EdDSA algorithm where the message M is hashed twice

If that's the case why can't we just simply allow a Blob or File to be passed onto the digest function and let the hashing be in control of reading & seeking the content? Why would you need to create a writeable stream or a single buffer at all out of a blob?

I don't argue against streaming support. It's a grate addition for things where hashing algorithms works in a streamable (block) fashion.


For feature references you can get a stream from a blob using the newly added read method by just calling blob.stream() instead of doing new Response(blob).body


zip.js created a neat universal base class that had some basic read/write functionality that could be added onto any kind of data (blob, string, base64, typedArrays) as long as it had this kind of method:

class Something extends zip.Reader {
  #data

  constructor() { ... }

  readUint8Array (start, length) {
    return slice_and_return_uint8(this.#data, start, length) // could also return a promise
  }
}

If we had something like this then digest would not be limited to only a few types of acceptable data. it would be more like stream's pull method but with added arguments of what & where to read. when i came to thing about it, they acts pretty much as a Transform stream.

Imagine using something like this where you had to read M twice...

var transformer = new TransformStream({
  transform({ start, length }, controller) {
    controller.enqueue( slice_and_return_uint8(data, start, length) )
  }
})

crypto.subtle.digest(algorithm, transformer)

@rabindranathfv
Copy link

@rabindranathfv rabindranathfv commented Feb 28, 2020

Suggested API:

const value = await crypto.subtle.digest(algorithm, readableStream);

Overload the existing method so it takes a readable stream. Example:

// Digest the HTML spec:
const request = await fetch('https://html.spec.whatwg.org/');
const value = await crypto.subtle.digest('SHA-256', request.body);

This would also allow providing the chunks manually, or using a combination of many sources:

// Digest a combination of the HTML spec, the DOM spec, and "That's all folks".
const responsesToDigest = [
  fetch('https://html.spec.whatwg.org/'),
  fetch('https://dom.spec.whatwg.org/'),
];
const { writable, readable } = new TransformStream();
const valuePromise = crypto.subtle.digest('SHA-256', readable);

for await (const response of responsesToDigest) {
  await response.body.pipeTo(writable, { preventClose: true });
}

const writer = writable.getWriter();
writer.write(new TextEncoder().encode("That's all folks"));
writer.close();

const value = await valuePromise;

@jakearchibald did you try to use this with angular in client side? i am using it and it's is make kboom all my unit test hahaha.

@mikeal

This comment has been hidden.

@sideshowbarker

This comment has been hidden.

@Cavitt
Copy link

@Cavitt Cavitt commented Feb 4, 2021

For those needing performant hashing now on the web the best options are WebAssembly or ASM.js for files larger than your target chunk size and use the native crypto function for smaller files.

Web Assembly: https://github.com/Daninet/hash-wasm
ASM.js: https://github.com/asmcrypto/asmcrypto.js

Note: I am not affiliated with either of these projects, I do use asmcrypto.js in production though.

I hope to one day have native incremental hashing in the browser, it's quite a hassle.

@knightcode
Copy link

@knightcode knightcode commented Aug 6, 2021

Use case I'm interested in would be forming a cipher stream for upload from a File object:

const iv = ...
const add = ...
const key = ...
const file = document.getElementById("file_input").files[0]; // or dataTransfer.files

// maybe: new FormData()

const response = await fetch("...", {
    method: "POST",
    body: window.crypto.subtle.encrypt(
      {name: "AES-GCM", iv: iv, addtionalData: aad, tagLength: 128},
      key,
     file
});

The opposite direction would be nice, too,...somehow attaching subtle.decrypt to an <a download> tag. But that seems less intuitive.

@jimmywarting
Copy link

@jimmywarting jimmywarting commented Sep 27, 2021

Another solution could be to accept a AsyncIterable that yields uint8array's, in that case both node & whatwg streams could be processed

const request = await fetch('https://html.spec.whatwg.org/');
const value = await crypto.subtle.digest('SHA-256', request.body);

@MattiasBuelens
Copy link

@MattiasBuelens MattiasBuelens commented Sep 27, 2021

Another solution could be to accept a AsyncIterable that yields uint8array's, in that case both node & whatwg streams could be processed

Not sure.

  • Having a ReadableStream as input could allow for more optimizations, e.g. using large BYOB reads, or even bypassing the streams API entirely and reading directly from an underlying file/socket.
  • Node 16 supports web streams (experimentally), so ReadableStream already works for both the web and Node without having to generalize to async iterables.
  • We don't yet have a precedent of a web API that accepts an async iterable. ReadableStream.from(asyncIterable) would be the first, maybe followed by the iterator helpers proposal. This isn't to say that we shouldn't have web APIs accepting async iterables, it's just that we'll have to think more carefully about the API design.

@jimmywarting
Copy link

@jimmywarting jimmywarting commented Sep 27, 2021

Hmm, good point!

I previous suggested adding blob support here: #216
but if it could accept a ReadableStream from crypto.subtle.digest('SHA-256', blob.stream()) and even bypassing stream api then there wouldn't be much value in adding support for blob...

@knightcode
Copy link

@knightcode knightcode commented Sep 27, 2021

Another solution could be to accept a AsyncIterable that yields uint8array's, in that case both node & whatwg streams could be processed

const request = await fetch('https://html.spec.whatwg.org/');
const value = await crypto.subtle.digest('SHA-256', request.body);

Would the user be able to save the streaming file to the file system with this API?

@MattiasBuelens
Copy link

@MattiasBuelens MattiasBuelens commented Sep 27, 2021

Would the user be able to save the streaming file to the file system with this API?

crypto.subtle.digest() accepts a variable-size input but returns a fixed-size output, for example SHA-256 always returns a 256-bit (32 byte) output. So there's no point in making the output a stream.

For encrypt() and decrypt(), the output size is dependent on the input size: larger inputs generate larger outputs. So here, it would make sense to also return a ReadableStream. You'll be able to write this to disk using the File System Access API:

const decryptedStream = crypto.subtle.decrypt(/* ... */); // this API doesn't exist yet
const handle = await window.showSaveFilePicker();
decryptedStream.pipeTo(await handle.createWritable());

But for now, we're only looking at digest().

@matthewjumpsoffbuildings

I recently have been dealing with workers/fetch and was very disappointed to find that the Crypto API doesnt support Streams.

That implementation you proposed @MattiasBuelens looks fantastic, hopefully something like it comes to fruition

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Linked pull requests

Successfully merging a pull request may close this issue.

None yet