You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
It returns a WrtiableStream to let the caller write a file on device, the close handler tells the device to finish the file and return the final result. Let's call it PushFileStream.
Because each packet can't exceed packetSize length, I need to split each incoming chunks. I think I can re-use the ChunkStream (a TransformStream) that's used somewhere else, so I piped the readable end to PushFileStream, and returned the writable end of the ChunkStream to callers.
It can correctly chunk the incoming data and send them to device, however, the writable of ChunkStream won't wait for close of PushFileStream. If the caller want to close the stream and writer after finishing pushing, they may be closed before close, causing push to fail.
So how can I wait for PushFileStreamclose, or how can I generally use TransformStream when creating a WritableStream?
The text was updated successfully, but these errors were encountered:
Now I have this API:
It returns a
WrtiableStream
to let the caller write a file on device, theclose
handler tells the device to finish the file and return the final result. Let's call itPushFileStream
.Because each packet can't exceed
packetSize
length, I need to split each incomingchunk
s. I think I can re-use theChunkStream
(aTransformStream
) that's used somewhere else, so I piped thereadable
end toPushFileStream
, and returned the writable end of theChunkStream
to callers.It can correctly chunk the incoming data and send them to device, however, the
writable
ofChunkStream
won't wait forclose
ofPushFileStream
. If the caller want to close thestream
andwriter
after finishing pushing, they may be closed beforeclose
, causing push to fail.So how can I wait for
PushFileStream
close
, or how can I generally useTransformStream
when creating aWritableStream
?The text was updated successfully, but these errors were encountered: