-
Notifications
You must be signed in to change notification settings - Fork 771
Description
I have a Gradle build where I build a Docker image tar file which has a size of 166 MiB.
I use the GitHub publish Gradle plugin and tried to upload this Docker image as asset to the release.
This consistently resulted in an OutOfMemoryError.
The GitHub publish Gradle plugin under the hood uses this library.
And I also easily reproduced it stand-alone.
In version 1.1xx, you use an HttpsUrlConnection without setting it to fixed-length streaming or chunking mode and thus the JRE uses a byte-array output stream to colllect the whole body in RAM to determine the size up-front.
In version 1.3xx many things changed, but now you use your GitHubRequest.Builder#with(java.io.InputStream) method to load the whole file into RAM using IOUtils.toByteArray.
So at different places both load the whole file into RAM instead of properly streaming it.
Please support at least giving the size to methods like uploadAsset so that the size can be set as HTTP header but the content be streamed, or if GitHub supports it maybe even using chunked mode additionally if no explicit size was given.
The problem can pretty easily be reproduced using:
github
.getRepository("my/repository")
.listReleases()
.iterator()
.next()
.uploadAsset("foo.tar.gz", new FileInputStream("file/that/is/too/large/for/RAM"), "application/octet-stream")