Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unable to migrate data from constellation when payloads content is large #705

Closed
pmitra43 opened this issue Apr 17, 2019 · 1 comment · Fixed by #719
Closed

Unable to migrate data from constellation when payloads content is large #705

pmitra43 opened this issue Apr 17, 2019 · 1 comment · Fixed by #719
Assignees
Labels
0.10.0 bug Something isn't working

Comments

@pmitra43
Copy link

When data-migration.jar utility is executed against the payloads directory of constellation, an Exception in thread "main" java.lang.OutOfMemoryError: Java heap space occurs, in case the data size is significantly higher than the heap memory allocated to the java process. This occurs due to all the files being loaded in memory before starting the migration.

Command:

java -Xmx<max-memory-allocated> -jar <dir>/data-migration.jar -storetype dir -inputpath <path-to-payloads>/ -dbuser quorum -dbpass quorum -outputfile /quorum/tessera/data -exporttype h2

Stacktrace:

Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
at java.nio.file.Files.read(Files.java:3099)
at java.nio.file.Files.readAllBytes(Files.java:3158)
at com.quorum.tessera.data.migration.FilesDelegate.readAllBytes(FilesDelegate.java:18)
at com.quorum.tessera.data.migration.DirectoryStoreFile.lambda$load$2(DirectoryStoreFile.java:26)
at com.quorum.tessera.data.migration.DirectoryStoreFile$$Lambda$5/1587487668.apply(Unknown Source)
at java.util.stream.Collectors.lambda$toMap$58(Collectors.java:1321)
at java.util.stream.Collectors$$Lambda$8/1702297201.accept(Unknown Source)
at java.util.stream.ReduceOps$3ReducingSink.accept(ReduceOps.java:169)
at java.util.Iterator.forEachRemaining(Iterator.java:116)
at java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801)
at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481)
at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471)
at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708)
at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:499)
at com.quorum.tessera.data.migration.DirectoryStoreFile.load(DirectoryStoreFile.java:24)
at com.quorum.tessera.data.migration.CmdLineExecutor.execute(CmdLineExecutor.java:107)
at com.quorum.tessera.data.migration.Main.main(Main.java:11)

And in some cases (it is most likely when the max heap memory allocated is quite low ~ 128m)

Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
at org.apache.commons.codec.binary.BaseNCodec.resizeBuffer(BaseNCodec.java:173)
at org.apache.commons.codec.binary.BaseNCodec.ensureBufferSize(BaseNCodec.java:190)
at org.apache.commons.codec.binary.Base32.decode(Base32.java:296)
at org.apache.commons.codec.binary.BaseNCodec.decode(BaseNCodec.java:321)
at org.apache.commons.codec.binary.BaseNCodec.decode(BaseNCodec.java:306)
at com.quorum.tessera.data.migration.DirectoryStoreFile.lambda$load$1(DirectoryStoreFile.java:25)
at com.quorum.tessera.data.migration.DirectoryStoreFile$$Lambda$4/769287236.apply(Unknown Source)
at java.util.stream.Collectors.lambda$toMap$58(Collectors.java:1320)
at java.util.stream.Collectors$$Lambda$8/1702297201.accept(Unknown Source)
at java.util.stream.ReduceOps$3ReducingSink.accept(ReduceOps.java:169)
at java.util.Iterator.forEachRemaining(Iterator.java:116)
at java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801)
at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481)
at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471)
at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708)
at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:499)
at com.quorum.tessera.data.migration.DirectoryStoreFile.load(DirectoryStoreFile.java:24)
at com.quorum.tessera.data.migration.CmdLineExecutor.execute(CmdLineExecutor.java:107)
at com.quorum.tessera.data.migration.Main.main(Main.java:11)

@prd-fox prd-fox added the bug Something isn't working label Apr 18, 2019
@melowe melowe self-assigned this Apr 23, 2019
Krish1979 added a commit that referenced this issue Apr 23, 2019
Fix #705 out of memory error thrown by Files.readAllBytes
@prd-fox prd-fox reopened this Apr 26, 2019
@prd-fox prd-fox assigned prd-fox and unassigned melowe Apr 26, 2019
@prd-fox
Copy link
Contributor

prd-fox commented Apr 29, 2019

Hi @pmitra43

I have submitted a PR to fix this issue - are you able to build it and try it out? Would be great to hear if this fixes your issue: #719

Thanks

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
0.10.0 bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants