New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Memory issue? #55
Comments
Hi @JonLaliberte ! Thank you for trying the converter, and sorry about its failure for some reason. The app reads the enex file as a stream, and parses the notes one-by-one, so I don't remember to any allocated shared space that could cause memory leaks. But, if the note itself is huge, then it can be imagined that its standalone parsing causes out-of-memory exceptions. What about the latter failure, on that point, was there a huge note as well? |
IIRC that file I removed was about 35-50mb. Testing just now by removing notes from the enex file using BBEdit: |
@JonLaliberte okok, thanks for the insights, i'll be thinking on the solution. |
I just wanted to mention one other thing, this hadn't been happening with the previous release I had tested, that was v2.8.0 Thanks again for putting the work in to this, it's the type of thing people will likely use once then never again, so it's a bit thankless. Cheers! |
:) Thanks man! |
Couldn't help myself and did some more testing :) I exported from Evernote just some very large notes (because of attachments). This was just 15 notes, but totaling 382MB. So then I tried a smaller enex file that didn't contain those notes, but still contained some other larger notes (up to 9mb). This one failed still, but it failed on a smallish 230k note (it also seemed to be a bit malformed - some bad HTML if that matters). I'd guess there there is some kind of memory leak we're hitting, not necessarily caused by large files, but made more obvious because of them (the leak is more likely to happen when processing a large file, because they are large). I ended up dragging/dropping all of my notes in to 3 batches, these enex files were: 713MB (56 notes), 227MB (93 notes), 122MB (1270 notes). I processed these separately and all completed successfully! |
Thanks @JonLaliberte for this detailed investigation. I'll try to reproduce the problem as a test-case, but it will take some time, beacuse there is rush in my professional job currently as well. |
Hi, I'm seeing the same issue. @JonLaliberte said he didn't see it in 2.8.0, so I tried with that version, and saw the issue again. (Both versions failed on the same note.) My logs, in case it helps at all:
I'll try to do the same workaround that @JonLaliberte suggested (with splitting). My Evernote notebook has 2098 entries, though, so it might take a while. By the way, fantastic work with the project, @akosbalasko. I already gave you gold on reddit but let us know if there's a way to support this further. As @JonLaliberte says, this is generally a thankless thing, like a defender in football — if everything works right, nobody notices. But this does help a lot of people. |
One more thing. I tried to just increase
That's ~8 GB of memory. But this did not help. Looking at the many different replies to https://stackoverflow.com/questions/13616770/node-js-fatal-error-js-allocation-failed-process-out-of-memory-possible, it might be some infinite loop somewhere or some parser trying to parse something completely ridiculous. Fixing this might be far outside this package's scope. Maybe a |
Hi @filiph , hi @JonLaliberte ! First of all, thank you both for reporting this issue, you guys provided really detailed bug reports which were a great help to me to localize the problem. What has taken a bit more time as usual, I'm sorry about that. But finally, I think I found the problematic component (it was the xml-parser btw) and replaced it with a bit better one. I tested it with two cases besides all the unit tests: one contained 2560 notes, with 500kb attachment in each, the other consists of 128 notes with 50MB attachment in each. The previous version (2.9.2) was failed at the 10th of the 50MB-batch, the fixed version passes both of the cases. A new version is coming soon. |
Using the 2.9.2 release.
It seems to keep failing at the same place, so I checked the next note in the enex doc and it did contain some large attachments. After removing that note from the enex file, it was able to make it past that note, but it failed on another later on - I assume for the same reason (haven't checked it yet).
<--- Last few GCs --->
[53900:0x110008000] 53655 ms: Mark-sweep 1106.9 (1123.8) -> 1053.4 (1122.3) MB, 88.0 / 0.0 ms (average mu = 0.921, current mu = 0.768) allocation failure scavenge might not succeed
[53900:0x110008000] 53783 ms: Mark-sweep 1056.9 (1122.3) -> 1056.0 (1102.3) MB, 122.7 / 0.0 ms (average mu = 0.799, current mu = 0.044) allocation failure scavenge might not succeed
<--- JS stacktrace --->
Cannot get stack trace in GC.
FATAL ERROR: MarkCompactCollector: semi-space copy, fallback in old gen Allocation failed - JavaScript heap out of memory
1: 0x1011d1c65 node::Abort() (.cold.1) [/usr/local/bin/node]
2: 0x10009f919 node::Abort() [/usr/local/bin/node]
3: 0x10009fa7f node::OnFatalError(char const*, char const*) [/usr/local/bin/node]
4: 0x1001e3867 v8::Utils::ReportOOMFailure(v8::internal::Isolate*, char const*, bool) [/usr/local/bin/node]
5: 0x1001e3807 v8::internal::V8::FatalProcessOutOfMemory(v8::internal::Isolate*, char const*, bool) [/usr/local/bin/node]
6: 0x10036b995 v8::internal::Heap::FatalProcessOutOfMemory(char const*) [/usr/local/bin/node]
7: 0x1003c0127 v8::internal::EvacuateNewSpaceVisitor::Visit(v8::internal::HeapObject, int) [/usr/local/bin/node]
8: 0x10039d7fc void v8::internal::LiveObjectVisitor::VisitBlackObjectsNoFail<v8::internal::EvacuateNewSpaceVisitor, v8::internal::MajorNonAtomicMarkingState>(v8::internal::MemoryChunk*, v8::internal::MajorNonAtomicMarkingState*, v8::internal::EvacuateNewSpaceVisitor*, v8::internal::LiveObjectVisitor::IterationMode) [/usr/local/bin/node]
9: 0x10039d118 v8::internal::FullEvacuator::RawEvacuatePage(v8::internal::MemoryChunk*, long*) [/usr/local/bin/node]
10: 0x10039cd86 v8::internal::Evacuator::EvacuatePage(v8::internal::MemoryChunk*) [/usr/local/bin/node]
11: 0x1003c4dae v8::internal::PageEvacuationTask::RunInParallel(v8::internal::ItemParallelJob::Task::Runner) [/usr/local/bin/node]
12: 0x10038dd52 v8::internal::ItemParallelJob::Task::RunInternal() [/usr/local/bin/node]
13: 0x10038e1bf v8::internal::ItemParallelJob::Run() [/usr/local/bin/node]
14: 0x1003a0099 void v8::internal::MarkCompactCollectorBase::CreateAndExecuteEvacuationTasks<v8::internal::FullEvacuator, v8::internal::MarkCompactCollector>(v8::internal::MarkCompactCollector*, v8::internal::ItemParallelJob*, v8::internal::MigrationObserver*, long) [/usr/local/bin/node]
15: 0x10039fb90 v8::internal::MarkCompactCollector::EvacuatePagesInParallel() [/usr/local/bin/node]
16: 0x100392447 v8::internal::MarkCompactCollector::Evacuate() [/usr/local/bin/node]
17: 0x10038fe56 v8::internal::MarkCompactCollector::CollectGarbage() [/usr/local/bin/node]
18: 0x10036bcdf v8::internal::Heap::MarkCompact() [/usr/local/bin/node]
19: 0x100369533 v8::internal::Heap::PerformGarbageCollection(v8::internal::GarbageCollector, v8::GCCallbackFlags) [/usr/local/bin/node]
20: 0x100367a3e v8::internal::Heap::CollectGarbage(v8::internal::AllocationSpace, v8::internal::GarbageCollectionReason, v8::GCCallbackFlags) [/usr/local/bin/node]
21: 0x10037390a v8::internal::Heap::AllocateRawWithLightRetry(int, v8::internal::AllocationType, v8::internal::AllocationOrigin, v8::internal::AllocationAlignment) [/usr/local/bin/node]
22: 0x100373991 v8::internal::Heap::AllocateRawWithRetryOrFail(int, v8::internal::AllocationType, v8::internal::AllocationOrigin, v8::internal::AllocationAlignment) [/usr/local/bin/node]
23: 0x10034135a v8::internal::Factory::NewFillerObject(int, bool, v8::internal::AllocationType, v8::internal::AllocationOrigin) [/usr/local/bin/node]
24: 0x100693768 v8::internal::Runtime_AllocateInYoungGeneration(int, unsigned long*, v8::internal::Isolate*) [/usr/local/bin/node]
25: 0x1009dcb79 Builtins_CEntry_Return1_DontSaveFPRegs_ArgvOnStack_NoBuiltinExit [/usr/local/bin/node]
[1] 53899 abort npm run start -- --outputDir=./jon/out --include-metadata
The text was updated successfully, but these errors were encountered: