Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Crash with no details on large saves #71

Closed
petersrinivasan opened this issue Nov 6, 2019 · 0 comments · Fixed by #112
Closed

Crash with no details on large saves #71

petersrinivasan opened this issue Nov 6, 2019 · 0 comments · Fixed by #112
Assignees
Labels

Comments

@petersrinivasan
Copy link

I am reorganizing my library of 10,020 sounds, including embedding metadata. I would

  1. Open the folder in ME.
  2. Export a Core.csv
  3. Edit Core.csv in excel (ensuring excel imported everything as-is)
  4. Save the edit as Core-New.csv
  5. Close all files in ME.
  6. Import Core-New.csv
  7. Verify the correct info was being changed by scanning for green boxes.
  8. Save changes.

This would sometimes bring up a "Saving" progress bar. Other times it wouldn't. At some point during the save, if there were enough changes(I don't know the exact cutoff, more on that later), the program would instantly close, with no error messages.

Some number of files would have had their metadata changed successfully. No bwfs would be corrupted, so in theory I could just run the same Core-New.csv again and again. Instead, I chose to split the CSV file into parts and run each part.

2000 line CSV - 100% fail rate
1000 line CSV - ~30% fail rate
500 line CSV - ~10% fail rate
200 line CSV - 0% fail rate

Recall that a "fail" just means retrying the CSV one or more times. I settled for 1000 line CSVs which didn't take too long to process all the files.

It should be noted that I don't know what triggers the crash. It could either be number of fields being saved or number of files being changed.

Tech specs:
Updated Windows 10
BWF MetaEdit v1.3.1
16GB RAM
Total RAM usage 40% - 60% at crash. Unlikely to be RAM related.
Most wav files only a few MB each.
No errors on import, either in the Tech > Errors column or the in-app error log

I didn't realize there was an option for a log file. I tested everything again with the log file active. I failed to repro. ME changed metadata for nearly 10k files successfully. Two variables were different:

  1. Logging was turned on. (It's possible but unlikely that writing to the log each time gave the processor enough buffer to finish the job. I've seen this happen in other programs.
  2. I used Goolge Sheets instead of Excel to edit and generate the Core New.csv (Unlike Excel, it has an upfront option during import to leave the data completely untouched. It's possible there was some kind of escaped character Excel was exporting wrongly, but now that my files are properly stamped I don't want to continue testing)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants