Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

dump_data -train create huge files even for small input #16

Closed
zvik opened this issue Dec 19, 2018 · 1 comment
Closed

dump_data -train create huge files even for small input #16

zvik opened this issue Dec 19, 2018 · 1 comment

Comments

@zvik
Copy link

zvik commented Dec 19, 2018

It seems that there is no test to see if the input ends on the second pass. So if the input data has less than 5M frames it will still generate huge features files.

I suggest changing https://github.com/mozilla/LPCNet/blob/master/src/dump_data.c#L264 to

if (!training || one_pass_completed) break;
@ahmed-fau
Copy link

ahmed-fau commented Jul 2, 2019

Hi @zvik ,

Thanks for this tip. I need also to have training/feature data for only one pass. However, I could not understand how the change you have suggested could be applied because the line you have referred to is regarding allocation of an array.

Would it be possible to instead remove the condition: count*FRAME_SIZE_5MS>=10000000 at line 232 ?

The only thing which is changed from one pass to another is the random number generations, isn't it?

Best

@jmvalin jmvalin closed this as completed Oct 14, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants