-
Notifications
You must be signed in to change notification settings - Fork 270
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Hyperparameter Optimization for Keras model with large dataset #67
Comments
@NTNguyen13 Indeed it is something being worked on. Just curious: how large is your dataset exactly? |
Hi, @x94carbone my dataset has 11000 images, each with 30-40 KB |
Hmm ok. Does your model work with Keras' |
I tried it with small random data and it works, it didn't work in the real case because of the large dataset |
So this likely means that there's nothing wrong with Talos at the moment, we've just gotta implement a feature to get |
Looks like all clear so closing here. |
bump. I'm fairly new to DL and Keras (but not new to other AI and ML) and I gotta say that it is amazing that most tools, examples, and academic papers are all built around these teeny tiny "toy" datasets like MNIST and CIFAR. Who uses 32x32 pixel images in a real application? Please add my vote for adding Keras Sequence support in Talos so it can be used in real applications. |
Condition Check:
[x ] I'm up-to-date with the latest release:
[x ] I've confirmed that my Keras model works outside of Talos.
If you still have an error, please submit complete trace and a code with:
You can provide the code in pastebin / gist or any other format you like.
I want to perform Hyperparameter Optimization on my Keras Model. The problem is the dataset is quite big, normally in training I use
fit_generator
to load the data in batch from disk, but the Talos only supportfit
method.I tried to load the whole data to memory, by using this:
But the when performing
talos.Scan()
, the OS kills it because of large memory usage. I also tried to undersampling my dataset to only 10%, but it's still too big.I saw that the issue #11 is being working on, but I wonder is there any workaround strategy to perform Hyperparameter Opimization for large dataset in this case?
The text was updated successfully, but these errors were encountered: