Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

docs(readme): try to naturalize copy #1

Merged
merged 3 commits into from May 15, 2019

Conversation

Projects
None yet
2 participants
@jaredscheib
Copy link
Contributor

commented May 1, 2019

No description provided.

@jaredscheib jaredscheib requested a review from mkulaczkowski May 1, 2019

@jaredscheib
Copy link
Contributor Author

left a comment

some additional questions for @mkulaczkowski

'use_BN': hp.choice('use_BN', [False, True]),
# Use a first convolution which is special?
# Use a first convolution that is special?

This comment has been minimized.

Copy link
@jaredscheib

jaredscheib May 1, 2019

Author Contributor

should this have a ??

'fc_dropout_drop_proba': hp.uniform('fc_dropout_proba', 0.0, 0.6),
# Use batch normalisation at more places?
# Use batch normalization at more places?

This comment has been minimized.

Copy link
@jaredscheib

jaredscheib May 1, 2019

Author Contributor

should this have a ??

The final accuracy is of 67.61% in average on the 100 fine labels, and is of 77.31% in average on the 20 coarse labels.
The results are comparable to the ones in the middle of [that list](http://rodrigob.github.io/are_we_there_yet/build/classification_datasets_results.html#43494641522d313030), under the CIFAR-100 section.
The final accuracy is 67.61% on average for the 100 fine labels, and is 77.31% on average for the 20 coarse labels.
These results are comparable to the ones in the middle of [this list](http://rodrigob.github.io/are_we_there_yet/build/classification_datasets_results.html#43494641522d313030), under the CIFAR-100 section.

This comment has been minimized.

Copy link
@jaredscheib

jaredscheib May 1, 2019

Author Contributor

what is the significance of this list? might be worth contextualizing. maybe it's due to my lack of familiarity, but this felt like an arbitrary list on the internet to compare against :)

README.md Outdated

This is an oriented random search, in contrast with a Grid Search where hyperparameters are pre-established with fixed steps increase. Random Search for Hyper-Parameter Optimization (such as what Hyperopt do) has proven to be an effective search technique. The paper about this technique sits among the most cited deep learning papers. To sum up, it is more efficient to search randomly through values and to intelligently narrow the search space rather than looping on fixed sets of values for the hyperparameters.
This kind of Oriented Random Search is Hyperopt's strength, as opposed to a simpler Grid Search where hyperparameters are pre-established with fixed-step increases. Random Search for Hyperparameter Optimization has proven to be an effective search technique. The paper about this technique sits among the most cited deep learning papers. In summary, it is more efficient to randomly search through values and intelligently narrow the search space, rather than looping on fixed sets of hyperparamter values.

This comment has been minimized.

Copy link
@jaredscheib

jaredscheib May 1, 2019

Author Contributor

what do you think about clarifying whether Hyperopt can do Grid Search? i wasn't sure myself, though i did look it up and saw that it appears not to: hyperopt/hyperopt#200 – i think it's not clear from these docs

This comment has been minimized.

Copy link
@jaredscheib

jaredscheib May 2, 2019

Author Contributor

and also, could you clarify what fixed steps increase meant? my change there may not be technically correct anymore. with a bit of clarification and context, i can come up with something that reads more naturally and is also technically correct.

@SebastianMeler SebastianMeler merged commit 4273f3c into master May 15, 2019

1 check passed

WIP Ready for review
Details
@jaredscheib

This comment has been minimized.

Copy link
Contributor Author

commented May 16, 2019

@SebastianMeler thanks for merging. could you also respond to my various questions above?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
You can’t perform that action at this time.