Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Supports hyperparameter tuning for nn methods #404

Closed
qychen2001 opened this issue May 23, 2024 · 2 comments
Closed

Supports hyperparameter tuning for nn methods #404

qychen2001 opened this issue May 23, 2024 · 2 comments

Comments

@qychen2001
Copy link

Very good work!
Hyperparameter tuning is a common strategy for nn-based methods. This will also give better results for nn-based methods. But I didn't find a simple implementation. May I ask if support will be considered in the future? Or is there any simple way to implement this part?

@weihua916
Copy link
Contributor

We have a script that does optuna-based hyper-param tuning for a variety of deep tabular models; https://github.com/pyg-team/pytorch-frame/blob/master/benchmark/data_frame_benchmark.py

@qychen2001
Copy link
Author

Thanks for the answer! My problem has been solved.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants