-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How do you set hyperparameters? #3
Comments
hello, with the following command, python main.py --train --ckpt_path ckpt --gpu_ids 0 --batch_size 64 --lorb base --modulation Normal --epochs 100 --dataset CREMAD --gs_flag I also got the result of 68.8. When I set the batch size as 16 and av_alpha as 0.55, I got the result of 73.6, which is still far from the result in the paper. Are there any else parameters? |
@Cecile-hi Hi, I set batch_size=16, epoch=200, lr_decay_step=150, and other hyperparameters to default. I got a score of 0.800 on the CREMA-D dataset, which is as good as expected. Here is the figure of the accuracy curve: |
Hello, I wonder how you set hyperparameters, such as learning rate, batch size, and the number of epochs. We just obtained a 68 score on the Crame-D dataset using the command in the readme.
python main.py --train --ckpt_path ckpt --gpu_ids 0 --batch_size 64 --lorb base --modulation Normal --epochs 100 --dataset CREMAD --gs_flag
The text was updated successfully, but these errors were encountered: