Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

训练阶段的几个问题 #5

Closed
jpzhai opened this issue Jul 9, 2022 · 3 comments
Closed

训练阶段的几个问题 #5

jpzhai opened this issue Jul 9, 2022 · 3 comments

Comments

@jpzhai
Copy link

jpzhai commented Jul 9, 2022

您好,想问一下,在训练阶段,就已经对测试集进行测试了对吗?
还有训练阶段的
# Fix the parameters of Batch Normalization after 10000 episodes (1 epoch)
if epoch_item < 1:
model.train()
else:
model.eval()
这几行代码是什么意思呢?epoch_item是不是从for epoch_item in range(opt.epochs):这里获取的

@WenbinLee
Copy link
Owner

您好,关于第一个问题,是的,在训练阶段进行了测试,方便观察模型的效果;
关于第二个问题,这样设置的目的是希望模型训练完一个Epoch之后,固定模型BN层的参数,提高模型的泛化能力,是一个小的trick,当然你可以去掉这个设置,但是需要在train代码里把模型的模式改成model.train()。

@jpzhai
Copy link
Author

jpzhai commented Jul 14, 2022

感谢您的回答,还有一个问题不是太懂,在训练阶段的代码中episodeSize是代表batch_size吗?可以改变他的大小吗?我发小改变他之后,取N-way数据标签都是0-4,这样会对训练有影响吗?

@WenbinLee
Copy link
Owner

WenbinLee commented Jul 14, 2022 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants