-
Notifications
You must be signed in to change notification settings - Fork 54
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
During eval, getting "ValueError: model_fn should return an EstimatorSpec". During training, OK #9
Comments
Sorry, I did not try |
Thank you for the hint, I'll take a look. |
same error, but not find eval result, can help? |
def model_fn(features, labels, mode, params): # pylint: disable=unused-argument where eval=True, still return TPUEstimatorSpec, so cause error |
Try another tensorflow API. |
Take a look at the error! Modifying |
Thank you for
BERT-multi-gpu
.I'm running
run_pretraining_gpu_v2.py
on the provided datasetsample_text.txt
.The only change, I made was to the
n_gpus
flag (in may case, 3).Training was fine. But I also have
--do_eval=True
(as below).The error below on
TF 1.14.0
The text was updated successfully, but these errors were encountered: