You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
debug信息如下
2021-11-10 10:39:52,766 WARNING utils.py 80] Gradients do not exist for variables ['bert/pooler/dense/kernel:0', 'bert/pooler/dense/bias:0'] when minimizing the loss. If you're using model.compile(), did you forget to provide a lossargument?
975/Unknown - 398s 378ms/step - acc: 0.9998 - loss: 0.0051Traceback (most recent call last):
File "run_simcse.py", line 79, in
save_weights_only=False)
File "/usr/local/lib/python3.7/dist-packages/keras/utils/traceback_utils.py", line 67, in error_handler
raise e.with_traceback(filtered_tb) from None
File "/usr/local/lib/python3.7/dist-packages/keras/utils/generic_utils.py", line 896, in update
self._values[k][0] += v * value_base
ValueError: operands could not be broadcast together with shapes (64,) (27,) (64,)
环境
硬件环境:GPU P100 、linux
软件环境一:TF 2.7 、 transformer-keras 0.4.9
软件环境二:TF2.4 ,trainsformer-keras 0.3.1
数据集
超参:
现象:
1、数据正常读取,bert能加载,但是在model.fit报错了
debug信息如下
2021-11-10 10:39:52,766 WARNING utils.py 80] Gradients do not exist for variables ['bert/pooler/dense/kernel:0', 'bert/pooler/dense/bias:0'] when minimizing the loss. If you're using
model.compile()
, did you forget to provide aloss
argument?975/Unknown - 398s 378ms/step - acc: 0.9998 - loss: 0.0051Traceback (most recent call last):
File "run_simcse.py", line 79, in
save_weights_only=False)
File "/usr/local/lib/python3.7/dist-packages/keras/utils/traceback_utils.py", line 67, in error_handler
raise e.with_traceback(filtered_tb) from None
File "/usr/local/lib/python3.7/dist-packages/keras/utils/generic_utils.py", line 896, in update
self._values[k][0] += v * value_base
ValueError: operands could not be broadcast together with shapes (64,) (27,) (64,)
我尝试方法
更改数据集大小为batch_size的整数倍,无法解决
davidADSP/GDL_code#63
The text was updated successfully, but these errors were encountered: