You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When we save checkpoint, the error F ./tensorflow/core/framework/embedding/value_ptr.h:256] Unsupport GlobalStep in subclass of ValuePtrBase occurs. Because I find that the checkpoint is a temporary file best_checkpoint/best.data-00000-of-00001.tempstate11898667549733680686.
The text was updated successfully, but these errors were encountered:
Whether you a model from a model with global_step in EmbeddingVariable, but there's no such configure in graph.
I found that the feature eviction is triggered when saving checkpoint,.Therefore, I try to enable GlobalStepEvict for all embedding variables, and it works. So, is it necessary to enable GlobalStepEvict for all embedding variables?
When we save checkpoint, the error
F ./tensorflow/core/framework/embedding/value_ptr.h:256] Unsupport GlobalStep in subclass of ValuePtrBase
occurs. Because I find that the checkpoint is a temporary filebest_checkpoint/best.data-00000-of-00001.tempstate11898667549733680686
.The text was updated successfully, but these errors were encountered: