Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unsupport GlobalStep in subclass of ValuePtrBase #161

Closed
Lihengwannafly opened this issue Apr 14, 2022 · 3 comments
Closed

Unsupport GlobalStep in subclass of ValuePtrBase #161

Lihengwannafly opened this issue Apr 14, 2022 · 3 comments

Comments

@Lihengwannafly
Copy link

When we save checkpoint, the error F ./tensorflow/core/framework/embedding/value_ptr.h:256] Unsupport GlobalStep in subclass of ValuePtrBase occurs. Because I find that the checkpoint is a temporary file best_checkpoint/best.data-00000-of-00001.tempstate11898667549733680686.

@liutongxuan
Copy link
Member

Whether you a model from a model with global_step in EmbeddingVariable, but there's no such configure in graph.

@Lihengwannafly
Copy link
Author

Whether you a model from a model with global_step in EmbeddingVariable, but there's no such configure in graph.

I found that the feature eviction is triggered when saving checkpoint,.Therefore, I try to enable GlobalStepEvict for all embedding variables, and it works. So, is it necessary to enable GlobalStepEvict for all embedding variables?

@Lihengwannafly
Copy link
Author

It is possible to be a compilation environment, and there is no problem with the official mirror.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants