-
Notifications
You must be signed in to change notification settings - Fork 561
Can import meta graph KeyError: 'InfeedEnqueueTuple' #426
Comments
this needs more details or i vote we close it |
For me, the same problem has occurred:
I think this is due to difference between GPU and TPU. |
Things I've tried without success:
This seems to work in colab
|
can i get some more details about what we're trying to do and why we expect it to work? I'm still not sure what the bug is. I'd like to see something about the problem we're trying to solve here. |
I didn't realize there was confusion. Problem: you can't import a the .meta file. It certainly expected that import_meta_graph should work (e.g. there's no callout in the api documentation that it doesn't work for large models or anything). Consequence: slightly harder to do model manipulation (for things like oneoff/l2_cost, and lz-to-mg converter) I'm 98% sure this has to do with TPUs because the InfeedEnqueueTuple op is only(?) used on TPU specific and probably not loaded/supported if TPUs aren't found. I see this affects other people. [1] but everyone seems to just suggest hacking around the problem (export as .pb, load tensors manually, ...) |
From a different thread the proposed solution is to "use one of the export functions, e.g. (TPU)Estimator.export_savedmodel() for inference" we might try this at a later date but the main blocker (leela-zero/leela-zero#2133) has been fixed a different way |
I tensorflow (installed from pip3) and tf-nightly
full error:
The text was updated successfully, but these errors were encountered: