This repository was archived by the owner on Jul 7, 2023. It is now read-only.

Description
layers/common_hparams.py mentions a hyperparameter "pretrained_model_dir":
"Directory containing a checkpoint for a pretrained model. This will only be used if a new run is being started. Parameters not found in the pretrained model will be randomly initialized. Superfluous parameters in the pretrained model will be ignored."
It sounds like it could be useful, however, as far as I can tell, the hyperparameter is not actually hooked up anywhere. The name of the parameter does not appear anywhere else in the sources of tensor2tensor, tensorflow, or tensorflow_estimator.
Is this feature obsolete or not yet implemented?