Join GitHub today
GitHub is home to over 28 million developers working together to host and review code, manage projects, and build software together.Sign up
Transferlearning#nOutReplace reverts changed number of inputs to next layer if that layer is also changed #6343
The testcase below fails on windows 10 using beta 2 (both CPU and CUDA 9.2).
The testcase attempts to change nOut of two layers which are subsequent to each other, which results in nIn to the second being overwritten to the original value. Inspecting the code in TransferLearning#nOutReplace confirms this is the case as the config to modify is taken from the original graph and the modified config is put directly into the editedConfigBuilder, overwriting any prior changes.
Taking the config to modify from the editedConfigBuilder or from editedVertices if present could be a way forward.