-
Notifications
You must be signed in to change notification settings - Fork 77
Closed as not planned
Labels
user interfaceChanges to the user interface and improvements in usabilityChanges to the user interface and improvements in usability
Description
Currently, we record metrics (e.g. "loss") for nested networks like this:
{"loss": 0.2294, "loss/inference_loss": 0.1227, "loss/summary_loss": 0.1067}This is nice because
- We can see the composition of the loss in the progress bar
- The nested metrics are individually logged to TensorBoard
- We enable direct comparisons between nested metrics in TensorBoard for different training runs
However, when no summary network is present, there is some redundancy:
{"loss": 0.1227, "loss/inference_loss": 0.1227}Here, clearly, the "loss/inference_loss" key is unnecessary in the progress bar. We would still like to log it, though, for reasons 2. and 3. above. In this case, we would want to drop this key so we end up with just
{"loss": 0.1227}We can fix this issue by:
- using a custom
keras.callbacks.ProgbarLoggeror analogously a customkeras.utils.Progbar - constructing a graph structure where child metrics
"loss/inference_loss"are assigned to their parent"loss" - dropping single graph leaves (i.e., ones where the parent only has a single child)
We could also redesign how the metrics are recorded, but it is important that we do not remove the benefits shown above.
Unfortunately, with the current state of metrics, we cannot fix this issue by:
- using nested dictionaries for the metrics (not supported by keras)
- hard-coding which keys to drop (since we do not know about user-defined metrics internally)
Metadata
Metadata
Assignees
Labels
user interfaceChanges to the user interface and improvements in usabilityChanges to the user interface and improvements in usability