In this mini-project, I present how eager execution can really be helpful in speeding up the model training process.
Steps I followed to conduct the experiments:
- I maintained the exact same environment, model configuration, dataset (FashionMNIST) for the experiments. I only changed the TensorFlow versions.
- I ran thorough profiling to check what really causes execution in TensorFlow 1.14 to be slow and I found out it was Sessions.
Apart from these, I used Weights and Biases to log the CPU usage and memory footprints of the experiments. I was amazed to find out that TensorFlow 2.0 was much more performant in terms of CPU usage as well. Here are some snaps: