Add a tutorial to visualize activation heatmaps
Deep learning models are black-box in nature. From a business-oriented perspective, it is important to understand how a model is making a particular prediction, more specifically -- what parts of the given input signal are contributing to a model's prediction. This is helpful for all business stakeholders.
The field of computer vision has progressed rapidly with modern deep learning cavalry. Today, it's absolutely possible to train a high-quality image classification model within a very less amount of time. But the model interpretability part still remains a major challenge. An effective deep learning practitioner should have the right tools to explain his/her deep learning models.
The objective of this task is to write a tutorial that shows how to take a trained tf.keras
image classification model and an image (or a set of images) and how to produce activation heatmap overlaid on the input image like this one. From the figure, you can see what excited the particular model to generate the prediction.
The final output of this assignment should be a Colab notebook example.