-
Notifications
You must be signed in to change notification settings - Fork 19.7k
Description
It has been explained in issue 9412 how to keep dropout on at test time in order to sample from the approximate Bayesian posterior that MC dropout provides.
However, is it possible in Keras to fix which weights are dropped for any set of predictions (e.g. across the range of inputs) so that each call of predict(model, data) can provide predictions from a single smooth function, rather than the current setup of predicting from a separate function for each individual input?
Doing so would allow visualisations as in the following:
http://www.cs.ox.ac.uk/people/yarin.gal/website/blog_2248.html
As mentioned in this blog written by the inventor of MC dropout, fixing the dropped weights for all test inputs make better visualization.
Does anyone have a solution for fixing the dropout weights using the keras dropout?
Originally posted by @hjliag in #9412 (comment)