Skip to content

MC dropout - fixing dropped weights at test time to sample smooth functions #14172

@charliekirkwood

Description

@charliekirkwood

It has been explained in issue 9412 how to keep dropout on at test time in order to sample from the approximate Bayesian posterior that MC dropout provides.

However, is it possible in Keras to fix which weights are dropped for any set of predictions (e.g. across the range of inputs) so that each call of predict(model, data) can provide predictions from a single smooth function, rather than the current setup of predicting from a separate function for each individual input?

Doing so would allow visualisations as in the following:

http://www.cs.ox.ac.uk/people/yarin.gal/website/blog_2248.html

As mentioned in this blog written by the inventor of MC dropout, fixing the dropped weights for all test inputs make better visualization.

Does anyone have a solution for fixing the dropout weights using the keras dropout?

Originally posted by @hjliag in #9412 (comment)

Metadata

Metadata

Assignees

No one assigned

    Labels

    type:supportUser is asking for help / asking an implementation question. Stackoverflow would be better suited.

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions