You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi,
I have a trained neural network. How can I used the Uncertainty Toolbox to quantify uncertainty of the neural network.
How should I calculate predictions_std (a vector of standard deviations)?
The text was updated successfully, but these errors were encountered:
Hi, thanks for your interest in the toolbox! A general trained neural network on its own may not have a notion of uncertainty to quantify. Are you training the network for doing classification or regression? As of now our library only supports regression.
If you are doing regression, one easy way to incorporate uncertainty for your neural network is to train it to predict a normal distribution, rather than a single point. Then, your network will predict the mean and standard deviation, which you can directly plug into the uncertainty toolbox. We have an example of this here: https://github.com/uncertainty-toolbox/simple-uq
Hi,
I am stuck in recalibration, I got the mean and SD prediction from DNN and got the average caliibration using this toolbox, Now what i have to do to perform the recalibration. Do i need to set aside validation data to further recalibrate?
Hello you do need to set aside another validation set in order to recalibrate. I would look at the toolbox tutorial if you haven't already. The end of the tutorial has an example of recalibration for this case.
Hi,
I have a trained neural network. How can I used the Uncertainty Toolbox to quantify uncertainty of the neural network.
How should I calculate predictions_std (a vector of standard deviations)?
The text was updated successfully, but these errors were encountered: