Skip to content

FR: Enable forward method of Bayesian Layers to return value only for smoother integration with PyTorch #7

@piEsposito

Description

@piEsposito

It would be nice if we could store the KL divergence value as an attribute of the Bayesian Layers and return them on the forward method only if needed.

With that we can have less friction on integration with PyTorch. being able to "plug and play" with bayesian-torch layers on deterministic models.

It would be something like that:

def forward(self, x, return_kl=False):
    ...
    self.kl = kl

    if return_kl:
        return out, kl
    return out   

We then can get it from the bayesian layers when calculating the loss with no harm or hard changes to the code, which might encourage users to try the lib.

I can work on that also.

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions