Skip to content

About the y_bpp and the Gaussian entropy model #326

@YuKDseele

Description

@YuKDseele

Hello, I have a question about y_bpp and normalization.

In the implementation of the Gaussian entropy model in CompressAI, y_bpp is computed by estimating the likelihood after normalizing the input:

half = float(0.5)

if means is not None:
    values = inputs - means
else:
    values = inputs

scales = self.lower_bound_scale(scales)

values = torch.abs(values)
upper = self._standardized_cumulative((half - values) / scales)
lower = self._standardized_cumulative((-half - values) / scales)
likelihood = upper - lower

Does this mean that during actual training, it is not (y - means) / scales but rather torch.abs(y - means) / self.lower_bound_scale(scales) that is fitted to the standard normal distribution?
I need to normalize the latent variable y to obtain a standard spherical normal vector for calculating the spatial correlation.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions