Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Latent codec: more natural initialization #224

Merged
merged 6 commits into from
Apr 15, 2023

Conversation

YodaEmbedding
Copy link
Contributor

@YodaEmbedding YodaEmbedding commented Apr 11, 2023

  • docs: updates for sphinx-book-theme==1.0.0 (fixes warnings)
  • latent codec: Use self-documenting keyword arguments instead of _setdefault et al.
  • latent codec: Got rid of redundant N argument.

To do:

  • Test everything again.

@YodaEmbedding YodaEmbedding force-pushed the refactor/latent_codec branch 2 times, most recently from 401284d to 5efcc84 Compare April 12, 2023 04:32
Use self-documenting keyword arguments instead of `_setdefault` et al.
# A HyperpriorLatentCodec is made of "hyper" and "y" latent codecs.
latent_codec={
# Side-information branch with entropy bottleneck for "z":
"hyper": HyperLatentCodec(
N,
h_a=h_a,
h_s=h_s,
entropy_bottleneck=EntropyBottleneck(N),
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

entropy_bottleneck=EntropyBottleneck(M) ?

I see, for cheng / sensetime, it's still N

# A HyperpriorLatentCodec is made of "hyper" and "y" latent codecs.
latent_codec={
# Side-information branch with entropy bottleneck for "z":
"hyper": HyperLatentCodec(
N,
h_a=h_a,
h_s=h_s,
entropy_bottleneck=EntropyBottleneck(N),
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

entropy_bottleneck=EntropyBottleneck(M) ?

self._setdefault("entropy_bottleneck", lambda: EntropyBottleneck(N))
self._setdefault("h_a", nn.Identity)
self._setdefault("h_s", nn.Identity)
self.entropy_bottleneck = entropy_bottleneck or EntropyBottleneck(0)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What is the rationale to init an entropybottleneck with 0 channels, can you extend it later?

In gainhyperlatentcodec, you init with
self.entropy_bottleneck = entropy_bottleneck or EntropyBottleneck()

Copy link
Contributor Author

@YodaEmbedding YodaEmbedding Apr 14, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You're right; this was a bit silly. I wanted it to type-check. Possible alternatives:

  1. Force callers to provide entropy_bottleneck:

    def __init__(
        self,
        entropy_bottleneck: Optional[EntropyBottleneck] = None,
    ):
        assert entropy_bottleneck is not None
        self.entropy_bottleneck = entropy_bottleneck

    Usage:

    HyperLatentCodec(
        entropy_bottleneck=EntropyBottleneck(N),
        h_a=h_a,
        h_s=h_s,
    )
  2. Restore the z_channels argument:

    def __init__(
        self,
        z_channels: Optional[int] = None,
        entropy_bottleneck: Optional[EntropyBottleneck] = None,
    ):
        self.entropy_bottleneck = entropy_bottleneck or EntropyBottleneck(z_channels)  # type: ignore

    Usage:

    HyperLatentCodec(
        z_channels=N,
        h_a=h_a,
        h_s=h_s,
    )

I think (1) seems simplest / most explicit / consistent with users already providing h_a=h_a, h_s=h_s.

@fracape fracape merged commit 2758f12 into InterDigitalInc:master Apr 15, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants