Skip to content

Conversation

@alanhdu
Copy link
Contributor

@alanhdu alanhdu commented Jul 8, 2024

The original docstring formatting was off (looks like a bad line-wrap mangling the parameters), but I figured I might as well reformat to match the same docstring style as the rest of the file.

What does this PR do?

Before submitting

Who can review?

Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.

The original docstring formatting was off (looks like a bad line-wrap mangling the parameters), but I figured I might as well reformat to match the same docstring style as the rest of the file.
These may be fractional.
:param embedding_dim: the dimension of the output. :param max_period: controls the minimum frequency of the
embeddings. :return: an [N x dim] Tensor of positional embeddings.
Args
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

thanks!
can we also add the other 2 arguments that are missing from the doc string?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I can try -- TBH I haven't examined the function closely to really understand what downscale_freq_shift and scale actually do.

This simplifies the implementation (IMO) by getting rid of the redundant `torch.exp(-math.log(a) * b))` with `torch.pow(a, -b)` and updates the docstring to capture what each argument does.
assert len(timesteps.shape) == 1, "Timesteps should be a 1d-array"

half_dim = embedding_dim // 2
exponent = -math.log(max_period) * torch.arange(
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can we not change the implementation?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sure -- I find the current implementation to be needlessly complex, but hapy to revert.

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

@yiyixuxu yiyixuxu merged commit 673eb60 into huggingface:main Jul 11, 2024
@yiyixuxu
Copy link
Collaborator

thank you for your contribution!

@alanhdu alanhdu deleted the patch-1 branch July 11, 2024 02:10
sayakpaul pushed a commit that referenced this pull request Dec 23, 2024
* Reformat docstring for `get_timestep_embedding`


---------

Co-authored-by: YiYi Xu <yixu310@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants