Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

can't find the tlt.data module? #13

Closed
RyanHuangNLP opened this issue Feb 21, 2022 · 5 comments
Closed

can't find the tlt.data module? #13

RyanHuangNLP opened this issue Feb 21, 2022 · 5 comments

Comments

@RyanHuangNLP
Copy link

RyanHuangNLP commented Feb 21, 2022

in the token labeling main file, I found that from tlt.data import create_token_label_target, TokenLabelMixup, FastCollateTokenLabelMixup, create_token_label_loader, create_token_label_dataset ,
but I can't find the tlt.data module
image

@Andy1621
Copy link
Collaborator

Very sorry for the late response. I have fixed the bug in the new version. It is because I add 'data' in .gitignore file.

@Andy1621
Copy link
Collaborator

As there is no more activity, I am closing the issue, don't hesitate to reopen it if necessary.

@RyanHuangNLP
Copy link
Author

@Andy1621 Thanks for responsing, another question is the UniFormer-L model config is following?

@register_model
def uniformer_base_ls(pretrained=True, **kwargs):
    global layer_scale
    layer_scale = True
    model = UniFormer(
        depth=[5, 8, 20, 7],
        embed_dim=[64, 128, 320, 512], head_dim=64, mlp_ratio=4, qkv_bias=True,
        norm_layer=partial(nn.LayerNorm, eps=1e-6), **kwargs)
    model.default_cfg = _cfg()
    return model

@Andy1621
Copy link
Collaborator

Andy1621 commented Feb 23, 2022

Yes! The head_dim of other models are 32. But for the large model we use 64.

@RyanHuangNLP
Copy link
Author

@Andy1621 Got it

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants