New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
The implementation of Isotropic architecture #130
Comments
Yes, set this reduce_ratios to [1,1,1,1]. |
Hello @iamhankai, Shouldn't the downsampling be also disabled for the isotropic architecture so that the number of tokens is constant throughout the model? |
Yes, it doesn't need downsampling in the isotropic architecture. |
for i in range(len(blocks)):
To implement the Isotropic architecture, I need to do is setting reduce_ratios to [1,1,1,1] and removing this Downsample, right? |
Yes. The Isotropic code will be released soon. |
已发布isotropic ViG的代码:https://github.com/huawei-noah/Efficient-AI-Backbones/tree/master/vig_pytorch |
Hi, thanks for sharing this impressive work. The paper mentioned two architectures, Isotropic one and pyramid one. I noticed that in the code, this is a reduce_ratios, and this reduce_ratios are used by a avg_pooling operation to calculate before building the graph. I am wondering whether all I need to do is setting this reduce_ratios to [1,1,1,1] if I want to implement the Isotropic architecture. Thanks.
self.n_blocks = sum(blocks)
channels = opt.channels
reduce_ratios = [4, 2, 1, 1]
dpr = [x.item() for x in torch.linspace(0, drop_path, self.n_blocks)]
num_knn = [int(x.item()) for x in torch.linspace(k, k, self.n_blocks)]
The text was updated successfully, but these errors were encountered: