Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Byol(190) #2

Open
wants to merge 23 commits into
base: byol
Choose a base branch
from
Open

Byol(190) #2

wants to merge 23 commits into from

Conversation

pranavsinghps1
Copy link

@pranavsinghps1 pranavsinghps1 commented Aug 10, 2021

Still Todo:

  • improve comments
  • Improve type hints
  • update/set consistent and clear variable names
  • Run linter

QuentinDuval and others added 16 commits July 26, 2021 14:09
Summary:
Pull Request resolved: facebookresearch#376

Regnet128Gf configuration for 6 additional of linear evaluations

Reviewed By: prigoyal

Differential Revision: D29915382

fbshipit-source-id: 636125438db2ef62ced5daaea94add72ef571fea
…rch#174)

Summary:
This PR introduces a script to automatically download kinetics 700 and format it in `disk_folder` and `disk_filelist` format.

Pull Request resolved: fairinternal/ssl_scaling#174

Reviewed By: prigoyal

Differential Revision: D29917908

Pulled By: QuentinDuval

fbshipit-source-id: 66d244bfc6ed219ae12ad333705c9687ee08b47a
Summary:
Pull Request resolved: facebookresearch#382

The warm up `dist.all_reduce()` call was happening before setting the CUDA device, which meant all workers were using device 0. This resulted in crashes / hangs as mentioned in https://fb.workplace.com/groups/1309000715937050/permalink/1621428588027593/

Reviewed By: prigoyal

Differential Revision: D30005438

fbshipit-source-id: 48087d117262dad9ee3e858f05c0f9c7206496bf
Summary: Pull Request resolved: facebookresearch#383

Reviewed By: QuentinDuval

Differential Revision: D30012758

Pulled By: prigoyal

fbshipit-source-id: c737dfbb3e7e59fc925d5615efdfdd3e9eef791c
…ures (facebookresearch#175)

Summary:
Correctly rely on config.MODEL.FEATURE_EVAL_SETTINGS.SHOULD_FLATTEN_FEATS to decide whether or not to flatten the features:

In addition:
- add options at loading of features to decide if we should flatten
- add unit tests to ensure the right behaviour

Pull Request resolved: fairinternal/ssl_scaling#175

Reviewed By: iseessel

Differential Revision: D30069587

Pulled By: QuentinDuval

fbshipit-source-id: 044389c46c5c1e658141c599545dc72c5c50dff2
Summary: Add configurations for Regnet256 on linear evaluation benchmarks

Reviewed By: iseessel

Differential Revision: D30070086

fbshipit-source-id: fc20ba889443c495b64088bf88b3dfe52e97ed8a
…search#387)

Summary:
Pull Request resolved: facebookresearch#387

The sliced were not created at the right place because of os.path.abspath

Reviewed By: iseessel

Differential Revision: D30109789

fbshipit-source-id: c332fbf5f5c52241a537bd1188e3268a2f5cb966
Summary:
[enhancement] FSDP with activation checkpoint now allows to specify blocks without activations (useful for linear evaluation) and completes incomplete configurations for stage_checkpoints

Pull Request resolved: fairinternal/ssl_scaling#176

Reviewed By: prigoyal

Differential Revision: D30143386

Pulled By: QuentinDuval

fbshipit-source-id: 6fa85059d36d0bfa44ea7c07ac92994985674943
Summary:
This fix the problem that the Barlow Twins model needs to save a function in the checkpoint.

Pull Request resolved: facebookresearch#388

Reviewed By: iseessel

Differential Revision: D30158877

Pulled By: prigoyal

fbshipit-source-id: 537d0686422148447a4a42e14b448eb6e592eec9
Summary:
Minor typo appearing on https://vissl.ai/

Pull Request resolved: facebookresearch#389

Reviewed By: iseessel

Differential Revision: D30158860

Pulled By: prigoyal

fbshipit-source-id: 0effb12f494de3067c49b19b142b30cc7e9312ff
Summary:
Pull Request resolved: facebookresearch#380

Various Instance Retrieval improvements:
1. Add support for Manifold

2. Cleanup noisy logs and add helpful logging.

3. Add DEBUG_MODE support for the Revisited Datasets.

4. Add ability to save results/logs/features.

5. Fix ROI crop bug.

6. Fix typo in benchmark_workflow.py causing benchmarks to fail.
7. Add a bunch of json configs to track and group multiple experiments.

Reviewed By: prigoyal

Differential Revision: D29995282

fbshipit-source-id: 2382963f39c6c61aa417b690a39754d4b30b3fe2
…ch#379)

Summary:
Pull Request resolved: facebookresearch#379

1. Fix the gem post processing logic.

Before this change, the code assumes that each non-preprocessed feature tensor has the same tensor shape:

```
    if cfg.IMG_RETRIEVAL.FEATS_PROCESSING_TYPE == "gem":
        gem_out_fname = f"{out_dir}/{train_dataset_name}_GeM.npy"
        train_features = torch.tensor(np.concatenate(train_features))
```

This is not the case, since ROxford/RParis images do not have a standard size, hence the resx layers have different height and widths (but same number of channels). GeM pooling will transform an image of any shape to a shape of `(num_channels)`

The change performs gem_pooling on each individual images, as opposed to all the images at once. This should be fine because both gem and l2 normalization are to be performed per-image.

2. Transform before cropping to the bounding box (as opposed to after cropping).

The experiments show that this yields much better results. This is also what the deepcluster implentation uses: https://github.com/facebookresearch/deepcluster/blob/master/eval_retrieval.py#L44

```
Oxford: 61.57 / 41.74 / 14.33 vs. 69.65 / 48.51 / 16.41
Paris: 83.7 / 66.87 / 44.81 vs. 87.9 / 70.57 / 47.39
```
f288434289
f288438150

Reviewed By: prigoyal

Differential Revision: D29993204

fbshipit-source-id: 052a77c97a53f9dd6a969d44622cee0b25901498
Summary:
Pull Request resolved: facebookresearch#378

Revisited oxford and paris provide bounding boxes for the queries of the landmarks, that they suggest to use in the evaluation. Weirdly enough, the bounding boxes actually degrade performance for my experiments. Hence, putitng an option to make the bounding boxes optional.

Reviewed By: prigoyal

Differential Revision: D29993208

fbshipit-source-id: cd1a00ae19d3faf61b520e00b9d05f28f60207b8
)

Summary:
Pull Request resolved: facebookresearch#381

1. Rename SHOULD_TRAIN_PCA_OR_WHITENING to TRAIN_PCA_WHITENING

2. Make l2 normalization optional.

3. Fix cfg access bugs

4. Add some more experiments.

Reviewed By: prigoyal

Differential Revision: D30002757

fbshipit-source-id: 3ec5be799a1d9bf2fa75c736fce9b2552db7966c
@pranavsinghps1 pranavsinghps1 changed the title Byol(190) Byol(https://github.com/facebookresearch/vissl/issues/190) Aug 10, 2021
@pranavsinghps1 pranavsinghps1 changed the title Byol(https://github.com/facebookresearch/vissl/issues/190) Byol(190) Aug 10, 2021
@pranavsinghps1
Copy link
Author

Wrt 9ff5847

  • Added Comments and type hints in byol_loss and byol_hooks.
  • Added references in byol_8node_resnet.yaml
  • Ran Linter and fixed redundant imports and whitespaces.

vissl/hooks/byol_hooks.py Outdated Show resolved Hide resolved
vissl/hooks/byol_hooks.py Outdated Show resolved Hide resolved
@register_loss("byol_loss")
class BYOLLoss(ClassyLoss):
"""
This is the loss proposed in BYOL
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would write a bit more information about how the loss is created.

loss
"""

# Split data
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would write a nice comment explaining what's happening here.

@iseessel iseessel changed the base branch from master to byol October 17, 2021 17:19
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
7 participants