Skip to content
This repository has been archived by the owner on Jan 10, 2023. It is now read-only.

Retrieval scripts and Autoencoder for obtaining query vectors #1

Open
HarmanDotpy opened this issue Apr 18, 2022 · 0 comments
Open

Comments

@HarmanDotpy
Copy link

Hi,

It was great to read your paper.

Retrieval script
I was wondering if you are going to release the retrieval script any time soon?

Autoencoder for getting image embeddings for retrieval:
What is the exact architecture of this autoencoder? Is the encoder and decoder the same as the encoder and generator used in TIM-GAN.
could you please explain the process of retrieval pleas, in particular, we have an autoencoder, made of an encoder E1 and decoder D1.
then we pretrain this autoencoder on the dataset. Could you tell me the exact pertaining process, loss functions etc? Can we use the run_pretrain.sh script for training the autoencoder?

While calculating recall for your method, this is written in the paper:
Screenshot 2022-04-18 at 12 15 04 PM

but how do we calculate the recall for other methods? i.e what is the encoder used in that case?

In my understanding, it should have been that there is a separate autoencoder trained on the dataset, which does not have anything to do with TIM-GAN, or any of the other methods, and then after all the models are trained and they are able to generate images, we can use this pretrained autoencoder to get the image representations of the generated images, and use it as a query.
could you tell me if this is happening in the paper, or if not then what is the exact process, because I want to calculate the metrics for these methods on my side.

Thanks you in advance, for the help

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant