Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to give a prompt when inference? #41

Open
Gang-Chen-China opened this issue May 15, 2024 · 1 comment
Open

How to give a prompt when inference? #41

Gang-Chen-China opened this issue May 15, 2024 · 1 comment

Comments

@Gang-Chen-China
Copy link

Gang-Chen-China commented May 15, 2024

Thanks for your impressive work! The following command takes a clear image file as input, and saves the output rainy. I want to know can I change the rainfall,can it change with different prompts?

python src/inference_unpaired.py --model_name "clear_to_rainy" \ --input_image "assets/examples/clear2rainy_input.png" --output_dir "outputs"

@Gang-Chen-China Gang-Chen-China changed the title How to give a prompte when inference? How to give a prompt when inference? May 15, 2024
@GaParmar
Copy link
Owner

GaParmar commented May 26, 2024

Hi, for our unpaired models we keep the text prompt fixed during both training and inference.
Since the model has not seen other text prompts during training, it is hard to predict what effects it will have if you modify the text prompt during inference.
Feel free to try it out and report your results here!
You can modify the fixed caption in this line to try a different caption.

-Gaurav

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants