Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Advice about training result #585

Open
YoannRandon opened this issue Nov 14, 2023 · 9 comments
Open

Advice about training result #585

YoannRandon opened this issue Nov 14, 2023 · 9 comments

Comments

@YoannRandon
Copy link

Hi, I have started a training recently and got some result which are not that great on the synthetic to realistic image style transfert.
I use the master branch of the code and got those result on the loss function.
image

@YoannRandon
Copy link
Author

And here are some example of what is generated :
image

@YoannRandon
Copy link
Author

it's already the 50+ epochs, and it seemed the model doesn't improve much. (the problem surely come from the lack of diversity of the training data 3568 synthetic/ 6519 realistic) but is there something i can do to improve the model performance ?
I use mask (not bounding box) for the training.

@YoannRandon
Copy link
Author

my cmd for training
image

@YoannRandon
Copy link
Author

YoannRandon commented Nov 14, 2023

my train config file
seg_sem.json

@beniz
Copy link
Contributor

beniz commented Nov 14, 2023

I believe we had this conversation before, right ? This dataset cannot lead to much results IMO, I can find and resend my email about it if needed.
Algorithms cannot compensate for badly set data. There may be ways to find hyper-parameters that do better than others, but at the margin only.

@beniz
Copy link
Contributor

beniz commented Nov 14, 2023

If you wish to test on a dataset from the literature, gta5 to cityscapes is one, and there are others.

@YoannRandon
Copy link
Author

Is there a specific methodology? During our last meeting I think I heard that you "restart the training", Is it related to when the model is doing poorly at some epochs which is detected visually throught the loss function or is at a specific way of training or do you just train in one go and change parameters according to your results after a training is over?

@YoannRandon
Copy link
Author

I'll try on gtaV2cityscape as an external reference for the doability on this specific uc.

@beniz
Copy link
Contributor

beniz commented Nov 14, 2023

image

This is to illustrate my argument: synth data has no trees/natural life basically, so the discriminator is pushing for them on fake images. Which is not what the goal is here. This is because synth and real domain need to have a purpose one to the other in reality.

Also, classes between synth and real domains may not be the same, which is wrong and makes semantic conservation much more difficult.

image
(different colors mean different classes)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants