Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Advantage over training with all data instead of samples #5

Open
leo2105 opened this issue Mar 10, 2022 · 1 comment
Open

Advantage over training with all data instead of samples #5

leo2105 opened this issue Mar 10, 2022 · 1 comment

Comments

@leo2105
Copy link

leo2105 commented Mar 10, 2022

Hi, I just wanna know what is the difference between training with all the 600 samples and training 100 samples first, 200, 300, ....
What does active learning step does? it really select the best images or what? I didn't get clear for me.

Thanks in advance

@IgorSusmelj
Copy link
Contributor

Hi @leo2105,

Active Learning and this tutorial are mostly written for situations where you don't have all data labeled. So you can't just train on all images. In that situation, you'd start with the already labeled data and train a model on it. You can then use the model predictions on the unlabeled data to figure out which batch to label next.

Active Learning is a research field where we try to find the next images to label that maximizes the gain in model accuracy.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants