Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

AB-98: Add a feature to select SVM parameters #257

Merged
merged 13 commits into from Aug 11, 2020
Merged

Conversation

@rsh7
Copy link
Contributor

@rsh7 rsh7 commented Feb 16, 2018

This is a...

  • Bug Fix
  • Feature addition
  • Refactoring
  • Minor / simple change (like a typo)
  • Other

Present Case:

When a dataset has been created, the user can submit it for evaluation.
before
This screen has 2 preferences (filtering and normalization) for now. We don't have any options to select which SVM parameters the user want to use so that the search space can be reduced.

Addition

I have added more additional preferences of C , gamma and Preprocessing values as well so that the user can be more specific on his/her requirements of the SVM parameters. And if the user doesn't provide any input, then the default values will be taken.
Now, the evaluation screen on acousticbrainz server will look like:
after

The additional preferences taken by the user are getting saved into database and then retrieved while model training stage. I am writing these values into the custom project file which is being generated during the evaluation stage before running the tests.
Thus, this feature will help the users to easily select which parameters they want to use.

@paramsingh
Copy link
Member

@paramsingh paramsingh commented Feb 16, 2018

Hey @rsh7, in general whenever we do a schema change, we add a update script too. See examples here: https://github.com/metabrainz/acousticbrainz-server/blob/master/admin/updates/20160812-user-feedback.sql

@paramsingh
Copy link
Member

@paramsingh paramsingh commented Feb 16, 2018

Also, tests seem to be failing, please run test.sh and fix them.

@rsh7
Copy link
Contributor Author

@rsh7 rsh7 commented Feb 17, 2018

@paramsingh, Fixed the failing tests and added an update script as well.

Copy link
Member

@paramsingh paramsingh left a comment

Nice work. Looks solid to me but needs some changes.

I'd like the entry of the parameters to be an optional feature hidded under an advanced shade, if possible.

admin/sql/create_tables.sql Outdated Show resolved Hide resolved
dataset_eval/evaluate.py Outdated Show resolved Hide resolved
dataset_eval/evaluate.py Outdated Show resolved Hide resolved
dataset_eval/gaia_wrapper.py Outdated Show resolved Hide resolved
dataset_eval/gaia_wrapper.py Outdated Show resolved Hide resolved
db/dataset_eval.py Outdated Show resolved Hide resolved
db/dataset_eval.py Outdated Show resolved Hide resolved
webserver/forms.py Outdated Show resolved Hide resolved
webserver/forms.py Outdated Show resolved Hide resolved
webserver/forms.py Outdated Show resolved Hide resolved
dataset_eval/gaia_wrapper.py Outdated Show resolved Hide resolved
dataset_eval/gaia_wrapper.py Outdated Show resolved Hide resolved
dataset_eval/gaia_wrapper.py Outdated Show resolved Hide resolved
@rsh7
Copy link
Contributor Author

@rsh7 rsh7 commented Feb 21, 2018

I have added a feature flag like

FEATURE_EVAL_LOCATION = False
to only show these preferences option on the acousticbrainz site when it is enabled.
So, if the value is set to False, it will display:

new

@rsh7 rsh7 changed the title AB-98: Added a feature to select SVM parameters AB-98: Add a feature to select SVM parameters Mar 4, 2018
@rsh7 rsh7 force-pushed the rsh7:preferences branch from 017aa3a to 355a19b Apr 2, 2018
@paramsingh paramsingh self-assigned this May 4, 2018
Copy link
Member

@paramsingh paramsingh left a comment

from my testing looks good, some minor things to fix though.

dataset_eval/evaluate.py Show resolved Hide resolved
dataset_eval/gaia_wrapper.py Show resolved Hide resolved
db/dataset_eval.py Outdated Show resolved Hide resolved
db.dataset_eval.evaluate_dataset(
dataset_id=ds["id"],
normalize=form.normalize.data,
eval_location=form.evaluation_location.data,
c_value=c_value,
gamma_value=gamma_value,
preprocessing_values=form.preprocessing_values.data,
filter_type=form.filter_type.data,
)
flash.info("Dataset %s has been added into evaluation queue." % ds["id"])

This comment has been minimized.

@paramsingh

paramsingh May 4, 2018
Member

I think flashing a message here if the input values were bad would be a good idea too.

This comment has been minimized.

@rsh7

rsh7 May 10, 2018
Author Contributor

@paramsingh What if we consider default values in case where input values are bad? And flashing a message saying - "Bad input value. We are taking default values for now"

webserver/templates/datasets/evaluate.html Outdated Show resolved Hide resolved
webserver/views/datasets.py Outdated Show resolved Hide resolved
@rsh7 rsh7 force-pushed the rsh7:preferences branch from 701a0bb to 925a5f9 Aug 19, 2018
@rsh7 rsh7 force-pushed the rsh7:preferences branch from 925a5f9 to 0bff3fb Feb 15, 2019
@rsh7
Copy link
Contributor Author

@rsh7 rsh7 commented Feb 15, 2019

Rebased to master!

@rsh7 rsh7 force-pushed the rsh7:preferences branch from 0bff3fb to b779bb9 Mar 4, 2019
@alastair alastair force-pushed the rsh7:preferences branch from b779bb9 to 8381201 Jun 23, 2020
@pep8speaks
Copy link

@pep8speaks pep8speaks commented Jun 23, 2020

Hello @rsh7! Thanks for updating this PR. We checked the lines you've touched for PEP 8 issues, and found:

There are currently no PEP 8 issues detected in this Pull Request. Cheers! 🍻

Comment last updated at 2020-08-11 14:15:54 UTC
@alastair alastair merged commit ac8cf10 into metabrainz:master Aug 11, 2020
1 of 2 checks passed
1 of 2 checks passed
@github-actions
build build
Details
@brainzbot
Jenkins Build finished.
Details
@alastair
Copy link
Contributor

@alastair alastair commented Aug 11, 2020

🎉

amCap1712 added a commit that referenced this pull request Jun 29, 2021
These flags were added in #257
but we forgot to add those to consul config.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
4 participants