Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add badge for code coverage in README.md #9

Closed
interesaaat opened this issue Mar 30, 2020 · 3 comments
Closed

Add badge for code coverage in README.md #9

interesaaat opened this issue Mar 30, 2020 · 3 comments

Comments

@interesaaat
Copy link
Collaborator

No description provided.

@ksaur
Copy link
Collaborator

ksaur commented Mar 31, 2020

fixed by commit for now, more elaborate solution to come.

@ksaur ksaur closed this as completed Mar 31, 2020
@ksaur
Copy link
Collaborator

ksaur commented Mar 31, 2020

Notes: github actions can regenerate a badge for you ,but the badge has to go SOMEWHERE. So for now, I'm generating the .svg badge as a commit hook and uploading it as part of a commit.

@ksaur ksaur reopened this Apr 1, 2020
@ksaur
Copy link
Collaborator

ksaur commented Apr 1, 2020

This solution doesn't work well. Let's wait to make this repo public and figure out a fancier external solution

@interesaaat interesaaat added this to the After switching to public milestone Apr 6, 2020
interesaaat added a commit that referenced this issue Apr 29, 2020
* initial upload

* dev branch workflow

* Update README.md

* starting to setup coverage

* flake err cleanup

* deleted more unused code

* can't find a good githubactions coverage

* can't find a good githubactions coverage

* bug fixes

* consolidating tests

* XGB Regressor is failing

* commiting lgbm regressor tests

* using params

* fixing lgbm max_depth bug

* better test output. TODO: fix the max_depth for lgbm and xgb to not fall through to None, need to compute

* adding failure case test. TODO: why does RF  not have extra_config in  regressor

* pinning to xgboost .90 for now

* refactoring tree's extra_config for xgb and lgbm

* fixing reversed param

* adding gbdt test file

* refactoring beam params functions

* making  all beam params  as numpy

* increasing coverege by shifting label starts  and by deleting unused model.infer_initial_types()

* extra config for rf reg

* flake8

* more error testing

* using onnxconverter types instead of copypaste

* more consolidation

* more test coverage

* first step in refactor

* cleaning up batch params

* adding beam++ to node size 1 test

* there is a bug, documenting

* renaming trees to match paper

* test

* adding precommit hooks

* README.md

* readme update

* commit hooks

* Fixing badge link to be relative

* notebook for demo

* notebook for demo

* notebook params change

* reveriting 2c95f48 and reopening issue #9; this solution is too clunky

* bumping pyt req

* Fix pytorch requirements

* Fix to brackets for alpha in xgboost

* Few minor fixes to comments in tests

* Removed unecessary regression tests

* Add binary classification tests for gemm, tree_trav and perf_tree_trav

* Fixes to whitespaces

* updating readme

* filling out contrib section

* expanding readme example so that (1) it actually runs (2) it actually does a thing

* cleaning notebook example

* Fix to typo and update to the requirements

* Fix to flake8 errors

* readme changes from this morning

* changes based on feedback

* Few edits to contributing

* Few edits in the README file

* fixing mailto: syntax

* Remove initial_types from the converter API

* Rename Skl2PyTorch container into HBPyTorch

* Add convert_xgboost and convert_lightgbm API

* Fix to spacing

* remove pandas check (for the moment)

* fix import

* Fix readme to use the new API

* removed common directory

* add some documentation

* renamed few things

* code refactoring for trees

* refactor lightgbm and xgboost by moving stuff into gbdt_commons

* done with a pass on gbdt after moving everything to _gbdt_common

* final refactoring of gbdt classes

* rename random forest stuff into decision tree

* major refactoring for tree implementations

* some renaming here and there

* minor fix

* Add test to validate that issue #7 is closed.

* import container stuff from onnx-common

* fix the parser to use the topology in onnx-common

* remove unnecessary files

* address first chunk of Karla's comments

* fix typo in calibration

* Another round of comments addressed

* fix typo

* these two lines seem unnecessary

* moving notebooks from broken branch

* adding notebooks with new API changes

* removing comment

* removed few unnecessary code and edited some documentation

* Update CONTRIBUTING.md

* remove . from git clone

* Final pass over non-converters files documentation / API

* add constants for converters

* simplify a bit the API by using extra_config for optional parameters

* Update CONTRIBUTING.md

* done with documentation over public classes , methods

* add contants and extra config management

* addressing Karla's comments

* pip install pdoc; pdoc --html hummingbird

* pdoc3, using overrides to get extra doc if we want it

* add few tests to check that we actually pick the correct implementation

* Update README.md

* Reformat doc

* add HB logo to readme file

* Add HB logo in doc

* add assertion on model being not None

Co-authored-by: Karla Saur <karla.saur@microsoft.com>
Co-authored-by: Matteo Interlandi <mainterl@microsoft.com>
interesaaat added a commit that referenced this issue Apr 29, 2020
* initial upload

* dev branch workflow

* Update README.md

* starting to setup coverage

* flake err cleanup

* deleted more unused code

* can't find a good githubactions coverage

* can't find a good githubactions coverage

* bug fixes

* consolidating tests

* XGB Regressor is failing

* commiting lgbm regressor tests

* using params

* fixing lgbm max_depth bug

* better test output. TODO: fix the max_depth for lgbm and xgb to not fall through to None, need to compute

* adding failure case test. TODO: why does RF  not have extra_config in  regressor

* pinning to xgboost .90 for now

* refactoring tree's extra_config for xgb and lgbm

* fixing reversed param

* adding gbdt test file

* refactoring beam params functions

* making  all beam params  as numpy

* increasing coverege by shifting label starts  and by deleting unused model.infer_initial_types()

* extra config for rf reg

* flake8

* more error testing

* using onnxconverter types instead of copypaste

* more consolidation

* more test coverage

* first step in refactor

* cleaning up batch params

* adding beam++ to node size 1 test

* there is a bug, documenting

* renaming trees to match paper

* test

* adding precommit hooks

* README.md

* readme update

* commit hooks

* Fixing badge link to be relative

* notebook for demo

* notebook for demo

* notebook params change

* reveriting 2c95f48 and reopening issue #9; this solution is too clunky

* bumping pyt req

* Fix pytorch requirements

* Fix to brackets for alpha in xgboost

* Few minor fixes to comments in tests

* Removed unecessary regression tests

* Add binary classification tests for gemm, tree_trav and perf_tree_trav

* Fixes to whitespaces

* updating readme

* filling out contrib section

* expanding readme example so that (1) it actually runs (2) it actually does a thing

* cleaning notebook example

* Fix to typo and update to the requirements

* Fix to flake8 errors

* readme changes from this morning

* changes based on feedback

* Few edits to contributing

* Few edits in the README file

* fixing mailto: syntax

* Remove initial_types from the converter API

* Rename Skl2PyTorch container into HBPyTorch

* Add convert_xgboost and convert_lightgbm API

* Fix to spacing

* remove pandas check (for the moment)

* fix import

* Fix readme to use the new API

* removed common directory

* add some documentation

* renamed few things

* code refactoring for trees

* refactor lightgbm and xgboost by moving stuff into gbdt_commons

* done with a pass on gbdt after moving everything to _gbdt_common

* final refactoring of gbdt classes

* rename random forest stuff into decision tree

* major refactoring for tree implementations

* some renaming here and there

* minor fix

* Add test to validate that issue #7 is closed.

* import container stuff from onnx-common

* fix the parser to use the topology in onnx-common

* remove unnecessary files

* address first chunk of Karla's comments

* fix typo in calibration

* Another round of comments addressed

* fix typo

* these two lines seem unnecessary

* moving notebooks from broken branch

* adding notebooks with new API changes

* removing comment

* removed few unnecessary code and edited some documentation

* Update CONTRIBUTING.md

* remove . from git clone

* Final pass over non-converters files documentation / API

* add constants for converters

* simplify a bit the API by using extra_config for optional parameters

* Update CONTRIBUTING.md

* done with documentation over public classes , methods

* add contants and extra config management

* addressing Karla's comments

* pip install pdoc; pdoc --html hummingbird

* pdoc3, using overrides to get extra doc if we want it

* add few tests to check that we actually pick the correct implementation

* Update README.md

* Reformat doc

* add HB logo to readme file

* Add HB logo in doc

* add assertion on model being not None

Co-authored-by: Karla Saur <karla.saur@microsoft.com>
Co-authored-by: Matteo Interlandi <mainterl@microsoft.com>
interesaaat added a commit that referenced this issue Aug 18, 2020
* initial upload

* dev branch workflow

* Update README.md

* starting to setup coverage

* flake err cleanup

* deleted more unused code

* can't find a good githubactions coverage

* can't find a good githubactions coverage

* bug fixes

* consolidating tests

* XGB Regressor is failing

* commiting lgbm regressor tests

* using params

* fixing lgbm max_depth bug

* better test output. TODO: fix the max_depth for lgbm and xgb to not fall through to None, need to compute

* adding failure case test. TODO: why does RF  not have extra_config in  regressor

* pinning to xgboost .90 for now

* refactoring tree's extra_config for xgb and lgbm

* fixing reversed param

* adding gbdt test file

* refactoring beam params functions

* making  all beam params  as numpy

* increasing coverege by shifting label starts  and by deleting unused model.infer_initial_types()

* extra config for rf reg

* flake8

* more error testing

* using onnxconverter types instead of copypaste

* more consolidation

* more test coverage

* first step in refactor

* cleaning up batch params

* adding beam++ to node size 1 test

* there is a bug, documenting

* renaming trees to match paper

* test

* adding precommit hooks

* README.md

* readme update

* commit hooks

* Fixing badge link to be relative

* notebook for demo

* notebook for demo

* notebook params change

* reveriting 2c95f48 and reopening issue #9; this solution is too clunky

* bumping pyt req

* Fix pytorch requirements

* Fix to brackets for alpha in xgboost

* Few minor fixes to comments in tests

* Removed unecessary regression tests

* Add binary classification tests for gemm, tree_trav and perf_tree_trav

* Fixes to whitespaces

* updating readme

* filling out contrib section

* expanding readme example so that (1) it actually runs (2) it actually does a thing

* cleaning notebook example

* Fix to typo and update to the requirements

* Fix to flake8 errors

* readme changes from this morning

* changes based on feedback

* Few edits to contributing

* Few edits in the README file

* fixing mailto: syntax

* Remove initial_types from the converter API

* Rename Skl2PyTorch container into HBPyTorch

* Add convert_xgboost and convert_lightgbm API

* Fix to spacing

* remove pandas check (for the moment)

* fix import

* Fix readme to use the new API

* removed common directory

* add some documentation

* renamed few things

* code refactoring for trees

* refactor lightgbm and xgboost by moving stuff into gbdt_commons

* done with a pass on gbdt after moving everything to _gbdt_common

* final refactoring of gbdt classes

* rename random forest stuff into decision tree

* major refactoring for tree implementations

* some renaming here and there

* minor fix

* Add test to validate that issue #7 is closed.

* import container stuff from onnx-common

* fix the parser to use the topology in onnx-common

* remove unnecessary files

* address first chunk of Karla's comments

* fix typo in calibration

* Another round of comments addressed

* fix typo

* these two lines seem unnecessary

* moving notebooks from broken branch

* adding notebooks with new API changes

* removing comment

* removed few unnecessary code and edited some documentation

* Update CONTRIBUTING.md

* remove . from git clone

* Final pass over non-converters files documentation / API

* add constants for converters

* simplify a bit the API by using extra_config for optional parameters

* Update CONTRIBUTING.md

* done with documentation over public classes , methods

* add contants and extra config management

* addressing Karla's comments

* pip install pdoc; pdoc --html hummingbird

* pdoc3, using overrides to get extra doc if we want it

* add few tests to check that we actually pick the correct implementation

* Update README.md

* Reformat doc

* add HB logo to readme file

* Add HB logo in doc

* add assertion on model being not None

Co-authored-by: Karla Saur <karla.saur@microsoft.com>
Co-authored-by: Matteo Interlandi <mainterl@microsoft.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants