Skip to content

Commit

Permalink
Release/0.8.1 (mckinsey#94)
Browse files Browse the repository at this point in the history
* Merge back to develop

* Simplifying viz.draw syntax in tutorial notebook (mckinsey#46)

* Add non negativity constraint in numpy lasso (mckinsey#41)

* Add plotting tutorial to the documentation (mckinsey#47)

* Unpin some requirements

* Mixed type data generation (mckinsey#55)

Added DAG-based synthetic data generator for mixed types (binary, categorical, continuous) using a linear SEM approach.

* Merge back to develop (mckinsey#59)

* Pytorch NOTEARS (mckinsey#63)

* NoTears as ScoreSolver

* refactor continuous solver

* adding attribute to access weight matrix

* refactoring continuous solver

* Adding fit_lasso method

* add data_gen_continuous.py and tests (mckinsey#38)

* add data_gen.py

* rename

* wrap SM

* move data_gen_continous, create test

* more coverage

* test fixes

* move discrete sem to another file

* node list dupe check test

* ValueError tests

* replace dag and sem functions with Ben's verions

* add Ben's tests

* fix fstring

* to_numpy_array coverage

* Ben's comments

* remove unreachable ValueError for coverage

* remove unused fixture

* remove redundant test

* remove extensions

Co-Authored-By: Ben Horsburgh <Ben.Horsburgh@quantumblack.com>

* docstring

Co-Authored-By: Ben Horsburgh <Ben.Horsburgh@quantumblack.com>

* docstring

Co-Authored-By: Ben Horsburgh <Ben.Horsburgh@quantumblack.com>

* docs

Co-Authored-By: Ben Horsburgh <Ben.Horsburgh@quantumblack.com>

* doc

Co-Authored-By: Ben Horsburgh <Ben.Horsburgh@quantumblack.com>

* rename file, g_dag rename to sm

* add new tests for equal weights

* docstring

* steve docstring, leq fix

* steve comments + docstrings

Co-authored-by: Ben Horsburgh <Ben.Horsburgh@quantumblack.com>

* Adding check input and removing some inner functions

* Removing attribute original_ndarray

* Aligning from pandas with new implementation

* Adding tests for fit_lasso

* More tests for lasso

* wrapping tabu params in a dict

* Aligning tests with new tabu params

* Aligning from_pandas with new tabu_params

* Adding fit_intercept option to _fit method

* Adding scaling option

* fixing lasso tests

* Adding a test for fit_intercept

* scaling option only with mean

* Correction in lasso bounds

* Fix typos

* Remove duplicated bounds function

* adding comments

* add torch files from xunzheng

* add from_numpy_torch function that works like from_numpy_lasso

* lint

* add requirements

* add debug functionality

* add visual debug test

* add license

* allow running as main for viz, comments

* move to contrib

* make multi layer work a bit better

* add comment for multi layer

* use polynomial dag constraint for better speed comparison

* revert unnecessary changes to keep PR lean

* revert unnecessary changes to keep PR lean

* revert unnecessary changes to keep PR lean

* fixes

* refactor

* Integrated tests

* Checkpoint

* Refactoring

* Finished initial refactoring

* All tests passed

* Cleaning

* Git add testing

* Get adjacency matrix

* Done cleaning

* Revert change to original notears

* Revert change to original structuremodel

* Revert change to pylintrc

* Undo deletion

* Apply suggestions from Zain

Co-authored-by: Zain Patel <zain.patel@quantumblack.com>

* Addressed Zain comments

* Migrated from_numpy

* Delete contrib test

* Migrated w_threshold

* Some linting

* Change to None

* Undo deletion

* List comprehension

* Refactoring scipy and remove scipy optimiser

* Refactoring

* Refactoring

* Refactoring complete

* change from np to torch tensor

* More refactoring

* Remove hnew equal to None

* Refactor again and remove commented line

* Minor change

* change to params

* Addressing Philip's comment

* Add property

* Add fc2 property weights

* Change to weights

* Docstring

* Linting

* Linting completed

* Add gpu code

* Add gpu to from_numpy and from_pandas

* cuda 0 run out of memory

* Debugging

* put 5

* debugging gpu

* shift to inner loop

* debugging not in place

* Use cada instead of to

* Support both interfaces

* Benchmarking gpu

* Minor fix

* correct import path for test

* change gpu from 5 to 1

* Debugging

* Debugging

* Experimenting

* Linting

* Remove hidden layer and gpu

* Linting

* Testing and linting

* Correct pytorch to torch

* Add init zeros

* Change weight threshold to 0.25

* Revert requirements.txt

* Update release.md

* Address coments

* Corrected release.md

* fc1 to adjacency

Co-authored-by: Ben Horsburgh <Ben.Horsburgh@quantumblack.com>
Co-authored-by: LiseDiagneQB <60981366+LiseDiagneQB@users.noreply.github.com>
Co-authored-by: Casey Juanxi Li <50737712+caseyliqb@users.noreply.github.com>
Co-authored-by: qbphilip <philip.pilgerstorfer@quantumblack.com>
Co-authored-by: Zain Patel <zain.patel@quantumblack.com>

* Pinned sphinx-auto-doc-typehints (mckinsey#66)

* Corrected a spelling/grammar mistake (mckinsey#55)

* Fix/lint (mckinsey#73)

* Hotfix/0.4.3 (mckinsey#7) - Address broken links and grammar

* Fix documentation links in README (mckinsey#2)

* Fix links in README

* library -> libraries

* Fix github link in docs

* Clean up grammar and consistency in documentation (mckinsey#4)

* Clean up grammar and consistency in `README` files

* Add esses, mostly

* Reword feature description to not appear automatic

* Update docs/source/05_resources/05_faq.md

Co-Authored-By: Ben Horsburgh <benhorsburgh@outlook.com>

Co-authored-by: Ben Horsburgh <benhorsburgh@outlook.com>

* hotfix/0.4.3: fix broken links

Co-authored-by: Zain Patel <30357972+mzjp2@users.noreply.github.com>
Co-authored-by: Nikos Tsaousis <tsanikgr@users.noreply.github.com>
Co-authored-by: Deepyaman Datta <deepyaman.datta@utexas.edu>

* Release/0.5.0

* Plotting now backed by pygraphviz. This allows:
   * More powerful layout manager
   * Cleaner fully customisable theme
   * Out-the-box styling for different node and edge types
* Can now get subgraphs from StructureModel containing a specific node
* Bugfix to resolve issue when fitting CPDs with some missing states in data
* Minor documentation fixes and improvements

* Release/0.6.0

* Release/0.7.0 (mckinsey#57)

* Added plottting tutorial to the documentation
* Updated `viz.draw` syntax in tutorial notebooks
* Bugfix on notears lasso (`from_numpy_lasso` and `from_pandas_lasso`) where the non-negativity constraint was not being set
* Added DAG-based synthetic data generator for mixed types (binary, categorical, continuous) using a linear SEM approach.
* Unpinned some requirements

* black

* pin pytorch version

* pin pytorch version

Co-authored-by: Ben Horsburgh <Ben.Horsburgh@quantumblack.com>
Co-authored-by: Zain Patel <30357972+mzjp2@users.noreply.github.com>
Co-authored-by: Nikos Tsaousis <tsanikgr@users.noreply.github.com>
Co-authored-by: Deepyaman Datta <deepyaman.datta@utexas.edu>

* Structure learning regressor (mckinsey#68)

* initial commit (local copy-paste)

* fixed minor comments

* minor bugfix

* impute from children inital commit

* bugfixes and method option

* auto thresholding

* autothreshold and bugfix

* make threshold removal explicit

* add l1 argument

* remove child imputation

* feat importance fix and tabu logic

* moved threshold till dag

* restructure with base class

* coef mask

* recipe

* enable bias fitting

* persist bias as node attribute

* allow fit_intercept

* minor PR comment fixes

* minor comment adjustment

* test coverage and l1 clarification

* recipe

* minor test fixes

* more tests

* full test coverage

* revove python 3.5/3.6 unsupported import

* add normalization option

* idiomatic typing

* correct pylint errors

* update some tests

* more typeing updates

* more pylint requirements

* more pylint disable

* python 3.5 support

* try to get to work with 3.5

* full coverage and 3.5 support

* remove base class to pass test

* remove unneeded supression

* black formatting changes

* remove unused import

* pytlint supression

* minor reformat change

* isort fix

* better defensive programming

* fix unittests

* docstring update

* do Raises docstring properly

* action SWE suggestions

* hotfixes

* minor update

* minor black formatting change

* final merge checkbox

* fix end of file

* Data Gen root node initialisation fix (mckinsey#72)

* Hotfix/0.4.3 (mckinsey#7) - Address broken links and grammar

* Fix documentation links in README (mckinsey#2)

* Fix links in README

* library -> libraries

* Fix github link in docs

* Clean up grammar and consistency in documentation (mckinsey#4)

* Clean up grammar and consistency in `README` files

* Add esses, mostly

* Reword feature description to not appear automatic

* Update docs/source/05_resources/05_faq.md

Co-Authored-By: Ben Horsburgh <benhorsburgh@outlook.com>

Co-authored-by: Ben Horsburgh <benhorsburgh@outlook.com>

* hotfix/0.4.3: fix broken links

Co-authored-by: Zain Patel <30357972+mzjp2@users.noreply.github.com>
Co-authored-by: Nikos Tsaousis <tsanikgr@users.noreply.github.com>
Co-authored-by: Deepyaman Datta <deepyaman.datta@utexas.edu>

* Release/0.5.0

* Plotting now backed by pygraphviz. This allows:
   * More powerful layout manager
   * Cleaner fully customisable theme
   * Out-the-box styling for different node and edge types
* Can now get subgraphs from StructureModel containing a specific node
* Bugfix to resolve issue when fitting CPDs with some missing states in data
* Minor documentation fixes and improvements

* Release/0.6.0

* Release/0.7.0 (mckinsey#57)

* Added plottting tutorial to the documentation
* Updated `viz.draw` syntax in tutorial notebooks
* Bugfix on notears lasso (`from_numpy_lasso` and `from_pandas_lasso`) where the non-negativity constraint was not being set
* Added DAG-based synthetic data generator for mixed types (binary, categorical, continuous) using a linear SEM approach.
* Unpinned some requirements

* fix for consinuous normal data

* generalise across all dtypes

* support fit_intercept

* fixed many test errors

* test logic fixes

* lint test fixes

* python 3.5 failure change

* minor test bugfix

* black

* pin pytorch version

* pin pytorch version

* additional test parameter

* black formatting

* requested changes

* test updates and docstring

* black format change

* disable too many lines

* change

* move recipe to tutorial folder

* releaseMD changes

Co-authored-by: Ben Horsburgh <Ben.Horsburgh@quantumblack.com>
Co-authored-by: Zain Patel <30357972+mzjp2@users.noreply.github.com>
Co-authored-by: Nikos Tsaousis <tsanikgr@users.noreply.github.com>
Co-authored-by: Deepyaman Datta <deepyaman.datta@utexas.edu>
Co-authored-by: Philip Pilgerstorfer <34248114+qbphilip@users.noreply.github.com>
Co-authored-by: qbphilip <philip.pilgerstorfer@quantumblack.com>

* [1/2] Poisson data for data gen (mckinsey#61)

* Hotfix/0.4.3 (mckinsey#7) - Address broken links and grammar

* Fix documentation links in README (mckinsey#2)

* Fix links in README

* library -> libraries

* Fix github link in docs

* Clean up grammar and consistency in documentation (mckinsey#4)

* Clean up grammar and consistency in `README` files

* Add esses, mostly

* Reword feature description to not appear automatic

* Update docs/source/05_resources/05_faq.md

Co-Authored-By: Ben Horsburgh <benhorsburgh@outlook.com>

Co-authored-by: Ben Horsburgh <benhorsburgh@outlook.com>

* hotfix/0.4.3: fix broken links

Co-authored-by: Zain Patel <30357972+mzjp2@users.noreply.github.com>
Co-authored-by: Nikos Tsaousis <tsanikgr@users.noreply.github.com>
Co-authored-by: Deepyaman Datta <deepyaman.datta@utexas.edu>

* Release/0.5.0

* Plotting now backed by pygraphviz. This allows:
   * More powerful layout manager
   * Cleaner fully customisable theme
   * Out-the-box styling for different node and edge types
* Can now get subgraphs from StructureModel containing a specific node
* Bugfix to resolve issue when fitting CPDs with some missing states in data
* Minor documentation fixes and improvements

* Release/0.6.0

* Release/0.7.0 (mckinsey#57)

* Added plottting tutorial to the documentation
* Updated `viz.draw` syntax in tutorial notebooks
* Bugfix on notears lasso (`from_numpy_lasso` and `from_pandas_lasso`) where the non-negativity constraint was not being set
* Added DAG-based synthetic data generator for mixed types (binary, categorical, continuous) using a linear SEM approach.
* Unpinned some requirements

* refactor & docstring

* remove unused helper object

* add data gen to init

* make test more robust

* add count data and test, use logs for poisson samples for stability

* fix tests

* duplicate fixtures

* remove unused fixtures

* refactor data_generators into package with core and wrappers

* move wrapper to test_wrapper

* variable name change bugfix

* fix tests

Co-authored-by: Ben Horsburgh <Ben.Horsburgh@quantumblack.com>
Co-authored-by: Zain Patel <30357972+mzjp2@users.noreply.github.com>
Co-authored-by: Nikos Tsaousis <tsanikgr@users.noreply.github.com>
Co-authored-by: Deepyaman Datta <deepyaman.datta@utexas.edu>
Co-authored-by: angeldrothqb <angel.droth@quantumblack.com>

* [2/2] Nonlinear Data gen (mckinsey#60)

* Hotfix/0.4.3 (mckinsey#7) - Address broken links and grammar

* Fix documentation links in README (mckinsey#2)

* Fix links in README

* library -> libraries

* Fix github link in docs

* Clean up grammar and consistency in documentation (mckinsey#4)

* Clean up grammar and consistency in `README` files

* Add esses, mostly

* Reword feature description to not appear automatic

* Update docs/source/05_resources/05_faq.md

Co-Authored-By: Ben Horsburgh <benhorsburgh@outlook.com>

Co-authored-by: Ben Horsburgh <benhorsburgh@outlook.com>

* hotfix/0.4.3: fix broken links

Co-authored-by: Zain Patel <30357972+mzjp2@users.noreply.github.com>
Co-authored-by: Nikos Tsaousis <tsanikgr@users.noreply.github.com>
Co-authored-by: Deepyaman Datta <deepyaman.datta@utexas.edu>

* Release/0.5.0

* Plotting now backed by pygraphviz. This allows:
   * More powerful layout manager
   * Cleaner fully customisable theme
   * Out-the-box styling for different node and edge types
* Can now get subgraphs from StructureModel containing a specific node
* Bugfix to resolve issue when fitting CPDs with some missing states in data
* Minor documentation fixes and improvements

* Release/0.6.0

* Release/0.7.0 (mckinsey#57)

* Added plottting tutorial to the documentation
* Updated `viz.draw` syntax in tutorial notebooks
* Bugfix on notears lasso (`from_numpy_lasso` and `from_pandas_lasso`) where the non-negativity constraint was not being set
* Added DAG-based synthetic data generator for mixed types (binary, categorical, continuous) using a linear SEM approach.
* Unpinned some requirements

* refactor & docstring

* remove unused helper object

* add data gen to init

* make test more robust

* add count data and test, use logs for poisson samples for stability

* add nonlinear

* fix tests

* duplicate fixtures

* remove unused fixtures

* refactor data_generators into package with core and wrappers

* move wrapper to test_wrapper

* add nonlinear to init

* change order in all

* change release.md

* root node fix on core + count

* nonlinear support to wrappers

* docstring update

* bugfix and reproducability fix

* many tests and test updates

* poiss bugfix and test fix

* moar test coverage

* categorical dataframe test coverage

* full test coverage and linting

* fix linting and fstring

* black reformat

* fix unused pylint argument

* pytest fix

* FINAL linting fix

* Fix stuff (mckinsey#75)

CircleCI fixes

Co-authored-by: Ben Horsburgh <Ben.Horsburgh@quantumblack.com>
Co-authored-by: Zain Patel <30357972+mzjp2@users.noreply.github.com>
Co-authored-by: Nikos Tsaousis <tsanikgr@users.noreply.github.com>
Co-authored-by: Deepyaman Datta <deepyaman.datta@utexas.edu>
Co-authored-by: angeldrothqb <angel.droth@quantumblack.com>
Co-authored-by: Zain Patel <zain.patel@quantumblack.com>

* update black version (mckinsey#76)

* fix black

* Fix/check for NA or Infinity when notears is used  (mckinsey#54)

* update scipy version (mckinsey#77)

* add DYNOTEARS implementation (mckinsey#50)

Adds DYNOTEARS and corresponding data generator (for testing)

* Pytorch NOTEARS extension - Non-Linear/Hidden Layer (mckinsey#65)

* NoTears as ScoreSolver

* refactor continuous solver

* adding attribute to access weight matrix

* refactoring continuous solver

* Adding fit_lasso method

* add data_gen_continuous.py and tests (mckinsey#38)

* add data_gen.py

* rename

* wrap SM

* move data_gen_continous, create test

* more coverage

* test fixes

* move discrete sem to another file

* node list dupe check test

* ValueError tests

* replace dag and sem functions with Ben's verions

* add Ben's tests

* fix fstring

* to_numpy_array coverage

* Ben's comments

* remove unreachable ValueError for coverage

* remove unused fixture

* remove redundant test

* remove extensions

Co-Authored-By: Ben Horsburgh <Ben.Horsburgh@quantumblack.com>

* docstring

Co-Authored-By: Ben Horsburgh <Ben.Horsburgh@quantumblack.com>

* docstring

Co-Authored-By: Ben Horsburgh <Ben.Horsburgh@quantumblack.com>

* docs

Co-Authored-By: Ben Horsburgh <Ben.Horsburgh@quantumblack.com>

* doc

Co-Authored-By: Ben Horsburgh <Ben.Horsburgh@quantumblack.com>

* rename file, g_dag rename to sm

* add new tests for equal weights

* docstring

* steve docstring, leq fix

* steve comments + docstrings

Co-authored-by: Ben Horsburgh <Ben.Horsburgh@quantumblack.com>

* Adding check input and removing some inner functions

* Removing attribute original_ndarray

* Aligning from pandas with new implementation

* Adding tests for fit_lasso

* More tests for lasso

* wrapping tabu params in a dict

* Aligning tests with new tabu params

* Aligning from_pandas with new tabu_params

* Adding fit_intercept option to _fit method

* Adding scaling option

* fixing lasso tests

* Adding a test for fit_intercept

* scaling option only with mean

* Correction in lasso bounds

* Fix typos

* Remove duplicated bounds function

* adding comments

* add torch files from xunzheng

* add from_numpy_torch function that works like from_numpy_lasso

* lint

* add requirements

* add debug functionality

* add visual debug test

* add license

* allow running as main for viz, comments

* move to contrib

* make multi layer work a bit better

* add comment for multi layer

* use polynomial dag constraint for better speed comparison

* revert unnecessary changes to keep PR lean

* revert unnecessary changes to keep PR lean

* revert unnecessary changes to keep PR lean

* fixes

* refactor

* Integrated tests

* Checkpoint

* Refactoring

* Finished initial refactoring

* All tests passed

* Cleaning

* Git add testing

* Get adjacency matrix

* Done cleaning

* Revert change to original notears

* Revert change to original structuremodel

* Revert change to pylintrc

* Undo deletion

* Apply suggestions from Zain

Co-authored-by: Zain Patel <zain.patel@quantumblack.com>

* Addressed Zain comments

* Migrated from_numpy

* Delete contrib test

* Migrated w_threshold

* Some linting

* Change to None

* Undo deletion

* List comprehension

* Refactoring scipy and remove scipy optimiser

* Refactoring

* Refactoring

* Refactoring complete

* change from np to torch tensor

* More refactoring

* Remove hnew equal to None

* Refactor again and remove commented line

* Minor change

* change to params

* Addressing Philip's comment

* Add property

* Add fc2 property weights

* Change to weights

* Docstring

* Linting

* Linting completed

* Add gpu code

* Add gpu to from_numpy and from_pandas

* cuda 0 run out of memory

* Debugging

* put 5

* debugging gpu

* shift to inner loop

* debugging not in place

* Use cada instead of to

* Support both interfaces

* Benchmarking gpu

* Minor fix

* correct import path for test

* change gpu from 5 to 1

* Debugging

* Debugging

* Experimenting

* Linting

* Remove hidden layer and gpu

* Linting

* Testing and linting

* Correct pytorch to torch

* Add init zeros

* Change weight threshold to 0.25

* Revert requirements.txt

* Add hidden layer

* small refactor

* directional adj

* minor edits

* fix bias issues

* breaking changes update to the interface

* typo

* new regressor regularisation interface

* update forward method

* forward(X) predictions work

* working!

* bugfix data normalisation

* some fixes

* average regularisation and adj calc at end

* give credit!

Co-authored-by: Philip Pilgerstorfer <34248114+qbphilip@users.noreply.github.com>

* loc lin docstring update

Co-authored-by: Philip Pilgerstorfer <34248114+qbphilip@users.noreply.github.com>

* docstring + fc1/fc2 name updates

* moar docstring updates

* more minor updates

* remove normalize option

* plotting util

* rename to DAGRegressor

* rename and checks

* more util functions

* fix bias

* fix bias with no intercept

* fix linear adj

* add tests

* minor fix

* minor fixes

* extend interface to bias

* differentialte coef_ and feature_imporances

* seperate bias element

* tests

* more test coverage

* nonlinear test coverage

* test hotfix

* more test coverage

* test requirements update

* more test coverage

* formatting changes

* final pylint change

* more linting

* more bestpractice structuring

* more minor fixes

* FINAL linting updates

* actual last change

* update to reg defaults, additions to the tutorial

* nonlinear regularisation updates

* regressor tutorial

* almost finishing touches

* gradient based h function!

* soft clamp and coef feature importance seperation

* small api update, closer to batchnorm

* docstring updates

* stronger soft clamping

* gradient L1 rather than L2

* fcpos neg removal, gradient optim

* revert back to create_graph=True for 2nd derivative

* remove print and test fix

* black reformatting

* new black version

* full test coverage

* isort fix

* pylint fix

* first layer h(W) for speed optimization

* fix batch norm system

* add nonlinear test

* test hotfix

* black reformat

* isort fix

* remove X requirement from h_func

* regressor tutorial final commit and black update

* LayerNorm replacement

Co-authored-by: Philip Pilgerstorfer <34248114+qbphilip@users.noreply.github.com>

* major changes

* add standardization

* minort changes

* fix tests

* rename reg parameters

* linting

* test coverage, docstting

* check array for infs

* fix isinstance to base type

* fix isort, add test coverage

* new tutorial

* docstring fix

Co-authored-by: Zain Patel <zain.patel@quantumblack.com>

* test string match

Co-authored-by: Zain Patel <zain.patel@quantumblack.com>

* assert improvement

Co-authored-by: Zain Patel <zain.patel@quantumblack.com>

* SWE suggestions

* minor bugfix

* more test fixing

Co-authored-by: Ben Horsburgh <Ben.Horsburgh@quantumblack.com>
Co-authored-by: LiseDiagneQB <60981366+LiseDiagneQB@users.noreply.github.com>
Co-authored-by: Casey Juanxi Li <50737712+caseyliqb@users.noreply.github.com>
Co-authored-by: qbphilip <philip.pilgerstorfer@quantumblack.com>
Co-authored-by: Zain Patel <zain.patel@quantumblack.com>
Co-authored-by: angeldrothqb <angel.droth@quantumblack.com>
Co-authored-by: angeldrothqb <67913551+angeldrothqb@users.noreply.github.com>
Co-authored-by: Philip Pilgerstorfer <34248114+qbphilip@users.noreply.github.com>

* Merge release/0.8.0 back into develop for release 0.8.0 (mckinsey#82)

* Hotfix/0.4.3 (mckinsey#7) - Address broken links and grammar

* Fix documentation links in README (mckinsey#2)

* Fix links in README

* library -> libraries

* Fix github link in docs

* Clean up grammar and consistency in documentation (mckinsey#4)

* Clean up grammar and consistency in `README` files

* Add esses, mostly

* Reword feature description to not appear automatic

* Update docs/source/05_resources/05_faq.md

Co-Authored-By: Ben Horsburgh <benhorsburgh@outlook.com>

Co-authored-by: Ben Horsburgh <benhorsburgh@outlook.com>

* hotfix/0.4.3: fix broken links

Co-authored-by: Zain Patel <30357972+mzjp2@users.noreply.github.com>
Co-authored-by: Nikos Tsaousis <tsanikgr@users.noreply.github.com>
Co-authored-by: Deepyaman Datta <deepyaman.datta@utexas.edu>

* Release/0.5.0

* Plotting now backed by pygraphviz. This allows:
   * More powerful layout manager
   * Cleaner fully customisable theme
   * Out-the-box styling for different node and edge types
* Can now get subgraphs from StructureModel containing a specific node
* Bugfix to resolve issue when fitting CPDs with some missing states in data
* Minor documentation fixes and improvements

* Release/0.6.0

* Release/0.7.0 (mckinsey#57)

* Added plottting tutorial to the documentation
* Updated `viz.draw` syntax in tutorial notebooks
* Bugfix on notears lasso (`from_numpy_lasso` and `from_pandas_lasso`) where the non-negativity constraint was not being set
* Added DAG-based synthetic data generator for mixed types (binary, categorical, continuous) using a linear SEM approach.
* Unpinned some requirements

* release.md, version bump, docs

Co-authored-by: Ben Horsburgh <Ben.Horsburgh@quantumblack.com>
Co-authored-by: Zain Patel <30357972+mzjp2@users.noreply.github.com>
Co-authored-by: Nikos Tsaousis <tsanikgr@users.noreply.github.com>
Co-authored-by: Deepyaman Datta <deepyaman.datta@utexas.edu>
Co-authored-by: Philip Pilgerstorfer <34248114+qbphilip@users.noreply.github.com>

* fix tests (mckinsey#87)

* Fix pygments fail (mckinsey#84)

Thanks Zain!

* update notebook beginning (mckinsey#89)

* Add Binary distribution type support (mckinsey#85)

* binary dtype folder and __init__

* dtype base class

* continuous dtype

* binary dtype

* update core

* make plural

* update interface for idx

* minor variable name change

* notears update

* python 3.5 support

* fix fstring

* remove categorical methods, doctrings

* formatting and docstrings

* remove redundant cat code

* isort

* indexerror check

* defensive check tests

* datatype loss tests

* more test coverage

* more tests and formatting

* fix test import

* remove double test

* linting

* docstring and pylint

* docstring fix

Co-authored-by: Zain Patel <zain.patel@quantumblack.com>

* fix long string

Co-authored-by: Zain Patel <zain.patel@quantumblack.com>

* docstring fix

Co-authored-by: Zain Patel <zain.patel@quantumblack.com>

* remove relative imports

* docstring fix

* dict comprehension

* list comprehension and neatness

* remove unuesd import to __init__

* fix test

* remove unused return interface

* add binary f1score tests

* one datatype instane per feature

* rename dtype -> disttype, attach dists to nodes

* fix tests

* fix linting

* fix preserve node dtyper

* fix tests

* fix tests

* fix tests

* final docstring and test fixes

* lint fix

* test_fix, warning

* linting

* fix test

* fix tests

* reduce threshold of test

* raise better error

* black linting

* remove warning

* remove useless supression and import

Co-authored-by: Zain Patel <zain.patel@quantumblack.com>

* Add sklearn binary classifier (mckinsey#90)

* binary dtype folder and __init__

* dtype base class

* continuous dtype

* binary dtype

* update core

* make plural

* update interface for idx

* minor variable name change

* notears update

* python 3.5 support

* fix fstring

* remove categorical methods, doctrings

* formatting and docstrings

* remove redundant cat code

* isort

* indexerror check

* defensive check tests

* datatype loss tests

* more test coverage

* more tests and formatting

* fix test import

* remove double test

* linting

* docstring and pylint

* docstring fix

Co-authored-by: Zain Patel <zain.patel@quantumblack.com>

* fix long string

Co-authored-by: Zain Patel <zain.patel@quantumblack.com>

* docstring fix

Co-authored-by: Zain Patel <zain.patel@quantumblack.com>

* remove relative imports

* docstring fix

* dict comprehension

* list comprehension and neatness

* remove unuesd import to __init__

* fix test

* remove unused return interface

* new sklearn folder structure

* sklearn class outline

* new dtype interface

* docstring clarification

* inverse link function

* add binary f1score tests

* one datatype instane per feature

* rename dtype -> disttype, attach dists to nodes

* fix tests

* fix linting

* fix preserve node dtyper

* fix tests

* fix tests

* fix tests

* final docstring and test fixes

* lint fix

* test_fix, warning

* linting

* fix test

* fix tests

* reduce threshold of test

* docstring clarification

* _target_dist_type injection

* docstring updates + clf fit outline

* old doctring deprecation

* clf predict_proba and predict

* return bugfix

* docstring update

* import fix, linting, clf fit finished

* args docstring and None schema

* raise better error

* black linting

* linting

* revert to public interface

* remove warning

* remove useless supression and import

* add useless change to resolve merge conflict

* update inits

* standardization and data reconstruction

* remove unused imports

* fix clf .precit()

* regressor fit_predict

* remove useless regressor predict

* test import fix

* fix warnings

* pass series name thru

* fix schema pass thru

* better dict comprehension

Co-authored-by: Zain Patel <zain.patel@quantumblack.com>

* import and comment fixes

* update to .format()

* fig sklearn is fitted test

* more dtype schema insertion

* DAGRegressor test fix

* dag regressor test

* more linting

* big test restructure

* combined test suite

* error string update

* more test coverage

* linting and isort

* move test to combined test

* return float64 preds

* moar clf tests

* remove untestable (multiclass) code

* class number error test

* balck reformat

* docstrings, pylint

* fix test bug

* standard scaler for _base

* pull classes direct from LabelEncoder

* update tutorial

Co-authored-by: Zain Patel <zain.patel@quantumblack.com>

* Fix/uniform discretisation (mckinsey#65)

* Fix uniform discretiser

* Fix uniform discretiser

Co-authored-by: Philip Pilgerstorfer <34248114+qbphilip@users.noreply.github.com>

* release.MD

* version bump

* Update causalnex/structure/pytorch/dist_type/_base.py

Co-authored-by: Zain Patel <zain.patel@quantumblack.com>

* Update causalnex/structure/pytorch/dist_type/__init__.py

Co-authored-by: Zain Patel <zain.patel@quantumblack.com>

Co-authored-by: Ben Horsburgh <Ben.Horsburgh@quantumblack.com>
Co-authored-by: GabrielAzevedoFerreiraQB <57528979+GabrielAzevedoFerreiraQB@users.noreply.github.com>
Co-authored-by: Philip Pilgerstorfer <34248114+qbphilip@users.noreply.github.com>
Co-authored-by: stevelersl <55385183+SteveLerQB@users.noreply.github.com>
Co-authored-by: LiseDiagneQB <60981366+LiseDiagneQB@users.noreply.github.com>
Co-authored-by: Casey Juanxi Li <50737712+caseyliqb@users.noreply.github.com>
Co-authored-by: qbphilip <philip.pilgerstorfer@quantumblack.com>
Co-authored-by: Zain Patel <zain.patel@quantumblack.com>
Co-authored-by: KING-SID <sidhantbendre22@gmail.com>
Co-authored-by: Zain Patel <30357972+mzjp2@users.noreply.github.com>
Co-authored-by: Nikos Tsaousis <tsanikgr@users.noreply.github.com>
Co-authored-by: Deepyaman Datta <deepyaman.datta@utexas.edu>
Co-authored-by: Jebq <jb.oger2312@gmail.com>
Co-authored-by: Shuhei Ishida <shuhei.ishida66@gmail.com>
  • Loading branch information
15 people committed Sep 18, 2020
1 parent 8265f64 commit e29bc0c
Show file tree
Hide file tree
Showing 25 changed files with 1,356 additions and 252 deletions.
11 changes: 10 additions & 1 deletion RELEASE.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,14 @@
# Upcoming release

# Release 0.8.1

* Added `DAGClassifier` sklearn interface using the Pytorch NOTEARS implementation. Supports binary classification.
* Added binary distributed data support for pytorch NOTEARS.
* Added a "distribution type" schema system for pytorch NOTEARS (`pytorch.dist_type`).
* Rename "data type" to "distribution type" in internal language.
* Fixed uniform discretiser (`Discretiser(method='uniform')`) where all bins have identical widths.
* Fixed and updated sklearn tutorial in docs.

# Release 0.8.0

* Add DYNOTEARS (`from_numpy_dynamic`, an algorithm for structure learning on Dynamic Bayesian Networks).
Expand Down Expand Up @@ -52,6 +61,6 @@ The initial release of CausalNex.

## Thanks for supporting contributions
CausalNex was originally designed by [Paul Beaumont](https://www.linkedin.com/in/pbeaumont/) and [Ben Horsburgh](https://www.linkedin.com/in/benhorsburgh/) to solve challenges they faced in inferencing causality in their project work. This work was later turned into a product thanks to the following contributors:
[Yetunde Dada](https://github.com/yetudada), [Wesley Leong](https://www.linkedin.com/in/wesleyleong/), [Steve Ler](https://www.linkedin.com/in/song-lim-steve-ler-380366106/), [Viktoriia Oliinyk](https://www.linkedin.com/in/victoria-oleynik/), [Roxana Pamfil](https://www.linkedin.com/in/roxana-pamfil-1192053b/), [Nisara Sriwattanaworachai](https://www.linkedin.com/in/nisara-sriwattanaworachai-795b357/), [Nikolaos Tsaousis](https://www.linkedin.com/in/ntsaousis/), [Angel Droth](https://www.linkedin.com/in/angeldroth/), and [Zain Patel](https://www.linkedin.com/in/zain-patel/).
[Yetunde Dada](https://github.com/yetudada), [Wesley Leong](https://www.linkedin.com/in/wesleyleong/), [Steve Ler](https://www.linkedin.com/in/song-lim-steve-ler-380366106/), [Viktoriia Oliinyk](https://www.linkedin.com/in/victoria-oleynik/), [Roxana Pamfil](https://www.linkedin.com/in/roxana-pamfil-1192053b/), [Nisara Sriwattanaworachai](https://www.linkedin.com/in/nisara-sriwattanaworachai-795b357/), [Nikolaos Tsaousis](https://www.linkedin.com/in/ntsaousis/), [Angel Droth](https://www.linkedin.com/in/angeldroth/), [Zain Patel](https://www.linkedin.com/in/zain-patel/), and [Shuhei Ishida](https://www.linkedin.com/in/shuhei-i/).

CausalNex would also not be possible without the generous sharing from leading researches in the field of causal inference and we are grateful to everyone who advised and supported us, filed issues or helped resolve them, asked and answered questions or simply be part of inspiring discussions.
2 changes: 1 addition & 1 deletion causalnex/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -30,6 +30,6 @@
causalnex toolkit for causal reasoning (Bayesian Networks / Inference)
"""

__version__ = "0.8.0"
__version__ = "0.8.1"

__all__ = ["structure", "discretiser", "evaluation", "inference", "network", "plots"]
5 changes: 2 additions & 3 deletions causalnex/discretiser/discretiser.py
Original file line number Diff line number Diff line change
Expand Up @@ -174,10 +174,9 @@ def fit(self, data: np.ndarray) -> "Discretiser":
x.sort()

if self.method == "uniform":
bucket_width = len(x) / self.num_buckets
bucket_width = (np.max(x) - np.min(x)) / self.num_buckets
self.numeric_split_points = [
x[int(np.floor((n + 1) * bucket_width))]
for n in range(self.num_buckets - 1)
np.min(x) + bucket_width * (n + 1) for n in range(self.num_buckets - 1)
]

elif self.method == "quantile":
Expand Down
11 changes: 9 additions & 2 deletions causalnex/structure/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,14 @@
``causalnex.structure`` provides functionality to define or learn structure.
"""

__all__ = ["StructureModel", "notears", "dynotears", "data_generators", "DAGRegressor"]
__all__ = [
"StructureModel",
"notears",
"dynotears",
"data_generators",
"DAGRegressor",
"DAGClassifier",
]

from .sklearn import DAGRegressor
from .pytorch import DAGClassifier, DAGRegressor
from .structuremodel import StructureModel
3 changes: 1 addition & 2 deletions causalnex/structure/dynotears.py
Original file line number Diff line number Diff line change
Expand Up @@ -39,8 +39,7 @@
import scipy.optimize as sopt

from causalnex.structure import StructureModel

from .transformers import DynamicDataTransformer
from causalnex.structure.transformers import DynamicDataTransformer


def from_pandas_dynamic( # pylint: disable=too-many-arguments
Expand Down
3 changes: 2 additions & 1 deletion causalnex/structure/pytorch/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,8 @@
``causalnex.structure.pytorch`` provides functionality to define or learn structure using pytorch.
"""

__all__ = ["from_numpy", "from_pandas", "NotearsMLP"]
__all__ = ["from_numpy", "from_pandas", "NotearsMLP", "DAGRegressor", "DAGClassifier"]

from .core import NotearsMLP
from .notears import from_numpy, from_pandas
from .sklearn import DAGClassifier, DAGRegressor
42 changes: 39 additions & 3 deletions causalnex/structure/pytorch/core.py
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,8 @@
import torch.nn as nn
from sklearn.base import BaseEstimator

from .nonlinear import LocallyConnected
from causalnex.structure.pytorch.dist_type._base import DistTypeBase
from causalnex.structure.pytorch.nonlinear import LocallyConnected


class NotearsMLP(nn.Module, BaseEstimator):
Expand All @@ -56,9 +57,11 @@ class NotearsMLP(nn.Module, BaseEstimator):
loc_lin_layer weights are the weight of hidden layers after the first fully connected layer
"""

# pylint: disable=too-many-arguments
def __init__(
self,
n_features: int,
dist_types: List[DistTypeBase],
use_bias: bool = False,
hidden_layer_units: Iterable[int] = (0,),
bounds: List[Tuple[int, int]] = None,
Expand All @@ -70,7 +73,8 @@ def __init__(
Constructor for NOTEARS MLP class.
Args:
n_features: number of input features
n_features: number of input features.
dist_types: list of data type objects used to fit the NOTEARS algorithm.
use_bias: True to add the intercept to the model
hidden_layer_units: An iterable where its length determine the number of layers used,
and the numbers determine the number of nodes used for the layer in order.
Expand Down Expand Up @@ -116,6 +120,8 @@ def __init__(

# set the bounds as an attribute on the weights object
self.dag_layer.weight.bounds = bounds
# set the dist types
self.dist_types = dist_types
# type the adjacency matrix
self.adj = None
self.adj_mean_effect = None
Expand Down Expand Up @@ -175,6 +181,31 @@ def forward(self, x: torch.Tensor) -> torch.Tensor: # [n, d] -> [n, d]
x = x.squeeze(dim=2) # [n, d]
return x

def reconstruct_data(self, X: np.ndarray) -> np.ndarray:
"""
Performs X_hat reconstruction,
then converts latent space to original data space via link function.
Args:
X: input data used to reconstruct
Returns:
reconstructed data
"""

with torch.no_grad():
# convert the predict data to pytorch tensor
X = torch.from_numpy(X).float().to(self.device)

# perform forward reconstruction
X_hat = self(X)

# recover each one of the latent space projections
for dist_type in self.dist_types:
X_hat = dist_type.inverse_link_function(X_hat)

return np.asarray(X_hat.cpu().detach().numpy().astype(np.float64))

@property
def bias(self) -> Union[np.ndarray, None]:
"""
Expand Down Expand Up @@ -334,7 +365,12 @@ def _func(flat_params: np.ndarray) -> Tuple[float, np.ndarray]:
X_hat = self(X)
h_val = self._h_func()

loss = (0.5 / X.shape[0]) * torch.sum((X_hat - X) ** 2)
# preallocate loss tensor
loss = torch.tensor(0, device=X.device) # pylint: disable=not-callable
# sum the losses across all dist types
for dist_type in self.dist_types:
loss = loss + dist_type.loss(X, X_hat)

lagrange_penalty = 0.5 * rho * h_val * h_val + alpha * h_val
# NOTE: both the l2 and l1 regularization are NOT applied to the bias parameters
l2_reg = 0.5 * self.ridge_beta * self._l2_reg(n_features)
Expand Down
45 changes: 45 additions & 0 deletions causalnex/structure/pytorch/dist_type/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,45 @@
# Copyright 2019-2020 QuantumBlack Visual Analytics Limited
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
# EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES
# OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE, AND
# NONINFRINGEMENT. IN NO EVENT WILL THE LICENSOR OR OTHER CONTRIBUTORS
# BE LIABLE FOR ANY CLAIM, DAMAGES, OR OTHER LIABILITY, WHETHER IN AN
# ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF, OR IN
# CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
#
# The QuantumBlack Visual Analytics Limited ("QuantumBlack") name and logo
# (either separately or in combination, "QuantumBlack Trademarks") are
# trademarks of QuantumBlack. The License does not grant you any right or
# license to the QuantumBlack Trademarks. You may not use the QuantumBlack
# Trademarks or any confusingly similar mark as a trademark for your product,
# or use the QuantumBlack Trademarks in any other manner that might cause
# confusion in the marketplace, including but not limited to in advertising,
# on websites, or on software.
#
# See the License for the specific language governing permissions and
# limitations under the License.

"""
``causalnex.pytorch.dist_type`` provides distribution type support classes for the pytorch NOTEARS algorithm.
"""

from .binary import DistTypeBinary
from .continuous import DistTypeContinuous

dist_type_aliases = {
"bin": DistTypeBinary,
"cont": DistTypeContinuous,
}


__all__ = [
"DistTypeBinary",
"DistTypeContinuous",
]
79 changes: 79 additions & 0 deletions causalnex/structure/pytorch/dist_type/_base.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,79 @@
# Copyright 2019-2020 QuantumBlack Visual Analytics Limited
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
# EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES
# OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE, AND
# NONINFRINGEMENT. IN NO EVENT WILL THE LICENSOR OR OTHER CONTRIBUTORS
# BE LIABLE FOR ANY CLAIM, DAMAGES, OR OTHER LIABILITY, WHETHER IN AN
# ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF, OR IN
# CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
#
# The QuantumBlack Visual Analytics Limited ("QuantumBlack") name and logo
# (either separately or in combination, "QuantumBlack Trademarks") are
# trademarks of QuantumBlack. The License does not grant you any right or
# license to the QuantumBlack Trademarks. You may not use the QuantumBlack
# Trademarks or any confusingly similar mark as a trademark for your product,
# or use the QuantumBlack Trademarks in any other manner that might cause
# confusion in the marketplace, including but not limited to in advertising,
# on websites, or on software.
#
# See the License for the specific language governing permissions and
# limitations under the License.

"""
``causalnex.pytorch.dist_type._base`` defines the distribution type class interface and default behavior.
"""

from abc import ABCMeta, abstractmethod

import torch


class DistTypeBase(metaclass=ABCMeta):
""" Base class defining the distribution default behavior and interface """

def __init__(self, idx: int):
"""
Default constructor for the DistTypeBase class.
Unless overridden, provides default behavior to all subclasses.
Args:
idx: Positional index in data passed to the NOTEARS algorithm
which correspond to this datatype.
"""
self.idx = idx

@abstractmethod
def loss(self, X: torch.Tensor, X_hat: torch.Tensor) -> torch.Tensor:
"""
Args:
X: The original data passed into NOTEARS (i.e. the reconstruction target).
X_hat: The reconstructed data.
Returns:
Scalar pytorch tensor of the reconstruction loss between X and X_hat.
"""
raise NotImplementedError("Must implement the loss() method")

@abstractmethod
def inverse_link_function(self, X_hat: torch.Tensor) -> torch.Tensor:
"""
Convert the transformed data from the latent space to the original dtype
using the inverse link function.
Args:
X_hat: Reconstructed data in the latent space.
Returns:
Modified X_hat.
MUST be same shape as passed in data.
Projects the self.idx column from the latent space to the dist_type space.
"""
raise NotImplementedError("Must implement the inverse_link_function() method")
77 changes: 77 additions & 0 deletions causalnex/structure/pytorch/dist_type/binary.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,77 @@
# Copyright 2019-2020 QuantumBlack Visual Analytics Limited
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
# EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES
# OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE, AND
# NONINFRINGEMENT. IN NO EVENT WILL THE LICENSOR OR OTHER CONTRIBUTORS
# BE LIABLE FOR ANY CLAIM, DAMAGES, OR OTHER LIABILITY, WHETHER IN AN
# ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF, OR IN
# CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
#
# The QuantumBlack Visual Analytics Limited ("QuantumBlack") name and logo
# (either separately or in combination, "QuantumBlack Trademarks") are
# trademarks of QuantumBlack. The License does not grant you any right or
# license to the QuantumBlack Trademarks. You may not use the QuantumBlack
# Trademarks or any confusingly similar mark as a trademark for your product,
# or use the QuantumBlack Trademarks in any other manner that might cause
# confusion in the marketplace, including but not limited to in advertising,
# on websites, or on software.
#
# See the License for the specific language governing permissions and
# limitations under the License.

"""
``causalnex.pytorch.data_type.continuous`` defines the binary distribution type.
"""

import torch
import torch.nn as nn

from causalnex.structure.pytorch.dist_type._base import DistTypeBase


class DistTypeBinary(DistTypeBase):
""" Class defining binary distribution type functionality """

def loss(self, X: torch.Tensor, X_hat: torch.Tensor) -> torch.Tensor:
"""
https://pytorch.org/docs/stable/nn.html#torch.nn.BCEWithLogitsLoss
Uses the functional implementation of the BCEWithLogitsLoss class.
The average logit binary cross entropy loss.
Averages across sample dimension (dim=0).
Args:
X: The original data passed into NOTEARS (i.e. the reconstruction target).
X_hat: The reconstructed data.
Returns:
Scalar pytorch tensor of the reconstruction loss between X and X_hat.
"""
return nn.functional.binary_cross_entropy_with_logits(
input=X_hat[:, self.idx],
target=X[:, self.idx],
reduction="mean",
)

def inverse_link_function(self, X_hat: torch.Tensor) -> torch.Tensor:
"""
Inverse-logit (sigmoid) inverse link function for binary data.
Args:
X_hat: Reconstructed data in the latent space.
Returns:
Modified X_hat.
MUST be same shape as passed in data.
Projects the self.idx column from the latent space to the dist_type space.
"""
X_hat[:, self.idx] = torch.sigmoid(X_hat[:, self.idx])
return X_hat

0 comments on commit e29bc0c

Please sign in to comment.