Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
400 commits
Select commit Hold shift + click to select a range
79b8bf6
Tests OneHotEncoder on sparse matrices
mfeurer Feb 27, 2015
61c90c6
Added the three naive bayes classifier.
Mar 2, 2015
9f5067a
added bagged naive bayes classifier
Mar 2, 2015
1549acd
fixed negative value on log scale bug in beroullinb.py
Mar 4, 2015
953dcea
adjust #hyperparameter to fix unittest
KEggensperger Mar 4, 2015
b7fbc6b
adjust to new repo name
KEggensperger Mar 4, 2015
57320f8
Naive Bayes: forbid with preprocessors which exhibit negative values
mfeurer Mar 4, 2015
378e828
Adapt tests to new NB classifiers
mfeurer Mar 4, 2015
d4c5bdb
minor
KEggensperger Mar 6, 2015
b3939a4
add adaboost for classification
KEggensperger Mar 6, 2015
ff90dc5
add input/output key; reorder inputs
KEggensperger Mar 6, 2015
a336016
add input/output key; resort imports
KEggensperger Mar 6, 2015
506bbce
add unittest to assert keys in preperty dict
KEggensperger Mar 6, 2015
fb8b974
minor
KEggensperger Mar 6, 2015
59cb50b
fix typo
KEggensperger Mar 6, 2015
eb8df6c
add input/output and generic searchspace generation
KEggensperger Mar 9, 2015
ae7d19a
adjust property dicts, all components should have the same keys
KEggensperger Mar 9, 2015
451e444
add a dummy preprocessor, that is always part of the cs
KEggensperger Mar 9, 2015
6d94904
Fix configuration spaces
mfeurer Mar 9, 2015
c6eff2e
Merge branch 'master' of ssh://bitbucket.org/mfeurer/paramsklearn
mfeurer Mar 9, 2015
4302345
fiy typo
KEggensperger Mar 10, 2015
be311ca
add checks for propertiy keys and remove no longer valid checks
KEggensperger Mar 10, 2015
f9209a6
rebuild classification searchspace with transforming preprocessing me…
KEggensperger Mar 10, 2015
219ff05
add input/output key; reorder inputs
KEggensperger Mar 6, 2015
61701d8
add input/output key; resort imports
KEggensperger Mar 6, 2015
a9f232a
add unittest to assert keys in preperty dict
KEggensperger Mar 6, 2015
d5caefa
minor
KEggensperger Mar 6, 2015
928b592
fix typo
KEggensperger Mar 6, 2015
09510ae
add input/output and generic searchspace generation
KEggensperger Mar 9, 2015
af73be9
adjust property dicts, all components should have the same keys
KEggensperger Mar 9, 2015
6764d1f
add a dummy preprocessor, that is always part of the cs
KEggensperger Mar 9, 2015
47739e9
fiy typo
KEggensperger Mar 10, 2015
8492dae
add checks for propertiy keys and remove no longer valid checks
KEggensperger Mar 10, 2015
9f961bc
rebuild classification searchspace with transforming preprocessing me…
KEggensperger Mar 10, 2015
54f6ea0
Merge branch 'inputoutput' of bitbucket.org:mfeurer/paramsklearn into…
KEggensperger Mar 10, 2015
78b2eb9
add random trees embedding
KEggensperger Mar 10, 2015
2deed03
adjust #hyperparameter
KEggensperger Mar 10, 2015
1d638d3
Merged inputoutput into master
mfeurer Mar 10, 2015
64728c2
Update list of classifiers and transformers
mfeurer Mar 10, 2015
8e71f68
Merge branch 'master' of ssh://bitbucket.org/mfeurer/paramsklearn
mfeurer Mar 10, 2015
a67e59a
Update list of transformers
mfeurer Mar 10, 2015
223f1e5
Remove unnecessary statements
mfeurer Mar 10, 2015
4008e63
FIX: configuration space building with preprocessors which transform …
mfeurer Mar 11, 2015
8788c9e
FIX: dense/sparse indicators
mfeurer Mar 11, 2015
f30304d
FIX: upper limit on the maximum number of components in truncated SVD
mfeurer Mar 11, 2015
3ec3128
FIX: kNN can handle sparse data
mfeurer Mar 11, 2015
8aabb7e
Forbid combination of densifier and classifier which can handle spars…
mfeurer Mar 11, 2015
002885e
FIX: adaboost input metainformation
mfeurer Mar 13, 2015
0a2b3fd
FIX: no_preprocessing and include_classifiers
mfeurer Mar 13, 2015
d8d3198
extract searchspace utils to make them available for regression
KEggensperger Mar 13, 2015
6b9d1ea
adjust estimator dict
KEggensperger Mar 13, 2015
aa602e5
make searchspace generation more general
KEggensperger Mar 13, 2015
6ae9678
Remove densifier hyperparameter
mfeurer Mar 13, 2015
70337cd
FIX: raise ValueError on illegal default configurations
mfeurer Mar 13, 2015
ac49926
FIX: raise Exception when trying to use a classifier which can handle…
mfeurer Mar 13, 2015
662fc19
Merge remote-tracking branch 'origin/inputoutput' and add test to tes…
mfeurer Mar 13, 2015
00232db
Do not use adaboost with feature learning
mfeurer Mar 16, 2015
a3dec18
adding gaussian process classification using GPy
mblum Mar 26, 2015
a435cb5
Gaussian process classification uses sparse GPs now
mblum Mar 30, 2015
8a98705
Add components to documentation
mfeurer Mar 31, 2015
cb47080
Fix: Circumvent 'Buffer dtype mismatch, expected 'DOUBLE' but got 'fl…
mfeurer Mar 31, 2015
8f67b41
Merge branch 'master' of ssh://bitbucket.org/mfeurer/paramsklearn
mfeurer Mar 31, 2015
f294ccb
FIX: MinMaxScaler has zero as minimum after fitting on training data:…
mfeurer Mar 31, 2015
ef04d7a
Remove bagged_gaussian_nb and bagged_multinomial_nb
mfeurer Mar 31, 2015
226e861
Change truncatedSVD solver to randomized
mfeurer Mar 31, 2015
10d49f2
Move from iris test dataset to digits test dataset where iris is too …
mfeurer Mar 31, 2015
d3f4c38
Do not use GP until it is stable
mfeurer Mar 31, 2015
fa20159
Move numeric stability helpers from implementation to component
mfeurer Mar 31, 2015
348ab39
Do not use PCA together with tree models
mfeurer Mar 31, 2015
2b7f75c
explicitly turned off normalization in GP
mblum Mar 31, 2015
0841cce
GP is using noise term in kernel function
mblum Mar 31, 2015
3dfe717
Merge
mblum Mar 31, 2015
405865b
Merge branch 'master' of bitbucket.org:mfeurer/paramsklearn
mblum Mar 31, 2015
d9297f9
predict: add batch_size argument
mfeurer Mar 31, 2015
aff7c3f
Import GPy only if necessary (because it takes ~4s)
mfeurer Apr 1, 2015
11b4f16
FIX: batch_size in predict doesn't cause crash on sparse matrix any more
mfeurer Apr 1, 2015
95c3938
Raise ValueError if PCA returns NaN component values.
mfeurer Apr 1, 2015
696ab9f
Raise ValueError if Select Percentile mistakenly removes all features
mfeurer Apr 1, 2015
4bcca73
FIX: bug in Min/Max Scaler; improve numerical stability of min/max sc…
mfeurer Apr 1, 2015
6bc0dcc
Merge branch 'master' of bitbucket.org:mfeurer/paramsklearn
mblum Apr 2, 2015
11912be
Preprocessing: make y a keyword argument where possible
mfeurer Apr 7, 2015
aa4085a
Update: use float32 for test data; test whether preprocessors copy th…
mfeurer Apr 7, 2015
9ca2780
Merge branch 'master' of ssh://bitbucket.org/mfeurer/paramsklearn
mfeurer Apr 7, 2015
5078f55
fix bugs
KEggensperger Apr 7, 2015
a3f50c7
add gpy to deps
KEggensperger Apr 7, 2015
24e0265
argument bugfix in GP classify
mblum Apr 7, 2015
72a4ff0
Merge branch 'master' of bitbucket.org:mfeurer/paramsklearn
mblum Apr 7, 2015
e6df45e
Do not cast targets of test data to int32 any more
mfeurer Apr 7, 2015
0980f63
Make imputation less memory consuming
mfeurer Apr 7, 2015
bae0626
Merge branch 'master' of ssh://bitbucket.org/mfeurer/paramsklearn
mfeurer Apr 7, 2015
433b030
Readd GP to the configuration space
mfeurer Apr 7, 2015
958ecea
FIX: densifier returns array instead of matrix
mfeurer Apr 7, 2015
9935f38
Adapt test fixtures
mfeurer Apr 7, 2015
83a1404
add logistic regression according to the least squares revisited paper
stokasto Apr 9, 2015
1ec4287
fix predict method for projlogit, I completely skrewed that one up ;)
stokasto Apr 9, 2015
db13067
make constructor match the default
stokasto Apr 9, 2015
e7c2ea0
add new preprocessing method and fix a bug in testing different prepr…
stokasto Apr 9, 2015
d73e490
fix a few tests
stokasto Apr 9, 2015
c61ccf6
GEM doesn't seem to work for multilabel
mfeurer Apr 10, 2015
17ac2fe
Remove combination of gaussian_nb and feature learning from the confi…
mfeurer Apr 10, 2015
2138c37
fix potential bug if labels are not int
stokasto Apr 10, 2015
87e7e1b
Merge branch 'master' of bitbucket.org:mfeurer/paramsklearn
stokasto Apr 10, 2015
75f809a
Adaboost: increase n_estimators, but make them less deep
mfeurer Apr 16, 2015
271d3c6
kNN: fix bug that kNN does not use its hyperparameters
mfeurer Apr 16, 2015
bd4c333
Improve some tests
mfeurer Apr 16, 2015
428540e
mend
mfeurer Apr 16, 2015
3ecb872
Add missing classifiers
mfeurer Apr 16, 2015
235579c
Merge branch 'master' of ssh://bitbucket.org/mfeurer/paramsklearn
mfeurer Apr 16, 2015
2b82a0f
kNN: remove illegal configuration for sparse data
mfeurer Apr 17, 2015
b9e83ac
Fix the previous commit
mfeurer Apr 17, 2015
7152f48
Select percentile: loosen percentile hyperparameter
mfeurer Apr 17, 2015
77cc0f3
Add: no_preprocessing and normalizer
mfeurer Apr 17, 2015
f5ed072
Add DecisionTreeClassifier
mfeurer Apr 17, 2015
f165ddf
kNN remove l5 distance
mfeurer Apr 17, 2015
da46467
Fix: lda and qda: return something in predict_proba
mfeurer Apr 22, 2015
76f8336
Add missing preprocessors
mfeurer Apr 22, 2015
ff3bc3c
FIX: do not use nb without feature squashing
mfeurer Apr 22, 2015
ba1cd32
Forbid nystroem sampler with nb
mfeurer Apr 23, 2015
f6c976d
Fix bug and tests
mfeurer Apr 23, 2015
a942472
Improve stability with sparse datasets
mfeurer Apr 23, 2015
6de26d7
Feature: weighting for imbalanced classes
mfeurer May 2, 2015
9c3feb1
Remove dictionary learning and sparse filtering
mfeurer May 2, 2015
90e04e5
Remove Gaussian Process for classification
mfeurer May 2, 2015
894f1af
Forbid decision tree with feature learning
mfeurer May 2, 2015
eec7e69
Reduce unnecessary memory consumption
mfeurer May 5, 2015
e4dd557
Fix test
mfeurer May 5, 2015
27d586e
Forbid: chi^2 + normalization, fix bug with sparse matrices
mfeurer May 6, 2015
4dd0a3a
Remove non-sklearn components in this branch
mfeurer May 6, 2015
2d832c0
Forbid: chi^2 + normalization, fix bug with sparse matrices
mfeurer May 6, 2015
a1b5d90
Classification; forbidden: normalize with nb
mfeurer May 8, 2015
16ee754
Classification; forbidden: normalize with nb
mfeurer May 8, 2015
4ab60a2
fix typo
KEggensperger May 8, 2015
a814220
Merge branch 'master' of bitbucket.org:mfeurer/paramsklearn
KEggensperger May 8, 2015
9bcfd87
Remove GPy from dependencies
mfeurer May 12, 2015
05e69db
Merge branch 'master' of ssh://bitbucket.org/mfeurer/paramsklearn
mfeurer May 12, 2015
113e984
Use new version of HPOlibConfigSpace; tests run much faster now
mfeurer May 22, 2015
1d39db9
Remove non-sklearn components in this branch
mfeurer May 6, 2015
deb4a1c
Merge branch 'autoweka_comparison' of ssh://bitbucket.org/mfeurer/par…
mfeurer May 27, 2015
73d8643
Allow a kernel approximation to be the sole preprocessor
mfeurer May 31, 2015
e304385
Revert "Remove non-sklearn components in this branch"
mfeurer Oct 1, 2015
5030948
Remove sparse filtering again
mfeurer Oct 1, 2015
50f6974
Update result in first steps
mfeurer Oct 1, 2015
2e20d83
create latex table for #hyp.params
KEggensperger May 26, 2015
50a2d33
FIX: --sparse was ignored
KEggensperger May 26, 2015
802537f
remove *.aux and *.log files after using pdflatex
KEggensperger May 26, 2015
afb0dda
fix preprocessing hyperparams
KEggensperger May 27, 2015
bce6ae0
create hyperparameter table: sort alphabetically
mfeurer May 27, 2015
e4719b4
add binary
KEggensperger May 31, 2015
35d61fd
Add new test; check if OHE can deal with new values at transform time
mfeurer Jun 13, 2015
ec73e1c
Change hyperparameters for DT and RidgeClassifier
mfeurer Jun 23, 2015
ce6e065
FIX: use polynomial features
mfeurer Jun 23, 2015
ef390d1
Change parametrization of DT and fix tests
mfeurer Jun 23, 2015
6a93e91
Remove unnecessary hyperparameter class_weight in SVM models
mfeurer Jun 23, 2015
21c0450
Upgrade to sklearn 0.16.1; hyperparameters etc not yet adapted
mfeurer Jul 15, 2015
9260d6d
Add warmstarts to models where possible
mfeurer Jul 16, 2015
f5cb3e3
Allow for iterative fitting with auto-sklearn
mfeurer Jul 20, 2015
6c43b1f
REFACTOR: pipelines more flexible
mfeurer Jul 22, 2015
9caf652
Add ExtraTreesRegressor, update ExtraTreesClassifier to sklearn 0.16
mfeurer Jul 23, 2015
3f0ac49
Update random forests to sklearn 0.16
mfeurer Jul 23, 2015
53da02f
Update gradient boosting to sklearn 0.16
mfeurer Jul 23, 2015
d14fc4c
Update Ridge to sklearn 0.16
mfeurer Jul 23, 2015
ec918bf
Reactivate Support Vector Regression
mfeurer Jul 23, 2015
e23bcc6
Update AdaBoostClassifier, add AdaBoostRegressor
mfeurer Jul 23, 2015
904bfae
Update kNN, add kNN regressor
mfeurer Jul 23, 2015
0ce482d
Update DecisionTree, add DecisionTree for regression
mfeurer Jul 23, 2015
264d115
Update the list of classifiers, regressors and transformers according…
mfeurer Jul 23, 2015
b415da9
Update LibLinear_SVC, add LibLinear_SVR
mfeurer Jul 23, 2015
be9ba59
LibSVM_SVC: calibration no longer necessary for predict_proba with sp…
mfeurer Jul 23, 2015
03cc78a
Update SGD and add SGD for regression
mfeurer Jul 23, 2015
fc80312
FIX wrong casting to bool
mfeurer Jul 23, 2015
d751eaf
Update all classifiers to sklearn 0.16
mfeurer Jul 28, 2015
8fd35f5
Update preprocessors to sklearn 0.16
mfeurer Jul 28, 2015
f7153f6
LibLinear4preproc: fix penalty to L1 penalty
mfeurer Jul 28, 2015
c50f5fc
Reduce cache size of libsvm
mfeurer Jul 28, 2015
20b882c
Fix previous commit
mfeurer Jul 28, 2015
ff20389
Make test lest resource hungry
mfeurer Jul 30, 2015
05ccf0c
Fix test
mfeurer Jul 30, 2015
b3bc626
Fix naive bayes algorithms iterative_fit
mfeurer Jul 30, 2015
2e09e92
Fix output metadata of KernelPCA
mfeurer Jul 31, 2015
9023191
Make search space generation more general
mfeurer Jul 31, 2015
233a06d
Make create_searchspace_util work with several chained choices
mfeurer Jul 31, 2015
5d89b4b
Move constants to own file; make them int; prepares for new constraint
mfeurer Aug 14, 2015
5758fe9
Add functionality to deal with non-negative datasets
mfeurer Aug 26, 2015
b4d2da2
OneHotEncoder: make faster and add hyperparameter
mfeurer Sep 8, 2015
9d97581
Add OneHotEncoding component and test
mfeurer Sep 8, 2015
e337e60
Add OneHotEncoding as a pipeline component
mfeurer Sep 8, 2015
481ead7
SVM/SVR adaptive cache size
mfeurer Oct 1, 2015
e51c01c
Python 3 compability
mfeurer Oct 1, 2015
c4cf385
Version number
mfeurer Oct 1, 2015
c471bd6
Fix typo in component name
mfeurer Oct 5, 2015
ab758ee
Fix unittests
mfeurer Oct 5, 2015
a0ff41f
Fix unittests; fix bugs
mfeurer Oct 6, 2015
b119e91
Update requirements in setup.py
mfeurer Oct 6, 2015
fe48267
Remove Ridge Regression Classifier
mfeurer Oct 6, 2015
4da44ec
Fix SVC predict_proba to return correct shape
mfeurer Oct 7, 2015
869ab3b
Change parameter in KernelPCA
mfeurer Oct 7, 2015
be173f7
Add more specific error message to FastICA
mfeurer Oct 7, 2015
642e06e
Fix: predict_proba for SVC in binary classification
mfeurer Oct 7, 2015
28fad11
Check for valid names in argument include
mfeurer Oct 8, 2015
8133573
Change default if default is forbidden
mfeurer Oct 8, 2015
3396a20
Fix metalearning if no features file present
mfeurer Oct 22, 2015
45d8863
ADD multilabel classification
KEggensperger Oct 31, 2015
6158a8a
ADD option ta handle multilabel datasets
KEggensperger Oct 31, 2015
f7e2797
ADD multilabel support
KEggensperger Oct 31, 2015
17a84b9
ADD multilabel support
KEggensperger Oct 31, 2015
7d3909e
Add __repr__ to ParamSklearnBaseEstimator
mfeurer Oct 31, 2015
95509cd
Merge branch 'development' of ssh://github.com/automl/paramsklearn in…
mfeurer Oct 31, 2015
dad0904
* setting some parameters to log-scale in kNN and SGD
mlindauer Nov 12, 2015
5751426
Fix configuration spaces
mfeurer Nov 13, 2015
c013f89
ADD travis support
KEggensperger Nov 16, 2015
1d00fa3
UPDATE travis, more apt packages
KEggensperger Nov 16, 2015
2362d82
UPDATE travis; require numpy 1.9.0
KEggensperger Nov 16, 2015
08c9827
ADD landscape
KEggensperger Nov 16, 2015
318c0fb
UPDATE test and travis
KEggensperger Nov 16, 2015
0e31ed2
ADD travis-ci, landscape
KEggensperger Nov 16, 2015
a148ce5
UPDATE upgrade pip
KEggensperger Nov 16, 2015
a632f96
Merge branch 'development' of github.com:automl/paramsklearn into dev…
KEggensperger Nov 16, 2015
d57213d
ADD travis_wait to install scipy without timeout
KEggensperger Nov 17, 2015
d8a9948
FIX test
KEggensperger Nov 17, 2015
4bdbd91
ADD coveralls, except SGD did not converge
KEggensperger Nov 17, 2015
274c8c6
USE miniconde
KEggensperger Nov 17, 2015
a668b94
FIX
KEggensperger Nov 17, 2015
7f2f493
FIX
KEggensperger Nov 17, 2015
61b8713
FIX install mock
KEggensperger Nov 17, 2015
01ef49a
ADD coveralls
KEggensperger Nov 17, 2015
1c16c1f
add coverage badges
KEggensperger Nov 17, 2015
df1d1f4
ADD setup.py instead of nosetests
KEggensperger Nov 17, 2015
50be264
Merge branch 'development' of github.com:automl/paramsklearn into dev…
KEggensperger Nov 17, 2015
1a465cb
Fix unittest to work with numpy 1.10
mfeurer Nov 20, 2015
beba4fb
Merge branch 'development' of ssh://github.com/automl/paramsklearn in…
mfeurer Nov 20, 2015
e40588f
Loosen numpy version constraints; add python 3.5
mfeurer Nov 20, 2015
b1a96ad
Move import into fit(); avoid importing whole sklearn
mfeurer Nov 24, 2015
32c398b
Don't test python 3.5; not supported right now
mfeurer Nov 26, 2015
2f680e4
FIX test; lower bounds
KEggensperger Nov 27, 2015
ebb78bf
Fix bug pickling suprocess.Popen in python3 did not work
mfeurer Dec 1, 2015
702d15d
Fix binary classification score and added unittest for it
hmendozap Dec 2, 2015
57fce06
FIX could not use relative path for tmp_folder and output_folder
mfeurer Dec 3, 2015
c007b94
Update requ.txt
mfeurer Dec 9, 2015
7ccf202
Merge pull request #23 from automl/hotfix-metafeatures
mfeurer Dec 15, 2015
d524774
Merge pull request #22 from hmendozap/fix-binaryclass
mfeurer Dec 15, 2015
7b91435
Use development branch of ParamSklearn
mfeurer Dec 15, 2015
184e808
Merge branch 'development' of github.com:automl/auto-sklearn into dev…
mfeurer Dec 15, 2015
4d96590
Update balancing test
mfeurer Dec 15, 2015
9521534
Only output coverage for auto-sklearn
mfeurer Dec 15, 2015
529d07d
Merge pull request #24 from automl/master
mfeurer Dec 15, 2015
00dac61
Merge branch 'development' of /home/feurerm/projects/ParamSklearn int…
mfeurer Dec 16, 2015
b0623a5
Refactor: integrate ParamSklearn in auto-sklearn repo
mfeurer Dec 16, 2015
c8ed9d5
Remove example for ParamSklearn
mfeurer Dec 17, 2015
7c03173
Add dataset file for testing; was missing due to gitignore
mfeurer Dec 17, 2015
0d89f89
Change test, hopefully stops timeout on travis-ci
mfeurer Dec 17, 2015
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Empty file added .coveralls.yml
Empty file.
7 changes: 4 additions & 3 deletions .travis.yml
Original file line number Diff line number Diff line change
Expand Up @@ -39,11 +39,12 @@ install:

# Install requirements from other repos
- pip install git+https://github.com/automl/HPOlibConfigSpace.git
- pip install git+https://github.com/automl/paramsklearn.git

- python setup.py install

# command to run tests, e.g. python setup.py test
script:
# - coverage run --source autosklearn setup.py test
- cd test && nosetests -v --with-coverage
- cd test && nosetests -v --with-coverage --cover-package=autosklearn

after_success: coveralls
after_success: coveralls
Empty file added CHANGES.md
Empty file.
24 changes: 24 additions & 0 deletions LICENSE.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
Copyright (c) 2014, Matthias Feurer
All rights reserved.

Redistribution and use in source and binary forms, with or without
modification, are permitted provided that the following conditions are met:
* Redistributions of source code must retain the above copyright
notice, this list of conditions and the following disclaimer.
* Redistributions in binary form must reproduce the above copyright
notice, this list of conditions and the following disclaimer in the
documentation and/or other materials provided with the distribution.
* Neither the name of the <organization> nor the
names of its contributors may be used to endorse or promote products
derived from this software without specific prior written permission.

THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND
ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
DISCLAIMED. IN NO EVENT SHALL <COPYRIGHT HOLDER> BE LIABLE FOR ANY
DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES
(INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND
ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
10 changes: 7 additions & 3 deletions autosklearn/automl.py
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@
convert_conf2smac_string
from autosklearn.evaluation import calculate_score
from autosklearn.util import StopWatch, get_logger, setup_logger, \
get_auto_seed, set_auto_seed, del_auto_seed, submit_process, paramsklearn, \
get_auto_seed, set_auto_seed, del_auto_seed, submit_process, pipeline, \
Backend
from autosklearn.util.smac import run_smac

Expand Down Expand Up @@ -76,7 +76,7 @@ def _create_search_space(tmp_dir, data_info, backend, watcher, logger,
task_name = 'CreateConfigSpace'
watcher.start_task(task_name)
configspace_path = os.path.join(tmp_dir, 'space.pcs')
configuration_space = paramsklearn.get_configuration_space(
configuration_space = pipeline.get_configuration_space(
data_info,
include_estimators=include_estimators,
include_preprocessors=include_preprocessors)
Expand Down Expand Up @@ -614,7 +614,11 @@ def _load_models(self):
seed)

def score(self, X, y):
# fix: Consider only index 1 of second dimension
# Don't know if the reshaping should be done there or in calculate_score
prediction = self.predict(X)
if self._task == BINARY_CLASSIFICATION:
prediction = prediction[:, 1].reshape((-1, 1))
return calculate_score(y, prediction, self._task,
self._metric, self._label_num,
logger=self._logger)
Expand Down Expand Up @@ -695,4 +699,4 @@ def _delete_output_directories(self):
pass
else:
print("Could not delete tmp dir: %s" %
self._tmp_dir)
self._tmp_dir)
2 changes: 1 addition & 1 deletion autosklearn/cli/base_interface.py
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@
from autosklearn.data.competition_data_manager import CompetitionDataManager
from autosklearn.evaluation import CVEvaluator, HoldoutEvaluator, \
NestedCVEvaluator, TestEvaluator, get_new_run_num
from autosklearn.util.paramsklearn import get_configuration_space
from autosklearn.util.pipeline import get_configuration_space
from autosklearn.util import Backend


Expand Down
4 changes: 2 additions & 2 deletions autosklearn/constants.py
Original file line number Diff line number Diff line change
Expand Up @@ -32,8 +32,8 @@

R2_METRIC = 10
A_METRIC = 11
REGRESSION_METRIC = [R2_METRIC, A_METRIC]
METRIC = CLASSIFICATION_METRICS + REGRESSION_METRIC
REGRESSION_METRICS = [R2_METRIC, A_METRIC]
METRIC = CLASSIFICATION_METRICS + REGRESSION_METRICS
STRING_TO_METRIC = {
'acc': ACC_METRIC,
'acc_metric': ACC_METRIC,
Expand Down
2 changes: 1 addition & 1 deletion autosklearn/data/abstract_data_manager.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@
import numpy as np
import scipy.sparse

from ParamSklearn.implementations.OneHotEncoder import OneHotEncoder
from autosklearn.pipeline.implementations.OneHotEncoder import OneHotEncoder

from autosklearn.util import predict_RAM_usage

Expand Down
30 changes: 21 additions & 9 deletions autosklearn/ensemble_selection_script.py
Original file line number Diff line number Diff line change
Expand Up @@ -58,15 +58,20 @@ def get_predictions(dir_path, dir_path_list, include_num_runs,
match = model_and_automl_re.search(model_name)
automl_seed = int(match.group(1))
num_run = int(match.group(2))

if model_name.endswith("/"):
model_name = model_name[:-1]
basename = os.path.basename(model_name)

if (automl_seed, num_run) in include_num_runs:
if precision == "16":
predictions = np.load(os.path.join(dir_path, model_name)).astype(dtype=np.float16)
predictions = np.load(os.path.join(dir_path, basename)).astype(dtype=np.float16)
elif precision == "32":
predictions = np.load(os.path.join(dir_path, model_name)).astype(dtype=np.float32)
predictions = np.load(os.path.join(dir_path, basename)).astype(dtype=np.float32)
elif precision == "64":
predictions = np.load(os.path.join(dir_path, model_name)).astype(dtype=np.float64)
predictions = np.load(os.path.join(dir_path, basename)).astype(dtype=np.float64)
else:
predictions = np.load(os.path.join(dir_path, model_name))
predictions = np.load(os.path.join(dir_path, basename))
result.append(predictions)
return result

Expand Down Expand Up @@ -249,7 +254,10 @@ def main(autosklearn_tmp_dir,
dir_ensemble_list_mtimes = []

for dir_ensemble_file in dir_ensemble_list:
dir_ensemble_file = os.path.join(dir_ensemble, dir_ensemble_file)
if dir_ensemble_file.endswith("/"):
dir_ensemble_file = dir_ensemble_file[:-1]
basename = os.path.basename(dir_ensemble_file)
dir_ensemble_file = os.path.join(dir_ensemble, basename)
mtime = os.path.getmtime(dir_ensemble_file)
dir_ensemble_list_mtimes.append(mtime)

Expand Down Expand Up @@ -285,14 +293,18 @@ def main(autosklearn_tmp_dir,

model_idx = 0
for model_name in dir_ensemble_list:
if model_name.endswith("/"):
model_name = model_name[:-1]
basename = os.path.basename(model_name)

if precision is "16":
predictions = np.load(os.path.join(dir_ensemble, model_name)).astype(dtype=np.float16)
predictions = np.load(os.path.join(dir_ensemble, basename)).astype(dtype=np.float16)
elif precision is "32":
predictions = np.load(os.path.join(dir_ensemble, model_name)).astype(dtype=np.float32)
predictions = np.load(os.path.join(dir_ensemble, basename)).astype(dtype=np.float32)
elif precision is "64":
predictions = np.load(os.path.join(dir_ensemble, model_name)).astype(dtype=np.float64)
predictions = np.load(os.path.join(dir_ensemble, basename)).astype(dtype=np.float64)
else:
predictions = np.load(os.path.join(dir_ensemble, model_name))
predictions = np.load(os.path.join(dir_ensemble, basename))
score = calculate_score(targets_ensemble, predictions,
task_type, metric,
predictions.shape[1])
Expand Down
8 changes: 4 additions & 4 deletions autosklearn/evaluation/abstract_evaluator.py
Original file line number Diff line number Diff line change
Expand Up @@ -7,8 +7,8 @@

import numpy as np
import lockfile
from ParamSklearn.classification import ParamSklearnClassifier
from ParamSklearn.regression import ParamSklearnRegressor
from autosklearn.pipeline.classification import SimpleClassificationPipeline
from autosklearn.pipeline.regression import SimpleRegressionPipeline
from sklearn.dummy import DummyClassifier, DummyRegressor

from autosklearn.constants import *
Expand Down Expand Up @@ -106,13 +106,13 @@ def __init__(self, Datamanager, configuration=None,
if self.configuration is None:
self.model_class = MyDummyRegressor
else:
self.model_class = ParamSklearnRegressor
self.model_class = SimpleRegressionPipeline
self.predict_function = self.predict_regression
else:
if self.configuration is None:
self.model_class = MyDummyClassifier
else:
self.model_class = ParamSklearnClassifier
self.model_class = SimpleClassificationPipeline
self.predict_function = self.predict_proba

if num_run is None:
Expand Down
2 changes: 1 addition & 1 deletion autosklearn/evaluation/util.py
Original file line number Diff line number Diff line change
Expand Up @@ -47,7 +47,7 @@ def calculate_score(solution, prediction, task_type, metric, num_classes,
score = dict()
if task_type in REGRESSION_TASKS:
cprediction = sanitize_array(prediction)
for metric_ in REGRESSION_METRIC:
for metric_ in REGRESSION_METRICS:
score[metric_] = regression_metrics.calculate_score(metric_,
solution,
cprediction)
Expand Down
6 changes: 3 additions & 3 deletions autosklearn/metalearning/metafeatures/metafeatures.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,9 +13,9 @@
from sklearn.utils import check_array
from sklearn.multiclass import OneVsRestClassifier

from ParamSklearn.implementations.Imputation import Imputer
from ParamSklearn.implementations.OneHotEncoder import OneHotEncoder
from ParamSklearn.implementations.StandardScaler import StandardScaler
from autosklearn.pipeline.implementations.Imputation import Imputer
from autosklearn.pipeline.implementations.OneHotEncoder import OneHotEncoder
from autosklearn.pipeline.implementations.StandardScaler import StandardScaler

from autosklearn.util.logging_ import get_logger
from .metafeature import MetaFeature, HelperFunction, DatasetMetafeatures, \
Expand Down
Empty file.
Loading