Skip to content

Commit

Permalink
[DOC] Compose and deep learning classifier doc tidy (#3756)
Browse files Browse the repository at this point in the history
* compose package comments

* compose package comments

* deep learning package comments
  • Loading branch information
Tony Bagnall committed Nov 11, 2022
1 parent 1403b80 commit fe751d5
Show file tree
Hide file tree
Showing 6 changed files with 37 additions and 50 deletions.
21 changes: 8 additions & 13 deletions sktime/classification/compose/_column_ensemble.py
Expand Up @@ -189,35 +189,30 @@ class ColumnEnsembleClassifier(BaseColumnEnsembleClassifier):
This estimator allows different columns or column subsets of the input
to be transformed separately and the features generated by each
transformer
will be ensembled to form a single output.
transformer will be ensembled to form a single output.
Parameters
----------
estimators : list of tuples
List of (name, estimator, column(s)) tuples specifying the
transformer objects to be applied to subsets of the data.
List of (name, estimator, column(s)) tuples specifying the transformer
objects to be applied to subsets of the data.
name : string
Like in Pipeline and FeatureUnion, this allows the
transformer and
its parameters to be set using ``set_params`` and searched
in grid
search.
transformer and its parameters to be set using ``set_params`` and searched
in grid search.
estimator : or {'drop'}
Estimator must support `fit` and `predict_proba`. Special-cased
strings 'drop' and 'passthrough' are accepted as well, to
indicate to drop the columns
column(s) : string or int, array-like of string or int, slice, \
boolean mask array or callable
indicate to drop the columns.
column(s) : array-like of string or int, slice, boolean mask array or callable.
remainder : {'drop', 'passthrough'} or estimator, default 'drop'
By default, only the specified columns in `transformations` are
transformed and combined in the output, and the non-specified
columns are dropped. (default of ``'drop'``).
By specifying ``remainder='passthrough'``, all remaining columns
that
were not specified in `transformations` will be automatically passed
that were not specified in `transformations` will be automatically passed
through. This subset of columns is concatenated with the output of
the transformations.
By setting ``remainder`` to be an estimator, the remaining
Expand Down
25 changes: 8 additions & 17 deletions sktime/classification/compose/_ensemble.py
Expand Up @@ -28,24 +28,10 @@


class ComposableTimeSeriesForestClassifier(BaseTimeSeriesForest, BaseClassifier):
"""Time-Series Forest Classifier.
@article{DENG2013142,
title = {A time series forest for classification and feature extraction},
journal = {Information Sciences},
volume = {239},
pages = {142 - 153},
year = {2013},
issn = {0020-0255},
doi = {https://doi.org/10.1016/j.ins.2013.02.030},
url = {http://www.sciencedirect.com/science/article/pii/S0020025513001473},
author = {Houtao Deng and George Runger and Eugene Tuv and Martyanov Vladimir},
keywords = {Decision tree, Ensemble, Entrance gain, Interpretability,
Large margin, Time series classification}
}
"""Time Series Forest Classifier as described in [1]_.
A time series forest is a meta estimator and an adaptation of the random
forest for time-series/panel data that fits a number of decision tree
A time series forest is an adaptation of the random
forest for time-series data. It that fits a number of decision tree
classifiers on various sub-samples of a transformed dataset and uses
averaging to improve the predictive accuracy and control over-fitting.
The sub-sample size is always the same as the original input sample size
Expand Down Expand Up @@ -182,6 +168,11 @@ class labels (multi-output problem).
set. If n_estimators is small it might be possible that a data point
was never left out during the bootstrap. In this case,
`oob_decision_function_` might contain NaN.
References
----------
.. [1] Deng et. al, A time series forest for classification and feature extraction,
Information Sciences, 239:2013.
"""

_tags = {
Expand Down
11 changes: 6 additions & 5 deletions sktime/classification/deep_learning/cnn.py
Expand Up @@ -14,7 +14,7 @@


class CNNClassifier(BaseDeepClassifier):
"""Time Convolutional Neural Network (CNN), as described in [1].
"""Time Convolutional Neural Network (CNN), as described in [1]_.
Parameters
----------
Expand Down Expand Up @@ -49,13 +49,14 @@ class CNNClassifier(BaseDeepClassifier):
Notes
-----
.. [1] Zhao et. al, Convolutional neural networks for
time series classification, Journal of
Systems Engineering and Electronics, 28(1):2017.
Adapted from the implementation from Fawaz et. al
https://github.com/hfawaz/dl-4-tsc/blob/master/classifiers/cnn.py
References
----------
.. [1] Zhao et. al, Convolutional neural networks for time series classification,
Journal of Systems Engineering and Electronics, 28(1):2017.
Examples
--------
>>> from sktime.classification.deep_learning.cnn import CNNClassifier
Expand Down
13 changes: 7 additions & 6 deletions sktime/classification/deep_learning/fcn.py
Expand Up @@ -14,7 +14,7 @@


class FCNClassifier(BaseDeepClassifier):
"""Fully Connected Neural Network (FCN), as described in [1].
"""Fully Connected Neural Network (FCN), as described in [1]_.
Parameters
----------
Expand Down Expand Up @@ -42,12 +42,13 @@ class FCNClassifier(BaseDeepClassifier):
Notes
-----
.. [1] Zhao et. al, Convolutional neural networks for
time series classification, Journal of
Systems Engineering and Electronics, 28(1):2017.
Adapted from the implementation from Fawaz et. al
https://github.com/hfawaz/dl-4-tsc/blob/master/classifiers/cnn.py
https://github.com/hfawaz/dl-4-tsc/blob/master/classifiers/fcn.py
References
----------
.. [1] Zhao et. al, Convolutional neural networks for time series classification,
Journal of Systems Engineering and Electronics, 28(1):2017.
Examples
--------
Expand Down
15 changes: 7 additions & 8 deletions sktime/classification/deep_learning/mlp.py
Expand Up @@ -14,7 +14,7 @@


class MLPClassifier(BaseDeepClassifier):
"""Multi Layer Perceptron Network (MLP), as described in [1].
"""Multi Layer Perceptron Network (MLP), as described in [1]_.
Parameters
----------
Expand Down Expand Up @@ -42,16 +42,15 @@ class MLPClassifier(BaseDeepClassifier):
Notes
-----
.. .. [1] Network originally defined in:
@inproceedings{wang2017time, title={Time series classification from
scratch with deep neural networks: A strong baseline}, author={Wang,
Zhiguang and Yan, Weizhong and Oates, Tim}, booktitle={2017
International joint conference on neural networks (IJCNN)}, pages={
1578--1585}, year={2017}, organization={IEEE} }
Adapted from the implementation from source code
https://github.com/hfawaz/dl-4-tsc/blob/master/classifiers/mlp.py
References
----------
.. [1] Wang et. al, Time series classification from
scratch with deep neural networks: A strong baseline,
International joint conference on neural networks (IJCNN), 2017.
Examples
--------
>>> from sktime.classification.deep_learning.mlp import MLPClassifier
Expand Down
2 changes: 1 addition & 1 deletion sktime/classification/deep_learning/tapnet.py
Expand Up @@ -20,7 +20,7 @@


class TapNetClassifier(BaseDeepClassifier):
"""Implementation of TapNetClassifier, as described in [1].
"""Time series attentional prototype network (TapNet), as described in [1]_.
Parameters
----------
Expand Down

0 comments on commit fe751d5

Please sign in to comment.