Skip to content

Commit

Permalink
Fix some references.
Browse files Browse the repository at this point in the history
  • Loading branch information
vnmabus committed Apr 5, 2023
1 parent b158452 commit 87837bd
Show file tree
Hide file tree
Showing 5 changed files with 48 additions and 15 deletions.
2 changes: 1 addition & 1 deletion docs/modules/ml/classification.rst
Original file line number Diff line number Diff line change
Expand Up @@ -43,9 +43,9 @@ This module contains depth based estimators to perform classification.
.. autosummary::
:toctree: autosummary

skfda.ml.classification.MaximumDepthClassifier
skfda.ml.classification.DDClassifier
skfda.ml.classification.DDGClassifier
skfda.ml.classification.MaximumDepthClassifier

Logistic regression
-----------------------
Expand Down
36 changes: 36 additions & 0 deletions docs/refs.bib
Original file line number Diff line number Diff line change
Expand Up @@ -263,6 +263,42 @@ @article{ghosh+chaudhuri_2005_maximum
keywords = {Bayes risk,cross-validation,data depth,elliptic symmetry,kernel density estimation,location shift model,Mahalanobis distance,misclassification rate,Vapnik Chervonenkis dimension}
}

@article{li++_2012_ddclassifier,
title = {{{DD-Classifier}}: {{Nonparametric}} Classification Procedure Based on {{DD-Plot}}},
shorttitle = {{{DD-Classifier}}},
author = {Li, Jun and {Cuesta-Albertos}, Juan A. and Liu, Regina Y.},
year = {2012},
month = jun,
journal = {Journal of the American Statistical Association},
volume = {107},
number = {498},
pages = {737--753},
issn = {0162-1459},
doi = {10.1080/01621459.2012.688462},
url = {https://doi.org/10.1080/01621459.2012.688462},
urldate = {2020-01-10},
abstract = {Using the DD-plot (depth vs. depth plot), we introduce a new nonparametric classification algorithm and call it DD-classifier. The algorithm is completely nonparametric, and it requires no prior knowledge of the underlying distributions or the form of the separating curve. Thus, it can be applied to a wide range of classification problems. The algorithm is completely data driven and its classification outcome can be easily visualized in a two-dimensional plot regardless of the dimension of the data. Moreover, it has the advantage of bypassing the estimation of underlying parameters such as means and scales, which is often required by the existing classification procedures. We study the asymptotic properties of the DD-classifier and its misclassification rate. Specifically, we show that DD-classifier is asymptotically equivalent to the Bayes rule under suitable conditions, and it can achieve Bayes error for a family broader than elliptical distributions. The performance of the classifier is also examined using simulated and real datasets. Overall, the DD-classifier performs well across a broad range of settings, and compares favorably with existing classifiers. It can also be robust against outliers or contamination.},
keywords = {Classification,Data depth,DD-classifier,DD-plot,Maximum depth classifier,Misclassification rates,Nonparametric,Robustness}
}

@article{malfait+ramsay_2003_historical,
title = {The Historical Functional Linear Model},
author = {Malfait, Nicole and Ramsay, James O.},
year = {2003},
journal = {The Canadian Journal of Statistics / La Revue Canadienne de Statistique},
volume = {31},
number = {2},
eprint = {3316063},
eprinttype = {jstor},
pages = {115--128},
publisher = {{[Statistical Society of Canada, Wiley]}},
issn = {0319-5724},
doi = {10.2307/3316063},
url = {https://www.jstor.org/stable/3316063},
urldate = {2022-07-21},
abstract = {The authors develop a functional linear model in which the values at time t of a sample of curves yi(t) are explained in a feed-forward sense by the values of covariate curves xi(s) observed at times s {$\leq$} t. They give special attention to the case s {$\in$} [t - {$\delta$}, t], where the lag parameter {$\delta$} is estimated from the data. They use the finite element method to estimate the bivariate parameter regression function {$\beta$}(s, t), which is defined on the triangular domain s {$\leq$} t. They apply their model to the problem of predicting the acceleration of the lower lip during speech on the basis of electromyographical recordings from a muscle depressing the lip. They also provide simulation results to guide the calibration of the fitting process. /// Les auteurs d\'ecrivent un mod\`ele lin\'eaire fonctionnel dans lequel les valeurs au temps t d'un \'echantillon de courbes yi(t) sont expliqu\'ees par les valeurs observ\'ees aux temps s {$\leq$} t de courbes covariables xi(s). Ils accordent une attention particuli\`ere au cas o\`u s {$\in$} [t - {$\delta$}, t], {$\delta$} repr\'esentant un param\`etre de d\'elai estim\'e \`a partir des donn\'ees. Ils emploient la m\'ethode des \'el\'ements finis pour estimer la fonction param\`etre {$\beta$}(s, t) bivari\'ee d\'efinie sur le domaine triangulaire s {$\leq$} t. Ils appliquent leur mod\`ele \`a la pr\'evision de courbes d'acc\'el\'eration de la l\`evre inf\'erieure d'un locuteur \`a partir d'enregistrements \'electromyographiques d'un muscle abaissant celle-ci. Ils pr\'esentent aussi des r\'esultats de simulation pouvant guider le processus de calibration intervenant dans l'ajustement du mod\`ele.}
}

@article{marron++_2015_functional,
title = {Functional Data Analysis of Amplitude and Phase Variation},
author = {Marron, J. S. and Ramsay, James O. and Sangalli, Laura M. and Srivastava, Anuj},
Expand Down
12 changes: 6 additions & 6 deletions skfda/ml/classification/_depth_classifiers.py
Original file line number Diff line number Diff line change
Expand Up @@ -67,11 +67,13 @@ class DDClassifier(
BaseEstimator,
ClassifierMixin[Input, Target],
):
"""Depth-versus-depth (DD) classifer for functional data.
"""
Depth-versus-depth (DD) classifer for functional data.
Transforms the data into a DD-plot and then classifies using a polynomial
of a chosen degree. The polynomial passes through zero and maximizes the
accuracy of the classification on the train dataset.
of a chosen degree\ :footcite:p:`li++_2012_ddclassifier`.
The polynomial passes through zero and maximizes the accuracy of the
classification on the train dataset.
If a point is below the polynomial in the DD-plot, it is classified to
the first class. Otherwise, the point is classified to the second class.
Expand Down Expand Up @@ -118,9 +120,7 @@ class DDClassifier(
:class:`~skfda.preprocessing.dim_reduction.feature_extraction._ddg_transformer`
References:
Li, J., Cuesta-Albertos, J. A., and Liu, R. Y. (2012). DD-classifier:
Nonparametric classification procedure based on DD-plot. Journal of
the American Statistical Association, 107(498):737-753.
.. footbibliography::
"""

def __init__(
Expand Down
8 changes: 4 additions & 4 deletions skfda/ml/regression/_historical_linear_model.py
Original file line number Diff line number Diff line change
Expand Up @@ -217,15 +217,16 @@ class HistoricalLinearRegression(
BaseEstimator,
RegressorMixin[FDataGrid, FDataGrid],
):
r"""Historical functional linear regression.
r"""
Historical functional linear regression.
This is a linear regression method where the covariate and the response are
both functions :math:`\mathbb{R}` to :math:`\mathbb{R}` with the same
domain. In order to predict the value of the response function at point
:math:`t`, only the information of the covariate at points :math:`s < t` is
used. Is thus an "historical" model in the sense that, if the domain
represents time, only the data from the past, or historical data, is used
to predict a given point.
to predict a given point\ :footcite:p:`malfait+ramsay_2003_historical`.
The model assumed by this method is:
Expand Down Expand Up @@ -307,8 +308,7 @@ class HistoricalLinearRegression(
array([[ 2., 0., 0., 4., 5., 5.]])
References:
Malfait, N., & Ramsay, J. O. (2003). The historical functional linear
model. Canadian Journal of Statistics, 31(2), 115-128.
.. footbibliography::
"""

Expand Down
5 changes: 1 addition & 4 deletions skfda/representation/basis/_bspline_basis.py
Original file line number Diff line number Diff line change
Expand Up @@ -169,11 +169,8 @@ def _evaluation_knots(self) -> Tuple[float, ...]:
Get the knots adding m knots to the boundary.
This needs to be done in order to allow a discontinuous behaviour
at the boundaries of the domain [RS05]_.
at the boundaries of the domain (see references).
References:
.. [RS05] Ramsay, J., Silverman, B. W. (2005). *Functional Data
Analysis*. Springer. 50-51.
"""
return tuple(
(self.knots[0],) * (self.order - 1)
Expand Down

0 comments on commit 87837bd

Please sign in to comment.