Skip to content

Commit

Permalink
Curate Bibtex.
Browse files Browse the repository at this point in the history
  • Loading branch information
vnmabus committed Apr 4, 2023
1 parent 53dfa29 commit 2f09844
Show file tree
Hide file tree
Showing 22 changed files with 465 additions and 509 deletions.
828 changes: 387 additions & 441 deletions docs/refs.bib

Large diffs are not rendered by default.

2 changes: 1 addition & 1 deletion examples/plot_pairwise_alignment.py
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,7 @@
# In the case of elastic registration it is taken as energy function the
# Fisher-Rao distance with a penalisation term, due to the property of
# invariance to reparameterizations of warpings functions
# :footcite:p:`srivastava+klassen_2016_analysis_elastic`.
# :footcite:p:`srivastava+klassen_2016_functionala`.
#
# .. math::
# E[f \circ \gamma, g] = d_{FR} (f \circ \gamma, g)
Expand Down
2 changes: 1 addition & 1 deletion skfda/datasets/_real_datasets.py
Original file line number Diff line number Diff line change
Expand Up @@ -1540,6 +1540,6 @@ def fetch_mco(

if fetch_mco.__doc__ is not None: # docstrings can be stripped off
fetch_mco.__doc__ += _mco_descr_template.format(
cite=":footcite:`ruiz++_2003_cariporide`",
cite=":footcite:`ruiz-meana++_2003_cariporide`",
bibliography=".. footbibliography::",
) + _param_descr
2 changes: 1 addition & 1 deletion skfda/exploratory/depth/_depth.py
Original file line number Diff line number Diff line change
Expand Up @@ -231,7 +231,7 @@ class DistanceBasedDepth(Depth[FDataGrid], BaseEstimator):
.. math::
D(x) = [1 + M(x, \mu)]^{-1}.
as explained in :footcite:`serfling+zuo_2000_depth_function`.
as explained in :footcite:`serfling+zuo_2000_general`.
Examples:
>>> import skfda
Expand Down
8 changes: 4 additions & 4 deletions skfda/exploratory/stats/_fisher_rao.py
Original file line number Diff line number Diff line change
Expand Up @@ -73,15 +73,15 @@ def _fisher_rao_warping_mean(
The karcher mean :math:`\bar \gamma` is defined as the warping that
minimises locally the sum of Fisher-Rao squared distances
:footcite:`srivastava+klassen_2016_analysis_orbit`.
:footcite:`srivastava+klassen_2016_statistical`.
.. math::
\bar \gamma = argmin_{\gamma \in \Gamma} \sum_{i=1}^{n}
d_{FR}^2(\gamma, \gamma_i)
The computation is performed using the structure of Hilbert Sphere obtained
after a transformation of the warpings, see
:footcite:`srivastava++_2011_ficher-rao_orbit`.
:footcite:`srivastava++_2011_registration`.
Args:
warping: Set of warpings.
Expand Down Expand Up @@ -211,8 +211,8 @@ def fisher_rao_karcher_mean(
equivalence class which makes the mean of the warpings employed be the
identity.
See :footcite:`srivastava+klassen_2016_analysis_karcher` and
:footcite:`srivastava++_2011_ficher-rao_karcher`.
See :footcite:`srivastava+klassen_2016_statistical` and
:footcite:`srivastava++_2011_registration`.
Args:
fdatagrid: Set of functions to compute the
Expand Down
2 changes: 1 addition & 1 deletion skfda/exploratory/stats/_stats.py
Original file line number Diff line number Diff line change
Expand Up @@ -192,7 +192,7 @@ def geometric_median(
\sum_{i=1}^N \left \| x_i-y \right \|
The geometric median in the functional case is also described in
:footcite:`gervini_2008_estimation`.
:footcite:`gervini_2008_robust`.
Instead of the proposed algorithm, however, the current implementation
uses the corrected Weiszfeld algorithm to compute the median.
Expand Down
4 changes: 2 additions & 2 deletions skfda/exploratory/visualization/_boxplot.py
Original file line number Diff line number Diff line change
Expand Up @@ -117,7 +117,7 @@ class Boxplot(FDataBoxplot):
detected in a functional boxplot by the 1.5 times the 50% central region
empirical rule, analogous to the rule for classical boxplots.
For more information see :footcite:ts:`sun+genton_2011_boxplots`.
For more information see :footcite:ts:`sun+genton_2011_functional`.
Args:
fdatagrid: Object containing the data.
Expand Down Expand Up @@ -538,7 +538,7 @@ class SurfaceBoxplot(FDataBoxplot):
:ref:`depth measure <depth-measures>`
for functional data, it represents the envelope of the
50% central region, the median curve, and the maximum non-outlying
envelope :footcite:`sun+genton_2011_boxplots`.
envelope :footcite:`sun+genton_2011_functional`.
Args:
fdatagrid: Object containing the data.
Expand Down
2 changes: 1 addition & 1 deletion skfda/exploratory/visualization/_magnitude_shape_plot.py
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,7 @@ class MagnitudeShapePlot(BasePlot):
The outliers are detected using an instance of
:class:`MSPlotOutlierDetector`.
For more information see :footcite:ts:`dai+genton_2018_visualization`.
For more information see :footcite:ts:`dai+genton_2018_multivariate`.
Args:
fdata: Object containing the data.
Expand Down
4 changes: 2 additions & 2 deletions skfda/inference/hotelling/_hotelling.py
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,7 @@ def hotelling_t2(
the discrete representation, depending on the input.
This statistic is defined in Pini, Stamm and Vantini
:footcite:`pini+stamm+vantini_2018_hotellings`.
:footcite:`pini++_2018_hotelling`.
Args:
fd1: Object with the first sample.
Expand Down Expand Up @@ -167,7 +167,7 @@ def hotelling_test_ind(
tested are generated randomly.
This procedure is from Pini, Stamm and Vantinni
:footcite:`pini+stamm+vantini_2018_hotellings`.
:footcite:`pini++_2018_hotelling`.
Args:
fd1: First sample of data.
Expand Down
10 changes: 5 additions & 5 deletions skfda/misc/hat_matrix.py
Original file line number Diff line number Diff line change
Expand Up @@ -149,12 +149,12 @@ class NadarayaWatsonHatMatrix(HatMatrix):
For smoothing, :math:`\{x_1, ..., x_n\}` are the points with known value
and :math:`\{x_1', ..., x_m'\}` are the points for which it is desired to
estimate the smoothed value. The distance :math:`d` is the absolute value
function :footcite:`wasserman_2006_nonparametric_nw`.
function :footcite:`wasserman_2006_nonparametric`.
For regression, :math:`\{x_1, ..., x_n\}` is the functional data and
:math:`\{x_1', ..., x_m'\}` are the functions for which it is desired to
estimate the scalar value. Here, :math:`d` is some functional distance
:footcite:`ferraty+vieu_2006_nonparametric_nw`.
:footcite:`ferraty+vieu_2006_functional`.
In both cases :math:`K(\cdot)` is a kernel function and :math:`h` is the
bandwidth.
Expand Down Expand Up @@ -224,7 +224,7 @@ class LocalLinearRegressionHatMatrix(HatMatrix):
where :math:`\{t_1, t_2, ..., t_n\}` are points with known value and
:math:`\{t_1', t_2', ..., t_m'\}` are the points for which it is
desired to estimate the smoothed value
:footcite:`wasserman_2006_nonparametric_llr`.
:footcite:`wasserman_2006_nonparametric`.
For **kernel regression** algorithm:
Expand All @@ -248,7 +248,7 @@ class LocalLinearRegressionHatMatrix(HatMatrix):
Where :math:`c_{ik}^j` is the :math:`j`-th coefficient in a truncated basis
expansion of :math:`X_i - X'_k = \sum_{j=1}^J c_{ik}^j` and :math:`d` some
functional distance :footcite:`baillo+grane_2008_llr`
functional distance :footcite:`baillo+grane_2009_local`
For both cases, :math:`K(\cdot)` is a kernel function and :math:`h` the
bandwidth.
Expand Down Expand Up @@ -431,7 +431,7 @@ class KNeighborsHatMatrix(HatMatrix):
In both cases, :math:`K(\cdot)` is a kernel function and
:math:`h_{i}` is calculated as the distance from :math:`x_i'` to its
``n_neighbors``-th nearest neighbor in :math:`\{x_1, ..., x_n\}`
:footcite:`ferraty+vieu_2006_nonparametric_knn`.
:footcite:`ferraty+vieu_2006_computational`.
Used with the uniform kernel, it takes the average of the closest
``n_neighbors`` points to a given point.
Expand Down
8 changes: 4 additions & 4 deletions skfda/misc/metrics/_fisher_rao.py
Original file line number Diff line number Diff line change
Expand Up @@ -66,7 +66,7 @@ class FisherRaoDistance():
If the observations are distributions of random variables the distance will
match with the usual Fisher-Rao distance in non-parametric form for
probability distributions :footcite:`srivastava++_2011_ficher-rao`.
probability distributions :footcite:`srivastava++_2011_registration`.
If the observations are defined in a :term:`domain` different than (0,1)
their domains are normalized to this interval with an affine
Expand Down Expand Up @@ -166,7 +166,7 @@ def fisher_rao_amplitude_distance(
.. math::
\mathcal{R}(\gamma) = \|\sqrt{\dot{\gamma}}- 1 \|_{\mathbb{L}^2}^2
See the :footcite:`srivastava+klassen_2016_analysis_amplitude` for a
See the :footcite:`srivastava+klassen_2016_functionala` for a
detailed explanation.
If the observations are defined in a :term:`domain` different than (0,1)
Expand Down Expand Up @@ -264,7 +264,7 @@ def fisher_rao_phase_distance(
where :math:`\gamma_{id}` is the identity warping.
See :footcite:`srivastava+klassen_2016_analysis_phase` for a detailed
See :footcite:`srivastava+klassen_2016_functionala` for a detailed
explanation.
If the observations are defined in a :term:`domain` different than (0,1)
Expand Down Expand Up @@ -348,7 +348,7 @@ def _fisher_rao_warping_distance(
d_{\Gamma}(\gamma_i, \gamma_j) = cos^{-1} \left ( \int_0^1
\sqrt{\dot \gamma_i(t)\dot \gamma_j(t)}dt \right )
See :footcite:`srivastava+klassen_2016_analysis_probability` for a detailed
See :footcite:`srivastava+klassen_2016_functionala` for a detailed
explanation.
If the warpings are not defined in [0,1], an affine transformation is made
Expand Down
2 changes: 1 addition & 1 deletion skfda/misc/metrics/_mahalanobis.py
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ class MahalanobisDistance(BaseEstimator):
Class that implements functional Mahalanobis distance for both
basis and grid representations of the data
:footcite:`berrendero+bueno-larraz+cuevas_2020_mahalanobis`.
:footcite:`berrendero++_2020_mahalanobis`.
Parameters:
n_components: Number of eigenvectors to keep from
Expand Down
6 changes: 3 additions & 3 deletions skfda/misc/operators/_srvf.py
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ class SRSF(
This representation it is used to compute the extended non-parametric
Fisher-Rao distance between functions, wich under the SRSF representation
becomes the usual :math:`\mathbb{L}^2` distance between functions.
See :footcite:`srivastava+klassen_2016_analysis_square`.
See :footcite:`srivastava+klassen_2016_functionala`.
The inverse SRSF transform is defined as
Expand Down Expand Up @@ -133,7 +133,7 @@ def transform(self, X: FDataGrid, y: object = None) -> FDataGrid:
Let :math:`f : [a,b] \rightarrow \mathbb{R}` be an absolutely
continuous function, the SRSF transform is defined as
:footcite:`srivastava+klassen_2016_analysis_square`:
:footcite:`srivastava+klassen_2016_functionala`:
.. math::
Expand Down Expand Up @@ -184,7 +184,7 @@ def inverse_transform(self, X: FDataGrid, y: None = None) -> FDataGrid:
Compute the inverse SRSF transform.
Given the srsf and the initial value the original function can be
obtained as :footcite:`srivastava+klassen_2016_analysis_square`:
obtained as :footcite:`srivastava+klassen_2016_functionala`:
.. math::
f(t) = f(a) + \int_{a}^t q(t)|q(t)|dt
Expand Down
25 changes: 12 additions & 13 deletions skfda/ml/classification/_depth_classifiers.py
Original file line number Diff line number Diff line change
Expand Up @@ -218,11 +218,12 @@ class DDGClassifier(
BaseEstimator,
ClassifierMixin[Input, Target],
):
r"""Generalized depth-versus-depth (DD) classifer for functional data.
r"""
Generalized depth-versus-depth (DD) classifier for functional data.
This classifier builds an interface around the DDGTransfomer.
The transformer takes a list of k depths and performs the following map:
The transformer takes a list of k depths and performs the following map
:footcite:p:`cuesta-albertos++_2017_ddgclassifier`:
.. math::
\mathcal{X} &\rightarrow \mathbb{R}^G \\
Expand All @@ -231,7 +232,6 @@ class DDGClassifier(
Where :math:`D_i^j(x)` is the depth of the point :math:`x` with respect to
the data in the :math:`i`-th group using the :math:`j`-th depth of the
provided list.
Note that :math:`\mathcal{X}` is possibly multivariate, that is,
:math:`\mathcal{X} = \mathcal{X}_1 \times ... \times \mathcal{X}_p`.
Expand Down Expand Up @@ -310,12 +310,8 @@ class DDGClassifier(
:class:`~skfda.preprocessing.dim_reduction.feature_extraction._ddg_transformer`
References:
Li, J., Cuesta-Albertos, J. A., and Liu, R. Y. (2012). DD-classifier:
Nonparametric classification procedure based on DD-plot. Journal of
the American Statistical Association, 107(498):737-753.
.. footbibliography::
Cuesta-Albertos, J.A., Febrero-Bande, M. and Oviedo de la Fuente, M.
(2017) The DDG-classifier in the functional setting. TEST, 26. 119-142.
"""

def __init__( # noqa: WPS234
Expand Down Expand Up @@ -514,15 +510,18 @@ def predict(self, X: Union[NDArrayFloat, pd.DataFrame]) -> Target:


class MaximumDepthClassifier(DDGClassifier[Input, Target]):
"""Maximum depth classifier for functional data.
"""
Maximum depth classifier for functional data.
Test samples are classified to the class where they are deeper.
Test samples are classified to the class where they are deeper
:footcite:p:`ghosh+chaudhuri_2005_maximum`.
Parameters:
depth_method:
The depth class to use when calculating the depth of a test
sample in a class. See the documentation of the depths module
for a list of available depths. By default it is ModifiedBandDepth.
Examples:
Firstly, we will import and split the Berkeley Growth Study dataset
Expand Down Expand Up @@ -557,8 +556,8 @@ class MaximumDepthClassifier(DDGClassifier[Input, Target]):
:class:`~skfda.ml.classification.DDGClassifier`
References:
Ghosh, A. K. and Chaudhuri, P. (2005b). On maximum depth and
related classifiers. Scandinavian Journal of Statistics, 32, 327–350.
.. footbibliography::
"""

def __init__(self, depth_method: Depth[Input] | None = None) -> None:
Expand Down
2 changes: 1 addition & 1 deletion skfda/ml/classification/_logistic_regression.py
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ class LogisticRegression(
This class implements the sequential “greedy” algorithm
for functional logistic regression proposed in
:footcite:ts:`bueno++_2021_functional`.
:footcite:ts:`berrendero++_2022_functional`.
.. warning::
For now, only binary classification for functional
Expand Down
8 changes: 2 additions & 6 deletions skfda/preprocessing/dim_reduction/variable_selection/_rkvs.py
Original file line number Diff line number Diff line change
Expand Up @@ -133,7 +133,7 @@ class RKHSVariableSelection(
misclassification error of all the classification problems with the
reduced dimensionality. For a longer discussion about the optimality and
consistence of this method, we refer the reader to the original
article [1]_.
article :footcite:`berrendero++_2018_use`.
In practice the points are selected one at a time, using
a greedy approach, so this optimality is not always guaranteed.
Expand Down Expand Up @@ -189,11 +189,7 @@ class RKHSVariableSelection(
(10000, 3)
References:
.. [1] J. R. Berrendero, A. Cuevas, and J. L. Torrecilla, «On the Use
of Reproducing Kernel Hilbert Spaces in Functional
Classification», Journal of the American Statistical
Association, vol. 113, no. 523, pp. 1210-1218, jul. 2018,
doi: 10.1080/01621459.2017.1320287.
.. footbibliography::
"""

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -134,7 +134,7 @@ class MaximaHunting(
For a longer explanation about the method, and comparison with other
functional variable selection methods, we refer the reader to the
original article :footcite:`berrendero+cuevas+torrecilla_2016_hunting`.
original article :footcite:`berrendero++_2016_variable`.
Parameters:
dependence_measure (callable): Dependence measure to use. By default,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -660,7 +660,7 @@ class AsymptoticIndependenceTestStop(StoppingCondition):
Stop when the selected point is independent from the target.
It uses an asymptotic test based on the chi-squared distribution described
in :footcite:`szekely+rizzo_2010_brownian`. The test rejects independence
in :footcite:`szekely+rizzo_2009_brownian`. The test rejects independence
if
.. math::
Expand Down Expand Up @@ -870,7 +870,7 @@ class RecursiveMaximaHunting(
selected by :class:`MaximaHunting` alone.
This method was originally described in a special case in article
:footcite:`torrecilla+suarez_2016_hunting`.
:footcite:`torrecilla+suarez_2016_feature`.
Additional information about the usage of this method can be found in
:doc:`/modules/preprocessing/dim_reduction/recursive_maxima_hunting`.
Expand Down
2 changes: 1 addition & 1 deletion skfda/preprocessing/registration/_fisher_rao.py
Original file line number Diff line number Diff line change
Expand Up @@ -60,7 +60,7 @@ class FisherRaoElasticRegistration(
`elastic mean`, wich is the local minimum of the sum of squares of elastic
distances. See :func:`~elastic_mean`.
In :footcite:`srivastava+klassen_2016_analysis_elastic` are described
In :footcite:`srivastava+klassen_2016_functionala` are described
extensively the algorithms employed and the SRSF framework.
Args:
Expand Down
4 changes: 2 additions & 2 deletions skfda/preprocessing/registration/_landmark_registration.py
Original file line number Diff line number Diff line change
Expand Up @@ -203,7 +203,7 @@ def landmark_elastic_registration_warping(
:math:`h_i(t^*_j)=t_{ij}`.
The registered samples can be obtained as :math:`x^*_i(t)=x_i(h_i(t))`.
See :footcite:`ramsay+silverman_2005_functional_landmark`
See :footcite:`ramsay+silverman_2005_registration`
for a detailed explanation.
Args:
Expand Down Expand Up @@ -344,7 +344,7 @@ def landmark_elastic_registration(
The registered samples will have their features aligned, i.e.,
:math:`x^*_i(t^*_j)=x_i(t_{ij})`.
See :footcite:`ramsay+silverman_2005_functional_landmark`
See :footcite:`ramsay+silverman_2005_registration`
for a detailed explanation.
Args:
Expand Down

0 comments on commit 2f09844

Please sign in to comment.