Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

DOC fix broken citeseer links as described in #24795 #24800

Merged
merged 4 commits into from Nov 2, 2022
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
4 changes: 2 additions & 2 deletions doc/modules/clustering.rst
Expand Up @@ -563,11 +563,11 @@ graph, and SpectralClustering is initialized with `affinity='precomputed'`::
Jianbo Shi, Jitendra Malik, 2000

* `"A Random Walks View of Spectral Segmentation"
<http://citeseer.ist.psu.edu/viewdoc/summary?doi=10.1.1.33.1501>`_
<https://citeseerx.ist.psu.edu/doc_view/pid/84a86a69315e994cfd1e0c7debb86d62d7bd1f44>`_
Marina Meila, Jianbo Shi, 2001

* `"On Spectral Clustering: Analysis and an algorithm"
<http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.19.8100>`_
<https://citeseerx.ist.psu.edu/doc_view/pid/796c5d6336fc52aa84db575fb821c78918b65f58>`_
Andrew Y. Ng, Michael I. Jordan, Yair Weiss, 2001

* :arxiv:`"Preconditioned Spectral Clustering for Stochastic
Expand Down
2 changes: 1 addition & 1 deletion doc/modules/feature_selection.rst
Expand Up @@ -305,7 +305,7 @@ fit and requires no iterations.

.. [sfs] Ferri et al, `Comparative study of techniques for
large-scale feature selection
<http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.24.4369&rep=rep1&type=pdf>`_.
<https://citeseerx.ist.psu.edu/doc_view/pid/5fedabbb3957bbb442802e012d829ee0629a01b6>`_.

Feature selection as part of a pipeline
=======================================
Expand Down
6 changes: 3 additions & 3 deletions doc/modules/linear_model.rst
Expand Up @@ -794,7 +794,7 @@ is more robust to ill-posed problems.

* Section 3.3 in Christopher M. Bishop: Pattern Recognition and Machine Learning, 2006

* David J. C. MacKay, `Bayesian Interpolation <http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.27.9072&rep=rep1&type=pdf>`_, 1992.
* David J. C. MacKay, `Bayesian Interpolation <https://citeseerx.ist.psu.edu/doc_view/pid/b14c7cc3686e82ba40653c6dff178356a33e5e2c>`_, 1992.

* Michael E. Tipping, `Sparse Bayesian Learning and the Relevance Vector Machine <http://www.jmlr.org/papers/volume1/tipping01a/tipping01a.pdf>`_, 2001.

Expand Down Expand Up @@ -836,11 +836,11 @@ Ridge Regression`_, see the example below.

.. [1] Christopher M. Bishop: Pattern Recognition and Machine Learning, Chapter 7.2.1

.. [2] David Wipf and Srikantan Nagarajan: `A new view of automatic relevance determination <https://papers.nips.cc/paper/3372-a-new-view-of-automatic-relevance-determination.pdf>`_
.. [2] David Wipf and Srikantan Nagarajan: `A New View of Automatic Relevance Determination <https://papers.nips.cc/paper/3372-a-new-view-of-automatic-relevance-determination.pdf>`_

.. [3] Michael E. Tipping: `Sparse Bayesian Learning and the Relevance Vector Machine <http://www.jmlr.org/papers/volume1/tipping01a/tipping01a.pdf>`_

.. [4] Tristan Fletcher: `Relevance Vector Machines explained <http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.651.8603&rep=rep1&type=pdf>`_
.. [4] Tristan Fletcher: `Relevance Vector Machines Explained <https://citeseerx.ist.psu.edu/doc_view/pid/3dc9d625404fdfef6eaccc3babddefe4c176abd4>`_


.. _Logistic_regression:
Expand Down
2 changes: 1 addition & 1 deletion doc/modules/manifold.rst
Expand Up @@ -268,7 +268,7 @@ The overall complexity of MLLE is
.. topic:: References:

* `"MLLE: Modified Locally Linear Embedding Using Multiple Weights"
<http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.70.382>`_
<https://citeseerx.ist.psu.edu/doc_view/pid/0b060fdbd92cbcc66b383bcaa9ba5e5e624d7ee3>`_
Zhang, Z. & Wang, J.


Expand Down
2 changes: 1 addition & 1 deletion doc/modules/model_evaluation.rst
Expand Up @@ -830,7 +830,7 @@ precision-recall curve as follows.
2008.
.. [Everingham2010] M. Everingham, L. Van Gool, C.K.I. Williams, J. Winn, A. Zisserman,
`The Pascal Visual Object Classes (VOC) Challenge
<http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.157.5766&rep=rep1&type=pdf>`_,
<https://citeseerx.ist.psu.edu/doc_view/pid/b6bebfd529b233f00cb854b7d8070319600cf59d>`_,
IJCV 2010.
.. [Davis2006] J. Davis, M. Goadrich, `The Relationship Between Precision-Recall and ROC Curves
<https://www.biostat.wisc.edu/~page/rocpr.pdf>`_,
Expand Down
4 changes: 2 additions & 2 deletions doc/modules/naive_bayes.rst
Expand Up @@ -216,12 +216,12 @@ It is advisable to evaluate both models, if time permits.

* A. McCallum and K. Nigam (1998).
`A comparison of event models for Naive Bayes text classification.
<http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.46.1529>`_
<https://citeseerx.ist.psu.edu/doc_view/pid/04ce064505b1635583fa0d9cc07cac7e9ea993cc>`_
Proc. AAAI/ICML-98 Workshop on Learning for Text Categorization, pp. 41-48.

* V. Metsis, I. Androutsopoulos and G. Paliouras (2006).
`Spam filtering with Naive Bayes -- Which Naive Bayes?
<http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.61.5542>`_
<https://citeseerx.ist.psu.edu/doc_view/pid/8bd0934b366b539ec95e683ae39f8abb29ccc757>`_
3rd Conf. on Email and Anti-Spam (CEAS).

.. _categorical_naive_bayes:
Expand Down
4 changes: 2 additions & 2 deletions doc/modules/neighbors.rst
Expand Up @@ -345,8 +345,8 @@ Alternatively, the user can work with the :class:`BallTree` class directly.

.. topic:: References:

* `"Five balltree construction algorithms"
<http://citeseer.ist.psu.edu/viewdoc/summary?doi=10.1.1.91.8209>`_,
* `"Five Balltree Construction Algorithms"
<https://citeseerx.ist.psu.edu/doc_view/pid/17ac002939f8e950ffb32ec4dc8e86bdd8cb5ff1>`_,
Omohundro, S.M., International Computer Science Institute
Technical Report (1989)

Expand Down
4 changes: 2 additions & 2 deletions doc/modules/random_projection.rst
Expand Up @@ -28,7 +28,7 @@ technique for distance based method.
Kaufmann Publishers Inc., San Francisco, CA, USA, 143-151.

* Ella Bingham and Heikki Mannila. 2001.
`Random projection in dimensionality reduction: applications to image and text data. <http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.24.5135&rep=rep1&type=pdf>`_
`Random projection in dimensionality reduction: applications to image and text data. <https://citeseerx.ist.psu.edu/doc_view/pid/aed77346f737b0ed5890b61ad02e5eb4ab2f3dc6>`_
In Proceedings of the seventh ACM SIGKDD international conference on
Knowledge discovery and data mining (KDD '01). ACM, New York, NY, USA,
245-250.
Expand Down Expand Up @@ -84,7 +84,7 @@ bounded distortion introduced by the random projection::

* Sanjoy Dasgupta and Anupam Gupta, 1999.
`An elementary proof of the Johnson-Lindenstrauss Lemma.
<http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.39.3334&rep=rep1&type=pdf>`_
<https://citeseerx.ist.psu.edu/doc_view/pid/95cd464d27c25c9c8690b378b894d337cdf021f9>`_

.. _gaussian_random_matrix:

Expand Down
2 changes: 1 addition & 1 deletion examples/calibration/plot_compare_calibration.py
Expand Up @@ -206,5 +206,5 @@ def predict_proba(self, X):
# 1996.
# .. [3] `Obtaining calibrated probability estimates from decision trees and
# naive Bayesian classifiers
# <http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.29.3039&rep=rep1&type=pdf>`_
# <https://citeseerx.ist.psu.edu/doc_view/pid/4f67a122ec3723f08ad5cbefecad119b432b3304>`_
# Zadrozny, Bianca, and Charles Elkan. Icml. Vol. 1. 2001.
2 changes: 1 addition & 1 deletion examples/ensemble/plot_adaboost_regression.py
Expand Up @@ -10,7 +10,7 @@
detail.

.. [1] `H. Drucker, "Improving Regressors using Boosting Techniques", 1997.
<http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.31.314>`_
<https://citeseerx.ist.psu.edu/doc_view/pid/8d49e2dedb817f2c3330e74b63c5fc86d2399ce3>`_

"""

Expand Down
8 changes: 4 additions & 4 deletions sklearn/manifold/_locally_linear.py
Expand Up @@ -301,9 +301,9 @@ def locally_linear_embedding(
.. [2] Donoho, D. & Grimes, C. Hessian eigenmaps: Locally
linear embedding techniques for high-dimensional data.
Proc Natl Acad Sci U S A. 100:5591 (2003).
.. [3] Zhang, Z. & Wang, J. MLLE: Modified Locally Linear
.. [3] `Zhang, Z. & Wang, J. MLLE: Modified Locally Linear
Embedding Using Multiple Weights.
http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.70.382
<https://citeseerx.ist.psu.edu/doc_view/pid/0b060fdbd92cbcc66b383bcaa9ba5e5e624d7ee3>`_
.. [4] Zhang, Z. & Zha, H. Principal manifolds and nonlinear
dimensionality reduction via tangent space alignment.
Journal of Shanghai Univ. 8:406 (2004)
Expand Down Expand Up @@ -668,9 +668,9 @@ class LocallyLinearEmbedding(
.. [2] Donoho, D. & Grimes, C. Hessian eigenmaps: Locally
linear embedding techniques for high-dimensional data.
Proc Natl Acad Sci U S A. 100:5591 (2003).
.. [3] Zhang, Z. & Wang, J. MLLE: Modified Locally Linear
.. [3] `Zhang, Z. & Wang, J. MLLE: Modified Locally Linear
Embedding Using Multiple Weights.
http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.70.382
<https://citeseerx.ist.psu.edu/doc_view/pid/0b060fdbd92cbcc66b383bcaa9ba5e5e624d7ee3>`_
.. [4] Zhang, Z. & Zha, H. Principal manifolds and nonlinear
dimensionality reduction via tangent space alignment.
Journal of Shanghai Univ. 8:406 (2004)
Expand Down
4 changes: 2 additions & 2 deletions sklearn/manifold/_spectral_embedding.py
Expand Up @@ -523,9 +523,9 @@ class SpectralEmbedding(BaseEstimator):
Ulrike von Luxburg
<10.1007/s11222-007-9033-z>`

- On Spectral Clustering: Analysis and an algorithm, 2001
- `On Spectral Clustering: Analysis and an algorithm, 2001
Andrew Y. Ng, Michael I. Jordan, Yair Weiss
http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.19.8100
<https://citeseerx.ist.psu.edu/doc_view/pid/796c5d6336fc52aa84db575fb821c78918b65f58>`_

- :doi:`Normalized cuts and image segmentation, 2000
Jianbo Shi, Jitendra Malik
Expand Down
2 changes: 1 addition & 1 deletion sklearn/mixture/_bayesian_mixture.py
Expand Up @@ -323,7 +323,7 @@ class BayesianGaussianMixture(BaseMixture):
.. [2] `Hagai Attias. (2000). "A Variational Bayesian Framework for
Graphical Models". In Advances in Neural Information Processing
Systems 12.
<http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.36.2841&rep=rep1&type=pdf>`_
<https://citeseerx.ist.psu.edu/doc_view/pid/ee844fd96db7041a9681b5a18bff008912052c7e>`_

.. [3] `Blei, David M. and Michael I. Jordan. (2006). "Variational
inference for Dirichlet process mixtures". Bayesian analysis 1.1
Expand Down
4 changes: 2 additions & 2 deletions sklearn/random_projection.py
Expand Up @@ -101,9 +101,9 @@ def johnson_lindenstrauss_min_dim(n_samples, *, eps=0.1):

.. [1] https://en.wikipedia.org/wiki/Johnson%E2%80%93Lindenstrauss_lemma

.. [2] Sanjoy Dasgupta and Anupam Gupta, 1999,
.. [2] `Sanjoy Dasgupta and Anupam Gupta, 1999,
"An elementary proof of the Johnson-Lindenstrauss Lemma."
http://citeseer.ist.psu.edu/viewdoc/summary?doi=10.1.1.45.3654
<https://citeseerx.ist.psu.edu/doc_view/pid/95cd464d27c25c9c8690b378b894d337cdf021f9>`_

Examples
--------
Expand Down
4 changes: 2 additions & 2 deletions sklearn/semi_supervised/_label_propagation.py
Expand Up @@ -553,9 +553,9 @@ class LabelSpreading(BaseLabelPropagation):

References
----------
Dengyong Zhou, Olivier Bousquet, Thomas Navin Lal, Jason Weston,
`Dengyong Zhou, Olivier Bousquet, Thomas Navin Lal, Jason Weston,
Bernhard Schoelkopf. Learning with local and global consistency (2004)
http://citeseer.ist.psu.edu/viewdoc/summary?doi=10.1.1.115.3219
<https://citeseerx.ist.psu.edu/doc_view/pid/d74c37aabf2d5cae663007cbd8718175466aea8c>`_

Examples
--------
Expand Down
24 changes: 12 additions & 12 deletions sklearn/svm/_classes.py
Expand Up @@ -753,9 +753,9 @@ class SVC(BaseSVC):
.. [1] `LIBSVM: A Library for Support Vector Machines
<http://www.csie.ntu.edu.tw/~cjlin/papers/libsvm.pdf>`_

.. [2] `Platt, John (1999). "Probabilistic outputs for support vector
machines and comparison to regularizedlikelihood methods."
<http://citeseer.ist.psu.edu/viewdoc/summary?doi=10.1.1.41.1639>`_
.. [2] `Platt, John (1999). "Probabilistic Outputs for Support Vector
Machines and Comparisons to Regularized Likelihood Methods"
<https://citeseerx.ist.psu.edu/doc_view/pid/42e5ed832d4310ce4378c44d05570439df28a393>`_

Examples
--------
Expand Down Expand Up @@ -1017,9 +1017,9 @@ class NuSVC(BaseSVC):
.. [1] `LIBSVM: A Library for Support Vector Machines
<http://www.csie.ntu.edu.tw/~cjlin/papers/libsvm.pdf>`_

.. [2] `Platt, John (1999). "Probabilistic outputs for support vector
machines and comparison to regularizedlikelihood methods."
<http://citeseer.ist.psu.edu/viewdoc/summary?doi=10.1.1.41.1639>`_
.. [2] `Platt, John (1999). "Probabilistic Outputs for Support Vector
Machines and Comparisons to Regularized Likelihood Methods"
<https://citeseerx.ist.psu.edu/doc_view/pid/42e5ed832d4310ce4378c44d05570439df28a393>`_

Examples
--------
Expand Down Expand Up @@ -1233,9 +1233,9 @@ class SVR(RegressorMixin, BaseLibSVM):
.. [1] `LIBSVM: A Library for Support Vector Machines
<http://www.csie.ntu.edu.tw/~cjlin/papers/libsvm.pdf>`_

.. [2] `Platt, John (1999). "Probabilistic outputs for support vector
machines and comparison to regularizedlikelihood methods."
<http://citeseer.ist.psu.edu/viewdoc/summary?doi=10.1.1.41.1639>`_
.. [2] `Platt, John (1999). "Probabilistic Outputs for Support Vector
Machines and Comparisons to Regularized Likelihood Methods"
<https://citeseerx.ist.psu.edu/doc_view/pid/42e5ed832d4310ce4378c44d05570439df28a393>`_

Examples
--------
Expand Down Expand Up @@ -1442,9 +1442,9 @@ class NuSVR(RegressorMixin, BaseLibSVM):
.. [1] `LIBSVM: A Library for Support Vector Machines
<http://www.csie.ntu.edu.tw/~cjlin/papers/libsvm.pdf>`_

.. [2] `Platt, John (1999). "Probabilistic outputs for support vector
machines and comparison to regularizedlikelihood methods."
<http://citeseer.ist.psu.edu/viewdoc/summary?doi=10.1.1.41.1639>`_
.. [2] `Platt, John (1999). "Probabilistic Outputs for Support Vector
Machines and Comparisons to Regularized Likelihood Methods"
<https://citeseerx.ist.psu.edu/doc_view/pid/42e5ed832d4310ce4378c44d05570439df28a393>`_

Examples
--------
Expand Down
2 changes: 1 addition & 1 deletion sklearn/tree/_reingold_tilford.py
Expand Up @@ -158,7 +158,7 @@ def ancestor(vil, v, default_ancestor):
# the relevant text is at the bottom of page 7 of
# "Improving Walker's Algorithm to Run in Linear Time" by Buchheim et al,
# (2002)
# http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.16.8757&rep=rep1&type=pdf
# https://citeseerx.ist.psu.edu/doc_view/pid/1f41c3c2a4880dc49238e46d555f16d28da2940d
if vil.ancestor in v.parent.children:
return vil.ancestor
else:
Expand Down