Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 4 additions & 0 deletions doc/modules/biclustering.rst
Original file line number Diff line number Diff line change
Expand Up @@ -48,6 +48,8 @@ columns:
:target: ../auto_examples/bicluster/images/sphx_glr_plot_spectral_coclustering_003.png
:align: center
:scale: 50
:alt: The graph is a square heat map, 5x5, with axes from 0 to 250. The darkest 5
squares of heat map run diagonally from top left to bottom right.

An example of biclusters formed by partitioning rows and columns.

Expand All @@ -60,6 +62,8 @@ small:
:target: ../auto_examples/bicluster/images/sphx_glr_plot_spectral_biclustering_003.png
:align: center
:scale: 50
:alt: The graph is a square heat map, 5x5, with axes from 0 to 250. The variance of
the values within each bicluster is small, causing a checkerboard effect.

An example of checkerboard biclusters.

Expand Down
4 changes: 4 additions & 0 deletions doc/modules/calibration.rst
Original file line number Diff line number Diff line change
Expand Up @@ -53,6 +53,10 @@ by showing the number of samples in each predicted probability bin.
.. figure:: ../auto_examples/calibration/images/sphx_glr_plot_compare_calibration_001.png
:target: ../auto_examples/calibration/plot_compare_calibration.html
:align: center
:alt: Five plots comparing calibration classifiers. The first plot compares
Logistic, Naive Bayes, SVC, and Random forest classifiers side by side with perfect
calibration. The following four histograms isolate each classifier to describe
their predicted probability on the x-axis and count on the y-axis.

.. currentmodule:: sklearn.linear_model

Expand Down
23 changes: 23 additions & 0 deletions doc/modules/clustering.rst
Original file line number Diff line number Diff line change
Expand Up @@ -272,6 +272,10 @@ small, as shown in the example and cited reference.
:target: ../auto_examples/cluster/plot_mini_batch_kmeans.html
:align: center
:scale: 100
:alt: A figure of three panels with scatter plots for KMeans, MiniBatchKMeans,
and their difference. Both methods have very similar results, the left and
middle panels show the three identified clusters. The difference
panel highlights the <20 points that differ out of the 3000 points.


.. topic:: Examples:
Expand Down Expand Up @@ -310,6 +314,7 @@ is given.
:target: ../auto_examples/cluster/plot_affinity_propagation.html
:align: center
:scale: 50
:alt: Three distinct clusters created using affinity propagation.


Affinity Propagation can be interesting as it chooses the number of
Expand Down Expand Up @@ -461,10 +466,15 @@ computed using a function of a gradient of the image.
.. |noisy_img| image:: ../auto_examples/cluster/images/sphx_glr_plot_segmentation_toy_001.png
:target: ../auto_examples/cluster/plot_segmentation_toy.html
:scale: 50
:alt: An image of connected nearly equally sized circles. The circles are in
shades of green, blue, and yellow. The entire image is pixelated, blurry.

.. |segmented_img| image:: ../auto_examples/cluster/images/sphx_glr_plot_segmentation_toy_002.png
:target: ../auto_examples/cluster/plot_segmentation_toy.html
:scale: 50
:alt: An image of connected nearly equally sized circles. Each circle has a
distinct color: dark green, light green, blue, and yellow. The clarity of the image
is much better compared to the previous one.

.. centered:: |noisy_img| |segmented_img|

Expand Down Expand Up @@ -641,10 +651,20 @@ the roll.
.. |unstructured| image:: ../auto_examples/cluster/images/sphx_glr_plot_ward_structured_vs_unstructured_001.png
:target: ../auto_examples/cluster/plot_ward_structured_vs_unstructured.html
:scale: 49
:alt: A 3D scatter plot of 1500 data points. Points are located along a swirl
or swiss roll. The six clusters are highlighted in different colours.
The hierarchical clustering is performed without connectivity
constraints on the structure and is solely based on distance therefore
clusters extend to multiple layers of the roll.

.. |structured| image:: ../auto_examples/cluster/images/sphx_glr_plot_ward_structured_vs_unstructured_002.png
:target: ../auto_examples/cluster/plot_ward_structured_vs_unstructured.html
:scale: 49
:alt: A 3D scatter plot of 1500 data points. Points are located along a swirl
or swiss roll. The six clusters are highlighted in different colours.
The hierarchical clustering is performed with connectivity constraints
that respect the structure of the roll therefore the clusters form a nice
parcellation along the roll.

.. centered:: |unstructured| |structured|

Expand Down Expand Up @@ -798,6 +818,9 @@ by black points below.
.. |dbscan_results| image:: ../auto_examples/cluster/images/sphx_glr_plot_dbscan_001.png
:target: ../auto_examples/cluster/plot_dbscan.html
:scale: 50
:alt: A bubble chart that maps three clusters, with the x-axis from -2.5 to 2
and the y-axis from -2 to 2.5. Outliers are indicated separately and are
evenly spaced around the clusters.

.. centered:: |dbscan_results|

Expand Down
17 changes: 17 additions & 0 deletions doc/modules/covariance.rst
Original file line number Diff line number Diff line change
Expand Up @@ -154,6 +154,9 @@ object to the same sample.
:target: ../auto_examples/covariance/plot_covariance_estimation.html
:align: center
:scale: 65%
:alt: A plot comparing shrinkage coefficients by regularization parameter:
shrinkage coefficient on the x-axis and error: negative log-likelihood on test data
on the y-axis.

Bias-variance trade-off when setting the shrinkage: comparing the
choices of Ledoit-Wolf and OAS estimators
Expand All @@ -178,6 +181,12 @@ object to the same sample.
:target: ../auto_examples/covariance/plot_lw_vs_oas.html
:align: center
:scale: 75%
:alt: Two figures. One of them plots the Mean Squared Error difference between a
LedoitWolf and an OAS estimator of the covariance with the y-axis from 0 to 60.
The other one compares the Shrinkage covariance estimation between the LedoitWolf
and OAS, with the y-axis from 0 to 1. In iboth figures the x-axis is the number of
samples from 5 to 30.



.. _sparse_inverse_covariance:
Expand Down Expand Up @@ -210,6 +219,10 @@ cross-validation to automatically set the ``alpha`` parameter.
:target: ../auto_examples/covariance/plot_sparse_cov.html
:align: center
:scale: 60%
:alt: Various matrix using different estimators: Empircal covariance, Empircal
precision, Ledoit-Wolf covariance, Ledoit-Wolf precision, GraphicaLassoCV
covariance, GraphicaLassoCV precision, True covariance and True Precision.


*A comparison of maximum likelihood, shrinkage and sparse estimates of
the covariance and precision matrix in the very small samples
Expand Down Expand Up @@ -333,10 +346,14 @@ attributes of a :class:`MinCovDet` robust covariance estimator object.
.. |robust_vs_emp| image:: ../auto_examples/covariance/images/sphx_glr_plot_robust_vs_empirical_covariance_001.png
:target: ../auto_examples/covariance/plot_robust_vs_empirical_covariance.html
:scale: 49%
:alt: Two line charts comparing estimation errors that are made when using various
types of location and covariance estimates on contaminated Gaussian distributed
data sets.

.. |mahalanobis| image:: ../auto_examples/covariance/images/sphx_glr_plot_mahalanobis_distances_001.png
:target: ../auto_examples/covariance/plot_mahalanobis_distances.html
:scale: 49%
:alt: Scatterplot with plotted Mahalanobis distances.



Expand Down