Skip to content

Latest commit

 

History

History
322 lines (230 loc) · 12.2 KB

utilities.rst

File metadata and controls

322 lines (230 loc) · 12.2 KB

Utilities for Developers

Scikit-learn contains a number of utilities to help with development. These are located in :mod:`sklearn.utils`, and include tools in a number of categories. All the following functions and classes are in the module :mod:`sklearn.utils`.

Warning

These utilities are meant to be used internally within the scikit-learn package. They are not guaranteed to be stable between versions of scikit-learn. Backports, in particular, will be removed as the scikit-learn dependencies evolve.

.. currentmodule:: sklearn.utils

Validation Tools

These are tools used to check and validate input. When you write a function which accepts arrays, matrices, or sparse matrices as arguments, the following should be used when applicable.

  • :func:`assert_all_finite`: Throw an error if array contains NaNs or Infs.
  • :func:`safe_asarray`: Convert input to array or sparse matrix. Equivalent to np.asarray, but sparse matrices are passed through.
  • :func:`as_float_array`: convert input to an array of floats. If a sparse matrix is passed, a sparse matrix will be returned.
  • :func:`array2d`: equivalent to np.atleast_2d, but the order and dtype of the input are maintained.
  • :func:`atleast2d_or_csr`: equivalent to array2d, but if a sparse matrix is passed, will convert to csr format. Also calls assert_all_finite.
  • :func:`check_arrays`: check that all input arrays have consistent first dimensions. This will work for an arbitrary number of arrays.
  • :func:`warn_if_not_float`: Warn if input is not a floating-point value. the input X is assumed to have X.dtype.

If your code relies on a random number generator, it should never use functions like numpy.random.random or numpy.random.normal. This approach can lead to repeatability issues in unit tests. Instead, a numpy.random.RandomState object should be used, which is built from a random_state argument passed to the class or function. The function :func:`check_random_state`, below, can then be used to create a random number generator object.

  • :func:`check_random_state`: create a np.random.RandomState object from a parameter random_state.
    • If random_state is None or np.random, then a randomly-initialized RandomState object is returned.
    • If random_state is an integer, then it is used to seed a new RandomState object.
    • If random_state is a RandomState object, then it is passed through.

For example:

>>> from sklearn.utils import check_random_state
>>> random_state = 0
>>> random_state = check_random_state(random_state)
>>> random_state.rand(4)
array([ 0.5488135 ,  0.71518937,  0.60276338,  0.54488318])

Efficient Linear Algebra & Array Operations

Efficient Random Sampling

Efficient Routines for Sparse Matrices

The sklearn.utils.sparsefuncs cython module hosts compiled extensions to efficiently process scipy.sparse data.

Graph Routines

  • :func:`graph.single_source_shortest_path_length`: (not currently used in scikit-learn) Return the shortest path from a single source to all connected nodes on a graph. Code is adapted from networkx. If this is ever needed again, it would be far faster to use a single iteration of Dijkstra's algorithm from graph_shortest_path.
  • :func:`graph.graph_laplacian`: (used in :func:`sklearn.cluster.spectral.spectral_embedding`) Return the Laplacian of a given graph. There is specialized code for both dense and sparse connectivity matrices.
  • :func:`graph_shortest_path.graph_shortest_path`: (used in :class:sklearn.manifold.Isomap) Return the shortest path between all pairs of connected points on a directed or undirected graph. Both the Floyd-Warshall algorithm and Dijkstra's algorithm are available. The algorithm is most efficient when the connectivity matrix is a scipy.sparse.csr_matrix.

Backports

ARPACK

  • :func:`arpack.eigs` (backported from scipy.sparse.linalg.eigs in scipy 0.10) Sparse non-symmetric eigenvalue decomposition using the Arnoldi method. A limited version of eigs is available in earlier scipy versions.
  • :func:`arpack.eigsh` (backported from scipy.sparse.linalg.eigsh in scipy 0.10) Sparse non-symmetric eigenvalue decomposition using the Arnoldi method. A limited version of eigsh is available in earlier scipy versions.
  • :func:`arpack.svds` (backported from scipy.sparse.linalg.svds in scipy 0.10) Sparse non-symmetric eigenvalue decomposition using the Arnoldi method. A limited version of svds is available in earlier scipy versions.

Benchmarking

Testing Functions

Multiclass and multilabel utility function

Helper Functions

  • :class:`gen_even_slices`: generator to create n-packs of slices going up to n. Used in sklearn.decomposition.dict_learning and sklearn.cluster.k_means.
  • :class:`arraybuilder.ArrayBuilder`: Helper class to incrementally build a 1-d numpy.ndarray. Currently used in sklearn.datasets._svmlight_format.pyx.
  • :func:`safe_mask`: Helper function to convert a mask to the format expected by the numpy array or scipy sparse matrix on which to use it (sparse matrices support integer indices only while numpy arrays support both boolean masks and integer indices).
  • :func:`safe_sqr`: Helper function for unified squaring (**2) of array-likes, matrices and sparse matrices.

Hash Functions

  • :func:`murmurhash3_32` provides a python wrapper for the MurmurHash3_x86_32 C++ non cryptographic hash function. This hash function is suitable for implementing lookup tables, Bloom filters, Count Min Sketch, feature hashing and implicitly defined sparse random projections:

    >>> from sklearn.utils import murmurhash3_32
    >>> murmurhash3_32("some feature", seed=0) == -384616559
    True
    
    >>> murmurhash3_32("some feature", seed=0, positive=True) == 3910350737
    True
    

    The sklearn.utils.murmurhash module can also be "cimported" from other cython modules so as to benefit from the high performance of MurmurHash while skipping the overhead of the Python interpreter.

Warnings and Exceptions