Skip to content
A list of recommended Python libraries, and resources, intended for scientific Python users.
Branch: master
Clone or download
Type Name Latest commit message Commit time
Failed to load latest commit information. Added realpython guide to matplotlib Apr 3, 2019

Python for Scientists

A curated list of recommended Python frameworks, libraries, software and resources, all particularly useful for scientific Python users.

Intended for students and researchers in the sciences who want to get the most out of the open-source Python ecosystem. Aims to provide a list of tools useful for common tasks for scientists, without mentioning things which they are unlikely ever to need (e.g. authentication, databases, networking, NLP).

There is a section of must-haves for beginners.

List inspired by awesome-python, which is a great similar resource for anything else you might want to do with Python!

Some libraries appear multiple times where they are useful in multiple ways.


Libraries for manipulation of symbolic algebra, analytic integration etc.

  • SymPy - SymPy is a Python library for symbolic mathematics. It aims to become a full-featured computer algebra system (CAS) while keeping the code as simple as possible in order to be comprehensible and easily extensible.
  • sagemath - Mathematical software system with features covering multiple aspects of mathematics, including algebra, combinatorics, numerical mathematics, number theory, and calculus.


  • animatplot - A wrapper around matplotlib's funcanimation module - makes it very easy to animate matplotlib plots.

Bayesian Analysis

  • pymc3 - Package for Bayesian statistical modeling and probabilistic machine learning which focuses on advanced Markov chain Monte Carlo and variational fitting algorithms.
  • arviz - Exploratory analysis of Bayesian models.

Better Scientific Software

Code Quality

Tools to help you write neat and error-free python code

  • PEP8 - The official style guide for python code.
  • structure - The officially recommended way to structure any python project.
  • flake8 - A command-line tool which will tell you where you've violated PEP8's recommendations.
  • pyflakes - Similar to flake8, but instead checks for logistic errors (e.g. unused module imports).
  • pycodestyle - A wrapper for flake8 and pyflakes which runs them both.
  • pylint - A tool that checks for errors in Python code, tries to enforce a coding standard and looks for code smells.

Data Storage

  • netcdf4-python - netCDF is a popular file format for multidimensional data, developed by the weather and forecasting community. Use this format unless you have a good reason not to. netcdf4-python is a Python interface to the netCDF C library.
  • xarray - xarray's data model is based on netCDF, and provides the easiest way of reading and writing netCDF4 files in python. Will also load the data lazily, which is extremely useful when dealing with large amounts of data.
  • xmitgcm - A python package for reading MITgcm binary MDS files into xarray data structures. Included as an example of how to go about loading unusual binary file formats into xarray data structures intelligently.


  • pdb - The Python debugger. Part of the python standard library.

Development Environments

Programs to write code into. The main choice is between a software-engineering style IDE, and one intended specifically for scientific users.

  • JupyterLab - An IDE which incorporates Jupyter notebooks.
  • PyCharm - Very powerful IDE for python. Use if you want the full powers a software engineer would expect. Has a professional version, which is free for students.
  • spyder - MatLab-like development environment for scientific python users.


  • sphinx - Sphinx is a tool that makes it easy to create intelligent and beautiful documentation, from the docstrings in your code. Originally created for documenting the python language itself.
  • nbconvert - Convert jupyter notebooks to other formats such as PDF, LaTeX, HTML.

Domain-Specific Tools

Libraries of tools developed for python users in various fields of science.

  • astropy - Various tools and functionality for astronomy and astrophysics.
  • Biopython - Tools for biological computation.
  • geoviews - Makes it easy to explore and visualize geographical, meteorological, and oceanographic datasets, such as those used in weather, climate, and remote sensing research.
  • MetPy - MetPy is a collection of tools in Python for reading, visualizing and performing calculations with weather data.
  • NetworkX - A package for the creation, manipulation, and study of the structure, dynamics, and functions of complex networks.
  • nilearn - Machine learning for Neuro-Imaging in python.
  • Parcels - Track particles along circulating ocean currents.
  • PlasmaPy - Various tools for plasma physics.
  • ProDy - Package for protein structural dynamics analysis.
  • psychopy - An open-source application allowing you run a wide range of neuroscience, psychology and psychophysics experiments.
  • pyrocko - A seismology toolkit for python.
  • QuTIP - QuTiP is open-source software for simulating the dynamics of open quantum systems.
  • scikit-beam - Data analysis tools for X-Ray, Neutron and Electron sciences
  • scikit-spectra - A community developed python package for spectroscopy.
  • SunPy - SunPy is a data-analysis environment specializing in providing the software necessary to analyze solar and heliospheric data in Python.
  • TomoPy - Package for tomographic data processing and image reconstruction.

Error Handling

  • errors - How to properly raise and handle errors in python.
  • warnings - Throw proper warnings instead of using print statements. Python standard library module.
  • logging - Standard library module for properly logging information about what's going on as your code runs.


  • prophet - Tool for producing high quality forecasts for time series data that has multiple seasonality with linear or non-linear growth. Developed by Facebook.


  • python-gotchas - A collection of surprising Python snippets and lesser-known features.

GPU Acceleration

  • cupy - An implementation of a NumPy-compatible multi-dimensional array on CUDA.
  • numba - Numba can compile python functions into CUDA code.

Graphical Interfaces

  • pyqt5 - Library which lets you use the Qt GUI framework (itself written in C++) from python.

Job Scheduling

  • experi - An interface for managing computational experiments with many independent variables.
  • lancet - Launch jobs, organize the output, and dissect the results.
  • papermill - A tool for parameterizing, executing, and analyzing multiple Jupyter Notebooks.

Labelled Data

  • pandas - Major library for data analysis, made more powerful through the use of labelled data.
  • xarray - N-dimensional labelled arrays and datasets. Allows you to perform operations with incredible ease and clarity:
    average_temp = data['temperature'].sel('longitude'=40).mean(dim='time')

Mathematical Library Functions

  • scipy - The standard resource for all kinds of mathematical functions.
  • xrft - Discrete Fourier transform operations for xarray data structures.

Numerical Data

  • numpy - The fundamental package for numerical computation in python. So ubiquitous that it might as well be part of python's standard library at this point. Ultimately just a contiguous-in-memory C array, wrapped very nicely with python.

Optimisation Problems

  • nlopt - Library for nonlinear optimization, wrapping many algorithms for global and local, constrained or unconstrained, optimization.

Package Management

Keep track of module dependencies, python versions, and virtual environments.

  • conda - A package manager specifically intended for use by the scientific python community. Developed by the authors of numpy to manage both python packages and the underlying C/Fortran libraries which make them fast. Also obviates the need for system virtual environments.
  • anaconda - Conda, but packaged with a wide range of useful scientific python libraries, including many from this list.
  • pip - The standard way to install python packages. Use when you can't use conda, but will play nicely together.
  • setuptools - For when you make your own module, and want to install it properly into your conda environment (so you never need to touch your $PYTHONPATH!)

Paper Writing

  • matplotlib2tikz - A converter that takes a matplotlib figure and spits out a TikZ/PGFplots figure for smooth integration into LaTeX. Much better than having to try and alter details of a png image later on.


Use all the cores of your machine, and scale up to clusters!

  • dask - Tools for splitting up computations and executing them across many processors in parallel. dask.array in particular provides a numpy-like interface to a chunked-in-memory array. Dask is especially useful for analysing datasets which are larger than your RAM.
  • xarray - Employs dask behind the scenes to parallelize most operations. Simply load your dataset in "chunks" and xarray will operate on each chunk in parallel:
    # Load data in chunks
    ds = open_dataset('', chunks={'space': 100}
    # Will operate on each spatial chunk in parallel using dask

Physical Units

Keep track of which physical units your numbers are written in.

  • pint - Package to define, operate and manipulate physical quantities.
  • astropy.units - Submodule of astropy which handles units. Units multiply numpy arrays directly.


Producing static plots of publication quality.

  • matplotlib - A 2D plotting library which produces publication quality figures in a variety of hardcopy formats and interactive environments across platforms. The standard way to plot data in python.
  • anatomy of matplotlib - Tutorials on how to use matplotlib.
  • matplotlib guide - Article explaining how matplotlib is intended to be used, and will help clarify the differences between figures and axes, object-oriented approach, MATLAB-like stateful approach, interactivity etc. Also good because it gives clear best practice recommendations.
  • scientific-matplotlib - Matplotlib stylesheets for scientifc plots.
  • seaborn - A data visualisation library based on matplotlib. Produces much prettier plots than out-of-the-box matplotlib will.
  • xarray.plot - Submodule of xarray which makes plotting into a one-line job: data['density'].plot().
  • colorcet - A set of useful perceptually uniform colormaps for plotting scientific data.

Presentations and Sharing Work

  • Binder - Online Jupyter Notebook hosting for GitHub repositories. Allows users to run Jupyter notebooks from GitHub repositories in the cloud, without Python installed locally.
  • nb_pdf_template - A more accurate representation of jupyter notebooks when converting to pdfs.
  • RISE - A plugin for Jupyter which turns notebooks into slick presentations.
  • jupyter-rise - Automatically launch the RISE plugin from Binder. Great for giving presentations remotely.

Profiling and benchmarking

  • py-spy - A profiler for python code which doesn't interfere with the running process.
  • pytest-benchmark - A pytest fixture for benchmarking code.


Tools which are likely to be useful when writing python scripts to automate common tasks.

  • click - Run your scripts from the command line, with as little extra code as possible.
  • dateutil - Provides powerful extensions to the standard datetime module available in Python.
  • gitpython - Interact with git from python. Useful for tasks like checking if your simulation code has uncommitted changes before executing it.
  • pathlib - Use this anytime you want to do anything with a file path. Obviates the need for os and sys most of the time. A module in the python standard library.


Python inevitably sacrifices some speed to gain increased clarity. Scientific programs usually have one or two functions which do 90% of the work, and there are various ways to dramatically speed these up. Use in conjunction with parallelization through dask if you want as much speed as possible.

  • cython - A compiler which allows you to write snippets of C code into your python for massive speed increases.
  • F2PY - For calling fast, compiled Fortran subroutines from Python (part of SciPy)
  • numba - Automatic generation of fast compiled C code from Python functions.
  • bottleneck - A collection of fast numpy array functions written in C.
  • Theano - Allows you to define, optimize, and evaluate mathematical expressions involving multi-dimensional arrays efficiently.


  • statsmodels - Provides classes and functions for the estimation of many different statistical models, as well as for conducting statistical tests, and statistical data exploration.


Check that your code actually does what you think it will do!

  • pytest - The standard unit testing framework for python. Essential - if you're not unit-testing your calculations then you are merely hoping that they are actually doing what you think they are. pytest does a lot of magic behind the scenes to make it as simple as possible to use, with no boilerplate.
  • pytest-clarity - A plugin which improves the readability of pytest output.
  • hypothesis - Hypothesis testing for python. Normal tests check that your function behaves as expected for some specific input. Hypothesis tests check that your function behaves as expected for any input of some type, e.g. any string, or any numpy array. Basically magic, compatible with pytest, and the algorithms used in the implementation are very interesting.
  • cosmic-ray - Mutation testing in python. Checks that your test coverage is robust by randomly changing pieces of your code and checking that this change is actually caught by a test failing.
  • flaky - pytest plugin for automatically re-running inconsistent ("flaky") tests.


There are currently many competing visualisation libaries in python. 3D support is somewhat lacking though.

  • animatplot - A wrapper around matplotlib's funcanimation library - makes it very easy to animate matplotlib plots.
  • mayavi - 3D scientific data visualization and plotting in Python.
  • cartopy - A library for cartographic projections and plots, with matplotlib support.
  • bokeh - Bokeh is an interactive visualization library that targets modern web browsers for presentation.
  • plotly - Plotly's Python graphing library makes interactive, publication-quality graphs online.
  • holoviews - Stop plotting your data - annotate your data and let it visualize itself.
  • ipyvolume - 3d plotting for Python in the Jupyter notebook.
  • vispy - Interactive scientific visualisation in python.
  • yt - Very powerful software suite for analysing and visualising volumetric data. Written by astrophysicists, but since applied to many other domains.


Don't just write and run python scripts. Tools to make your workflow faster, clearer, and easier to come back to later.

  • ipython - Run python interactively, like MatLab! Forms the backend of Jupyter notebooks.
  • jupyter notebooks - In-browser notebooks comprised of cells which can contain markdown, images, or executable python code! Incredibly valuable for data exploration, presentation and recording work. The perfect format to email to a supervisor for feedback. Can be version-controlled with git so also useful for reproducibility and backing up your work.
  • jupyterlab - A development environment in which you can write Jupyter notebooks. The spiritual successor to spyder, in that it is designed specifically for scientists.
  • papermill - A tool for parameterizing, executing, and analyzing multiple Jupyter Notebooks.

Beginner Recommendations

  • First, install python through anaconda, which will also give you the packages you're about to use.
  • Write your code in either pycharm (if you want a professional IDE), spyder or jupyterlab (if you're used to MatLabs' environment).
  • Become familiar with numpy, the fundamental numeric object in python, and matplotlib, the standard way to plot.
  • Next, wrap your data into clearer, higher-level objects with either Pandas or xarray (use xarray if your data has more than one dimension).
  • Before writing new analysis functions, check if someone has already solved your problem for you in scipy , or in one of python's domain-specific scientific software packages.
  • As soon as you start writing your own analysis functions, test they're correct with unit tests written with pytest.
  • Analyse your data interactively with ipython, and record your work in a Jupyter notebook.
You can’t perform that action at this time.