Skip to content
View BlackHC's full-sized avatar


  • Pro
Block or Report

Block or report BlackHC

Block user

Prevent this user from interacting with your repositories and sending you notifications. Learn more about blocking users.

You must be logged in to block users.

Please don't include any personal information such as legal names or email addresses. Maximum 100 characters, markdown supported. This note will be visible to only you.
Report abuse

Contact GitHub support about this user’s behavior. Learn more about reporting abuse.

Report abuse

Hi there 👋

I have recently finished my PhD ("DPhil" with AIMS CDT) in Machine Learning at OATML (at the University of Oxford). Here is my quick online CV 🤗 @blackhc LinkedIn Google Scholar

🎓 Education & 💼 Industry Experience

  1. DPhil Computer Science

    University of Oxford, supervised by Prof Yarin Gal, Oxford, UK, Oct 2018 -- Summer 2023_
    Deep active learning and data subset selection using information theory and Bayesian neural networks.

  2. Research Engineer (Intern)

    Opal Camera, Remote, Oct 2022 -- Dec 2022
    Validation Pipeline for Gesture Control System.

  3. Resident Fellow

    Newspeak House, London, UK, Jan 2018 -- Jul 2018
    AI & Politics event series, science communication.

  4. Performance Research Engineer

    DeepMind, London, UK, Oct 2016 -- Aug 2017
    TensorFlow performance improvements (custom CUDA kernels) & profiling (i.a. “Neural Episodic Control”); automated agent regression testing.

  5. Software Engineer

    Google, Zürich, CH, Jul 2013 -- Sep 2016
    App & testing infrastructure; latency optimization; front-end development (Dart/GWT).

  6. MSc Computer Science

    Technische Universität München, München, DE, Sep 2009 -- Oct 2012
    Thesis “Assisted Object Placement”.

  7. BSc Mathematics

    Technische Universität München, München, DE, Sep 2009 -- Mar 2012
    Thesis “Discrete Elastic Rods”.

  8. BSc Computer Science

    Technische Universität München, München, DE, Sep 2007 -- Sep 2009
    Thesis “Multi-Tile Terrain Rendering with OGL/Equalizer”.

🧑‍🔬 Research

📚 Publications

Conference Proceedings

[1] J. Mukhoti*, A. Kirsch*, J. van Amersfoort, P. H. Torr, and Y. Gal, "Deterministic Neural Networks with Appropriate Inductive Biases Capture Epistemic and Aleatoric Uncertainty," CVPR 2023, 2023.

[2] F. Bickford Smith*, A. Kirsch*, S. Farquhar, Y. Gal, A. Foster, and T. Rainforth, "Prediction-Oriented Bayesian Active Learning," AISTATS, 2023.

[3] S. Mindermann*, J. M. Brauner*, M. T. Razzak*, A. Kirsch, et al., "Prioritized Training on Points that are Learnable, Worth Learning, and not yet Learnt," ICML, 2022.

[4] A. Jesson*, P. Tigas*, J. van Amersfoort, A. Kirsch, U. Shalit, and Y. Gal, "Causal-BALD: Deep Bayesian Active Learning of Outcomes to Infer Treatment-Effects from Observational Data," NeurIPS, 2021.

[5] A. Kirsch*, J. van Amersfoort*, and Y. Gal, "BatchBALD: Efficient and Diverse Batch Acquisition for Deep Bayesian Active Learning," NeurIPS, 2019.

Journal Articles

[6] A. Kirsch, "Black-Box Batch Active Learning for Regression", TMLR, 2023.

[7] A. Kirsch, "Does ‘Deep Learning on a Data Diet’ reproduce? Overall yes, but GraNd at Initialization does not", TMLR, 2023.

[8] A. Kirsch*, S. Farquhar*, P. Atighehchian, A. Jesson, F. Branchaud-Charron, Y. Gal, "Stochastic Batch Acquisition: A Simple Baseline for Deep Active Learning", TMLR, 2023.

[9] A. Kirsch and Y. Gal, "A Note on "Assessing Generalization of SGD via Disagreement"," TMLR, 2022.

[10] A. Kirsch and Y. Gal, "Unifying Approaches in Data Subset Selection via Fisher Information and Information-Theoretic Quantities," TMLR, 2022.

Workshop Papers

[11] D. Tran, J. Liu, M. W. Dusenberry, et al., "Plex: Towards Reliability using Pretrained Large Model Extensions," Principles of Distribution Shifts & First Workshop on Pre-training: Perspectives, Pitfalls, and Paths Forward, ICML 2022.

[12] A. Kirsch, J. Kossen, and Y. Gal, "Marginal and Joint Cross-Entropies & Predictives for Online Bayesian Inference, Active Learning, and Active Sampling," Updatable Machine Learning, ICML 2022, 2022.

[13] A. Kirsch, J. Mukhoti, J. van Amersfoort, P. H. Torr, and Y. Gal, "On Pitfalls in OoD Detection: Entropy Considered Harmful," Uncertainty in Deep Learning, 2021.

[14] A. Kirsch, T. Rainforth, and Y. Gal, "Active Learning under Pool Set Distribution Shift and Noisy Data," SubSetML, 2021.

[15] A. Kirsch*, S. Farquhar*, and Y. Gal, "A Simple Baseline for Batch Active Learning with Stochastic Acquisition Functions," SubSetML, 2021.

[16] A. Kirsch and Y. Gal, "A Practical & Unified Notation for Information-Theoretic Quantities in ML," SubSetML, 2021.

[17] A. Kirsch, C. Lyle, and Y. Gal, "Scalable Training with Information Bottleneck Objectives," Uncertainty in Deep Learning,

[18] A. Kirsch, C. Lyle, and Y. Gal, "Learning CIFAR-10 with a Simple Entropy Estimator Using Information Bottleneck Objectives," Uncertainty in Deep Learning, 2020.

📝 Reviewing

NeurIPS 2019 (Top Reviewer), AAAI 2020, AAAI 2021, ICLR 2021, NeurIPS 2021 (Outstanding Reviewer), NeurIPS 2022, NeurIPS 2022, TMLR, CVPR 2023.

🎯 Interests & Skills

Active Learning, Subset Selection, Information Theory, Information Bottlenecks, Uncertainty Quantification, Python, PyTorch, Jax, C++, CUDA, TensorFlow.

Popular repositories Loading

  1. tfpyth tfpyth Public

    Putting TensorFlow back in PyTorch, back in TensorFlow (differentiable TensorFlow PyTorch adapters).

    Python 641 97

  2. llm-strategy llm-strategy Public

    Directly Connecting Python to LLMs via Strongly-Typed Functions, Dataclasses, Interfaces & Generic Types

    Python 377 21

  3. toma toma Public

    Helps you write algorithms in PyTorch that adapt to the available (CUDA) memory

    Python 359 9

  4. BatchBALD BatchBALD Public

    Efficient and Diverse Batch Acquisition for Deep Bayesian Active Learning.

    Python 223 52

  5. dart_repl dart_repl Public

    Proof of concept REPL shell for Dart

    Dart 82 12

  6. batchbald_redux batchbald_redux Public

    Reusable BatchBALD implementation

    Jupyter Notebook 72 14