Skip to content
Invited talk for Summer BOB 2019 in Berlin
TeX Makefile
Branch: master
Clone or download
conal Merge pull request #1 from wagdav/patch-1
Add video link to the talk
Latest commit 4e4d1c3 Sep 12, 2019
Type Name Latest commit message Commit time
Failed to load latest commit information.
.gitignore first draft Jan 6, 2018
Makefile web Jul 1, 2019
deep-learning-rebooted.lhs tweaks Aug 21, 2019
formatting.fmt first draft Jan 6, 2018
macros.tex first draft Jan 6, 2018 first draft Jan 6, 2018 Add video link to the talk Sep 11, 2019

A Functional Reboot for Deep Learning

Invited talk for Summer BOB 2019 in Berlin.

You can find the slides (PDF) in my talks folder. A video (50 mins) is available on YouTube.


In this talk, I want to begin a conversation about what is the essence of deep learning and how we can optimally support this essence in the form of a programming interface or language. I'll give you my own impressions, and I hope to provoke an ongoing conversation. Despite the phenomenal success of deep learning, it's my sense that most of the choices made in the theory and practice of deep learning are nonessential and even harmful (unnecessarily complex and limited). I'll suggest that a very small addition to a modern typed functional programming language such as Haskell yields an ideal basis for deep learning that is much simpler, more general, and more rigorous that currently popular approaches.

You can’t perform that action at this time.