-
Notifications
You must be signed in to change notification settings - Fork 2
/
info.json
22 lines (22 loc) · 1.3 KB
/
info.json
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
{
"abstract": "Bayesian inference problems require sampling or approximating high-dimensional probability distributions. The focus of this paper is on the recently introduced Stein variational gradient descent methodology, a class of algorithms that rely on iterated steepest descent steps with respect to a reproducing kernel Hilbert space norm. This construction leads to interacting particle systems, the mean field limit of which is a gradient flow on the space of probability distributions equipped with a certain geometrical structure. We leverage this viewpoint to shed some light on the convergence properties of the algorithm, in particular addressing the problem of choosing a suitable positive definite kernel function. Our analysis leads us to considering certain nondifferentiable kernels with adjusted tails. We demonstrate significant performance gains of these in various numerical experiments.",
"authors": [
"Andrew Duncan",
"Nikolas N\u00fcsken",
"Lukasz Szpruch"
],
"emails": [
"a.duncan@imperial.ac.uk",
"nikolas.nusken@kcl.ac.uk",
"l.szpruch@ed.ac.uk"
],
"id": "20-602",
"issue": 56,
"pages": [
1,
39
],
"title": "On the geometry of Stein variational gradient descent",
"volume": 24,
"year": 2023
}