/
intergrating-pylearn2-and-hyperopt-taking-deep-l.json
32 lines (32 loc) · 3.26 KB
/
intergrating-pylearn2-and-hyperopt-taking-deep-l.json
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
{
"alias": "video/2731/intergrating-pylearn2-and-hyperopt-taking-deep-l",
"category": "SciPy 2014",
"copyright_text": "https://www.youtube.com/t/terms",
"description": "Deep learning algorithms have recently garnered much attention for their\nsuccesses in solving very difficult industrial machine perception\nproblems. However, for many practical purposes, these algorithms are\nunwieldy due to the rapid proliferation of \"hyperparameters\" in their\nspecification -- architectural and optimization constants which\nordinarily must be specified a priori by the practitioner. There is a\ngrowing interest within the machine learning community, and acutely so\namongst deep learning researchers, in intelligently automating the\nselection of hyperparameters for machine learning algorithms by through\nthe use of sequential model-based optimization techniques.\n[Hyperopt][http://hyperopt.github.io/hyperopt/] is software package\ndesigned for this purpose, architected as a general framework for\nhyperparameter optimization algorithms with support for complicated,\nawkward hyperparameter spaces that, e.g., involve many hyperparameters\nthat are only meaningful in the context of certain values of other\nhyperparameters.\n\n[Pylearn2][http://deeplearning.net/software/pylearn2] is a framework for\nmachine learning developed by the LISA laboratory at Universit\u00e9 de\nMontr\u00e9al; it is a research and prototyping library aimed primarily at\nmachine learning researchers, with a focus on \"deep learning\"\nalgorithms. Despite being far from a stable release, it has had\nconsiderable impact and developed a very active user community outside\nof the laboratory that birthed it.\n\nThis talk will deecribe recent efforts in building a flexible,\nuser-friendly bridge between Pylearn2 and Hyperopt for the purpose of\noptimizing the hyperparameters of deep learning algorithms. Briefly, it\nwill outline the relevant problem domain and the two packages, the\ntechnical challenges we've met in adapting the two for use with one\nanother and our solutions to them, in particular the development of a\nnovel common deferred evaluation/call-graph description language based\non ``functools.partial``, which we hope to make available in the near\nfuture as a standalone package.\n",
"duration": null,
"id": 2731,
"language": "eng",
"quality_notes": "",
"recorded": "2014-07-09",
"related_urls": [
"http://deeplearning.net/software/pylearn2]",
"http://hyperopt.github.io/hyperopt/]"
],
"slug": "intergrating-pylearn2-and-hyperopt-taking-deep-l",
"speakers": [
"David Warde-Farley"
],
"summary": "This talk/poster will outline and present recent work in integrating\nHyperopt, a package for the optimization of the hyperparameters of\nmachine learning algorithms, with Pylearn2, a machine learning research\nand prototyping framework focused on \"deep learning\" algorithms, the\ntechnical challenges we faced and how we addressed them.\n",
"tags": [
"machine learning"
],
"thumbnail_url": "https://i1.ytimg.com/vi/t50CGzbtcrY/hqdefault.jpg",
"title": "Intergrating Pylearn2 and Hyperopt: Taking Deep Learning Further with Hyperparamter Optimization",
"videos": [
{
"length": 0,
"type": "youtube",
"url": "https://www.youtube.com/watch?v=t50CGzbtcrY"
}
]
}