/
HyperModel_subclass.Rmd
116 lines (88 loc) · 3.11 KB
/
HyperModel_subclass.Rmd
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
---
title: "HyperModel subclass"
output:
rmarkdown::html_vignette
vignette: >
%\VignetteIndexEntry{HyperModel subclass}
%\VignetteEngine{knitr::rmarkdown}
%\VignetteEncoding{UTF-8}
---
```{r setup, include=FALSE}
knitr::opts_chunk$set(echo = TRUE, eval = F)
```
Recently, the [reticulate library](https://rstudio.github.io/reticulate/) provided with one of the most anticipating functionality --- ability to write a python class in R.
We could use __a HyperModel subclass__ instead of a model-building function. So, this makes it easy to share and reuse hypermodels.
A HyperModel subclass only needs to implement a ```build(self, hp)``` method. And, again one should return a compiled model inside a ```build``` function.
```
MyHyperModel <- reticulate::PyClass(
"HyperModel",
inherit = kerastuneR::HyperModel_class(),
list(
`__init__` = function(self, num_classes) {
self$num_classes = num_classes
NULL
},
build = function(self,hp) {
model = keras_model_sequential()
model %>% layer_dense(units = hp$Int('units',
min_value=32L,
max_value=512L,
step=32L),
activation='relu') %>%
layer_dense(as.integer(self$num_classes), activation='softmax') %>%
compile(
optimizer= tf$keras$optimizers$Adam(
hp$Choice('learning_rate',
values=c(1e-2, 1e-3, 1e-4))),
loss='categorical_crossentropy',
metrics='accuracy')
}
)
)
```
## Full code
```
# generate some data
x_data <- matrix(data = runif(500,0,1),nrow = 50,ncol = 5)
y_data <- ifelse(runif(50,0,1) > 0.6, 1L,0L) %>% as.matrix()
x_data2 <- matrix(data = runif(500,0,1),nrow = 50,ncol = 5)
y_data2 <- ifelse(runif(50,0,1) > 0.6, 1L,0L) %>% as.matrix()
# subclass
MyHyperModel <- reticulate::PyClass(
"HyperModel",
inherit = kerastuneR::HyperModel_class(),
list(
`__init__` = function(self, num_classes) {
self$num_classes = num_classes
NULL
},
build = function(self,hp) {
model = keras_model_sequential()
model %>% layer_dense(units = hp$Int('units',
min_value=32L,
max_value=512L,
step=32L),
activation='relu') %>%
layer_dense(as.integer(self$num_classes), activation='softmax') %>%
compile(
optimizer= tf$keras$optimizers$Adam(
hp$Choice('learning_rate',
values=c(1e-2, 1e-3, 1e-4))),
loss='categorical_crossentropy',
metrics='accuracy')
}
)
)
# Random Search
hypermodel = MyHyperModel(num_classes = 10)
tuner = RandomSearch(
hypermodel,
objective = 'val_accuracy',
max_trials = 10,
directory = 'my_dir',
project_name = 'helloworld')
# Run
tuner %>% fit_tuner(x_data,y_data,
epochs = 5,
validation_data = list(x_data2, y_data2))
```