Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -324,7 +324,7 @@
"In `__init__` the suggested structure is as follows:\n",
"* Create and store the necessary dictionaries (this has been done for you)\n",
"* Define an LSTM layer that takes as params: an input size (the number of characters), a hidden layer size `n_hidden`, a number of layers `n_layers`, a dropout probability `drop_prob`, and a batch_first boolean (True, since we are batching)\n",
"* Define a dropout layer with `dropout_prob`\n",
"* Define a dropout layer with `drop_prob`\n",
"* Define a fully-connected layer with params: input size `n_hidden` and output size (the number of characters)\n",
"* Finally, initialize the weights (again, this has been given)\n",
"\n",
Expand Down Expand Up @@ -557,7 +557,7 @@
},
"outputs": [],
"source": [
"## TODO: set you model hyperparameters\n",
"## TODO: set your model hyperparameters\n",
"# define and print the net\n",
"n_hidden=\n",
"n_layers=\n",
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -383,7 +383,7 @@
"In `__init__` the suggested structure is as follows:\n",
"* Create and store the necessary dictionaries (this has been done for you)\n",
"* Define an LSTM layer that takes as params: an input size (the number of characters), a hidden layer size `n_hidden`, a number of layers `n_layers`, a dropout probability `drop_prob`, and a batch_first boolean (True, since we are batching)\n",
"* Define a dropout layer with `dropout_prob`\n",
"* Define a dropout layer with `drop_prob`\n",
"* Define a fully-connected layer with params: input size `n_hidden` and output size (the number of characters)\n",
"* Finally, initialize the weights (again, this has been given)\n",
"\n",
Expand Down