-
-
Notifications
You must be signed in to change notification settings - Fork 52
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add Time Series Block #239
Changes from 1 commit
62e05d8
f34952d
dd57498
9117ac6
6d85a5d
d432f41
b4fffa1
972fa0c
4229f7e
c0cbc53
19b1af4
06e6b07
4ba57d9
aab2c1f
e543dc7
024bc58
371ae47
8b73ead
cd79590
234d3fc
a5236c8
9f84954
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,3 +1,9 @@ | ||
function tabular2rnn(X::AbstractArray{Float32, 3}) | ||
X = permutedims(X, (2, 1, 3)) | ||
X = [X[t, :, :] for t ∈ 1:size(X, 1)] | ||
codeboy5 marked this conversation as resolved.
Show resolved
Hide resolved
|
||
return X | ||
end | ||
|
||
""" | ||
RNNModel(recbackbonem, outsize, recout[; kwargs...]) | ||
|
||
|
@@ -24,5 +30,5 @@ function RNNModel(recbackbone, | |
dropout_rate = 0.0) | ||
|
||
dropout = dropout_rate == 0 ? identity : Dropout(dropout_rate) | ||
Chain(recbackbone, dropout, finalclassifier) | ||
end | ||
Chain(tabular2rnn, recbackbone, dropout, finalclassifier) | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I don't think we should be doing this dense to slices transform in the model itself. If you just need something RNN friendly, relying on the built-in support for dense inputs should be enough. The There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. An alternative for now is to make the data pipeline spit out the vector of arrays. We can then revisit if/when you add models like CNNs which expect dense inputs. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I think the "ideal" place to do this transform would be inside the training loop. I am not sure how to exactly do that for FastAI.jl. Is there a way ? Since the second phase would involve using some CNNs, using data pipeline to spit out vector of arrays would not work. |
||
end |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think "tabular" is a bit of a misnomer here, but naming is not a priority.