-
Notifications
You must be signed in to change notification settings - Fork 27
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Untraceable IndexOutOfRangeException. #10
Comments
@smoothdeveloper still couldn't comprehend what is going on. I combined DiffSharp and Hype into one project, did what you suggested and debugged. It is extremely difficult for me to find at which point we get an unexpected value. And apparently the library is written by a mathematician=) thus, notations and naming makes it a little more difficult for me to debug easily. |
@zgrkpnr to debug, this is what I used, a .fsx file I've put in Hype/docs/input folder (just so you get the paths right) and I've built DiffSharp in debug (the library is in its own folder) #r "../../../../DiffSharp/DiffSharp/src/DiffSharp/bin/Debug/DiffSharp.dll"
#r "../../src/Hype/bin/Release/Hype.dll"
open DiffSharp.AD.Float32
open Hype
type Activation =
|Sigm
|Softmax
|Linear
|Tanh
member this.funDM =
match this with
|Sigm -> DM.Sigmoid
|Softmax -> DM.mapCols DV.SoftMax
|Tanh -> DM.Tanh
|Linear -> id
member this.funDV =
match this with
|Sigm -> DV.Sigmoid
|Softmax -> DV.SoftMax
|Tanh -> DV.Tanh
|Linear -> id
let inline ( *+) dv f = DV.Append(dv, toDV [|f|])
type AutoEncoder(i,h,a:Activation) =
let removeBias (X:DM) = X.[0..X.Rows-2, *]
let replaceBias (X:DM) = X.[0..X.Rows-2, *] |> DM.appendRow (DV.create X.Cols 1)
let appendBiasDM (X:DM) = DM.appendRow (DV.create X.Cols 1) X
/// Weights W:DM . Since it has tied weights both layers use the same weights one transpose of the other.
member val W = Rnd.NormalDM(h+1, i+1, D 0.f, D 0.1f) with get, set
/// Flattened weights W':DV
member this.W' with get() = DM.toDV this.W and set W' = this.W <- DV.toDM (h+1) W'
/// Forward propagate the data
member this.RunDM (X:DM) = let h = X |> appendBiasDM |> (*) this.W |> replaceBias
(DM.Transpose this.W) * h |> removeBias
/// Forward propagation when W' provided by optimization algorithm
member this.Run (W':DV) (X:DM) = this.W' <- W'
this.RunDM X
/// Encode data and get hidden unit
member this.Encode (X:DM) = X |> appendBiasDM |> (*) this.W |> replaceBias
// TEST
let ae = AutoEncoder(3,2, Activation.Sigm)
let p = {Params.Default with Regularization = NoReg; Loss = Loss.Quadratic }
let X' = ((toDM [[1.f;5.f;2.f];[8.f;2.f;2.f];[1.f;5.f;2.f];[8.f;2.f;2.f];
[1.1f;5.2f;2.f];[8.1f;2.1f;2.f];[0.9f;4.9f;2.f];[7.9f;1.9f;2.f]])) / 10 |> DM.Transpose
let ds' = Dataset(X', X'.Copy())
let a,b,_,_ = Optimize.Train(ae.Run , ae.W', ds', p) I evaluated all but the last line with "execute in interactive", opened
yes but at the same time if I had to implement those algorithms based on what I read in math papers (which I'd probably have difficult time comprehend) the code would probably be using same kind of conventions :) I've noticed that in few spots, the library takes |
@smoothdeveloper my problem was not about debugging, actually. I debugged and checked all the dimensions of all the matrices and vectors. Then I thought I may miss some point along the way and I wrote down all the expected dimensions on a paper. (Yes, on a paper=)) All the dimensions are as expected. I know, this is probably my lack of understanding the underlying implementation. Thus, someone should reproduce the issue and findout if the bug is in my code or in DiffSharp or Hype. |
I was just pointing the detailed steps because "I combined DiffSharp and Hype into one project" in your comment (which I understood as you putting code of both libraries into a custom project). Looking at your code there is the removeBias / replaceBias and appendBiasDM which have 2 and 1, is that correct? it looks like it could alter the matrices sizes. |
@smoothdeveloper exactly. They removeBias and appendBiasDM alters sizes. One adds a row while the other removes it. However, they are intermediate operations. The
You can use this line to verify that input and output have the sime dimension. Intermediate operations are also compatible to each other because About |
Here is my simple autoencoder code
This code produces the following error.
I also tried to debug with source code of Hype and DiffSharp but couldn't figure out where things got wrong.
The text was updated successfully, but these errors were encountered: