Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ND4J: INDArray.toString() - scalars (rank 1+) should have brackets to avoid ambiguity #8382

Closed
kris1710 opened this issue Nov 12, 2019 · 2 comments · Fixed by KonduitAI/deeplearning4j#43
Assignees
Labels
Milestone

Comments

@kris1710
Copy link

@kris1710 kris1710 commented Nov 12, 2019

Hi , I am trying to implement a factorization machine part in DL4j. This is with reference to the issue .I am concatenating the inputs along a dimension =1 and then using the tensor_sum layer to add them over.

class tensors_sum extends SameDiffLambdaLayer {
override def defineLayer(sd: SameDiff, x: SDVariable): SDVariable = sd.sum(x,false,1)
}

class tensors_square extends SameDiffLambdaLayer {
override def defineLayer(sd: SameDiff, x: SDVariable): SDVariable = sd.math.square(x)
}

class tensors_by_2 extends SameDiffLambdaLayer {
override def defineLayer(sd: SameDiff, x: SDVariable): SDVariable = x.mul(0.5)
}

The configuration code:

val conf =new NeuralNetConfiguration.Builder()
      .updater(new Sgd(0.01))
      .graphBuilder()
      .addInputs("input1","input2","input3") //can use any label for this
      .addLayer("e_1", new EmbeddingLayer.Builder().nIn(10).nOut(5).build(), "input1")
      .addLayer("e_2", new EmbeddingLayer.Builder().nIn(10).nOut(5).build(), "input2")
      .addLayer("d_1",new DenseLayer.Builder().nIn(1).nOut(5).build(),"input3")
      .addVertex("e_1_reshape", new ReshapeVertex(-1,1,5),"e_1")
      .addVertex("e_2_reshape", new ReshapeVertex(-1,1,5),"e_2")
   .addVertex("d_1_reshape", new ReshapeVertex(-1,1,5),"d_1")
     .addVertex("stacking_1", new MergeVertex(),"e_1_reshape","e_2_reshape","d_1_reshape")
   .addLayer("a_plus_b",new tensors_sum,"stacking_1")
      .addLayer("a_plus_b_square",new tensors_square,"a_plus_b")```
   .addLayer("a_square_b_square",new tensors_square,"stacking_1")
      .addLayer("a_sq_plus_b_sq",new tensors_sum,"a_square_b_square")
       .addVertex("2ab",new ElementWiseVertex(Op.Subtract),"a_plus_b_square","a_sq_plus_b_sq")
     .addLayer("ab",new tensors_by_2,"2ab")
.addLayer("ab_sum",new tensors_sum,"ab")
     .addVertex("2d_out",new ReshapeVertex(-1,1),"ab_sum")
      .setOutputs("2d_out")	
      .build()

However, when i run

 val x   = Nd4j.ones(1,1)
   println(net.outputSingle(x,x,x))

I get the output as 3.8056, which is a scalar, however the output should be [[3.8056]] a (1,1) tensor. Can anyone please help? Additionally I wrote my code on the idea that reshape layer takes in the full dimensions ie batchsize,dim1,dim2 unlike keras. So if I ve to reshape an output of the embedding layer from (batchsize, 5), I ve to write something like ReshapeVertex(-1,1,5) (output shape would be (batchsize,1,5)

@AlexDBlack AlexDBlack self-assigned this Nov 12, 2019
@AlexDBlack

This comment has been minimized.

Copy link
Contributor

@AlexDBlack AlexDBlack commented Nov 13, 2019

So, this looks like a simple formatting issue for INDArray.toString()
Using your configuration and input, I get the following:

        System.out.println(out);
        System.out.println(out.shapeInfoToString());
4.0139
Rank: 2, DataType: FLOAT, Offset: 0, Order: c, Shape: [1,1],  Stride: [1,1]

In other words, it has shape you expect.

Full code: https://gist.github.com/AlexDBlack/68403bf29ea4c4b1e91651af9ef68f6f

Additionally I wrote my code on the idea that reshape layer takes in the full dimensions ie batchsize,dim1,dim2 unlike keras.

Right, DL4J always has the leading batch dimension everywhere

@AlexDBlack AlexDBlack changed the title shape issue, Unable to get the correct shape of the output vector ND4J: INDArray.toString() - scalars (rank 1+) should have brackets to avoid ambiguity Nov 13, 2019
@AlexDBlack AlexDBlack added this to the 1.0.0-beta6 milestone Nov 13, 2019
@kris1710

This comment has been minimized.

Copy link
Author

@kris1710 kris1710 commented Nov 13, 2019

Thanks Alex :)

AlexDBlack added a commit to KonduitAI/deeplearning4j that referenced this issue Nov 13, 2019
Signed-off-by: AlexDBlack <blacka101@gmail.com>
AlexDBlack added a commit to KonduitAI/deeplearning4j that referenced this issue Nov 14, 2019
Signed-off-by: AlexDBlack <blacka101@gmail.com>
AlexDBlack added a commit to KonduitAI/deeplearning4j that referenced this issue Nov 14, 2019
* eclipse#8172 Enable DL4J MKLDNN batch norm backward pass

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* eclipse#8382 INDArray.toString() rank 1 brackets / ambiguity fix

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* eclipse#8308 Fix handful of broken links (inc. some in errors)

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Unused dependencies, round 1

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Unused dependencies, round 2

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Unused dependencies, round 3

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Small fix

Signed-off-by: AlexDBlack <blacka101@gmail.com>

* Uniform distribution TF import fix

Signed-off-by: AlexDBlack <blacka101@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
2 participants
You can’t perform that action at this time.