-
Notifications
You must be signed in to change notification settings - Fork 1.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ConcatTable Identity and CAddTable #55
Conversation
output = if (inplace) { | ||
input.get[Tensor[T]](1).get | ||
} else { | ||
val input1 = input.get[Tensor[T]](1).get |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
you can use Table.apply here. Same in other places
i += 1 | ||
} | ||
|
||
while(i <= gradInput.length()) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What's this line for? Shouldn't gradInput.length == input.length???
class CAddTable[@specialized(Float, Double) T: ClassTag](val inplace: Boolean = false)( | ||
implicit ev: TensorNumeric[T]) extends Module[Table, Tensor[T], T] { | ||
|
||
gradInput = T() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
make it null?
class ConcatTable[T : ClassTag](implicit ev: TensorNumeric[T]) | ||
extends Container[Activities, Activities, T] { | ||
|
||
output = T() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
null??
var i = 0 | ||
while (i < modules.length) { | ||
val currentOutput = modules(i).updateOutput(input) | ||
if (!output.toTable().contains(i + 1)) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
put output.toTable into a variable?
} else { | ||
var i = 1 | ||
while (i <= out.toTable().length()) { | ||
addTable(out.toTable().get[Activities](i).get, in.toTable().get[Activities](i).get) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
apply
if (in.isInstanceOf[Tensor[T]]) { | ||
in.toTensor[T]().clone() | ||
} else { | ||
val out = T() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
use table clone method
} | ||
} | ||
|
||
def backward(method: String, input: Activities, gradOutput: Activities, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we avoid override this???
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It's not override. But it's more confusing
scale : Double = 1.0) : Activities = { | ||
|
||
val isTable = input.isInstanceOf[Table] | ||
val wasTable = gradInput.isInstanceOf[Table] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
rename the varibale to isInputTable/isGradInputTable. The name is silly!
} | ||
var i = 0 | ||
while (i < modules.length) { | ||
method match { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
don't do this in Java/Scala!
7a352b4
to
2cdbabf
Compare
* feat: mkl-dnn initialize * fix: structure of building * fix: public final static * fix: delete the dependencies of environments * fix: skip tests * add update dnn wrappers * fix: dynamic load iomp5 * feat linear supports and some fix * add more wrapper * add lrn api * fix: add bn and softmax * fix: some fixes * fix: mkl-dnn build * feat: add get format api * fix: add getSize * feat: aligned memory * add conv fuse relu api * fix: add aligned storage * add concat api * fix: mkl envs for lib mkldnn * fix: add mkl add method with 2 ptrs * fix: update to Release * fix: batch norm infer mode * fix: update 0.5.0 -> 0.6.0 * add free (intel-analytics#5) * feat: affinity for java thread * fix: update core branch * fix: delete the memset constant value for debug, and add affinity * feat: add mkl-dnn fusion * fix: memory format enum consistent with dnn * feat: add auto format * refactor: delete the MemoryFormat in MklDnn * Memory should load MKLDnn (intel-analytics#6) * refactor: move enums to seprate classes (intel-analytics#7) * feat: add GetShape and GetFormat api * fix: delete printf * fix a bug * add sum * refactor: change name * refactor: change submodule infos * fix: set block time by default. A property to control to disable it
eltwise layer(sum) in caffe