This repository has been archived by the owner on Nov 17, 2023. It is now read-only.
-
Notifications
You must be signed in to change notification settings - Fork 6.8k
[R] Progress Issue Tracking on R-package Documentation #242
Labels
Comments
tqchen
changed the title
[R] Progress Issue Tracking on R-package
[R] Progress Issue Tracking on R-package Documentation
Oct 9, 2015
Please use the script to list functions in specific environments.
> where <- c(paste("package:", "mxnet", sep = ""))
> lapply(where, find.funs)
[[1]]
[1] "arguments" "as.array"
[3] "init.ndarray.methods" "init.symbol.methods"
[5] "is.MXNDArray" "is.MXSymbol"
[7] "mx.apply" "mx.cpu"
[9] "mx.gpu" "mx.nd.array"
[11] "mx.nd.clip" "mx.nd.dot"
[13] "mx.nd.internal.copyto" "mx.nd.internal.div"
[15] "mx.nd.internal.div.scalar" "mx.nd.internal.load"
[17] "mx.nd.internal.minus" "mx.nd.internal.minus.scalar"
[19] "mx.nd.internal.mul" "mx.nd.internal.mul.scalar"
[21] "mx.nd.internal.plus" "mx.nd.internal.plus.scalar"
[23] "mx.nd.internal.random.gaussian" "mx.nd.internal.random.uniform"
[25] "mx.nd.internal.rdiv.scalar" "mx.nd.internal.rminus.scalar"
[27] "mx.nd.internal.save" "mx.nd.internal.set.value"
[29] "mx.nd.load" "mx.nd.save"
[31] "mx.symbol.Activation" "mx.symbol.BatchNorm"
[33] "mx.symbol.Concat" "mx.symbol.Convolution"
[35] "mx.symbol.Dropout" "mx.symbol.ElementWiseSum"
[37] "mx.symbol.Flatten" "mx.symbol.FullyConnected"
[39] "mx.symbol.Group" "mx.symbol.LeakyReLU"
[41] "mx.symbol.load" "mx.symbol.load.json"
[43] "mx.symbol.LRN" "mx.symbol.Pooling"
[45] "mx.symbol.Reshape" "mx.symbol.save"
[47] "mx.symbol.SliceChannel" "mx.symbol.Softmax"
[49] "mx.symbol.Variable" "mx.varg.symbol.Activation"
[51] "mx.varg.symbol.BatchNorm" "mx.varg.symbol.Concat"
[53] "mx.varg.symbol.Convolution" "mx.varg.symbol.Dropout"
[55] "mx.varg.symbol.ElementWiseSum" "mx.varg.symbol.Flatten"
[57] "mx.varg.symbol.FullyConnected" "mx.varg.symbol.Group"
[59] "mx.varg.symbol.internal.Div" "mx.varg.symbol.internal.Minus"
[61] "mx.varg.symbol.internal.Mul" "mx.varg.symbol.internal.Plus"
[63] "mx.varg.symbol.LeakyReLU" "mx.varg.symbol.LRN"
[65] "mx.varg.symbol.Pooling" "mx.varg.symbol.Reshape"
[67] "mx.varg.symbol.SliceChannel" "mx.varg.symbol.Softmax"
[69] "outputs"
> lsall(where)
[1] "arguments" "as.array"
[3] ".__C__Rcpp_MXNDArray" ".__C__Rcpp_MXSymbol"
[5] "init.ndarray.methods" "init.symbol.methods"
[7] "is.MXNDArray" "is.MXSymbol"
[9] "mx.apply" "mx.cpu"
[11] "mx.gpu" "mx.nd.array"
[13] "MXNDArray" "mx.nd.clip"
[15] "mx.nd.dot" "mx.nd.internal.copyto"
[17] "mx.nd.internal.div" "mx.nd.internal.div.scalar"
[19] "mx.nd.internal.load" "mx.nd.internal.minus"
[21] "mx.nd.internal.minus.scalar" "mx.nd.internal.mul"
[23] "mx.nd.internal.mul.scalar" "mx.nd.internal.plus"
[25] "mx.nd.internal.plus.scalar" "mx.nd.internal.random.gaussian"
[27] "mx.nd.internal.random.uniform" "mx.nd.internal.rdiv.scalar"
[29] "mx.nd.internal.rminus.scalar" "mx.nd.internal.save"
[31] "mx.nd.internal.set.value" "mx.nd.load"
[33] "mx.nd.save" "MXSymbol"
[35] "mx.symbol.Activation" "mx.symbol.BatchNorm"
[37] "mx.symbol.Concat" "mx.symbol.Convolution"
[39] "mx.symbol.Dropout" "mx.symbol.ElementWiseSum"
[41] "mx.symbol.Flatten" "mx.symbol.FullyConnected"
[43] "mx.symbol.Group" "mx.symbol.LeakyReLU"
[45] "mx.symbol.load" "mx.symbol.load.json"
[47] "mx.symbol.LRN" "mx.symbol.Pooling"
[49] "mx.symbol.Reshape" "mx.symbol.save"
[51] "mx.symbol.SliceChannel" "mx.symbol.Softmax"
[53] "mx.symbol.Variable" "mx.varg.symbol.Activation"
[55] "mx.varg.symbol.BatchNorm" "mx.varg.symbol.Concat"
[57] "mx.varg.symbol.Convolution" "mx.varg.symbol.Dropout"
[59] "mx.varg.symbol.ElementWiseSum" "mx.varg.symbol.Flatten"
[61] "mx.varg.symbol.FullyConnected" "mx.varg.symbol.Group"
[63] "mx.varg.symbol.internal.Div" "mx.varg.symbol.internal.Minus"
[65] "mx.varg.symbol.internal.Mul" "mx.varg.symbol.internal.Plus"
[67] "mx.varg.symbol.LeakyReLU" "mx.varg.symbol.LRN"
[69] "mx.varg.symbol.Pooling" "mx.varg.symbol.Reshape"
[71] "mx.varg.symbol.SliceChannel" "mx.varg.symbol.Softmax"
[73] "outputs" ".__T__as.array:base"
[75] ".__T__-:base" ".__T__/:base"
[77] ".__T__[<-:base" ".__T__[:base"
[79] ".__T__[[<-:base" ".__T__$<-:base"
[81] ".__T__$:base" ".__T__*:base"
[83] ".__T__+:base" |
stefanhenneking
pushed a commit
to stefanhenneking/mxnet
that referenced
this issue
Jun 30, 2017
* half2 class implemented * Added kPack * fixed operator names in comments * Removed kFloat16x2 type. Added half2(int) constructor * Removed dead code * Fixed lint error, removed __device__ definition from host compiles * Changed kPack to kLanes
eric-haibin-lin
pushed a commit
to eric-haibin-lin/mxnet
that referenced
this issue
Dec 2, 2017
iblislin
pushed a commit
to iblislin/incubator-mxnet
that referenced
this issue
Mar 18, 2018
Previously, when providing a KVStore object to mx.fit, the method would crash with an UndefVarError. This is now fixed by moving the definition of update_on_kvstore from _create_kvstore to fit.
Sign up for free
to subscribe to this conversation on GitHub.
Already have an account?
Sign in.
mxnet_generated.R
.mxnet_generated.R
The varg functions will be generated, and the docstring of the non-internal functions are writen to the commentsThe text was updated successfully, but these errors were encountered: