You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
What should be a Transformer vs. Estimator vs. LabelEstimator vs. FunctionNode vs. Evaluator vs. Loader vs. util method
What packages to put things in
(Till we figure something better out) ensure that Transformer RDD method implementations maintain each item & partitioning, e.g. functionally equivalent to rdd.map(x => apply(x))
Using MatrixUtils.rowsToMatrix instead of DenseMatrix(x:_*)
When to use Vector vs. DenseVector, when to use Matrix vs. DenseMatrix
(eventually) when & how to use NumericTransformer
etc.
Also, we should decide and be consistent about whether nodes:
are normal classes
are case classes
are normal classes w/ companion object constructor
The text was updated successfully, but these errors were encountered:
Should also contain a link to the spark style guide (which is our coding standard) and something brief about the expectation of how docs are formatted and desired test coverage.
Should contain information such as:
rdd.map(x => apply(x))
DenseMatrix(x:_*)
etc.
Also, we should decide and be consistent about whether nodes:
The text was updated successfully, but these errors were encountered: