Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
[SPARK-31768][ML] add getMetrics in Evaluators
### What changes were proposed in this pull request? add getMetrics in Evaluators to get the corresponding Metrics instance, so users can use it to get any of the metrics scores. For example: ``` val trainer = new LinearRegression val model = trainer.fit(dataset) val predictions = model.transform(dataset) val evaluator = new RegressionEvaluator() val metrics = evaluator.getMetrics(predictions) val rmse = metrics.rootMeanSquaredError val r2 = metrics.r2 val mae = metrics.meanAbsoluteError val variance = metrics.explainedVariance ``` ### Why are the changes needed? Currently, Evaluator.evaluate only access to one metrics, but most users may need to get multiple metrics. This PR adds getMetrics in all the Evaluators, so users can use it to get an instance of the corresponding Metrics to get any of the metrics they want. ### Does this PR introduce _any_ user-facing change? Yes. Add getMetrics in Evaluators. For example: ``` /** * Get a RegressionMetrics, which can be used to get any of the regression * metrics such as rootMeanSquaredError, meanSquaredError, etc. * * param dataset a dataset that contains labels/observations and predictions. * return RegressionMetrics */ Since("3.1.0") def getMetrics(dataset: Dataset[_]): RegressionMetrics ``` ### How was this patch tested? Add new unit tests Closes #28590 from huaxingao/getMetrics. Authored-by: Huaxin Gao <huaxing@us.ibm.com> Signed-off-by: Sean Owen <srowen@gmail.com>
- Loading branch information