Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[DOC] BaseForecaster score function uses MAPE instead of sMAPE as metric #4947

Closed
MBristle opened this issue Jul 24, 2023 · 1 comment · Fixed by #4948
Closed

[DOC] BaseForecaster score function uses MAPE instead of sMAPE as metric #4947

MBristle opened this issue Jul 24, 2023 · 1 comment · Fixed by #4948
Labels
documentation Documentation & tutorials module:forecasting forecasting module: forecasting, incl probabilistic and hierarchical forecasting

Comments

@MBristle
Copy link
Contributor

Describe the issue linked to the documentation

The score function of the BaseForecaster suggests the metric that is used for evaluation is sMAPE:

Returns

        -------

        score : float
            sMAPE loss of self.predict(fh, X) with respect to y_test.

The code calls the mean_absolute_percentage_error function without any further parameters.

from sktime.performance_metrics.forecasting import (
            mean_absolute_percentage_error,
        )

        return mean_absolute_percentage_error(y, self.predict(fh, X))

But the default values of the mean_absolute_percentage_error function are:

def mean_absolute_percentage_error(
    y_true,
    y_pred,
    horizon_weight=None,
    multioutput="uniform_average",
    symmetric=False,
    **kwargs,
):

and the doc states:

... 
    symmetric : bool, default=False
        Calculates symmetric version of metric if True.
...

Thus I conclude that instead of the sMAPE, MAPE is returned.

Suggest a potential alternative/fix

Either adjust the documentation such that it states that the MAPE metric is used or adjust the implementation by adding the symmetric parameter. I did not investigate if there are other places with the documentation error. ;) Many thanks!!

@MBristle MBristle added the documentation Documentation & tutorials label Jul 24, 2023
@fkiraly
Copy link
Collaborator

fkiraly commented Jul 24, 2023

Good spot! I thik the default for MAPE (non-symmetric) was changed a while ago, which affected the score function.

In order to not break functionality downstream, it's better to change the docstring than the code logic.

Would you like to make a PR to change the docstring? Would be much appreciated!

@fkiraly fkiraly added the module:forecasting forecasting module: forecasting, incl probabilistic and hierarchical forecasting label Jul 24, 2023
MBristle added a commit to MBristle/sktime that referenced this issue Jul 24, 2023
The default value of the mean_absolute_percentage_error function changed to the non-symmetric MAPE. This has been made explicit, and the doc-string is adjusted.

extend contributors list

fixes sktime#4947
fkiraly pushed a commit that referenced this issue Jul 26, 2023
…f non-symmetric MAPE (#4948)

Fixes #4947

The default value of the `mean_absolute_percentage_error` function changed
to the non-symmetric MAPE. This has been made explicit, and the
doc-string is adjusted.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
documentation Documentation & tutorials module:forecasting forecasting module: forecasting, incl probabilistic and hierarchical forecasting
Projects
None yet
2 participants