Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

show median in benchmark results #67070

Closed
scoder opened this issue Nov 16, 2014 · 16 comments
Closed

show median in benchmark results #67070

scoder opened this issue Nov 16, 2014 · 16 comments
Labels
performance Performance or resource usage type-feature A feature request or enhancement

Comments

@scoder
Copy link
Contributor

scoder commented Nov 16, 2014

BPO 22881
Nosy @pitrou, @scoder, @vstinner, @serhiy-storchaka, @wm75
Files
  • show_median.patch: show median in addition to min and avg timings
  • show_median.patch: update: use average of middle values for even sample size
  • show_median.patch: update: fix median calculation for even number of samples
  • Note: these values reflect the state of the issue at the time it was migrated and might not reflect the current state.

    Show more details

    GitHub fields:

    assignee = None
    closed_at = <Date 2016-09-01.19:27:35.757>
    created_at = <Date 2014-11-16.09:50:17.907>
    labels = ['type-feature', 'performance']
    title = 'show median in benchmark results'
    updated_at = <Date 2016-09-01.19:27:35.756>
    user = 'https://github.com/scoder'

    bugs.python.org fields:

    activity = <Date 2016-09-01.19:27:35.756>
    actor = 'scoder'
    assignee = 'none'
    closed = True
    closed_date = <Date 2016-09-01.19:27:35.757>
    closer = 'scoder'
    components = ['Benchmarks']
    creation = <Date 2014-11-16.09:50:17.907>
    creator = 'scoder'
    dependencies = []
    files = ['37206', '37207', '39279']
    hgrepos = []
    issue_num = 22881
    keywords = ['patch']
    message_count = 16.0
    messages = ['231239', '231241', '231243', '242050', '242078', '242079', '242080', '242081', '242461', '242463', '242464', '242465', '242466', '242467', '242469', '273926']
    nosy_count = 5.0
    nosy_names = ['pitrou', 'scoder', 'vstinner', 'serhiy.storchaka', 'wolma']
    pr_nums = []
    priority = 'normal'
    resolution = 'fixed'
    stage = 'patch review'
    status = 'closed'
    superseder = None
    type = 'enhancement'
    url = 'https://bugs.python.org/issue22881'
    versions = []

    @scoder
    Copy link
    Contributor Author

    scoder commented Nov 16, 2014

    The median tends to give a better idea about benchmark results than an average as it inherently ignores outliers.

    @scoder scoder added performance Performance or resource usage type-feature A feature request or enhancement labels Nov 16, 2014
    @serhiy-storchaka
    Copy link
    Member

    In case of even number of samples the median value is calculated as arithmetic mean of two middle samples.

    med_base = (base_times[len(base_times)//2] + base_times[(len(base_times)-1)//2]) / 2

    @scoder
    Copy link
    Contributor Author

    scoder commented Nov 16, 2014

    Fair enough, patch updated.

    @scoder
    Copy link
    Contributor Author

    scoder commented Apr 26, 2015

    Any more comments on the patch, or can it be applied?

    @wm75
    Copy link
    Mannequin

    wm75 mannequin commented Apr 26, 2015

    for the even number case, I think you shouldn't do // 2, but / 2.

    In general, wouldn't it be good to let the statistics module do all the stats calculations?

    @scoder
    Copy link
    Contributor Author

    scoder commented Apr 26, 2015

    In general, wouldn't it be good to let the statistics module do all the stats calculations?

    It's not available in older Python versions, e.g. 2.6.

    @wm75
    Copy link
    Mannequin

    wm75 mannequin commented Apr 26, 2015

    It's not available in older Python versions, e.g. 2.6.

    I know, I was talking about 3.5+, of course. This would not be backported to Python2 anyway, would it?

    @wm75
    Copy link
    Mannequin

    wm75 mannequin commented Apr 26, 2015

    ah sorry, it's late here already and I forgot what file this change is about. So forget my last comment then.

    @scoder
    Copy link
    Contributor Author

    scoder commented May 3, 2015

    for the even number case, I think you shouldn't do // 2, but / 2.

    Right. I updated the patch.

    @pitrou
    Copy link
    Member

    pitrou commented May 3, 2015

    Have you found the median to be more stable than the minimum here?

    @scoder
    Copy link
    Contributor Author

    scoder commented May 3, 2015

    I'm actually not sure how it relates to the minimum. The more runs you have, the higher the chance of hitting the actual minimum at least once. And if none of the runs hits the real minimum, you're simply out of luck.

    However, it should tend to give a much better result than the (currently printed) average, which suffers from outliers. And outliers are almost always too high for benchmarks and never too low, due to various external influences.

    @pitrou
    Copy link
    Member

    pitrou commented May 3, 2015

    Then let's just replace the average with the median? I don't think it makes sense to add more statistical information to the output (IMHO, there is already too much of it :-)).

    @serhiy-storchaka
    Copy link
    Member

    May be just drop 5% of largest values to avoid the impact of outliers?

    See also bpo-23552.

    @scoder
    Copy link
    Contributor Author

    scoder commented May 3, 2015

    Well, we can apply a kludge, or apply statistics.

    @pitrou
    Copy link
    Member

    pitrou commented May 3, 2015

    May be just drop 5% of largest values to avoid the impact of outliers?

    In fast mode (option "-f"), there may not be enough samples for that.

    @vstinner
    Copy link
    Member

    The new https://github.com/python/performance benchmark suite now displays the median rather than the arithmetic mean (average) by default (it also displays the standard deviation).

    See the perf issue for a discussion about median vs mean:
    psf/pyperf#1

    Can we now close this issue?

    @scoder scoder closed this as completed Sep 1, 2016
    @ezio-melotti ezio-melotti transferred this issue from another repository Apr 10, 2022
    Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
    Labels
    performance Performance or resource usage type-feature A feature request or enhancement
    Projects
    None yet
    Development

    No branches or pull requests

    4 participants