Skip to content

Conversation

matteosantama
Copy link
Contributor

Partially closes #34677

@WillAyd
Copy link
Member

WillAyd commented Jun 10, 2020

I think we could also just remove this entry altogether

@WillAyd WillAyd added the Benchmark Performance (ASV) benchmarks label Jun 10, 2020
@matteosantama
Copy link
Contributor Author

Just tested and that works for me as well. You think that's the better route to take?

@WillAyd
Copy link
Member

WillAyd commented Jun 10, 2020

Yea I think so

@TomAugspurger
Copy link
Contributor

We need to verify that changing this won't cause a break in the timeseries at https://pandas.pydata.org/speed/pandas/.

@matteosantama
Copy link
Contributor Author

How do we do that? I have another PR pending that needs benchmarking

@TomAugspurger
Copy link
Contributor

I'm not sure what conditions cause asv to consider it a different environment.

I have another PR pending that needs benchmarking

Hopefully that isn't blocked by this issue. You can make the changes locally and run the ASVs.

@matteosantama
Copy link
Contributor Author

Oh duh yeah thanks

@jorisvandenbossche
Copy link
Member

I think this will cause a break in the timeseries (in a similar way as the cython version pin caused a new timeseries, see the two colors at eg https://pandas.pydata.org/speed/pandas/#algorithms.Hashing.time_frame), but I don't think there is a way around that? We need to do this update at some point, so the question is maybe if we are ok with doing it now.

@WillAyd
Copy link
Member

WillAyd commented Jul 8, 2020

Even if this breaks the time series I would be OK with doing it. Maybe after the next release so we get a clean delineation?

@jorisvandenbossche
Copy link
Member

Yes, I think waiting until after the release sounds as a good idea

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Benchmark Performance (ASV) benchmarks

Projects

None yet

Development

Successfully merging this pull request may close these issues.

BUG: ASV Benchmarks failing to build

4 participants