Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Release 1.12.1 #197

Merged
merged 24 commits into from Dec 16, 2021
Merged

Release 1.12.1 #197

merged 24 commits into from Dec 16, 2021

Conversation

Morwenn
Copy link
Owner

@Morwenn Morwenn commented Dec 16, 2021

No description provided.

Passing the size directly lowers the number of operations performed by
bidirectional stable_partition in the general case (collection not
already partitioned or mostly partitioned). This in turn improves the
speed of bidirectional slabsort over random data.
This avoids a full collection scan when slab_sorter is given a full
std::list to sort.
With bidirectional iterators it potentially avoids n operations needed
to compute the size of the sequence.
The internal traits were accidentally guarded by __cpp_lib_range since
version 1.9.0, because I put the #endif in the wrong place.
This avoids accidentally specifying usig a different iterator tag for a
sorter's iterator_category and its internal static_assert, leading to
fewer potential bugs.
As a result, slabsort could only work with iterators for which an
ADL-found iter_swap was available.
Fixed/tweaked the following things about the class group_iterator used
in the implementation of merge_insertion_sort:
- Remove non-const operator[] (iterators aren't supposed to have one)
- Ensure it can wrap forward iterators
- Kill warning in make_group_iterator when difference_type < int
- Reduce type & code indirections where they are unneeded
@Morwenn Morwenn merged commit 3f4044e into master Dec 16, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

1 participant