Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improve OrderPreservingMultiDictionary. #11254

Merged
merged 5 commits into from
May 12, 2016

Conversation

CyrusNajmabadi
Copy link
Member

  1. Support allocation-free enumeration of the dictionary.
  2. Pool internal data so that less garbage is churned when creating and releasing dictionaries.

1. Support allocation-free enumeration of the dictionary.
2. Pool internal data so that less garbage is churned when creating and releasing dictionaries.
@CyrusNajmabadi
Copy link
Member Author

tagging @dotnet/roslyn-ide @dotnet/roslyn-compiler

The IDE would like to use this API to have a multidictionary that has low memory churn characteristics. Problems with the current API were that:

  1. there was no allocation-free way of enumerating the values within. You could request "Keys" but that would allocate an object for the KeyCollection.
  2. Even though the dictionary exposed a pooling API, the pooling only went as far as the top level dictionary itself. Internal data (specifically the arraybuilders used to store values) were not pooled and would cause GC churn.

This PR addresses both these concerns.

// the single item already in _value, and to store the item we're adding.
// In general, we presume that the amount of values per key will be low, so this
// means we have very little overhead when there are multiple keys per value.
private static ObjectPool<ArrayBuilder<V>> s_builderPool = new ObjectPool<ArrayBuilder<V>>(
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Isn't there a pooled version of array builder? e.g. ArrayBuilder<V>.GetInstance()

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice catch! I've switched to using that instead!

/// Each value is either a single V or an <see cref="ArrayBuilder{V}"/>.
/// Never null.
/// </summary>
private readonly object _value;
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is it common to have multiple items with the same key? If not, consider using OneOrMany<T> instead, which stores the values as T or ImmutableArray<T>.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is that any better though? In the current model, 1, 2, and many are handled fine. In the OneOrMany case, the moment you go past 2, you're always going to allocate on each subsequent add.

@gafter
Copy link
Member

gafter commented May 12, 2016

LGTM, after the noted change to use an existing pool.

{
get
{
Debug.Assert(this.Count >= 1);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is it common to fetch same "Items" twice?
Perhaps it is worth storing the array once it is created. (array can be dropped when set is modified).

@VSadov
Copy link
Member

VSadov commented May 12, 2016

LGTM.
Consider caching the array created in "Items". It could save some more allocation if re-reading same arrays is common enough

@CyrusNajmabadi CyrusNajmabadi merged commit 118bf7e into dotnet:master May 12, 2016
@CyrusNajmabadi CyrusNajmabadi deleted the metadataReading3 branch May 12, 2016 20:13
@CyrusNajmabadi CyrusNajmabadi restored the metadataReading3 branch May 30, 2016 21:04
@CyrusNajmabadi CyrusNajmabadi deleted the metadataReading3 branch January 25, 2020 00:37
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

5 participants