Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Performance optimization #496

Merged
merged 2 commits into from Oct 22, 2023
Merged

Performance optimization #496

merged 2 commits into from Oct 22, 2023

Conversation

kraendavid
Copy link
Contributor

No description provided.

@andyjefferson andyjefferson added this to the 6.0.6 milestone Oct 22, 2023
@andyjefferson andyjefferson merged commit 8c46edf into datanucleus:master Oct 22, 2023
@kraendavid kraendavid deleted the perf branch October 23, 2023 06:02
@chrisco484
Copy link

@kraendavid I wasn't sure what the purpose of the temporary 'updated' Set was. I'm sure there's a good reason but I couldn't work it out myself. Maybe due to recursion a recursive call needs to see the original Set? (but then couldn't that be achieved by only adding to the Set after the recursive call?) - don't know but losing sleep not being able to work it out ;)

@kraendavid
Copy link
Contributor Author

Hi @chrisco484
HashSet is faster lookup than Concurrent versions - that is the reason for using HashSet.
But since this is used on concurrent environment we have to make field volatile and assign a new temporary HashSet when changing the content of the HashSet.
So we want the fast lookup from HashSet - but still work in concurrent environment.
And this is all due to the fact that this set very fast will become stable and fully initialized - and thus only have lookup against it.
So we take the cost of being a little slower until fully initialized - but once initialized it is the fastest at runtime.
Did that explain it?

@chrisco484
Copy link

chrisco484 commented Oct 24, 2023

Hi @chrisco484 Did that explain it?

@kraendavid Yes, that explains it, thanks.

We have a massive enterprise app with ~3500 persistent classes so I was worried that the HashSet "copy from collection" constructor will be doing a lot of copying up until the point that initialization of all classes is complete.

Probably a case of short term pain for long term gain ;)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants