Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Move cache creation to be lazy #4781

Merged
merged 1 commit into from
Mar 23, 2023
Merged

Move cache creation to be lazy #4781

merged 1 commit into from
Mar 23, 2023

Conversation

BoD
Copy link
Contributor

@BoD BoD commented Mar 21, 2023

Related to #4780

@BoD BoD requested a review from martinbonnin as a code owner March 21, 2023 17:26
@netlify
Copy link

netlify bot commented Mar 21, 2023

Deploy Preview for apollo-android-docs canceled.

Name Link
🔨 Latest commit cc5a2ec
🔍 Latest deploy log https://app.netlify.com/sites/apollo-android-docs/deploys/641c05de14182c0008f788ed

return MemoryCacheFactory().create()
}
}).build()
assertNull(cacheCreateThreadName)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice test!

private val cache: OptimisticCache = OptimisticCache().chain(normalizedCacheFactory.createChain()) as OptimisticCache
// Keeping this as lazy to avoid accessing the disk at initialization which usually happens on the main thread
private val cache: OptimisticCache by lazy {
OptimisticCache().chain(normalizedCacheFactory.createChain()) as OptimisticCache
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is lazy thread safe? I remember reading something or some rumour saying lazy was generating a lot of bytecode. Might be worth making it a lateninit and use regular locks?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

By default by lazy uses a synchronized block on the JVM and on native it looks like it's using a Lock.

For the bytecode yes it can be "verbose" (👀 this from 2019!)

Pushed a commit doing this manually LMKWYT!

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks probably ok to keep lazy here according to Chet guidance. It's quite an heavy object and allocation is done only once so both work for me!

Comment on lines 43 to 49
if (_cache == null) {
lock.write {
if (_cache == null) {
_cache = OptimisticCache().chain(normalizedCacheFactory.createChain()) as OptimisticCache
}
}
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That doesn't work though. The whole if {} needs to be locked or you could create 2 caches

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah never mind, there's double if !

@BoD BoD added this pull request to the merge queue Mar 23, 2023
@BoD BoD merged commit 99126fd into main Mar 23, 2023
@BoD BoD deleted the lazy-cache-creation branch March 23, 2023 07:55
martinbonnin pushed a commit that referenced this pull request Mar 28, 2023
@martinbonnin martinbonnin mentioned this pull request Apr 3, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants