Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Coil 3] upgrade to Coil3 from Coil2, disk cache load failure #1998

Closed
wthee opened this issue Jan 3, 2024 · 1 comment · Fixed by #1999
Closed

[Coil 3] upgrade to Coil3 from Coil2, disk cache load failure #1998

wthee opened this issue Jan 3, 2024 · 1 comment · Fixed by #1999

Comments

@wthee
Copy link

wthee commented Jan 3, 2024

Describe the bug
upgrade to Coil3 from Coil2, disk cache load failure, AsyncImage onError throw exception : java.lang.IllegalArgumentException: Unexpected header: 10

  • ImageLoader
@OptIn(ExperimentalCoilApi::class)
    override fun newImageLoader(context: PlatformContext): ImageLoader {
        return ImageLoader.Builder(context)
            .components {
                add(NetworkFetcher.Factory())
            }
            .allowHardware(false)
            .memoryCachePolicy(CachePolicy.DISABLED)
            .diskCache {
                val path = context.filesDir.resolve(Constants.COIL_DIR).toOkioPath()
                DiskCache.Builder()
                    .maxSizePercent(0.04)
                    .directory(path)
                    .build()
            }

            .build()
    }

To Reproduce
upgrade to Coil3 from Coil2, and enable disk cache

Logs/Screenshots
java.lang.IllegalArgumentException: Unexpected header: 10

Version
Coil 3.0.0-alpha01
ktor 2.3.7

@colinrtwhite
Copy link
Member

Thanks - the disk cache entry format changed between Coil 2 and Coil 3. This will ensure the cache is cleared on upgrade. You can also clear the disk cache manually to work around this.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants