Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

compress/flate: runtime error: index out of range #41947

Open
agnivade opened this issue Oct 13, 2020 · 4 comments
Open

compress/flate: runtime error: index out of range #41947

agnivade opened this issue Oct 13, 2020 · 4 comments

Comments

@agnivade
Copy link
Contributor

@agnivade agnivade commented Oct 13, 2020

What version of Go are you using (go version)?

$ go version
Go 1.14.6

Does this issue reproduce with the latest release?

The code is still same in master, so I'd say yes

What operating system and processor architecture are you using (go env)?

This was a sentry report, so I am just filling whatever I have:

MacOS Version:10.14.6
Architecture: amd64

What did you do?

We have this Sentry crash report which points to an OOB access in the compress/flate library.

What did you expect to see?

No crash

What did you see instead?

runtime.boundsError: runtime error: index out of range [1106474014] with length 643
  File "deflate.go", line 259, in (*compressor).findMatch
  File "deflate.go", line 443, in (*compressor).deflate
  File "deflate.go", line 646, in (*compressor).close
  File "deflate.go", line 732, in (*Writer).Close
  File "gzip.go", line 242, in (*Writer).Close
  File "gzip.go", line 241, in (*GzipResponseWriter).Close
  File "deflate.go", line 259, in (*compressor).findMatch
  File "deflate.go", line 443, in (*compressor).deflate
  File "deflate.go", line 554, in (*compressor).write
  File "deflate.go", line 712, in (*Writer).Write
  File "gzip.go", line 196
  File "gzip.go", line 103, in (*GzipResponseWriter).Write
  File "io.go", line 407, in copyBuffer
  File "io.go", line 364, in Copy
  File "io.go", line 340, in CopyN
  File "fs.go", line 298, in serveContent
  File "fs.go", line 625, in serveFile
  File "fs.go", line 728, in (*fileHandler).ServeHTTP
  File "server.go", line 2080, in StripPrefix.func1
  File "server.go", line 2041, in HandlerFunc.ServeHTTP
  File "static.go", line 86
  File "server.go", line 2041, in HandlerFunc.ServeHTTP
  File "gzip.go", line 336, in GzipHandlerWithOpts.func1.1
  File "server.go", line 2041, in HandlerFunc.ServeHTTP
  File "mux.go", line 210, in (*Router).ServeHTTP
  File "server.go", line 2041, in HandlerFunc.ServeHTTP
  File "server.go", line 2836, in serverHandler.ServeHTTP
  File "server.go", line 1924, in (*conn).serve

Crash is happening here:

if wEnd == win[i+length] {

The codebase is https://github.com/mattermost/mattermost-server. Unfortunately, I cannot deduce which version of the codebase did this get triggered from. But we have a gzip response handler which compresses outgoing responses. The panic is coming in that path.

I also suspect that the stack trace is trashed somehow because a portion of it is duplicated. There's no way (*compressor).findMatch calls (*GzipResponseWriter).Close. But this is what I have.

/cc @klauspost @dsnet

@agnivade agnivade added this to the Backlog milestone Oct 13, 2020
@klauspost
Copy link
Contributor

@klauspost klauspost commented Oct 13, 2020

Yes, this looks impossible, and this piece of code hasn't been touched for many years. My immediate guess would be a race and the Writer being used concurrently by 2 goroutines.

@agnivade
Copy link
Contributor Author

@agnivade agnivade commented Oct 13, 2020

A concurrent write is not entirely impossible. Unfortunately, I just have this report so I cannot reproduce this myself.

@klauspost
Copy link
Contributor

@klauspost klauspost commented Oct 13, 2020

I don't suppose you have a stack trace with paths? It is pretty much impossible for me to track down the call path by generic filenames alone.

FWIW there is nothing protecting against concurrent writes in the GzipResponseWriter - not that it in any way indicates this is why it is happening.

@agnivade
Copy link
Contributor Author

@agnivade agnivade commented Oct 13, 2020

I don't suppose you have a stack trace with paths?

Unfortunately, I don't. This is all what Sentry generates.

However, I can fill it out for you if it makes things easier.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Linked pull requests

Successfully merging a pull request may close this issue.

None yet
2 participants