Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

database/sql: data race on rows.releaseConn #9716

Closed
dvyukov opened this issue Jan 28, 2015 · 12 comments
Closed

database/sql: data race on rows.releaseConn #9716

dvyukov opened this issue Jan 28, 2015 · 12 comments
Milestone

Comments

@dvyukov
Copy link
Member

dvyukov commented Jan 28, 2015

Race builder says:

WARNING: DATA RACE
Write by goroutine 71:
  database/sql.(*Stmt).Query()
      go/src/database/sql/sql.go:1436 +0x6f3
  database/sql.func·039()
      go/src/database/sql/sql_test.go:1957 +0x1ea
  testing.func·002()
      go/src/testing/benchmark.go:418 +0x18f

Previous read by goroutine 61:
  database/sql.(*Stmt).Query()
      go/src/database/sql/sql.go:1435 +0x51f
  database/sql.func·039()
      go/src/database/sql/sql_test.go:1957 +0x1ea
  testing.func·002()
      go/src/testing/benchmark.go:418 +0x18f
  runtime.goexit()
      go/src/runtime/asm_amd64.s:2430 +0x0
  reflect.ValueOf()
      go/src/reflect/value.go:2081 +0xfc
  database/sql/driver.int32Type.ConvertValue()
      go/src/database/sql/driver/types.go:108 +0x80
  database/sql/driver.(*int32Type).ConvertValue()
      <autogenerated>:12 +0xf9
  database/sql.driverArgs()
      go/src/database/sql/convert.go:68 +0xb3d
  database/sql.resultFromStatement()
      go/src/database/sql/sql.go:1310 +0x372
  database/sql.(*Stmt).Exec()
      go/src/database/sql/sql.go:1289 +0x31c
  database/sql.(*concurrentTxStmtExecTest).test()
      go/src/database/sql/sql_test.go:1689 +0x1a0
  database/sql.func·035()
      go/src/database/sql/sql_test.go:1750 +0xe2

Goroutine 71 (running) created at:
  testing.(*B).RunParallel()
      go/src/testing/benchmark.go:410 +0x46e
  database/sql.BenchmarkManyConcurrentQueries()
      go/src/database/sql/sql_test.go:1964 +0x496
  testing.(*B).runN()
      go/src/testing/benchmark.go:124 +0xf2
  testing.(*B).launch()
      go/src/testing/benchmark.go:216 +0x1d0

Goroutine 61 (running) created at:
  testing.(*B).RunParallel()
      go/src/testing/benchmark.go:410 +0x46e
  database/sql.BenchmarkManyConcurrentQueries()
      go/src/database/sql/sql_test.go:1964 +0x496
  testing.(*B).runN()
      go/src/testing/benchmark.go:124 +0xf2
  testing.(*B).launch()
      go/src/testing/benchmark.go:216 +0x1d0

http://build.golang.org/log/bfca7707a0ccb2dd15e8228eb5b97434c33d1143

There is a runtime error after that, but the race still can be true.

@dvyukov
Copy link
Member Author

dvyukov commented Jan 28, 2015

May be related to:
https://go-review.googlesource.com/#/c/2207/
It is the only recent change to database/sql.

@dvyukov dvyukov added this to the Go1.5 milestone Jan 28, 2015
@johto
Copy link
Contributor

johto commented Jan 28, 2015

Is there something I'm missing w.r.t. the line numbers in the race report? They seem completely bogus. E.g:

database/sql.func·039()
      /tmp/buildlet-scatch184358971/go/src/database/sql/sql_test.go:1957 +0x1ea

my line 1957 looks like this: loses0 := drv.closeCount

@johto
Copy link
Contributor

johto commented Jan 28, 2015

Is there something I'm missing w.r.t. the line numbers in the race report? They seem completely bogus.

Oh, my bad. It was my tree that was completely bogus. Please ignore that.

@methane
Copy link
Contributor

methane commented Jan 29, 2015

It's very strange.

goroutine 61 runs database/sql.(concurrentTxStmtExecTest).test()
But goroutine 61 was created from database/sql.BenchmarkManyConcurrentQueries()

@dvyukov
Copy link
Member Author

dvyukov commented Jan 29, 2015

It looks like another incarnation of a bug fixed upstream by:
http://llvm.org/viewvc/llvm-project?view=revision&revision=224702

Just ignore everything below runtime.goexit in stack of goroutine 61. Then it looks pretty consistent.

@minux
Copy link
Member

minux commented Jan 29, 2015

so it's time to update the race runtime again? or are you waiting until
it's closer to 1.5 release? Do you want me to file an issue to remind
us to update the race runtime?

@methane
Copy link
Contributor

methane commented Jan 29, 2015

I can't imagine why line 1435 and 1436 make race condition.

1426       rowsi, err = rowsiFromStatement(driverStmt{dc, si}, args...)
1427       if err == nil {
1428          // Note: ownership of ci passes to the *Rows, to be freed
1429          // with releaseConn.
1430          rows := &Rows{
1431             dc:    dc,
1432             rowsi: rowsi,
1433             // releaseConn set below
1434          }
1435          s.db.addDep(s, rows)
1436          rows.releaseConn = func(err error) {
1437             releaseConn(err)
1438             s.db.removeDep(s, rows)
1439          }
1440          return rows, nil
1441       }

@dvyukov
Copy link
Member Author

dvyukov commented Jan 29, 2015

minux, you may file an issue if you wish
updating race runtimes is difficult, I need to figure out what I want to do for race detector in this release cycle and then update the race runtime once

@dvyukov
Copy link
Member Author

dvyukov commented Jan 29, 2015

methane, does not make sense to me either
if it does not repeat on builders, let's assume that it is due to the heap corruption that caused the crash later

@dvyukov
Copy link
Member Author

dvyukov commented Jan 29, 2015

Here is another crash in the same place. Starts looking alarming.

fatal error: all goroutines are asleep - deadlock!

goroutine 1 [chan receive]:
testing.(*B).run(0xc20807a200, 0x0, 0x0, 0x0, 0x0, 0x0)
    src/testing/benchmark.go:180 +0x91
testing.RunBenchmarks(0x6d2698, 0x785cc0, 0x9, 0x9)
    src/testing/benchmark.go:312 +0x7ec
testing.(*M).Run(0xc20800a140, 0x7ade40)
    src/testing/testing.go:495 +0x286
main.main()
    database/sql/_test/_testmain.go:158 +0x28d

goroutine 295 [semacquire]:
sync.runtime_Semacquire(0xc2082dfca0)
    src/runtime/sema.go:43 +0x2d
sync.(*WaitGroup).Wait(0xc2080445a0)
    src/sync/waitgroup.go:132 +0x1e8
testing.(*B).RunParallel(0xc20807a200, 0xc208044580)
    src/testing/benchmark.go:421 +0x497
database/sql.BenchmarkManyConcurrentQueries(0xc20807a200)
    src/database/sql/sql_test.go:1964 +0x497
testing.(*B).runN(0xc20807a200, 0x3e8)
    src/testing/benchmark.go:124 +0xf3
testing.(*B).launch(0xc20807a200)
    src/testing/benchmark.go:216 +0x1d1
created by testing.(*B).run
    src/testing/benchmark.go:179 +0x57

goroutine 487 [semacquire]:
sync.runtime_Semacquire(0xc208516cdc)
    src/runtime/sema.go:43 +0x2d
sync.(*Mutex).Lock(0xc208516cd8)
    src/sync/mutex.go:66 +0xfd
database/sql.(*DB).addDep(0xc208516cb0, 0x8007e7a80, 0xc20819b298, 0x63cf60, 0xc208502ba0)
    src/database/sql/sql.go:370 +0x46
database/sql.(*Stmt).Query(0xc20829a000, 0xc2081b5f68, 0x2, 0x2, 0x0, 0x0, 0x0)
    src/database/sql/sql.go:1435 +0x5f3
database/sql.func·039(0xc2082e6620)
    src/database/sql/sql_test.go:1957 +0x1eb
testing.func·002()
    src/testing/benchmark.go:418 +0x190
created by testing.(*B).RunParallel
    src/testing/benchmark.go:419 +0x46e

goroutine 478 [chan receive]:
database/sql.(*DB).connectionOpener(0xc20800dd60)
    src/database/sql/sql.go:595 +0x6d
created by database/sql.Open
    src/database/sql/sql.go:458 +0x41e

http://build.golang.org/log/045696ee3a83e4bc4055222ffc905f1c5e4d8086

@johto
Copy link
Contributor

johto commented Jan 29, 2015

That suggests that either db.mu is leaked, or there's something terribly wrong with the compiler or runtime. Given the volume of problems lately I'd be inclined to think that the latter is more likely.

@rsc
Copy link
Contributor

rsc commented Jul 14, 2015

There's a lot of skepticism here and there are no later reports, so assuming this is some kind of corruption that has been fixed.

@rsc rsc closed this as completed Jul 14, 2015
@golang golang locked and limited conversation to collaborators Jul 13, 2016
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

6 participants