-
Couldn't load subscription status.
- Fork 18.4k
Description
Currently, defer runs once the function exits.
I argue that it would be much more intuitive if defer ran once it was out-of-scope.
For instance, a "solution" to a classic concurrent programming example, adding 2,000,000 to an integer, one at a time, concurrently
func main() {
mux := new(sync.Mutex)
wg := new(sync.WaitGroup)
wg.Add(2)
var x int
go addOneMillion(&x, mux, wg)
go addOneMillion(&x, mux, wg)
wg.Wait()
fmt.Println(x)
}
func addOneMillion(x *int, mux *sync.Mutex, wg *sync.WaitGroup) {
for i := 0; i < 1000000; i++ {
mux.Lock()
defer mux.Unlock()
*x = *x + 1
}
wg.Done()
}Intuitively, you would expect the loop to run 1,000,000 times, but every once in a while it would need to wait for the other goroutine to finish its operation.
Instead, the program gets stuck. This is because a goroutine runs mux.Lock(), but doesn't run mux.Unlock() until the function is ready to exit. But before that happens, mux.Lock() gets run again.
One of the other most common uses for defer is to free resources. Resources should be freed as soon as possible in order for optimal performance. A resource cannot be used outside of its variable scope (obviously), so therefore a defer should free the resource once it is out of scope.
For instance:
func main() {
// ...
if condition {
r, err := getSomeResource()
if err != nil {
panic(err)
}
defer r.Close()
// do stuff with `r`...
// under this proposal, r.Close() would be run right here
}
// we cannot do stuff with `r` as it is now out of scope,
// so it should be freed at this point.
someLongRunningOperation()
// in current Go, r.Close() would be run all the way down here. What a waste!
}