-
Notifications
You must be signed in to change notification settings - Fork 912
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
investigating an error when import.ova
occurs
#1972
Comments
Can you run with Does this happen only when run by the job or are you seeing it when run manually from a terminal too? We should probably have a way to disable the progress logger regardless of this issue, which could also be a workaround here if the problem/fix isn't clear. |
We are going to add the environment variable As for a possible fix, without understanding what's going on, is to wrap the channel write operation in an if statement to ensure the channel can be written to before attempting. I'd like to figure out, though, why a channel is being written when the system implied it is |
We got the following stack trace: $ govc import.ova -options=/tmp/options.json340456251 image/ops-manager-vsphere-2.9.1-build.121.ova
[07-05-20 22:30:00] Uploading pivotal-ops-manager-disk1.vmdk... OK
panic: send on closed channel
goroutine 33 [running]:
github.com/vmware/govmomi/vim25/progress.(*reader).Read(0xc0003b34c0, 0xc000236000, 0x2000, 0x2000, 0xc00040db18, 0x408b9b, 0xc00001e000)
/home/vmware/src/github.com/vmware/govmomi/vim25/progress/reader.go:116 +0x24c
io/ioutil.devNull.ReadFrom(0x0, 0x1431220, 0xc0003b34c0, 0x101e7a0, 0x203001, 0x7fd2b23154a8)
/home/dougm/golang-go/src/io/ioutil/ioutil.go:147 +0x92
io.copyBuffer(0x1433300, 0x1e86af8, 0x1431220, 0xc0003b34c0, 0x0, 0x0, 0x0, 0x158946800, 0x0, 0x0)
/home/dougm/golang-go/src/io/io.go:388 +0x2ed
io.Copy(...)
/home/dougm/golang-go/src/io/io.go:364
net/http.(*transferWriter).doBodyCopy(0xc000097900, 0x1433300, 0x1e86af8, 0x1431220, 0xc0003b34c0, 0x158946800, 0x0, 0x0)
/home/dougm/golang-go/src/net/http/transfer.go:400 +0x6a
net/http.(*transferWriter).writeBody(0xc000097900, 0x142d940, 0xc0003b3700, 0x2, 0x2)
/home/dougm/golang-go/src/net/http/transfer.go:364 +0x733
net/http.(*Request).write(0xc000346f00, 0x142d940, 0xc0003b3700, 0x0, 0xc0003faa50, 0x0, 0x0, 0x0)
/home/dougm/golang-go/src/net/http/request.go:682 +0x6d3
net/http.(*persistConn).writeLoop(0xc00009fd40)
/home/dougm/golang-go/src/net/http/transport.go:2207 +0x1c8
created by net/http.(*Transport).dialConn
/home/dougm/golang-go/src/net/http/transport.go:1575 +0xb23
goroutine 1 [runnable]:
runtime.SetFinalizer(0xfdf020, 0xc0002618c0, 0x0, 0x0)
/home/dougm/golang-go/src/runtime/mfinal.go:329 +0x89
os.(*file).close(0xc0002618c0, 0x14387f8, 0xc000247700)
/home/dougm/golang-go/src/os/file_unix.go:252 +0x101
os.(*File).Close(0xc0002a4270, 0xc000247828, 0xc000247700)
/home/dougm/golang-go/src/os/file_unix.go:233 +0x33
github.com/vmware/govmomi/govc/importx.(*TapeArchiveEntry).Close(0xc0003fa180, 0x0, 0x0)
/home/vmware/src/github.com/vmware/govmomi/govc/importx/archive.go:92 +0x34
github.com/vmware/govmomi/govc/importx.(*ovfx).Upload(0xc00017f810, 0x1442b60, 0xc000038070, 0xc00029b680, 0xc00003c930, 0x2c, 0xc0003d99a0, 0x1e, 0x0, 0x0, ...)
/home/vmware/src/github.com/vmware/govmomi/govc/importx/ovf.go:323 +0x375
github.com/vmware/govmomi/govc/importx.(*ovfx).Import(0xc00017f810, 0x127ab2b, 0x5, 0x0, 0x0, 0x0)
/home/vmware/src/github.com/vmware/govmomi/govc/importx/ovf.go:297 +0x9e6
github.com/vmware/govmomi/govc/importx.(*ova).Import(...)
/home/vmware/src/github.com/vmware/govmomi/govc/importx/ova.go:62
github.com/vmware/govmomi/govc/importx.(*ova).Run(0xc00000e160, 0x1442b60, 0xc000038048, 0xc000074840, 0x0, 0x0)
/home/vmware/src/github.com/vmware/govmomi/govc/importx/ova.go:51 +0x10e
github.com/vmware/govmomi/govc/cli.Run(0xc0000300d0, 0x3, 0x3, 0xc00003e0b8)
/home/vmware/src/github.com/vmware/govmomi/govc/cli/command.go:165 +0x56e
main.main()
/home/vmware/src/github.com/vmware/govmomi/govc/main.go:97 +0x64
goroutine 6 [syscall, 2 minutes]:
os/signal.signal_recv(0xc000054787)
/home/dougm/golang-go/src/runtime/sigqueue.go:147 +0x9c
os/signal.loop()
/home/dougm/golang-go/src/os/signal/signal_unix.go:23 +0x22
created by os/signal.init.0
/home/dougm/golang-go/src/os/signal/signal_unix.go:29 +0x41
goroutine 19 [IO wait]:
internal/poll.runtime_pollWait(0x7fd2b2351e68, 0x72, 0xffffffffffffffff)
/home/dougm/golang-go/src/runtime/netpoll.go:184 +0x55
internal/poll.(*pollDesc).wait(0xc0002a0098, 0x72, 0x1c00, 0x1c98, 0xffffffffffffffff)
/home/dougm/golang-go/src/internal/poll/fd_poll_runtime.go:87 +0x45
internal/poll.(*pollDesc).waitRead(...)
/home/dougm/golang-go/src/internal/poll/fd_poll_runtime.go:92
internal/poll.(*FD).Read(0xc0002a0080, 0xc00031e000, 0x1c98, 0x1c98, 0x0, 0x0, 0x0)
/home/dougm/golang-go/src/internal/poll/fd_unix.go:169 +0x1cf
net.(*netFD).Read(0xc0002a0080, 0xc00031e000, 0x1c98, 0x1c98, 0x203000, 0x0, 0x1c8b)
/home/dougm/golang-go/src/net/fd_unix.go:202 +0x4f
net.(*conn).Read(0xc00000e3b8, 0xc00031e000, 0x1c98, 0x1c98, 0x0, 0x0, 0x0)
/home/dougm/golang-go/src/net/net.go:184 +0x68
crypto/tls.(*atLeastReader).Read(0xc0003372a0, 0xc00031e000, 0x1c98, 0x1c98, 0xc0000888c0, 0x41714e, 0xc0000888a0)
/home/dougm/golang-go/src/crypto/tls/conn.go:780 +0x60
bytes.(*Buffer).ReadFrom(0xc00022f058, 0x142daa0, 0xc0003372a0, 0x409c45, 0x101a5c0, 0x11e9680)
/home/dougm/golang-go/src/bytes/buffer.go:204 +0xb4
crypto/tls.(*Conn).readFromUntil(0xc00022ee00, 0x1432240, 0xc00000e3b8, 0x5, 0xc00000e3b8, 0x23)
/home/dougm/golang-go/src/crypto/tls/conn.go:802 +0xec
crypto/tls.(*Conn).readRecordOrCCS(0xc00022ee00, 0x0, 0x0, 0x3)
/home/dougm/golang-go/src/crypto/tls/conn.go:609 +0x124
crypto/tls.(*Conn).readRecord(...)
/home/dougm/golang-go/src/crypto/tls/conn.go:577
crypto/tls.(*Conn).Read(0xc00022ee00, 0xc000312000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
/home/dougm/golang-go/src/crypto/tls/conn.go:1255 +0x161
net/http.(*persistConn).Read(0xc000187b00, 0xc000312000, 0x1000, 0x1000, 0xc00003e660, 0xc000088c20, 0x404cf5)
/home/dougm/golang-go/src/net/http/transport.go:1752 +0x75
bufio.(*Reader).fill(0xc0002e42a0)
/home/dougm/golang-go/src/bufio/bufio.go:100 +0x103
bufio.(*Reader).Peek(0xc0002e42a0, 0x1, 0x0, 0x0, 0x1, 0xc00003e200, 0x0)
/home/dougm/golang-go/src/bufio/bufio.go:138 +0x4f
net/http.(*persistConn).readLoop(0xc000187b00)
/home/dougm/golang-go/src/net/http/transport.go:1905 +0x1d6
created by net/http.(*Transport).dialConn
/home/dougm/golang-go/src/net/http/transport.go:1574 +0xafe
goroutine 20 [select]:
net/http.(*persistConn).writeLoop(0xc000187b00)
/home/dougm/golang-go/src/net/http/transport.go:2204 +0x123
created by net/http.(*Transport).dialConn
/home/dougm/golang-go/src/net/http/transport.go:1575 +0xb23
goroutine 22 [select, 2 minutes]:
github.com/vmware/govmomi/nfc.(*LeaseUpdater).waitForProgress(0xc0003fa150, 0xc00003c930, 0x2c, 0xc0003d99a0, 0x1e, 0x0, 0x0, 0x0, 0x158946800, 0x11, ...)
/home/vmware/src/github.com/vmware/govmomi/nfc/lease_updater.go:95 +0x11f
created by github.com/vmware/govmomi/nfc.newLeaseUpdater
/home/vmware/src/github.com/vmware/govmomi/nfc/lease_updater.go:79 +0x1a3
goroutine 23 [select]:
github.com/vmware/govmomi/nfc.(*LeaseUpdater).run(0xc0003fa150)
/home/vmware/src/github.com/vmware/govmomi/nfc/lease_updater.go:125 +0x148
created by github.com/vmware/govmomi/nfc.newLeaseUpdater
/home/vmware/src/github.com/vmware/govmomi/nfc/lease_updater.go:84 +0x1fc |
What version of govc are you using? The "send on closed channel" is at |
That latest stable release with official binaries -- v0.22.1. The v0.22.2 has been tagged on |
If you are looking for release management with go. Our team has been really happy with goreleaser. It provides a consistent way of building binaries and publishing them to Github, etc. |
Right, v0.22.2 didn't impact govc, it was just a tag for govmomi on the release-0.22 branch. Regarding goreleaser, @frapposelli added support for that, but we had to revert to bash + github-release at some point. He started to bring back goreleaser in #1375 , but that's on the back burner too. We'd love some help getting that done if your team is interested in helping! |
The fix that you pointed out says it was only released in v0.22.2. They does affect |
No, that fix is included since v0.17.1: v0.17.0...v0.17.1 I think what you're referring to is the GitHub UI displaying the latest tag that includes that commit. If you click on the v0.22.2 only includes 1 commit since v0.22.1: v0.22.1...v0.22.2 |
If the fix was released on v0.22.1, then we are experiencing something different. I admit, that navigating the code is difficult, to "hunt down the bug". The CLI is request and response, via Soap albeit, but there doesn't appear to be streaming. The layer of goroutines and channels for a request and response makes it difficult to navigate. We know that reader is closed, the initiate happens on a The asynchronous coordination of the channels makes hard to know when |
Also, I'm I tried to illustrate that here. package main
import (
"fmt"
)
type Namer interface {
Name() string
}
type A struct {
name string
}
func (n A) Name() string {
return n.name
}
func main() {
a := A{"Mary"}
w := make(chan Namer)
go func() { w <- a }()
fmt.Printf("a = %#v\n", a)
b := <-w
fmt.Printf("b = %#v\n", b)
} Outputs that
I think this means |
We are going to set |
I haven't looked at the progress code much myself, agree it is tough to follow. But it was written 6+ years ago and the only other issue in that time was #1057 (which someone else fixed). I still think regardless of this issue, we should have an option to disable the progress reporting. |
This may not be the only other issue, this maybe the only other reported issue. So far, setting |
I'm not doubting there's bug(s) in the progress code. My point was just that I don't know the progress code myself as there hasn't been a need. Glad you found a workaround. You mentioned this only happens in CI and you weren't able to reproduce running it manually.. |
It was not on the same system. The reason being the network connection from workstation is slow to upload 5GB to the DC. The CI system has a direct connection. For human reasons, I didn't want to wait.
16, a large worker
It is running many other jobs in parallel. Nothing CPU intensive, mainly network I/O. |
I'm closing this as we were able to work around the problem by setting |
We have a job that runs a
govc import.ova
periodically.We are experiencing a bug (20%) with the upload.
The upload is successful, the
import.ova
is complete.I'm trying to capture this for SEO and to start our investigation.
The text was updated successfully, but these errors were encountered: