You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm looking for ideas/guidance on what needs to be done to support generating the documentation for large modules that are disallowed by current limits. I need to look into the code but I assume some sort of lazy module loading would be required to support it. Guidance, ideas, and thoughts are high appreciated. I'd be happy to implement support for large modules.
The text was updated successfully, but these errors were encountered:
Thanks for reporting this. As you point out, there are limits on the number of packages per module (10,000). Does your module exceed that size? I am told that the number of Go source files shouldn't by itself be a problem, though individual very large files are skipped. We're curious to hear as many details about this monster module as you're comfortable sharing.
Yes, the module contains considerably more packages than the 10k packages per module limit. It's a large codebase monorepo that's structured as a single go module for central dependency management and compatibility reasons. I will share the rough number of packages the module contains when I have those numbers.
That's quite surprising: the packages.Load operation is essentially a wrapper around go list -json to retrieve the metadata (directory, filename, package name, etc) for each package in the workspace, which is usually efficient and reliable. Are you able to print the list of arguments to packages.Load? If so, could you try running these two commands:
$ time go run golang.org/x/tools/go/packages/gopackages@master ./...
$ time go list -json ./...
(Replace ./... by the arguments you observe in the call to packages.Load.) I would expect both to run to completion rapidly on the order of several hundred packages per second, so a 10K package repo should take around half a minute.