-
Notifications
You must be signed in to change notification settings - Fork 17.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Excessive memory usage in Go : A concrete example #647
Labels
Comments
This test was run on a Mac Mini with a 2 Ghz Intel core 2 Duo processor (running Mac OS X 10.6) and under Linux (Ubuntu 8.04 LTS) on a pentium processor. The result is the same: Go uses much more memory than Python and executes at about the same speed (which is not very fast for a compiled language). What puzzles me most is that when I start godoc -http=:8080 , in order to broswe the doc in Mozilla, the resulting process uses up about 300 Megabytes of RAM (one third of the available RAM), which I cannot explain. This can be seen in the activity monitor or by running "top -U <username>" from the command line. How much RAM does: "godoc -http=:8080" use on your PC, then ? Serge. |
One last comment (for completeness): The effect described above becomes really a handicap if one parses a text file which is about 10 Mb large (or more), using the test program listed above or something similar. Presently, Python and C++ can easily parse text file which are over 100 Mb large whereas Go just runs grinds to a halt because it uses Gigabites of RAM to process a file which is merely 100 Mb large (using routine similar to the naive one listed above). The whole point of this test is to assess the capacity of Go to process large text files in order to determine if Go can be used to implement natural language processing applications better than Python or C++. In that field, processing large corpora is a routine task. Unless I did someting wrong or I missed an important point, it appears that Go in its present form, cannot compete with Python or C++ or C to process large text files. Serge. |
I'd try running the latest version -- 'hg sync' in $GOROOT and rebuild. With the latest version I'm not seeing memory usage go past 10 MB. You can monitor memory usage using runtime.Memstats: func monitor() { for { println(runtime.MemStats.Alloc) time.Sleep(1e9) } } And at the top of main, just add: runtime.GOMAXPROCS(2) //use two cores go monitor() When I run that on a 3.9 million line file (just hamlet concatenated a bunch of times), the highest mem usage I get 10.9 MB. |
Thanks for taking the time to do a fair comparison. This is indeed a bug in the garbage collector policy. http://golang.org/cl/257041 fixes the bug, which cuts the memory usage from 50M to 8M when reading Project Gutenberg's shaks12.txt. (The Python program is somewhere just north of 6M.) Owner changed to r...@golang.org. |
This issue was closed by revision 8ddd6c4. Status changed to Fixed. |
This issue was closed.
Sign up for free
to subscribe to this conversation on GitHub.
Already have an account?
Sign in.
by serge.hulne:
The text was updated successfully, but these errors were encountered: