Join GitHub today
GitHub is home to over 40 million developers working together to host and review code, manage projects, and build software together.Sign up
The current tests are an inconsistent mess. Some files are hand-written. Others are auto-generated and enormous, and could be generated on the fly instead of being checked in. Some architectures are well tested. Others are not. Some subsets of architectures (AVX) are exhaustively tested. Others are not.
I would like a proper discussion about how to test the assembler (and the compiler, if you like). Without a clear picture, the tests will become enormous, wildly inconsistent, messy, and difficult to maintain.
Thank you for opening an issue.
Please, correct me where I'm wrong.
(quotes from https://go-review.googlesource.com/c/go/+/115858)
From @ianlancetaylor :
We had x86test program written by Russ that generated amd64enc.s test suite.
Many of those were commented-out with TODO comments, I've implemented missing instructions and performed uncommenting automatically with x86avxgen program (for legacy instruction, this was done by hand).
Because legacy x86 instruction already had coverage in amd64enc.s.
That may sound unfair, but if I'm working on x86 arch, it makes sense that I care more about that one.
To conclude: x86 test suite has auto generated tests for both new and older instructions.
I agree on this one, but when I've joined the project, it already had big auto-generated asm test file. I though it's logical and consistent to add another one to make it cover new instructions.
I'll describe current situation to make this discussion easier. Just in case any details are missing from my previous responses.
With x86, it was "almost" exhaustive auto-generated tests for both legacy and AVX instructions.
Hand-written tests also catch regression bugs.
This is the current approach, we can discuss alternatives and improvements now, if the situation is clear.
Where is the code for
I don't personally see the need to generate these test files on the fly. I tend to have a preference for keeping the files in the repo, but with a CI job that confirms they are in sync with the generator (for example run
However I think it is is problematic if the programs that generated them are unavailable or out of sync.
I don't have the same perspective as others here, so apologies if I'm speaking out of turn. It seems to me it would be great to get to the point where all x86 generated files in the Go repo can be reproduced from tools in
Ideally, some form of https://golang.org/cl/104496 could be landed as well. The goal would be that all these code generators take their input from the same (canonical) database of Go x86 instructions.
Thoughts? I'm interested in contributing in these areas if I can be of use :)
Great, thanks @quasilyte! From my point of view it doesn't need to be clean at all, just throw up whatever you have on your machine :) The initial goal is just to get to the point where we can reproduce all these generated files, then we can consider any cleanup/refactor.