-
Notifications
You must be signed in to change notification settings - Fork 555
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
spvBinaryParse API deficient: client needs to handle endianness; not easily composable #1
Comments
dneto0
added a commit
to dneto0/SPIRV-Tools
that referenced
this issue
Nov 20, 2015
Add members: - words: a pointer to an array of words in the instruction, in host native endianness. - num_words: sizes the words member Remove member: - offset This simplifies clients of spvBinaryParse, because they don't have to handle endianness translation. Also, it makes the binary parse API more composable, allowing for easy chaining of binary parse clients. A binary parse client is handed the array of words directly instead of having to reference some external array of all the words in the SPIR-V binary. It also allows a binary parse client to mutate the instruction stream before handing off to a downstream consumer. TODO(dneto): Still need to write the unit tests for spvBinaryParse Fixes: KhronosGroup#1
dneto0
added a commit
to dneto0/SPIRV-Tools
that referenced
this issue
Nov 20, 2015
Add members: - words: a pointer to an array of words in the instruction, in host native endianness. - num_words: sizes the words member Remove member: - offset This simplifies clients of spvBinaryParse, because they don't have to handle endianness translation. Also, it makes the binary parse API more composable, allowing for easy chaining of binary parse clients. A binary parse client is handed the array of words directly instead of having to reference some external array of all the words in the SPIR-V binary. It also allows a binary parse client to mutate the instruction stream before handing off to a downstream consumer. TODO(dneto): Still need to write the unit tests for spvBinaryParse Fixes: KhronosGroup#1
benvanik
pushed a commit
to xenia-project/SPIRV-Tools
that referenced
this issue
Nov 29, 2015
Add members: - words: a pointer to an array of words in the instruction, in host native endianness. - num_words: sizes the words member Remove member: - offset This simplifies clients of spvBinaryParse, because they don't have to handle endianness translation. Also, it makes the binary parse API more composable, allowing for easy chaining of binary parse clients. A binary parse client is handed the array of words directly instead of having to reference some external array of all the words in the SPIR-V binary. It also allows a binary parse client to mutate the instruction stream before handing off to a downstream consumer. TODO(dneto): Still need to write the unit tests for spvBinaryParse Fixes: KhronosGroup#1
dnovillo
referenced
this issue
in dnovillo/SPIRV-Tools
Sep 21, 2017
dnovillo
referenced
this issue
in dnovillo/SPIRV-Tools
Sep 21, 2017
dnovillo
referenced
this issue
in dnovillo/SPIRV-Tools
Sep 21, 2017
This was referenced Nov 24, 2017
s-perron
added a commit
that referenced
this issue
Feb 23, 2018
In some shaders there are a lot of very large and deeply nested structures. This creates a lot of work for scalar replacement. Also, since commit ca4457b we have been very aggressive as rewriting variables. This has causes a large increase in compile time in creating and then deleting the instructions. To help low the costs, I want to run a cleanup of some of the easy loads and stores to remove. This reduces the number of symbols sroa has to work on. It also reduces the amount of code the simplifier has to simplify because it was not generated by sroa. To confirm the improvement, I ran numbers on three different sets of shaders: Time to run --legalize-hlsl: Set #1: 55.89s -> 12.0s Set #2: 1m44s -> 1m40.5s Set #3: 6.8s -> 5.7s Time to run -O Set #1: 18.8s -> 10.9s Set #2: 5m44s -> 4m17s Set #3: 7.8s -> 7.8s Contributes to #1328.
This was referenced Jan 16, 2019
jaebaek
added a commit
to jaebaek/SPIRV-Tools
that referenced
this issue
Oct 15, 2020
`DebugInfoManager::AddDebugValueIfVarDeclIsVisible()` adds OpenCL.DebugInfo.100 DebugValue from DebugDeclare only when the DebugDeclare is visible to the give scope. It helps us correctly handle a reference variable e.g., { // scope KhronosGroup#1. int foo = /* init */; { // scope KhronosGroup#2. int& bar = foo; ... in the above code, we must not propagate DebugValue of `int& bar` for store instructions in the scope KhronosGroup#1 because it is alive only in the scope KhronosGroup#2. We have an exception: If the given DebugDeclare is used for a function parameter, `DebugInfoManager::AddDebugValueIfVarDeclIsVisible()` has to always add DebugValue instruction regardless of the scope. It is because the initializer (store instruction) for the function parameter can be out of the function parameter's scope (the function) in particular when the function was inlined. Without this change, the function parameter value information always disappears whenever we run the function inlining pass and `DebugInfoManager::AddDebugValueIfVarDeclIsVisible()`.
jaebaek
added a commit
that referenced
this issue
Oct 16, 2020
`DebugInfoManager::AddDebugValueIfVarDeclIsVisible()` adds OpenCL.DebugInfo.100 DebugValue from DebugDeclare only when the DebugDeclare is visible to the give scope. It helps us correctly handle a reference variable e.g., { // scope #1. int foo = /* init */; { // scope #2. int& bar = foo; ... in the above code, we must not propagate DebugValue of `int& bar` for store instructions in the scope #1 because it is alive only in the scope #2. We have an exception: If the given DebugDeclare is used for a function parameter, `DebugInfoManager::AddDebugValueIfVarDeclIsVisible()` has to always add DebugValue instruction regardless of the scope. It is because the initializer (store instruction) for the function parameter can be out of the function parameter's scope (the function) in particular when the function was inlined. Without this change, the function parameter value information always disappears whenever we run the function inlining pass and `DebugInfoManager::AddDebugValueIfVarDeclIsVisible()`.
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
The spvBinaryParse API isn't quite right:
values from instructions, the client must have retained the base pointer to the module, and then index into it via the instruction offset and the operand offset. That's fine as far as it goes, but it isn't very composable. For example, suppose you want to chain multiple transforms together. The consumer at
the end of the pipeline needs a base pointer somewhere to read values out of the instructions. But that means the last transform has to form a pointer relative to some array of words that was meaningful only at the beginning of the pipeline.
I think we can fix both problems in a reasonable way, as follows:
The spv_parsed_instruction_t should gain a "words" member of type "const uint32_t*". It points to native-endian sequence of SPIR-V words for the entire instruction. (So, for example, words[0] contains the combined word count and opcode.) As before, the contract is that the words array is only valid for as long as the callback is excecuting, then may be discarded or overwritten.
In the common case where the original module was in the machine's native endianness, then no copying is required: the words pointer is just computed from the original module's base pointer plus the instruction's offset. In the non-native endian case, there is some copying, but it's done by the parser and not the client. And since the lifetime of the data is restricted to the duration of the callback, we can reuse that space for the next instruction.
With the new design it will become easy to make pipelines of analysis or transform steps in a space efficient way. It should also be generally time efficient since the chain of callbacks should usually be working on the same chunk of instruction storage. It should be hot in the cache.
The text was updated successfully, but these errors were encountered: