Join GitHub today
GitHub is home to over 28 million developers working together to host and review code, manage projects, and build software together.Sign up
Consider exposing low-level Shader / ConstantBuffer API #3471
I see from #761 and this comment that there is a long-term plan to expose a Shader API to users. I wanted to add my +1 to this idea, and get a discussion going about whether - and if so, how best - to do so.
(As background: up to v3.x, XNA did have APIs for working directly with shaders, but these APIs were removed in v4.0. You could create VertexShader and PixelShader objects, and there were corresponding sets of
The existing -
One question I want to ask, and my apologies if this has been discussed already: are there any plans in the medium/long-term to move to a Direct3D 11-style (or more specifically, Direct3D 11 as interpreted by SharpDX) API, where there are separate classes for each pipeline stage? I can see arguments both ways, and it certainly doesn't have to be decided as part of this issue, but I wanted to raise it as it might affect the public API for shaders and constant buffers.
I don't know how controversial the idea of exposing a low-level
Are there are any platforms currently supported by MonoGame that don't support low-level shaders, but do support effects? (I assume not, but it's worth asking.)
An important consideration is how this will affect the content pipeline. I think it would make sense to make shaders first-class citizens in the content pipeline, with a
(This paragraph is not directly related to this discussion, but I wanted to mention why I'm asking for this feature: I have a shader generation system, running as a content pipeline processor, which takes as input a surface shader, and produces quite a few (currently 36) shader permutations. Up until recently, I've been writing all these permutations out into a single temporary
Now, I realise that this example is fairly extreme, so I'm hoping others can chime in with their thoughts too, so we get a balanced view of how useful this change would be. Obviously, additional public APIs mean additional maintenance overhead, so the decision needs to be made carefully.
I generally think so, but we need to finish some work first. If you look at
You will see it directly uses
Also I want to take a closer look at the Apple Metal and DirectX 12 APIs and see if the basic concepts we're exposing still make sense for them.
I never understood the benefit of that. Can you explain why we might want that?
In theory they could use either and be fine... or even mix them. Really i think all we can do is depend on documentation to explain things.
The PlayStation Mobile stuff is the only oddity we have when it comes to shaders. But it shouldn't stop us from making progress.
Right we want importers and processors that work on
When I wrote the current MGFX system and added these low level shader APIs I was specifically thinking of your situation. It is why
I have been looking at those myself for a while and came to the conclusion that MonoGame (in its current state) will be hard to transition to those API without a lot of hacking and loosing some of the advantages those APIs offer.
That being said I am curious to hear from other ppl on this subject.
IMO we have three things we should do.
First catch up to DX11 style deferred contexts for multithreading. This would be like adding a
Second we need to be able to use DX12/Metal under the current XNA style APIs. We won't get very much performance improvement from it... but it allows for compatibility for old XNA code. This also sets us up to be able to drop DX11 and GLES on iOS.
Last we enhance the APIs to support DX12/Metal more directly. These would be optional APIs to start with, but could eventually become the new API replacing the XNA4 ones.
Yeah... if the platform doesn't support DX12/Metal style APIs then they have to use the old XNA4 style APIs.
I guess we should discuss exactly what the public API would look like. I was imagining it would be close to the underlying Direct3D (and I think OpenGL?) API, something like:
GraphicsDevice.SetVertexShaderConstantBuffer(int slot, ConstantBuffer buffer); GraphicsDevice.SetVertexShaderConstantBuffers(int slot, ConstantBuffer buffers); GraphicsDevice.SetPixelShaderConstantBuffer(int slot, ConstantBuffer buffer); GraphicsDevice.SetPixelShaderConstantBuffers(int slot, ConstantBuffer buffers); // etc.
It's only an aesthetic difference, really. Instead of the above, we could have:
GraphicsDevice.VertexShader.Shader = shader; GraphicsDevice.VertexShader.SetConstantBuffer(int slot, ConstantBuffer buffer); GraphicsDevice.VertexShader.SetConstantBuffers(int slot, ConstantBuffer buffers); GraphicsDevice.PixelShader.Shader = shader; GraphicsDevice.PixelShader.SetConstantBuffer(int slot, ConstantBuffer buffer); GraphicsDevice.PixelShader.SetConstantBuffers(int slot, ConstantBuffer buffers); // etc.
where most of the methods are defined in a
While we're talking about this, I think we should consider #2602 as well. Currently we have:
But we could think about:
GraphicsDevice.VertexShader.SamplerStates; GraphicsDevice.VertexShader.Textures; GraphicsDevice.PixelShader.SamplerStates; GraphicsDevice.PixelShader.Textures;
(For backwards compatibility, we could keep the existing properties on
And on a related note...
I know about those APIs at a high level, but I haven't looked in detail about their API surfaces (I'm not sure if the Direct3D 12 API is public yet?). I agree, it's important that whatever changes we make now will at least have a migration path to these more modern APIs. Do you have beta access to the Direct3D 12 SDK? (or maybe if you do, you can't say!)
I think I see what you are getting at.
... or maybe also something more low level like...
The point being if you are working with
This also helps clear up the confusion on what to use...
Yeah... that is a good reason to do that.
I do... but can't share anything I learn about it here. :(
Yes, exactly - although for consistency with
new ConstantBuffer(GraphicsDevice device, int sizeInBytes); // Most common usage - setting a struct, which has the same layout as the shader cbuffer, at offset 0. ConstantBuffer.SetData<T>(int offset, T data); // Not sure when people would use this overload - MonoGame doesn't offer any public APIs for converting between structures and byte arrays ConstantBuffer.SetData(int offset, byte data, int count);
Since (in Direct3D at least) we create our constant buffers using
Something else I just thought about: do MGFX and MojoShader support
Remember the first customer for the
Good... there is no reason to ever do that.
MojoShader does not... it is a SM3.0 system.
For HLSL platforms we can as we use the DirectX shader compiler and
Not sure what you mean by "built-in effects". The Microsoft Effect system was depreciated in DX11. We cannot count on it in the future.
That is another conversation... shouldn't muddy this up with that.
Oh yes, good point. Do you think the raw byte overload should be exposed immediately though, or should wait until we do have public APIs to convert between structures and byte arrays?
Sorry, I meant
In which case, until we have a way to compile GLSL shaders directly (is there an issue for that? I know it's been mentioned a few times), you won't be able to use the
C# has APIs that can be used for that, so I don't see why we need to provide any.
They don't need to be changed at all.
We can use them right now with MojoShader... so we need to preserve that functionality in the short term. MojoShader just builds some fake constant buffers for float/int by packing all the data into an array.
Yes - I was only using that example to confirm that MG already supports
That's the part I was getting at - it depends on how MojoShader does that. It it does it in a predictable way, such that you can write a
What sort of timeline do you think is sensible for this? Presumably after 3.3, at least?