Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Consider exposing low-level Shader / ConstantBuffer API #3471

Open
tgjones opened this issue Jan 28, 2015 · 11 comments
Open

Consider exposing low-level Shader / ConstantBuffer API #3471

tgjones opened this issue Jan 28, 2015 · 11 comments
Labels
Design Requires API decisions MGFX Shader compilation tool

Comments

@tgjones
Copy link
Contributor

tgjones commented Jan 28, 2015

I see from #761 and this comment that there is a long-term plan to expose a Shader API to users. I wanted to add my +1 to this idea, and get a discussion going about whether - and if so, how best - to do so.

(As background: up to v3.x, XNA did have APIs for working directly with shaders, but these APIs were removed in v4.0. You could create VertexShader and PixelShader objects, and there were corresponding sets of Get*Shader*Constant and Set*ShaderConstant methods on GraphicsDevice.)

The existing - internal - Shader and ConstantBuffer classes already look well-factored to me, and GraphicsDevice already has - internal - properties for VertexShader and PixelShader, and a SetConstantBuffer method.

One question I want to ask, and my apologies if this has been discussed already: are there any plans in the medium/long-term to move to a Direct3D 11-style (or more specifically, Direct3D 11 as interpreted by SharpDX) API, where there are separate classes for each pipeline stage? I can see arguments both ways, and it certainly doesn't have to be decided as part of this issue, but I wanted to raise it as it might affect the public API for shaders and constant buffers.

I don't know how controversial the idea of exposing a low-level Shader / ConstantBuffer API is - there's certainly potential for confusing new users, if you need to choose between shaders and effects. Any thoughts on this?

Are there are any platforms currently supported by MonoGame that don't support low-level shaders, but do support effects? (I assume not, but it's worth asking.)

An important consideration is how this will affect the content pipeline. I think it would make sense to make shaders first-class citizens in the content pipeline, with a ShaderImporter, ShaderProcessor, etc. I suspect there would be more work involved here, than in the runtime side of the API, but if this is the route that we go down, I'd be happy to help. This was always the missing piece back when XNA did have a runtime API for shaders - as far as I know, there was no content pipeline support for these low-level shader objects; you had to compile the shader yourself, and then pass the bytecode to the relevant constructor in the runtime types.)


(This paragraph is not directly related to this discussion, but I wanted to mention why I'm asking for this feature: I have a shader generation system, running as a content pipeline processor, which takes as input a surface shader, and produces quite a few (currently 36) shader permutations. Up until recently, I've been writing all these permutations out into a single temporary .mgfx file, and then calling MG's EffectProcessor to compile it, and loading it as an Effect at runtime (and then converting it back into my own hierarchy of objects so that I can manage the permutations). But now that I need some of those permutations to have different vertex shaders, it's becoming unwieldy to use a single .mgfx file. I want to go with the classic uber-shader approach of pre-processor #defines, but that means each shader permutation will need to be compiled separately (compiling each shader separately is exactly what the MGFX compiler is doing internally, but that's not exposed, which makes sense, because there's currently no runtime support for individual shaders). My workaround is to compile each permutation into its own Effect, but since most of the vertex shaders are the same, that means there is a lot of duplication (unless MonoGame de-dupes at a Shader level, but I don't think it does). My ideal situation is to work with Shader objects directly, both in the content pipeline and at runtime, because my shader permutation system doesn't really fit into the existing effects framework.)


Now, I realise that this example is fairly extreme, so I'm hoping others can chime in with their thoughts too, so we get a balanced view of how useful this change would be. Obviously, additional public APIs mean additional maintenance overhead, so the decision needs to be made carefully.

@tomspilman
Copy link
Member

classes already look well-factored to me

I generally think so, but we need to finish some work first. If you look at ConstantBuffer:

https://github.com/mono/MonoGame/blob/develop/MonoGame.Framework/Graphics/Shader/ConstantBuffer.cs#L98

You will see it directly uses EffectParameter types. That needs to be fixed... but i'm not sure exactly how. We don't want a ShaderParameter I don't think. We probably just want an API like SetParameter(offset,byte[]) and let the caller be responsible for the rest. I think we need to make that change before we make any shader APIs public.

Also I want to take a closer look at the Apple Metal and DirectX 12 APIs and see if the basic concepts we're exposing still make sense for them.

separate classes for each pipeline stage?

I never understood the benefit of that. Can you explain why we might want that?

potential for confusing new users, if you need to choose between shaders and effects

In theory they could use either and be fine... or even mix them. Really i think all we can do is depend on documentation to explain things.

Are there are any platforms currently supported by MonoGame that don't support low-level shaders

The PlayStation Mobile stuff is the only oddity we have when it comes to shaders. But it shouldn't stop us from making progress.

how this will affect the content pipeline.

Right we want importers and processors that work on .hlsl, .glsl, etc.

I have a shader generation system

When I wrote the current MGFX system and added these low level shader APIs I was specifically thinking of your situation. It is why Shader and ConstantBuffer exist! :)

@prollin
Copy link
Contributor

prollin commented Jan 29, 2015

Also I want to take a closer look at the Apple Metal and DirectX 12 APIs and see if the basic concepts we're exposing still make sense for them.

I have been looking at those myself for a while and came to the conclusion that MonoGame (in its current state) will be hard to transition to those API without a lot of hacking and loosing some of the advantages those APIs offer.
In my opinion, a better approach would be to mimic those APIs and implement the missing parts in software for the older platforms (or simply state that certain features are not supported).

That being said I am curious to hear from other ppl on this subject.

@tomspilman
Copy link
Member

I am curious to hear from other ppl on this subject.

IMO we have three things we should do.

First catch up to DX11 style deferred contexts for multithreading. This would be like adding a GraphicsContext which will have all the rendering stuff and changing GraphicsDevice to inherit from it. This way old XNA code still works identically. This style theading has limited benefits... but it has some benefits.

Second we need to be able to use DX12/Metal under the current XNA style APIs. We won't get very much performance improvement from it... but it allows for compatibility for old XNA code. This also sets us up to be able to drop DX11 and GLES on iOS.

Last we enhance the APIs to support DX12/Metal more directly. These would be optional APIs to start with, but could eventually become the new API replacing the XNA4 ones.

or simply state that certain features are not supported

Yeah... if the platform doesn't support DX12/Metal style APIs then they have to use the old XNA4 style APIs.

@tgjones
Copy link
Contributor Author

tgjones commented Jan 31, 2015

We probably just want an API like SetParameter(offset,byte[]) and let the caller be responsible for the rest.

I guess we should discuss exactly what the public API would look like. I was imagining it would be close to the underlying Direct3D (and I think OpenGL?) API, something like:

GraphicsDevice.SetVertexShaderConstantBuffer(int slot, ConstantBuffer buffer);
GraphicsDevice.SetVertexShaderConstantBuffers(int slot, ConstantBuffer[] buffers);
GraphicsDevice.SetPixelShaderConstantBuffer(int slot, ConstantBuffer buffer);
GraphicsDevice.SetPixelShaderConstantBuffers(int slot, ConstantBuffer[] buffers);
// etc.

where ConstantBuffer is basically just a bag of bytes. If so, I don't see a need for a SetParameter API. But were you thinking we'd instead expose methods on ConstantBuffer to let you set a parameter by name? Instead of setting a whole struct at a time (which would require some SharpDX-style interop methods)? (Looking at the ConstantBuffer code in more detail, I guess this is what you were thinking.)

separate classes for each pipeline stage?

I never understood the benefit of that. Can you explain why we might want that?

It's only an aesthetic difference, really. Instead of the above, we could have:

GraphicsDevice.VertexShader.Shader = shader;
GraphicsDevice.VertexShader.SetConstantBuffer(int slot, ConstantBuffer buffer);
GraphicsDevice.VertexShader.SetConstantBuffers(int slot, ConstantBuffer[] buffers);
GraphicsDevice.PixelShader.Shader = shader;
GraphicsDevice.PixelShader.SetConstantBuffer(int slot, ConstantBuffer buffer);
GraphicsDevice.PixelShader.SetConstantBuffers(int slot, ConstantBuffer[] buffers);
// etc.

where most of the methods are defined in a CommonShaderStage base class. It keeps the GraphicsDevice class a bit tidier when we add other shader stages in the future, and it emphasises the similarity between shader stages (which may or may not work for us, depending on whether this maps well to platforms other than Direct3D). In any case, it's not a big deal.

While we're talking about this, I think we should consider #2602 as well. Currently we have:

GraphicsDevice.SamplerStates;
GraphicsDevice.Textures;

But we could think about:

GraphicsDevice.VertexShader.SamplerStates;
GraphicsDevice.VertexShader.Textures;
GraphicsDevice.PixelShader.SamplerStates;
GraphicsDevice.PixelShader.Textures;

(For backwards compatibility, we could keep the existing properties on GraphicsDevice as well.)

And on a related note...

I want to take a closer look at the Apple Metal and DirectX 12 APIs and see if the basic concepts we're exposing still make sense for them.

I know about those APIs at a high level, but I haven't looked in detail about their API surfaces (I'm not sure if the Direct3D 12 API is public yet?). I agree, it's important that whatever changes we make now will at least have a migration path to these more modern APIs. Do you have beta access to the Direct3D 12 SDK? (or maybe if you do, you can't say!)

When I wrote the current MGFX system and added these low level shader APIs I was specifically thinking of your situation. It is why Shader and ConstantBuffer exist! :)

Awesome!

@tomspilman
Copy link
Member

where ConstantBuffer is basically just a bag of bytes. If so, I don't see a need for a SetParameter API.

I think I see what you are getting at.

ConstantBuffer and Shader should be hard to use directly. There is no way to do ConstantBuffer.SetParameter(name, data) if you are working with ConstantBuffer and Shader directly. If you want that sort of API use Effect.

ConstantBuffer just exposes something like...

ConstantBuffer.CopyBytes(int offset, byte[] data, int count);
ConstantBuffer.CopyStruct<T>(int offset, T dataStruct) where T : struct;

... or maybe also something more low level like...

byte[] ConstantBuffer.Data { get }
ConstantBuffer.MarkDirty();

The point being if you are working with ConstantBuffer there is no concept of parameters.

This also helps clear up the confusion on what to use... Effect is easy to use while Shader is for more advanced users.

It keeps the GraphicsDevice class a bit tidier when we add other shader stages in the future

Yeah... that is a good reason to do that.

Do you have beta access to the Direct3D 12 SDK?

I do... but can't share anything I learn about it here. :(

@tgjones
Copy link
Contributor Author

tgjones commented Feb 1, 2015

The point being if you are working with ConstantBuffer there is no concept of parameters.

Yes, exactly - although for consistency with VertexBuffer and IndexBuffer I might suggest something like:

new ConstantBuffer(GraphicsDevice device, int sizeInBytes);

// Most common usage - setting a struct, which has the same layout as the shader cbuffer, at offset 0.
ConstantBuffer.SetData<T>(int offset, T data);

// Not sure when people would use this overload - MonoGame doesn't offer any public APIs for converting between structures and byte arrays
ConstantBuffer.SetData(int offset, byte[] data, int count);

Since (in Direct3D at least) we create our constant buffers using CpuAccessFlags.None, we wouldn't be able to offer ConstantBuffer.GetData methods. Is this okay? Or do we want to expose some of those details to the user?

Something else I just thought about: do MGFX and MojoShader support cbuffer declarations? I think MGFX is fine - it simply skips over code it doesn't know about, and indeed we use cbuffer in the built-in effects - but not sure about MojoShader? I guess, for now, we'd stick with our current system of compiling multiple input languages (HLSL, PSSL, etc.) - just raw shaders this time, instead of effects - on Windows, and automatically converting HLSL to GLSL using MojoShader. Unless MojoShader doesn't support cbuffers, in which case we couldn't offer that automatic conversion. (This whole area is an interesting discussion, but perhaps for another time! I have some specific ideas, but not yet any time to explore them.)

@tomspilman
Copy link
Member

Remember the first customer for the ConstantBuffer and Shader APIs is the Effect system. So we need to ensure we have enough of an API to support it.

we wouldn't be able to offer ConstantBuffer.GetData methods.

Good... there is no reason to ever do that.

do MGFX and MojoShader support cbuffer declarations?

MojoShader does not... it is a SM3.0 system.

For HLSL platforms we can as we use the DirectX shader compiler and cbuffers are supported there.

indeed we use cbuffer in the built-in effects

Not sure what you mean by "built-in effects". The Microsoft Effect system was depreciated in DX11. We cannot count on it in the future.

we couldn't offer that automatic conversion

That is another conversation... shouldn't muddy this up with that.

@tgjones
Copy link
Contributor Author

tgjones commented Feb 1, 2015

Remember the first customer for the ConstantBuffer and Shader APIs is the Effect system.

Oh yes, good point. Do you think the raw byte overload should be exposed immediately though, or should wait until we do have public APIs to convert between structures and byte arrays?

Not sure what you mean by "built-in effects". The Microsoft Effect system was depreciated in DX11. We cannot count on it in the future.

Sorry, I meant BasicEffect, DualTextureEffect, etc.

MojoShader does not... it is a SM3.0 system.

In which case, until we have a way to compile GLSL shaders directly (is there an issue for that? I know it's been mentioned a few times), you won't be able to use the Shader / ConstantBuffer APIs in OpenGL?

@tomspilman
Copy link
Member

public APIs to convert between structures and byte arrays?

C# has APIs that can be used for that, so I don't see why we need to provide any.

Sorry, I meant BasicEffect, DualTextureEffect, etc.

They don't need to be changed at all.

you won't be able to use the Shader / ConstantBuffer APIs in OpenGL

We can use them right now with MojoShader... so we need to preserve that functionality in the short term. MojoShader just builds some fake constant buffers for float/int by packing all the data into an array.

@tgjones
Copy link
Contributor Author

tgjones commented Feb 1, 2015

Sorry, I meant BasicEffect, DualTextureEffect, etc.

They don't need to be changed at all.

Yes - I was only using that example to confirm that MG already supports cbuffer in HLSL source code.

MojoShader just builds some fake constant buffers for float/int by packing all the data into an array.

That's the part I was getting at - it depends on how MojoShader does that. It it does it in a predictable way, such that you can write a struct in C# and have it match the one that MojoShader generates, then that's fine, but otherwise it won't really work in OpenGL.

What sort of timeline do you think is sensible for this? Presumably after 3.3, at least?

@tomspilman
Copy link
Member

Yeah. after 3.3 for certain

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Design Requires API decisions MGFX Shader compilation tool
Projects
None yet
Development

No branches or pull requests

4 participants