-
I am trying to implement a shader which is dependent on the horizontal resolution of the pixel buffer. It expects 640 pixels or it doesn't work right. If I set my window to 640x400 everything is fine, but the fragment shader gives incorrect results as soon as the window is resized, even if the window hasn't been resized far enough that the pixel buffer has been scaled up. I suppose this is more of a question than a bug report. How can I run a fragment shader that operates just on the logical pixel buffer and not the viewport window size? |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 5 replies
-
No problem! We have discussions for these kinds of questions!
The order you call the renderers in |
Beta Was this translation helpful? Give feedback.
-
To add an example, I'm currently trying to make an PAL/NTSC filter for my emulator. That fileter translate a binary video signal into colours (it's an Apple 2 emulator). That kind of operation must be done at the pixel buffer level, not at the screen level. I have not yet started (just spent an hour trying to understand how wgpu/pixels do their job) but my initial feeling is that I'll probably try to make the default scaling shader to accept an arbitrary texture. My idea is that I my do with a simple predefined pipeline:
There are 2 points here:
|
Beta Was this translation helpful? Give feedback.
No problem! We have discussions for these kinds of questions!
The order you call the renderers in
pixels.render_with()
matters. If you call your renderer first, then it receives the pre-scaled texture. If you call it after the default scaling renderer then it receives the scaled texture. You probably want to call your renderer first, then pass its output to the scaling renderer.