New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Viewports causes inputs to be off #29570
Comments
In #28796 there are some additions to viewport that might be good to remember if looking into fixing this. Just to make sure that it also works with the new attach to screen functionality. I don't know if it even makes a difference, but just to remember to make sure it also works. |
If a more in depth example of how multiple viewports and the new attach to screen can be used, take a look here: https://github.com/Keetz/ViewportSplit There are some input calculations that doesn't really work as intended but the overall use of viewports is an minimal example of a setup I use. |
The issue in your example is that the Viewport has no knowledge about the input because you are clicking on a texture. If you use a How is that supposed to automatically know your intentions? You click in the middle of the screen and the even is mirrored in each texture? What if you want to capture the event in one texture and show it in another (this would need 2 viewports but the issue is still there)? What if the viewport texture is used in a 3d model? |
@mrcdk maybe my example is bad, and yea it works with the viewport container indeed. That isn't always the case because of viewport size, stretch settings and so on. You got a point about using viewport kind of like a "picture in picture" setup where you want to replicate something happening in another "window", I didn't think about that. The project I linked to on my gitub has another example in the scene called "Node.tscn" though it requires some new features from the master branch of godot, it doesn't use textures rects at all. EDIT |
But, again, the control (or any other node/material) is just showing a texture, it doesn't have any knowledge about the texture coming from a viewport or not. It's the developer's job to tell each other they are working together. I really don't understand what you are doing in your second example (the github one) You are using a viewport to draw both parts and then each texture rect is rendering part of the viewport? How would godot transform the event automatically in this case? The viewport doesn't know where it's being rendered it just get a texture ready, the texture rects don't know that they are rendering part of a viewport texture. I don't really know how godot would understand that automatically. |
Just ignore the "why" in my ViewportSplit project, it is a minimal project I used for something else. The project itself is a reflection on a setup I use on some embeded hardware where the game has to fill two monitors, but it has to be setup side by side even though the result is on top of each other. Hard to explain, but again the "why" isn't really that important 😄 But yea the Main.tscn uses the texture rects, but as I said in the previous comment, look at Node.tscn (remember to be on master and not 3.1.1 for that to work). To really simplify this whole issue, I just think it is really unintuitive that I cant rely on an event to be on the position I see and click. Again, looking at your picture, if I see that in the game window, why should it matter what kind of viewport setup I did to achieve it? Why is the, in this example, mouse event "captured" by some viewport and then the event position is somewhere completely different then what I visually click? I might be terrible wrong, but I can't see why the engine shouldn't be able to calculate this. |
The viewport doesn't capture any input, in the example, what's capturing the input is a control showing a texture. The viewport doesn't have a position, it only has a size. It doesn't know that's being rendered, the engine tells it to render and it just renders its content to a texture. You may not even use that texture to render it to the screen and use it for a shader or something. The viewport doesn't even know there's been an event to begin with (apart from the main one) and knowing that there's been one wouldn't matter because it doesn't have enough information to process it, like for example, the position so it can't know that a event at 100, 100 in the main viewport needs to be transformed to 10, 10 in its own coordinates. Even if it had all the information to process it the event is still happening on a node rendering texture, texture that can be in different places, that can be streched, that only part of it is being rendered, that can be offset, rotated, scaled,... You are in charge to transform the event with all that information and send it to the correct viewport to be consumed. The engine has no idea what you want to do. |
I checked to sample project, the event aren't event forwarded here. As @mrcdk explained, you are using a viewport as a texture here. The thing is that this control cannot forward the input event messages to its texture, textures are not meant for that. Even if we wanted to implement the event forwarding to the texture, that would require to do so on all nodes that can display a texture, which would be hard to maintain and a little bit bloaty. If you want to forward the event to a viewport, you need to put your viewport inside a ViewportContainer. This kind of Control will forward the input event to the child viewport and it should likely solve your problem. If there is a lack of flexibility with this implementation, this likely could be solved I believe (by maybe allowing a reference to the viewport node instead of having it as a child). |
If I understand correctly, the problem is that a viewport attached to the screen directly is not getting input events? If this is the case this may be a bug, as viewports attached to the screen should always get input events. |
@reduz we do |
Just to repeat myself for clarity. The best example is probably to get the project here: https://github.com/Keetz/ViewportSplit Game.tscn is to represent how our games are setup, here we have some logic that moves around a few nodes to keep the game setup as we want. I have some logic in Game.gd to handle input, but this isn't actually working as explained here #29570 (comment) Basically everything under these lines can be ignored:
^ What I do here appears to work as it updates to position for the particle I move around further down in the _input() function, but it actually doesn't update the event globally. Main.tscn is how our "old" way of handling a specific setup we use on some embeded hardware, where we need to split the top and the bottom part of the game and set them side by side without losing the look and feel of Game.tscn (See how the particles move out on the top of the left half of the screen, and comes in from the bottom on the right side of the screen if the window is small enough) Node.tscn is our "new" way of handling what we do in Main.tscn. This setup is done by @clayjohn and he also did some improvements to Godot itself that makes this better for performance (that is why we need Godot from master). This approach is what gives me the most problems with input as I can't do the same as in Main.tscn. The root viewport has its size set to (0, 0) so it never gets any input, which mean that I cannot catch the input, transform it and send it on to the other viewports. Hopefully me trying to explain more and in other ways clarify this even more. I will gladly provide more info or try to explain what is going on in some other way, maybe in a more direct conversation on IRC with the project running if needed. I guess you could say that Godot should use global positions relative to root for inputs, but I am not sure that makes sense when we do something like |
The issue is still unclear to me. But if I understand correctly, the original problem is being able to handle input events in a global way (the inputs events are not transformed by the viewport's transform).
So basically, you could be able to handle the events if the viewport was not of size (0,0). But this cannot happen because you don't want the viewport to render as it would kill the performances. So in that case, why not simply allowing the possibility to completely disable the rendering of a viewport ? That would still allow you to handle the events without hurting the performances no ? |
If I understand it correct, we don't render them to the global root viewport. The global viewport is set not to render by Instead we render the two viewports directly to screen, see #26440 and #28796 (I can't explain more than what is written there already) I am not sure about disabling rendering of a viewport completely, I just donøt have enough knowledge about all that rendering stuff and how it works, how godot does it, how viewports works and so on. I did some small updates to my ViewportSplit project where I commented out the input code that wasn't really working, and I added two buttons to the Game.tscn To really try and simplify my issue I can cook it down this:
Everytime I try to explain my problem, I end up writing quiet a lot. It is hard to keep it simple, but to cook it down even more! Complete basic problem... How do I get inputs to work when running Node.tscn? If I am overseeing an easy solution, fair enough. If there is no way or no easy way, that is what should be made/fixed |
I just found out the problem with the example project might be partly due to #28855.
to
That should cause the event to be (more or less) relative to the left (named "Bottom") viewport. There is still an event scaling problem there though. The main problem (outside the issue with #28855) is likely to smartly select the correct viewport to forward the event to, depending on the event's position in the window. |
This seems to be fixed in 963d3a0 (except the demo is quite broken in 4.0 and difficult to test properly) Can anyone else confirm? >_> |
@KoBeWi I tried it with the current master branch and can indeed confirm that the bug actually disappeared. |
So this should be fine to close. |
Godot version:
Godot 3.1.1-stable
Issue description:
Using multiple viewports easily becomes a daunting task because you have to calculate input offsets yourself. Depending on your setup with different viewports, different scale and stretch options and what else you can imagine, it can be extremely hard to handle inputs correct.
My suggestion is that it should be the engines responsibility to make sure that inputs are translated to be 1:1 to window position, if that make sense. I'll try to elaborate.
If we have a window with the size (500, 500) and I click the left mouse button in the middle of the window at (250, 250), I should always get the click event at that position, no matter how many viewports I use within the game or how things are scaled and stretched.
If that always can be expected to be the default event position, the user can then do whatever transforming to the event if they want it offset, but I really think it would make sense if Godot could make sure that an event on the game window correspond to the event inside the game.
Minimal reproduction project:
I created a small project to show how easy it is to end up with mouse events completely gone.
It consists of two scenes, Main and Game. Run the Game scene and everything work as expected. Run the Main scene and notice how you can click the button in the middle, nor can you get the particles to spawn on the mouse cursor when clicking and holding mouse left.
InputOffset.tar.gz
Side notes:
The minimal project is a really simple case, but in practice I encountered this problem multiple times, and it annoys me every time that I can't rely on mouse events being at the position that I actually see on my monitor. I have to calculate an offset depending on the viewports manually and in some cases it can be really daunting to do.
The text was updated successfully, but these errors were encountered: