Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Viewports causes inputs to be off #29570

Closed
Keetz opened this issue Jun 7, 2019 · 18 comments
Closed

Viewports causes inputs to be off #29570

Keetz opened this issue Jun 7, 2019 · 18 comments

Comments

@Keetz
Copy link
Contributor

Keetz commented Jun 7, 2019

Godot version:

Godot 3.1.1-stable

Issue description:

Using multiple viewports easily becomes a daunting task because you have to calculate input offsets yourself. Depending on your setup with different viewports, different scale and stretch options and what else you can imagine, it can be extremely hard to handle inputs correct.

My suggestion is that it should be the engines responsibility to make sure that inputs are translated to be 1:1 to window position, if that make sense. I'll try to elaborate.

If we have a window with the size (500, 500) and I click the left mouse button in the middle of the window at (250, 250), I should always get the click event at that position, no matter how many viewports I use within the game or how things are scaled and stretched.

If that always can be expected to be the default event position, the user can then do whatever transforming to the event if they want it offset, but I really think it would make sense if Godot could make sure that an event on the game window correspond to the event inside the game.

Minimal reproduction project:

I created a small project to show how easy it is to end up with mouse events completely gone.
It consists of two scenes, Main and Game. Run the Game scene and everything work as expected. Run the Main scene and notice how you can click the button in the middle, nor can you get the particles to spawn on the mouse cursor when clicking and holding mouse left.

InputOffset.tar.gz

Side notes:
The minimal project is a really simple case, but in practice I encountered this problem multiple times, and it annoys me every time that I can't rely on mouse events being at the position that I actually see on my monitor. I have to calculate an offset depending on the viewports manually and in some cases it can be really daunting to do.

@Keetz
Copy link
Contributor Author

Keetz commented Jun 7, 2019

In #28796 there are some additions to viewport that might be good to remember if looking into fixing this. Just to make sure that it also works with the new attach to screen functionality.

I don't know if it even makes a difference, but just to remember to make sure it also works.

@Keetz
Copy link
Contributor Author

Keetz commented Jun 7, 2019

If a more in depth example of how multiple viewports and the new attach to screen can be used, take a look here: https://github.com/Keetz/ViewportSplit

There are some input calculations that doesn't really work as intended but the overall use of viewports is an minimal example of a setup I use.

@mrcdk
Copy link
Contributor

mrcdk commented Jun 7, 2019

The issue in your example is that the Viewport has no knowledge about the input because you are clicking on a texture. If you use a ViewportContainer the input will be taken care of for you but if you use the texture directly you are in charge of sending the correct transformed event, it's not possible to do it automatically because a viewport can be of any size so there's not a 1:1 conversion and even if it's the same size as the main viewport it would only work on a few cases and cause a lots of issues in others. For example, what if you use the viewport texture in multiple places like this:

https://i.imgur.com/gD8gu21.png

How is that supposed to automatically know your intentions? You click in the middle of the screen and the even is mirrored in each texture? What if you want to capture the event in one texture and show it in another (this would need 2 viewports but the issue is still there)? What if the viewport texture is used in a 3d model?

@Keetz
Copy link
Contributor Author

Keetz commented Jun 7, 2019

@mrcdk maybe my example is bad, and yea it works with the viewport container indeed.
Looking at your picture, it isn't about knowing my intentions, but if you run the game and you see the same as your picture, I would expect that if I click on the position where I see a button, the event would be there too and therefore actually clicking the button.

That isn't always the case because of viewport size, stretch settings and so on.

You got a point about using viewport kind of like a "picture in picture" setup where you want to replicate something happening in another "window", I didn't think about that.

The project I linked to on my gitub has another example in the scene called "Node.tscn" though it requires some new features from the master branch of godot, it doesn't use textures rects at all.

EDIT
When running Node.tscn it first appears to work, but that is because I have done some offset calculations in _input(), and it doesn't actually adjust the event correctly. Try to add a button anywhere, and you will see that you can't click it because the actual event is somewhere else.

@mrcdk
Copy link
Contributor

mrcdk commented Jun 7, 2019

But, again, the control (or any other node/material) is just showing a texture, it doesn't have any knowledge about the texture coming from a viewport or not. It's the developer's job to tell each other they are working together.

I really don't understand what you are doing in your second example (the github one) You are using a viewport to draw both parts and then each texture rect is rendering part of the viewport? How would godot transform the event automatically in this case? The viewport doesn't know where it's being rendered it just get a texture ready, the texture rects don't know that they are rendering part of a viewport texture. I don't really know how godot would understand that automatically.

@Keetz
Copy link
Contributor Author

Keetz commented Jun 7, 2019

Just ignore the "why" in my ViewportSplit project, it is a minimal project I used for something else. The project itself is a reflection on a setup I use on some embeded hardware where the game has to fill two monitors, but it has to be setup side by side even though the result is on top of each other. Hard to explain, but again the "why" isn't really that important 😄

But yea the Main.tscn uses the texture rects, but as I said in the previous comment, look at Node.tscn (remember to be on master and not 3.1.1 for that to work).

To really simplify this whole issue, I just think it is really unintuitive that I cant rely on an event to be on the position I see and click. Again, looking at your picture, if I see that in the game window, why should it matter what kind of viewport setup I did to achieve it? Why is the, in this example, mouse event "captured" by some viewport and then the event position is somewhere completely different then what I visually click?

I might be terrible wrong, but I can't see why the engine shouldn't be able to calculate this.

@mrcdk
Copy link
Contributor

mrcdk commented Jun 7, 2019

The viewport doesn't capture any input, in the example, what's capturing the input is a control showing a texture. The viewport doesn't have a position, it only has a size. It doesn't know that's being rendered, the engine tells it to render and it just renders its content to a texture. You may not even use that texture to render it to the screen and use it for a shader or something. The viewport doesn't even know there's been an event to begin with (apart from the main one) and knowing that there's been one wouldn't matter because it doesn't have enough information to process it, like for example, the position so it can't know that a event at 100, 100 in the main viewport needs to be transformed to 10, 10 in its own coordinates. Even if it had all the information to process it the event is still happening on a node rendering texture, texture that can be in different places, that can be streched, that only part of it is being rendered, that can be offset, rotated, scaled,... You are in charge to transform the event with all that information and send it to the correct viewport to be consumed. The engine has no idea what you want to do.

@groud
Copy link
Member

groud commented Jun 9, 2019

I checked to sample project, the event aren't event forwarded here. As @mrcdk explained, you are using a viewport as a texture here. The thing is that this control cannot forward the input event messages to its texture, textures are not meant for that. Even if we wanted to implement the event forwarding to the texture, that would require to do so on all nodes that can display a texture, which would be hard to maintain and a little bit bloaty.

If you want to forward the event to a viewport, you need to put your viewport inside a ViewportContainer. This kind of Control will forward the input event to the child viewport and it should likely solve your problem. If there is a lack of flexibility with this implementation, this likely could be solved I believe (by maybe allowing a reference to the viewport node instead of having it as a child).

@uldall
Copy link
Contributor

uldall commented Jun 10, 2019

@clayjohn Would it be possible to make use of a ViewportContainer together with the optimizations we did in #26440 (And in general the optimization you made in Keetz's ViewportSplit project) ?

@reduz
Copy link
Member

reduz commented Jun 10, 2019

If I understand correctly, the problem is that a viewport attached to the screen directly is not getting input events? If this is the case this may be a bug, as viewports attached to the screen should always get input events.

@Keetz
Copy link
Contributor Author

Keetz commented Jun 11, 2019

@reduz we do get_viewport().set_attach_to_screen_rect(Rect2()) and because of that it doesn't get any input at all. I thought that was expected as we set its size to (0, 0)?

@Keetz
Copy link
Contributor Author

Keetz commented Jun 11, 2019

Just to repeat myself for clarity.

The best example is probably to get the project here: https://github.com/Keetz/ViewportSplit
The project consists of three scenes. Main.tscn, Game.tscn and Node.tscn.

Game.tscn is to represent how our games are setup, here we have some logic that moves around a few nodes to keep the game setup as we want.

I have some logic in Game.gd to handle input, but this isn't actually working as explained here #29570 (comment)

Basically everything under these lines can be ignored:

func _input(event):
	if get_tree().get_root().has_node("Node"):

^ What I do here appears to work as it updates to position for the particle I move around further down in the _input() function, but it actually doesn't update the event globally.

Main.tscn is how our "old" way of handling a specific setup we use on some embeded hardware, where we need to split the top and the bottom part of the game and set them side by side without losing the look and feel of Game.tscn (See how the particles move out on the top of the left half of the screen, and comes in from the bottom on the right side of the screen if the window is small enough)
In Main.gd I catch the event in _input() and calculates the new event position and sends it to the Viewport in the scene, this approach works, but I still can't see why the engine couldn't be better at this.

Node.tscn is our "new" way of handling what we do in Main.tscn. This setup is done by @clayjohn and he also did some improvements to Godot itself that makes this better for performance (that is why we need Godot from master).

This approach is what gives me the most problems with input as I can't do the same as in Main.tscn. The root viewport has its size set to (0, 0) so it never gets any input, which mean that I cannot catch the input, transform it and send it on to the other viewports.

Hopefully me trying to explain more and in other ways clarify this even more. I will gladly provide more info or try to explain what is going on in some other way, maybe in a more direct conversation on IRC with the project running if needed.

I guess you could say that Godot should use global positions relative to root for inputs, but I am not sure that makes sense when we do something like get_viewport().set_attach_to_screen_rect(Rect2()).

@groud
Copy link
Member

groud commented Jun 14, 2019

The issue is still unclear to me. But if I understand correctly, the original problem is being able to handle input events in a global way (the inputs events are not transformed by the viewport's transform).
In a common situation, this would be possible by using a normal root Viewport + 2 ViewportContainers and their viewports, but you could not use the optimizations made by @clayjohn in that case (as it requires to render the 2 viewports into dedicated buffers before rendering them to the global root viewport, thus not rendering directly to screen). Am I right ?

This approach is what gives me the most problems with input as I can't do the same as in Main.tscn. The root viewport has its size set to (0, 0) so it never gets any input, which mean that I cannot catch the input, transform it and send it on to the other viewports.

So basically, you could be able to handle the events if the viewport was not of size (0,0). But this cannot happen because you don't want the viewport to render as it would kill the performances. So in that case, why not simply allowing the possibility to completely disable the rendering of a viewport ? That would still allow you to handle the events without hurting the performances no ?

@Keetz
Copy link
Contributor Author

Keetz commented Jun 14, 2019

as it requires to render the 2 viewports into dedicated buffers before rendering them to the global root viewport, thus not rendering directly to screen

If I understand it correct, we don't render them to the global root viewport. The global viewport is set not to render by get_viewport().set_attach_to_screen_rect(Rect2()).

Instead we render the two viewports directly to screen, see #26440 and #28796 (I can't explain more than what is written there already)

I am not sure about disabling rendering of a viewport completely, I just donøt have enough knowledge about all that rendering stuff and how it works, how godot does it, how viewports works and so on.

I did some small updates to my ViewportSplit project where I commented out the input code that wasn't really working, and I added two buttons to the Game.tscn

To really try and simplify my issue I can cook it down this:

  1. In Main.tscn, regardless of my re-sizing of the viewport, why do I have to manually calculate input position?
    I understand that ViewportContainer can help in this case, but let's forget that for a moment and just pass on the input as I do by getting the viewport and calling its _input() function manually with the input from the root viewport. If I didn't do the calculations I do in _input(), the event position would be comletely off. Try and outcomment everything but viewport.input(event) in Main.gd

  2. In Node.tscn how do I get inputs to work here? I can't "catch" them from the root viewport and pass them on as in Main.

Everytime I try to explain my problem, I end up writing quiet a lot. It is hard to keep it simple, but to cook it down even more!

Complete basic problem... How do I get inputs to work when running Node.tscn? If I am overseeing an easy solution, fair enough. If there is no way or no easy way, that is what should be made/fixed

@groud
Copy link
Member

groud commented Jun 14, 2019

I just found out the problem with the example project might be partly due to #28855.
To partly solve the problem in the Game.gd file, you should first change:

mouse_particles.set_position(Vector2(event.position.x, event.position.y))

to

mouse_particles.global_position = get_global_mouse_position()

That should cause the event to be (more or less) relative to the left (named "Bottom") viewport. There is still an event scaling problem there though.

The main problem (outside the issue with #28855) is likely to smartly select the correct viewport to forward the event to, depending on the event's position in the window.

@KoBeWi
Copy link
Member

KoBeWi commented Jul 24, 2020

This seems to be fixed in 963d3a0 (except the demo is quite broken in 4.0 and difficult to test properly)

Can anyone else confirm? >_>

@juse4pro
Copy link

juse4pro commented Jan 24, 2021

@KoBeWi I tried it with the current master branch and can indeed confirm that the bug actually disappeared.
I mean I have some different problems now (e.g. scaling the window below the override size results in nothing renders at all) but this is another topic.
So let's hope that with the 4.0 release we don't have trouble anymore. :)

@KoBeWi
Copy link
Member

KoBeWi commented Jan 30, 2021

So this should be fine to close.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

9 participants