-
-
Notifications
You must be signed in to change notification settings - Fork 3.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Divide by UiScale
when converting UI coordinates from physical to logical
#8720
Conversation
…rojection matrix for the UI camera.
UiScale
when calculating UI logical coordinates
UiScale
when calculating UI logical coordinatesUiScale
when converting from physical to logical coordinates
UiScale
when converting from physical to logical coordinatesUiScale
when converting UI coordinates from physical to logical
@ickshonpe can you resolve conflicts? I think this is likely to get merged preferentially. |
Yep done. This seems like the right approach. Once users have set up their window and set I can't see where there are any tangible benefits from the extra precision using f64s for the scaling, I think that's one of the first things we should deal with after 0.11 is done. I'll make an issue. |
Checked all the examples and they all seem fine. Just going to fix up the |
Added an extract parameter for `UiScale`. The window's logical size is now divided by `UiScale::scale` to calculate the logical size of the UI's viewport for the border thickness resolution calculations. `viewport_debug` Doubled the physical resolution of the window for this example and set `UiScale` to `2.` so the logical size of the UI's viewport remains the same. Also simplified this example by instead of spawning and despawning the two different uinode trees on state changes, using the `Display` style property to disable the trees one at a time.
Updated Shouldn't be any other problems. |
Divide cursor postion by UiScale, not multiply.
Renamed `logical_viewport_size` to `ui_logical_viewport_size` to be extra explicit. Added a comment explaining why we have to divide by `UiScale` here.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This looks good to me / seems like the right fixes.
Objective
After the UI layout is computed when the coordinates are converted back from physical coordinates to logical coordinates the
UiScale
is ignored. This results in a confusing situation where we have two different systems of logical coordinates.Example:
result:
We asked for a 100x100 UI node but the Node's size is multiplied by the value of
UiScale
to give a logical size of 400x400.Solution
Divide the output physical coordinates by
UiScale
inui_layout_system
and multiply the logical viewport size byUiScale
when creating the projection matrix for the UI'sExtractedView
inextract_default_ui_camera_view
.Changelog
UiScale
when converting them back to logical coordinates. The logical size of Ui nodes now matches the values given to their size constraints.UiScale
before creating the projection matrix for the UI'sExtractedView
inextract_default_ui_camera_view
.ui_focus_system
the cursor position returned fromWindow
is divided byUiScale
.Node::physical_size
andNode::physical_rect
.viewport_debug
now uses aUiScale
of 2. to ensure that viewport coordinates are working correctly with a non-unitUiScale
.Migration Guide
Physical UI coordinates are now divided by both the
UiScale
and the window's scale factor to compute the logical sizes and positions of UI nodes.This ensures that UI Node size and position values, held by the
Node
andGlobalTransform
components, conform to the same logical coordinate system as the style constraints from which they are derived, irrespective of the currentscale_factor
andUiScale
.