46
NGUI 3 Support / Interaction of UI root resolution vs game resolution
« on: October 25, 2014, 01:24:10 PM »
Can someone comment on how resolutions of the UI and of the Unity viewport interact? Let's consider three cases:
Provided we follow proper auto layout practices advocated by Google, Apple, NGUI and everyone else who has tried to develop for multiple screens, all those three cases should look absolutely identical when rendered into a 1280x720 screenshot. Now, what I'm interested in is not how the end result looks, but how difficult it was to render it from UI standpoint. I'm not interested in actual per-pixel shader cost, it's obvious that with viewport resolution going up, your GPU will have more work to do. I'm interested in performance hit of UI resolution.
From what I'm seeing so far, UI resolution affects only the scaling of objects inside UI root, and consequentially, controls the scale of pixel unit in relation to real pixels of your screen. That alone is extremely unlikely to cause any different in performance, I would guess. You don't get performance hits from changing transform scales and saving higher values into your int variables. Okay, what else... labels, textures, sprites. From what I am seeing, true type labels in NGUI are rasterized at the viewport resolution, never at UI resolution, so it's reasonable to guess that performance hit of labels is not at all different between the cases 1 and 2. Sprites and textures, as well as bitmap font labels, are already rasterized, all that happens in runtime is your GPU sampling them, and between the cases 1 and 2 that will have absolutely identical flat cost no matter how high or low the UI resolution is.
With that in mind, am I correct in assuming that cases 1 and 2 will have absolutely identical performance on every device? Essentially I'm asking if I overlooked any per-pixel calculations NGUI might perform in UI resolution space with performance impact proportional to that resolution. So far I'm seeing none.
Reason I'm asking about that is simple - I would like to adopt DP instead of pixels, keep one high-res atlas and use root scaling to influence size of the elements in proportion to the screen, in line with how native UIs in Android and iOS do it depending on the DPI of the device. Knowing I can run 4k UI root without any performance impact would relieve a great burden from my mind there.
- Viewport resolution of 1280x720, UI resolution of 1280x720
- Viewport resolution of 1280x720, UI resolution of 3840x2160
- Viewport resolution of 3840x2160, UI resolution of 3840x2160
Provided we follow proper auto layout practices advocated by Google, Apple, NGUI and everyone else who has tried to develop for multiple screens, all those three cases should look absolutely identical when rendered into a 1280x720 screenshot. Now, what I'm interested in is not how the end result looks, but how difficult it was to render it from UI standpoint. I'm not interested in actual per-pixel shader cost, it's obvious that with viewport resolution going up, your GPU will have more work to do. I'm interested in performance hit of UI resolution.
From what I'm seeing so far, UI resolution affects only the scaling of objects inside UI root, and consequentially, controls the scale of pixel unit in relation to real pixels of your screen. That alone is extremely unlikely to cause any different in performance, I would guess. You don't get performance hits from changing transform scales and saving higher values into your int variables. Okay, what else... labels, textures, sprites. From what I am seeing, true type labels in NGUI are rasterized at the viewport resolution, never at UI resolution, so it's reasonable to guess that performance hit of labels is not at all different between the cases 1 and 2. Sprites and textures, as well as bitmap font labels, are already rasterized, all that happens in runtime is your GPU sampling them, and between the cases 1 and 2 that will have absolutely identical flat cost no matter how high or low the UI resolution is.
With that in mind, am I correct in assuming that cases 1 and 2 will have absolutely identical performance on every device? Essentially I'm asking if I overlooked any per-pixel calculations NGUI might perform in UI resolution space with performance impact proportional to that resolution. So far I'm seeing none.
Reason I'm asking about that is simple - I would like to adopt DP instead of pixels, keep one high-res atlas and use root scaling to influence size of the elements in proportion to the screen, in line with how native UIs in Android and iOS do it depending on the DPI of the device. Knowing I can run 4k UI root without any performance impact would relieve a great burden from my mind there.