Hi,
I am new to NGUI and now I am trying to build a UI element which has a straightforward behaviour (which we have seen at many games) but before starting my own implementation, I wanted to get some advice for a possible already-made implementation inside NGUI.
I have a simple sprite with collider, and when you click on it, a small texture drawn on clicked area (like an indicator). Also you can get the relative position of clicked area like if you have a sprite of 100x100 then you could be able to get the clicked area inside your 100x100 sprite (Clicked = 26, 12)
I could have a sprite and get OnPress event from it. Then inside that I can get the mouse position, that is very easy so far. But what about next?
1. For drawing texture, I could draw directly on mouse screen coordinates with Unity's own texture drawing system or is there any way to set another NGUI texture's position to mouse position? Maybe UICamera.lastTouchPosition but would setting the UISprite's transform position to it would be enough?
2. Same goes for getting the relative position of mouse inside a UI sprite. You could think this as a mini-map on GUI and you could map this mini-map to real world map and clicking on it makes navigation or some other action on the map. Also of course it should work on different resolutions and full-screen mode.
Thanks!