Hey all,
So my game has various visible HUD elements, and also various modal windows. Now, up until now I've used Unity's Input.GetButton to map my controls. For example the W key is mapped to Accelerate and if W is held down, the player accelerates. However, I want to support remapping of keys, so someone could potentially remap Accelerate to 'Mouse 2'.
My problem comes from NGUI not swallowing events when it handles them. For example the player clicks on the HUD UI and he still fires his weapon. I've read a fair amount and it seems that fallthrough is the 'solution' to this problem.
Now I've thought about using fallthrough and just catching all the events, but how the heck would I deal with this in a GetButton scenario? NGUI sees 'clicks' and 'On Key' events, but not the overlaying Input.GetButton calls. So if I let On Click fall through and handle it as a click - I have no way to know what that click is mapped to in the Input.GetButton context. Plus I've tried catching OnKey events with fallthrough and it doesn't seem to work at all. For example my fallthrough processor catches mouse clicks (OnPress) but when I hit a key on the keyboard, the corrosponding OnKey in my fallthrough gameobject is not triggered.
Another thing I've tried is to just check UICamera.hoveredObject and if it is null - I process my Input.GetButtons, otherwise I assume the mouse is over the GUI... this doesn't quite work either as the player could be hovering over the minimap or HUD with the mouse while holding W to accelerate.
I've also read a bit about using NGUIs event system to handle actual game events and not just GUI events - but I fail to understand how this works in context with mappable keys and buttons.
Does this make sense? Do I need to create a whole new control system now that I'm integrating NGUI? Does anyone have any suggestions on how I can handle keyboard and mouse input in a mappable manner... what's the best practice here?
Thanks,
Yard