Author Topic: 2D GUI + World Camera multi touch  (Read 1775 times)


  • Jr. Member
  • **
  • Thank You
  • -Given: 0
  • -Receive: 0
  • Posts: 83
    • View Profile
2D GUI + World Camera multi touch
« on: January 27, 2013, 12:31:38 AM »
So I've got a game on iOS where I want to be able to touch a Left or Right arrow to move an object left and right while also being able to swipe objects on a secondary camera that I've added UICamera to.

Right now I'm raycasting custom to detect the objects under my touch events to be detected, but this does not work with nGUI per documentation and forum posts I've read.  How do I simulate a swipe on a main Camera with UICamera attached and get the equivalent of an OnHover() event and call a function on the object?

Example would be a user could be touching the 2D GUI buttons and swipes the screen with another finger at the same time and if there's a piece of treasure with a collider on it detected under the swipe it calls the treasure's HitDetected() function and does its thing (only once).


  • Administrator
  • Hero Member
  • *****
  • Thank You
  • -Given: 337
  • -Receive: 1171
  • Posts: 22,128
  • Toronto, Canada
    • View Profile
Re: 2D GUI + World Camera multi touch
« Reply #1 on: January 27, 2013, 12:45:07 AM »
You don't need to raycast. If you set UICamera.genericEventListener to something, and on that something you have a script that has OnDrag function implemented, that function will be receiving a copy of all your drag events, letting you check what's underneath (UICamera.currentTouch.current).

There is no OnHover on touch-based devices.

You don't need to simulate anything in your case. Assuming you have a draggable panel, you can just give it some momentum (UIDraggablePanel.currentMomentum).