I'm aware that Unity Remote sends both mouse and touch events as mentioned
here and
here, but I need the ability to debug touch events in the editor which means Unity Remote is a necessity.
What's happening is with Mouse and Touch Event Sources enabled on UICamera, UICamera.currentTouch.pos returns the
average position of all active touch points and the OnDrag delta is the same for all OnDrag calls throughout a frame. That means fingers dragging in opposite directions report the same delta.
Unchecking Mouse as an Event Source kills touch functionality altogether, where I had hoped it would just ignore the mouse events sent by Unity Remote and allow touch events to work unhindered.
Digging around in UICamera I found this block of code:
if (useTouch)
{
if (mIsEditor)
{
// Only process mouse events while in the editor
ProcessMouse();
}
else
{
// Process touch events first
ProcessTouches();
// If we want to process mouse events, only do so if there are no active touch events,
// otherwise there is going to be event duplication as Unity treats touch events as mouse events.
if (useMouse && Input.touchCount == 0)
ProcessMouse();
}
}
That looks like why disabling Mouse as an Event Source also kills touch functionality when using Unity Remote, which of course connects to the editor.
So how would you recommend fixing this? My guess involves removing the mIsEditor block, but of course I'm wary about editing NGUI source.