So I've seen some questions about this, but I haven't been able to find a definitive answer that worked for me.
So my situation is basically that I'm raycasting from my Main Camera which shows my level to my current mouse position in relation to the world. Then when I click I create a second raycast that tests what I clicked on and returns the gameObject.tag to me so that I can do stuff depending on the type of object.
What I THINK is wrong:
My NGUI stuff is located at 0,0,0 over in the corner of the world and it looks normal in game and works like it should as far as clicking and making menus pop up. However, on my actually screen, since I have 2 cameras, and the raycast is coming from the Main Camera, it's not realizing that there is a button there in the game because the actual location of the NGUI elements is 0,0,0.
I'm not sure if I should be parenting the UI Root to the Main Camera to make it show up that way, or if there's a way to make the raycast detect those buttons anyway (realize they are there on screen in runtime even though they aren't physically there in the scene). I really need the raycast to not go through those on screen elements because it allows the player to place objects in places they shouldn't be able to and it causes all sorts of bugs.
Any help would be GREATLY appreciated.