That seems backwards. Why wouldn't you click the weapon to select the type of attack/action you want to take and then tapping in the game client area is a "fire that type of weapon" action.
Unless is somehow makes more sense to have a target reticle that is placed... but it does, indeed, seem a frustrating way to spawn actions. You'd have to estimate positions of moving targets and place the reticle out front of them, then hope you can click the attack type and get the timing right. Of course, your fingers are out of the way so it is easier to see the result.
Having the order reversed means that the attack is initiated on the tap of the target rather than the tap of the weapon selection, which means the user's fingers are possibly over the screen and preventing visual display of something.
Really, this isn't an NGUI question. It's a game design question.
(I may have missed the intent of your question, though. Any object with a collider receives NGUI events if it's part of the UICamera's event mask. So it should be fairly easy to have your playfield incorporate colliders that get the events and then grab the event's coordinates to place the target or fire the weapon or whatever.)