I'm using a WebCamTexture which updates the UITexture with whatever is seen by the device camera.
If I use a standard Unity plane and renderer as my texture, the texture is updated and shows what the camera is viewing, in both the editor and on my Android device.
However, if I use a UITexture, although the camera works in the editor and shows on the UITexture, it doesn't work on my Android device, there is just a black texture.
This is my script...
I am using NGUI 3.9.0
public class WebcamTest : MonoBehaviour {
public string deviceName;
WebCamTexture wct;
[SerializeField] UITexture _uiTexture;
// Use this for initialization
void Start () {
WebCamDevice[] devices = WebCamTexture.devices;
deviceName = devices[0].name;
wct
= new WebCamTexture
(deviceName,
400,
300,
12); //GetComponent<Renderer>().material.mainTexture = wct;
//_uiTexture.material.mainTexture = wct;
_uiTexture.mainTexture = wct;
wct.Play();
}
}