As an effect in our game we're using a simple script that will create a 500x500 texture, fill each pixel in that texture with a random color, then assign it to a UITexture object with an Until/Texture shader. Here's a simplified version of our script:
using UnityEngine;
using System.Collections;
public class BlitTest : MonoBehaviour {
// Use this for initialization
void Start ()
{
Texture2D texture
= new Texture2D
( 500,
500, TextureFormat
.ARGB32,
false ); for( int x = 0; x < 500; x++ )
{
for( int y = 0; y < 500; y++ )
{
texture
.SetPixel( x, y,
new Color
( Random
.Range( 0f, 1f
), Random
.Range( 0f, 1f
), Random
.Range( 0f, 1f
) ) ); }
}
texture.Apply();
GameObject.FindObjectOfType<UITexture>().mainTexture = texture;
}
}
When using NGUI 3.10, the results are as expected:

However, when we upgrade to NGUI 3.11, the UITexture no longer shows up in the game:

The only difference between the two screen shots is that the 2nd one uses NGUI 3.11 instead of 3.10.
I noticed in the readme that there's a lot of changes within NGUI as far as UV's, geometry, etc. Could one of those be causing this issue, and is there a way to resolve it? I can provide the test project I used for this demo if necessary.
Thanks
-Mo