Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.

Topics - bac9

Pages: 1 [2] 3
NGUI 3 Support / Any alternatives to UIPanels for hard edge clipping?
« on: October 11, 2014, 04:20:08 AM »
I'm implementing a reusable UI based on Android interface guidelines. Many animated elements from the guidelines can easily be implemented using simple layered sprites. For example, here are standard switches using three simple NGUI sprites each:

With other elements, here is a tiny little problem, though. That problem is the extreme use of clipped effects on rectangular clickable elements. Check the following sample animations from the guidelines:

Every touch creates an ripple that must be clipped within the touched element. That's where things get tricky. You can imitate it pretty easily using UIPanel, of course:

Except you definitely do not want UIPanels within the hierarchy of every single button, tab, field and card, because you do not want to isolate parts of your widgets into an entirely separate part of depth hierarchy and because you do not want to clutter your panel list with those tiny clip areas. So, is there some lightweight alternative to UIPanel that can only do clipping (not even soft edge, extremely crude rectangular edge clipping, for example depth based, will suffice too - no need for texture-defined masking, no need for circular clipping, nothing like that) without all other rich functionality and hierarchy fragmentation UIPanels bring with them?

NGUI 3 Support / Simple anchor setup question
« on: October 09, 2014, 05:39:02 AM »
I was not able to figure it out from the documentation so I apologize if it is obvious. The question is: how can I (through code) set the reference points of Unified anchoring of a widget to Top/Bottom/Left/Right enum values that NGUI inspector UI always offers you to use? I have not found any argument like that in the SetAnchor method or among the properties of individual anchors, which leaves me reliant on rather ugly hack where I match the pixel dimensions and position of an anchored widget to those of it's parent and hope that NGUI auto-selects the appropriate reference mode on each of the four anchor points. It usually does, because mode I want has the closest relative offset (zero), but I'd rather avoid doing that.

I understand that those handy enums and Unified anchoring mode on the whole are probably just editor-side wrapping of underlying auto-generated Transforms and underlying Advanced anchoring mode, and I see the methods allowing you to set reference Transforms for anchors, but I'm not seeing any way to fetch those points in the first place. I can calculate the 4 corner points and 4 side midpoints myself, of course, and create my own transforms there, but that would be very dirty and I'd rather prefer to find a way to get a reference to existing ones.

Components placed on UI objects with colliders enjoy extremely useful messages OnHover and OnPress that can be used to call void methods with isOver and isDown bool arguments. This allows me to easily hook complex animated behavior outside of UIButton scope to buttons, for example.

Unfortunately there is an inconvenience - I'm creating prefabbed UI elements with wrapper components (like ready for use buttons with labels), and I'd prefer the wrapper component to be alone on a parent object, with the label, sprite, collider and other stuff being in the hierarchy under it. Naturally, those neat messages stop arriving to my component when there is no collider on it. As a workaround, I'm trying out subscribing that component on a parent GameObject to the UIButton events, but I'm not sure if the same functionality is supported through that. I'm using the following simple code:

  1. referenceButton.onClick.Add (new EventDelegate (this, "OnPress"));

But as far as I understand, the button can't send that event with bool arguments describing whether a press has started or ended; and there is no hover event at all. Is there any way to get four state input response (hover entered area, hover left the area, click started, click ended) on a remote object or should I abandon use of events altogether and use a collider on a parent to intercept all those messages directly?

Very obvious workaround is to create a component placed onto the child GameObject with UISprite/Collider that will do absolutely nothing but catch OnHover/OnPress messages and relay them to the reference of my wrapper component - but it would be very nice if I could avoid cluttering the project with classes of that sort.


Edit: Oh, I see I have overlooked UIEventListener class.
I'm not sure I follow how it works though. The example contained in the description of that class is not working:

  1. UIEventListener.Get(gameObject).onClick += MyClickFunction;
Error: Operator `+=' cannot be applied to operands of type `UIEventListener.VoidDelegate' and `method group'


Edit 2: I think I got it working, looks like every single method called through needs a GameObject argument in addition to standard arguments like bools described in the event tutorial.
Could it be down to the fact that I'm subscribing on creation of the button in the editor? Maybe it only works when I subscribe ingame?


Edit 3: Yeah that was it, any and all subscriptions should be performed when the game is running.

I'm encountering a pretty strange problem. My UI camera is rendering to a texture which I then use in a material applied to a curved surface attached to a player. It's a standard approach for UI rendering with Oculus, Elite-style world space UI and other use cases like that. Everything works fine, except for one little thing - for some strange reason, not a single UISprite is written to the alpha of that RenderTexture. All UITexture and UILabel objects are rendered perfectly, but sprites vanish. To illustrate:

Original UI:

What I get in the render texture:

I'm not sure what could be the reason for that, frankly.

  • I have checked the source textures, the problem still appears when you use an image that works with UITexture on a UISprite, so it's not an issue with all-black alpha in a source
  • UITextures don't have a custom shader/material, so they are rendered through exact same shader atlased sprites do, which rules out issue with the shaders

Any ideas?

If that helps, I'm using a 3D UI prefab NGUI creates from the menu with minimal changes. UI camera is using Solid Color clear mode with the alpha of the background color set to zero (it's the only clearing configuration that works properly with render textures, anything else will leave trash from the previous frames in the texture).


Edit: Looks like I overlooked the shader difference between UITexture and UISprite - the latter is using premultiplied shader. Can you fill me in on why can't premultiplier shader be rendered into texture and what would I lose by switching the atlas to unlit colored shader used by UITexture by default?

Quite minor problem, but I'm curious if it's possible to fix it. At the moment you are adding your custom packets directly into TNet code, which obviously becomes a problem if an update contains any of those files, as that will wipe your packets out. I'm on a relatively safe side with the version control, as I can check the old version again and copy the missing stuff, but that won't be the case for every user. Is there a way to separate custom packets into standalone files so that this situation won't arise no matter what TNet files are updated in the coming versions?

Quite simple and straightforward problem: I have noticed that some of the textures in my scenes are sometimes automatically converted to UISprites if the sprites they are using are present in some atlas. I guess it's a helpful optimization that saves time in some use cases (e.g. design a UI with a scattered collection of texture files, generate an atlas from what you actually ended up using, get whole scene instantly converted to properly use it), but in some other use cases it's unfortunately harmful: in particular, when you are using UITexture to allow custom shading of a particular UI element (for example, when you use UITexture with a custom GrabPass based UI shader for semi-transparent blurry window backgrounds).

So, is there a way to disable that option?

NGUI 3 Support / Feature request: inverse sliced sprite rendering
« on: July 15, 2014, 09:51:57 AM »
A few days ago I had to implement Android L-style UI elements of arbitrary size which can change their depth and have to cast appropriately scaled/blurred shadow as that depth slides around during animations. This turned out to be a bigger problem than I have expected, because while the Sliced sprite type allows you to make a shadow that perfectly fits to a rectangle of an arbitrary size (using anchoring), it does not allow you to scale the border quads of the shadow sprite while anchoring the center quad to a parent object.

The problem stems from the fact that Sliced sprites, of course, get their full scale from their height/width properties, and calculate the size of the border quads by subtracting offsets from the resulting area. So, if you want to create a shadow that has a gradient covering 16 pixels around a rectangle, you have to create a sliced sprite with 16px gradients and set up it's border offsets to 16px, then using anchoring with the 16px offset. Now, if you want the very same rectangle to cast a 32px shadow, you can't reuse the same shadow sprite.

I think this issue can be fairly easily fixed, opening a way to a very rich variety of UI elements ranging from shadows with dynamically adjustable radius to reusable frames, animated pixel-perfect edge highlights and so on. Here is how the proposed additional sprite rendering mode can work.


  • Use UISprite properties to set up the dimensions of the center quad instead of the total sprite dimensions
  • Use the pixel size of the borders the sprite has listed in the atlas properties only to set up UV mapping of the 9-quad sprite, and not for the actual size of the border quads
  • Set up the size of the 8 border quads using an additional property "border size or something", be it one int for a uniform offset, or a Vector4 for anchor-like flexible offsets)

How feasible is it to add something like this to NGUI?

TNet 3 Support / OnNetworkDisconnect not being called
« on: June 23, 2014, 10:10:11 AM »
Maybe I have overlooked something in the documentation, but is OnNetworkDisconnect supposed to be called in the latest version of TNet? As far as I see, only OnNetworkConnect with the false result argument is actually being called in the event of a client being disconnected from a server, and OnNetworkDisconnect remains inactive no matter the scenario. This is reproduced on AutoJoin example shipped with TNet.

I'm not having any obstacles with that, keeping both connect/disconnect related code under OnNetworkConnect is somewhat convenient even. But I'm curious whether OnNetworkDisconnect was deprecated/deactivated or whether I'm doing something wrong (like, maybe it's only supposed to work on an object with an active TNObject)?

TNet 3 Support / Building a modified server executable
« on: June 18, 2014, 06:21:50 AM »
Sorry if that was brought up before, but there a page in the documentation or a tutorial available somewhere covering how can you modify the standalone server code and build it? I'm not terribly familiar with this aspect of VS, - I only ever used it to work on the code and never touched the build options and project management options. I imagine there are dozens of tutorials on the subject if I would know what to search for. In particular, I'm wondering about two things:

1. How to get TNetServer project working? I understand that TNet ships with the source code of the server and I see Assets/TNet/TNetServer/TNetTest/ServerMain.cs plus .csproj/.sln in the server archive, but I'm not sure if I'm opening them correctly. I'm not seeing how ServerMain.cs is supposed to work with "using TNet;" when it's classes aren't there in the TNetTest folder. Am I supposed to somehow modify the project settings or move TNet classes from Assets/TNet to Assets/TNet/TNetServer/TNetTest folder?

2. How to build TNetServer project properly? It's probably something straightforward like hitting a hotkey and choosing the name/location of an executable, but there are plenty of branching menus in VS and I'm not yet sure where to look for the build option.

Reason I'm touching the subject in the first place is that I need to implement a simple authentication system. I already have it up and running on a client, and it's possible to leave this way by using very dirty tricks like distributing username/password DB to every single client without ever touching server logic, but it would be nicer to have it in one place. Hence the questions above.

I have just started learning TNet and I'm encountering a strange issue with TNAutoJoin that was used in the second TNet tutorial and in an example scene. Any attempt to start the game is interrupted by the following error:

  1. Exception has been thrown by the target of an invocation. (TNAutoJoin.OnNetworkConnect)
  2. UnityEngine.Debug:LogError(Object, Object)
  3. TNet.UnityTools:Broadcast(String, Object[]) (at Assets/TNet/Client/TNUnityTools.cs:216)
  4. TNManager:OnConnect(Boolean, String) (at Assets/TNet/Client/TNManager.cs:977)
  5. TNet.GameClient:ProcessPacket(Buffer, IPEndPoint) (at Assets/TNet/Client/TNGameClient.cs:665)
  6. TNet.GameClient:ProcessPackets() (at Assets/TNet/Client/TNGameClient.cs:633)
  7. TNManager:Update() (at Assets/TNet/Client/TNManager.cs:962)

The server logs the connection, though. I've tried checking a few options like changing IP and ports of the server in the server config and TNAutoJoin inspector, but those parameters don't seem to be related to the issue. What else could be happening and is there a way to make the "Exception has been thrown by the target of an invocation." error more informative? (I've seen lots of very varied cases on the net mentioning it, none applying to the situation as far as I see, so it might be a generic error covering very broad amount of potential reasons)

Update: Rechecked it on an empty project, the error is not happening, so it's most likely down to differences in TNAutoJoin public parameters I was attempting to use. The question about making "Exception" error more informative still stands though, as I'm pretty much in the blind about the cause.

Update2: Reimporting TNet somehow fixed the issue, examples now open even in the current project. Not sure if issue is related or not, but I'm now getting a strange error in the server log that doesn't seem to affect the connection:

  1. has connected
  2. The requested address is not valid in its context.

This happens on every single connection I make with the server and a client situated on the same PC.

NGUI 3 Support / Color tint support for custom shaders
« on: June 10, 2014, 09:21:41 AM »
Can someone point me to proper way of exposing shader tint color to NGUI? Every single shader shipping with NGUI has no such property, yet somehow every texture and every sprite still supports tinting. How is it achieved? I'm mostly interested in that to get UITexture elements with custom shaders working with NGUI inspector tinting, as creating materials and dragging them into UITexture material slot every time you need to update a color isn't the most convenient workflow.

I tried few widespread naming conventions like _Color and _Tint, but NGUI seems not to search for properties named like that. What is it using then?

I'm encountering a very weird issue and I'm not sure where to start digging for reasons behind it. Any help would be appreciated. In short, what's happening is this: some of the panels in my application (it's a relatively simple 2D game built entirely with NGUI panels) disappear during prolonged play with no clear trigger or reason. Here is how one of the screens looks normally:

And here is how it looks once the glitch happens:

Unfortunately, over the course of the week in testing, I was only able to encounter the issue on iOS and Android devices and never in the editor, which severely complicates things as I'm unable to inspect the state of missing objects to figure out how they do it. My versions so far:

- Something is changing the depth of an UIPanel, hiding them behind the background (proved that unlikely by checking the glitch with a build with a transparent background - nothing was there)
- Something is changing the alpha of an UIPanel to 0, making it invisible
- Something is disabling or destroying a GameObject housing a UIPanel (might be likely, as one of the managers hangs if the glitch happens during interaction with one of the plants, indicating possible NRE)
- Something is moving an UIPanel GameObject out of view
- Some internal issue corrupts the UIPanel composite texture, making it invisible

As I've said, though, it's very hard to know what exactly happens when I'm unable to reproduce the issue in the Editor or during on-device tests with XCode tethering. Only things I know so far are:

- That the glitch usually happens more than half an hour into non-stop gameplay
- That the glitch affects only certain elements of UI in a very stable manner, absolutely never affecting the background panel, upper menu panel and main menu panel
- That the glitch happens all at once and never gradually

Another problem is that I'm absolutely sure that I don't have a single line in my project code that can control UIPanel depth and alpha, and I don't have a single line moving or destroying the GameObjects that are disappearing. Yet they somehow do disappear.

How can I approach debugging this? I have implemented an ingame debug console, but obviously, I can't just dump the state of every disappearing UIPanel/GameObject on every Update() and hope to get a readable log after it. Is there a cleaner method? For example, a way to hook a debug message to every alpha change, depth change and other change happening with UIPanel properties (if those have getters/setters and aren't just public).

I'm wondering if there is a good way to achieve the effect depicted below without resorting to huge atlases with manually blurred screenshots. I know iOS/OSX do the effect by applying real blur to their render textures - maybe there is a way to alter UIPanel code and NGUI shaders to achieve the same thing? Like allowing a UIPanel to contain a blur value that is used by a shader to blur the texture.

I'm making a stylized pseudo-3d UI using fisheye non-orthographic camera. Here is a simple gif:

My approach to it is pretty simple - I keep a traditional orthographic camera on while editing, and it's replaced by another camera with all those distortions as the game starts, so I'm spared from dealing with alignment issues and weird distortions while designing the UI content. Still, are there any potential problems that can arise from that setup I should be aware of? Like, for instance, would raycasting continue to work correctly in all cases there?

NGUI 3 Support / Is there a way to use anchors to rotate an object?
« on: March 26, 2014, 05:18:40 AM »
I'm wondering if it's possible to use NGUI anchor system not just for horizontal/vertical alignment of elements but also to rotate something. Let's say I have a rectangle-based line with 4px Y size (height) and variable X size (length) that I want to always have attached to a corner of one object and stretched to a corner of another object, even if those corners are not horizontally aligned.

I would guess it's not possible using only the anchors and I'm better off just writing a bit of custom code doing the rotation and X scaling? Something similar to widespread "look at" implementations, with me leaving anchors to handle just the origin point of that line and nothing else, I guess?

Pages: 1 [2] 3