Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.


Topics - ArenMook

Pages: 1 [2] 3 4 ... 10
16
Dev Blog / Apr 17, 2016 - Once Upon a BRDF...
« on: April 17, 2016, 08:02:20 PM »
As I predicted in my previous post this week was spent texturing the planetary terrain. Little did I expect to delve as deeply into Unity's graphics as I did though. It was a classic example of "one thing leads to another"...

The week started with me simply writing code to generate cylindrical mapping UVs alongside the vertices in order to get the basic texture stretched over the terrain to see how it looks. It wasn't bad per say, but it wasn't quite good either:





The shadows were way too dark, which makes perfect sense really: there was no ambient lighting aside from the one coming from the milkyway skybox. It was most noticeable during sunset/sunrise. Take the following image for example:



As you can see, the entire terrain is just a black blob. The sky is still fairly well lit, meaning there should be plenty of indirect illumination coming from it that should light the terrain. So the question was, how to address it? Simple solution would be to change the ambient lighting from the space skybox to a color instead, but this approach wouldn't work well because it would give everything a flat, uniform color.

Another approach within Unity is to specify a gradient. This would work well, wouldn't it? Sure would, if the terrain was flat. Perhaps I'm wrong, but I saw no matrix in there that could be set in order to actually transform the direction of the gradient. In other words, since there is no way to choose an "up" vector, this approach wouldn't work either.

The same problem prevented me from using Unity's "default skybox" also. There is simply no way to change the "up" vector.

This leaves just the static cubemap skybox-based approach that I was already using. If only there was a way to update the contents of this skybox in real-time based on what's around the player, right? Well, turns out there is. The legacy approach, which I started with first, was to just render into a cubemap and then use this cubemap in the shader somewhere. It was promising, but unfortunately cubemaps rendered this way don't have mip-maps, meaning LOD downsampling wasn't working. In case you wonder why this is necessary... think of indirect lighting as a very blurry 360 degree photo. If you don't blur it, then you will be able to see reflections of the world instead of soft ambient lighting. In fact, it's that last part -- reflections -- that triggered a thought within my head: what about reflection probes?

Unity now has a feature called reflection probes that you can sprinkle liberally throughout your scene in order to get much more realistic looking reflections on objects in your scene. The way it works is by rendering cubemaps at multiple points in the scene, and then simply blending between them in order to apply more realistic reflections to objects moving through the scene. There is something called "light probe group" as well, but it didn't seem to actually do anything when I tried to use it, and in any case based on what I've read it seems they are geared more toward static scenes.

Reflection probes though -- they are most certainly not limited to being static. In fact, simply adding a probe to a blank scene immediately changes how objects using the Standard shader looks (assuming the material is smooth). Better still, it was possible to sample the reflection probe's cubemap at reduced LOD levels, thus getting that blurry ambient lighting cubemap I was looking for! So that's when I thought to myself: why not just have a real-time reflection probe follow the main camera around, rendering only the really big objects into it, such as the planet (using a simple sphere), its atmosphere, and the skybox itself? The result was immediately promising:





Unfortunately as I quickly learned, there were downsides that ultimately forced to me to write a custom BRDF. In technical terms, BRDF stands for "Bidirectional Reflectance Distribution Function". In more common terms, it's just a function that accepts various parameters related to the material and light and spits out the final color the material should have. Unity 5 started using them and there are 3 different versions available. I tried all 3, but neither had the look I was looking for. The default BRDF was most promising for well-lit scenes, but its handling of specular lighting was just flat out weird. Take this picture for example:



The tops of those hills are all uniform greyish color. In fact, as the sun's angle gets more and more shallow and the specular grazing angle increases, the material becomes more and more washed out and there is no way to turn this off. Take the following 100% black sphere as another example. Albedo and specular are both pure black, and yet when you look at it from a shallow angle to the light, starts becoming brighter and brighter:



Where is that color coming from? The light -- its specular contribution. But why isn't there any way to get rid of it? Not every material has such strong specular reflections. What about materials like vantablack when it's time to create stealth coating? Or the more common case, just to eliminate specular highlights from terrain while leaving them intact on the water like this:



Worse still, the standard BRDF's ambient lighting comes from what seems to be a static source. That is, if the light moves creating different lighting conditions at run-time, the standard shader-lit materials don't seem to benefit. Take the following sphere placed in a blank scene with the default skybox and a reflection probe set to render only that skybox, for example. I simply hit Play then changed the angle of the light to that of a sunset:





The sphere is still lit like it was day time! That's obviously completely unacceptable. Playing around with the settings reveals that the reflections come from the reflection probe and look proper, but the global illumination term seems to come from a static texture of some kind, and as such doesn't change at run-time.

But hey, the best part about the new rendering pipeline is that it's fully exposed to be modified, so I got started on my own custom lighting path. The first thing I did was to simply scrap the "ShadeSHPerPixel" call in my copy of the UnityGI_Base function found in UnityGlobalIllumination.cginc with a call to sample the reflection probe's cubemap. Since I only have one reflection probe and don't plan on having more, I didn't need to do any blending and only sampled the first one:
  1. #if UNITY_SHOULD_SAMPLE_SH
  2.         // BUG: This flat out doesn't work when a skybox-based ambient lighting is used with the skybox updated at run-time
  3.         //gi.indirect.diffuse = ShadeSHPerPixel (normalWorld, data.ambient);
  4.  
  5.         // This gives similar results and also happens to work properly
  6.         float4 diffuse = UNITY_SAMPLE_TEXCUBE_LOD(unity_SpecCube0, normalWorld, 0.8 * UNITY_SPECCUBE_LOD_STEPS);
  7.         gi.indirect.diffuse = DecodeHDR(diffuse, unity_SpecCube0_HDR).rgb;
  8. #endif
This gave nice enough results, but just to make it extra blurry and eliminate the visible corners in the cubemap, I wrote code to do instant blurring:
  1. // Fast, imprecise version:
  2. //float4 diffuse = UNITY_SAMPLE_TEXCUBE_LOD(unity_SpecCube0, normalWorld, 0.8 * UNITY_SPECCUBE_LOD_STEPS).rgb;
  3. //gi.indirect.diffuse = DecodeHDR(diffuse, unity_SpecCube0_HDR).rgb;
  4.  
  5. // Smoother but slower version:
  6. float3 right = normalize(cross(float3(0.0, 1.0, 0.0), normalWorld));
  7. float3 up = normalize(cross(normalWorld, right));
  8. const float sampleFactor = 0.9 * UNITY_SPECCUBE_LOD_STEPS;
  9. const float jitterFactor = 0.3;
  10.        
  11. float4 diffuse = (UNITY_SAMPLE_TEXCUBE_LOD(unity_SpecCube0, normalWorld, sampleFactor) +
  12.         UNITY_SAMPLE_TEXCUBE_LOD(unity_SpecCube0, lerp(normalWorld,  up, jitterFactor), sampleFactor) +
  13.         UNITY_SAMPLE_TEXCUBE_LOD(unity_SpecCube0, lerp(normalWorld, -up, jitterFactor), sampleFactor) +
  14.         UNITY_SAMPLE_TEXCUBE_LOD(unity_SpecCube0, lerp(normalWorld,  right, jitterFactor), sampleFactor) +
  15.         UNITY_SAMPLE_TEXCUBE_LOD(unity_SpecCube0, lerp(normalWorld, -right, jitterFactor), sampleFactor)) * 0.2;
  16.  
  17. gi.indirect.diffuse = DecodeHDR(diffuse, unity_SpecCube0_HDR).rgb;
With the global illumination taken care of, it was time to tackle the specular issues. I took care of that by creating a custom BRDF and extending the output struct's specular color to be a half4 instead of 3 -- that is, I simply wanted to take advantage of its unused alpha channel. In the last part of the BRDF function I made good use of it:
  1.         specularTerm = max(0, specularTerm * specColor.a * specColor.a * nl);
  2.  
  3.         half diffuseTerm = disneyDiffuse * nl;
  4.         half grazingTerm = saturate(oneMinusRoughness * specColor.a + (1.0 - oneMinusReflectivity));
  5.     return half4(diffColor * (gi.diffuse + light.color * diffuseTerm) +
  6.                 light.color * FresnelTerm (specColor.rgb, lh) * specularTerm +
  7.                 surfaceReduction * gi.specular * FresnelLerp (specColor.rgb, grazingTerm, nv) * specColor.a, 1.0);
In simple terms, what this is doing is merely attenuates the specular's strength by the specular color's alpha, thus making it possible to get rid of it completely. It's merely based on tweaking observation to get the results I want. Now, with a simple custom specular shader that uses the texture's alpha channel to control what should be affected by specular highlights and what shouldn't, I was able to get the "custom" results in the screenshots above and below.

The last thing I did was to write a custom fog version that also sampled the same exact diffuse term in order to have the fog be colored by the ambient lighting as well. This created a colored fog that blends much better with the environment around it.





I'm attaching the BRDF + specular shader for those that need it.


17
Dev Blog / Apr 10, 2016 - Planetary terrain
« on: April 10, 2016, 12:28:44 PM »
With the atmospheric shaders in solid shape I decided it was time to go back to planetary terrain generation. In the previous post on this topic I explained how I succeeded in reducing memory usage down to a fraction of what it used to be, but I still had a long way to go before the planetary terrain is anything close to being usable.

First, the seams. Due to how graphics work, when vertices on one mesh don't perfectly overlap vertices on an adjacent mesh, seams occur which are easily visible when in the game. In the last screenshot of the abovementioned post I show exactly that issue -- there are vertices on one mesh that don't have a corresponding vertex on an adjacent mesh. Fortunately it's easy enough to fix by trimming edge triangles on the denser of the two meshes if it's adjacent to a mesh with a lower subdivision level:



With that out of the way moved the entire subdivision process onto worker threads using the handy WorkerThread class I created a while back that's capable of spawning threads up to the limit of 2*CPU cores, after which point it queues up functions and executes them as threads free up. The only thing that still needed to be done on the main thread was the actual setting of mesh geometry and collider mesh. Unfortunately the latter was always the performance hog, taking up twice as long as all the other operations executed in the same frame combined:



How to speed this up? Well, first, I don't actually need colliders on any but the deepest-most subdivision patches. This eliminated the majority of them right away. The rest? I simply staggered them out -- rather than creating all colliders when the data becomes available, I made it so that only one collider can be created per frame. This only takes about 2 milliseconds per frame, which is perfectly acceptable. This effectively makes the entire planet generate seamlessly and without any hiccups all the way down to the 10th subdivision:



The next step was to actually generate a terrain based on some useful data. Naturally, since I already had a high-res height map of Earth (8192x4096), I simply wrote a script that samples its height values:

  1. using UnityEngine;
  2. using TNet;
  3.  
  4. [RequireComponent(typeof(QuadSphere))]
  5. public class EquirectangularHeightmap : MonoBehaviour
  6. {
  7.         public Texture2D texture;
  8.  
  9.         public double lowestPoint = 0d;
  10.         public double highestPoint = 8848d;
  11.         public double planetRadius = 6371000d;
  12.  
  13.         const float invPi = 1f / Mathf.PI;
  14.         const float invTwoPi = 0.5f / Mathf.PI;
  15.  
  16.         void Awake ()
  17.         {
  18.                 if (texture != null)
  19.                 {
  20.                         float[] data;
  21.                         {
  22.                                 Color[] cols = texture.GetPixels();
  23.                                 data = new float[cols.Length];
  24.                                 for (int i = 0, imax = cols.Length; i < imax; ++i) data[i] = cols[i].r;
  25.                         }
  26.  
  27.                         int width = texture.width;
  28.                         int height = texture.height;
  29.                         var sphere = GetComponent<QuadSphere>();
  30.                         var min = (float)(sphere.radius * lowestPoint / planetRadius);
  31.                         var max = (float)(sphere.radius * highestPoint / planetRadius);
  32.  
  33.                         sphere.onSampleHeight = delegate(ref Vector3d normal)
  34.                         {
  35.                                 var longtitude = Mathf.Atan2((float)normal.x, (float)normal.z);
  36.                                 var latitude = Mathf.Asin((float)normal.y);
  37.                                 longtitude = Mathf.Repeat(0.5f - longtitude * invTwoPi, 1f);
  38.                                 latitude = latitude * invPi + 0.5f;
  39.                                 return Mathf.Lerp(min, max, Interpolation.BicubicClamp(data, width, height, longtitude, latitude));
  40.                         };
  41.                 }
  42.         }
  43. }
  44.  

It looked alright from high orbit, but what about zooming in? Not so much:



Fortunately I was working on a game prototype a couple of years ago for which I wrote various interpolation techniques. The screenshot above uses the basic Bilinear filtering. Switching the code to use Bicubic filtering proves a much more pleasant result:



Hermite spline filtering is even better:



Still very blurry though, and increasing texture resolution is not an option. Solution? Add some noise! I opted to go with a pair of noises: a 5 octave ridged multifractal, and a 5 octave Perlin noise that results in this terrain:



Combining the hermite filtered sampled texture with the noise results in this:



It looks excellent all the way down to ground level with 16 subdivions:



At that subdivision level the vertex resolution is 19.2 meters. Since I'll most likely go with a planetary scale of 1/10th of their actual size for gameplay reasons, that will give me a resolution of under 2 meters per vertex, which should make it possible to have a pretty detailed terrain.

The downside of this approach right now is having to sample that massive texture... 8192*4096*4 = 134.2 megabytes needs to be allocated just to parse it via texture.GetColors32(). Another 134.2 MB is needed because I have to convert Color32 to float before it can be used for interpolation. Not nice... I wish there was a way to texture.GetColor() on a specific channel... The obvious way around it would be to use smaller textures, but that would cause even more details to be lost. I'm thinking of simply caching it by saving the result in a file after parsing the texture once.

In case you're wondering, it takes 50 milliseconds to generate a planet going down to 16th subdivision level in the Unity Editor:



The memory usage goes up by 300 MB when generating the planet, and since 134*2=268 MB of that is due to sampling the epic heightmap texture, that means the entire planet only takes ~30 MB in memory. Not bad!

I'm looking forward to seeing how it will look when I put some textures on it -- but that's going to be my focus next week.

Speaking of textures, I also tweaked the atmospheric shader from the previous post a little, making it more vibrant and improving the day/night transition to make it more realistic:



I achieved the improved fidelity by splitting up the scattering from just one color to two. One color is used for atmospheric scattering, and another is used for scattering close to the ground (terrain, clouds). The reason I had to do it was because when using just one color it doesn't seem possible to have a vibrant blue sky and a reddish tint night time cloud transition. Blue sky causes the night time transition to look yellow. To get a reddish transition I had to make the sky light turquoise colored which obviously looks pretty terrible. Using two scattering colors gave me the best of both worlds:










18
Dev Blog / Apr 2, 2016 - In pursuit of better skies
« on: April 02, 2016, 11:55:36 AM »
In my previous post I mentioned that the atmosphere was "good enough" to move on to other tasks. But... the transition from day to night really bugged me... Simply put, I thought the transition looked too washed out, and I wanted to fix that.

My solution involved restarting from scratch and redoing the shaders, paying closer attention to the scattering process. The finalized effect proved to be much better than my first attempt, although with the downside of being limited to the 2.5% atmosphere height again. The visual thickness of 2.5% using the new shader was actually thicker than my first attempt set to 5% though, so I just left it like that. The visual benefit was well worth it:



The next step was fixing the clouds... The cylindrical texture I used before was not only low quality, but it looked like crap around the poles. I needed something that not only looked good, but was capable of zooming in all the way down to the ground level without looking seriously blurred out. Of course this isn't something that's possible using higher resolution textures. Some basic math: the circumference of Earth is 40,075 km. Max texture size in Unity is 8192. This gives the equatorial resolution of 4.89 km per pixel. That's pretty terrible. Another approach was needed.

After some thought, I decided to turn the cylindrical cloud map texture into a tileable square with the base size of 4096x4096. I used the basic projection logic (same one I used for the quad sphere) to calculate the UV coordinates both horizontally, and at the poles. This gave me better per-km resolution than using a 8192 texture with the added benefit of looking great both at the equator and at the poles:



If only it would still look great when zoomed in, right? Well... the most obvious way of adding detail at zoomed levels is to use detail textures. But... we're dealing with clouds, so what would the detail texture be? Why... the same exact texture! Instead of adding detail to an existing blurred texture, I decided to blend between the same exact texture, but sampled at two different resolutions (1x zoom and 4x zoom). The result was just denser clouds that could be zoomed in farther, but eventually still looked washed out when zoomed in enough.

That's when I thought to myself: why not do this continuously? Use the camera's height to determine the 2 closest zoom levels and feed these values to the shader. The shader will then use the 2 UVs to sample the cloud texture, then blend between them using the height-based blending value. I can basically continuously blend between 2 textures in an ever-increasing zoom level based on the camera's height:
  1.                 if (heightFactor < 0.5f)
  2.                 {
  3.                         atmosphericBlending.x = 16f;    // Tex0 UV coordinate multiplier
  4.                         atmosphericBlending.y = 1f;     // Tex 0 blending weight
  5.                         atmosphericBlending.z = 8f;     // Tex1 UV coordinate multiplier
  6.                         atmosphericBlending.w = 0f;     // Tex1 blending weight
  7.                 }
  8.                 else if (heightFactor < 2f)
  9.                 {
  10.                         float f = (heightFactor - 0.5f) / 1.5f;
  11.                         atmosphericBlending.x = 16f;
  12.                         atmosphericBlending.y = 1f - f;
  13.                         atmosphericBlending.z = 8f;
  14.                         atmosphericBlending.w = f;
  15.                 }
  16.                 else if (heightFactor < 6f)
  17.                 {
  18.                         float f = (heightFactor - 2f) / 4f;
  19.                         atmosphericBlending.x = 8f;
  20.                         atmosphericBlending.y = 1f - f;
  21.                         atmosphericBlending.z = 4f;
  22.                         atmosphericBlending.w = f;
  23.                 }
  24.                 else if (heightFactor < 18f)
  25.                 {
  26.                         float f = (heightFactor - 6f) / 12f;
  27.                         atmosphericBlending.x = 4f;
  28.                         atmosphericBlending.y = 1f - f;
  29.                         atmosphericBlending.z = 2f;
  30.                         atmosphericBlending.w = f;
  31.                 }
  32.                 else if (heightFactor < 54f)
  33.                 {
  34.                         float f = (heightFactor - 18f) / 36f;
  35.                         atmosphericBlending.x = 2f;
  36.                         atmosphericBlending.y = 1f - f;
  37.                         atmosphericBlending.z = 1f;
  38.                         atmosphericBlending.w = f;
  39.                 }
  40.                 else
  41.                 {
  42.                         atmosphericBlending.x = 2f;
  43.                         atmosphericBlending.y = 0f;
  44.                         atmosphericBlending.z = 1f;
  45.                         atmosphericBlending.w = 1f;
  46.                 }
And just like that, I had clouds that could be zoomed in all the way to the ground that looked fantastic both far away and up close. The blending between them was very subtle when traveling at the speed of a real-world rocket -- so subtle as to not be noticeable. Even with the camera test I set up that was moving very quickly the transitions were all smooth:



I made the clouds be affected by the scattering color for a more natural looking transition:



Next up came the night lights. The square nature of pixels was very noticeable up close. Something had to be done. The approach I settled on simply took the night lights and blurred them slightly, resulting in smoother edges. I then used a detail texture blended with the night lights map to create a much higher-than-original night lights map:





It still looks great even from far away:



The last thing I did was duplicate the cloud detail code in the shader and made it sample a noise map texture that I then applied to the terrain itself. This added a subtle variation to the terrain's texture that improved how it looks when zoomed in. The snow-covered mountain peaks still looked rather bad when zoomed in because there was a lot of contrast between the color of the pixels, but I addressed that by simply adding extra snow to make the transition less jarring. I did that by sampling the height map, disturbing it a little by using my trusty LOD-based noise map explained above, and adding extra whiteness around the cliffs:



Unfortunately there was nothing it could do to make the shoreline less blurry when zoomed in... but I have some ideas on how I can address that that I will explore further in a future post. I'm sure simply adding terrain deformations based on a heightmap + noise will improve the look quite a bit. But in the meantime I have some new decent looking 3440x1440 backgrounds!






19
Dev Blog / Mar 27, 2016 - Why is the sky blue?
« on: March 27, 2016, 07:07:07 AM »
After the strenuous trials of of the basics of planetary generation I decided that I've had enough math for a bit and it was time to work on something shiny instead.

Coincidentally, while I was banging my head against the planetary wall, Neil was learning the basics of atmospheric scattering. He, like many before him, started with the GPU Gems 2 article by Sean O'Neil from over a decade ago. It certainly seemed promising. No need for texture lookups or any kind of pre-generated data not only means the planetary atmospheric properties can be easily modded by players, but also gives the possibility of terraforming planets over time. Who's to say that without a concentrated player effort it shouldn't be possible to terraform Mars from its CO2 atmosphere into a Nitrogen-Oxygen mix? Doing so would certainly change its reddish hue to a blue one. Would be nice to make it change gradually over time, wouldn't it?

Anyhow, I digress... So Neil started with the GPU gems approach... and quickly ran into several limitations. First, the shader assumes that the atmosphere's radius is always going to be 2.5% of the planet's radius, and I thought it would be much nicer to not have this limitation. 5% looks better, and I'm still not fully committed to a 1:1 scale of planets -- but that's something I'll delve into in a later post.

Another limitation is that Eric's approach uses several shaders: one set for inside the atmosphere, and another for observer being outside the atmosphere. While not a big issue to swap between the two, there are enough differences between the shaders to make the change noticeable. That, to me, is unacceptable. All transition must be completely seamless.

The last limitation was something I discovered when I've delved into the task of atmospheric rendering myself -- the inability to control the colors properly -- especially during the sunset. In the screenshots on the GPU Gems page it shows a proper sunset, but no matter what I tried I couldn't get it to look quite like that.

Of course I've also purveyed the available assortment of similar solutions on the Asset Store. The most promising of them -- AS3 atmospheric scatter -- was broken in Unity 5, used per-vertex shading along with pre-generated textures, and looked even worse than the GPU Gems approach. It also still suffered from the visible shader transition from the outer to inner atmosphere. Worse still, adjusting the light properties at run-time had no effect on how the planet looked. Apparently light's color and intensity gets baked into whatever lookup textures it uses. So in other words -- scrap:



Years ago I also picked up FORGE3D's planets. I'll admit, it was -- and still is -- a nice pack. It has a nice assortment of beautiful planets with thoughtful little details, such as having separate cloud maps for the sides of the planet and top/bottom to avoid the visual artifacts. I'm sure I'll be able to use some of the textures from it in the future. The way the shaders are structured also makes them quite suitable for procedurally generated content -- just specify the textures to use for the shader, and you've got a beautiful planet. The downside? Their planets are all external-view only, and everything is very art-side -- meaning lots of very high res textures.

Other kits were less useful. Etherea1 is completely broken in Unity 5. Space Graphics Toolkit is also partially broken in Unity 5. The Blacksmith atmospheric scattering is planet-side only, is not realistic, and overall rather unwieldy... I can go on, but needless to say, I was not able to find anything that actually does the job I was looking for, so I had to give it a go myself.

So far I've spent just over a week on it. I continued what Neil started with the GPU Gems approach as it's the one that was "almost" there and got it to a state that both simplified the code and also made it more robust. I was able to eliminate the 2.5% limit I mentioned and fix the outer-inner transition issue (in fact, only one shader is needed to draw the atmosphere now, not two)... and I got it all looking moderately acceptable. The most challenging part for me was getting my head around the whole "scattering" deal -- and that was because it was difficult for me to visualize. I had to resort to writing a shader to show me what happens:



At the time of this writing, the atmosphere is still very much a work in progress. Unfortunately from my experience the GPU Gems shader-based scattering is not very flexible. There is still a matter of discontinuity between the atmosphere and surface shaders that I want to eliminate (I want one shader to "just work" for both), and it's rather difficult to get the effects I want. I've also not been successful in getting the sunset to look realistic. I am not seeing an orange sky until after the sun has set, and even then it's limited to the horizon where the sun disappeared, while the rest of the sky remains completely black. I will keep at it and see if I can improve it still, but for now it does look acceptable enough to move forward.

One thing's for sure though... I'm going to need to get some better textures.








20
Dev Blog / Mar 27, 2016 - How (not) to generate a planet
« on: March 27, 2016, 04:40:27 AM »
Has it really been a month already since the last log entry? Time certainly flew by...

Let's see what happened since then... After a bit of struggle, I finally managed to migrate to Unity 5. The reason for that ended up being the desire to have more control over the rendering process. There were visual rendering artifacts with the tri-planar approach in Unity 5 when specularity was added to the mix and I wasn't able to get it fixed on Unity 4's side, but I did fix it in Unity 5 by normalizing... the s.Normal in the lighting function, I think? I don't even remember anymore... I updated the shader I included at the end of the previous post after I got it working.

Curiously enough, terrain generation code is a lot slower in Unity 5 -- but only in the editor. In Unity 4 it was taking an average of 1150 milliseconds to generate the cratered terrain from the first post. The same task takes Unity 5 anywhere from 1650 to 3600 milliseconds to complete. Oddly enough, doing a build produces the opposite results. Unity 5 64-bit stand-alone build created the terrain 15% faster than Unity 4 -- which is why I ultimately decided to ignore it.

Moving forward I created two sphere generation algorithms -- one using a quad sphere described in the previous post, and another using an icosahedron. As it turned out, the icosahedron's triangles are not perfect after all, and they do get skewed -- which is something I should have realized, in hindsight. There's less skewing, but it still happens:



Quadsphere has a nice advantage over the icosahedron sphere: its UV coordinates are very simple, and if desired, one could even map 6 square textures to it without using any kind of projection or blending just because it's always about working with quads. There was still the outstanding issue of vertices near the corners being skewed to the point of being less than quarter of the size of ones in the center, but I was able to resolve it by simply pre-skewing all vertices using a simple mathematical operation:
  1. x = (x * 0.7 + x * x * 0.3);
The hardest part was figuring out the inverse of that operation, as I needed to be able to take any 3D world position on the sphere and convert it to a 2D position on the sphere's side for the purpose of determining how to subdivide the sphere as well as which regions the player should currently listen to (multiplayer). After busting my head a bit over this highschool math problem that was so far in the past as to almost appear beyond me, I was able to figure it out:
  1.         static double InverseTransform (double x)
  2.         {
  3.                 const double d233 = 0.7 / 0.3;
  4.                 const double d116 = d233 * 0.5;
  5.                 const double d136 = d116 * d116;
  6.  
  7.                 if (x < 0.0) -(System.Math.Sqrt(-x / 0.3 + d136) - d116);
  8.                 else System.Math.Sqrt(x / 0.3 + d136) - d116;
  9.         }
After I was done with that math problem, I went straight into... more math. How to subdivide the quad sphere into patches? How to do it so that no two adjacent patches exceed each other by more than one size difference? And more importantly, how to do it in a memory-efficient manner? Naturally I didn't want to do any of this math stuff... too much math already. I figured I'd just start coding first, do that silly unnecessary math later.

I first started with the most naive approach, just to see it working. I figured -- hey, KSP used a 10 level-deep subdivision in their planets. I'll just go-ahead and pre-generate all the nodes right away, making it easy to know which node has which neighbor and parent, so traversing them will be a breeze!

Well, here's how it went... First time I hit play, I sat there twiddling thumbs for over 10 seconds while it generated those nodes. They weren't game objects, mind you -- not at all. They were simple instances of a custom QuadSphereNode class I created that had a bunch of references (to siblings, to potential renderers, meshes, etc). No geometry was ever generated -- not yet. I was merely creating the nodes. So naturally, the 10 second wait to generate enough nodes for just 1 planet was a surprise to me. And then I looked at my memory usage... Unity was at 3.5 gigabytes of RAM. Ouch!

That's when I decided to get my head out of my ass and do some math, again.
  1. 6 nodes, 10 subdivisions:
  2. 0 = 6
  3. 1 = 24
  4. 2 = 96
  5. 3 = 384
  6. 4 = 1536
  7. 5 = 6144
  8. 6 = 24576
  9. 7 = 98304
  10. 8 = 393216
  11. 9 = 1572864
  12. Total = 2,097,150 nodes
So to generate 2 million nodes, it was taking Unity 10 seconds, and it was eating up over 3 gigs of RAM, leading me to guesstimate that each class was adding ~1500 bytes RAM. The word "unacceptable" didn't quite cover it.

Not willing to delve into the whole "doing subdivision properly" logic just yet I decided to see if I could reduce the size of my classes first. That's when I learned that SizeOf() doesn't work properly in Unity. If you ever need to use it, use it in Visual Studio instead. It doesn't take into account unassigned fields pointing to class types. Long story short, by simply moving the geometry-related stuff out into its own class, and then leaving a null-by-default field pointing to that geometry for each node I was able to reduce memory usage down to 33% (1 gigabyte), and the generation time down to 2.3 seconds.

Of course that was still too much to be used in practice, and so I finally decided to delve into proper subdivision logic. On the bright side, it didn't take as long as I thought it would to get the "what's neighbor of what" code working in a very efficient manner and generate meshes on the fly on demand. On the downside, at the time of this writing, I still haven't actually finished per say. What the code does right now is given the directional vector and distance to the surface of the planet, it figures out what subdivision level is required. It then dives right into the subdivision logic, creating only the nodes that are needed to reach the 4 nodes closest to the observer. After that's done, it propagates outwards from those 4 nodes, creating higher and higher level neighbors until the entire sphere is covered. The code had to take into account neighbor sides on the sphere (and account for their varied rotation) and ensure that no patch is next to another patch that's more than 1 level above or below it in subdivisions.

Long story short, memory usage is negligible now, and generation time takes a mere 134 milliseconds to generate a planet going down all the way to the 10th subdivision -- including all the meshes necessary to render it, all the renderers and all the colliders. Over half of that time is spent baking PhysX collision data according to the profiler:



You may notice that the memory usage in there as well. A mere 12.3 MB were allocated during the planet's creation. So yes... from 10,000 milliseconds and 3 gigabytes not including geometry, down to 63 milliseconds and 12.3 megabytes that includes the geometry. Not bad! The final mesh looks like this:


21
Dev Blog / Mar 1, 2016 - Terrain Generation (Part 2)
« on: March 01, 2016, 08:37:23 AM »
In the previous post I mentioned that I decided to scrap using Unity's terrain system in favor of a home brew solution, and now I'm going to explain why.

First, let's examine what the Unity system can do. It can accept a heightmap and it expects splat maps (basically textures) for any additional information you need to pass to it. It handles LOD for you silently and with minimal control over it, and the terrains are expected to be flat (that is, they can't curve from edge to edge). The terrains it generates are are always rectangular or square. Still, as show in the previous post even with these limitations it's possible to create interesting looking terrains. The fairly low draw far clipping plane gives the illusion of curvature where there isn't one:



Regardless, what I'd like to have, is a terrain system on a planetary scale. While I don't necessarily want the ability to go from orbit to the planet's surface, it would be nice to have the tech that can do that -- simply because it would be cool. The terrain I'd like would need to be curved to make it possible to create round planets, and travel from any point on the planet to any other point without loading times, given enough patience and a fast enough vehicle. I can, in theory, just fill the terrain data in such a way that I can still use rectangular terrains to circumnavigate the planet (think a rectangle that simply gets projected onto a sphere that I can move around freely)... but what about supporting smaller celestial bodies such as large asteroids with a diameter of a few hundred meters? The curvature would need to be very evident in that case, and it would be more difficult to pull it off with the existing terrain. Doable though -- can just write a shader that adjusts the vertex positions as they get farther from the player.

In any case, thinking about all of this made me consider other possibilities, such as generating my own planetary terrain. Taking a quick glance on the Asset Store revealed two packages that could help. Unfortunately neither of them panned out for different reasons ranging from flat compile errors and naive approaches (10,000 diameter sphere? ouch!) to the vertex position code being hidden somewhere where I couldn't even find it. Next step? Playing around with it myself, of course.

First, how does one generate a spherical planet and deform it? Well, the LibNoise noise generation is already 3-D by default, so it's really a simple matter of generating a sphere. There are 3 common approaches of generating a sphere. The most common is a regular sphere one can create using the GameObject menu in Unity. It looks like this:



The most obvious problem with this sphere is the vastly uneven triangle sizes and difficulty in sub-dividing them to create a more dense mesh. The latter is necessary in order to improve details around the camera. So... that's out.

The second approach is to start with a cube made up of 6 planes like so:



... then simply normalize the position of every vertex, thus creating a sphere:



That's better! Subdividing each face is a trivial matter of splitting it into 4 child faces, so it would be trivial to improve the level of detail of the terrain closer to the camera. Additionally, since each side is technically still a square, so putting textures on it is a trivial matter. There is just one small problem... the sizes of the triangles can vary by quite a bit, depending on whether the triangle is close to the center or the corners of the mesh. Take this picture for example, taken from the same distance to the sphere's surface, just at two different points:



As you can see, not only the sizes of the triangles are vastly different, but the regions that are square on the left actually look like skewed rectangles on the right. If the player is standing at the intersection point on the terrain that's on the right hand side, he would be surrounded by 3 very skewed regions! While not a total deal breaker, it's still something that would be nice to avoid, if possible.

Enter solution #3: icosahedron. The best thing about using icosahedrons is that they're made up of equilateral triangles -- that is perfect triangles with all 3 sides of completely equal length. This means that every triangle is going to be the same exact size as any other. No skewing of any kind! Subdividing each triangle is a trivial matter as well -- simply split each one into 4 by dividing each line connecting points right down the middle. There's just one problem... it's not possible to use the same kind of texture mapping for an icosahedron that can be used for other shapes as there are no squares to work with:



So how would one do texture mapping? Well... while it's true that using vertex texture coordinates may be a problem, one could still simply use the normalized position of the vertex to figure out where that resulting vector projects onto the 6-sided cube, thus resulting in usable UV coordinates. This kind of approach is commonly known as tri-planar mapping:



Like with the terrain options before it I also checked the Asset Store offerings... unfortunately doing tri-planar mapping properly involves sampling each side of the cube independently, or normal mapping doesn't work properly and you end up with pixels that get lit even though they are supposed to be shadowed and vice versa. All the Asset Store offerings for Unity 4 suffered from this issue -- even my old deprecated Space Game Starter Kit tri-planar asteroid shader, amusingly enough. Also, to do tri-planar mapping properly, it's important to ensure that no two adjacent surfaces have tangents pointing in opposite directions, or visible rendering artifacts can appear on those edges (especially when specular lighting is used). So... I had to create a new one that does what I wanted:



The shader itself is below in case you need it.

  1. Shader "SW/Triplanar Bumped"
  2. {
  3.         Properties
  4.         {
  5.                 _Color ("Main Color", Color) = (1,1,1,1)
  6.                 _MainTex ("Base (RGB)", 2D) = "white" {}
  7.                 _BumpMap ("Normal Map", 2D) = "bump" {}
  8.                 _BlendPower ("Blend Power", Float) = 64.0
  9.                 _SpecColor ("Specular Color", Color) = (0.5, 0.5, 0.5, 1)
  10.                 _Shininess ("Shininess", Range (0.03, 1)) = 0.078125
  11.         }
  12.    
  13.         SubShader
  14.         {
  15.                 LOD 300
  16.                 Tags { "RenderType" = "Opaque" }
  17.  
  18. CGPROGRAM
  19. #pragma target 3.0
  20. #pragma surface surf Proper vertex:vert
  21.  
  22. sampler2D _MainTex;
  23. sampler2D _BumpMap;
  24. half4 _MainTex_ST;
  25. half4 _Color;
  26. half _BlendPower;
  27. half _Shininess;
  28.  
  29. struct Input
  30. {
  31.         float3 worldPos;
  32.         float3 normal;
  33. };
  34.  
  35. half4 LightingProper (SurfaceOutput s, half3 lightDir, half3 viewDir, half atten)
  36. {
  37.     half3 n = normalize(s.Normal);
  38.     half3 h = normalize(lightDir + viewDir);
  39.     half diff = max(0.0, dot (n, lightDir));
  40.     half nh = max(0.0, dot (n, h));
  41.     half spec = pow(nh, s.Specular * 128.0) * s.Gloss;
  42.  
  43.     half4 c;
  44.     c.rgb = (s.Albedo * _LightColor0.rgb * diff + _LightColor0.rgb * _SpecColor.rgb * spec) * atten;
  45.     c.a = s.Alpha + _LightColor0.a * _SpecColor.a * spec * atten;
  46.     return c;
  47. }
  48.  
  49. inline float3 GetContribution (float3 normal)
  50. {
  51.         half3 contribution = pow(abs(normalize(normal)), _BlendPower);
  52.         return contribution / dot(contribution, 1.0);
  53. }
  54.  
  55. void vert (inout appdata_full v, out Input o)
  56. {
  57.         UNITY_INITIALIZE_OUTPUT(Input, o);
  58.         o.normal = v.normal;
  59.         half3 signage = sign(v.normal);
  60.         half3 contribution = GetContribution(v.normal);
  61.         v.tangent = half4(contribution.y, -signage.z * contribution.z, signage.x * contribution.x, -1.0);
  62. }
  63.  
  64. void surf (Input IN, inout SurfaceOutput o)
  65. {
  66.         half3 signage = sign(IN.normal) * 0.5 + 0.5;
  67.         half3 contribution = GetContribution(IN.normal);
  68.         half3 pos = mul(_World2Object, float4(IN.worldPos, 1.0)).xyz;
  69.  
  70.         half2 tc0xy = half2(-pos.z,  pos.y) * _MainTex_ST.xy + _MainTex_ST.zw;
  71.         half2 tc0zw = half2( pos.z,  pos.y) * _MainTex_ST.xy + _MainTex_ST.zw;
  72.         half2 tc1xy = half2( pos.x, -pos.z) * _MainTex_ST.xy + _MainTex_ST.zw;
  73.         half2 tc1zw = half2( pos.x,  pos.z) * _MainTex_ST.xy + _MainTex_ST.zw;
  74.         half2 tc2xy = half2( pos.y, -pos.x) * _MainTex_ST.yx + _MainTex_ST.wz;
  75.         half2 tc2zw = half2(-pos.y, -pos.x) * _MainTex_ST.yx + _MainTex_ST.wz;
  76.  
  77.         half4 c =
  78.                 lerp(tex2D(_MainTex, tc0xy), tex2D(_MainTex, tc0zw), signage.x) * contribution.x +
  79.                 lerp(tex2D(_MainTex, tc1xy), tex2D(_MainTex, tc1zw), signage.y) * contribution.y +
  80.                 lerp(tex2D(_MainTex, tc2xy), tex2D(_MainTex, tc2zw), signage.z) * contribution.z;
  81.  
  82.         o.Albedo = c.rgb * _Color.rgb;
  83.         o.Alpha = 1.0;
  84.  
  85.         o.Normal = UnpackNormal(
  86.                 lerp(tex2D(_BumpMap, tc0xy), tex2D(_BumpMap, tc0zw), signage.x) * contribution.x +
  87.                 lerp(tex2D(_BumpMap, tc1xy), tex2D(_BumpMap, tc1zw), signage.y) * contribution.y +
  88.                 lerp(tex2D(_BumpMap, tc2xy), tex2D(_BumpMap, tc2zw), signage.z) * contribution.z);
  89.  
  90.         o.Gloss = c.a * _SpecColor.a;
  91.         o.Specular = _Shininess;
  92. }
  93. ENDCG
  94.         }
  95.         Fallback "Diffuse"
  96. }
  97.  

22
Dev Blog / Feb 26, 2016 - What I learned from developing Windward
« on: February 26, 2016, 03:50:31 AM »
Version control everything
Starting your next project? First thing you should do is set up a local Git repository. Don’t wait, and don’t put it off for later. Do it first thing, even when working alone. Even if you don’t do offsite backups, a local repository is a fantastic way of creating restore points that also offers an important side benefit of allowing you to compare files to see exactly what changed to cause the current horribly broken behaviour. Also, if you do change something for the worse with a repository you can always easily restore your changes selectively to a previous version — whether it’s the entire project folder, or just a couple of lines of code in one of your files. Even better still, getting savvy with branches will allow you to prototype different things without messing up the main (hopefully stable) repository, which grants you far more freedom and greatly reduces change anxiety.

Follow your vision
Every developer will quickly learn that everyone has ideas for games — whether it’s brand-new ones, or simply improvements to existing games. Some of them may actually be very good, and it’s always a good idea to listen to your players when they say that something isn’t working well and needs to be changed. That said though, never let your players pull your game in a direction that takes it away from your original design of what it should be. Strive to make a game that you yourself want to play. Incorporate ideas, but keep true to your own creative vision.

Don’t sweat the small stuff
You may care about making sure that every bolt and nail is modelled on your tiny in-game model that will never take more than half an inch on your screen, but your players won’t. Worse still, they will hate you for wasting your time on something that they simply don’t care about. And guess what? They will be 100% right. Surprising as some may find this, nothing is more important in a game than its gameplay. Your game can have a world made out of cubes and go on to sell millions of copies simply because it gives your players the creative freedom to express themselves with those simple cubes. When developing a game, always aim for the least needed effort to get your vision across. Nobody will care that your fancy starship has a velvet-wrapped gold-encrusted toilet seat if it handles like crap and your game chugs at 2 FPS on a NASA-grade supercomputer. Ask yourself a simple question: what will your players focus on during their gameplay? If your game is about the challenges of taking a satisfying dump in null-G, then such attention to detail will probably help. If, on the other hand, your game is about space-based dogfighting, then you’re simply wasting your time and money. So don’t be that guy. Shift your focus — at least until more important tasks are out of the way.

Play your game
I can’t stress this one enough. The most important aspect of making a good game is playing it — ideally alongside your players. Not only will you gather valuable feedback but you will also see first hand what your players see: the good, the bad, and the horribly broken. I can’t even count the number of times I came up with a great idea that I added to the game, only to find it lacking after thorough play-testing, or severely overbalanced, or simply not working at all. Don’t wait for QA to point out your mistakes. Take the proactive approach and play through your additions yourself first. And after you test it, test it again — then make sure you didn’t break something else in the process. Because trust me, you probably did. That’s just how programming works…

Multiplayer
Adding multiplayer is a great way of creating a community for your own game. Even mediocre games can be made great with the addition of coop multiplayer. Not all games will have the same benefit of course — real-time games are more suitable for it than puzzle games, for example. Regardless, if you are considering multiplayer for your game, do it early on — ideally as the first thing you do. Adding multiplayer is generally as simple as changing function calls to remote procedure calls. In layman’s terms, instead of simply doing something (calling a function, setting a variable, etc), send a message to other players saying that that function should be called or that variable should be changed. That’s it! That’s literally all it takes with advanced networking and serialization tools like TNet.

Empower your players
Some players will like your game, some will hate it, and then there will be a few who will like it A LOT. They will spend hundreds or even thousands of hours playing it, becoming your “veterans”. Cherish them. Empower them. Give them special things — whether unique items that don’t imbalance the game, or better yet — moderator privileges if your game happens to be online. With the right tools such players can quickly become your secret army — guiding new players, keeping troublemakers at bay, and even creating content and running special events — but only if you’ll let them. So do.

Modding support
Once your game is out (and indeed often even while it’s still in beta!), your players will want to change it. It may be simple balancing changes desired by an individual with a slightly different vision, or it may he a complete overhaul in the style of XCom Long War or Arma 2’s DayZ. Making your game easily accessible to all is an important step to greatly increasing its shelf life. Even if the game is mediocre, simply giving your players tools to make it better will have a tremendous effect on your game’s sales. If you choose not to add modding support due to security concerns or to prevent cheating, at least consider adding a way for players to customize their experience — give them hats, flags, paint jobs, etc — just let them make their experience more unique.

Value the "non-mainstream" press
Youtube Let’s Plays and Twitch live streaming have become the go-to sources for information for players considering purchasing games these days — especially around the launch day. When you release your game you can expect a literal horde of people to contact you in hopes of obtaining a free copy or letting you know that they’ve already released a video or five about it. Some of those — Twitch streamers especially — can have quite a notable effect that you can easily observe when looking at the sales graph during and after the hours the content was streamed or put live. As a rule of thumb though… if whomever contacts you has to ask you for a free copy, then it’s highly unlikely that they can have any noticeable effect on your game’s visibility. Worse still, from my own experience with Windward, 19 out of every 20 requests were coming from scammers — individuals pretending to be Youtubers and Twitch streamers in hopes that I would be too busy to do a basic identity check. When you release your game and start getting such requests, always take the time to double-check the contact page on the channel they claim to be. I had a polite generic response to such obviously fake requests: “Certainly, I have a couple of keys for you and your friends — please contact me from that channel via Youtube and they are yours.” Simple, polite, and always sent a clear message: scammers need not apply.

Embrace the piracy
Your game will be pirated. You can fight it, but you can’t stop it. Worse still, when you fight it, chances are you will end up harming the experience of legit customers either by making your game slower or by inadvertently causing stability issues with the game itself. My suggestion to fighting piracy is simple — just don’t. Simply accept the fact that your game will be available for free somewhere and work to enhance the experience of legit customers instead. For example in Windward I was releasing frequent updates that would add new content, but would also have a side effect of preventing older versions of the game from connecting to updated multiplayer servers. Those that chose to pirate the game could still play it solo — but to play online they would need to go legit first. Also keep in mind, even without such incentives, every pirate is a potential source of free advertising. They may not pay for the game, but if they like it they will tell their friends about it — and their friends might just buy it!

23
Dev Blog / Feb 26, 2016 - Terrain Generation (Part 1)
« on: February 26, 2016, 01:29:53 AM »
So the new game I am working on is supposed to feature, among other things, an endless seamless terrain system for planetary bodies. I toyed with the idea of adding it to Windward on several occasions, but the back-end support for it simply wasn't there and I decided that it would wait until the next project (that I am working on now). Before working on the endless terrain system, TNet first needed to support multiple channels seamlessly, with the ability to enter / join channels (think: regions) at will. Well, TNet 3 supports that feature -- and so I got started.

The idea behind the reason for something like that is pretty straightforward. I want it to be possible to pick a landing site on a planet or a moon, then drive from the landing site to any other point on that celestial body without any loading times -- and I want to be able to do it in multiplayer.

So how would one approach a task like this one? Well, in my case I've been using Unity's terrain system with Windward, so naturally continuing to use it was a logical choice. Problem is... quite obviously it's impossible to generate a terrain that spans the entire planet's surface, so the content has to be streamed in somehow, and instead of using one terrain like in Windward, I'd have to create multiple terrains around the player and stream in their content as the player travels around the world. I actually did something like that back in... 2009 I think it was? It was for a game prototype involving tanks and it wasn't too much of a challenge. Doing it without impacting performance was the only difficulty, and if the actual terrain is generated on a separate thread it shouldn't have any visible impact on the game itself.

To quickly test this idea's feasibility and refresh my memory I wrote a simple test that uses LibNoise to generate a multi-layered noise on one thread, then fills the TerrainData in the other thread. Timing it showed that a 256x256 terrain using 3 perlin noise generators (3 octaves each) + 1 simple noise (3 octaves) takes 242 milliseconds with another 16 milliseconds spent setting the TerrainData and 18 more milliseconds spent setting the terrain's splat maps. Since the setting of terrain data and splat maps are separate actions, this means that the main thread requires 2 frames to set its content and the longest stutter would be roughly 1 frame (1000/18 = 55.56 FPS).

Two problems here. I will focus on the first problem in this post: the terrain looked rather "meh":



Second problem was that even a 1 frame stutter is still going to be unpleasant to have, especially when it's done 2 frames in a row. But hey, one problem at a time. I'm a visual person first. I want to see what i'm working with!

To address the terrain being all boring, I have two solutions. First, I can use different types of noise like I did in Windward, and use one noise to control the blending of other noises. This was used in Windward to great extent to create cliffs next to rolling planes and idyllic sandy beaches. The only issue is that in this case I am creating a terrain to be used for airless celestial bodies, not ridged cliffs towering over sandy beaches.

(Plus, have you ever tried driving a vehicle in a low-G environment on a rocky surface? I'll give you a hint: I hope you like flipping over!)

Taking Luna as an example, it's pock-marked with craters, and as this KSP dev blog post explains, the easiest way to generate craters is to use Voronoi noise.

In simplest terms, Voronoi noise can be used to generate a cell-like pattern where each cell has a center that you can access and calculate a distance to.



Imagining each cell as a crater, the distance from the center can be easily converted to depth. Properly clamped (limit the maximum distance from the center), you can easily use this to figure out where depressions in the terrain should be:



Next step is to add a curve to the crater, as craters not only lower the terrain in the center, but they also raise the terrain at the edges, creating visible ridges. Fortunately creating a curve in Unity is a trivial matter, and adding a public AnimationCurve to the generator class, then sampling it in the code using curve.Evaluate() does the job nicely.



Looking at the photos of various moons it's quite clear that the center of the craters is generally darker than the surrounding area as the lighter dust gets blown out of the crater, exposing the bedrock underneath. Figuring out which areas should be bright and which should be dark is a trivial matter using the data already present:



Finally, the edges of the crater should be brighter in order to give it some visible contrast (and because in reality they usually are anyway). Again, using the existing data we can easily figure out where the edges should be:



Combine the darkened and brightened areas together gives this:



Just to make it prettier, I apply some texturing on top along with some normal maps:



It looks better from the ground:





Unfortunately after applying all of that, the terrain generation time is now up to 1155 milliseconds. Since this happens on a separate thread, it's not that much of an issue -- but there are other limitations with the Unity's built-in terrain system that ultimately caused me to decide to roll my own that I will examine more closely in the next post.

24
TNet 3 Support / Where I am taking TNet
« on: October 07, 2015, 08:09:01 PM »
Originally TNet started as a simple networking library I needed for Windward, but it has evolved greatly since then. The addition of the DataNode class greatly improved object serialization and not only allows anyone to send entire tree hierarchies of objects, but also save them to disk and load them later, in both text and binary formats.

Over the past while I've been taking it further still. Currently the TNet library on my Windward dev branch is able to serialize entire game object hierarchies into DataNode, which can then be saved to disk or sent across the network. What can be serialized? Right now, quite a bit. User scripts, renderers, entire meshes, textures... everything can be serialized automatically.

Just as a test, I am currently using it in Windward to export entire ships into tiny LZMA compressed formats -- complete with all the meshes (vertices, skinning info, etc), the materials, textures referenced by those materials, colliders and of course all user script values -- all saved to one complete file. Sort of like an Asset Bundle, but that can be created at run time and parsed easily. DataNode can even serialize references -- references to prefabs, references to game objects, components in the scene, etc. Have you ever wanted to take a snapshot of your entire scene at run time in a stand-alone game, then load it in the Editor? Well, this addition should make it possible.

Better yet, since it's all saved in the DataNode format, you can easily send it via any RFC.

In fact, the only thing I am not able to serialize properly right now are particle systems, but that's due to a major limitation in Unity -- most values you see in inspector when selecting the particle system are simply not exposed to scripting at all (which I believe finally gets fixed in Unity 5.3 sometime? not sure...).

Anyway, what I am trying to say is that TNet has evolved way beyond its "networking" scope and is currently more of an editor extension for handling robust means of data serialization with a networking support option. So as such I will be changing it to that in the Asset Store in the near future, which also means that it will be licensed per-seat like NGUI. Hopefully this will happen around the same time I'll release an expanded set of video tutorials covering all the new features and how to use them.

Thoughts/comments? Let me know!

25
NGUI 3 Support / InControl + NGUI
« on: May 09, 2015, 04:06:31 AM »
I recently decided to add InControl plugin to Windward to handle different controller types. Although it doesn't seem to support NVidia Shield controller on Windows properly, it does seem to work quite well. If anyone else is doing something similar and wants an example of how to integrate it with NGUI, this script should help to give you an idea:
  1. using UnityEngine;
  2. using InControl;
  3.  
  4. [RequireComponent(typeof(InControlManager))]
  5. public class InControlNGUI : MonoBehaviour
  6. {
  7.         void OnEnable ()
  8.         {
  9.                 UICamera.GetKey = GetKey;
  10.                 UICamera.GetKeyDown = GetKeyDown;
  11.                 UICamera.GetKeyUp = GetKeyUp;
  12.                 UICamera.GetAxis = GetAxis;
  13.         }
  14.  
  15.         static bool GetKeyDown (KeyCode key)
  16.         {
  17.                 if (key >= KeyCode.JoystickButton0)
  18.                 {
  19.                         InputDevice dev = InputManager.ActiveDevice;
  20.  
  21.                         switch (key)
  22.                         {
  23.                                 case KeyCode.JoystickButton0: return dev.GetControl(InputControlType.Action1).WasPressed;
  24.                                 case KeyCode.JoystickButton1: return dev.GetControl(InputControlType.Action2).WasPressed;
  25.                                 case KeyCode.JoystickButton2: return dev.GetControl(InputControlType.Action3).WasPressed;
  26.                                 case KeyCode.JoystickButton3: return dev.GetControl(InputControlType.Action4).WasPressed;
  27.                                 case KeyCode.JoystickButton4: return dev.GetControl(InputControlType.LeftBumper).WasPressed;
  28.                                 case KeyCode.JoystickButton5: return dev.GetControl(InputControlType.RightBumper).WasPressed;
  29.                                 case KeyCode.JoystickButton6: return dev.GetControl(InputControlType.Back).WasPressed;
  30.                                 case KeyCode.JoystickButton7: return dev.GetControl(InputControlType.Start).WasPressed;
  31.                                 case KeyCode.JoystickButton8: return dev.GetControl(InputControlType.LeftStickButton).WasPressed;
  32.                                 case KeyCode.JoystickButton9: return dev.GetControl(InputControlType.RightStickButton).WasPressed;
  33.                         }
  34.                 }
  35.                 return Input.GetKeyDown(key);
  36.         }
  37.  
  38.         static bool GetKey (KeyCode key)
  39.         {
  40.                 if (key >= KeyCode.JoystickButton0)
  41.                 {
  42.                         InputDevice dev = InputManager.ActiveDevice;
  43.  
  44.                         switch (key)
  45.                         {
  46.                                 case KeyCode.JoystickButton0: return dev.GetControl(InputControlType.Action1).IsPressed;
  47.                                 case KeyCode.JoystickButton1: return dev.GetControl(InputControlType.Action2).IsPressed;
  48.                                 case KeyCode.JoystickButton2: return dev.GetControl(InputControlType.Action3).IsPressed;
  49.                                 case KeyCode.JoystickButton3: return dev.GetControl(InputControlType.Action4).IsPressed;
  50.                                 case KeyCode.JoystickButton4: return dev.GetControl(InputControlType.LeftBumper).IsPressed;
  51.                                 case KeyCode.JoystickButton5: return dev.GetControl(InputControlType.RightBumper).IsPressed;
  52.                                 case KeyCode.JoystickButton6: return dev.GetControl(InputControlType.Back).IsPressed;
  53.                                 case KeyCode.JoystickButton7: return dev.GetControl(InputControlType.Start).IsPressed;
  54.                                 case KeyCode.JoystickButton8: return dev.GetControl(InputControlType.LeftStickButton).IsPressed;
  55.                                 case KeyCode.JoystickButton9: return dev.GetControl(InputControlType.RightStickButton).IsPressed;
  56.                         }
  57.                 }
  58.                 return Input.GetKey(key);
  59.         }
  60.  
  61.         static bool GetKeyUp (KeyCode key)
  62.         {
  63.                 if (key >= KeyCode.JoystickButton0)
  64.                 {
  65.                         InputDevice dev = InputManager.ActiveDevice;
  66.  
  67.                         switch (key)
  68.                         {
  69.                                 case KeyCode.JoystickButton0: return dev.GetControl(InputControlType.Action1).WasReleased;
  70.                                 case KeyCode.JoystickButton1: return dev.GetControl(InputControlType.Action2).WasReleased;
  71.                                 case KeyCode.JoystickButton2: return dev.GetControl(InputControlType.Action3).WasReleased;
  72.                                 case KeyCode.JoystickButton3: return dev.GetControl(InputControlType.Action4).WasReleased;
  73.                                 case KeyCode.JoystickButton4: return dev.GetControl(InputControlType.LeftBumper).WasReleased;
  74.                                 case KeyCode.JoystickButton5: return dev.GetControl(InputControlType.RightBumper).WasReleased;
  75.                                 case KeyCode.JoystickButton6: return dev.GetControl(InputControlType.Back).WasReleased;
  76.                                 case KeyCode.JoystickButton7: return dev.GetControl(InputControlType.Start).WasReleased;
  77.                                 case KeyCode.JoystickButton8: return dev.GetControl(InputControlType.LeftStickButton).WasReleased;
  78.                                 case KeyCode.JoystickButton9: return dev.GetControl(InputControlType.RightStickButton).WasReleased;
  79.                         }
  80.                 }
  81.                 return Input.GetKeyUp(key);
  82.         }
  83.  
  84.         static float GetAxis (string name)
  85.         {
  86.                 InputDevice dev = InputManager.ActiveDevice;
  87.  
  88.                 switch (name)
  89.                 {
  90.                         case "LX": return dev.LeftStickX; // Match UICamera.horizontalAxisName
  91.                         case "LY": return dev.LeftStickY; // Match UICamera.verticalAxisName
  92.                         case "RX": return dev.RightStickX; // Match UICamera.horizontalPanAxisName
  93.                         case "RY": return dev.RightStickY; // Match UICamera.verticalPanAxisName
  94.                         case "DX": return dev.DPadX;
  95.                         case "DY": return dev.DPadY;
  96.                         case "TX": return dev.RightTrigger - dev.LeftTrigger;
  97.                 }
  98.                 return Input.GetAxis(name);
  99.         }
  100. }
  101.  
Keep in mind it's easier if you started with InControl to begin with, but if you didn't (like me) and instead used XBox controller mappings, then you'll need to do what I did above. Everywhere I need to poll an axis in the code I use UICamera.GetAxis rather than Input.GetAxis.

26
Misc Archive / Windward is now out on Steam
« on: October 11, 2014, 08:37:53 PM »
Windward has been redone pretty much from scratch in under 2.5 months, and is now out on Steam Early Access. Read more here:

http://www.tasharen.com/?p=4730

27
NGUI 3 Support / Never use Instantiate(). Use gameObject.AddChild.
« on: September 19, 2014, 08:24:07 PM »
I have to repeat myself every single day here. Never use Object.Instantiate() when working with NGUI. Use gameObject.AddChild instead. Why? Because AddChild does this:

1. Sets the game object's layer (VERY IMPORTANT!)
2. Sets the game object's parent (also very important!)
3. Resets the position/scale
4. Works with Undo functionality

If you use Instantiate, then you are shooting yourself in the foot by creating your UI element in the middle of nowhere, orphaned without a parent. Think of the children! Save a life by using AddChild instead.

28
NGUI 3 Support / Limited support August 28th - September 6th
« on: August 28, 2014, 09:54:55 PM »
...on the account of me being in Italy for a week. Wife made me go on another vacation. :(

Sucks too, because I'm on such a roll with Windward, and now I have to take a forced break. Arg!

I'm not sure what my wifi situation will be like there, but I'll try to keep the forum responses up to date.

29
NGUI 3 Support / MOVED: NGUI HUDText is a bit shaky
« on: August 08, 2014, 04:46:26 PM »

30
NGUI 3 Support / Dynamic Font issues -- call for a clean repro case
« on: July 25, 2014, 02:43:48 AM »
Hey guys,

Ricardo Arango from Unity Tech is currently looking into the issues with dynamic fonts. He has tracked down the problem to Unity keeping track of when the last time each character was used (with "used" being basically the last time it was requested). Since NGUI caches the result, and doesn't re-request characters every frame (doing so would be pretty crazy for performance, to say the least), Unity's code believes that certain characters are no longer in use and discards them, which is what causes issues some of you have encountered.

All this aside, Ricardo has asked for a clean repro case to test the potential fixes with. If some of you have done your fair share of investigating the problem in the past and have a clean project for him to look at, please PM me with a download link. It would be fantastic if this particular bug was finally squashed. :)

Pages: 1 [2] 3 4 ... 10