Rope Physics and Rope Rendering

April 8, 2017 - devlog

I recently added snare turrets to Jettomero, which will attach cables to Jettomero’s arms when in range and pull them until the turret eventually gets yanked out of the ground when the tension is high enough. I’m sure there’s many different ways to handle something like this but here’s a quick tutorial for how I set it up.

Using Unity3D’s spring joints proved to be effective at running the physics side of this system. For the rendering I used a single line renderer. Since the Line Renderer component relies on specific points to draw to I needed to keep assigning new positions in my Update() function. I assign the first point in the line to the origin of the cable where the first joint lives. Then I run a for loop to iterate over each joint in order of their chain, I can assign each additional point in the line to the position of the connected rigidbody, the last of which will be Jettomero’s arm. So the cable will get rendered for the entire length of the joint system.


You’ll probably want to make sure the line renderer has the Use World Space toggle selected so that all the transform.position references are in the proper space.

As for the RigidBody and SpringJoint components, this was what my settings looked like once I got things working how I wanted.

All the joints are set up the same except for my turret base I freeze the position and rotation so it can pull against Jettomero’s arms. Then I check  joint.currentForce.magnitude against a pre-defined threshold. When the force is high enough I disable the constraints on the base and it come flying out of the ground towards Jettomero.

The joints took me a while to figure out, the Auto Configure option was throwing me off for a while. It really depends on how you’re setting up your scene so sometimes it’s best to run the game and just mess with settings until it’s working how you want. Then right click on the component in the inspector, copy the component, stop the game and then paste the runtime component settings back in. Playing with physics takes patience and lots of tuning and iteration.

That’s essentially it. If I missed something or you have any questions feels free to reach out to me on twitter @GhostTimeGames.

New Generative Textures using Particles

April 7, 2017 - devlog

I switched to a new fully generative system for creating my planet textures yesterday. I was previously using a few hand-drawn source files and manipulating those but now I’m using a particle system to form my source, largely thanks to the latest Noise and Trails modules in the Unity3D shuriken system.. There’s some neat features that let me do stuff like this:

So I can manipulate the lifetime and size over lifetime and overall size and noise distortion of these lines. Then I render it into a new Texture using a render texture and stack it 10 times with some randomized rotation and random tiling/scaling in the material.. Which gives me stuff like this:

Then I just throw that texture onto the planet and my shader will threshold it and adjust the lines based on the lighting. So it ends up like this:

It’s very similar to the look I had before but because I’m no longer limited to a range of hand-drawn textures it gives me lots of opportunities for greater random variety. It’s exciting creating a system that still has room for surprises each time I visit a new planet. I’ll continue to experiment with the particles to create an even wider range of possibilities.

A Disclaimer for Jettomero

February 16, 2017 - devlog

Please note this is not a tutorial. This is me attempting to explain a number of things to both you and I to alleviate certain expectations about Jettomero that I’m concerned some people may have and that may also be causing me stress/depression. I know there’s lots of great games out there that don’t conform to any pre-existing ideas of what a video game should be – even so I find it difficult at times to feel confident in pushing that myself. Before I start hyping the game for launch I’d like to make a few things clear.

  1. Jettomero probably isn’t the game you think it might be. This is intentional. The core concept of the game (long before it ever took any sort of shape) is still at the base of everything and is very important to me. In order for it to work I do need to undermine the player’s expectation. Some people may not like this. So be it.
  2. Jettomero is more of an experience than a game. This relates to my first point again, I’ll be playing with some gaming tropes explicitly to flip them on their heads.
  3. Jettomero will probably be confusing when and if you first pick it up. I’ll tell you how to play, but I won’t tell you why. I’m not going to give you a mission to complete. The meaning to your actions will need to come from you.
  4. Jettomero is only intended to be 2-3 hours long. This is all the time I need to do the things I want to do and I don’t see any reason to drag this out.

That being said, I do hope you enjoy Jettomero.

World Space to Screen Space UI

December 20, 2016 - devlog

It’s not actually that complicated but this is a simple technique for achieving the effect of 2D UI tracking world space positions. If you’re new to Unity then this tutorial may be helpful to you.

The Unity3D UI offers great flexibility, allowing easy setup of multiple canvases in screen space and world space. World space UI works exceptionally well for things like interactive computer screens, but may not always be exactly what you want due to scaling/orientation. In the above case, I needed a way to clearly mark all the planets in the system, so I render them all in the main screen space UI and use the setup as I will describe below.

Step 1) The parent rect transform should extend to all edges of the screen, otherwise this will not work.

Step 2) The UI object should be configured as such:

The important details to note are the anchor is in the lower left corner and the pivot is set to 0, 0. This ensures that the origin of the UI element will always track to the exact point. If you want it off-centered then you can modify the children of the main tracking UI element.

3) Now in your update function you’ll want to find the world position of your target and then set the UI RectTransform’s anchored position. Here’s the example from my code (where I’m iterating over a list of available planets).

Vector3 viewPos = Camera.main.WorldToViewportPoint(planets[i].transform.position);
planetLabels[i].anchoredPosition = new Vector2(viewPos.x * canvasSize.x, viewPos.y * canvasSize.y);

planetLabels[i].gameObject.SetActive(viewPos.z > 0);

The viewPos will always be a normalized value between 0 and 1. So when you apply that to the anchored position of the UI you’ll need to multiply it by the canvas’s size. The size won’t always match the screen’s dimensions, so to find the canvas size you should use Canvas.pixelRect.width and Canvas.pixelRect.height – referencing the Canvas object in the UI element’s upper hierarchy.

It’s also very important to note that if the viewPos.z is less than 0 it means that the object is behind the camera, so unless you take this into account for your handling (in my case I disable the game object), the UI will show up in a mirrored position if it is behind the camera.

That’s all there is to it. In the gif above I also modify the alpha of the parent CanvasGroup to fade in and out the elements. As always, let me know if you have any questions @GhostTimeGames on twitter.

Runtime crater mesh deformations

November 12, 2016 - devlog

Since I’m already deforming my planets at runtime using a noise map, it made sense to include the ability to add craters to the planets when a large event occurs (landing or blasting off for example).

landingcrater

It’s not entirely ideal but my system seems to work alright – the biggest issue is that I use the deformed surfaces for my mesh collider as well so I need to be careful how much I warp the surface or it might cause issues with character mobility.

 

Here’s the gist of it:

float craterSize = 5.0f;

Mesh mesh = GetComponent<MeshFilter>().mesh;
List<Vector3> vertices = mesh.vertices.ToList();
for (int i = 0; i < vertices.Count; i++) 
{
     vertices[i] = transform.TransformPoint(vertices[i]);
     float distFromCrater =  Vector3.Distance(vertices[i], craterOrigin);
     if (distFromCrater < craterSize)
     {
          Vector3 dirFromCrater = (craterOrigin  vertices[i]).normalized;
          vertices[i] = vertices[i]  dirFromCrater * (craterSize  distFromCrater)/2.0f;   
     }
     vertices[i] = transform.InverseTransformPoint(vertices[i]);
}
mesh.SetVertices(vertices);
mesh.RecalculateNormals();

 

So once I have an array of all the vertices I convert each one into world space (using transform.TransformPoint()) and I can check how close each vertex is from the source of the crater explosion. If a vertex is within range of the crater explosion then I move it back away from the crater origin based on its proximity – so it’s most impacted at the center and becomes gentle towards the edges. Then I convert the vertices back to local space using transform.InverseTransformPoint() and I can assign my modified vertex array back to the mesh. It’s also very important to recalculate the normals.

That’s essentially all there is to it. I use a particle system using rock models to represent the earth that has been displaced and mask the sudden model change in the explosion effects as well. I’ll likely need to limit the number of craters that can be made in a single area to avoid the terrain being completely distorted since it’s definitely possible to start making things weird with enough repeated deformations.

Shader-driven Flame Effect

October 19, 2016 - devlog

I saw a cool gif of some animated flames the other day and wondered if I’d be able to recreate something similar in Unity3D. I’ve been using particle effects for almost everything up until now because I’m terrible at writing shaders but I thought I’d see what I could do. So here’s my less than perfect shader which you can copy and tweak as you desire.

flameshaderexample

I’ll try my best to explain what I’m doing in the shader so you’ll know the best places to start tweaking things.

First things first, my base texture was literally created by going to photoshop, selecting Filter->Render->Difference Clouds. I just needed something quick to test with and this ended up being perfect.

difference-clouds

You could try out some different texture and see how it affects the effect. For this example I think you’ll want to make sure you select the Alpha from Grayscale and Transparency from Alpha options in the texture import settings.

screen-shot-2016-10-19-at-10-09-51-am

Now for the shader code. Remember, I don’t really know what I’m doing here. But it works.

Shader GhostTime/FlameCutoffShader
{
     Properties {
     _Color (Main ColorColor) = (1,1,1,1)
     _MainTex (Base (RGB) Trans (A)2D) = white {}
     _SecTex (Second (RGB) Trans (A)2D) = white {}
     _Scale (ScaleFloat) = 1
     }
 
     SubShader {
     Tags {Queue=Transparent IgnoreProjector=True RenderType=Transparent DisableBatching = True }
     LOD 200
     Lighting off
     cull off
 
     CGPROGRAM
     #pragma surface surf NoLighting noforwardadd alpha
     
     sampler2D _MainTex;
     sampler2D _SecTex;
     fixed4 _Color;
     float _Scale;
     
     struct Input {
         float2 uv_MainTex;
         float3 worldPos;
     };

     fixed4 LightingNoLighting(SurfaceOutput s, fixed3 lightDir, fixed atten)
     {
         fixed4 c;
         c.rgb = s.Albedo; 
         c.a = s.Alpha;
         return c;
     }
     
     void surf (Input IN, inout SurfaceOutput o) {
        float time = _Time;

        fixed4 c = tex2D(_MainTex, IN.uv_MainTex + float2(0, time.r * 10));
        float cRight = tex2D(_SecTex, IN.uv_MainTex + float2(time.r * 0.10)).a;
        float cLeft = tex2D(_SecTex, IN.uv_MainTex + float2(time.r * –0.10)).a;

        float mixedAlpha = (c.a + 0.1) * (cRight + 0.5) * (cLeft + 0.5);

        float3 worldPos = mul(_Object2Worldfloat4(0,0,0,1)).xyz;

        float wDist = distance(worldPos, IN.worldPos);

        float distCut = _Scale/wDist;
         
        if (mixedAlpha < distCut)
        {
            o.Alpha = 1.0;
            o.Albedo = _Color + distCut;
        }
        else if (mixedAlpha – 0.01 < distCut)
           {
               o.Alpha = 1.0;
               o.Albedo = float3(000);
        }
        else
            o.Alpha = 0;
          

           
     }
     ENDCG
     }
 
     Fallback Transparent/VertexLit
 }

Firstly, you’ll notice there’s two texture slots in the properties – I used the same texture for both, but you could change them up if you wanted to experiment with the effect. The first texture is the main flames source. The _Scale property is important for adjusting to change the reach of the flames. You’ll need to play with this and adjust it based on the size of the mesh you’re using. Unfortunately I couldn’t find a good way to automatically get the scale/bounds of the mesh from inside of the shader, so you’ll probably need to set this property from script.

So in the surf function we get the time from _Time – this is essential for animating the flame over time. Below that is where we sample the textures using offsets adjusted my the time.

fixed4 c = tex2D(_MainTex, IN.uv_MainTex + float2(0, time.r * 10));
float cRight = tex2D(_SecTex, IN.uv_MainTex + float2(time.r * 0.1, 0)).a;
float cLeft = tex2D(_SecTex, IN.uv_MainTex + float2(time.r * -0.1, 0)).a;

So the main texture sampler is being offset in the Y axis by 10 times the normal time rate. The cRight and cLeft values are being adjusted in the x direction at a much slower rate, and I only need their alpha. It’s worth noting that these time multipliers are entirely arbitrary and you may want to include them as customizable properties in the head of the shader.

Now I multiply the alpha of all three of these samples to create a combined shifting alpha. I add an arbitrary amount to each one to prevent them from being decreased too significantly.
float mixedAlpha = (c.a + 0.1) * (cRight + 0.5) * (cLeft + 0.5);
The left and right samples help to keep the flames a little more random and flicker, instead of just scrolling a single texture straight up the mesh.

Now I get the distance of each point from the origin of the object, because we want the flames to start dithering as they travel farther from the source.
float3 worldPos = mul(_Object2World, float4(0,0,0,1)).xyz;
float wDist = distance(worldPos, IN.worldPos);

Now that we have the distance we use it with our _Scale property to create the cutoff value. If you’re not seeing any flames on your mesh you’ll need to adjust the _Scale property in the material properties.
float distCut = _Scale/wDist;

Now we do our check against the cutoff value.
if (mixedAlpha < distCut) { o.Alpha = 1.0; o.Albedo = _Color + distCut; } else if (mixedAlpha - 0.01 < distCut) { o.Alpha = 1.0; o.Albedo = float3(0, 0, 0); } else o.Alpha = 0; So first thing to point out here - if we are setting our alpha to 1 then we also slightly modify the colour by the distance value - this creates the effect of the flame becoming darker the further away it gets from the origin. It's a subtle gradient but you could also consider doing a lerp to a secondary colour here, etc. Next do an else if check where we modify the original alpha value by -0.01 (another arbitrary value). This section will determine the edge of the flame, which I've coloured black. It's a thin line but you could replace the -0.01 with a customizable property to adjust the width of the edges. Same goes for the edge colour. And that's pretty much it. If you have any questions I'll see what I can do, but I'm also open to any suggestions for making this shader better. I'm sure it has lots of room for improvement but hopefully this is a helpful starting point for you.

Creating Lazer Eyes

September 29, 2016 - devlog

Yes, I know I spelled ‘laser’ wrong. But that’s just how I spell lazer because Z is a cool futuristic letter and lazers are also cool and futuristic. Anyway… I’m going to outline a couple techniques I used to produce this lazer vision effect in Jettomero using Unity3D’s line renderer component.
lazerdemo
Ignoring the fact that each eye has it’s own beam, let’s focus on a single beam. The beam is made up of 3 line renderers. One is a solid line – the other two use a jagged lightning-like texture which I animate from script.
supersimplelightning
When using the Unity line renderer it’s a good idea to store your Vector3[] for the line points in it’s own separate array since the line renderer annoyingly doesn’t have a property or function to retrieve them otherwise. So all 3 line renderers in the beam should use the same start and end points – that much should be obvious. Now, we’ll want to get the 2 points of the beam to find the distance between them. Why? So we can adjust the tiling on the lightning material so that it doesn’t get stretched out when the beam becomes shorter or longer. Take a look at my Update function on a script I attach to the lightning lines.

    void Update () 
    {
        currentOffset -= Vector2.right * Time.deltaTime * scrollSpeed;
        float length = Vector3.Distance(positions[0], positions[1]);
        GetComponent<LineRenderer>().material.mainTextureScale = new Vector2(length/5.0f, 1);
        GetComponent<LineRenderer>().material.mainTextureOffset = currentOffset;
    }

You can see I calculate the length of the line each frame and then apply that to the mainTextureScale of the line renderer material (the length/5.0f is just an arbitrary value for how many tiles I want per unit of distance). That keeps the lightning arcs consistent across any length.

The other thing I do in the update function is animate the texture offset. I store a Vector2 on the script which can be continually changed over time using a speed that I randomize within a range in my start function. If you’re using multiple lines for an animated effect you will want to randomize the speed and start offset so that they aren’t all running overtop one another and you’ll get a much nicer effect. Here’s my start function:

    void Start () 
    {
        randomOffset = Random.Range(0.0f, 1.0f);
        Vector2 currentOffset = new Vector2(randomOffset, 0.0f);
        scrollSpeed = Random.Range(4.0f, 6.0f);
    }

Finally, for the end of the lazer I have two particle systems – one dispersing some sparks from a sphere emitter. The spark lines are just a thin cross texture all being spawned at a single point using random rotation, random start size, and a size over lifetime from 0 to 1. It’s the same effect I use for the sun’s rays in Jettomero only much faster.

screen-shot-2016-09-29-at-10-47-38-am

If there’s anything here that isn’t totally clear please let me know and I’ll try to elaborate more thoroughly. This technique isn’t too fancy but it is a very simple way to get an interesting effect.

Hacking the lighting in a sandbox solar system

September 28, 2016 - devlog

I’ve been able to get most things working reasonably well in Unity3D, but this is one obstacle I’ve run up against that I haven’t been able to find an effective solution for yet. The issue? With a sun sitting right in the middle of my scene I need a point light in order to shine light evenly to all my planets. However, the point light’s shadows don’t work in the same way that a direction light’s shadows do – so I actually do need to use directional lights (as far as I’ve been able to tell so far.

Let’s look at some comparison screens – point light on the left, directional on the right:

lightingexample1
As you can see here, everything looks about the same from space except for the angle of the light. But when we land on a planet you’ll see…

lightingexample2
The directional light looks way better right!? Even if I turn on shadows on the point light – the vast distance between the sun’s light and the planet means that the shadows won’t ever appear. At one point I had set things up so that each planet had it’s own point light at a very close proximity so that it could cast some shadows – but the way that point lights and directional lights cast shadows is very different and ultimately I needed the directional effect.

lightingexample3
And then there’s the other issue with point lights – which is that the light won’t be occluded by an object, so you can see all my objects on the dark side of the planet are still receiving light right through the planet’s body – which is not at all what I want. So the directional light is working pretty well for me on planets except that…

lightingexample4
If I now look at the other planets from this planet, you’ll notice the light source for those isn’t accurate at all for the directional light – my script always points the directional light at the player’s current planet.

My solution up until now has been to switch my light source between point and directional whenever I land or take off from a planet, and try to hide the switch behind some particle effects. One potential solution might be to have both lights on at all times and change the layers of the lights’ culling masks and the target objects – but that might complicate my existing layer system. I spent a long time cracking my head against the wall on this one but so far this is the best solution I’ve come up with so far. It’s super hacky but unfortunately it’s a problem unique to using this open solar system setup.

I’d love to hear alternative ideas from anyone who might have one.

**UPDATE – While playing with the lights some more I discovered that setting the Render Mode on the point light to ‘Important’ will make the shadows work properly on the dark side of the planet. If I set up a spot light for each planet (that I can turn on/off when appropriate) then I can get my directional shadows this way as well – as long as the spot light is also set to ‘Important’.

Creating a Steady Cam Effect

September 24, 2016 - devlog

Here’s a quick overview of how I managed my idle camera movement for Jettomero. I really enjoy spending a lot of time of camera behaviour to get things feeling natural and effective at motivating gameplay. To capture that ‘giant monster’ movie feel I wanted to have a constant drifting of the camera – almost as a handheld camera simulation.

My camera control in Jettomero is actually surprisingly complex since I have 2 objects which use smoothing for their position and rotation – the camera object itself, and a camera target object that changes behaviour based on the game context (whether we’re flying around space or walking on a planet). I may go into depth about the camera system in another post but for now we’ll just look at the idle movement.

steadycam

I was originally modifying position, rotation and zoom level all using the same technique but I’ve since simplified it to only adjust rotation because I found there was too much movement happening previously. So in the Start() function of my SteadyCam script I call this Coroutine.

    IEnumerator RotFlux()
    {
        while (true)
        {
            yield return new WaitForSeconds(Random.Range(0.5f, 1.5f));
            rotOffsetTarget = Quaternion.Euler(Random.Range(rotRange, rotRange), Random.Range(rotRange, rotRange), Random.Range(rotRange, rotRange));
            yield return null;
        }
    }

The variable rotOffsetTarget is a Quaternion that I declare at the top of the script, so this coroutine modifies it every 0.5 to 1.5 seconds – using a range that I’ve also defined at the top of script, as 3.0f. Now that I have a randomized rotation variation being set frequently I can include it in my Update function.

To simplify things – these are the two lines that affects my Camera rotation in the update function:

rotOffset = Quaternion.Lerp(rotOffset, rotOffsetTarget, Time.deltaTime);

transform.rotation = Quaternion.Lerp(transform.rotation, CameraFollow.Instance.transform.rotation * rotOffset, Time.deltaTime * rotSpeed);

First, I take my rotOffsetTarget and smooth to it using a Quaternion Lerp from rotOffset. Yes, I do a lot of smoothing.

As I had mentioned earlier, I have 2 levels of smoothing on the Camera, so CameraFollow.Instance.transform refers to the second level of smoothing, which I want my steady cam to follow and mimic. I take the rotation of that tranform, add my random rotation variation to it (Quaternions are added through multiplication) and then Lerp between my Camera object’s current rotation and this target rotation. My variable rotSpeed (set at 2.0f earlier in my script) determines how quickly I settle in to the new rotation.

I don’t have a professional background in code so I’m sure there’s a more effective way to do all this, but I’m happy to share my technique. If you have any further questions about my Camera behaviour feel free to drop me a line on Twitter.

Using a Particle System to Create Static Decorations

September 23, 2016 - devlog

Here’s a quick look at a useful trick for using the Shuriken Particle System in Unity3D to quickly make procedural arrangements of static elements.

I was playing with the idea of rings around planets when someone on twitter suggested an asteroid belt around the planets. This was an appealing idea so I considered how to quickly prototype it to get an idea of how it might work. Fortunately the Shuriken particle system in Unity3D is perfect for getting this up and running quickly. First I whipped up a ring mesh in Blender to use as the Mesh emitter, so all my asteroids would originate somewhere on the mesh (the edges in this particular case).

screen-shot-2016-09-22-at-10-06-53-pm

Here you can see the frame of the mesh that I use as my emission source.

Now there’s several important settings to use if you want the particle system to behave more like a prop than a dynamic effect.

1- Set the Start Lifetime to max – for me Unity capped the value at 100000 seconds – which is ~27 hours – so those particles aren’t dying any time soon.

2- Turn off the emission rate and use a single burst to spawn all your particles at once (also be sure to turn off looping). Here you can choose the range of how many particles you’d like to decorate with. For my asteroid belts I set it between 0 and 300 so that some planets would only have a few scattered here and there.

screen-shot-2016-09-22-at-10-07-21-pm

3- Use mesh renderers under the Render setting (optional). For me I wanted to use the same material as my planet for the asteroids and also didn’t want the default billboarding effect here so I was able to select a few different meshes to randomly assign to each particle.

screen-shot-2016-09-22-at-10-17-07-pm

Even if you want more control over your scene decoration at a later point, I think using Shuriken as a quick visual prototype is amazingly quick, clean and ideal for rapid iteration. In my case I might end up just sticking with my particle belt because it does pretty much everything I need it to do.

screen-shot-2016-09-22-at-10-00-58-pm

There’s lots of great ways to use this little hack, so be creative and see what you can do with it.