Using a Particle System to Create Static Decorations

September 23, 2016 - devlog

Here’s a quick look at a useful trick for using the Shuriken Particle System in Unity3D to quickly make procedural arrangements of static elements.

I was playing with the idea of rings around planets when someone on twitter suggested an asteroid belt around the planets. This was an appealing idea so I considered how to quickly prototype it to get an idea of how it might work. Fortunately the Shuriken particle system in Unity3D is perfect for getting this up and running quickly. First I whipped up a ring mesh in Blender to use as the Mesh emitter, so all my asteroids would originate somewhere on the mesh (the edges in this particular case).

screen-shot-2016-09-22-at-10-06-53-pm

Here you can see the frame of the mesh that I use as my emission source.

Now there’s several important settings to use if you want the particle system to behave more like a prop than a dynamic effect.

1- Set the Start Lifetime to max – for me Unity capped the value at 100000 seconds – which is ~27 hours – so those particles aren’t dying any time soon.

2- Turn off the emission rate and use a single burst to spawn all your particles at once (also be sure to turn off looping). Here you can choose the range of how many particles you’d like to decorate with. For my asteroid belts I set it between 0 and 300 so that some planets would only have a few scattered here and there.

screen-shot-2016-09-22-at-10-07-21-pm

3- Use mesh renderers under the Render setting (optional). For me I wanted to use the same material as my planet for the asteroids and also didn’t want the default billboarding effect here so I was able to select a few different meshes to randomly assign to each particle.

screen-shot-2016-09-22-at-10-17-07-pm

Even if you want more control over your scene decoration at a later point, I think using Shuriken as a quick visual prototype is amazingly quick, clean and ideal for rapid iteration. In my case I might end up just sticking with my particle belt because it does pretty much everything I need it to do.

screen-shot-2016-09-22-at-10-00-58-pm

There’s lots of great ways to use this little hack, so be creative and see what you can do with it.

Handling Audio Fades in Coroutines

September 17, 2016 - devlog

Hi, here’s another quick demo of some C# code I wrote for Jettomero in Unity3D.

I was just adding some new functionality to my music system and needed to be able to transition volume levels on a given audio source. This is simple to accommodate using a coroutine and I can pass an Audio Source and target volume to a generic function to keep things clean and reusable. However, I needed to be able to call these volume transitions on potentially multiple audio sources at once and also needed to be able to stop any given transition if that audio source received an updated transition call. Fortunately Unity’s StopCoroutine will accept an IEnumerator as an argument so I can set up a reference to every coroutine before I start it and then use that reference to call it to stop later on. Since every audio source would only ever have a single active transition I decided to use a dictionary to store a paired AudioSource and IEnumerator. This way I can easily check the dictionary for an existing transition provided the audio source. Being careful to remove completed transitions from the dictionary again this technique seems to work perfectly for my needs. This code could obviously be adapted for non-audio purposes as well.

Here’s what the code looks like, (remember to add “using System.Collections.Generic” at the top of your .cs script to import the Dictionary support):

 

Dictionary<AudioSource,IEnumerator> activeTransitions = new Dictionary<AudioSource, IEnumerator>();

    void ChangeVolume(AudioSource source, float targetVolume, float transitionTime)
    {
        if (activeTransitions.ContainsKey(source))
        {
            IEnumerator activeTransition;
            activeTransitions.TryGetValue(source, out activeTransition);
            StopCoroutine(activeTransition);
            activeTransitions.Remove(source);
        }

        IEnumerator audioTransition = TransitionVolume(source, targetVolume, transitionTime);
        StartCoroutine(audioTransition);
        activeTransitions.Add(source, audioTransition);
    }

    IEnumerator TransitionVolume(AudioSource source, float targetVolume, float transitionTime)
    {
        float currentVolume = source.volume;
        float progress = 0.0f;
        while (progress < 1.0f)
        {
            progress += Time.deltaTime/transitionTime;
            source.volume = Mathf.Lerp(currentVolume, targetVolume, progress);
            yield return null;
        }
            
        activeTransitions.Remove(source);
    }

Creating Semi-Procedural Textures

August 8, 2016 - devlog

Hello, this is the first of what I hope will be a series of development logs documenting some of the process in creating my projects. Some of the posts may be specific to certain applications and others may just be rants on broader game dev subjects. This particular log outlines a system that I set up in Unity3D. The technique will rely some familiarity with Unity3D and is mostly just a creative solution for something I needed in my game. Hopefully you find it interesting and/or inspiring.

The planets in Jettomero are procedurally generated using multiple techniques. In this dev log I’ll explain how the textures for each planet are created. I may still adjust it in the future to allow for greater diversity but in the mean time…

This is the hierarchy view of my texture generation process. A disabled camera lives on the parent object and points at a set of 10 quads that live underneath. By default each quad is using an identical texture, using a threshold shader so that whites are transparent and the black lines will all overlap on top of a single white background quad.

screen-shot-2016-09-15-at-7-22-03-pm

Here you can see what the layered quads look like in the scene view. It’s not much to look at yet. The lines are taken from a hand-drawn sketch and were additionally thresholded to black/white in photoshop before being imported into unity.

screen-shot-2016-09-15-at-7-24-50-pm

If I were to wrap a texture like this around one of my planets right now it would look like this:screen-shot-2016-09-15-at-7-32-28-pm

It still looks kind of cool because of the thresholded shader I’m using but I want more detail and more randomness.

Now, when I run the game, every time a new planet is created it requests a new texture from my texture generator script. So here’s what the layered quads look like when they generate a new texture:

screen-shot-2016-09-15-at-7-35-48-pm

For each quad renderer I assign one of my inkline textures and then randomize the texture scale and offset while also rotating the quads. This creates a big old mess of lines that will never be the same again. Note that the layers all all visible because I use a shader on the quads that makes white transparent.

From here my disabled camera gets rendered to a Render Texture and and I can copy that into a Texture2D which can be passed along to the planet that requested it.
screen-shot-2016-09-15-at-7-42-11-pm

The code for this is all fairly simple:

public Texture2D NewGeneratedTexture(float scale = 0.0f)
    {
        Texture2D refTex = refTextures[Random.Range(0, refTextures.Count)];

        foreach (Renderer r in GetComponentsInChildren<Renderer>())
        {
            
            r.material.mainTexture = refTex;
            float randomScale = scale == 0.0f ? Random.Range(4.0f, 8.0f) : scale;
            float randomOffset = Random.Range(1.0f, 1.0f);
            r.material.mainTextureScale = Vector3.one * randomScale;
            r.material.mainTextureOffset = Vector3.one * randomOffset;

            r.transform.localEulerAngles = new Vector3(0.0f, 0.0f, Random.Range(0.0f, 1.0f) * 360.0f);
        }

        Texture2D newTex = new Texture2D(defaultSize, defaultSize);

        RenderTexture.active = GetComponent<Camera>().targetTexture;
        GetComponent<Camera>().Render();

        newTex.ReadPixels(new Rect(0, 0, defaultSize, defaultSize), 0, 0);
        newTex.Apply();

        return newTex;
    }

I keep an array of different inklines, one of which is chosen at random at the start (although I could mix and match if I wanted). Then for each quad renderer you can see I assign the texture to that renderer’s material and proceed to warp it in basic ways. Then I create an empty texture and render my extra camera’s view into the texture.

I’m not sure if this is the most efficient code or the best solution for what I’m doing but I’ve been happy with the results so far. My original approach generated a texture completely in code using Perlin noise (which is what I use for my terrain deformation maps still), but this new technique let’s me easily use a texture with a hand-drawn feel which creates what I find to be a very appealing image on its own.

screen-shot-2016-09-15-at-7-52-36-pm

If you have any questions feel free to follow up with me by email @ ghosttimegames at gmail dot com or via Twitter @GhostTimeGames.

screen-shot-2016-09-15-at-7-54-45-pm