Shaders have always been something that i have been particularly interested in but never understood up until recently. This trimester its been my goal to actually start messing around with shaders to get a better understanding.
Rendering in Unity
The diagram below loosely represents the three different entities which plays a role in the rendering workflow of Unity:
3D models are, essentially, a collection of 3D coordinates called vertices. They are connected together to make triangles. Each vertex can contain few other information, such as a colour, the direction it points towards (called normal) and some coordinates to map textures onto it (called UV data).
Models cannot be rendered without a material. Materials are wrappers which contain a shader and the values for its properties. Hence, different materials can share the same shader, feeding it with different data.
Water Shader 2D
This shader creates the effect that an object is merging together
First off we pass in the texture colour
texcol = tex2D (_MainTex, i.uv);
Then we check if the alpha to determine a gap
Then set the alpha as if it was the center of the metaball and adjust colour accordingly
And finally return that finalColour
This is what it should look like
How does this work in Unity?
A texture is used to take away the colour from the shader instead of defining each blob inside it, this way it doesn’t need to compute every blob for a point because we can use a texture to mask that.
Similar to the one below:
This combined with the metaball shader should look more like this
Our “Water Drops” are going to be using sphere colliders with rigidbodies, we can setup the friction or density to desired configuration. Then we create the collider smaller than the texture like so.
a Render texture behaves like a Regular texture, so we can place any shader in it, if we apply our metaball shader to this, if our camera sees something white it will merge its colours.
But we don’t want this merging every white colour in the scene, to prevent this we need to create two cameras, one that looks at the regular scene and one that looks at the layer for the metaballs.
Once we have created that we just need to layer the plane with the metaball render texture over our current scene and it should do this rest for us.
So how do you make something seem like its softer and squishier?
I went with the basics of trying to make it seem like it was amorphous by squishing the sides of the vertices.
By adding the sign of the vertex times by the sin, it would shrink or grow depending on its position in the sine wave.
v.vertex.x += sign(v.vertex.x) * sin(_Time.w) / _Speed;
A similar thing was done with the Y vertex, but using cos instead to give an offset.
v.vertex.y += sign(v.vertex.y) * cos(_Time.w) / _Speed;
Then this information was passed to the mesh
o.pos = UnityObjectToClipPos(v.vertex);
o.uv = v.texcoord;
and this was the results