Giving nuance to a PBR texture across an environment is a common challenge. One solution to this is to use blended textures. Here I describe my process for generating artwork which features such a texture.
Maya ShaderFX
One of the issues facing our workflow is that there is no currently compatible multi-layer shader format which exists between DCC packages and engines such as Unity/Unreal. So what we have to do is build an analog of the final shader that we are going to use in our game engine, so that we can preview what our final output shader will look like, whilst still having the context and tools of our DCC scene. In Maya, we use a ShaderFX graph to build such a shader.
The structure of the graph is quite basic. Essentially, we have three texture maps. We use two cascading lerp nodes (linear interpolation) to feed these to the diffuse property of our output shader. We use the red channels and blue channels of our geometry vertex colors, to control the texture amounts, by using the VertexColor node.
We can export this result directly to our Unity project.
Unity Shader Graph
UV Mapping
While this workflow uses a three layer texture format, the technique can be extended for use of up to five textures, with RGBA and black all being associated with a texture. However, seeing as we might be targeting a real-time output format, we have to consider optimization for our final shader, so it’s a balancing act.