I've been working on the material system lately. A material handles the textures and shaders used when rendering a mesh. Every material has its own shading program, which consists of a vertex shader, a fragment shader and an optional geometry shader. Some of the material's properties, like its number of texture layers, are compiled directly into the shaders, while others, like material colors, can be updated at run-time using uniform variables.
Materials can be made up of several layers which are blended on top of each other with a variety of blend modes, very similar to Photoshop. A texture layer can contain a diffuse texture with an optional alpha channel (for transparency effects), a normal map with a height component stored in the alpha channel (for parallax mapping and similar effects) and a specular map. Internally, texture layers are represented by array textures. This brings the advantage that only one texture object has to be bound per layer, effectively reducing the number of texture binds by two thirds (diffuse, normal and specular maps are all adressed by the same sampler in the shader).
In addition, texture layers can be scrolled, stretched or rotated for effects like animated computer screens etc. This was heavily used in Quake 3, for example. I'll show an example of that later (if I don't forget as usual).
The wall on the screenshot is composed of four images (as seen on the left). You can see the material definition in the upper left corner.