Effie

So, I finally finished a model.

I wanted to make my Mariku model, but I decided to do a test model of another character first for practice and to see how my workflow would go. I decided to make one of Effie from Descender, before she was Queen Between.

I did two different lighting angles to test it. I’m fairly satisfied with my shader at this point….But not how I’m using it. I need to get more used to it, and adjustments I need to make to account for it. For example, the legs show through on the two images in several places where I forgot to turn on the clipping mask. I’d like a way to render them all as if 2d, if possible, all on one alpha layer, but I don’t know how I’d go about that with them as separate objects.

I also see I should change my linework. I was quite happy with the sketchy effect I was getting, but in practice, I don’t have the control I want; the nostrils, for example, get lines when I wouldn’t want them to, and it’s just…Inconvenient having to wait to see how it’ll look. I want as close to real time feedback as I can get.

I also need to work out the rigging. I used a proper armature this time, and then used the Data Transfer modifier to transfer the weights onto the clothes. She lacks her jacket in the render because I couldn’t get it to copy them and didn’t want to have to stop and rig that manually. I wanted to use the Mesh Deform modifier, but it was finicky and unreliable on this model, and having to unbind and rebind everything every time I changed the cage would be a pain. I also haven’t found a solution for unexpected behaviour triggered by having mesh deform and armatures at the same time.

I also just need to get better. The mesh quality is just not good enough. It doesn’t hold up to close ups and the anatomy isn’t good enough. Especially the legs. I’m not good at them.

Plus, I need to work out a better way to do the colours. I mixed them individually where needed, but because each colour has a set of values to go with it, it’s inconvenient. It’s just awkward. I could use two masks, one for total main colour and one for total shadow, but I worry it would damage the watercolour effect I’m going for, being too perfect and clean, whereas with multiple colours, the masks can be modified, such as adding edge soak at the borders, etc.

Lastly, I need to change how I’ll alter shading. The shader just won’t perform well enough to modify it in realtime with vertex paint as I’d wanted to. My experience of Blender’s texture painting makes me think it probably won’t handle using the texture-based method in a real scene, either. Normal editing is a nuisance, though. I’m uncertain what to do about it.

Still, I want to work something out. I’m making progress. I just wish I didn’t feel like I was constantly at the “If it was just a bit better…” Stage.

Pseduowatercolour V2 And Effie

I’ve been working hard recently. Real life has been a bit tricky recently with the corona virus making people panic. I’ve had more overtime recently because work has been more hectic because of that, so I’ve been trying to make the most of my free time. With that, I’ve made a fair few changes to my psuedowatercolour shader, and I’m liking the results.

A problem I had before was that it didn’t work well for dark colours. I tried applying it to my Mariku model, but it looked so wrong. The problem was that it was also blending to the background colour, white. But in real paint, it probably wouldn’t do that; darker pigments seem to stain more, so a dark brown would probably just fade at the edges to a less dark brown. So, I added a new input for it, Maximum Blend, that defines how much the colour can blend with the canvas colour at most. This doesn’t affect the transparency, allowing me to keep my uneven transparent outer edge, while avoiding it looking too unnatural. I tried some more tests on my test models, including multiple colours this time.

I think it’s more effective than before, and I’m quite satisfied, mostly, with how the dark colours look here. The lighter ones look better, though.

In any case, there still problems. Main being the performance. Frankly, it’s insufficient. I might be using a laptop, but even so, it’s far too slow for my liking when I alter values or use multiple colours. In this instance, I used two copies of the shader mixed with a texture. So it’s running the entire thing twice, and I found it slow.

A more efficient way to do it would be to mix the values the shader uses beforehand, like the colour ramp inputs, mix the colours, etc. I tried making a node group for it.

Unfortunately, as you can see, it’s obscenely long. It’s ridiculous. Technically, it works, but it’s just not sufficient. It’s clunky and hard to use, and takes a long time to plug in. So, my next move with be to make it more efficient. I’m going to pack as many of those values into single inputs and outputs as possible. For example, I could combine the values for Silhouette Ramp Low, Silhouette Ramp High, and Maximum Blend into a colour. Colours limit it to 0, though, so I think I’ll use vectors. I’ll need to make one function to convert the full value set into the packed version, then another to unpack them, and a version of this function to mix the two, using the packed versions as inputs. Then I should have less to plug in to each thing.

I’m also going to add in again some features I took out. I thought that modifying things by viewing angle was making the node too large, but after thinking about it and testing it, it would be useful, I just need to use it more carefully. I’m also not currently seeing as much use for the depth as I’d thought before; I’ll probably remove it. By getting the depth from the node group I made, I could stick it in as an input anyway by using it as the override mask, rather than it definitely needing to be internal.

I have more to write soon, but this is it for now. I finally, mostly, solved my sculpting problem with smoothing, so I’ll have proper models to show soon. I’m currently working on one of Effie from Descender, for fun and practice. If I can make this work, I can definitely make Mariku work.