Character face rig!


Geril here. This month in my free time, just to expand our portfolio a little, I've created a new character. It's still heavily work-in-progress, but it's gotten interesting enough to show you! It's a standard character for use in Unreal 4 and made in Blender, as usual, but this time instead of creating our own skeleton from scratch, I made the skeleton based on this .blend file recommended by Epic for Unreal 4.

While I kept the bone structure and names of the original skeleton, I modified everything I needed to to suit the character, and - as always - I made a specific and unique facial rig. If you've been our reader for a while, you might have noticed that we prefer using shape keys/morph targets for our characters' faces. It's true that they are easier to use for games, but they also limit things quite a bit. And even though I love limitations, because limitations force me to be creative, it was about time for me to step out of my comfort zone and create a dedicated video game character complete with a bone-driven face rig. So I did. The face has a lot of bones that are constrained to a bunch of driver bones for easier use, so within a few seconds I can animate stupid faces for the poor character.

At least that's how it started.

Right now on the pictures and the attached video is a face with standard driver bones. The eyes will move by texture panning, but what will really be interesting is that we plan to swap out parts of the skeletal mesh during runtime. At first this might sound strange, especially for a rig like this, but we want to see how well the Unreal Engine handles this stuff.

The mesh swaps will happen on the fly: the face will be split into multiple smaller meshes (the splits will be invisible of course), and for example when the character blinks, the eye meshes will switch to a closed-eyed mesh version instead of the eyelids stretching over the eyes. We will need multiple meshes which are spread out in-between the open and closed eye states to make a smooth animation, and then we still have to solve how to handle the textures for these meshes. The upside is that we don't have to worry about texture stretching, because we'll be using separate meshes for O and U sounds, and even for sticking the tongue out.

But first we'll need a finished and stable face rig that could work standalone as well. The separate mesh pieces will be created using these weights too, and then modified in a way that leaves most of the weights intact. That way if the lips have a smile on them, and the mesh changes to an O-sound, the smile will stay. It sounds weird in text, but if all goes according to our plans, it should work pretty well. We'll have to change our animation style a bit to fit this technology, maybe make it snappier so the mesh changes won't look abrupt. The only question remaining is how UE4 will react to this all.

While I'm still working on the rig, Lussy's making the textures for the character. This takes up most of our free time now, so look forward to more posts about it in the coming months!


Still busy...


Okay, the fact that we are busy hasn't changed since last month... But at least now I have some time to write about our latest Unreal 4 adventures.

We had to script a horseman. The horse mesh was given to us, complete with materials and animations. The challenge was putting an already heavily script-based human on top of the horse and syncing up their animations and movements.

The movement part was easy, we attached the human's pelvis to a socket on the horse's saddle. However the animations were a bit more tricky, and we still think the whole system could have been done better, but not sure how.

This was our solution:

We united the horse and human skeletons in Blender, and animated the human on top of the horse that way. We then exported the unified skeleton, along with the animations, and assigned the horse to that skeleton. An animblueprint was built for the unified skeleton that contains IKs for putting the rein in the human's hands and the stirrup on his feet (If the IKs relied on any variables outside of their respective animblueprint, they lagged behind about one frame).

Now comes the hacky part: we stuck the human on top of the horse, retargeted the riding animations of the merged skeleton onto the human's skeleton, then built the same animblueprint for its skeleton too, sans the IKs. We rigged the whole thing so that the merged skeleton's BP sends signals to the human's BP to change anim states, so the changes occur at the same time and the animations stay in sync. Occasional errors still occur once in a blue moon (oh crap, today's a blue moon!), but we suspect those are framerate-related.

This system feels bloated and makes any later changes risky, however, we couldn't come up with anything more stable at the moment, and under our current pressure. Do you guys have ideas on how to make such a system more stable?


Busy busy...

This has been a terribly busy month for us, so we don't have time to write a proper post now. However, we've learned a lot about making two AnimBlueprints (a horse and a horseman) precisely sync up in Unreal 4, so we'll have something to write about next month!

Happy holidays, everyone!


Monthy ramble!

This month we've honestly just worked our asses off so there isn't much else we could write about, and we can't really write about that either, so...

We've upgraded our PCs - I (Lussy) only got a GTX 1070 video card, but Geril replaced his whole machine and it was painful.
We got him a socket TR4 motherboard, a Ryzen CPU, a Vega64 video card and some ram, but when we first put them together, the PC couldn't start up. We had to get another stick of ram to actually get into the UEFI. Then we had to flash the BIOS to get the video card and the other stick of ram to work. This took quite a while, and now he has 32GB ram instead of the planned 16...
So if you buy a recently released motherboard, just be prepared to deal with this.

When it's turned on, the whole thing lights up like a Christmas tree.
In other news, we've barely had time and energy to sit in front of a TV to play games, so we use the Switch a lot. We've seen lots of people just shrug off the portability factor of the Switch, but for us and our lifestyle, it is a godsend - we prefer anything portable over having to sit still in front of a huge screen for long periods of time, since that's what we do for a living, anyways... And we like taking long baths and now we can even play Skyrim during them.

We also try to play our PS4 frequently, but it's really inconvenient that we have to charge the controller every 2-3 hours (seriously, what's up with that battery life?), and, since we finish work well after midnight, we have to turn the volume all the way down and kill the subwoofer. With the Switch, the screen is just the right size that we don't have to sit too close to it and it's not tiring to hold, either. It's also pretty convenient that we can bring it with us to long train rides and family gatherings, where we can split the controller and play Mario Kart with other people.

As developers thought, we have to say that the Switch, while more powerful than last-gen consoles, isn't as powerful as the current generation of home consoles. That won't stop us from aiming to develop for it, however, because it seems like the Unreal 4 gets excellent support for the platform. We always keep in mind how we can tone down the stuff we make so that they will run on the Switch. Right now we're using the GPD Win as comparison: if it runs on the GPD, it will surely run on the Switch, too.

That's it for this month's ramble, we hope we'll be able to talk about or show more interesting stuff in the future.


A GPD Win post!


This month we've taught horses to follow paths, made a character creation system and also put together some complex interactive materials. All that in Unreal 4. Too bad we can't show any of this right now, because it's for our current job. In any case, we're learning a lot working full-time in Unreal 4!

There's a thing we only briefly mentioned in our previous post but didn't elaborate on it: the GPD Win that Lussy created the rhythm game project with, using Unreal 4.

The GPD Win is advertised as a portable Windows 10 PC, and Geril bought one about 3 months ago. At first he considered getting a LattePanda, but figured it would be too much trouble to make it stable. (Since, funnily enough, Geril bought it for work, stability is a huge concern)

Even though our GPD is for work, we still use it to play some games: we play World of Warcraft on it using random public Wi-Fi hotspots; Fallout: New Vegas and various indie games. For example, Cuphead runs with a stable 60fps.

To play any "serious" modern games on it, we need to do some heavy work and tweaking just to get it running in a bearable manner, so we don't do it very often. It's just too much effort, but it's just about what we expected.

But as for Unreal 4, we had no clue how it would run. It's a huge relief that it runs well, surprisingly well. Of course everything that can be set to low has to be set to low, and everything takes a really long time to load and compile.

At first, Geril created simple materials, material instances, particle effects and levels. Editing materials is fine, but compiling shaders is a problem - probably because the built-in storage is really slow. It also gets very hot when compiling shaders (and even the maximum fan speed doesn't help), so we actually had to leave it in the fridge for an hour to complete compiling about 5000 shaders. Well, at least anyone who opened the fridge had a laugh.

The framerate immediately drops in the editor when working with any lights in the scene. Funnily enough, playing in editor produces better framerates than not starting the scene at all.

What really works well on it though are blueprints. Last month's practice project was created almost entirely on the GPD, and apart from the controls (using blueprints with analog sticks...), there wasn't much to complain about.

We could talk about what's actually in the GPD, but we think it's irrelevant - it's basically a 64 bit PC, and Windows PCs are known for their customization options, therefore any issues can ultimately be solved. We were positively surprised how agile the Unreal Engine is, and how much work can be done using such a tiny machine.

So we can work even on the go! Yay... I guess?

With that said, Happy Halloween! (The Halloween spirit evaded us this year)


Keeping the beat in Unreal 4!

We're really looking forward to not beginning our posts with this: we've been very busy this month, and had almost no time for our personal projects.

That's only almost, though. We have had time for an approximately 3-day practice project in Unreal 4 (about 1.5 days of this was done on a GPD Win device, which can run Unreal 4, impressively, and is also portable, so we could work on it during a long trip). I (Lussy) have finally figured out a way to bind things to a soundwave's position (in seconds) in Blueprints, so we've made a replica of the rhythm game Taiko. Because we didn't have to come up with new gameplay, and only had to implement the existing mechanics, we've been messing around with binding things to BPM, and working on sickeningly colorful graphics.

Excuse me for the sub-par play, it was late and oh yeah we had to show off the 'Miss' particle effects!

The song and beatmap in the video are from one of the original Taiko games, but we used this osu file to bring them into our project by exporting the contents to .csv and importing it into a data table.

We tried to only include really light, unlit-only graphics that wouldn't impact gameplay. We even though about only using 2D sprites or just widgets, but it just didn't seem right for an Unreal project. But in any case, the gameplay stays stable even if the framerate dips. I can't overstate this, we're REALLY happy about the whole thing syncing up, and we never had to use ticks. I've been trying for years to accomplish this

So where to from here... This is a really low priority, tiny project, but we have a few ideas for our own gameplay mechanics that we are going to replace the Taiko mechanics with. Only the timing mechanism will stay. Until then, ... We managed to actually sync things up to a given song and rhythm without the framerate messing things up! Woooo!


Summer's over already!...


We're going through a pretty rough transitional period at the moment. We've had time to do this article for Sketchfab however, so enjoy:

This is related to the Fisheye Placebo fanart lipsync scene we've done. We show off some making of pics in the article, and describe our workflow.

There's also this little test we've cooked up, it's an Unreal 4 Blueprint AI system with very basic pathfinding and a few interactive objects.

The interactive objects work via a point of interest system. First, we created an offset for the head that follows the points of interest that the character is closest to (or that has more influence). After those, we've also added attractors and repellers as child blueprints to the PoI-s. The attractor has a chance to summon nearby characters to itself and make them clap, and the repeller can either explode and kill characters that were caught in the blast, and/or send the rest fleeing and cowering in fear. These are all pretty rough and basic, but functional nonetheless. We spent about a day working on them, they were made as a tech demo.

Oh yeah, and the assets are from the World of Warcraft.