I've been working on a *rapid* way to edit the facial performance capture that can be done using the Live Link Face App from Epic Games for UE4. After a few days of fiddling around, I reached a solution using Blueprint integration with Sequencer. Will be expanding on this system to feature a UI at some point. This is not a traditional face rig.
On the LEFT - is the raw data streaming into UE4 via the Live Link Face App which uses the Apple ARKit Blendshapes.
On the RIGHT is the same take after a few minutes of work using the BP and Sequencer to refine the performance.
A lot of work to be done when refining, in some cases I think the refinement went a bit overboard... but this is the first test shot and export and I wanted to share :)
Awesome stuff Deepak. Will you be making a course on this for UE4? Just finished your Independent Filmmaking course and it blew my mind. Looking for more of your courses!