BlendShapes don’t animate while playing animation in RealityKit

Hi everyone,

I’m running into an issue with RealityKit when trying to animate BlendShapes (ShapeKeys) while a skeletal animation is playing. The model is a rigged character in .usdz format with both predefined skeletal animations and BlendShapes (exported from Blender).

The problem: when I play any animation using entity.playAnimation(...), the BlendShapes stop responding. Calling setBlendShapes(...) still logs that weights are being updated, but the visual changes are not visible.

The exact same blend shape animation works perfectly when no animation is playing.

In SceneKit the same model works as expected: shape keys get animated during animation playback. But not in realitykit

Still, as soon as an animation starts, the shape keys don’t animate anymore.

Here’s the test project on GitHub that demonstrates the issue clearly: https://github.com/IAMTHEBURT/RealityKitWitnBlendShapesSample

The goal is to play facial expressions (like blinking or talking) while a body animation (like waving) is playing.

Is this a known limitation in RealityKit? Or is there a recommended way to combine skeletal animations with real-time BlendShape updates?

Thanks in advance for any insights.

The goal is to play facial expressions (like blinking or talking) while a body animation (like waving) is playing.

I'm actually working on something similar, wondering the same question.

  • Model imported from usdz with a list of animations (walk, idle, etc). E.g. entity.playAnimation(animations[index], transitionDuration: 0.2, startsPaused: false)
  • I can manipulate joints for the neck or jaw programmatically to adjust the model. By doing:
// input variable mouthOpen: Float
let target = "Root_M/.../Jaw_M"

var newPose = basePose
guard let index = newPose.jointNames.firstIndex(of: target) else { return }

let baseTransform = basePose.jointTransforms[index]
let maxAngle: Float = 40
let angle: Float = maxAngle * mouthOpen * (.pi / 180)
let extraRot = simd_quatf(angle: angle, axis: simd_float3(x: 0, y: 0, z: 1))
newPose.jointTransforms[index] = Transform(
    scale: baseTransform.scale,
    rotation: baseTransform.rotation * extraRot,
    translation: baseTransform.translation
)
skeletalComponent.poses.default = newPose
creatureMeshEntity.components
    .set(skeletalComponent)

I also plan on making the head look at a specific point by manually setting the neck or eye joints rotation.

The problem is that playing an animation via entity.playAnimation() will overwrite the jointtransforms and so block the programmatic rotating of joints.

Playing a character's walk/idle animation while making them look at a specific spot is a pretty common use case, isn't it?

The docs on AnimationGroup say:

If two animations on the same property overlap durations at runtime, the one that the framework processes second overwrites the first.

That means, I'll have to adjust the animation in the usdz using blender, so it does not use the jaw or neck joint? Then I should be able to animate the jaw / neck simultaneously to the idle animation from my asset, using AnimationGroup?

I've only been able to get simultaneous animations playing on a rigged model by using hard-coded FromToByAnimations and ensuring that the animations do not have any collisions in terms of which joints they are animating. Then when using playAnimation, passing a blend layer offset to the each animation group. But I have not tried this approach of using Blend shapes from Blender

BlendShapes don’t animate while playing animation in RealityKit
 
 
Q