The goal is to play facial expressions (like blinking or talking) while a body animation (like waving) is playing.
I'm actually working on something similar, wondering the same question.
Model imported from usdz with a list of animations (walk, idle, etc). E.g. entity.playAnimation(animations[index], transitionDuration: 0.2, startsPaused: false)
I can manipulate joints for the neck or jaw programmatically to adjust the model. By doing:
// input variable mouthOpen: Float
let target = "Root_M/.../Jaw_M"
var newPose = basePose
guard let index = newPose.jointNames.firstIndex(of: target) else { return }
let baseTransform = basePose.jointTransforms[index]
let maxAngle: Float = 40
let angle: Float = maxAngle * mouthOpen * (.pi / 180)
let extraRot = simd_quatf(angle: angle, axis: simd_float3(x: 0, y: 0, z: 1))
newPose.jointTransforms[index] = Transform(
scale: baseTransform.scale,
rotation: baseTransform.rotation * extraRot,
translation: baseTransform.translation
)
skeletalComponent.poses.default = newPose
creatureMeshEntity.components
.set(skeletalComponent)
I also plan on making the head look at a specific point by manually setting the neck or eye joints rotation.
The problem is that playing an animation via entity.playAnimation() will overwrite the jointtransforms and so block the programmatic rotating of joints.
Playing a character's walk/idle animation while making them look at a specific spot is a pretty common use case, isn't it?
Topic:
Graphics & Games
SubTopic:
RealityKit
Tags: