Post

Replies

Boosts

Views

Activity

Reply to Anchoring a Prim to the face doesn't work
So it seems just putting the line token preliminary:anchoring:type = "face" in was not enough. What worked was writing a python script, importing the from pxr import Usd, Sdf, UsdGeom, Kind stuff and then creating a scene hierarchy with /Root/Scenes/Scene/Children/MyModel where the scene gets the anchor-token. The USD Classes reference (https://graphics.pixar.com/usd/docs/api/class_usd_stage.html) helped a little but It was way more complicated then I expected it to be.
Topic: App & System Services SubTopic: General Tags:
Jun ’21
Reply to Safari, JavaScript and WebGL extremely bad performance?
We're seeing the same issue in all our WebGL applications. It definitely worked in April 2021, since then without changing any code the Safari performance went from 60fps to 2fps. Example scene built in unity (with a camera-texture since camera input is important for our AR applications): https://test.looc.io/forest/index.html
Topic: Safari & Web SubTopic: General Tags:
Feb ’22
Reply to Glass material in USDZ
I think it would help to specify more clearly what shader system you are using? When it comes to ARKit, ARKit is only the framework that matches your camera to the 3D scene, i.e. handling of the odometry etc. The materials are rendered in the 3D shaders used, technically you can use ARKit with shaders from RealityKit, SceneKit and MetalKit. here's a nice writeup on their differences on stackoverflow
Topic: Spatial Computing SubTopic: ARKit Tags:
Dec ’22
Reply to Model Guardrails Too Restrictive?
I had a similar experience in Beta 3, even questions like "What is the capital of France?" were hitting guardrails. Tried the same question with a number of real countries, always guardrailed. Then tried with Gondor and Westeros and for those fictional countries the model sent a response. I'm assuming mentioning real country names must have triggered guard rails against political topics. As of Beta 4 my test questions for capitals work for both real and fictional countries.
Jul ’25
Reply to There's wrong with speech detector ios26
SpeechAnalysisModule doesn't exist, SpeechAnalyzer init parameter is called SpeechModule. Doing let modules: [any SpeechModule] = [detector, transcriber] also doesn't work, since it's obviously Cannot convert value of type 'SpeechDetector' to expected element type 'any SpeechModule'. This compiles and runs: let detector = SpeechDetector(detectionOptions: SpeechDetector.DetectionOptions(sensitivityLevel: .medium), reportResults: true) let modules: [any SpeechModule] = [detector as! (any SpeechModule), transcriber] let analyzer = SpeechAnalyzer(modules: modules, options: SpeechAnalyzer.Options(priority: .high, modelRetention: .processLifetime)) but honestly, I see no difference with or without the detector. Actually testing the results via: let detector = SpeechDetector(detectionOptions: SpeechDetector.DetectionOptions(sensitivityLevel: .medium), reportResults: true) let modules: [any SpeechModule] = [detector as! (any SpeechModule), transcriber] Task { for try await result in detector.results { print("result: \(result.description)]") } } also doesn't yield any log lines, so I think while force-casting it to SpeechModule doesn't make the app crash, it's just ignored.
Topic: Media Technologies SubTopic: Audio Tags:
Jul ’25
Reply to There's wrong with speech detector ios26
Thank you Greg! @DTS Engineer When I use it with the retroactive protocol conformance, it seems to work, but I never see any results (for the reportResults: true) When I try: let detector = SpeechDetector(detectionOptions: SpeechDetector.DetectionOptions(sensitivityLevel: .medium), reportResults: true) if analyzer == nil { analyzer = SpeechAnalyzer(modules: [detector, transcriber], options: SpeechAnalyzer.Options(priority: .high, modelRetention: .processLifetime)) } Task { for try await result in detector.results { print("result: \(result.description)]") } } I never see any of the result prints in the log, while the Transcription works fine. Is the detector.results supposed to be used like that and if so, does it show any response for others?
Topic: Media Technologies SubTopic: Audio Tags:
Jul ’25
Reply to BlendShapes don’t animate while playing animation in RealityKit
The goal is to play facial expressions (like blinking or talking) while a body animation (like waving) is playing. I'm actually working on something similar, wondering the same question. Model imported from usdz with a list of animations (walk, idle, etc). E.g. entity.playAnimation(animations[index], transitionDuration: 0.2, startsPaused: false) I can manipulate joints for the neck or jaw programmatically to adjust the model. By doing: // input variable mouthOpen: Float let target = "Root_M/.../Jaw_M" var newPose = basePose guard let index = newPose.jointNames.firstIndex(of: target) else { return } let baseTransform = basePose.jointTransforms[index] let maxAngle: Float = 40 let angle: Float = maxAngle * mouthOpen * (.pi / 180) let extraRot = simd_quatf(angle: angle, axis: simd_float3(x: 0, y: 0, z: 1)) newPose.jointTransforms[index] = Transform( scale: baseTransform.scale, rotation: baseTransform.rotation * extraRot, translation: baseTransform.translation ) skeletalComponent.poses.default = newPose creatureMeshEntity.components .set(skeletalComponent) I also plan on making the head look at a specific point by manually setting the neck or eye joints rotation. The problem is that playing an animation via entity.playAnimation() will overwrite the jointtransforms and so block the programmatic rotating of joints. Playing a character's walk/idle animation while making them look at a specific spot is a pretty common use case, isn't it?
Topic: Graphics & Games SubTopic: RealityKit Tags:
Jul ’25
Reply to BlendShapes don’t animate while playing animation in RealityKit
The docs on AnimationGroup say: If two animations on the same property overlap durations at runtime, the one that the framework processes second overwrites the first. That means, I'll have to adjust the animation in the usdz using blender, so it does not use the jaw or neck joint? Then I should be able to animate the jaw / neck simultaneously to the idle animation from my asset, using AnimationGroup?
Topic: Graphics & Games SubTopic: RealityKit Tags:
Jul ’25
Reply to Anchoring a Prim to the face doesn't work
So it seems just putting the line token preliminary:anchoring:type = "face" in was not enough. What worked was writing a python script, importing the from pxr import Usd, Sdf, UsdGeom, Kind stuff and then creating a scene hierarchy with /Root/Scenes/Scene/Children/MyModel where the scene gets the anchor-token. The USD Classes reference (https://graphics.pixar.com/usd/docs/api/class_usd_stage.html) helped a little but It was way more complicated then I expected it to be.
Topic: App & System Services SubTopic: General Tags:
Replies
Boosts
Views
Activity
Jun ’21
Reply to SAFARI 15.2 WEBGL performance disaster
Seeing the same issue here. Our test scene used to render at 60fps back in April 2021, now without changing any code it's at 2FPS and my whole Macbook Air freezes. Test scene: https://test.looc.io/forest/index.html
Topic: Safari & Web SubTopic: General Tags:
Replies
Boosts
Views
Activity
Feb ’22
Reply to Safari, JavaScript and WebGL extremely bad performance?
We're seeing the same issue in all our WebGL applications. It definitely worked in April 2021, since then without changing any code the Safari performance went from 60fps to 2fps. Example scene built in unity (with a camera-texture since camera input is important for our AR applications): https://test.looc.io/forest/index.html
Topic: Safari & Web SubTopic: General Tags:
Replies
Boosts
Views
Activity
Feb ’22
Reply to Opening a new terminal window or tab is extremely slow
Turns out homebrew seems to have spammed hundreds of lines eval "$(/opt/homebrew/bin/brew shellenv)" into my ~/.zprofile which I didn't even know existed.
Topic: App & System Services SubTopic: Core OS Tags:
Replies
Boosts
Views
Activity
Jun ’22
Reply to How to disable wide-angle for ARKit Face recognition?
So, now that we have an iPad 5th Generation with the 12MP camera I can see the problem more clearly. Basically, what we'd need is for the ARSCNView to also have the center stage function that Apple offers in FaceTime. The full wide-angle-camera image is too big.
Topic: Spatial Computing SubTopic: ARKit Tags:
Replies
Boosts
Views
Activity
Jul ’22
Reply to Crash in `outlined init with copy of` when run in Release Mode
PS: If we remove one of the fields from Resource , say delete the authToken field, the method will no longer crash. I believe that would reduce the size of the struct Resource to where the init method would no longer be outlined
Topic: Programming Languages SubTopic: Swift Tags:
Replies
Boosts
Views
Activity
Aug ’22
Reply to Crash in `outlined init with copy of` when run in Release Mode
PS: If I change enum HTTPMethod<Payload> {     case get     case post(Payload)     case patch(Payload) } to enum HTTPMethod<Payload> {     case get(Payload)     case post(Payload)     case patch(Payload) } the problem also goes away. Filed a bug report using the Feedback Assistent, there shouldn't be reason why the .get case has an associated value.
Topic: Programming Languages SubTopic: Swift Tags:
Replies
Boosts
Views
Activity
Aug ’22
Reply to Glass material in USDZ
I think it would help to specify more clearly what shader system you are using? When it comes to ARKit, ARKit is only the framework that matches your camera to the 3D scene, i.e. handling of the odometry etc. The materials are rendered in the 3D shaders used, technically you can use ARKit with shaders from RealityKit, SceneKit and MetalKit. here's a nice writeup on their differences on stackoverflow
Topic: Spatial Computing SubTopic: ARKit Tags:
Replies
Boosts
Views
Activity
Dec ’22
Reply to Glass material in USDZ
PS: Your title says usdz but you tagged it ARKit. Are you trying to show your object inside an app that uses ARKit? Or are you trying to show a usdz using QuickLook on a website ?
Topic: Spatial Computing SubTopic: ARKit Tags:
Replies
Boosts
Views
Activity
Dec ’22
Reply to Model Guardrails Too Restrictive?
I had a similar experience in Beta 3, even questions like "What is the capital of France?" were hitting guardrails. Tried the same question with a number of real countries, always guardrailed. Then tried with Gondor and Westeros and for those fictional countries the model sent a response. I'm assuming mentioning real country names must have triggered guard rails against political topics. As of Beta 4 my test questions for capitals work for both real and fictional countries.
Replies
Boosts
Views
Activity
Jul ’25
Reply to [26] audioTimeRange would still be interesting for .volatileResults in SpeechTranscriber
Turns out it was my bad, I had a bug in looking through the runs of the AttributredString, I now found all the audioTimeRanges
Topic: Media Technologies SubTopic: Audio Tags:
Replies
Boosts
Views
Activity
Jul ’25
Reply to There's wrong with speech detector ios26
SpeechAnalysisModule doesn't exist, SpeechAnalyzer init parameter is called SpeechModule. Doing let modules: [any SpeechModule] = [detector, transcriber] also doesn't work, since it's obviously Cannot convert value of type 'SpeechDetector' to expected element type 'any SpeechModule'. This compiles and runs: let detector = SpeechDetector(detectionOptions: SpeechDetector.DetectionOptions(sensitivityLevel: .medium), reportResults: true) let modules: [any SpeechModule] = [detector as! (any SpeechModule), transcriber] let analyzer = SpeechAnalyzer(modules: modules, options: SpeechAnalyzer.Options(priority: .high, modelRetention: .processLifetime)) but honestly, I see no difference with or without the detector. Actually testing the results via: let detector = SpeechDetector(detectionOptions: SpeechDetector.DetectionOptions(sensitivityLevel: .medium), reportResults: true) let modules: [any SpeechModule] = [detector as! (any SpeechModule), transcriber] Task { for try await result in detector.results { print("result: \(result.description)]") } } also doesn't yield any log lines, so I think while force-casting it to SpeechModule doesn't make the app crash, it's just ignored.
Topic: Media Technologies SubTopic: Audio Tags:
Replies
Boosts
Views
Activity
Jul ’25
Reply to There's wrong with speech detector ios26
Thank you Greg! @DTS Engineer When I use it with the retroactive protocol conformance, it seems to work, but I never see any results (for the reportResults: true) When I try: let detector = SpeechDetector(detectionOptions: SpeechDetector.DetectionOptions(sensitivityLevel: .medium), reportResults: true) if analyzer == nil { analyzer = SpeechAnalyzer(modules: [detector, transcriber], options: SpeechAnalyzer.Options(priority: .high, modelRetention: .processLifetime)) } Task { for try await result in detector.results { print("result: \(result.description)]") } } I never see any of the result prints in the log, while the Transcription works fine. Is the detector.results supposed to be used like that and if so, does it show any response for others?
Topic: Media Technologies SubTopic: Audio Tags:
Replies
Boosts
Views
Activity
Jul ’25
Reply to BlendShapes don’t animate while playing animation in RealityKit
The goal is to play facial expressions (like blinking or talking) while a body animation (like waving) is playing. I'm actually working on something similar, wondering the same question. Model imported from usdz with a list of animations (walk, idle, etc). E.g. entity.playAnimation(animations[index], transitionDuration: 0.2, startsPaused: false) I can manipulate joints for the neck or jaw programmatically to adjust the model. By doing: // input variable mouthOpen: Float let target = "Root_M/.../Jaw_M" var newPose = basePose guard let index = newPose.jointNames.firstIndex(of: target) else { return } let baseTransform = basePose.jointTransforms[index] let maxAngle: Float = 40 let angle: Float = maxAngle * mouthOpen * (.pi / 180) let extraRot = simd_quatf(angle: angle, axis: simd_float3(x: 0, y: 0, z: 1)) newPose.jointTransforms[index] = Transform( scale: baseTransform.scale, rotation: baseTransform.rotation * extraRot, translation: baseTransform.translation ) skeletalComponent.poses.default = newPose creatureMeshEntity.components .set(skeletalComponent) I also plan on making the head look at a specific point by manually setting the neck or eye joints rotation. The problem is that playing an animation via entity.playAnimation() will overwrite the jointtransforms and so block the programmatic rotating of joints. Playing a character's walk/idle animation while making them look at a specific spot is a pretty common use case, isn't it?
Topic: Graphics & Games SubTopic: RealityKit Tags:
Replies
Boosts
Views
Activity
Jul ’25
Reply to BlendShapes don’t animate while playing animation in RealityKit
The docs on AnimationGroup say: If two animations on the same property overlap durations at runtime, the one that the framework processes second overwrites the first. That means, I'll have to adjust the animation in the usdz using blender, so it does not use the jaw or neck joint? Then I should be able to animate the jaw / neck simultaneously to the idle animation from my asset, using AnimationGroup?
Topic: Graphics & Games SubTopic: RealityKit Tags:
Replies
Boosts
Views
Activity
Jul ’25