Post

Replies

Boosts

Views

Activity

How to mix Animation and IKRig in RealityKit
I want an AR character to be able to look at a position while still playing the characters animation. So far, I managed to manually adjust a single bone rotation using skeletalComponent.poses.default = Transform( scale: baseTransform.scale, rotation: lookAtRotation, translation: baseTransform.translation ) which I run at every rendering update, while a full body animation is running. But of course, hardcoding single joints to point into a direction (in my case the head) does not look as nice, as if I were to run some inverse cinematic that includes, hips + neck + head joints. I found some good IKRig code in Composing interactive 3D content with RealityKit and Reality Composer Pro. But when I try to adjust rigs while animations are playing, the animations are usually winning over the IKRig changes to the mesh.
1
0
534
Aug ’25
Sometimes stream via AVPlayer plays Audio but Video is just grey
In our app, we're streaming short HLS streams to local AVPlayers. In the view we have an AVPlayerLayer which is connected to the AVPlayer we start, and usually we hear audio and see the video. Sometimes, instead of video, we'll see a grey screen, while still being able to hear the audio. The playerlayer will be a shade of grey, which is not a shade coming from our app. Also if for testing purposes we don't connect the player to the PlayerLayer, this grey color will not be there, so it's definitely coming from the AVPlayerLayer. If this is somehow related to an HLSStream becoming corrupted or something, what are the API's in AVFoundation we could use to debug this? So far, when this happens we see no peculiar catch blocks or errors being thrown in our debug logs.
0
0
1.2k
Mar ’21
'ModuleNotFoundError: No module named '_usd'' when importing USD
So, I'm using some small python script that imports from pxr import Usd, Sdf, UsdGeom, Kind which used to work fine a few months ago. I got the USD tools from https://developer.apple.com/augmented-reality/tools/ and always had to run it in a Rosetta-Terminal (plus making sure the USD tools are in PATH and PYTHONPATH) but it worked. Now, whatever I do I always get   File "create_scene.py", line 2, in <module>     from pxr import Usd, Sdf, UsdGeom, Kind   File "/Applications/usdpython/USD/lib/python/pxr/Usd/__init__.py", line 24, in <module>     import _usd ModuleNotFoundError: No module named '_usd' Does anyone have any clues what this _usd is about?
0
0
1.5k
Jan ’22
Transmission property for transparent materials ("frosted")
Today I was surprised that the latest version of Three.js (browser based 3D Engine) actually supports a transmission property for transparent materials that looks pretty good and is really performant: So far I have only used this material property when rendering in Blender, not in a realtime engine. But if THREE can run this at 60fps in the browser, there must be a way of achieving the same in ARKit/SceneKit/RealityKit? I just haven't found anything in the SCNMaterial documentation about Transmission nor anyone mentioning relevant shader modifiers. Especially if we want to present objects with frosted glas effects in AR this would be super useful? (The Three.js app for reference: https://q2nl8.csb.app/ and I learned about this in an article by Kelly Mulligan: https://tympanus.net/codrops/2021/10/27/creating-the-effect-of-transparent-glass-and-plastic-in-three-js/)
0
0
1.5k
Dec ’22
RealityView doesn't free up memory after disappearing
Basically, take just the Xcode 26 AR App template, where we put the ContentView as the detail end of a NavigationStack. Opening app, the app uses < 20MB of memory. Tapping on Open AR the memory usage goes up to ~700MB for the AR Scene. Tapping back, the memory stays up at ~700MB. Checking with Debug memory graph I can still see all the RealityKit classes in the memory, like ARView, ARRenderView, ARSessionManager. Here's the sample app to illustrate the issue. PS: To keep memory pressure on the system low, there should be a way of freeing all the memory the AR uses for apps that only occasionally show AR scenes.
0
1
189
Sep ’25
How to mix Animation and IKRig in RealityKit
I want an AR character to be able to look at a position while still playing the characters animation. So far, I managed to manually adjust a single bone rotation using skeletalComponent.poses.default = Transform( scale: baseTransform.scale, rotation: lookAtRotation, translation: baseTransform.translation ) which I run at every rendering update, while a full body animation is running. But of course, hardcoding single joints to point into a direction (in my case the head) does not look as nice, as if I were to run some inverse cinematic that includes, hips + neck + head joints. I found some good IKRig code in Composing interactive 3D content with RealityKit and Reality Composer Pro. But when I try to adjust rigs while animations are playing, the animations are usually winning over the IKRig changes to the mesh.
Replies
1
Boosts
0
Views
534
Activity
Aug ’25
ARCoachingOverlayView replacement for RealityView
I thought the ARCoachingOverlayView was a nice touch, so each apps ARKit coaching was recognizable and I used it in my ARView/ARSCNView based apps. Now with RealityView, is there any replacement planned? Or should we just use UIViewRepresentable and wrap ARCoachingOverlayView?
Replies
1
Boosts
0
Views
533
Activity
Sep ’25
Sometimes stream via AVPlayer plays Audio but Video is just grey
In our app, we're streaming short HLS streams to local AVPlayers. In the view we have an AVPlayerLayer which is connected to the AVPlayer we start, and usually we hear audio and see the video. Sometimes, instead of video, we'll see a grey screen, while still being able to hear the audio. The playerlayer will be a shade of grey, which is not a shade coming from our app. Also if for testing purposes we don't connect the player to the PlayerLayer, this grey color will not be there, so it's definitely coming from the AVPlayerLayer. If this is somehow related to an HLSStream becoming corrupted or something, what are the API's in AVFoundation we could use to debug this? So far, when this happens we see no peculiar catch blocks or errors being thrown in our debug logs.
Replies
0
Boosts
0
Views
1.2k
Activity
Mar ’21
'ModuleNotFoundError: No module named '_usd'' when importing USD
So, I'm using some small python script that imports from pxr import Usd, Sdf, UsdGeom, Kind which used to work fine a few months ago. I got the USD tools from https://developer.apple.com/augmented-reality/tools/ and always had to run it in a Rosetta-Terminal (plus making sure the USD tools are in PATH and PYTHONPATH) but it worked. Now, whatever I do I always get   File "create_scene.py", line 2, in <module>     from pxr import Usd, Sdf, UsdGeom, Kind   File "/Applications/usdpython/USD/lib/python/pxr/Usd/__init__.py", line 24, in <module>     import _usd ModuleNotFoundError: No module named '_usd' Does anyone have any clues what this _usd is about?
Replies
0
Boosts
0
Views
1.5k
Activity
Jan ’22
Transmission property for transparent materials ("frosted")
Today I was surprised that the latest version of Three.js (browser based 3D Engine) actually supports a transmission property for transparent materials that looks pretty good and is really performant: So far I have only used this material property when rendering in Blender, not in a realtime engine. But if THREE can run this at 60fps in the browser, there must be a way of achieving the same in ARKit/SceneKit/RealityKit? I just haven't found anything in the SCNMaterial documentation about Transmission nor anyone mentioning relevant shader modifiers. Especially if we want to present objects with frosted glas effects in AR this would be super useful? (The Three.js app for reference: https://q2nl8.csb.app/ and I learned about this in an article by Kelly Mulligan: https://tympanus.net/codrops/2021/10/27/creating-the-effect-of-transparent-glass-and-plastic-in-three-js/)
Replies
0
Boosts
0
Views
1.5k
Activity
Dec ’22
RealityView doesn't free up memory after disappearing
Basically, take just the Xcode 26 AR App template, where we put the ContentView as the detail end of a NavigationStack. Opening app, the app uses < 20MB of memory. Tapping on Open AR the memory usage goes up to ~700MB for the AR Scene. Tapping back, the memory stays up at ~700MB. Checking with Debug memory graph I can still see all the RealityKit classes in the memory, like ARView, ARRenderView, ARSessionManager. Here's the sample app to illustrate the issue. PS: To keep memory pressure on the system low, there should be a way of freeing all the memory the AR uses for apps that only occasionally show AR scenes.
Replies
0
Boosts
1
Views
189
Activity
Sep ’25