Reality Composer Pro

RSS for tag

Prototype and produce content for AR experiences using Reality Composer Pro.

Learn More

Posts under Reality Composer Pro subtopic

Post

Replies

Boosts

Views

Activity

How to get multiple animations into USDZ
Most models are only available as glb or fbx, so I usually reexport them into usdz using Blender. When I import them into Reality Composer Pro, Mesh, Textures etc look great, but in the Animation Library subsection all I can see is one default subtree animation. In Blender I can see all available animations and play them individually. The default subtree animation just plays the default idle animation. In fact when I open the nonlinear animation view in Blender and select a different animation as the default animation, the exported usdz shows the newly selected animation as default subtree animation. I can see in the Apple sample apps models can have multiple animations in their Animation Library. I'm using the latest Blender 4.5 and the usdz exporter should be working properly?
3
1
695
Oct ’25
What's in Reality Composer Pro 26
That title would have made a great WWDC Sessions. Unfortunately, it seems like nothing is new in Reality Composer Pro this year. I've noticed at all versions of the Xcode Beta this summer have shipped with Reality Composer Pro version 2.0. There have been slight bumps in the build number. I haven't found any new features or seen any documentation to indicate that anything has changed. So the question is, what is the state of Reality Composer Pro? Should we continue to use this tool or start doing everything in code? A huge number of Sample Projects use Reality Composer Pro, so it seems like Apple is still using it even if they didn't update it this year.
1
3
495
Aug ’25
Exporting .reality files from Reality Composer Pro
I've been using the MacOS XCode Reality Composer to export interactive .reality files that can be hosted on the web and linked to, triggering QuickLook to open the interactive AR experience. That works really well. I've just downloaded XCode 15 Beta which ships with the new Reality Composer Pro and I can't see a way to export to .reality files anymore. It seems that this is only for building apps that ship as native iOS etc apps, rather than that can be viewed in QuickLook. Am I missing something, or is it no longer possible to export .reality files? Thanks.
3
2
2k
Jul ’25
Blender to Reality Composer Pro 2.0 to SwiftUI + RealityKit visionOS Best Practices
Hi, I'm very new to 3D and am currently porting a SwiftUI iOS app to visionOS 2.0. I saw WWDC24 feature Blender in multiple spatial videos, and have begun integrating Blender models and animations into my VisionOS app (I would also like to integrate skeletons and programmatic rigging, more on that later). I'm wondering if there are “Best Practices” for this workflow - from Blender to USD to RCP 2.0 to visionOS 2 in Xcode. I’ve cobbled together the following that has some obvious holes: I’ve been able to find some pre-rigged and pre-animated models online that can serve as a great starting point. As a reference, here is a free model from SketchFab - a simple rigged skeleton with 6 built in animations: https://sketchfab.com/3d-models/skeleton-character-low-poly-8856e0138f424d68a8e0b40e185951f6 When exporting to USD from Blender, I haven’t been able to export more than one animation per USD file. Is there a workflow to export multiple animations in a single USDC file, or is this just not possible? As a temporary workaround, here is a python script I’ve been using to loop through all Blender animations, and export a model for each animation: import bpy import os # Set the directory where you want to save the USD files output_directory = “/path/to/export” # Ensure the directory exists if not os.path.exists(output_directory): os.makedirs(output_directory) # Function to export current scene as USD def export_scene_as_usd(output_path, start_frame, end_frame): bpy.context.scene.frame_start = start_frame bpy.context.scene.frame_end = end_frame # Export the scene as a USD file bpy.ops.wm.usd_export( filepath=output_path, export_animation=True ) # Save the current scene name original_scene = bpy.context.scene.name # Iterate through each action and export it as a USD file for action in bpy.data.actions: # Create a new scene for each action bpy.context.window.scene = bpy.data.scenes[original_scene].copy() new_scene = bpy.context.scene # Link the action to all relevant objects for obj in new_scene.objects: if obj.animation_data is not None: obj.animation_data.action = action # Determine the frame range for the action start_frame, end_frame = action.frame_range # Export the scene as a USD file output_path = os.path.join(output_directory, f"{action.name}.usdc") export_scene_as_usd(output_path, int(start_frame), int(end_frame)) # Delete the temporary scene to free memory bpy.data.scenes.remove(new_scene) print("Export completed.") I have also been able to successfully export rigging armatures as a single Skeleton - each “bone” showing getting imported into Reality Composer Pro 2.0 when exporting/importing manually. I would like to have all of these animations available in a single scene to be used in a RealityView in visionOS - so I have placed all animation models in a RCP scene and created named Timeline Action animations for each, showing the correct model and hiding the rest when triggering specific animations. I apply materials/textures to each so they appear the same, using Shader Graph. Then in SwiftUI I use notifications (as shown here - https://forums.developer.apple.com/forums/thread/756978) to trigger each RCP Timeline Action animation from code. Two questions: Is there a better way than to have multiple models of the same skeleton - each with a different animation - in a scene to be able to trigger multiple animations? Or would this require recreating Blender animations using skeleton rigging and keyframes from within RCP Timelines? If I want to programmatically create custom animations and move parts of the skeleton/armatures - do I need to do this by defining custom components in RCP, using IKRig and define movement of each of the “bones” in Xcode? I’m looking for any tips/tricks/workflow from experienced engineers or 3D artists that can create a more efficient/optimized workflow using Blender, USD, RCP 2 and visionOS 2 with SwiftUI. Thanks so much, I appreciate any help! I am very excited about all the new tools that keep evolving to make spatial apps really fun to build!
4
2
1.2k
Apr ’25
How to trigger actions by OnCollision in Behaviors Component
It's all about notifications to trigger actions from RCP's new Timeline system. From Compose interactive 3D content in Reality Composer Pro I am actually starting to confuse why there was need to use Entity.applyTapForBehaviors in code to trigger content in Behaviors Component. Simply because in Behaviors Component, we have chosen OnTap to allow a "Tap Notification" to trigger our action (on a selected target object). Then I guess by selecting OnCollision this trigger, I should write something like CollisionEvent.entityA.applyCollisionForBehaviors, which we don't have. And ofc the collision on my object won't trigger this action (because I only did things in RCP not in code). Ignoring this post has pointed out we could use Behaviors Component's OnNotification to trigger something for now. I found that I could still use OnTap trigger but actually put my code Entity.applyTapForBehaviors under my subscribed collision's begin event. That actually works better than OnCollision So what is the design principles here? And how could I trigger a collision notification to let my Behaviors Component's OnCollision actually works?
1
2
630
Mar ’25
Multiple-frames BlendShape (failed) Animation in Reality Composer Pro
Goal: To render in an apple vision pro app, the solid-mechanics 3D simulation results coming form an FEA code. Starting point: I have surface vtks with deformations on each node. Each time step has a a mesh with the nodal coordinates. This is straighforward translatable to a usd MeshSequence. Unfortunately, the results cannot be simplified to a scaling o linear transformation as you would do with other game-oriented animations. Tools: Right now, I am using Xcode and reality composer pro (RCP) to build the scenes. Technical limitations: I am aware that RCP can do animations with BlendMesh and skeletons and that MeshSequence is not a problem. Progress: Coverting to the sequence of vtk meshes to a usd MeshSequence is straighforward. This animates correctly in Preview and Blender (see screenshot). I managed to convert from MeshSequence to multiple keys and BlendMesh. This also animates correctly in Blender and preview. Unfortunately, the BlendMesh of multiple blended meshes shows a zero animation time in RCP (see screenshot below) Also, see below usda file scheme for the animation. Of course I am not showing full vectors such as faceVertexCounts, faceVertexIndex, normals. Question: what is the right set up to create a BlendMesh animation that RCP will correctly import and animate, form a set of Meshes or multiple key shapes? Blender animation Time zero RCP "animations" #usda 1.0 ( defaultPrim = "BlendMeshRoot" doc = "Blender v4.5.3 LTS" endTimeCode = 48 framesPerSecond = 24 metersPerUnit = 1 startTimeCode = 0 timeCodesPerSecond = 24 upAxis = "Z" ) def Xform "BlendMeshRoot" ( customData = { dictionary Blender = { bool generated = 1 } } ) { def SkelRoot "Mesh" { custom string userProperties:blender:object_name = "Mesh" float3 xformOp:rotateXYZ = (89.99999, -0, 0) float3 xformOp:scale = (0.009999999, 0.01, 0.01) double3 xformOp:translate = (0, 0, 0) uniform token[] xformOpOrder = ["xformOp:translate", "xformOp:rotateXYZ", "xformOp:scale"] def Mesh "Mesh" ( active = true prepend apiSchemas = ["MaterialBindingAPI", "SkelBindingAPI"] ) { uniform bool doubleSided = 1 float3[] extent = [(25.091871, -34.121277, -13.298501), (299.94482, 245.10088, 202.35126)] int[] faceVertexCounts = [3, 3, ... int[] faceVertexIndices = [0, 10293, ... rel material:binding = </BlendMeshRoot/_materials/MeshSequence_Default> normal3f[] normals = [(-0.3632836, -0.9102419, -0.19870725), .... point3f[] points = [(244.41148, 155.42062, 70.454926),..... float3[] primvars:node_displacement = [(93.54703, 110.9341, 48.37992).... float3[] primvars:Normals = [(-0.0050530406, -0.9910114, -0.13368203),... int[] primvars:skel:jointIndices = [0, 0, 0, 0, 0 ... float[] primvars:skel:jointWeights = [1, 1, 1, 1, 1... uniform token[] skel:blendShapes = ["frame_0000", "frame_0001", "frame_0002", "frame_0003", "frame_0004", "frame_0005"] rel skel:blendShapeTargets = [ </BlendMeshRoot/Mesh/Mesh/frame_0000>, ....... </BlendMeshRoot/Mesh/Mesh/frame_0005>, ] prepend rel skel:skeleton = </BlendMeshRoot/Mesh/Skel> uniform token subdivisionScheme = "none" custom string userProperties:blender:data_name = "Mesh" custom float userProperties:originalTime float userProperties:originalTime.timeSamples = { 0: 0, } def BlendShape "frame_0000" { uniform vector3f[] offsets = [(0, 0, 0), (0, 0, 0),..... uniform int[] pointIndices = [0, 1, 2, ..... } ..... ..... #### BlendShape frame to 0005 ..... def Skeleton "Skel" ( prepend apiSchemas = ["SkelBindingAPI"] ) { uniform matrix4d[] bindTransforms = [( (1, 0, 0, 0), (0, 1, 0, 0), (0, 0, 1, 0), (0, 0, 0, 1) )] uniform token[] joints = ["joint1"] uniform matrix4d[] restTransforms = [( (1, 0, 0, 0), (0, 1, 0, 0), (0, 0, 1, 0), (0, 0, 0, 1) )] prepend rel skel:animationSource = </BlendMeshRoot/Mesh/Skel/Anim> def SkelAnimation "Anim" { uniform token[] blendShapes = ["frame_0000", "frame_0001", "frame_0002", "frame_0003", "frame_0004", "frame_0005"] float[] blendShapeWeights.timeSamples = { 0: [1, 0, 0, 0, 0, 0], 1: [0.9697085, 0.03029152, 0, 0, 0, 0], 2: [0.88787615, 0.11212383, 0, 0, 0, 0], ..... 46: [0, 0, 0, 0, 0.11212379, 0.8878762], 47: [0, 0, 0, 0, 0.030291557, 0.96970844], 48: [0, 0, 0, 0, 0, 1], } } } } def Scope "_materials" { def Material "MeshSequence_Default" { token outputs:surface.connect = </BlendMeshRoot/_materials/MeshSequence_Default/Principled_BSDF.outputs:surface> custom string userProperties:blender:data_name = "MeshSequence_Default" def Shader "Principled_BSDF" { uniform token info:id = "UsdPreviewSurface" float inputs:clearcoat = 0 float inputs:clearcoatRoughness = 0.03 color3f inputs:diffuseColor = (0.8, 0.4, 0.3) float inputs:ior = 1.5 float inputs:metallic = 0 float inputs:opacity = 1 float inputs:roughness = 0.5 float inputs:specular = 0.2 token outputs:surface } } } def Scope "AnimationClips" { custom rel animations = </BlendMeshRoot/Mesh/Skel/Anim> } def RealityKitComponent "AnimationLibrary" { custom rel animations = </BlendMeshRoot/Mesh/Skel/Anim> custom token info:id = "RealityKit.AnimationLibrary" custom double realitykit:approximateDuration = 2 custom double[] realitykit:clipDurations = [2] custom string[] realitykit:clipNames = ["Anim"] custom rel realitykit:clipTargets = </BlendMeshRoot/Mesh/Skel/Anim> custom double realitykit:frameRate = 24 custom bool realitykit:isAnimationLibrary = 1 } }
2
1
464
Oct ’25
Auto Rig Pro Mixamo Animation to RC Pro
Hi! I have been struggling with this for a little while and most of what I've found has not helped much. I hope to find more success here. Essentially, I have a model I've made in Blender. I rigged it using Auto Rig Pro, and I've also used ARP to add a Mixamo animation to it. That all works fine in Blender. However, when I try to import this model into RCP, I don't get the animation. The "default subtree animation" is completely empty. I attribute this to my lack of experience in this field but here's what I've attempted thus far: Pushing the Mixamo keyframes into an NLA strip. I'm pretty sure this is the correct line of action, but I'm definitely not doing something right. Baking the animation (?) I've made sure that I have animation checked when I export the model! Any ideas or reference projects would be lovely. I haven't really found much that has pushed me in the right direction. This project is, unfortunately, kind of time sensitive, so I would appreciate help ASAP. Thank you and let me know if I can add anymore context!
3
0
408
3w
Stop Reality Composer
Is there a way to stop a Reality Composer Timeline ? And restart it later on. For me it looks like you can only start a timeline via notification. and related to it. i have the issue that if the notification to start a certain timeline happens twice or more, it looks as if the timeline actually plays multiple times and animations start to glitch and jump. what is best practise here to avoid this.
1
0
450
1w
ShaderGraphMaterial with Occlusion Surface Output fails to load on iOS and macOS
A ShaderGraphMaterial with an Occlusion Surface Output generated with RealityComposer 2 fails to load on iOS 18 and macOS 15 with the following error: RealityFoundation.ShaderGraphMaterial.LoadError.invalidTypeFound (https://developer.apple.com/documentation/realitykit/shadergraphmaterial/loaderror/invalidtypefound) This happens with both https://developer.apple.com/documentation/shadergraph/realitykit/occlusion-surface-(realitykit) and https://developer.apple.com/documentation/shadergraph/realitykit/shadow-receiving-occlusion-surface-(realitykit) RealityView { content in do { let bgEntity = ModelEntity(mesh: .generateCone(height: 0.5, radius: 0.1), materials: [SimpleMaterial(color: .red, isMetallic: true)]) bgEntity.position.z = -0.2 content.add(bgEntity) let occlusionMaterial = try await ShaderGraphMaterial(named: "/Root/OcclusionMaterial", from: "OcclusionMaterial") let testEntity = ModelEntity(mesh: .generateSphere(radius: 0.4), materials: [occlusionMaterial]) content.add(testEntity) content.cameraTarget = testEntity } catch { print("Shader Graph Load Error:") dump(error) } } .realityViewCameraControls(.orbit) .edgesIgnoringSafeArea(.all) Feedback ID: FB15081296
2
1
1.3k
Nov ’25
[xrsimulator] Exception thrown during compile: cannotGetRkassetsContents
My friend cannot build my visionOS project in the simulator. He gets the following error. Error: [xrsimulator] Exception thrown during compile: cannotGetRkassetsContents(path: "/Users/path/to/Packages/RealityKitContent/Sources/RealityKitContent/RealityKitContent.rkassets") In Xcode, he is able to open the RealityKitContent package in realityComposer Pro by clicking on the Package.realitycomposerpro file. No warnings show up wrt this error in RCP either. All scenes appear to be usable/navigable in RCP. This error only comes up when he tries to build the project in Xcode command+b. The is no other information in the Report Navigator's Build logs for this error. The error is always followed by this next error. Error: Tool exited with code 1 Yikes, please help!
1
1
513
Feb ’25
Launching a timeline on a specific model via notification
Hello! I’m familiar with the discussion on “Sending messages to the scene”, and I’ve successfully used that code. However, I have several instances of the same model in my scene. Is it possible to make only one specific model respond to a notification? For example, can I pass something like RealityKit.NotificationTrigger.SourceEntity in userInfo or use another method to target just one instance?
1
1
152
May ’25
Reality Composer to Reality Composer Pro?
I sketched a idea for a project in Reality Composer on my iPad, thinking when I had a chance to sit down I would work it up in Xcode. However, when I got back to my computer, I discovered I cannot open a file created in Reality Composer (or the exported Reality file) in Reality Composer Pro. Am I missing something obvious here, because this seems like a huge oversight. If anyone, can let me know how to open a file created in Reality Composer in Reality Composer Pro, I would greatly appreciate it. Partly, because there seems to be objects available in Reality Composer that are not in Reality Composer Pro. Thanks Stan
2
1
258
Jun ’25
Unexpected behavior when writing entities and loading realityFiles.
I have a simple visionOS app that creates an Entity, writes it to the device, and then attempts to load it. However, when the entity file get overwritten, it affects the ability for the app to load it correctly. Here is my code for saving the entity. import SwiftUI import RealityKit import UniformTypeIdentifiers struct ContentView: View { var body: some View { VStack { ToggleImmersiveSpaceButton() Button("Save Entity") { Task { // if let entity = await buildEntityHierarchy(from: urdfPath) { let type = UTType.realityFile let filename = "testing.\(type.preferredFilenameExtension ?? "bin")" let documentsURL = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask)[0] let fileURL = documentsURL.appendingPathComponent(filename) do { let mesh = MeshResource.generateBox(size: 1, cornerRadius: 0.05) let material = SimpleMaterial(color: .blue, isMetallic: true) let modelComponent = ModelComponent(mesh: mesh, materials: [material]) let entity = Entity() entity.components.set(modelComponent) print("Writing \(fileURL)") try await entity.write(to: fileURL) } catch { print("Failed writing") } } } } .padding() } } Every time I press "Save Entity", I see a warning similar to: Writing file:///var/mobile/Containers/Data/Application/1140E7D6-D365-48A4-8BED-17BEA34E3F1E/Documents/testing.reality Failed to set dependencies on asset 1941054755064863441 because NetworkAssetManager does not have an asset entity for that id. When I open the immersive space, I attempt to load the same file: import SwiftUI import RealityKit import UniformTypeIdentifiers struct ImmersiveView: View { @Environment(AppModel.self) private var appModel var body: some View { RealityView { content in guard let type = UTType.realityFile.preferredFilenameExtension else { return } let documentsURL = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask)[0] let fileURL = documentsURL.appendingPathComponent("testing.\(type)") guard FileManager.default.fileExists(atPath: fileURL.path) else { print("❌ File does not exist at path: \(fileURL.path)") return } if let entity = try? await Entity(contentsOf: fileURL) { content.add(entity) } } } } I also get errors after I overwrite the entity (by pressing "Save Entity" after I have successfully loaded it once). The warnings that appear when the Immersive space attempts to load the new entity are: Asset 13277375032756336327 Mesh (RealityFileAsset)URL/file:///var/mobile/Containers/Data/Application/1140E7D6-D365-48A4-8BED-17BEA34E3F1E/Documents/testing.reality/Mesh_0.compiledmesh failure: Asset provider load failed: type 'RealityFileAsset' -- RERealityArchive: Failed to open load stream for entry 'assets/Mesh_0.compiledmesh'. Asset 8308977590385781534 Scene (RealityFileAsset)URL/file:///var/mobile/Containers/Data/Application/1140E7D6-D365-48A4-8BED-17BEA34E3F1E/Documents/testing.reality/Scene_0.compiledscene failure: Asset provider load failed: type 'RealityFileAsset' -- RERealityArchive: Failed to read archive entry. AssetLoadRequest failed because asset failed to load '13277375032756336327 Mesh (RealityFileAsset)URL/file:///var/mobile/Containers/Data/Application/1140E7D6-D365-48A4-8BED-17BEA34E3F1E/Documents/testing.reality/Mesh_0.compiledmesh' (Asset provider load failed: type 'RealityFileAsset' -- RERealityArchive: Failed to open load stream for entry 'assets/Mesh_0.compiledmesh'.) The order of operations to make this happen: Launch app Press "Save Entity" to save the entity "Open Immersive Space" to view entity Press "Save Entity" to overwrite the entity "Open Immersive Space" to view entity, failed asset load request Also Launch app, the entity should still be save from last time the app ran "Open Immersive Space" to view entity Press "Save Entity" to overwrite the entity "Open Immersive Space" to view entity, failed asset load request NOTE: It appears I can get it to work slightly better by pressing the "Save Entity" button twice before attempting to view it again in the immersive space.
0
1
207
Aug ’25
Xcode Cloud builds don't work with *.usdz files in a RealityComposer package
In courses like Compose interactive 3D content in Reality Composer Pro Realitykit Engineers recommended working with Reality Composer Pro to create RealityKit packages to embed in our Realitykit Xcode projects. And, comparing the workflow to Unity/Unreal, I can see the reasoning since it is nice to prepare scenes/materials/assets visually. Now when we also want to run a Xcode Cloud CI/CD pipeline this seems to come into conflict: When adding a basic *.usdz to the RealityKitContent.rkassets folder, every build we run on Xcode cloud fails with: Compile Reality Asset RealityKitContent.rkassets ❌realitytool requires Metal for this operation and it is not available in this build environment I have also found this related forum post here but it was specifically about compiling a *.skybox.
4
1
509
Sep ’25
Scene not found after changes in Reality Composer Pro
randomly, the app does not work after small changes in Reality Composer. Small changes like scaling a object a tiny bit. to fix the error, i have to change another element in reality composer and hope for the best. if this does not help, i change (transform) something else, or deactive/activate something to get the project working again. I can't see a pattern why the Reality Composer Project sometimes gets in a state where it does not compile anymore.
3
0
1.9k
3w
Reality Composer Timeline unfinished
the timeline editor feels often unfinished. Setting the time cursor to a different time is often not reflected in the preview. You either have to click a clip or wait. sometimes the cursor even disappears, eg. when switching tabs to shadergraph Not being able to select and move multiple clips is missing. There is also no snapping to clips or time cursor as found in other tools. And then there is the timeline compile bug https://developer.apple.com/forums/thread/810868 The timeline as it is, is a good start but it definitely needs some more love to be on par with other commercial tools like Unity or After Effects.
1
1
185
2w
Animations exported from Blender does not shoe in Reality Composer Pro
I made an animation in Blender using geometry nodes that I exported to USDC file (then I used Reality Converter to convert to USDZ) and I can see the animation when viewing from the finder but does not play after importing to RCP. Any idea how I can play the animation? Or can the animation be accessed through Xcode? Thanks!
4
0
1.1k
Apr ’25