Post

Replies

Boosts

Views

Activity

Reply to Using a scene from Reality Composer Pro in an IOS app?
Well, I figured it out from the Happy Beam demo code here: Happy Beam Docs Problem: the Bundle var wasn’t found in scope. Solution: Make sure that your Reality Composer Pro Package has been as a Framework in he General Project Settings Import (your package name) In the Sources directory that Reality Composer Pro created, there is a Swift file that contains var you’re looking for usually (your project name + Bundle; i.e. “projectnameBundle”) Load by creating an entity; scene = Entity(named: “Scene”, in: projectnameBundle) Add the entity to your RealityView; content.add(scene) Note: those will place the scene at your cameras location. So, be sure to move the camera away from the starting point to verify, but it’d be best to add a horizontal anchor and add the entity to the anchor, then the anchor to the RealityView to be less confusing Visually. import RealityKit import projectName struct ContentView : View {     var body: some View {         RealityView { content in // Create horizontal plane anchor            let anchor = AnchorEntity(.plane(.horizontal, classification: .any, minimumBounds: SIMD2<Float>(0.2, 0.2))) // Load Scene from Reality Composer Pro Package             do {                 let scene = try await Entity(named: "Scene", in: projectnameBundle) // Add model to anchor                 anchor.addChild(scene) // Add anchor to RealityView                 content.add(anchor)             } catch is CancellationError {                 // The entity initializer can throw this error if an enclosing                 // RealityView disappears before the model loads. Exit gracefully.                 return             } catch let error {                 // Other errors indicate unrecoverable problems.                 print("Failed to load scene: \(error)")             }             // View Settings             content.camera = .spatialTracking         }         .edgesIgnoringSafeArea(.all)     } }
Dec ’24
Reply to Uploading usdz model to AR Quick Look at https link
I was struggling to get QuickLook functioning in my app. It would display with the text "No File to View." The OP helped me fix my issues. My issues were in the coordinator setup. So, here's a functioning (as of 12/'24) QuickLook wrapped in SwiftUI using URL pass through from button if anyone needs it. import QuickLook import ARKit // 1. Create a SwiftUI view that will present the QLPreviewController. struct QuickLookView: View { var fileURL: URL // Accept the URL as a parameter from button var body: some View { QuickLookPreview(fileURL: fileURL) // Pass the fileURL to the QuickLookPreview .edgesIgnoringSafeArea(.all) } } // 2. Create a UIViewControllerRepresentable that wraps the UIKit QLPreviewController. struct QuickLookPreview: UIViewControllerRepresentable { var fileURL: URL // Accept the fileURL parameter func makeUIViewController(context: Context) -> QLPreviewController { let previewController = QLPreviewController() previewController.dataSource = context.coordinator return previewController } func updateUIViewController(_ uiViewController: QLPreviewController, context: Context) { // No updates required } // 3. Create a Coordinator to handle the QLPreviewControllerDataSource methods. class Coordinator: NSObject, QLPreviewControllerDataSource { var fileURL: URL // Store the fileURL in the Coordinator init(fileURL: URL) { self.fileURL = fileURL } func numberOfPreviewItems(in controller: QLPreviewController) -> Int { return 1 } func previewController(_ controller: QLPreviewController, previewItemAt index: Int) -> QLPreviewItem { return fileURL as QLPreviewItem // Return the fileURL as the preview item } } // Create a Coordinator instance func makeCoordinator() -> Coordinator { return Coordinator(fileURL: fileURL) // Pass the fileURL to the Coordinator } } And the then button NavigationLink(destination: QuickLookView(fileURL: Bundle.main.url(forResource: "modelName", withExtension: "usdz")!)) { Text("Quick Look") .font(.system(size: 22)) .fontWeight(.bold) .padding(.horizontal, 40) .padding(.vertical, 20) .background(Color.purple) .foregroundColor(.white) .cornerRadius(6) }
Topic: Programming Languages SubTopic: Swift Tags:
Dec ’24
Reply to RealityView and Persistent World Data?
I wanted to update this post with resources I found. It appears the automation for persistent anchors and world data maps has been configured as WorldAnchors. Currently, it looks like this is only supported in visionOS. https://developer.apple.com/documentation/visionos/tracking-points-in-world-space It appears that by simply adding a WorldAnchor that visionOS automatically tracks the world map, unloading and loading based on your location automatically in the background. This is amazing. Though, I'm not sure why this wouldn't be supported on iOs and iPadOS as well. Perhaps in the future it will be implemented as a core ARKit feature as well. To the best of my limited knowledge, it appears we will have to continue to use the previous methods for persistent data, which can be found here: https://developer.apple.com/documentation/arkit/arkit_in_ios/data_management/saving_and_loading_world_data However, I still have to try and implement this with RealityView. As it is my understanding that only RealityView supports Reality Composer Pro packages. The goal here is to simply place a Reality Composer Pro package with AR Persistence...
Dec ’24
Reply to ARView vs RealityView (iOS, iPadOS)
@lijiaxu Thank you for your input. Definitely, ARView for now on iOS and iPadOS. In the project I was working on, I had decided to abandon RealityView. The question was still lingering in my mind, but you have answered it here. I think I was worried about missing out on the latest realism/rendering features, but I don't think there are any missing in ARView at this point. Perhaps, it was wishful thinking on my part that things like persistent data would become automated in the background with a simple boolean control. That said, with VisionOS that appears to be the direction, and that is exciting. that was the primary functionality I was after. To correct or update my original note: When using RealityView, after applying the anchor to a specific surface type, such as floor or table, the tracking was much improved and matched what I am familiar with in ARView.
Jan ’25
Reply to How to set Reality Composer Pro Entity/Anchor Transform/Position? iOS
I am no longer at a loss! haha Entity() is different than ModelEntity() Entity() keeps the hierarchy (independent model transforms). ModelEntity() flattens it (single model transforms). You have to make sure you're interacting with your root element in the RCP scene. If you're not, the anchor or model transforms will have no effect. //import your bundle import yourproject //Load async let scene = try await Entity(named: "Scene", in: yourprojectBundle) //Target root object let rootEntity = scene.children.first?.children.first(where: { $0.name == "rootObject" } Bonus: you can print out your hierarchy with this... // Helper function to print entity hierarchy func printEntityHierarchy(_ entity: Entity, level: Int) { let indent = String(repeating: " ", count: level) print("\(indent)Entity: \(entity.name) Transform: \(entity.transform)") for child in entity.children { printEntityHierarchy(child, level: level + 1) } } Note: the same thing applies for gesture control. You have to make sure you're targeting the root object or the specific object you want.
Feb ’25
Reply to RealityView and Persistent World Data?
I wanted to update this thread since I have learned more about RealityView. Technically there is a way to use RealityView and access the world map data on iOS and iPadOS, but I can't seem to figure that out. Someone with a higher degree of experience would be better suited to answer that question. It would involve knowing how to bridge between low-level and high-level kits, and at this point in time, at least on iOS and iPadOS RealityView doesn't necessarily provide any additional benefits. Though I think it will in the future, if not just simplicity and ease of use. The other correction is that ARView does support RCP projects, you just need to use Entity() to load them from the bundle. So, using the known methods for persistent data with RCP projects will still get you there. import yourproject let rcpProject = try await Entity(named: "Scene", in: yourprojectBundle)
Feb ’25
Reply to Best Practices for taking Blender 3D objects and importing them into Reality Composer Pro.
In my experience, there are three options. Export as USDC from Blender. Import into Reality Converter to check and change or add any texture files. Export as USDZ (compressed). Import into Reality Composer Pro. Export as USDC or USDZ (simply change the c to a z in the file extension when saving). Import into Reality Composer Pro. Lastly, for efficiency and most control, export from Blender as USDC with no textures or materials, import into Reality Composer Pro, and rebuild the shader/textures inside Reality Composer Pro.
Apr ’25
Reply to Blender to Reality Composer Pro 2.0 to SwiftUI + RealityKit visionOS Best Practices
I figure it out!! I spent a few days on this. I'm using the current versions of Blender and RCP as of April, 3rd, 2025. At first this did not work, but I repeated my steps and tried again, and then it start working. I think there are some memory issues in RCP when previewing the scene (especially if loading assets with the same name and replacing what's there). I replicated the workflow with all new files and names, and it went smoothly. Note: Use the Star icon in the NLA strip editor in Blender, or have the action you want to export active/selected/in the stash. This is the one that will be exported. Export your model as a USDC from Blender. Uncheck animation, include textures/materials. This will be your visible model object. In Blender, Star the NLA strip you want to export. Select the model. Export as a USDC, Selected Only, check Animation, uncheck everything else (materials, textures). However, you MUST include the Mesh. Repeat step 2 for each animation. Note: you will see your object as a pink striped object in the RCP Project Browser, and the correct animation should be visibly playing in the preview window on the right when selected. Notice: Keep the following steps on the model parent object in the RCP scene tree. Add the Animation Library Component to the model parent. Click the + to add an animation. Choose the "animation" USDC's you exported from Blender. This will provide access to the multiple animations on the single model object. To see these animations in RCP Preview, you'll need to create a timeline. To do that, follow these steps: Create a new Timeline Set the play head where you'd like Select your parent model object in the scene tree In the Animation Library Right-Click on the animation you want to play and choose Insert into Timeline. Note: you can manually drag an Animation Action from the Timeline Actions panel if you'd like and set it up that way, but this is quicker. Next, you'll need to play your Timeline in RCP. The easiest way to do that is: Add a Behavior Component to the model. Add an OnAddedToScene behavior Choose your Timeline as the action. When you preview the RCP scene in RCP, you should now see your animation playing on your model object. These animations should also be accessible with Xcode by using the names they have been given in the Animation Library. I believe you can also edit those names there as needed.
Apr ’25
Reply to Blender to Reality Composer Pro 2.0 to SwiftUI + RealityKit visionOS Best Practices
Well, while this definitely works inside Reality Composer Pro and in the exported Scene.usdz, it failed when loading the bundle in Xcode into RealityView and testing on device. The animation (shape key in my case) fails to play, nor can I programmatically play it. There seems to be some kind of mismatch taking place with the blendWeights, and I'm not sure why yet...
Apr ’25
Reply to Blender to Reality Composer Pro 2.0 to SwiftUI + RealityKit visionOS Best Practices
Just to add: I loaded the exported Scene.usdz from RCP into the RealityView instead of referencing the package/bundle, and everything performed as it does in RCP. There is some kind of mismatch taking place between the resource naming, shape key/animation naming/paths and blendWeights when compiling from the RCP bundle in Xcode vs exporting the USDZ from RCP. I don't know if this is a bug or not, but it is quite odd. Stick to the exported USDZ for now. Maybe an Apple Dev can chime in at some point.
Apr ’25