Post

Replies

Boosts

Views

Activity

Reply to Having an issue with using a custom component with Reality Composer Pro
I was typing up the original post over a very slow internet connection so their some wonky grammar. I have been modifying the Package.swift file for the Reality Kit project & I have been using relative paths but it does not work for some reason. let package = Package( name: "RealityKitContent", platforms: [ .visionOS(.v2), .macOS(.v15), .iOS(.v18) ], products: [ // Products define the executables and libraries a package produces, and make them visible to other packages. .library( name: "RealityKitContent", targets: ["RealityKitContent"]), ], dependencies: [ // Dependencies declare other packages that this package depends on. // .package(url: /* package url */, from: "1.0.0"), ], targets: [ // Targets are the basic building blocks of a package. A target can define a module or a test suite. // Targets can depend on other targets in this package, and on products in packages this package depends on. .target( name: "RealityKitContent", dependencies: []), ] ) I tried setting a path like dependencies: [ .package(path: "../../../../MyFramework"), ], targets: [ .target( name: "RealityKitContent", dependencies: [ .product(name: "MyFramework", package: "MyFramework") ]), ] At this time, the Swift package Framework is being referenced as a local package. Eventually, there will be a public GitHub repo.
Mar ’25
Reply to Reality Converter - Unlit Material
I would also like to know if this is possible with Reality Composer Pro v1 - is something possible with shader graph? Does RCP v2 with Xcode 16 have any new options? In my case, I created a plane in Blender with a texture image with an alpha mask. When I view the USDZ in visionOS on a device or the simulator, the mesh is quite dark. I am loaded the scene with Model3D & not a Reality Kit view (for various reasons).
Aug ’24
Reply to Using a scene from Reality Composer Pro in an IOS app?
I went took a look at the Hello World sample & I see I made 2 mistakes: use Entity(named: and use the appropriate path: struct ContentView: View { var body: some View { RealityView { content in do { let scene = try await Entity(named: "Scenes/HOF", in: HeadsOnFire.headsOnFireBundle) content.add(scene) } catch is CancellationError { // The entity initializer can throw this error if an enclosing // RealityView disappears before the model loads. Exit gracefully. return } catch let error { // Other errors indicate unrecoverable problems. print("Failed to load cube: \(error)") } } } }
Jun ’24
Reply to How to Fix Cracking and Popping Sound ?
I have been experiencing this issue with a MacBook Pro 14" model since I bought it late last year (2021). I went to the Apple store & at the genius bar they checked my MacBook & then I was referred to technical support; I had uploaded video recording showing the audio issues & also uploaded system logs. This was back in Jan when I was in contact with technical support the last I heard was that it was being investigating by engineering at Apple...
Topic: App & System Services SubTopic: Core OS Tags:
May ’22
Reply to Using SCNTechnique with AR Kit to create a chromatic aberration visual effect?
Happy days, I managed to get something working by hacking together some bits from a few places. // Based on code from https://github.com/JohnCoates/Slate fragment half4 scene_filter_fragment_chromatic_abberation(VertexOut vert [[stage_in]],                                 texture2dhalf, access::sample scene [[texture(0)]]) {     float2 coordinates = vert.texcoord;     constexpr sampler samp = sampler(coord::normalized, address::repeat, filter::nearest);     half4 color = scene.sample(samp, coordinates);     float2 offset = (coordinates - 0.4) * 2.0;     float offsetDot = dot(offset, offset);     const float strength = 5.0;     float2 multiplier = strength * offset * offsetDot;     float2 redCoordinate = coordinates - 0.003 * multiplier;     float2 blueCoordinate = coordinates + 0.01 * multiplier;     half4 adjustedColor;     adjustedColor.r = scene.sample(samp, redCoordinate).r;     adjustedColor.g = color.g;     adjustedColor.b = scene.sample(samp, blueCoordinate).b;     adjustedColor.a = color.a;     return adjustedColor; } And I need to turn off the camera grain to get the filter to apply to the entire scene: Swift         if #available(iOS 13.0, *) {             sceneView.rendersCameraGrain = false         } I have a repo here: https://github.com/ManjitBedi/SCNTechnique-Experiment This project on GitHub really helped me out as well: https://github.com/2RKowski/ARSCNViewImageFiltersExample
Topic: Graphics & Games SubTopic: General Tags:
Apr ’21