Post

Replies

Boosts

Views

Activity

SPM Traits not working correctly with multiple targets in same Xcode workspace (App + App Clip)
Hi, I've just migrated to Swift Tools 6.2 and package traits, and I'm encountering an issue when using traits with multiple targets in the same Xcode workspace. Setup: Main iOS app target App Clip target Both consume the same local packages (e.g., UIComponents) What I'm trying to achieve: Main app imports packages without the COMPACT_BUILD trait App Clip imports packages with the COMPACT_BUILD trait enabled Package configuration (simplified): // UIComponents/Package.swift let package = Package( name: "UIComponents", platforms: [.iOS(.v18)], traits: [ .trait(name: "COMPACT_BUILD", description: "Minimal build for App Clips"), ], // ... targets: [ .target( name: "UIComponents", dependencies: [...], swiftSettings: [ .define("COMPACT_BUILD", .when(traits: ["COMPACT_BUILD"])), ] ), ] ) In the code: #if COMPACT_BUILD // Excluded from App Clip #endif The consumer packages: Main app's package imports without trait: .package(path: "../UIComponents") App Clip's package imports with trait: .package(path: "../UIComponents", traits: ["COMPACT_BUILD"]) The problem: When building the main app target, the COMPACT_BUILD compiler condition is unexpectedly active — even though the main app's dependency chain never enables that trait. It seems like the trait enabled by the App Clip target is "leaking" into the main app build. I confirmed this by adding #error("COMPACT_BUILD is active") — it triggers when building the main app, which shouldn't happen. If I disable the App Clip target from the build scheme, the main app builds correctly with COMPACT_BUILD not defined. I am also able to build the App Clip separately. Environment: Xcode 26.2 swift-tools-version: 6.2 iOS 26.2 Questions: Is this expected behavior with Xcode's SPM integration? Are traits resolved workspace-wide rather than per-target? Is there a workaround to have different trait configurations for different targets consuming the same package? Or do I need to fall back to separate package targets (e.g., UIComponents and UIComponentsCompact) to achieve this? Any guidance would be appreciated. Thanks!
0
0
33
2d
ShaderGraphMaterial with Occlusion Surface Output fails to load on iOS and macOS
A ShaderGraphMaterial with an Occlusion Surface Output generated with RealityComposer 2 fails to load on iOS 18 and macOS 15 with the following error: RealityFoundation.ShaderGraphMaterial.LoadError.invalidTypeFound (https://developer.apple.com/documentation/realitykit/shadergraphmaterial/loaderror/invalidtypefound) This happens with both https://developer.apple.com/documentation/shadergraph/realitykit/occlusion-surface-(realitykit) and https://developer.apple.com/documentation/shadergraph/realitykit/shadow-receiving-occlusion-surface-(realitykit) RealityView { content in do { let bgEntity = ModelEntity(mesh: .generateCone(height: 0.5, radius: 0.1), materials: [SimpleMaterial(color: .red, isMetallic: true)]) bgEntity.position.z = -0.2 content.add(bgEntity) let occlusionMaterial = try await ShaderGraphMaterial(named: "/Root/OcclusionMaterial", from: "OcclusionMaterial") let testEntity = ModelEntity(mesh: .generateSphere(radius: 0.4), materials: [occlusionMaterial]) content.add(testEntity) content.cameraTarget = testEntity } catch { print("Shader Graph Load Error:") dump(error) } } .realityViewCameraControls(.orbit) .edgesIgnoringSafeArea(.all) Feedback ID: FB15081296
2
1
1.3k
2w
realitytool requires Metal for this operation and it is not available in this build environment
Hello, I'm getting started for my project with Xcode Cloud since I upgraded to the macOS Sequioa Beta and Xcode 16 now refuses to archive builds for TestFlight. Somewhere very late in the build process I get the following error: realitytool requires Metal for this operation and it is not available in this build environment The log says this happens at: Compile Skybox urban.skybox My project uses RealityKit. How can I fix this issue? Thanks!
5
5
927
Oct ’25
SwiftUI Slider will cause app to crash on macOS Tahoe RC
Hello, creating a simple-as-it-gets Slider in SwiftUI and then running that app on Mac Catalyst with the macOS idiom enabled, the app crashes: struct ContentView: View { @State private var sliderValue: Double = 0.4 var body: some View { VStack { Slider(value: $sliderValue) } .padding() } } running this will result in an exception: _setMinimumEnabledValue: is not supported on UISlider when running Catalyst apps in the Mac idiom. See UIBehavioralStyle for possible alternatives. This is obviously not ideal and also apparently no documented. Is there a workaround for this? It used to work for on macOS Sonoma. macOS 26 RC Xcode 26 RC FB20191635 Thanks!
4
4
355
Oct ’25
Shared/GroupImmersive Space – Query Local Device Transform
Hi, I am in the process of implementing SharePlay into our app. The shared experience opens an Immersive Space and we set systemCoordinator.configuration.supportsGroupImmersiveSpace = true Now visionOS establishes a shared coordinate space for the immersive space. From the docs: To achieve consistent positioning of RealityKit entities across multiple devices in an immersive space during a SharePlay session There are cases where we want to position content in front of the user (independent of the shared session, and for each user individually). Normally to do that we use the transform retrieved via worldTrackingProvider.queryDeviceAnchor.originFromAnchorTransform to position content in front of the user (plus some Z Offset and smooth interpolation). This works fine in non-SharePlay instances and the device transform is where I would expect it to be but during the FaceTime call deviceAnchor.originFromAnchorTransform seems to use the shared origin of the immersive space and then I end up with a transform that might be offset. Here is a video of the issue in action: https://streamable.com/205r2p The blue rect is place using AnchorEntity(.head, trackingMode: .continuous). This works regardless of the call and the entity is always placed based on the head position. The green rect is adjusted on every frame using the transform I get from worldTrackingProvider.queryDeviceAnchor. As you can see it's offset. Is there any way I can query query this transform locally for the user during a FaceTime call? Also I would like to know if it's possible to disable this automatic entity transform syncing behavior? Setting entity.synchronization = nil results in the entity not showing up at all. https://developer.apple.com/documentation/realitykit/synchronizationcomponent Is SynchronizationComponent only relevant for the legacy MultiPeerConnectivity approach? Thank you!
2
0
270
Oct ’25
NSToolbar Space item rendered with Liquid Glass Background
Hi, I have a NSToolbar in my Mac Catalyst app with a space and flexible space item in it (https://developer.apple.com/documentation/appkit/nstoolbaritem/identifier/space). On macOS Tahoe the space item is being rendered with a Liquid Glass effect and seems to be automatically grouped with the previous item. Is there a way to prevent this? It basically adds some undesired padding next to the previous item and looks add. The flexible space is rendered normally and as before. I am talking about the space right next to the back chevron item. Thanks for any hints!
1
0
142
Sep ’25
CustomMaterial disable unlit tone mapping
Hi, since iOS 18 UnlitMaterial and ShaderGraphMaterial have the option to disable tone mapping, e.g via https://developer.apple.com/documentation/realitykit/unlitmaterial/init(applypostprocesstonemap:) Is it possible to do the same for CustomMaterial? I tried initializing a CustomMaterial based on an UnlitMaterial where tone mapping is disabled, like so: let unlitMat = UnlitMaterial(applyPostProcessToneMap: false) let customMaterial = try CustomMaterial( from: unlitMat, surfaceShader: surfaceShader, geometryModifier: geometryModifier ) but that does not seem to work. The colors of my texture still look altered in comparison to a plain UnlitMaterial or a ShaderGraphMaterial where its disabled. Any hints? Thank you!
1
0
118
Jun ’25
Symbol not found: _$sSo22CLLocationCoordinate2DVSE12CoreLocationMc when building for visionOS 2.5 with Xcode 16.3
Hello, I'm encountering a runtime crash when building my visionOS app with Xcode 16.3 for visionOS 2.5. Our existing AppStore/Testflight app is also instantly crashing on visionOS 2.5 when opened but works fine on e.g visionOS 2.4. The app builds successfully but crashes on launch with this symbol lookup error (slightly adjusted because the forum complained regarding sensitive data): Symbol not found: _$sSo22CLLocationCoordinate2DVSE12CoreLocationMc Referenced from: <XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX> /private/var/containers/Bundle/Application/XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX/MyApp.app/MyApp.debug.dylib Expected in: <XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX> /usr/lib/swift/libswiftCoreLocation.dylib dyld config: DYLD_LIBRARY_PATH=/usr/lib/system/introspection DYLD_INSERT_LIBRARIES=/usr/lib/libLogRedirect.dylib:/usr/lib/libBacktraceRecording.dylib:/usr/lib/libMainThreadChecker.dylib:/System/Library/PrivateFrameworks/GPUToolsCapture.framework/GPUToolsCapture:/usr/lib/libViewDebuggerSupport.dylib I've already implemented my own Codable conformance for CLLocationCoordinate2D: extension CLLocationCoordinate2D: Codable { // implementation details... } This worked fine on previous visionOS/Xcode versions. Has anyone encountered this issue or found a solution? System details: macOS version: 15.3.2 Xcode version: 16.3 visionOS target: 2.5 Thank you!
2
0
99
May ’25
Mac Catalyst SwiftUI – . focused() not working
Hello, given this following simple SwiftUI setup: struct ContentView: View { var body: some View { CustomFocusView() } } struct CustomFocusView: View { @FocusState private var isFocused: Bool var body: some View { color .frame(width: 128, height: 128) .focusable(true) .focused($isFocused) .onTapGesture { isFocused.toggle() } .onKeyPress("a") { print("A pressed") return .handled } } var color: Color { isFocused ? .blue : .red } } If I run this via Mac – Designed for iPad, the CustomFocusView toggles focus as expected and cycles through red and blue. Now if I run this same exact code via Mac Catalyst absolutely nothing happens and so far I wasn't able to ever get this view to accept focused state. Is this expected? I would appreciate if anyone could hint me on how to get this working. Thank and best regards!
5
0
548
Apr ’25
Loading USDZ with particle system crashes on Intel Macs
Hello, we have a RealityKit app that also runs on macOS via Catalyst. For specific USD assets containing particle systems we have observed a reproducible crash. Steps to reproduce: Open Reality Composer Pro Create new file Create simple particle system (default one is fine) export as USDZ Create project in Xcode Call Entity.load(… and pass in your USD Running this on an Intel iMac with macOS Sequoia 15.3 will lead to a crash with the following console log: validateWithDevice:4704: failed assertion `Render Pipeline DescrvalidateWithDevice:4704: failed assertion `Render Pipeline Descriptor Validation depthAttachmentPixelFormat (MTLPixelFormatDepthvalidateWithDevice:4704: failed assertion `Render Pipeline Descriptor Validation depthAttachmentPixelFormat (MTLPixelFormatDepthvalidateWithDevice:4704: failed assertion `Render Pipeline Descriptor Validation depthAttachmentPixelFormat (MTLPixelFormatDepth32Float) and stencilAttachmentPixelFormat (MTLPixelFormatStencil8) must match. ' iptor Validation depthAttachmentPixelFormat (MTLPixelFormatDepth32Float) and stencilAttachmentPixelFormat (MTLPixelFormatStencil8) must match. ' 32Float) and stencilAttachmentPixelFormat (MTLPixelFormatStencil32Float) and stencilAttachmentPixelFormat (MTLPixelFormatStencil8) must match. ' 8) must match. ' Xcode version: 16.2.0 iMac 2020 3,8GHz Intel Core i7 macOS Sequoia 15.3 FB16477373 It would be great if this could be fixed quickly or a workaround provided since it affects or production app. Thank you!
1
0
420
Mar ’25
What causes »ARSessionDelegate is retaining X ARFrames« console warning?
Hi, since iOS 15 I've repeatedly noticed the console warning »ARSessionDelegate is retaining X ARFrames. This can lead to future camera frames being dropped« even for rather simple projects using RealityKit and ARKit. Could someone from the ARKit team please elaborate what causes this warning and what can be done to avoid it? If I remember correctly I didn't even assign an ARSessionDelegate. Thank you!
4
1
3.4k
Feb ’25
Building visionOS app with Firebase Source Distribution in Xcode Cloud
I have a working Xcode Cloud setup for my iOS and macOS targets, and I'm trying to add visionOS support. The issue is that Firebase requires using their source distribution for visionOS (instead of their default binary distribution). Locally, this works by launching Xcode with: open -a Xcode --env FIREBASE_SOURCE_FIRESTORE project.xcodeproj For Xcode Cloud, I've added a ci_post_clone.sh script that sets this environment variable for visionOS builds: #!/bin/bash if [[ $CI_PRODUCT_PLATFORM == "xrOS" ]]; then echo "Running setup for visionOS..." export FIREBASE_SOURCE_FIRESTORE=1 fi However, I'm getting the following error during build: an out-of-date resolved file was detected at /.../project.xcworkspace/xcshareddata/swiftpm/Package.resolved, which is not allowed when automatic dependency resolution is disabled So since setting FIREBASE_SOURCE_FIRESTORE=1 changes which SPM dependencies are required: Normal setup uses: abseil-cpp-binary, grpc-binary Source distribution needs: abseil-cpp-swiftpm, grpc-ios, boringssl-swiftpm What's the recommended way to handle this in Xcode Cloud when maintaining builds for all platforms? Should I be using separate workflows with different branches for different platforms? Or is there a better approach? System: Xcode 16.2 Using SPM for dependency management Firebase iOS SDK 10.29.0 Building for iOS, macOS, and visionOS Thanks in advance for any guidance!
0
1
478
Feb ’25
USDZ with Blend Shapes Workflow Recommendations
Hi, since RealityKit 4 now supports Blend Shapes I was wondering if there are any workflow or tooling recommendations to bake/export them into a USDZ. Are Blender or Cinema4D capable to do that out of the box? Should we look into NVIDIA omniverse (https://docs.omniverse.nvidia.com/connect/latest/blender/manual.htm) So far this topic seems very sparsely documented and I would appreciate any hints. Thank you!
3
0
1.4k
Jan ’25