Hi, I've just migrated to Swift Tools 6.2 and package traits, and I'm encountering an issue when using traits with multiple targets in the same Xcode workspace.
Setup:
Main iOS app target
App Clip target
Both consume the same local packages (e.g., UIComponents)
What I'm trying to achieve:
Main app imports packages without the COMPACT_BUILD trait
App Clip imports packages with the COMPACT_BUILD trait enabled
Package configuration (simplified):
// UIComponents/Package.swift
let package = Package(
name: "UIComponents",
platforms: [.iOS(.v18)],
traits: [
.trait(name: "COMPACT_BUILD", description: "Minimal build for App Clips"),
],
// ...
targets: [
.target(
name: "UIComponents",
dependencies: [...],
swiftSettings: [
.define("COMPACT_BUILD", .when(traits: ["COMPACT_BUILD"])),
]
),
]
)
In the code:
#if COMPACT_BUILD
// Excluded from App Clip
#endif
The consumer packages:
Main app's package imports without trait:
.package(path: "../UIComponents")
App Clip's package imports with trait:
.package(path: "../UIComponents", traits: ["COMPACT_BUILD"])
The problem:
When building the main app target, the COMPACT_BUILD compiler condition is unexpectedly active — even though the main app's dependency chain never enables that trait. It seems like the trait enabled by the App Clip target is "leaking" into the main app build.
I confirmed this by adding #error("COMPACT_BUILD is active") — it triggers when building the main app, which shouldn't happen.
If I disable the App Clip target from the build scheme, the main app builds correctly with COMPACT_BUILD not defined.
I am also able to build the App Clip separately.
Environment:
Xcode 26.2
swift-tools-version: 6.2
iOS 26.2
Questions:
Is this expected behavior with Xcode's SPM integration? Are traits resolved workspace-wide rather than per-target?
Is there a workaround to have different trait configurations for different targets consuming the same package?
Or do I need to fall back to separate package targets (e.g., UIComponents and UIComponentsCompact) to achieve this?
Any guidance would be appreciated. Thanks!
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Created
When assigning a ManipulationComponent to an Entity SceneEvents.WillRemoveEntity will be called for that Entity.
Expected Behavior: the Entity is not (even if temporarily) removed from the Scene and no SceneEvents will be triggered as a result of assigning a ManipulationComponent.
FB20872220
Sidebars for mac Catalyst apps running with UIDesignRequiresCompatibility flag render their active items with a white bg tint – resulting in labels and icons being not visible.
mac OS Tahoe 26.1 Beta 3 (25B5062e)
FB20765036
Example (Apple Developer App):
Hi,
I am in the process of implementing SharePlay into our app. The shared experience opens an Immersive Space and we set systemCoordinator.configuration.supportsGroupImmersiveSpace = true
Now visionOS establishes a shared coordinate space for the immersive space.
From the docs:
To achieve consistent positioning of RealityKit entities across multiple devices in an immersive space during a SharePlay session
There are cases where we want to position content in front of the user (independent of the shared session, and for each user individually). Normally to do that we use the transform retrieved via worldTrackingProvider.queryDeviceAnchor.originFromAnchorTransform
to position content in front of the user (plus some Z Offset and smooth interpolation).
This works fine in non-SharePlay instances and the device transform is where I would expect it to be but during the FaceTime call deviceAnchor.originFromAnchorTransform seems to use the shared origin of the immersive space and then I end up with a transform that might be offset.
Here is a video of the issue in action: https://streamable.com/205r2p
The blue rect is place using AnchorEntity(.head, trackingMode: .continuous). This works regardless of the call and the entity is always placed based on the head position.
The green rect is adjusted on every frame using the transform I get from worldTrackingProvider.queryDeviceAnchor. As you can see it's offset.
Is there any way I can query query this transform locally for the user during a FaceTime call?
Also I would like to know if it's possible to disable this automatic entity transform syncing behavior?
Setting entity.synchronization = nil results in the entity not showing up at all.
https://developer.apple.com/documentation/realitykit/synchronizationcomponent
Is SynchronizationComponent only relevant for the legacy MultiPeerConnectivity approach?
Thank you!
Hi,
I have a NSToolbar in my Mac Catalyst app with a space and flexible space item in it (https://developer.apple.com/documentation/appkit/nstoolbaritem/identifier/space).
On macOS Tahoe the space item is being rendered with a Liquid Glass effect and seems to be automatically grouped with the previous item. Is there a way to prevent this?
It basically adds some undesired padding next to the previous item and looks add. The flexible space is rendered normally and as before.
I am talking about the space right next to the back chevron item.
Thanks for any hints!
Hello,
creating a simple-as-it-gets Slider in SwiftUI and then running that app on Mac Catalyst with the macOS idiom enabled, the app crashes:
struct ContentView: View {
@State private var sliderValue: Double = 0.4
var body: some View {
VStack {
Slider(value: $sliderValue)
}
.padding()
}
}
running this will result in an exception:
_setMinimumEnabledValue: is not supported on UISlider when running Catalyst apps in the Mac idiom. See UIBehavioralStyle for possible alternatives.
This is obviously not ideal and also apparently no documented.
Is there a workaround for this?
It used to work for on macOS Sonoma.
macOS 26 RC
Xcode 26 RC
FB20191635
Thanks!
Hi,
since iOS 18 UnlitMaterial and ShaderGraphMaterial have the option to disable tone mapping, e.g via https://developer.apple.com/documentation/realitykit/unlitmaterial/init(applypostprocesstonemap:)
Is it possible to do the same for CustomMaterial? I tried initializing a CustomMaterial based on an UnlitMaterial where tone mapping is disabled, like so:
let unlitMat = UnlitMaterial(applyPostProcessToneMap: false)
let customMaterial = try CustomMaterial(
from: unlitMat,
surfaceShader: surfaceShader,
geometryModifier: geometryModifier
)
but that does not seem to work. The colors of my texture still look altered in comparison to a plain UnlitMaterial or a ShaderGraphMaterial where its disabled.
Any hints? Thank you!
Hello,
I'm encountering a runtime crash when building my visionOS app with Xcode 16.3 for visionOS 2.5. Our existing AppStore/Testflight app is also instantly crashing on visionOS 2.5 when opened but works fine on e.g visionOS 2.4.
The app builds successfully but crashes on launch with this symbol lookup error (slightly adjusted because the forum complained regarding sensitive data):
Symbol not found: _$sSo22CLLocationCoordinate2DVSE12CoreLocationMc
Referenced from: <XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX> /private/var/containers/Bundle/Application/XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX/MyApp.app/MyApp.debug.dylib
Expected in: <XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX> /usr/lib/swift/libswiftCoreLocation.dylib
dyld config: DYLD_LIBRARY_PATH=/usr/lib/system/introspection DYLD_INSERT_LIBRARIES=/usr/lib/libLogRedirect.dylib:/usr/lib/libBacktraceRecording.dylib:/usr/lib/libMainThreadChecker.dylib:/System/Library/PrivateFrameworks/GPUToolsCapture.framework/GPUToolsCapture:/usr/lib/libViewDebuggerSupport.dylib
I've already implemented my own Codable conformance for CLLocationCoordinate2D:
extension CLLocationCoordinate2D: Codable {
// implementation details...
}
This worked fine on previous visionOS/Xcode versions. Has anyone encountered this issue or found a solution?
System details:
macOS version: 15.3.2
Xcode version: 16.3
visionOS target: 2.5
Thank you!
Hello,
we have a RealityKit app that also runs on macOS via Catalyst.
For specific USD assets containing particle systems we have observed a reproducible crash.
Steps to reproduce:
Open Reality Composer Pro
Create new file
Create simple particle system (default one is fine)
export as USDZ
Create project in Xcode
Call Entity.load(… and pass in your USD
Running this on an Intel iMac with macOS Sequoia 15.3 will lead to a crash with the following console log:
validateWithDevice:4704: failed assertion `Render Pipeline DescrvalidateWithDevice:4704: failed assertion `Render Pipeline Descriptor Validation
depthAttachmentPixelFormat (MTLPixelFormatDepthvalidateWithDevice:4704: failed assertion `Render Pipeline Descriptor Validation
depthAttachmentPixelFormat (MTLPixelFormatDepthvalidateWithDevice:4704: failed assertion `Render Pipeline Descriptor Validation
depthAttachmentPixelFormat (MTLPixelFormatDepth32Float) and stencilAttachmentPixelFormat (MTLPixelFormatStencil8) must match.
'
iptor Validation
depthAttachmentPixelFormat (MTLPixelFormatDepth32Float) and stencilAttachmentPixelFormat (MTLPixelFormatStencil8) must match.
'
32Float) and stencilAttachmentPixelFormat (MTLPixelFormatStencil32Float) and stencilAttachmentPixelFormat (MTLPixelFormatStencil8) must match.
'
8) must match.
'
Xcode version: 16.2.0
iMac 2020 3,8GHz Intel Core i7
macOS Sequoia 15.3
FB16477373
It would be great if this could be fixed quickly or a workaround provided since it affects or production app. Thank you!
I have a working Xcode Cloud setup for my iOS and macOS targets, and I'm trying to add visionOS support. The issue is that Firebase requires using their source distribution for visionOS (instead of their default binary distribution).
Locally, this works by launching Xcode with:
open -a Xcode --env FIREBASE_SOURCE_FIRESTORE project.xcodeproj
For Xcode Cloud, I've added a ci_post_clone.sh script that sets this environment variable for visionOS builds:
#!/bin/bash
if [[ $CI_PRODUCT_PLATFORM == "xrOS" ]]; then
echo "Running setup for visionOS..."
export FIREBASE_SOURCE_FIRESTORE=1
fi
However, I'm getting the following error during build:
an out-of-date resolved file was detected at /.../project.xcworkspace/xcshareddata/swiftpm/Package.resolved, which is not allowed when automatic dependency resolution is disabled
So since setting FIREBASE_SOURCE_FIRESTORE=1 changes which SPM dependencies are required:
Normal setup uses: abseil-cpp-binary, grpc-binary
Source distribution needs: abseil-cpp-swiftpm, grpc-ios, boringssl-swiftpm
What's the recommended way to handle this in Xcode Cloud when maintaining builds for all platforms? Should I be using separate workflows with different branches for different platforms? Or is there a better approach?
System:
Xcode 16.2
Using SPM for dependency management
Firebase iOS SDK 10.29.0
Building for iOS, macOS, and visionOS
Thanks in advance for any guidance!
Topic:
Developer Tools & Services
SubTopic:
Xcode Cloud
Tags:
Swift Packages
Xcode
Xcode Cloud
visionOS
Hello,
given this following simple SwiftUI setup:
struct ContentView: View {
var body: some View {
CustomFocusView()
}
}
struct CustomFocusView: View {
@FocusState private var isFocused: Bool
var body: some View {
color
.frame(width: 128, height: 128)
.focusable(true)
.focused($isFocused)
.onTapGesture {
isFocused.toggle()
}
.onKeyPress("a") {
print("A pressed")
return .handled
}
}
var color: Color {
isFocused ? .blue : .red
}
}
If I run this via Mac – Designed for iPad, the CustomFocusView toggles focus as expected and cycles through red and blue.
Now if I run this same exact code via Mac Catalyst absolutely nothing happens and so far I wasn't able to ever get this view to accept focused state. Is this expected? I would appreciate if anyone could hint me on how to get this working.
Thank and best regards!
Hi,
since RealityKit 4 now supports Blend Shapes I was wondering if there are any workflow or tooling recommendations to bake/export them into a USDZ.
Are Blender or Cinema4D capable to do that out of the box? Should we look into NVIDIA omniverse (https://docs.omniverse.nvidia.com/connect/latest/blender/manual.htm)
So far this topic seems very sparsely documented and I would appreciate any hints. Thank you!
Hi,
We've been leveraging AppClips on iOS for a while now to distribute native-app quality AR experiences (utilising ARKit and RealityKit) with the accessibility of a website.
This has been a crucial differentiator for us and is a core driver for our business.
Since our authoring tools also allows to run the same AR experiences on Vision Pro it would be amazing if they could be triggered by App Clips here as well. We've got this feedback from clients and users multiple times and since there seems to already be some basic App Clips support (e.g when registering the custom lens inserts) integrated into the system we would immensely appreciate if this feature could be opened up for 3rd party developers as well.
Associated feedback ID: FB13348462
Thank you!
Devices running iOS 18 using RealityKit do not seem to receive lighting supplied via ARKit Environment Texturing (https://developer.apple.com/documentation/arkit/arworldtrackingconfiguration/2977509-environmenttexturing).
Instead just a default IBL is used by RealityKit.
This happens with RealityView as well as ARView.
It also happens when I explicitly opt-in to environment texturing:
let worldTrackingConfig = ARWorldTrackingConfiguration()
worldTrackingConfig.environmentTexturing = .automatic
arView.session.run(worldTrackingConfig)
Even the Xcode AR Template has this issue.
I'm attaching a screenshot of the sample app running on iOS 18 where it's broken and from iOS 17 where it works as expected.
I hope this can get resolved quickly since I see it as a major regression.
Feedback ID: FB15091335
UPDATE:
It works on my older iPhone XS (iOS 18 22A5282m)
Broken on iPad Pro (11-inch) (3rd generation) (iPadOS 18.0 (22A5350a))
Maybe it's related to LiDAR?
Thank you!
iOS 17 (works):
iOS 18 (broken):
Hi,
I have a RealityKit app that I am building with Xcode 16. The app has a minimum deployment target of iOS 17. If I run it on an iOS 17 device the app crashes:
dyld[15716]: Symbol not found: _$s10RealityKit13ShapeResourceC14generateConvex4fromAcA04MeshD0C_tYaKFZ
Referenced from: …
Expected in: …/System/Library/Frameworks/RealityFoundation.framework/RealityFoundation
My code looks something like this:
@available(iOS, introduced: 13.0, obsoleted: 18.0)
@MainActor @preconcurrency func generateNonAsyncConvexShapeResource(from meshResource: MeshResource) throws -> ShapeResource {
ShapeResource.generateConvex(from: meshResource)
}
@available(iOS 18.0, *)
func generateConvexShapeAsync(from meshResource: MeshResource) async throws -> ShapeResource {
// This will only be available for iOS 18 and above
return try await ShapeResource.generateConvex(from: meshResource)
}
if let meshResource = try? modelEntity.model?.mesh.applying(transform: transform.matrix) {
if #available(visionOS 1.0, iOS 18.0, *) {
try? await generateConvexShapeAsync(from: meshResource)// await shapeResources.append(.generateConvex(from: meshResource))
} else {
try? generateNonAsyncConvexShapeResource(from: meshResource)
}
}
So I actually do check for the system and only call the async variant on iOS 18.
Any hints how to fix that?
Thanks!