Hello,
after a recent Apple Maps update last week the satellite imagery data now looks a lot worse than before.
What previously looked lifelike and lush now looks very sad and wintery. Also the contrast seems way to extreme.
Attached is a sample image.
FB: FB11716831
Any chance that this could be reverted to the old version?
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hello,
I recently converted from manual dictionary-based JSON Serialisation to Codable and noticed that this resulted in a pretty significant growth in binary size (I looked that up via App Thinning Size Report).
The difference from codable to non codable is ~800KB.
As our app also supports App Clips I now can't fulfill the 10MB Universal Bundle Size limit anymore.
Is there any way I can make this a little more lean? Would it help to manually implement all Codable methods instead of relying on compiler synthetization?
Thanks for any hints!
Hello,
I've done a couple tests and noticed that when I run an ARKit Session with for example a world- or geo-tracking configuration which has environment texturing set to either .manual or .automatic and I walk around with the device for an extended distance there is a pretty noticeable increase in memory usage. I'm not implying that it's a leak but it seems like the system creates lots and lots of environment probes and does not remove older ones.
An example:
This is a barebones Xcode RealityKit starter project that spawns a couple cubes with an ARGeoTracking configuration. After ~7mins of walking around (roughly a distance of 100-200 meters) it uses an additional 300MBs of RAM.
I've seen cases where users walked around for a while and the app eventually crashed because of this.
When environment texturing is disabled RAM usage pretty much stays the same, no matter how far I walk.
Is there a recommended way on how to handle this? Should I remove probes manually after a while or eventually disable environment texturing altogether at some point (will that preserve the current cube map)?
I would appreciate any guidance.
With the Xcode example project you can easily recreate the issue by modifying it just a little and then walk around for a while with statistics enabled.
class ViewController: UIViewController {
@IBOutlet var arView: ARView!
override func viewDidLoad() {
super.viewDidLoad()
arView.automaticallyConfigureSession = false
}
override func viewDidAppear(_ animated: Bool) {
super.viewDidAppear(animated)
let config = ARWorldTrackingConfiguration()
config.environmentTexturing = .manual
arView.session.run(config)
// Load the "Box" scene from the "Experience" Reality File
let boxAnchor = try! Experience.loadBox()
// Add the box anchor to the scene
arView.scene.anchors.append(boxAnchor)
}
}
Using OcclusionMaterial on macOS and iOS works fine in Non-AR mode when I set the background to just a simple color (https://developer.apple.com/documentation/realitykit/arview/environment-swift.struct/color) but when I set a custom skybox (https://developer.apple.com/documentation/realitykit/arview/environment-swift.struct/background-swift.struct/skybox(_:)) the OcclusionMaterial renders as fully black. I would expect it to properly occlude the content and show through the skybox behind it.
This happens with box ARView and RealityView. On current iOS/macOS Betas as well as on older systems, e.g iOS 17 and macOS Sonoma.
Feedback ID: FB15081053
Hello,
I'm encountering a runtime crash when building my visionOS app with Xcode 16.3 for visionOS 2.5. Our existing AppStore/Testflight app is also instantly crashing on visionOS 2.5 when opened but works fine on e.g visionOS 2.4.
The app builds successfully but crashes on launch with this symbol lookup error (slightly adjusted because the forum complained regarding sensitive data):
Symbol not found: _$sSo22CLLocationCoordinate2DVSE12CoreLocationMc
Referenced from: <XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX> /private/var/containers/Bundle/Application/XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX/MyApp.app/MyApp.debug.dylib
Expected in: <XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX> /usr/lib/swift/libswiftCoreLocation.dylib
dyld config: DYLD_LIBRARY_PATH=/usr/lib/system/introspection DYLD_INSERT_LIBRARIES=/usr/lib/libLogRedirect.dylib:/usr/lib/libBacktraceRecording.dylib:/usr/lib/libMainThreadChecker.dylib:/System/Library/PrivateFrameworks/GPUToolsCapture.framework/GPUToolsCapture:/usr/lib/libViewDebuggerSupport.dylib
I've already implemented my own Codable conformance for CLLocationCoordinate2D:
extension CLLocationCoordinate2D: Codable {
// implementation details...
}
This worked fine on previous visionOS/Xcode versions. Has anyone encountered this issue or found a solution?
System details:
macOS version: 15.3.2
Xcode version: 16.3
visionOS target: 2.5
Thank you!
Hi,
since iOS 18 UnlitMaterial and ShaderGraphMaterial have the option to disable tone mapping, e.g via https://developer.apple.com/documentation/realitykit/unlitmaterial/init(applypostprocesstonemap:)
Is it possible to do the same for CustomMaterial? I tried initializing a CustomMaterial based on an UnlitMaterial where tone mapping is disabled, like so:
let unlitMat = UnlitMaterial(applyPostProcessToneMap: false)
let customMaterial = try CustomMaterial(
from: unlitMat,
surfaceShader: surfaceShader,
geometryModifier: geometryModifier
)
but that does not seem to work. The colors of my texture still look altered in comparison to a plain UnlitMaterial or a ShaderGraphMaterial where its disabled.
Any hints? Thank you!
Hi,
I am in the process of implementing SharePlay into our app. The shared experience opens an Immersive Space and we set systemCoordinator.configuration.supportsGroupImmersiveSpace = true
Now visionOS establishes a shared coordinate space for the immersive space.
From the docs:
To achieve consistent positioning of RealityKit entities across multiple devices in an immersive space during a SharePlay session
There are cases where we want to position content in front of the user (independent of the shared session, and for each user individually). Normally to do that we use the transform retrieved via worldTrackingProvider.queryDeviceAnchor.originFromAnchorTransform
to position content in front of the user (plus some Z Offset and smooth interpolation).
This works fine in non-SharePlay instances and the device transform is where I would expect it to be but during the FaceTime call deviceAnchor.originFromAnchorTransform seems to use the shared origin of the immersive space and then I end up with a transform that might be offset.
Here is a video of the issue in action: https://streamable.com/205r2p
The blue rect is place using AnchorEntity(.head, trackingMode: .continuous). This works regardless of the call and the entity is always placed based on the head position.
The green rect is adjusted on every frame using the transform I get from worldTrackingProvider.queryDeviceAnchor. As you can see it's offset.
Is there any way I can query query this transform locally for the user during a FaceTime call?
Also I would like to know if it's possible to disable this automatic entity transform syncing behavior?
Setting entity.synchronization = nil results in the entity not showing up at all.
https://developer.apple.com/documentation/realitykit/synchronizationcomponent
Is SynchronizationComponent only relevant for the legacy MultiPeerConnectivity approach?
Thank you!
Hi, I've just migrated to Swift Tools 6.2 and package traits, and I'm encountering an issue when using traits with multiple targets in the same Xcode workspace.
Setup:
Main iOS app target
App Clip target
Both consume the same local packages (e.g., UIComponents)
What I'm trying to achieve:
Main app imports packages without the COMPACT_BUILD trait
App Clip imports packages with the COMPACT_BUILD trait enabled
Package configuration (simplified):
// UIComponents/Package.swift
let package = Package(
name: "UIComponents",
platforms: [.iOS(.v18)],
traits: [
.trait(name: "COMPACT_BUILD", description: "Minimal build for App Clips"),
],
// ...
targets: [
.target(
name: "UIComponents",
dependencies: [...],
swiftSettings: [
.define("COMPACT_BUILD", .when(traits: ["COMPACT_BUILD"])),
]
),
]
)
In the code:
#if COMPACT_BUILD
// Excluded from App Clip
#endif
The consumer packages:
Main app's package imports without trait:
.package(path: "../UIComponents")
App Clip's package imports with trait:
.package(path: "../UIComponents", traits: ["COMPACT_BUILD"])
The problem:
When building the main app target, the COMPACT_BUILD compiler condition is unexpectedly active — even though the main app's dependency chain never enables that trait. It seems like the trait enabled by the App Clip target is "leaking" into the main app build.
I confirmed this by adding #error("COMPACT_BUILD is active") — it triggers when building the main app, which shouldn't happen.
If I disable the App Clip target from the build scheme, the main app builds correctly with COMPACT_BUILD not defined.
I am also able to build the App Clip separately.
Environment:
Xcode 26.2
swift-tools-version: 6.2
iOS 26.2
Questions:
Is this expected behavior with Xcode's SPM integration? Are traits resolved workspace-wide rather than per-target?
Is there a workaround to have different trait configurations for different targets consuming the same package?
Or do I need to fall back to separate package targets (e.g., UIComponents and UIComponentsCompact) to achieve this?
Any guidance would be appreciated. Thanks!