TL;DR: RealityKit and Reality Composer Pro aren't forward or backward compatible with each other, and the resulting error message is terse and unhelpful. (FB14828873)
So far, I've been sticking with Xcode 16.4 for development and only using Xcode 26.0 beta experimentally.
Yesterday, I used xcode-select to switch to Xcode 26.0 beta 3 to test it, but I forgot to switch back.
Consequently, this morning I unintentionally used the future Reality Composer Pro (the version included with Xcode 26) to make a small change to a USD file.
Now I realize that if I'm unlucky, it's possible Reality Composer Pro may have silently introduced a small change into the USD file that may make RealityKit fail to read the file on iOS 18 and visionOS 2, which in the past has resulted in hours of debugging to track down the source of the failure, often a single line in the USD file that RealityKit can't communicate to me other than with the error "the operation couldn't be completed".
As an analogy, this situation is as if, during regular development (not involving Reality Composer Pro), Xcode didn't warn you about specific API version conflicts, but instead failed with a generic error message, without highlighting the line in your Swift file that was the source of the error.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I've loaded a ShaderGraphMaterial from a RealityKit content bundle and I'm attempting to access the initial values of its parameters using getParameter(handle:), but this method appears to always return nil:
let shaderGraphMaterial = try await ShaderGraphMaterial(named: "MyMaterial", from: "MyFile")
let namedParameterValue = shaderGraphMaterial.getParameter(name: "myParameter")
// This prints the value of the `myParameter` parameter, as expected.
print("namedParameterValue = \(namedParameterValue)")
let handle = ShaderGraphMaterial.parameterHandle(name: "myParameter")
let handleParameterValue = shaderGraphMaterial.getParameter(handle: handle)
// Expected behavior: prints the value of the `myParameter` parameter, as above.
// Observed behavior: prints `nil`.
print("handleParameterValue = \(handleParameterValue)")
Is this expected behavior?
Based on the documentation at https://developer.apple.com/documentation/realitykit/shadergraphmaterial/getparameter(handle:) I'd expect getParameter(handle:) to return the value of the parameter, just as getParameter(name:) does.
I've tested this on iOS 18.5 and iOS 26.0 beta 2.
Assuming this getParameter(handle:) works as designed, is the following ShaderGraphMaterial extension an appropriate workaround, or can you recommend a better approach?
Thank you.
public extension ShaderGraphMaterial {
/// Reassigns the values of all named material parameters using the handle-based API.
///
/// This works around an issue where, at least as of RealityKit 26.0 beta 2 and
/// earlier, `getParameter(handle:)` will always return `nil` when used to read the
/// initial value of a shader graph material parameter read using
/// `ShaderGraphMaterial(named:from:in:)`, whereas `getParameter(name:)` will work
/// as expected.
private mutating func copyNamedParametersToHandles() {
for parameterName in self.parameterNames {
if let value = self.getParameter(name: parameterName) {
let handle = ShaderGraphMaterial.parameterHandle(name: parameterName)
do {
try self.setParameter(handle: handle, value: value)
} catch {
assertionFailure("Cannot set parameter value")
}
}
}
}
}
Topic:
Graphics & Games
SubTopic:
RealityKit
Tags:
RealityKit
Reality Composer Pro
Shader Graph Editor
visionOS
Is there a way to permanently disable PHASE SDK logging? It seems to be a lot chattier than Apple's other SDKs.
While developing a RealityKit app that uses AudioPlaybackController, I must manually hide the PHASE SDK log output several times each day so I can see my app's log messages.
Thank you.
On iPadOS 26 beta, the navigation bar can appear inset underneath the status bar (FB18241928)
This bug does not happen on iOS 18.
This bug occurs when a full screen modal view controller without a status bar is presented, the device orientation changes, and then the full screen modal view controller is dismissed.
This bug appears to happen only on iPad, and not on iPhone.
This bug happens both in the simulator and on the device.
Thank you for investigating this issue.
I'm watching the session video "What's new in Xcode 26" and when I try out the new #Playground macro (in the context of a local package in my project) I see only this activity indicator for about 10-15 minutes:
Eventually I do see this error:
Despite the error, my local package does not import SwiftUI and has no dependencies other than Foundation.
I checked the Xcode 26 release notes and I don't see this issue mentioned.
A clean build of my project takes 55 seconds. Is it expected behavior that #Playground would require 10+ minutes to spin up?
Is there anything I can to do make the new #Playground macro work correctly?
Thank you.
I've installed Xcode 26.0 beta (17A5241e), and if I select my project from the Welcome to Xcode dialog, Xcode crashes.
I've attempted this 10 times with the same result each time.
What should I do?
Thank you.
We're using RealityKit to create a science education AR app for iOS, iPadOS, and visionOS.
In the WWDC25 session video "Bring your SceneKit project to RealityKit" https://developer.apple.com/videos/play/wwdc2025/288 at 8:15, it's explained that when using RealityKit, RealityView should be used in all cases, whereas in the past, SceneKit required SCNView, SceneView, or ARSCNView, depending on an app's requirements.
Because the initial development of our app on iOS predates iOS 18's RealityView, our app currently uses ARView to render RealityKit AR content on iOS and iPadOS.
Is it recommended that we migrate to RealityView, or can we safely continue using our existing ARView implementation? We'd prefer to avoid unnecessary development cost.
If migrating from ARView to RealityView is recommended, what specific benefits should we expect from this transition?
Thank you.
Liquid Glass was introduced as a universal design language for all platforms, but isn't supported by visionOS 26 beta.
For a small team creating a visionOS app targeted for release in fall 2026, should we focus our design work on Liquid Glass, or for the existing visionOS design language?
In Reality Composer Pro 2.0 (448.120.2), if I use the "Create new scene in the project" button more than once, this feature doesn't seem to work.
The second time I use "Create new scene in the project", Reality Composer Pro displays a new empty USDA file in the project browser with the wrong icon (a yellow document icon instead of a 3D box icon).
Also, it doesn't create the new scene's USDA file on disk or display the scene in entity hierarchy browser or create a new tab.
Consequently, whenever I want to add more than one new scene to my project, I have to repeatedly quit and restart Reality Composer Pro.
In my use of RCP just now, I had to quit and restart RCP six times to create seven new scenes.
Is this a known issue?
Repro steps:
In a Reality Composer Pro project, create a new folder using the "add folder" icon button in the project browser.
Inside the new folder, click the "Create new scene in the project" icon button.
Click the "Create new scene in the project" icon button a second time.
Expected behavior:
A new USDA file is created on disk. The new USDA file's root entity appears in the entity hierarchy browser and a corresponding new tab is created.
Observed behavior:
The USDA file for this scene is not created on disk and it does not appear in the entity hierarchy browser in a new tab. In the project browser view, a yellow document icon appears and it does not appear to correspond to an actual USDA file.
Thank you for any insight you can provide about this issue.
My experience has been that ModelEntity(named:in:) can be used to load a USD file with a simple structure consisting of entities and model entities, and, critically, it will flatten the entity hierarchy down to a single ModelEntity, presumably reducing the number of draw calls.
However, can anyone verify that the following is true?
If ModelEntity(named:in:) is used to load a USD file from a RealityKit content bundle, it may fail when the USD file contains more complex data, such as shader graph material definitions, or perhaps for some other reason. I am not sure.
AND the error that ModelEntity(named:in:) throws in this case is
Cannot load RealityKitContent entity: Failed to find resource with name "<name>" in bundle
which would literally suggest that the file does not exist, instead of what I assume the error actually is, which is "the file exists but its entity hierarchy could not be flattened to a single ModelEntity" ?
Is that an accurate description of the known behavior of ModelEntity:named:in:)?
I understand that I could use Entity(named:in:) instead, without the flattening feature. My question is really more about the seemingly misleading error message.
Thank you for any clarification you can provide.
In Xcode, when you type a word like "don't", a second apostrophe is automatically inserted after the word and you have to manually delete it.
Repro steps:
Create a new empty Markdown file or XML file.
Open the file in Xcode.
Type the word “don’t”.
Expected behavior: The word “don’t” appears.
Observed behavior: The word “don’t” appears, followed by an extra single quote.
Starting with iOS 18.0 beta 1, I've noticed that RealityKit frequently crashes in the simulator when an app launches and presents an ARView.
I was able to create a small sample app with repro steps that demonstrates the issue, and I've submitted feedback: FB16144085
I've included a crash log with the feedback.
If possible, I'd appreciate it if an Apple engineer could investigate and suggest a workaround. It's awkward to be restricted to the iOS 17 simulator, which does not exhibit this behavior.
Please let me know if there's anything I can do to help.
Thank you.
In Xcode 16, when I use File > New > Project to create a new iOS app project, Xcode automatically creates an asset catalog file called Assets.xcassets in the project.
However, if I later use File > New > File from Template, for example to add an asset catalog to an embedded Swift package within a project, Xcode suggests the default filename Media.xcassets for the file, instead of Assets.xcassets.
Why is the default name for this file different in each context?
Thank you for explaining this troubling consistency issue which causes me to lie awake at night.
For a UIKit app based on scenes (UIScene), is it safe to reference UserDefaults in code that is executed from UIApplicationDelegate/application(_: didFinishLaunchingWithOptions:) ?
I've read that in iOS 15, there were undocumented scenarios involving app prewarming that would cause UserDefaults reads to fail within a window of time after device reboots, as described at https://christianselig.com/2024/10/beware-userdefaults/
The failure mode is that an app would be released, and months later, a small fraction of users would report failures consistent with UserDefaults reads unexpectedly returning nil, causing a loss of data. The user experience is bad, and debugging this behavior is then challenging because of how rarely it occurs.
Apple's https://developer.apple.com/documentation/uikit/app_and_environment/responding_to_the_launch_of_your_app/about_the_app_launch_sequence#3894431 seems to suggest that prewarming only executes an app "up until, but not including when main() calls UIApplicationMain(_:_:_:_:), but https://stackoverflow.com/questions/71025205/ios-15-prewarming-causing-appwilllaunch-method-when-prewarm-is-done documents that UIApplicationDelegate/application(_: didFinishLaunchingWithOptions:) has in fact been observed executing during app prewarming in scene-based apps.
So, my question: In an app based on scenes, if I'd like to reference UserDefaults within UIApplicationDelegate/application(_: didFinishLaunchingWithOptions:), when is it safe to do this?
I'm guessing the answer is one of these:
Never.
Only in apps that don't support scenes.
Only in iOS 16 or later.
Only in IOS 17 or later.
Is it guaranteed safe to reference UserDefaults in UIWindowSceneDelegate/scene(_:willConnectTo:options:) or later?
Is there documentation from Apple regarding this issue?
Thank you.
New in iOS 17.5 is UIImpactFeedbackGenerator/impactOccurred(at:), which generates haptic feedback at a specific screen location.
https://developer.apple.com/documentation/uikit/uiimpactfeedbackgenerator/4403143-impactoccurred
Which devices support this?
How does this work if the Taptic Engine has a fixed physical location?