Hello,
PhysicallyBasedMaterial in RealityKit 2 contains a blending property to adjust the transparency of a material.
Is there a way to animate this over time to fade entities in and out?
I've tried the new FromToByAnimation API but could not figure out if there is a supported BindPath for the transparency.
Ideally what I would like to achieve is something similar to SceneKits SCNAction.fadeIn(duration: …) which also worked on a whole node.
I figured I could also go the route of a custom fragment shader here, though that seems overkill.
As RealityComposer also supports fade actions I would assume that this is at least supported behind the scenes.
Thanks for any help!
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hello,
is there a recommended way to render Menu items, e.g in a SwiftUI ContextMenu with icon (SFSymbols)?
Let's say I have the following setup:
Both buttons render fine on native macOS (e.g Sonoma) but Catalyst refuses to render the symbol at all. I tried every possible combination I could think off. The only way I found was to directly copy and paste a symbol from the SF symbols app and inline it with the label string as unicode. Unfortunately I have a couple custom SF symbols so this isn't really an option for me.
I feel like this is a perfectly valid usecase, as it makes the menu visually a lot easier scannable.
With UIKit and Ventura this at least worked for Menubar items but now also seems broken on Sonoma.
I would greatly appreciate any hints. Thanks!
Hello,
I have a macOS Catalyst app that I now began updating and building against the iOS 18/macOS Sequoia SDKs. Most things appear to be working just fine as before, apart from my NSToolbar.
At the root of my app I am presenting a UISplitViewController which gets a custom SidebarViewController and a UITabBarController as its viewControllers.
Then at same point in the apps lifecycle the UITabBarController presents another ViewController modally. I then associate the controllers window with a custom NSToolbar like this:
let toolbar = NSToolbar(identifier: "mainToolbar")
toolbar.displayMode = .iconAndLabel
toolbar.delegate = self
toolbar.allowsUserCustomization = false
titleBar.toolbarStyle = .automatic
titleBar.titleVisibility = .hidden
titleBar.toolbar = toolbar
I also disable automatic NSToolbar hosting via: https://developer.apple.com/documentation/uikit/uinavigationbardelegate/3987959-navigationbarnstoolbarsection (returning .none).
Now all of this worked fine on macOS Sonoma and previous versions but on Sequoia my custom toolbar refuses to show up.
My suspicion is that is has something to do with the new tab and sidebar behaviour introduced with the new SDKs (https://developer.apple.com/documentation/uikit/uinavigationbardelegate/3987959-navigationbarnstoolbarsection).
For now within my UITabBarController I was able to revert to the old look using:
if #available(iOS 18.0, *) {
mode = .tabSidebar
sidebar.isHidden = true
isTabBarHidden = true
}
This result in a look similar to the previous macOS version but my NSToolbar unfortunately remains hidden.
Is there an easy fix for this? Since I am a solo developer I would prefer to spend my available resources currently on other features and adopt the new tab/sidebars a couple months down the line.
Appreciate any help and hints, thanks!
There used to be a toolbar here on the right side. ↑
HoverEffectComponent on macOS 15 and iOS 18 works fine using RealityView, but seems to be ignored when ARView (even with a SwiftUI UIViewRepresentable) is used.
Feedback ID: FB15080805
A ShaderGraphMaterial with an Occlusion Surface Output generated with RealityComposer 2 fails to load on iOS 18 and macOS 15 with the following error:
RealityFoundation.ShaderGraphMaterial.LoadError.invalidTypeFound (https://developer.apple.com/documentation/realitykit/shadergraphmaterial/loaderror/invalidtypefound)
This happens with both https://developer.apple.com/documentation/shadergraph/realitykit/occlusion-surface-(realitykit) and https://developer.apple.com/documentation/shadergraph/realitykit/shadow-receiving-occlusion-surface-(realitykit)
RealityView { content in
do {
let bgEntity = ModelEntity(mesh: .generateCone(height: 0.5, radius: 0.1), materials: [SimpleMaterial(color: .red, isMetallic: true)])
bgEntity.position.z = -0.2
content.add(bgEntity)
let occlusionMaterial = try await ShaderGraphMaterial(named: "/Root/OcclusionMaterial", from: "OcclusionMaterial")
let testEntity = ModelEntity(mesh: .generateSphere(radius: 0.4), materials: [occlusionMaterial])
content.add(testEntity)
content.cameraTarget = testEntity
} catch {
print("Shader Graph Load Error:")
dump(error)
}
}
.realityViewCameraControls(.orbit)
.edgesIgnoringSafeArea(.all)
Feedback ID: FB15081296
Topic:
Spatial Computing
SubTopic:
Reality Composer Pro
Tags:
RealityKit
Reality Composer Pro
Shader Graph Editor
Hi,
I have a RealityKit app that I am building with Xcode 16. The app has a minimum deployment target of iOS 17. If I run it on an iOS 17 device the app crashes:
dyld[15716]: Symbol not found: _$s10RealityKit13ShapeResourceC14generateConvex4fromAcA04MeshD0C_tYaKFZ
Referenced from: …
Expected in: …/System/Library/Frameworks/RealityFoundation.framework/RealityFoundation
My code looks something like this:
@available(iOS, introduced: 13.0, obsoleted: 18.0)
@MainActor @preconcurrency func generateNonAsyncConvexShapeResource(from meshResource: MeshResource) throws -> ShapeResource {
ShapeResource.generateConvex(from: meshResource)
}
@available(iOS 18.0, *)
func generateConvexShapeAsync(from meshResource: MeshResource) async throws -> ShapeResource {
// This will only be available for iOS 18 and above
return try await ShapeResource.generateConvex(from: meshResource)
}
if let meshResource = try? modelEntity.model?.mesh.applying(transform: transform.matrix) {
if #available(visionOS 1.0, iOS 18.0, *) {
try? await generateConvexShapeAsync(from: meshResource)// await shapeResources.append(.generateConvex(from: meshResource))
} else {
try? generateNonAsyncConvexShapeResource(from: meshResource)
}
}
So I actually do check for the system and only call the async variant on iOS 18.
Any hints how to fix that?
Thanks!
Hi,
We've been leveraging AppClips on iOS for a while now to distribute native-app quality AR experiences (utilising ARKit and RealityKit) with the accessibility of a website.
This has been a crucial differentiator for us and is a core driver for our business.
Since our authoring tools also allows to run the same AR experiences on Vision Pro it would be amazing if they could be triggered by App Clips here as well. We've got this feedback from clients and users multiple times and since there seems to already be some basic App Clips support (e.g when registering the custom lens inserts) integrated into the system we would immensely appreciate if this feature could be opened up for 3rd party developers as well.
Associated feedback ID: FB13348462
Thank you!
Hello,
I'm encountering a runtime crash when building my visionOS app with Xcode 16.3 for visionOS 2.5. Our existing AppStore/Testflight app is also instantly crashing on visionOS 2.5 when opened but works fine on e.g visionOS 2.4.
The app builds successfully but crashes on launch with this symbol lookup error (slightly adjusted because the forum complained regarding sensitive data):
Symbol not found: _$sSo22CLLocationCoordinate2DVSE12CoreLocationMc
Referenced from: <XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX> /private/var/containers/Bundle/Application/XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX/MyApp.app/MyApp.debug.dylib
Expected in: <XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX> /usr/lib/swift/libswiftCoreLocation.dylib
dyld config: DYLD_LIBRARY_PATH=/usr/lib/system/introspection DYLD_INSERT_LIBRARIES=/usr/lib/libLogRedirect.dylib:/usr/lib/libBacktraceRecording.dylib:/usr/lib/libMainThreadChecker.dylib:/System/Library/PrivateFrameworks/GPUToolsCapture.framework/GPUToolsCapture:/usr/lib/libViewDebuggerSupport.dylib
I've already implemented my own Codable conformance for CLLocationCoordinate2D:
extension CLLocationCoordinate2D: Codable {
// implementation details...
}
This worked fine on previous visionOS/Xcode versions. Has anyone encountered this issue or found a solution?
System details:
macOS version: 15.3.2
Xcode version: 16.3
visionOS target: 2.5
Thank you!
Hi,
I am in the process of implementing SharePlay into our app. The shared experience opens an Immersive Space and we set systemCoordinator.configuration.supportsGroupImmersiveSpace = true
Now visionOS establishes a shared coordinate space for the immersive space.
From the docs:
To achieve consistent positioning of RealityKit entities across multiple devices in an immersive space during a SharePlay session
There are cases where we want to position content in front of the user (independent of the shared session, and for each user individually). Normally to do that we use the transform retrieved via worldTrackingProvider.queryDeviceAnchor.originFromAnchorTransform
to position content in front of the user (plus some Z Offset and smooth interpolation).
This works fine in non-SharePlay instances and the device transform is where I would expect it to be but during the FaceTime call deviceAnchor.originFromAnchorTransform seems to use the shared origin of the immersive space and then I end up with a transform that might be offset.
Here is a video of the issue in action: https://streamable.com/205r2p
The blue rect is place using AnchorEntity(.head, trackingMode: .continuous). This works regardless of the call and the entity is always placed based on the head position.
The green rect is adjusted on every frame using the transform I get from worldTrackingProvider.queryDeviceAnchor. As you can see it's offset.
Is there any way I can query query this transform locally for the user during a FaceTime call?
Also I would like to know if it's possible to disable this automatic entity transform syncing behavior?
Setting entity.synchronization = nil results in the entity not showing up at all.
https://developer.apple.com/documentation/realitykit/synchronizationcomponent
Is SynchronizationComponent only relevant for the legacy MultiPeerConnectivity approach?
Thank you!
Hi,
please let me know if I should rather file feedback for this, but I figured it's worth to flag it one way or an another:
Test Xcode Version: 14.0 beta 6 (14A5294g)
1. Project »Altering RealityKit Rendering with Shader Functions«
This project crashes right away when running it on a device (iOS 15 and 16).
Screenshot:
2. Project »Altering RealityKit Rendering with Shader FunctionsUsing object capture assets in RealityKit«
Suffers from pretty bad performance when run on a device – barely scratching 20-25fps on an iPhone 12 Pro. iPhone XS even less.
Screenshot:
As these are official sample project I feel like they should work flawlessly out of the box.
Best
Arthur
Devices running iOS 18 using RealityKit do not seem to receive lighting supplied via ARKit Environment Texturing (https://developer.apple.com/documentation/arkit/arworldtrackingconfiguration/2977509-environmenttexturing).
Instead just a default IBL is used by RealityKit.
This happens with RealityView as well as ARView.
It also happens when I explicitly opt-in to environment texturing:
let worldTrackingConfig = ARWorldTrackingConfiguration()
worldTrackingConfig.environmentTexturing = .automatic
arView.session.run(worldTrackingConfig)
Even the Xcode AR Template has this issue.
I'm attaching a screenshot of the sample app running on iOS 18 where it's broken and from iOS 17 where it works as expected.
I hope this can get resolved quickly since I see it as a major regression.
Feedback ID: FB15091335
UPDATE:
It works on my older iPhone XS (iOS 18 22A5282m)
Broken on iPad Pro (11-inch) (3rd generation) (iPadOS 18.0 (22A5350a))
Maybe it's related to LiDAR?
Thank you!
iOS 17 (works):
iOS 18 (broken):
Hi,
since RealityKit 4 now supports Blend Shapes I was wondering if there are any workflow or tooling recommendations to bake/export them into a USDZ.
Are Blender or Cinema4D capable to do that out of the box? Should we look into NVIDIA omniverse (https://docs.omniverse.nvidia.com/connect/latest/blender/manual.htm)
So far this topic seems very sparsely documented and I would appreciate any hints. Thank you!
Hi,
since iOS 15 I've repeatedly noticed the console warning »ARSessionDelegate is retaining X ARFrames. This can lead to future camera frames being dropped« even for rather simple projects using RealityKit and ARKit. Could someone from the ARKit team please elaborate what causes this warning and what can be done to avoid it?
If I remember correctly I didn't even assign an ARSessionDelegate.
Thank you!
Hello,
On iOS 16 when I'm retrieving an existing material from a model entity and update it's blending property to .transparent(opacity: …) the color or baseColor texture get's removed after reassigning the updated material.
My usecase is that I want to fade in a ModelEntity through a CustomSystem and therefore need to repeatedly reassign the opacity value. I've tested this with UnlitMaterial and PhysicallyBasedMaterial – both suffer from this issue.
On iOS 15 this works as expected. Please let me know if there is any workaround, as this seems to me like a major regression and ideally I need this to work once iOS 16 gets released to the public.
The radar number including a sample project is: FB11420976
Thank you!
Hello,
creating a simple-as-it-gets Slider in SwiftUI and then running that app on Mac Catalyst with the macOS idiom enabled, the app crashes:
struct ContentView: View {
@State private var sliderValue: Double = 0.4
var body: some View {
VStack {
Slider(value: $sliderValue)
}
.padding()
}
}
running this will result in an exception:
_setMinimumEnabledValue: is not supported on UISlider when running Catalyst apps in the Mac idiom. See UIBehavioralStyle for possible alternatives.
This is obviously not ideal and also apparently no documented.
Is there a workaround for this?
It used to work for on macOS Sonoma.
macOS 26 RC
Xcode 26 RC
FB20191635
Thanks!