Hi,
is there a way in visionOS to anchor an entity to the POV via RealityKit?
I need an entity which is always fixed to the 'camera'.
I'm aware that this is discouraged from a design perspective as it can be visually distracting. In my case though I want to use it to attach a fixed collider entity, so that the camera can collide with objects in the scene.
Edit:
ARView on iOS has a lot of very useful helper properties and functions like cameraTransform (https://developer.apple.com/documentation/realitykit/arview/cameratransform)
How would I get this information on visionOS? RealityViews content does not seem offer anything comparable.
An example use case would be that I would like to add an entity to the scene at my users eye-level, basically depending on their height.
I found https://developer.apple.com/documentation/realitykit/realityrenderer which has an activeCamera property but so far it's unclear to me in which context RealityRenderer is used and how I could access it.
Appreciate any hints, thanks!
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hello,
in this project https://developer.apple.com/documentation/arkit/content_anchors/tracking_and_visualizing_faces there is some sample code that describes how to map the camera feed to an object with SceneKit and a shader modifier.
I would like know if there is an easy way to achieve the same thing with a CustomMaterial and RealityKit 2.
Specifically I'm interested in what would be the best way to pass in the background of the RealityKit environment as a texture to the custom shader.
In SceneKit this was really easy as one could just do the following:
material.diffuse.contents = sceneView.scene.background.contents
As the texture input for custom material requires a TextureResource I would probably need a way to create a CGImage from the background or camera feed on the fly.
What I've tried so far is accessing the captured image from the camera feed and creating a CGImage from the pixel buffer like so:
guard
let frame = arView.session.currentFrame,
let cameraFeedTexture = CGImage.create(pixelBuffer: frame.capturedImage),
let textureResource = try? TextureResource.generate(from: cameraFeedTexture, withName: "cameraFeedTexture", options: .init(semantic: .color))
else {
return
}
// assign texture
customMaterial.custom.texture = .init(textureResource)
extension CGImage {
public static func create(pixelBuffer: CVPixelBuffer) -> CGImage? {
var cgImage: CGImage?
VTCreateCGImageFromCVPixelBuffer(pixelBuffer, options: nil, imageOut: &cgImage)
return cgImage
}
}
This seems wasteful though and is also quite slow.
Is there any other way to accomplish this efficiently or would I need to go the post processing route?
In the sample code the displayTransform for the view is also being passed as a SCNMatrix4. CustomMaterial custom.value only accepts a SIMD4 though. Is there another way to pass in the matrix?
Another idea I've had was to create a CustomMaterial from an OcclusionMaterial which already seems to contain information about the camera feed but so far had no luck with it.
Thanks for the support!
Hello,
in our app we are downloading some user generated content (.reality files and USDZs) and displaying it within the app.
This worked without issues in iOS 14 but with iOS 15 (release version) there have been a lot of issues with certain .reality files. As far as I can see USDZ files still work.
I've created a little test project and the error message log is not really helpful.
2021-10-01 19:42:30.207645+0100 RealityKitAssetTest-iOS15[3239:827718] [Assets] Failed to load asset of type 'RealityFileAsset', error:Could not find archive entry named assets/Scéna17_9dfa3d0.compiledscene.
2021-10-01 19:42:30.208097+0100 RealityKitAssetTest-iOS15[3239:827598] [Assets] Failed to load asset path '#18094855536753608259'
2021-10-01 19:42:30.208117+0100 RealityKitAssetTest-iOS15[3239:827598] [Assets] AssetLoadRequest failed because asset failed to load '#18094855536753608259'
2021-10-01 19:42:30.307040+0100 RealityKitAssetTest-iOS15[3239:827598] throwing -10878
2021-10-01 19:42:30.307608+0100 RealityKitAssetTest-iOS15[3239:827598] throwing -10878
2021-10-01 19:42:30.307712+0100 RealityKitAssetTest-iOS15[3239:827598] throwing -10878
2021-10-01 19:42:30.307753+0100 RealityKitAssetTest-iOS15[3239:827598] throwing -10878
2021-10-01 19:42:30.307790+0100 RealityKitAssetTest-iOS15[3239:827598] throwing -10878
2021-10-01 19:42:30.307907+0100 RealityKitAssetTest-iOS15[3239:827598] throwing -10878
2021-10-01 19:42:30.307955+0100 RealityKitAssetTest-iOS15[3239:827598] throwing -10878
2021-10-01 19:42:30.308155+0100 RealityKitAssetTest-iOS15[3239:827598] throwing -10878
2021-10-01 19:42:30.308194+0100 RealityKitAssetTest-iOS15[3239:827598] throwing -10878
▿ Failed to load loadRequest.
- generic: "Failed to load loadRequest."
Basic code structure that is used for loading:
cancellable = Entity.loadAsync(named: entityName, in: .main)
.sink { completion in
switch completion {
case .failure(let error):
dump(error)
print("Done")
case .finished:
print("Finished loading")
}
} receiveValue: { entity in
print("Entity: \(entity)")
}
Is there anyway to force it to load in a mode that enforces compatibility?
As mentioned this only happens on iOS 15. Even ARQuickLook can't display the files anymore (no issues on iOS 14).
Thanks for any help!
Sidebars for mac Catalyst apps running with UIDesignRequiresCompatibility flag render their active items with a white bg tint – resulting in labels and icons being not visible.
mac OS Tahoe 26.1 Beta 3 (25B5062e)
FB20765036
Example (Apple Developer App):
Hello,
I have a cylinder with an Unlit Material and a base color set. Now I want to apply a gradient as the alpha mask so I get this kind of halo GTA-like checkpoint look.
The code:
var baseMaterial = UnlitMaterial(color: UIColor.red)
baseMaterial.blending = .transparent(opacity: .init(scale: 100, texture: .init(maskTextureResource))) // maskTextureResource is the gradient mask
baseMaterial.opacityThreshold = 0
This works but unfortunately leads to some ugly visible gradient banding. I've also tried to play with the scale of the blending texture but that did not help.
As an alternative approach I tried to solve this via a custom surface shader.
Code below:
[[visible]]
void gradientShader(realitykit::surface_parameters params) {
auto surface = params.surface();
float2 uv = params.geometry().uv0();
float h = 0.5; // adjust position of middleColor
half startAlpha = 0.001;
half middleAlpha = 1;
half endAlpha = 0.001;
half alpha = mix(mix(startAlpha, middleAlpha, half(uv.y/h)), mix(middleAlpha, endAlpha, half((uv.y - h)/(1.0 - h))), half(step(h, uv.y)));
surface.set_emissive_color(half3(params.material_constants().emissive_color()));
surface.set_base_color(half3(params.material_constants().base_color_tint()));
surface.set_opacity(alpha);
}
The result looks really nice and smooth but unfortunately this now also culls the inner part of the cylinder. Even on the semitransparent parts.
What I want is a having the effect applied on both the outer and inner part of the cylinder being visible. So the transparent part of the outside allows you to seethrough to the inside.
I've got this working by using a PhysicallyBasedMaterial instead of an UnlitMaterial (which does not support blending out of the box) but again had to issue with the banding.
On my Custom Material faceCulling is set to .none.
Here is how it currently looks – as you can see in the left one the alpha mask is not smooth and has banding artefacts:
Thank you for any help!
Hello,
I've been tinkering with PortalComponent on visionOS a bit but noticed that the content of the WorldComponent is always clipped to the mesh geometry of whatever entities have the PortalComponent applied. Now I'm wondering if there is any way or trick to allow contents of the portal to peek out – similar to the Encounter Dinosaurs experience on Vision Pro (I assume it also uses PortalComponent?).
I saw that PortalComponent has a clippingPlane property (https://developer.apple.com/documentation/realitykit/portalcomponent/clippingplane-swift.property). But so far I haven't been able to achieve a perceptible visual difference with it.
If possible I would like to avoid hacky tricks using duplicate meshes or similar to achieve this.
Thanks for any hints!
Hello,
I'm getting started for my project with Xcode Cloud since I upgraded to the macOS Sequioa Beta and Xcode 16 now refuses to archive builds for TestFlight.
Somewhere very late in the build process I get the following error:
realitytool requires Metal for this operation and it is not available in this build environment
The log says this happens at:
Compile Skybox urban.skybox
My project uses RealityKit. How can I fix this issue?
Thanks!
Hello,
given this following simple SwiftUI setup:
struct ContentView: View {
var body: some View {
CustomFocusView()
}
}
struct CustomFocusView: View {
@FocusState private var isFocused: Bool
var body: some View {
color
.frame(width: 128, height: 128)
.focusable(true)
.focused($isFocused)
.onTapGesture {
isFocused.toggle()
}
.onKeyPress("a") {
print("A pressed")
return .handled
}
}
var color: Color {
isFocused ? .blue : .red
}
}
If I run this via Mac – Designed for iPad, the CustomFocusView toggles focus as expected and cycles through red and blue.
Now if I run this same exact code via Mac Catalyst absolutely nothing happens and so far I wasn't able to ever get this view to accept focused state. Is this expected? I would appreciate if anyone could hint me on how to get this working.
Thank and best regards!
Hi,
since iOS 15 I've repeatedly noticed the console warning »ARSessionDelegate is retaining X ARFrames. This can lead to future camera frames being dropped« even for rather simple projects using RealityKit and ARKit. Could someone from the ARKit team please elaborate what causes this warning and what can be done to avoid it?
If I remember correctly I didn't even assign an ARSessionDelegate.
Thank you!
Hello,
On iOS 16 when I'm retrieving an existing material from a model entity and update it's blending property to .transparent(opacity: …) the color or baseColor texture get's removed after reassigning the updated material.
My usecase is that I want to fade in a ModelEntity through a CustomSystem and therefore need to repeatedly reassign the opacity value. I've tested this with UnlitMaterial and PhysicallyBasedMaterial – both suffer from this issue.
On iOS 15 this works as expected. Please let me know if there is any workaround, as this seems to me like a major regression and ideally I need this to work once iOS 16 gets released to the public.
The radar number including a sample project is: FB11420976
Thank you!
Hello,
creating a simple-as-it-gets Slider in SwiftUI and then running that app on Mac Catalyst with the macOS idiom enabled, the app crashes:
struct ContentView: View {
@State private var sliderValue: Double = 0.4
var body: some View {
VStack {
Slider(value: $sliderValue)
}
.padding()
}
}
running this will result in an exception:
_setMinimumEnabledValue: is not supported on UISlider when running Catalyst apps in the Mac idiom. See UIBehavioralStyle for possible alternatives.
This is obviously not ideal and also apparently no documented.
Is there a workaround for this?
It used to work for on macOS Sonoma.
macOS 26 RC
Xcode 26 RC
FB20191635
Thanks!
Hi,
please let me know if I should rather file feedback for this, but I figured it's worth to flag it one way or an another:
Test Xcode Version: 14.0 beta 6 (14A5294g)
1. Project »Altering RealityKit Rendering with Shader Functions«
This project crashes right away when running it on a device (iOS 15 and 16).
Screenshot:
2. Project »Altering RealityKit Rendering with Shader FunctionsUsing object capture assets in RealityKit«
Suffers from pretty bad performance when run on a device – barely scratching 20-25fps on an iPhone 12 Pro. iPhone XS even less.
Screenshot:
As these are official sample project I feel like they should work flawlessly out of the box.
Best
Arthur
Devices running iOS 18 using RealityKit do not seem to receive lighting supplied via ARKit Environment Texturing (https://developer.apple.com/documentation/arkit/arworldtrackingconfiguration/2977509-environmenttexturing).
Instead just a default IBL is used by RealityKit.
This happens with RealityView as well as ARView.
It also happens when I explicitly opt-in to environment texturing:
let worldTrackingConfig = ARWorldTrackingConfiguration()
worldTrackingConfig.environmentTexturing = .automatic
arView.session.run(worldTrackingConfig)
Even the Xcode AR Template has this issue.
I'm attaching a screenshot of the sample app running on iOS 18 where it's broken and from iOS 17 where it works as expected.
I hope this can get resolved quickly since I see it as a major regression.
Feedback ID: FB15091335
UPDATE:
It works on my older iPhone XS (iOS 18 22A5282m)
Broken on iPad Pro (11-inch) (3rd generation) (iPadOS 18.0 (22A5350a))
Maybe it's related to LiDAR?
Thank you!
iOS 17 (works):
iOS 18 (broken):
Hi,
since RealityKit 4 now supports Blend Shapes I was wondering if there are any workflow or tooling recommendations to bake/export them into a USDZ.
Are Blender or Cinema4D capable to do that out of the box? Should we look into NVIDIA omniverse (https://docs.omniverse.nvidia.com/connect/latest/blender/manual.htm)
So far this topic seems very sparsely documented and I would appreciate any hints. Thank you!
Hello,
PhysicallyBasedMaterial in RealityKit 2 contains a blending property to adjust the transparency of a material.
Is there a way to animate this over time to fade entities in and out?
I've tried the new FromToByAnimation API but could not figure out if there is a supported BindPath for the transparency.
Ideally what I would like to achieve is something similar to SceneKits SCNAction.fadeIn(duration: …) which also worked on a whole node.
I figured I could also go the route of a custom fragment shader here, though that seems overkill.
As RealityComposer also supports fade actions I would assume that this is at least supported behind the scenes.
Thanks for any help!