I have 2 planes with textures on. I want these planes to intersect [ –|– ], and I want the blend mode to be additive. Currently I get z fighting on the planes, and I can't see how to set blend modes.
I've done this before in Unity and Godot in a fairly straight forward manner.
How do I accomplish this with RealityKit, preferably using code only (my scene is quite dynamic)?
Do I need to do it with a shader manually? How can I stop the z fighting?
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hi team, I'm looking for the RealityKit debugger in Xcode 26 beta 3. I'm running a RealityKit app on my iPad running iPadOS 26 b3, but the debugger option is not there in Xcode.
It's been an ask for a few years and I'm wondering if there are any plans, or whether the '26 SDKs/Tools allow Apple Music to work in the simulator? I develop for the Vision Pro so the usual 'fix' of running on the device is a bit of a hard ask.
At the very least a small sample library that works in the simulator would be welcome (similar to how photos works)
Cheers
Hi,
In my app I am using MusicLibraryRequest<Artist> to fetch all of the artists in someone's Library collection. With this response I then fetch each artists albums: artist.with([.album]).
The response from this only gives albums in the users Library collection. I would like to augment it with all of the albums for an artist from the full catalogue.
I'm using MusicKit and targeting iOS18 and visionOS 2.
Could someone please point me towards the best way to approach this?
I have a scene with multiple RealityKit entities. There is a blue cube which I want to rotate along with all of its children (it's partly transparent).
Inside the cube are a number of child entities (red) that I want to tap.
The cube and red objects all have collision components as is required for gestures to work.
If I want to rotate the blue cube, and also tap the red objects I can't do this as the blue cube's collision component intercepts the taps.
Is there a way of accomplishing what I want?
I'm targeting visionOS 2, and my scene is in a volume.
I have some entities which use attachments to show a label next to them. I would like to change this to only show the label when the entity is being looked at / hovered over. I have the new HoverEffect component on my entity that works nicely, but I can't see how I toggle the visibility of the labels.
I have code such as the following. The performance on the Vision Pro seems to get quite bad once I hit a few thousand of these models. It feels like I should be able to optimise this somehow, perhaps using instancing. Is that possible with RealityKit in visionOS 2?
let material = UnlitMaterial(color: .white)
let sphereModel = ModelEntity(
mesh: .generateSphere(radius: 0.001),
materials: [material])
for index in 0..<5000 {
let point = generatedPoints[index]
let model = sphereModel.clone(recursive: false)
model.position = [point.x, point.y, point.z]
parent.addChild(starModel)
}
I'm porting a scenekit app to RealityKit, eventually offering an AR experience there. I noticed that when I run it on my iPhone 15 Pro and iPad Pro with the 120Hz screen, the framerate seems to be limited to 60fps. Is there a way to increase the target framerate to 120 like I can with sceneKit?
I'm setting up my arView like so:
@IBOutlet private var arView: ARView! {
didSet {
arView.cameraMode = .nonAR
arView.debugOptions = [.showStatistics]
}
}
My app was recently transferred from account X to Y. We went through all of the documentation and it's gone ok bar one big issue.
The app update that we last sent to Test Flight no longer has access to our apps App Group shared user defaults.
It's been suggested to me to delete the app group id from the old account, and recreate it in our new account. I'd like to confirm a couple of points before I proceed.
Will the production version of the app be affected if I delete the app group from the old account.
After recreating the app group in the new account, will the data in shared user defaults become available again?
Hi all, is there anything available in the iOS SDKs to allow me to find and connect to a bluetooth speaker? At the moment I have to direct users to iOS settings and do it from there, but I would like to have an in-app experience for this.
Thanks!
I can see my pi listed in iOS settings, and I can connect to it that way. I'm also trying to connect using CoreBluetooth but the pi never appears as a Peripheral I can connect to.
Should I be taking another approach? Would it appear as an ExternalAccessory instead?
I have a Python server on the device that I'd like to connect to.
Thanks!
I have a 3D scene with a perspective camera and I'd like some of the elements to be projected using an orthographic projection instead.
My use case is that I have some 3D elements with attached text nodes. I'd like the text on these nodes to always be the same size no matter how far away the camera is. Is there a way I can use SceneKit to mix and match? Or is there another technique I can use?
Hi team, are there any plans to add a small detent to this controller? Or the ability to set a custom height? I love that this is here, but without one of those features I fear I won't be able to use it as much as I'd like to. Thanks!
https://developer.apple.com/documentation/uikit/uisheetpresentationcontroller/detent
With the iOS privacy changes I'm a bit confused when we need to show the privacy pop up to request permission, and what needs to be gated behind the answer?
We use mixpanel/ga, and from my reading of the iOS 14 documents it seems we may need to ask permission, even if the IDFA is not used. Is this the correct interpretation?
Topic:
App Store Distribution & Marketing
SubTopic:
General
Tags:
App Tracking Transparency
AdSupport
Looks like the particle system for scenekit is not available as a template anymore in xcode 11. I can only see the SpriteKit version. Is this by design or a mistake, or is there another way to create one?