Post

Replies

Boosts

Views

Activity

SwiftUI TextField selection - strange initial values with iOS
When using a TextField with axis to set to .vertical on iOS, it sets a bound selection parameter to an erroneous value. Whilst on MacOS it performs as expected. Take the following code: import SwiftUI @main struct SelectionTestApp: App { var body: some Scene { WindowGroup { ContentView() } } } struct ContentView: View { @FocusState private var isFocused: Bool @State private var text = "" @State private var textSelection: TextSelection? = nil var body: some View { TextField("Label", text: $text, selection: $textSelection, axis: .vertical) .onChange(of: textSelection, initial: true) { if let textSelection { print("textSelection = \(textSelection)") } else { print("textSelection = nil") } } .focused($isFocused) .task { isFocused = true } } } Running this on MacOS target gives the following on the console: textSelection = nil textSelection = TextSelection(indices: SwiftUI.TextSelection.Indices.selection(Range(0[any]..<0[any])), affinity: SwiftUI.TextSelectionAffinity.downstream) Running the code on iOS gives: textSelection = TextSelection(indices: SwiftUI.TextSelection.Indices.selection(Range(1[any]..<1[any])), affinity: SwiftUI.TextSelectionAffinity.upstream) textSelection = TextSelection(indices: SwiftUI.TextSelection.Indices.selection(Range(1[any]..<1[any])), affinity: SwiftUI.TextSelectionAffinity.upstream) Note here the range is 1..<1 - which is incorrect. Also of side interest this behaviour changes if you remove the axis parameter: textSelection = nil Am I missing something, or is this a bug?
2
0
137
Oct ’25
SwiftUI Picker layout under MacOS26
Prior to MacOS 26, Multiple Pickers could be laid out with a uniform width. For example: struct LayoutExample: View { let fruits = ["apple", "banana", "orange", "kiwi"] let veg = ["carrot", "cauliflower", "peas", "Floccinaucinihilipilification Cucurbitaceae"] @State private var selectedFruit: String = "kiwi" @State private var selectedVeg: String = "carrot" var body: some View { VStack(alignment: .leading) { Picker(selection: $selectedFruit) { ForEach(fruits, id: \.self, content: Text.init) } label: { Text("Fruity choice") .frame(width: 150, alignment: .trailing) } .frame(width: 300) Picker(selection: $selectedVeg) { ForEach(veg, id: \.self, content: Text.init) } label: { Text("Veg") .frame(width: 150, alignment: .trailing) } .frame(width: 300) } } } Renders like this, prior to MacOS26: But now looks like this under MacOS 26: Is there anyway to control the size of the picker selection in MacOS 26?
1
0
78
Sep ’25
Custom EntityAction - different behaviour VisionOS 2.6 vs 26
I implemented an EntityAction to change the baseColor tint - and had it working on VisionOS 2.x. import RealityKit import UIKit typealias Float4 = SIMD4<Float> extension UIColor { var float4: Float4 { if cgColor.numberOfComponents == 4, let c = cgColor.components { Float4(Float(c[0]), Float(c[1]), Float(c[2]), Float(c[3])) } else { Float4() } } } struct ColourAction: EntityAction { // MARK: - PUBLIC PROPERTIES let startColour: Float4 let targetColour: Float4 // MARK: - PUBLIC COMPUTED PROPERTIES var animatedValueType: (any AnimatableData.Type)? { Float4.self } // MARK: - INITIATION init(startColour: UIColor, targetColour: UIColor) { self.startColour = startColour.float4 self.targetColour = targetColour.float4 } // MARK: - PUBLIC STATIC FUNCTIONS @MainActor static func registerEntityAction() { ColourAction.subscribe(to: .updated) { event in guard let animationState = event.animationState else { return } let interpolatedColour = event.action.startColour.mixedWith(event.action.targetColour, by: Float(animationState.normalizedTime)) animationState.storeAnimatedValue(interpolatedColour) } } } extension Entity { // MARK: - PUBLIC FUNCTIONS func changeColourTo(_ targetColour: UIColor, duration: Double) { guard let modelComponent = components[ModelComponent.self], let material = modelComponent.materials.first as? PhysicallyBasedMaterial else { return } let colourAction = ColourAction(startColour: material.baseColor.tint, targetColour: targetColour) if let colourAnimation = try? AnimationResource.makeActionAnimation(for: colourAction, duration: duration, bindTarget: .material(0).baseColorTint) { playAnimation(colourAnimation) } } } This doesn't work in VisionOS 26. My current fix is to directly set the material base colour - but this feels like the wrong approach: @MainActor static func registerEntityAction() { ColourAction.subscribe(to: .updated) { event in guard let animationState = event.animationState, let entity = event.targetEntity, let modelComponent = entity.components[ModelComponent.self], var material = modelComponent.materials.first as? PhysicallyBasedMaterial else { return } let interpolatedColour = event.action.startColour.mixedWith(event.action.targetColour, by: Float(animationState.normalizedTime)) material.baseColor.tint = UIColor(interpolatedColour) entity.components[ModelComponent.self]?.materials[0] = material animationState.storeAnimatedValue(interpolatedColour) } } So before I raise this as a bug, was I doing anything wrong in the former version and got lucky? Is there a better approach?
0
0
88
Sep ’25
Accessing an actor's isolated state from within a SwiftUI view
I'm trying to understand a design pattern for accessing the isolated state held in an actor type from within a SwiftUI view. Take this naive code: actor Model: ObservableObject { @Published var num: Int = 0 func updateNumber(_ newNum: Int) { self.num = newNum } } struct ContentView: View { @StateObject var model = Model() var body: some View { Text("\(model.num)") // <-- Compiler error: Actor-isolated property 'num' can not be referenced from the main actor Button("Update number") { Task.detached() { await model.updateNumber(1) } } } } Understandably I get the compiler error Actor-isolated property 'num' can not be referenced from the main actor when I try and access the isolated value. Yet I can't understand how to display this data in a view. I wonder if I need a ViewModel that observes the actor, and updates itself on the main thread, but get compile time error Actor-isolated property '$num' can not be referenced from a non-isolated context. class ViewModel: ObservableObject { let model: Model @Published var num: Int let cancellable: AnyCancellable init() { let model = Model() self.model = model self.num = 0 self.cancellable = model.$num // <-- compile time error `Actor-isolated property '$num' can not be referenced from a non-isolated context` .receive(on: DispatchQueue.main) .sink { self.num = $0 } } } Secondly, imagine if this code did compile, then I would get another error when clicking the button that the interface is not being updated on the main thread...again I'm not sure how to effect this from within the actor?
3
2
9.7k
Apr ’25
When to use an AnchorEntity or HandTrackingProvider in VisionOS
As I understand it there are two ways I can track a hand, or a joint, in RealityKit: either, create an AnchorEntity, for example AnchorEntity(.hand(.left, location: .palm)) or, set up an ARSession with a HandTrackingProvider ( a lot more code which I haven't repeated here). Assuming this is correct, when would I want to use one over the other?
2
0
409
Mar ’25
RealityKit particleEmitter delay starting when toggling isEmitting
I have a scene built up in RealityComposerPro, in which I've added a ParticleEmitter with isEmitting set to False and 'Loop' set to True. In my app, when I toggle isEmitting to True there can be a delay of a few seconds before the ParticleEmitter starts. However, if I programatically add the emitter in code at that point, it starts immediately. To be clear, I'm seeing this on the VisionOS simulator - I don't have access to a device at this time. Am I misunderstanding how to control the ParticleEmitter when I need precise control on when it starts.
1
0
540
Feb ’25
EntityAction for MaterialBaseTint - incorrect colours
Hello, I'm writing an EntityAction that animates a material base tint between two different colours. However, the colour that is being actually set differs in RGB values from that requested. For example, trying to set an end target of R0.5, G0.5, B0.5, results in a value of R0.735357, G0.735357, B0.735357. I can also see during the animation cycle that intermediate actual tint values are also incorrect, versus those being set. My understanding is the the values of material base colour are passed as a SIMD4. Therefore I have a couple of helper extensions to convert a UIColor into this format and mix between two colours. Note however, I don't think the issue is with this functions - even if their outputs are wrong, the final value of the base tint doesn't match the value being set. I wondered if this was a colour space issue? import simd import RealityKit import UIKit typealias Float4 = SIMD4<Float> extension Float4 { func mixedWith(_ value: Float4, by mix: Float) -> Float4 { Float4( simd_mix(x, value.x, mix), simd_mix(y, value.y, mix), simd_mix(z, value.z, mix), simd_mix(w, value.w, mix) ) } } extension UIColor { var float4: Float4 { var r: CGFloat = 0.0 var g: CGFloat = 0.0 var b: CGFloat = 0.0 var a: CGFloat = 0.0 getRed(&r, green: &g, blue: &b, alpha: &a) return Float4(Float(r), Float(g), Float(b), Float(a)) } } struct ColourAction: EntityAction { let startColour: SIMD4<Float> let targetColour: SIMD4<Float> var animatedValueType: (any AnimatableData.Type)? { SIMD4<Float>.self } init(startColour: UIColor, targetColour: UIColor) { self.startColour = startColour.float4 self.targetColour = targetColour.float4 } static func registerEntityAction() { ColourAction.subscribe(to: .updated) { event in guard let animationState = event.animationState else { return } let interpolatedColour = event.action.startColour.mixedWith(event.action.targetColour, by: Float(animationState.normalizedTime)) animationState.storeAnimatedValue(interpolatedColour) } } } extension Entity { func updateColour(from currentColour: UIColor, to targetColour: UIColor, duration: Double, endAction: @escaping (Entity) -> Void = { _ in }) { let colourAction = ColourAction(startColour: currentColour, targetColour: targetColour, endedAction: endAction) if let colourAnimation = try? AnimationResource.makeActionAnimation(for: colourAction, duration: duration, bindTarget: .material(0).baseColorTint) { playAnimation(colourAnimation) } } } The EntityAction can only be applied to an entity with a ModelComponent (because of the material), so it can be called like so: guard let modelComponent = entity.components[ModelComponent.self], let material = modelComponent.materials.first as? PhysicallyBasedMaterial else { return } let currentColour = material.baseColor.tint let targetColour = UIColor(_colorLiteralRed: 0.5, green: 0.5, blue: 0.5, alpha: 1.0) entity.updateColour(from:currentColour, to: targetColour, duration: 2)
1
0
563
Feb ’25
Subdivision shows in RealityComposerPro but not when loaded in Simulator
Hello, I am trying to use the subdivision mesh rendering option. I can see it working in RealityComposerPro: But not when loading asset and displaying in Simulator: Using this code: import SwiftUI import RealityKit import RealityKitContent struct AirspaceView: View { // MARK: - VIEW BODY var body: some View { RealityView { content in if let a = try? await Entity(named: "Models/Test/Test.usdc", in: realityKitContentBundle) { content.add(a) } } } } Any ideas why?
2
1
551
Feb ’25
Animating a RealityComposerPro shader's uniform input value
I'm trying to build a Shader in "Reality Composer Pro" that updates from a start time. Initially I tried the following: The idea was that when the startTime was 0, the output would be 0, but then I would set startTime from within code and this would be compared with the current GPU time, and difference used to drive another part of the shader graph: if let testEntity = root.findEntity(named: "Test"), var shaderGraphMaterial = testEntity.components[ModelComponent.self]?.materials.first as? ShaderGraphMaterial { let time = CFAbsoluteTimeGetCurrent() try! shaderGraphMaterial.setParameter(name: "StartTime", value: .float(Float(time))) testEntity.components[ModelComponent.self]?.materials[0] = shaderGraphMaterial } However, I haven't found a reference to the time the shader would be using. So now I am trying to write an EntityAction to achieve the same effect. Instead of comparing a start time to the GPU's time I'm trying to animate one of the shader's uniform input. However, I'm not sure how to specify the bind target. Here's my attempt so far: import RealityKit struct ShaderAction: EntityAction { let startValue: Float let targetValue: Float var animatedValueType: (any AnimatableData.Type)? { Float.self } static func registerEntityAction() { ShaderAction.subscribe(to: .updated) { event in guard let animationState = event.animationState else { return } let value = simd_mix(event.action.startValue, event.action.targetValue, Float(animationState.normalizedTime)) animationState.storeAnimatedValue(value) } } } extension Entity { func updateShader(from startValue: Float, to targetValue: Float, duration: Double) { let fadeAction = ShaderAction(startValue: startValue, targetValue: targetValue) if let shaderAnimation = try? AnimationResource.makeActionAnimation(for: fadeAction, duration: duration, bindTarget: .material(0).customValue) { playAnimation(shaderAnimation) } } } ''' Currently when I run this I get an assertion failure: 'Index out of range (operator[]:line 797) index = 260, max = 8' Furthermore, even if it didn't crash I don't understand how to pass a binding to the custom shader value "startValue". Any clues of how to achieve this effect - even if it's a completely different way.
1
0
598
Feb ’25
GCControllerDidConnect notification not received in VisionOS 2.0
I am unable to get VisionOS 2.0 (simulator) to receive the GCControllerDidConnect notification and thus am unable to setup support for a gamepad. However, it works in VisionOS 1.2. For VisionOS 2.0 I've tried adding: .handlesGameControllerEvents(matching: .gamepad) attribute to the view Supports Controller User Interaction to Info.plist Supported game controller types -> Extended Gamepad to Info.plist ...but the notification still doesn't fire. It does when the code is run from VisionOS 1.2 simulator, both of which have the Send Game Controller To Device option enabled. Here is the example code. It's based on the Xcode project template. The only files updated were ImmersiveView.swift and Info.plist, as detailed above: import SwiftUI import GameController import RealityKit import RealityKitContent struct ImmersiveView: View { var body: some View { RealityView { content in // Add the initial RealityKit content if let immersiveContentEntity = try? await Entity(named: "Immersive", in: realityKitContentBundle) { content.add(immersiveContentEntity) } NotificationCenter.default.addObserver( forName: NSNotification.Name.GCControllerDidConnect, object: nil, queue: nil) { _ in print("Handling GCControllerDidConnect notification") } } .modify { if #available(visionOS 2.0, *) { $0.handlesGameControllerEvents(matching: .gamepad) } else { $0 } } } } extension View { func modify<T: View>(@ViewBuilder _ modifier: (Self) -> T) -> some View { return modifier(self) } }
2
1
937
Dec ’24
Creating and applying a shader to change an entity's rendering in RealityKit
Hello, I am looking to create a shader to update an entity's rendering. As a basic example say I want to recolour an entity, but leave its original textures showing through: I understand with VisionOS I need to use Reality Composer Pro to create the shader, but I'm lost as how to reference the original colour that I'm trying to update in the node graph. All my attempts appear to completely override the textures in the entity (and its sub-entities) that I want to impact. Also the tutorials / examples I've looked at appear to create materials, not add an effect on top of existing materials. Any hints or pointers? Assuming this is possible, I've been trying to load the material in code, and apply to an entity. But do I need to do this to all child entities, or just the topmost? do { let entity = MyAssets.createModelEntity(.plane) // Loads from bundle and performs config let material = try await ShaderGraphMaterial(named: "/Root/TestMaterial", from: "Test", in: realityKitContentBundle) entity.applyToChildren { $0.components[ModelComponent.self]?.materials = [material] } root.addChild(entity) } catch { fatalError(error.localizedDescription) }
3
0
928
Oct ’24
EntityAction implementation example
Does anyone have experience of creating their own EntityActions? Say for example I wanted one that faded up the opacity of an entity, then once it had completed set another property on one of the entity's components. I understand that I could use the FromToByAction to control the opacity (and have this working), but I am interested to learn how to create my own dedicated EntityAction, and finding the documentation hard to fathom. I got as far as creating a struct conforming to EntityAction protocol: var animatedValueType: (any AnimatableData.Type)? } Subscribing to update events on this: FadeUpAction.subscribe(to: .updated) { event in guard let animationState = event.animationState else { return } // My animation state is always nil, so I never get here! let newValue = \\\Some Calc... animationState.storeAnimatedValue(newValue) } And setting it up as an animation on an entity: let action = FadeUpAction() if let animation = try? AnimationResource.makeActionAnimation( for:action, duration: 2.0, bindTarget: .opacity ) { entity.playAnimation(animation) } ...but haven't been able to understand how to extract the current timeDelta or set the value in the event handler. Any pointers?
2
0
590
Oct ’24