Post

Replies

Boosts

Views

Activity

Reply to Mac Catalyst SwiftUI – . focused() not working
Hi @Vision Pro Engineer and thanks for the reply! I don't necessarily need this to work on color. That was just an example to make it easily reproducible. My usecase is basically: I have a view with a sidebar on the left that contains items which should be keyboard navigatable. e.g list cells which can get focus by moving up and down and then make them listen for keypress. E.g I could press backspace to delete them or enter to make them renamable (turn the name label into a textfield). On the right side of the sidebar is a RealityKit scene view with custom camera controls. The idea would be that if the user puts focus on the right, or "clicks/taps" the scene view it becomes focused and then accepts keyboard input via onKeyPress. But this currently does not seem possible with what SwiftUI offers on Mac Catalyst? So should I rather roll my own focus management? I'm using a mix of UIKit and SwiftUI and currently retrieve keypresses on a container UIViewController via pressesBegan.
Topic: UI Frameworks SubTopic: SwiftUI Tags:
Jan ’25
Reply to Is migrating from ARView to RealityView recommended?
I would be curious about this as well. With iOS 26 etc. we now at least also have post processing support on RealityView but there are still a couple things missing from ARView, e.g https://developer.apple.com/documentation/realitykit/arview/debugoptions-swift.struct Or using RealityViewEnvironment seems to offer less flexibility compared to the UIKit API. @Environment(\.realityKitScene) var scene: RealityKit.Scene? https://developer.apple.com/documentation/swiftui/environmentvalues/realitykitscene is only available on visionOS which is also inconvenient. However new features like hover effects only seem to work using RealityView (FB15080805) and do nothing on ARView. Overall it's a bit of an awkward state right now where you gotta pick and each options brings its own disadvantages. I hope we can continue ARView for the time being.
Topic: Graphics & Games SubTopic: RealityKit Tags:
Jun ’25
Reply to SwiftUI Slider will cause app to crash on macOS Tahoe RC
Using a UIViewRepresentable like this works, but the Liquid Glass animations for the slider look very wonky and choppy. On slider release there isn't even an animation at all. import SwiftUI import UIKit struct ContentView: View { @State private var sliderValue: Double = 0.48 var body: some View { VStack { VStack { SafeSlider(value: $sliderValue, range: 0...1.18, step: 0.01) Text("Value: \(sliderValue, specifier: "%.2f")") } .frame(maxWidth: 128) } .padding() } } struct SafeSlider<Value: BinaryFloatingPoint>: View where Value.Stride: BinaryFloatingPoint { @Binding var value: Value var range: ClosedRange<Value> = 0...1 var step: Value = 0 var onEditingChanged: (Bool) -> Void = { _ in } var body: some View { #if targetEnvironment(macCatalyst) if #available(macOS 26.0, *) { SliderWrapper(value: $value, range: range, step: step, onEditingChanged: onEditingChanged) } else { defaultSlider } #else defaultSlider #endif } private var defaultSlider: some View { Slider(value: $value, in: range, step: Value.Stride(step), onEditingChanged: onEditingChanged) } } // MARK: - UIKit wrapper for Catalyst #if targetEnvironment(macCatalyst) struct SliderWrapper<Value: BinaryFloatingPoint>: UIViewRepresentable where Value.Stride: BinaryFloatingPoint { @Binding var value: Value var range: ClosedRange<Value> var step: Value var onEditingChanged: (Bool) -> Void func makeUIView(context: Context) -> UISlider { let slider = UISlider() slider.minimumValue = Float(range.lowerBound) slider.maximumValue = Float(range.upperBound) slider.value = Float(value) slider.addTarget(context.coordinator, action: #selector(Coordinator.valueChanged(_:)), for: .valueChanged) slider.addTarget(context.coordinator, action: #selector(Coordinator.dragStarted(_:)), for: .touchDown) slider.addTarget(context.coordinator, action: #selector(Coordinator.dragEnded(_:)), for: [.touchUpInside, .touchUpOutside, .touchCancel]) return slider } func updateUIView(_ uiView: UISlider, context: Context) { uiView.minimumValue = Float(range.lowerBound) uiView.maximumValue = Float(range.upperBound) uiView.value = Float(value) } func makeCoordinator() -> Coordinator { Coordinator(value: $value, step: step, onEditingChanged: onEditingChanged) } class Coordinator: NSObject { var value: Binding<Value> var step: Value var onEditingChanged: (Bool) -> Void init(value: Binding<Value>, step: Value, onEditingChanged: @escaping (Bool) -> Void) { self.value = value self.step = step self.onEditingChanged = onEditingChanged } @objc func valueChanged(_ sender: UISlider) { var newValue = Value(sender.value) if step != 0 { let rounded = (newValue / step).rounded() * step newValue = rounded sender.value = Float(newValue) } value.wrappedValue = newValue } @objc func dragStarted(_ sender: UISlider) { onEditingChanged(true) } @objc func dragEnded(_ sender: UISlider) { onEditingChanged(false) } } } #endif
Topic: UI Frameworks SubTopic: SwiftUI Tags:
Sep ’25
Reply to Shared/GroupImmersive Space – Query Local Device Transform
Okay I just wasted over 24h solving this, so hopefully this will save someone some pain: On visionOS 26 if you have an ImmersiveSpace and a regular WindowGroup in you app and want to use SharePlay with the ImmersiveSpace, you should set groupActivityAssociation on the view within the ImmersiveSpace: ImmersiveSpace(id: "immersiveSpaceID", for: AppScene.self) { $scene in let view = makeMyView() if #available(visionOS 26.0, *) { view .groupActivityAssociation(.primary("immersiveSpaceID")) // if you don't set this you might be in for a bad time } else { view } } If you don't do this, and query the device anchor during your SharePlay session, you will get a transform that is offset – supposedly based on the shared displacement. 💀 Once groupActivityAssociation is set, everything works just as expected. Feels like a bug to me, but at least it seems to be undocumented behavior. https://developer.apple.com/documentation/SwiftUI/View/groupActivityAssociation(_:) In my particular case I have a window that I hide but not dismiss while the ImmersiveSpace is active, and apparently by default the window will get the primary association.
Topic: Spatial Computing SubTopic: General Tags:
Oct ’25
Reply to visionOS – Starting GroupActivity FaceTime Call dismisses Immersive Space
Thanks for the quick response @Vision Pro Engineer ! I've filed FB20701196 with a minimal reproduction project. I created a simplified test app based on the structure of Apple's "Building a guessing game for visionOS" sample, and it has the exact same issue in TestFlight. The immersive space transitions to background state when FaceTime appears, but works fine in local builds. My FB includes: Since a clean implementation following the sample code structure hits this same issue in TestFlight, it seems like a platform bug rather than something wrong with my code. This is blocking our SharePlay launch, so really hoping for a fix soon. A public TestFlight link is attached to the radar but awaiting review. There are also screen recordings. Thank you!
Topic: Spatial Computing SubTopic: General Tags:
Oct ’25
Reply to visionOS – Starting GroupActivity FaceTime Call dismisses Immersive Space
@Vision Pro Engineer I did some more digging and it seems clear that the ImmersiveSpace gets backgrounded as soon as the FaceTime call gets initiated from the SharePlay UI. Again; that only happens in TestFlight environment. Maybe as a workaround I can try first dismissing the immersive space manually and then restarting it once the .activate() call has finished. Not great but I'm not sure if there is another way currently. Here a full video of whats happening: https://jumpshare.com/s/dOcSWPmHIPaiVBM3rE8v
Topic: Spatial Computing SubTopic: General Tags:
Oct ’25
Reply to Mac Catalyst SwiftUI – . focused() not working
Hi @Vision Pro Engineer and thanks for the reply! I don't necessarily need this to work on color. That was just an example to make it easily reproducible. My usecase is basically: I have a view with a sidebar on the left that contains items which should be keyboard navigatable. e.g list cells which can get focus by moving up and down and then make them listen for keypress. E.g I could press backspace to delete them or enter to make them renamable (turn the name label into a textfield). On the right side of the sidebar is a RealityKit scene view with custom camera controls. The idea would be that if the user puts focus on the right, or "clicks/taps" the scene view it becomes focused and then accepts keyboard input via onKeyPress. But this currently does not seem possible with what SwiftUI offers on Mac Catalyst? So should I rather roll my own focus management? I'm using a mix of UIKit and SwiftUI and currently retrieve keypresses on a container UIViewController via pressesBegan.
Topic: UI Frameworks SubTopic: SwiftUI Tags:
Replies
Boosts
Views
Activity
Jan ’25
Reply to Loading USDZ with particle system crashes on Intel Macs
Fixed in Mac OS 15.4 Beta 2. 🎉
Replies
Boosts
Views
Activity
Mar ’25
Reply to Mac Catalyst SwiftUI – . focused() not working
Hello @Vision Pro Engineer apologies for the late reply! I filed a bug with following ID: FB17160401 Thank you!
Topic: UI Frameworks SubTopic: SwiftUI Tags:
Replies
Boosts
Views
Activity
Apr ’25
Reply to RealityRenderer's Perspective Camera's FOV
Hi, you should be able to set the FOV yourself using https://developer.apple.com/documentation/realitykit/perspectivecameracomponent/fieldofviewindegrees Not sure how far you can push it though, I would just give it a try. 60 degrees appears to be the default.
Topic: Spatial Computing SubTopic: ARKit Tags:
Replies
Boosts
Views
Activity
May ’25
Reply to Is migrating from ARView to RealityView recommended?
I would be curious about this as well. With iOS 26 etc. we now at least also have post processing support on RealityView but there are still a couple things missing from ARView, e.g https://developer.apple.com/documentation/realitykit/arview/debugoptions-swift.struct Or using RealityViewEnvironment seems to offer less flexibility compared to the UIKit API. @Environment(\.realityKitScene) var scene: RealityKit.Scene? https://developer.apple.com/documentation/swiftui/environmentvalues/realitykitscene is only available on visionOS which is also inconvenient. However new features like hover effects only seem to work using RealityView (FB15080805) and do nothing on ARView. Overall it's a bit of an awkward state right now where you gotta pick and each options brings its own disadvantages. I hope we can continue ARView for the time being.
Topic: Graphics & Games SubTopic: RealityKit Tags:
Replies
Boosts
Views
Activity
Jun ’25
Reply to SwiftUI Slider will cause app to crash on macOS Tahoe RC
Using a UIViewRepresentable like this works, but the Liquid Glass animations for the slider look very wonky and choppy. On slider release there isn't even an animation at all. import SwiftUI import UIKit struct ContentView: View { @State private var sliderValue: Double = 0.48 var body: some View { VStack { VStack { SafeSlider(value: $sliderValue, range: 0...1.18, step: 0.01) Text("Value: \(sliderValue, specifier: "%.2f")") } .frame(maxWidth: 128) } .padding() } } struct SafeSlider<Value: BinaryFloatingPoint>: View where Value.Stride: BinaryFloatingPoint { @Binding var value: Value var range: ClosedRange<Value> = 0...1 var step: Value = 0 var onEditingChanged: (Bool) -> Void = { _ in } var body: some View { #if targetEnvironment(macCatalyst) if #available(macOS 26.0, *) { SliderWrapper(value: $value, range: range, step: step, onEditingChanged: onEditingChanged) } else { defaultSlider } #else defaultSlider #endif } private var defaultSlider: some View { Slider(value: $value, in: range, step: Value.Stride(step), onEditingChanged: onEditingChanged) } } // MARK: - UIKit wrapper for Catalyst #if targetEnvironment(macCatalyst) struct SliderWrapper<Value: BinaryFloatingPoint>: UIViewRepresentable where Value.Stride: BinaryFloatingPoint { @Binding var value: Value var range: ClosedRange<Value> var step: Value var onEditingChanged: (Bool) -> Void func makeUIView(context: Context) -> UISlider { let slider = UISlider() slider.minimumValue = Float(range.lowerBound) slider.maximumValue = Float(range.upperBound) slider.value = Float(value) slider.addTarget(context.coordinator, action: #selector(Coordinator.valueChanged(_:)), for: .valueChanged) slider.addTarget(context.coordinator, action: #selector(Coordinator.dragStarted(_:)), for: .touchDown) slider.addTarget(context.coordinator, action: #selector(Coordinator.dragEnded(_:)), for: [.touchUpInside, .touchUpOutside, .touchCancel]) return slider } func updateUIView(_ uiView: UISlider, context: Context) { uiView.minimumValue = Float(range.lowerBound) uiView.maximumValue = Float(range.upperBound) uiView.value = Float(value) } func makeCoordinator() -> Coordinator { Coordinator(value: $value, step: step, onEditingChanged: onEditingChanged) } class Coordinator: NSObject { var value: Binding<Value> var step: Value var onEditingChanged: (Bool) -> Void init(value: Binding<Value>, step: Value, onEditingChanged: @escaping (Bool) -> Void) { self.value = value self.step = step self.onEditingChanged = onEditingChanged } @objc func valueChanged(_ sender: UISlider) { var newValue = Value(sender.value) if step != 0 { let rounded = (newValue / step).rounded() * step newValue = rounded sender.value = Float(newValue) } value.wrappedValue = newValue } @objc func dragStarted(_ sender: UISlider) { onEditingChanged(true) } @objc func dragEnded(_ sender: UISlider) { onEditingChanged(false) } } } #endif
Topic: UI Frameworks SubTopic: SwiftUI Tags:
Replies
Boosts
Views
Activity
Sep ’25
Reply to App Clips don't work
Any news here? Some links seem to now have resolved for me, others are still showing "App Clip unavailable". That is really frustrating.
Topic: UI Frameworks SubTopic: General Tags:
Replies
Boosts
Views
Activity
Sep ’25
Reply to AppClip Not Launching from Physical QR Codes
Same problem here, really frustrating and annoying since I can't seem to find anything on my side that would cause this. And it worked perfectly fine just days ago.
Replies
Boosts
Views
Activity
Sep ’25
Reply to App Clip Unavailable
Did anyone find a solution for this? Struggling with the same problem and no reply from Apple so far.
Topic: App & System Services SubTopic: General Tags:
Replies
Boosts
Views
Activity
Sep ’25
Reply to Shared/GroupImmersive Space – Query Local Device Transform
I just discovered https://developer.apple.com/documentation/swiftui/environmentvalues/immersivespacedisplacement and will try if that gives me what I want.
Topic: Spatial Computing SubTopic: General Tags:
Replies
Boosts
Views
Activity
Sep ’25
Reply to Shared/GroupImmersive Space – Query Local Device Transform
Okay I just wasted over 24h solving this, so hopefully this will save someone some pain: On visionOS 26 if you have an ImmersiveSpace and a regular WindowGroup in you app and want to use SharePlay with the ImmersiveSpace, you should set groupActivityAssociation on the view within the ImmersiveSpace: ImmersiveSpace(id: "immersiveSpaceID", for: AppScene.self) { $scene in let view = makeMyView() if #available(visionOS 26.0, *) { view .groupActivityAssociation(.primary("immersiveSpaceID")) // if you don't set this you might be in for a bad time } else { view } } If you don't do this, and query the device anchor during your SharePlay session, you will get a transform that is offset – supposedly based on the shared displacement. 💀 Once groupActivityAssociation is set, everything works just as expected. Feels like a bug to me, but at least it seems to be undocumented behavior. https://developer.apple.com/documentation/SwiftUI/View/groupActivityAssociation(_:) In my particular case I have a window that I hide but not dismiss while the ImmersiveSpace is active, and apparently by default the window will get the primary association.
Topic: Spatial Computing SubTopic: General Tags:
Replies
Boosts
Views
Activity
Oct ’25
Reply to ManipulationComponent Not Translating using indirect input
I can confirm this bug still exists in the newest visionOS beta.
Topic: Spatial Computing SubTopic: General Tags:
Replies
Boosts
Views
Activity
Oct ’25
Reply to Manipulation stops working when changing rooms
Yes, this bug made me refrain from using ManipulationComponent so far in a production app. Surprised it made it into the release version.
Topic: Spatial Computing SubTopic: General Tags:
Replies
Boosts
Views
Activity
Oct ’25
Reply to visionOS – Starting GroupActivity FaceTime Call dismisses Immersive Space
Thanks for the quick response @Vision Pro Engineer ! I've filed FB20701196 with a minimal reproduction project. I created a simplified test app based on the structure of Apple's "Building a guessing game for visionOS" sample, and it has the exact same issue in TestFlight. The immersive space transitions to background state when FaceTime appears, but works fine in local builds. My FB includes: Since a clean implementation following the sample code structure hits this same issue in TestFlight, it seems like a platform bug rather than something wrong with my code. This is blocking our SharePlay launch, so really hoping for a fix soon. A public TestFlight link is attached to the radar but awaiting review. There are also screen recordings. Thank you!
Topic: Spatial Computing SubTopic: General Tags:
Replies
Boosts
Views
Activity
Oct ’25
Reply to visionOS – Starting GroupActivity FaceTime Call dismisses Immersive Space
@Vision Pro Engineer I did some more digging and it seems clear that the ImmersiveSpace gets backgrounded as soon as the FaceTime call gets initiated from the SharePlay UI. Again; that only happens in TestFlight environment. Maybe as a workaround I can try first dismissing the immersive space manually and then restarting it once the .activate() call has finished. Not great but I'm not sure if there is another way currently. Here a full video of whats happening: https://jumpshare.com/s/dOcSWPmHIPaiVBM3rE8v
Topic: Spatial Computing SubTopic: General Tags:
Replies
Boosts
Views
Activity
Oct ’25