Post

Replies

Boosts

Views

Activity

Reply to Mac Catalyst SwiftUI – . focused() not working
Anyone? This is one of the last steps of polish of my Catalyst app that I would like to implement, but I just don't see what's going wrong. Based on these docs: https://developer.apple.com/documentation/swiftui/view/focusable(_:interactions:) The focus interactions allowed for custom views changed in macOS 14—previously, custom views could only become focused with keyboard navigation enabled system-wide. Clients built using older SDKs will continue to see the older focus behavior, while custom views in clients built using macOS 14 or later will always be focusable unless the client requests otherwise by specifying a restricted set of focus interactions. It reads to me like this should be possible with macOS 14 or later? Essentially what I want is similar to the behavior in Reality Composer Pro where when the user clicks the scene view keyboard controls are enabled to move around in the scene, whereas when focus changes e.g to elements on the sidebar keyboard navigation will cycle between them. What works in my example above is using the arrow keys right away to focus the way. Although then I only get the focus ring, but the isFocused FocusState still does not update. Very confusing. This video also didn't help much: https://developer.apple.com/documentation/swiftui/focus-cookbook-sample Is this just somewhat unfinished behavior specific to Mac Catalyst?
Topic: UI Frameworks SubTopic: SwiftUI Tags:
Jan ’25
Reply to Mac Catalyst SwiftUI – . focused() not working
Hi @Vision Pro Engineer and thanks for the reply! I don't necessarily need this to work on color. That was just an example to make it easily reproducible. My usecase is basically: I have a view with a sidebar on the left that contains items which should be keyboard navigatable. e.g list cells which can get focus by moving up and down and then make them listen for keypress. E.g I could press backspace to delete them or enter to make them renamable (turn the name label into a textfield). On the right side of the sidebar is a RealityKit scene view with custom camera controls. The idea would be that if the user puts focus on the right, or "clicks/taps" the scene view it becomes focused and then accepts keyboard input via onKeyPress. But this currently does not seem possible with what SwiftUI offers on Mac Catalyst? So should I rather roll my own focus management? I'm using a mix of UIKit and SwiftUI and currently retrieve keypresses on a container UIViewController via pressesBegan.
Topic: UI Frameworks SubTopic: SwiftUI Tags:
Jan ’25
Reply to Is migrating from ARView to RealityView recommended?
I would be curious about this as well. With iOS 26 etc. we now at least also have post processing support on RealityView but there are still a couple things missing from ARView, e.g https://developer.apple.com/documentation/realitykit/arview/debugoptions-swift.struct Or using RealityViewEnvironment seems to offer less flexibility compared to the UIKit API. @Environment(\.realityKitScene) var scene: RealityKit.Scene? https://developer.apple.com/documentation/swiftui/environmentvalues/realitykitscene is only available on visionOS which is also inconvenient. However new features like hover effects only seem to work using RealityView (FB15080805) and do nothing on ARView. Overall it's a bit of an awkward state right now where you gotta pick and each options brings its own disadvantages. I hope we can continue ARView for the time being.
Topic: Graphics & Games SubTopic: RealityKit Tags:
Jun ’25
Reply to SwiftUI Slider will cause app to crash on macOS Tahoe RC
Using a UIViewRepresentable like this works, but the Liquid Glass animations for the slider look very wonky and choppy. On slider release there isn't even an animation at all. import SwiftUI import UIKit struct ContentView: View { @State private var sliderValue: Double = 0.48 var body: some View { VStack { VStack { SafeSlider(value: $sliderValue, range: 0...1.18, step: 0.01) Text("Value: \(sliderValue, specifier: "%.2f")") } .frame(maxWidth: 128) } .padding() } } struct SafeSlider<Value: BinaryFloatingPoint>: View where Value.Stride: BinaryFloatingPoint { @Binding var value: Value var range: ClosedRange<Value> = 0...1 var step: Value = 0 var onEditingChanged: (Bool) -> Void = { _ in } var body: some View { #if targetEnvironment(macCatalyst) if #available(macOS 26.0, *) { SliderWrapper(value: $value, range: range, step: step, onEditingChanged: onEditingChanged) } else { defaultSlider } #else defaultSlider #endif } private var defaultSlider: some View { Slider(value: $value, in: range, step: Value.Stride(step), onEditingChanged: onEditingChanged) } } // MARK: - UIKit wrapper for Catalyst #if targetEnvironment(macCatalyst) struct SliderWrapper<Value: BinaryFloatingPoint>: UIViewRepresentable where Value.Stride: BinaryFloatingPoint { @Binding var value: Value var range: ClosedRange<Value> var step: Value var onEditingChanged: (Bool) -> Void func makeUIView(context: Context) -> UISlider { let slider = UISlider() slider.minimumValue = Float(range.lowerBound) slider.maximumValue = Float(range.upperBound) slider.value = Float(value) slider.addTarget(context.coordinator, action: #selector(Coordinator.valueChanged(_:)), for: .valueChanged) slider.addTarget(context.coordinator, action: #selector(Coordinator.dragStarted(_:)), for: .touchDown) slider.addTarget(context.coordinator, action: #selector(Coordinator.dragEnded(_:)), for: [.touchUpInside, .touchUpOutside, .touchCancel]) return slider } func updateUIView(_ uiView: UISlider, context: Context) { uiView.minimumValue = Float(range.lowerBound) uiView.maximumValue = Float(range.upperBound) uiView.value = Float(value) } func makeCoordinator() -> Coordinator { Coordinator(value: $value, step: step, onEditingChanged: onEditingChanged) } class Coordinator: NSObject { var value: Binding<Value> var step: Value var onEditingChanged: (Bool) -> Void init(value: Binding<Value>, step: Value, onEditingChanged: @escaping (Bool) -> Void) { self.value = value self.step = step self.onEditingChanged = onEditingChanged } @objc func valueChanged(_ sender: UISlider) { var newValue = Value(sender.value) if step != 0 { let rounded = (newValue / step).rounded() * step newValue = rounded sender.value = Float(newValue) } value.wrappedValue = newValue } @objc func dragStarted(_ sender: UISlider) { onEditingChanged(true) } @objc func dragEnded(_ sender: UISlider) { onEditingChanged(false) } } } #endif
Topic: UI Frameworks SubTopic: SwiftUI Tags:
Sep ’25
Reply to Shared/GroupImmersive Space – Query Local Device Transform
Okay I just wasted over 24h solving this, so hopefully this will save someone some pain: On visionOS 26 if you have an ImmersiveSpace and a regular WindowGroup in you app and want to use SharePlay with the ImmersiveSpace, you should set groupActivityAssociation on the view within the ImmersiveSpace: ImmersiveSpace(id: "immersiveSpaceID", for: AppScene.self) { $scene in let view = makeMyView() if #available(visionOS 26.0, *) { view .groupActivityAssociation(.primary("immersiveSpaceID")) // if you don't set this you might be in for a bad time } else { view } } If you don't do this, and query the device anchor during your SharePlay session, you will get a transform that is offset – supposedly based on the shared displacement. 💀 Once groupActivityAssociation is set, everything works just as expected. Feels like a bug to me, but at least it seems to be undocumented behavior. https://developer.apple.com/documentation/SwiftUI/View/groupActivityAssociation(_:) In my particular case I have a window that I hide but not dismiss while the ImmersiveSpace is active, and apparently by default the window will get the primary association.
Topic: Spatial Computing SubTopic: General Tags:
Oct ’25
Reply to visionOS – Starting GroupActivity FaceTime Call dismisses Immersive Space
Thanks for the quick response @Vision Pro Engineer ! I've filed FB20701196 with a minimal reproduction project. I created a simplified test app based on the structure of Apple's "Building a guessing game for visionOS" sample, and it has the exact same issue in TestFlight. The immersive space transitions to background state when FaceTime appears, but works fine in local builds. My FB includes: Since a clean implementation following the sample code structure hits this same issue in TestFlight, it seems like a platform bug rather than something wrong with my code. This is blocking our SharePlay launch, so really hoping for a fix soon. A public TestFlight link is attached to the radar but awaiting review. There are also screen recordings. Thank you!
Topic: Spatial Computing SubTopic: General Tags:
Oct ’25