In my SwiftUI iOS app, I need to detect which key (and modifier flags – Command, Option, Shift, Control) a user presses, but I don't want to pre-register them using .keyboardShortcut(_:modifiers:).
My use case is that keyboard shortcuts are user-configurable, so I need to capture the actual key + modifier combination dynamically at runtime and perform the appropriate action based on the user’s settings.
Questions:
What is the recommended way to detect arbitrary key + modifier combinations in SwiftUI on iOS?
Is there a SwiftUI-native solution for this, or should I rely on UIPressesEvent and wrap it with UIViewControllerRepresentable?
If UIKit bridging is necessary, what is the cleanest pattern for integrating this with SwiftUI views (e.g., Buttons)?
Any official guidance or best practices would be greatly appreciated!
Explore the various UI frameworks available for building app interfaces. Discuss the use cases for different frameworks, share best practices, and get help with specific framework-related questions.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I'm writing an iOS app that shares content with buddies. So my app will run in one language but the shared content will use the locale configured for the buddy.
I found this Apple documentation which suggests that locale: (Locale(identifier: is the solution.
Apple Documentation
But I can't get it to work.
Here's sample code.
struct LocalizationDemoView: View {
@State var isEnglish = true
var body: some View {
var myLocale: String { isEnglish ? "en": "de" }
VStack {
Toggle("Switch language", isOn: $isEnglish).frame(maxWidth: 200)
HStack {
Text("\(myLocale): ")
Text(String(localized: "Hello world!", locale: (Locale(identifier: myLocale)), comment: "To share"))
}
}
}
}
And here's the excerpt from the string catalog:
{
"sourceLanguage" : "en",
"strings" : {
"Hello world!" : {
"comment" : "To share",
"localizations" : {
"de" : {
"stringUnit" : {
"state" : "translated",
"value" : "🇩🇪 Moin Welt!"
}
},
"en" : {
"stringUnit" : {
"state" : "translated",
"value" : "🇬🇧 Hello world!"
}
}
}
}
}
}
Has Apple reduced support for string catalogs or is my code wrong?
Xcode 16.4 compiled on MacOS 15.6.1, device iOS 18.6.2
For a number of complex views in my app, I need to embed scrollable objects (like List, Form or TextEditor) in a ScrollView. To do that, I need to apply a limit to the height of the embedded object.
What I would like to do is set that limit to the actual height of the content being displayed in the List, Form, TextEditor, etc. (Note that this height calculation should be dynamic, so that content changes are properly displayed.
I attempted the following:
@State listHeight: CGFloat = .zero
List {
... content here
}
.onScrollGeometryChange(for: CGFloat.self, of: { geometry in
return geometry.contentSize.height
}, action: { oldValue, newValue in
if oldValue != newValue {
listHeight = newValue
}
})
.frame(height: listHeight)
.scrollDisabled(true)
This does not work because geometry.contentSize.height is always 0. So it is apparent that .onScrollGeometryChangedoes not interact with the internal scrolling mechanism of List. (Note, however, that.scrollDisabled` does work.)
Does anyone have a suggestion on how I might get this to work?
Problem description:
When using the Custom keyboard in various system application scenarios, a gray area is always displayed at the top of the Custom keyboard. Although the system has been restarted and the Custom keyboard has been restarted, and the pull-down operation has been repeated in multiple system application environments, the gray bar still appears. This problem also has the same problem on other input method software.
Reproduction steps:
Set the current input method to "CustomKeybaord"
In the memo app or any other system app, pull up the Custom keyboard, or switch to the Custom keyboard.
Observe whether a gray area is always displayed above the Custom keyboard.
Request:
I would like to understand whether the current behavior is a bug or if it requires Custom keyboards to adapt their style accordingly. However, I not found that
currently unable to modify the style or color of the gray bar in question, and I do not have a suitable adaptation solution at this time. I would appreciate further assistance
Topic:
UI Frameworks
SubTopic:
UIKit
In iOS 26, with the introduction of the new prominent style buttons like system done, how to apply the tint color for these buttons in globally at app level.
We are only able to set for individual buttons using barButtonItem.tintColor and need a way to apply globally.
We’ve tried:
UIBarButtonItem.appearance().tintColor
UIBarButtonItem.appearance(whenContainedInInstancesOf: [UINavigationController.self]).tintColor
but nothing worked.
sample code:
let doneButton = UIBarButtonItem(barButtonSystemItem: .done, target: nil, action: nil)
doneButton.tintColor = .systemPink
Is there a new recommended way to globally style UIBarButtonItem with the prominent style in iOS 26?
In SwiftUI for macOS, when implementing a DragGesture inside a ScrollVIew, how can I implement auto-scrolling when the mouse is not actively moving?
In AppKit, this would normally be done with a periodic event so that auto-scrolling continues to take place even if the user isn't actively moving the mouse. This is essential behaviour when implementing something like a drag-to-select gesture.
NSView.autoscroll(with: NSEvent) -> Bool
Is there anything in SwiftUI or ScrollView to accomplish this behaviour?
If you create a NavigationSplitView, then the sidebar is automatically adorned with a sidebar material effect. This affects the views background as well as any controls that are in the view.
What is the correct way to disable this behaviour so that I can use a NavigationSplitView without the material effects being applied?
The best I've come up with so far is to explicitly set the background on the sidebar but I'm curious if that's the correct way or I'm just getting lucky.
struct ContentView: View {
var body: some View {
NavigationSplitView {
// This works, but is it correct?
SidebarView()
.background(.windowBackground)
} detail: {
DetailView()
}
}
}
At WWDC the scroll edge effect was presented as a variable blur combined with a soft gradient.
Since beta 4 the bottom scroll edge effect has no blur, the top bar sometimes has, but sometimes does not. Even in my own app I have two views with totally different results. This was not changed back in beta 5, so I assume this is not a bug but an intended change.
It's hard to design for such a moving target. Especially the missing bottom blur looks bad in a lot of places in my app and I'm debating not shipping the new design if this does not get resolved.
Is there any guidance how this effect is supposed to look now? At this point it just seems random and there's no way adjust the way it looks.
Phone: Top blur, bottom no blur
Settings: Top blur, bottom no blur
Notes: No blur at all
Music: No blur at all
My own app: No top blur
My own app: Top blur
I implemented a subclass of NSCollectionViewItem:
class ViewItem: NSCollectionViewItem {
override func loadView() {
self.view = NSView()
}
}
and registed to NSCollectionView:
class PictureFrameThemeListView: NSCollectionView {
init(viewModel: PictureFrameViewModel) {
super.init(frame: .zero)
self.register(ViewItem.self, forItemWithIdentifier: .item)
}
However, when calling makeItem(withIdentifier:for:), the following error is prompted:
FAULT: NSInternalInconsistencyException: -[NSNib _initWithNibNamed:bundle:options:] could not load the nib 'Item' in bundle NSBundle
What confuses me is that I registered the subclass of NSCollectionViewItem, why do NSCollectionView to init the NSNib?
Hello,
It's impossible to manage hit area for textfield and it's already too small by default (detected with Accessibility Inspector).
I'm tried to force with modifier "frame(minHeight: xxx)", the component change the height but the hit area is still in same size.
Have to tips to fix this issue?
I have an Apple Watch Series 6 (Model: M00H3LL/A) that was working perfectly before I updated it to watchOS 26.0 beta on 21st August 2024. The update process completed smoothly while the watch was on the charger, but immediately after the update the display became extremely dim.
It’s not completely black, but so faint that it’s very difficult to see. I’ve already tried setting the brightness to maximum and also enabled maximum brightness under Accessibility settings, but the issue remains. I also tried force restarting the watch (holding the Digital Crown and side button together), but that did not resolve the issue.
I know this is not a hardware issue, since the problem appeared immediately after the update, and at times the display briefly returns to normal brightness for just a few milliseconds (like a flicker).
My current watchOS version is 26.0 (23R5350a).Could you please advise on how I can fix this, or confirm if this is a known issue with the beta version?
Thank you.
Topic:
UI Frameworks
SubTopic:
General
This method on UIEvent gets you more touch positions, and I think it's useful for a drawing app, to respond with greater precision to the position of the Pencil stylus.
Is there a similar thing in macOS, for mouse or tablet events? I found this property mouseCoalescingEnabled, but the docs there don't describe how to get the extra events.
I am building a centralized event handling system for UIKit controls and gesture recognizers. My current approach registers events using static methods inside a handler class, like this:
internal class TWOSInternalCommonEventKerneliOS {
internal static func RegisterTouchUpInside(_ pWidget: UIControl) -> Void {
pWidget.addTarget(
TWOSInternalCommonEventKerneliOS.self,
action: #selector(TWOSInternalCommonEventKerneliOS.WidgetTouchUpInsideListener(_:)),
for: .touchUpInside
)
}
@objc
internal static func WidgetTouchUpInsideListener(_ pWidget: UIView) -> Void {
print("WidgetTouchUpInside")
}
}
This works in my testing because the methods are marked @objc and static, but I couldn’t find Apple documentation explicitly confirming whether using ClassName.self (instead of an object instance) is officially supported.
Questions:
Is this approach (passing ClassName.self as the target) recommended or officially supported by UIKit?
If not, what is the safer alternative to achieve a similar pattern, where event registration can remain in static methods but still follow UIKit conventions?
Would using a shared singleton instance as the target (e.g., TWOSInternalCommonEventKerneliOS.shared) be the correct approach, or is there a better pattern?
Looking for official guidance to avoid undefined behavior in production.
Hi all,
Sharing a reproducible UIKit issue I’m seeing across multiple iPadOS 26 betas, with a tiny sample attached and a short video.
Short video
https://youtu.be/QekYNnHsfYk
Tiny project
https://github.com/yoasha/ListSwipeOvershootReproSwift
Summary
In a UISplitViewController (.doubleColumn), a UICollectionView using list layout shows a large leading-swipe overshoot when the split view is expanded (isCollapsed == false). The cell content translates roughly 3–4× the action width.
Repros with appearance = .plain and .grouped
Does not repro with .insetGrouped
Independent of trailing provider (issue persists when trailing provider is nil)
Collapsed split (compact width) behaves correctly
Environment
Devices: iPad Air (3rd gen), iPadOS 26.0 (23A5326a) → Repro
Simulators: iPad Pro 11-inch (M4), iPadOS 26.0 beta 6 → Repro
Also tested on device: iPadOS 26 beta 5, 6, 7, 8
Xcode: 26.0 beta 6 (17A5305f)
Steps to reproduce
Launch the sample; ensure the split is expanded (isCollapsed == false).
In the secondary list, set Appearance = Plain (also repros with Grouped).
Perform a leading swipe (LTR: swipe right) on any row.
Actual: content shifts ~3–4× the action width (overshoot).
Expected: content translates exactly the action width.
Switch Appearance = InsetGrouped and repeat the leading (swipe right) gesture → correct (no overshoot).
Feedback Assistant
FB ID: FB19785883 (full report + attachments filed; this forum thread mirrors the repro for wider visibility)
Minimal code (core of the sample)
If anyone from Apple needs additional traces or a sysdiagnose, I can attach promptly. Thanks!
// Secondary column VC (snippet)
var cfg = UICollectionLayoutListConfiguration(appearance: .plain) // also .grouped / .insetGrouped
cfg.showsSeparators = true
cfg.headerMode = .none
cfg.leadingSwipeActionsConfigurationProvider = { _ in
let read = UIContextualAction(style: .normal, title: "Read") { _,_,done in done(true) }
read.backgroundColor = .systemBlue
let s = UISwipeActionsConfiguration(actions: [read])
s.performsFirstActionWithFullSwipe = false
return s
}
// Trailing provider can be nil and the bug still repros for leading swipe:
cfg.trailingSwipeActionsConfigurationProvider = nil
let layout = UICollectionViewCompositionalLayout.list(using: cfg)
let collectionView = UICollectionView(frame: view.bounds, collectionViewLayout: layout)
// … standard data source with UICollectionViewListCell + UIListContentConfiguration
// Split setup (snippet)
let split = UISplitViewController(style: .doubleColumn)
split.preferredDisplayMode = .oneBesideSecondary
split.viewControllers = [
UINavigationController(rootViewController: PrimaryTableViewController()),
UINavigationController(rootViewController: SecondaryListViewController())
]
I am developing an App in iPadOS26 beta. I try to switch "Multitasking & Gestures" setting to “Full Screen Apps“ in the code of App. Is there any public API available to implement it.
By the way, my App is a Cordova App.
Thank you
Topic:
UI Frameworks
SubTopic:
General
In the current beta of iPadOS 26.0 (23A5297m), the camera does not function properly when the following conditions are met:
The device is set to Windowed Apps mode in the Settings app.
Multiple application windows are present on one screen.
Using AVCaptureVideoPreviewLayer.
On the other hand, UIImagePicker works properly.
Is this a specification of iPadOS 26 that cannot be avoided? Or is there an official solution or workaround available?
Normally, the camera feed should appear within the square frame of the lower window in the attached image, but it does not.
Topic:
UI Frameworks
SubTopic:
General
I’ve been testing out PaperKit from beta 1 up until 3, then took a break and on my return at beta 7 I find that the .drawing property of the PaperMarkup data model has been removed, i.e. a PKDrawing can no longer be added / modified on PaperKit. Also, the shape recognition feature seems to have been removed.
I see this as a tremendous drawback. I filed feedback already: FB19893338
Please bring it back.
Is it possible to add an ornament below a tab bar like this? Or does the whole side bar have to be an ornament to build this?
I have a standard Mac project created using Xcode templates with Storyboard. I manually removed the window from the Storyboard file and use ViewController.h as the base class for other ViewControllers. I create new windows and ViewControllers programmatically, implementing the UI entirely through code.
However, I've discovered that on all macOS versions prior to macOS 26 beta, the application would trigger automatic termination due to App Nap under certain circumstances. The specific steps are:
Close the application window (the application doesn't exit immediately at this point)
Click on other windows or the desktop, causing the application to lose focus (though this description might not be entirely accurate - the app might have already lost focus when the window closed, but the top menu bar still shows the current application immediately after window closure)
The application automatically terminates
In previous versions, I worked around this issue by manually executing [[NSApplication sharedApplication] run]; to create an additional main event loop, keeping the application active (although I understand this isn't an ideal approach).
However, in macOS 26 beta, I've found that calling [[NSApplication sharedApplication] run]; causes all NSButton controls I created in the window to become unresponsive to click events, though they still respond to mouse enter events.
Through recent research, I've learned that the best practice for preventing App Nap automatic termination is to use [[NSProcessInfo processInfo] disableAutomaticTermination:@"User interaction required"]; to increment the automatic termination reference count.
My questions are:
Why did calling [[NSApplication sharedApplication] run]; work properly on macOS versions prior to 15.6.1?
Why does calling [[NSApplication sharedApplication] run]; in macOS 26 beta cause buttons to become unresponsive to click events?
I have a few view controllers in a large UIKit application that previously started showing content right below the bottom of the top navigation toolbar.
When testing the same code on iOS 26, these same views have their content extend under the navigation bar and toolbar. I was able to fix it with:
if #available(iOS 26, *, *) {
self.edgesForExtendedLayout = [.bottom]
}
when running on iOS 26. I also fixed one or two places where the main view was anchored to self.view.topAnchor instead of self.view.safeAreaLayoutGuide.topAnchor.
Although this seems to work, I wonder if this was an intended change in iOS 26 or just a temporary bug in the beta that will be resolved.
Were changes made to the safe area and edgesForExtendedLayout logic in iOS 26? If so, is there a place I can see what the specific changes were, so I know my code is handling it properly?
Thanks!
Topic:
UI Frameworks
SubTopic:
UIKit