I'm looking for a way to make the following possible in Xcode (26.0 or later):
I select one or more lines of code. I then enter a hotkey (or select a menu item) that results in adding the line:
#ifdef SOME_MACRO
before the selection and adding the line:
#endif
after the selection.
Example:
Start with the following lines of code:
BOOL x = NO;
int y = 4;
NSString *str = @"Hello";
If I then highlight the int y = 4; line and use the proper hotkey or menu, the result would be:
BOOL x = NO;
#ifdef SOME_MACRO
int y = 4;
#endif
NSString *str = @"Hello";
Is something like this possible in Xcode?
I looked at code snippets but that doesn't seem to support wrapping existing code.
I looked at the Xcode Settings under Editor and Shortcuts and didn't see a way to add such a custom shortcut.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Created
I have an app for macOS that is built using Mac Catalyst. I need to perform some background processing. I'm using BGProcessingTaskRequest to schedule the request. I have also integrated CKSyncEngine so I need that to be able to perform its normal background processing.
On iOS, when the user leaves the app, I can see a log message that the request was scheduled and a bit later I see log messages coming from the actual background task code.
On macOS I ran the app from Xcode. I then quit the app (Cmd-q). I can see the log message that the request was scheduled. But the actual task is never run. In my test, I ran my app on a MacBook Pro running macOS 26.0. When I quit the app, I checked the log file in the app sandbox and saw the message that the task was scheduled. About 20 minutes later I closed the lid on the MacBook Pro for the night. I did not power down, it just went to sleep. Roughly 10 hours later I opened the lid on the MacBook Pro, logged in, and checked the log file. It had not been updated since quitting the app. I should also mention that the laptop was not plugged in at all during this period.
My question is, does a Mac Catalyst app support background processing after the user quits the app? If so, how is it enabled?
The documentation for BGProcessingTaskRequest and BGProcessingTask show they are supported under Mac Catalyst, but I couldn't find any documentation in the Background Tasks section that mentioned anything specific to setup for Mac Catalyst.
Running the Settings app and going to General -> Login Items & Extension, I do not see my app under the App Background Activity section. Does it need to be listed there? If so, what steps are needed to get it there?
If this is all documented somewhere, I'd appreciate a link since I was not able to find anything specific to making this work under Mac Catalyst.
Topic:
App & System Services
SubTopic:
Processes & Concurrency
Tags:
CloudKit
macOS
Mac Catalyst
Background Tasks
I have an iPad app that supports multiple scenes.
I discovered some issues with my app's user interface that I would like to tweak based on whether the user has setup multitasking (in iPadOS 26) as "Full Screen Apps" or "Windowed Apps".
Is there any API or way to determine the current iPadOS 26 multitasking setting?
I've looked at UIDevice.current.isMultitaskingSupported and UIApplication.shared.supportsMultipleScenes. Both always return true no matter the user's chosen multitasking choice.
I also looked at UIWindowScene isFullScreen which was always false. I tried to look at UIWindowScene windowingBehaviors but that was always nil.
Under iPadsOS 26.0 and 26.1, if a view controller is presented with a presentation style of fullScreen or pageSheet, and the view controller is setup with a UISearchController that has obscuresBackgroundDuringPresentation set to true, then when cancelling the search the view controller is being dismissed when it should not be.
To replicate, create a new iOS project based on Swift/Storyboard using Xcode 26.0 or Xcode 26.1. Update ViewController.swift with the following code:
import UIKit
class ViewController: UIViewController {
override func viewDidLoad() {
super.viewDidLoad()
view.backgroundColor = .systemBackground
title = "Root"
navigationItem.rightBarButtonItems = [
UIBarButtonItem(title: "Full", primaryAction: .init(handler: { _ in
self.showModal(with: .fullScreen)
})),
UIBarButtonItem(title: "Page", primaryAction: .init(handler: { _ in
self.showModal(with: .pageSheet)
})),
UIBarButtonItem(title: "Form", primaryAction: .init(handler: { _ in
self.showModal(with: .formSheet)
})),
]
}
private func showModal(with style: UIModalPresentationStyle) {
let vc = ModalViewController()
let nc = UINavigationController(rootViewController: vc)
// This triggers the double dismiss bug when set to either pageSheet or fullScreen.
// If set to formSheet then it works fine.
// Bug is only on iPad with iPadOS 26.0 or 26.1 beta 2.
// Works fine on iPhone (any iOS) and iPadOS 18 as well as macOS 26.0 (not tested with other versions of macOS).
nc.modalPresentationStyle = style
self.present(nc, animated: true)
}
}
Then add a new file named ModalViewController.swift with the following code:
import UIKit
class ModalViewController: UIViewController {
override func viewDidLoad() {
super.viewDidLoad()
title = "Modal"
view.backgroundColor = .systemBackground
setupSearch()
}
private func setupSearch() {
let sc = UISearchController(searchResultsController: UIViewController())
sc.delegate = self // Just for debugging - being set or not does not affect the bug
sc.obscuresBackgroundDuringPresentation = true // Critical to reproducing the bug
navigationItem.searchController = sc
navigationItem.preferredSearchBarPlacement = .stacked
}
// When the search is cancelled by tapping on the grayed out area below the search bar,
// this is called twice when it should only be called once. This happens only if the
// view controller is presented with a fullScreen or pageSheet presentation style.
// The end result is that the first call properly dismisses the search controller.
// The second call results in this view controller being dismissed when it should not be.
override func dismiss(animated flag: Bool, completion: (() -> Void)? = nil) {
print("dismiss ViewController")
// Set breakpoint on the following line
super.dismiss(animated: flag, completion: completion)
}
}
extension ModalViewController: UISearchControllerDelegate {
func willDismissSearchController(_ searchController: UISearchController) {
print("willDissmissSearchController")
}
func didDismissSearchController(_ searchController: UISearchController) {
print("didDismissSearchController")
}
}
Build and run the app on a simulated or real iPad running iPadOS 26.0 or 26.1 (beta 2). A root window appears with 3 buttons in the navbar. Each button displays the same view controller but with a different modalPresentationStyle.
Tap the Form button. This displays a modal view controller with formSheet style. Tap on the search field. Then tap on the grayed out area of the view controller to cancel the search. This all works just fine. Dismiss the modal (drag it down).
Now tap either the Page or Full button. These display the same modal view controller with pageSheet or fullScreen style respectively. Tap on the search field. Then tap on the grayed out area of the view controller to cancel the search. This time, not only is the search cancelled, but the view controller is also dismissed. This is because the view controller’s dismiss(animated:completion:) method is being called twice.
See ViewController.swift for the code that presents the modal. See ModalViewController.swift for the code that sets up the search controller. Both contain lots of comments.
Besides the use of fullScreen or pageSheet presentation style to reproduce the bug, the search controller must also have its obscuresBackgroundDuringPresentation property set to true. It’s the tap on that obscured background to cancel the search that results in the double call to dismiss. With the breakpoint set in the overloaded dismiss(animated:completion:) function, you can see the two stack traces that lead to the call to dismiss. When presented as a formSheet, the 2nd call to dismiss is not being made.
This issue does not affect iPadOS 18 nor any version of iOS on iPhones. Nor does it affect the app using Mac Catalyst on macOS 26.0 (untested with macOS 15 or 26.1).
In short, it is expected that cancelling the search in a presented view controller should not also result in the view controller being dismissed.
Tested with Xcode 26.1 beta 2 and Xcode 26.0. Tested with iPadOS 26.1 beta 2 (real and simulated) and iPadOS 26.0 (simulated).
A version of this post was submitted as FB20569327
My UIKit/Mac Catalyst app supports a user opening multiple windows (multiple scenes). One of these is a special scene that shows content that I want to appear in front of all other app windows/scenes, even while the user is interacting with one of the app's other scenes. I do not need this special scene to stay in front of the windows of other apps, just in front of the windows of my own app.
While I'm not 100% sure, it seems that AppKit supports this through the NSWindow level property. I can't find any equivalent feature in UIKit/Mac Catalyst. UIWindow windowLevel is not the same thing since that only affects the order of windows within a given scene. I need an entire scene (and its windows) to stay in front of my app's other scenes (and their windows).
I don't see anything relevant in UIWindow, UIScene, UIWindowScene, UISceneSession, UIScene.ActivationRequestOptions, or UIWindowScene.ActivationRequestOptions.
There is a serious usability issue with PHPickerViewController in a UIKit app running on macOS 26 via Mac Catalyst when the Mac Catalyst interface is set to “Scaled to Match iPad”. Mouse click and other pointer interactions do not take place in the correct position. This means you have to click in the wrong position to select a photo and to close the picker. This basically makes it unusable.
To demonstrate, use Xcode 26 on macOS 26 to create a new iOS app project based on Swift/Storyboard. Then update ViewController.swift with the following code:
import UIKit
import PhotosUI
class ViewController: UIViewController {
override func viewDidLoad() {
super.viewDidLoad()
var cfg = UIButton.Configuration.plain()
cfg.title = "Photo Picker"
let button = UIButton(configuration: cfg, primaryAction: UIAction(handler: { _ in
self.showPicker()
}))
button.translatesAutoresizingMaskIntoConstraints = false
view.addSubview(button)
NSLayoutConstraint.activate([
button.centerXAnchor.constraint(equalTo: view.safeAreaLayoutGuide.centerXAnchor),
button.centerYAnchor.constraint(equalTo: view.safeAreaLayoutGuide.centerYAnchor),
])
}
private func showPicker() {
var config = PHPickerConfiguration()
config.selectionLimit = 10
config.selection = .ordered
let vc = PHPickerViewController(configuration: config)
vc.delegate = self
self.present(vc, animated: true)
}
}
extension ViewController: PHPickerViewControllerDelegate {
func picker(_ picker: PHPickerViewController, didFinishPicking results: [PHPickerResult]) {
print("Picked \(results.count) photos")
dismiss(animated: true)
}
}
Then go to the "Supported Destinations" section of the project target. Add a "Mac (Mac Catalyst)" destination. Then under the "Deployment Information" section, make sure the "Mac Catalyst Interface" setting is "Scaled to Match iPad".
Then build and run the app on a Mac (using the Mac Catalyst destination) with macOS 26.0.1. Make sure the Mac has a dozen or so pictures in the Photo Library to fully demonstrate the issue. When the app is run, a simple screen appears with one button in the middle. Click the button to bring up the PHPickerViewController. Now try to interact with the picker interface. Note that all pointer interactions are in the wrong place on the screen. This makes it nearly impossible to choose the correct photos and close the picker.
Quit the app. Select the project and go to the General tab. In the "Deployment Info" change the “Mac Catalyst Interface” setting to “Optimize for Mac” and run the app again. Now the photo picker works just fine.
If you run the app on a Mac running macOS 15 then the photo picker works just fine with either “Mac Catalyst Interface” setting.
The problem only happens under macOS 26.0 (I do not have macOS 26.1 beta to test) when the “Mac Catalyst Interface” setting is set to “Scaled to Match iPad”. This is critical for my app. I cannot use “Optimize for Mac”. There are far too many issues with that setting (I use UIStepper and UIPickerView to start). So it is critical to the usability of my app under macOS 26 that this issue be resolved.
It is expected that PHPickerViewController responds correctly to pointer events on macOS 26 when running a Mac Catalyst app set to “Scaled to Match iPad”.
A version of this has been filed as FB20503207
Is anyone else using one of the "Dark" editor themes (such as "Default (Dark)" or "Classic (Dark)" in Xcode? And is anyone doing this with Xcode 26 on macOS 26?
Here's the result while using the "Default (Dark)" theme while my Mac is in "light" mode:
Note that the black background of the editor goes all the way to the far left edge of the Xcode window. The large gray area in the project tree is the black background bleeding through the sidebar.
This is really distracting. Is there a way to fix this (besides not using a dark theme - I've been using dark themes for over 30 years)? This appears to be a poor design decision in macOS 26 to have split views show the background of the secondary column behind the primary column. iPadOS 26 has the same issue (see https://developer.apple.com/forums/thread/800073).
The UIResponderStandardEditActions protocol includes pasteAndMatchStyle:. UITextView conforms to UIResponderStandardEditActions. But I can't find a way to get that menu to appear. I get the standard "Paste" menu. I've tried overriding pasteAndMatchStyle: in a subclass of UITextView. I've overridden canPerformAction:withSender: but it never gets called with the pasteAndMatchStyle: selector. I've implemented the textView:editMenuForTextInRange:suggestedActions: delegate method. It's called but the suggested actions do not include the "Paste and Match Style" action (key command).
I came up with an ugly hack that involved overriding buildMenuWithBuilder: and adding my own key command after the paste command. But this shouldn't be necessary considering it's supposed to be a standard edit action.
So what's the trick to make the "Paste and Match Style" edit menu appear properly in a UITextView? I'm testing with iOS 17, 18, and 26.
For difficult reasons I won’t get into, I ended up manually downloading the latest iOS 26 simulator runtime. I now have a file named 78756498-8AB4-4E5A-986C-7AA435758657.aar copied to my Mac.
How do I get this archive installed so Xcode 26 recognizes it as a proper simulator runtime component?
All searching I‘ve done for manually installing simulators references dmg files and older versions of Xcode. There’s no mention of aar files.
When I tried the command:
sudo xcrun simctl runtime add ./78756498-8AB4-4E5A-986C-7AA435758657.aar
I get the result:
An error was encountered processing the command (domain=NSPOSIXErrorDomain, code=22):
Error while creating AEA backend
Invalid argument
I tried to use Archive Utility to open the file but that just says it is unable to expand the file.
I even tried renaming the file with a dmg extension and then tried mounting the file and I get the same “AEA backend” error.
My Mac doesn’t have sufficient Internet access to let me download and install this normally through Xcode. I need to find a way to get this file installed manually.
I’m creating a UITabBarController with a simple array of UITab instances. I’m setting the mode to .tabSideBar. How do I prevent editing? I don’t want the Edit button to appear at all. I’ve tried setting the tab controller’s customizableViewControllers property to both nil and an empty array but neither is preventing the Edit button from appearing. I scanned the various delegates and I don‘t see any relevant methods.
So far I’ve tested this on a simulated iPad running iPadOS 26 using Xcode 26 RC.
I have a triple-column UISplitViewController setup in "tile" mode. Each of the 3 columns has a table view controller. Under iPadOS 26, the section headers and row selection in the middle table extends all the way to the left of the screen, behind the primary column. It looks terrible. The documentation for "Adopting Liquid Glass" makes it sound like you can add this behavior by using UIBackgroundExtensionView. But I get this behavior automatically in a UISplitViewController. How do I turn this off?
I created a simpler sample using a double-column split view with two table view controllers. Here's a screenshot of the result:
Note how the section headers and the row selection appear all the way to the left edge of the screen. I don't want that effect. How do you turn off this effect in a UISplitViewController?
Here is the code used to setup the split view and the app's main window:
func scene(_ scene: UIScene, willConnectTo session: UISceneSession, options connectionOptions: UIScene.ConnectionOptions) {
guard let winScene = (scene as? UIWindowScene) else { return }
let primary = PrimaryViewController(style: .plain)
let primaryNC = UINavigationController(rootViewController: primary)
let detail = DetailViewController(style: .plain)
let detailNC = UINavigationController(rootViewController: detail)
let sv = UISplitViewController(style: .doubleColumn)
sv.preferredDisplayMode = .oneBesideSecondary
sv.preferredSplitBehavior = .tile
sv.primaryBackgroundStyle = .none
sv.displayModeButtonVisibility = .automatic
sv.setViewController(primaryNC, for: .primary)
sv.setViewController(detailNC, for: .secondary)
let win = UIWindow(windowScene: winScene)
win.rootViewController = sv
win.makeKeyAndVisible()
window = win
}
The PrimaryViewController and DetailViewController are simple UITableViewController subclasses that only add a few rows and section headers as needed.
This is really odd. If you setup a UISearchController with a preferredSearchBarPlacement of .stacked and you setup the search bar with scope buttons, then when the view controller is initially displayed, the currently hidden scope buttons block touch events from reaching the main view just below the search bar. But once the search is activated and dismissed, then the freshly hidden scope buttons no longer cause an issue.
This is easily demonstrated by putting a UITableViewController in a UINavigationController. Setup the table view to show a few simple rows. Then setup a search controller using the following code:
func setupSearch() {
// Setup a stacked search bar with scope buttons
// Before the search is ever activated, the hidden scope buttons block any touches in the main view controller
// in the area just below the search bar.
// Once the search is activated and dismissed, the problem goes away. It seems that displaying and hiding the
// scope buttons at least once fixes the issue that exists beforehand.
// This issue only exists in iOS/iPadOS 26, not iOS/iPadOS 18 or earlier.
let search = UISearchController(searchResultsController: UIViewController())
search.hidesNavigationBarDuringPresentation = true
search.obscuresBackgroundDuringPresentation = true
search.scopeBarActivation = .onSearchActivation // Ensure button appear immediately
let searchBar = search.searchBar
searchBar.scopeButtonTitles = [ "One", "Two", "Three" ]
self.navigationItem.searchController = search
self.navigationItem.hidesSearchBarWhenScrolling = false // Issue appears even if this is true
self.navigationItem.preferredSearchBarPlacement = .stacked
}
When first shown, before any attempt is made to activate the search, any attempt to tap on the upper 2/3 of the first row in the table view (which is just below the search bar) fails. If you tap on the lower 1/3 of the first row it works fine. If you then activate the search (now the scope buttons appear) and then dismiss the search (now the scope buttons are hidden again), then there is no issue tapping anywhere on the first row of the table. But if you restart the app, the problem starts over again.
This problem happens on any iPhone or iPad, real or simulated, running iOS/iPadOS 26 RC. This is a regression from iOS 18 or earlier.
When trying to use a UISearchController setup with a UISearchBar that has scope buttons, the search controller's scopeBarActivation property is set to .onSearchActivation, the navigation item's preferredSearchBarPlacement property is set to .integrated. or .integratedButton, and the search bar/button appears in the navigation bar, then the scope buttons never appear. But space is made for where they should appear.
Some relevant code in a UIViewController shown as the root view controller of a UINavigationController:
private func setupSearch() {
let sc = UISearchController(searchResultsController: UIViewController())
sc.delegate = self
sc.obscuresBackgroundDuringPresentation = true
// Setup search bar with scope buttons
let bar = sc.searchBar
bar.scopeButtonTitles = [ "One", "Two", "Three", "Four" ]
bar.selectedScopeButtonIndex = 0
bar.delegate = self
// Apply the search controller to the nav bar
navigationItem.searchController = sc
// BUG - Under iOS/iPadOS 26 RC, using .onSearchActivation results in the scope buttons never appearing at all
// when using integrated placement in the nav bar.
// Ensure the scope buttons appear immediately upon activating the search controller
sc.scopeBarActivation = .onSearchActivation
// This works but doesn't show the scope buttons until the user starts typing - that's too late for my needs
//sc.scopeBarActivation = .automatic
if #available(iOS 26.0, *) {
// Under iOS 26 put the search icon in the nav bar - same issue for .integrated and .integratedButton
navigationItem.preferredSearchBarPlacement = .integrated // .integratedButton
// My toolbar is full so I need the search in the navigation bar
navigationItem.searchBarPlacementAllowsToolbarIntegration = false // Ensure it's in the nav bar
} else {
// Under iOS 18 put the search bar in the nav bar below the title
navigationItem.preferredSearchBarPlacement = .stacked
}
}
I need the search bar in the navigation bar since the toolbar is full. And I need the scope buttons to appear immediately upon search activation.
This problem happens on any real or simulated iPhone or iPad running iOS/iPadOS 26 RC.
All of these issues appear when the search controller is set on the view controller's navigationItem and the search controller's searchBar has its scopeButtonTitles set.
So far the following issues are affecting my app on iOS/iPadOS 26 as of beta 7:
When the scopeBarActivation of UISearchController is set to .onSearchActivation, the preferredSearchBarPlacement of the navigationItem is set to .integratedButton, and the searchBarPlacementAllowsToolbarIntegration is set to false (forcing the search icon to appear in the nav bar), on both iPhones and iPads, the scope buttons never appear. They don't appear when the search is activated. They don't appear when any text is entered into the search bar. FB19771313
I attempted to work around that issue by setting the scopeBarActivation to .manual. I then show the scope bar in the didPresentSearchController delegate method and hide the scope bar in the willDismissSearchController. On an iPhone this works though the display is a bit clunky. On an iPad, the scope bar does appear via the code in didPresentSearchController, but when any scope bar button is tapped, the search controller is dismissed. This happens when the app is horizontally regular. When the app on the iPad is horizontally compact, the buttons work but the search bar's text is not correctly aligned within the search bar. Quite the mess really. I still need to post a bug report for this issue. But if issue 1 above is fixed then I don't need this workaround.
When the scopeBarActivation of UISearchController is set to .onSearchActivation, the preferredSearchBarPlacement of the navigationItem is set to .stacked, and the hidesSearchBarWhenScrolling property of the navigationItem is set to false (always show the search bar), and this is all used in a UITableViewController, then upon initial display of the view controller on an iPhone or iPad, you are unable to tap on the first row of the table view except on the very bottom of the row. The currently hidden scope bar is stealing the touches. If you activate and then cancel the search (making the scope bar appear and then disappear) then you are able to tap on the first row as expected. The initially hidden scope bar also bleeds through the first row of the table. It's faint but you can tell it's not quite right. Again, this is resolved by activating and then canceling the search once. FB17888632
When the scopeBarActivation of UISearchController is set to .onSearchActivation, the preferredSearchBarPlacement of the navigationItem is set to integrated or .integratedButton, and the toolbar is shown, then on iPhones (where the search bar/icon appears in the toolbar) the scope buttons appear (at the top of the screen) the first time the search is activated. But if you cancel the search and then activate it again, the search bar never appears a second (or later) time. On an iPad the search bar/icon appears in the nav bar and you end up with the same issue as #1 above. FB17890125
Issues 3 and 4 were reported against beta 1 and still haven't been fixed. But if issue 1 is resolved on iPhone, iPad, and Mac (via Mac Catalyst), then I personally won't be affected by issues 2, 3, or 4 any more (but of course all 4 issues need to be fixed). And by resolved, I mean that the scope bar appears and disappears when it is supposed to each and every time the search is activated and cancelled (not just the first time). The scope bar doesn't interfere with touch events upon initial display of the view controller. And there are no visual glitches no matter what the horizontal size class is on an iPad.
I really hope the UIKit team can get these resolved before iOS/iPadOS 26 GM.
Before I file a bug report I wanted to verify that I'm not missing something.
If I setup a view controller in a navigation controller and I add a view with a constraint that lines it up with the view controller's view's layoutMarginsGuide (leadingAnchor or trailingAnchor), in several cases the view will not line up with buttons added in the navigation bar. Under iOS 18 everything lines up as expected.
To demonstrate, create a new iOS project based on Swift/Storyboard. Setup the storyboard to show a UINavigationController with one UIViewController. Then in ViewController.swift (the one embedded in the navigation controller), use the following code:
import UIKit
class ViewController: UIViewController {
override func viewDidLoad() {
super.viewDidLoad()
view.backgroundColor = .yellow
title = "Layout Margins"
let leftCancel = UIBarButtonItem(systemItem: .cancel)
navigationItem.leftBarButtonItem = leftCancel
let rightCancel = UIBarButtonItem(systemItem: .cancel)
navigationItem.rightBarButtonItem = rightCancel
let leftView = UIView()
leftView.backgroundColor = .blue
leftView.translatesAutoresizingMaskIntoConstraints = false
self.view.addSubview(leftView)
let rightView = UIView()
rightView.backgroundColor = .red
rightView.translatesAutoresizingMaskIntoConstraints = false
self.view.addSubview(rightView)
NSLayoutConstraint.activate([
leftView.widthAnchor.constraint(equalToConstant: 80),
leftView.heightAnchor.constraint(equalToConstant: 80),
leftView.leadingAnchor.constraint(equalTo: self.view.layoutMarginsGuide.leadingAnchor),
leftView.topAnchor.constraint(equalTo: self.view.layoutMarginsGuide.topAnchor),
rightView.widthAnchor.constraint(equalToConstant: 80),
rightView.heightAnchor.constraint(equalToConstant: 80),
rightView.trailingAnchor.constraint(equalTo: self.view.layoutMarginsGuide.trailingAnchor),
rightView.topAnchor.constraint(equalTo: self.view.layoutMarginsGuide.topAnchor),
])
}
}
This adds a "Cancel" button to both ends of the navigation bar and it adds two little square views lined up with the leading and trailing layout margins.
Here's the results:
iPad running iPadOS 26 beta 3 (note the misalignment). This is really jarring when trying to align another glass button below the cancel button:
iPad running iPadOS 18.5 (aligned just fine):
iPhone in portrait running iOS 26 beta (aligned just fine):
iPhone in landscape running iOS 26 beta (no alignment at all):
iPhone in portrait running iOS 18.5 (aligned just fine):
iPhone in landscape running iOS 18.5 (aligned just fine):
Under iOS 26 on an iPhone (simulator at least) in portrait, the cancel buttons line up with the colored squares. That's good. In landscape, the colored squares have much larger margins as expected (due to the larger safe areas caused by the notch), but the cancel buttons in the navigation bar are not using the same margins. This one is debatable. Under iOS 18 the cancel buttons use larger margins to match the larger safe area. But I can see why under iOS 26 they changed this since the navigation bar doesn't interfere with the notch. But it's inconsistent.
Under iOS 26 on an iPad (simulator at least), it's wrong in any orientation. Despite the lack of any notch or need for a larger safe area, the colored squares are indented just a bit more than the buttons in the navigation bar. I see no reason for this. Under iOS 18 everything lines up as expected.
My real question at this point: Is the mismatched margins on an iPad under iOS 26 between the buttons in the navigation bar and other views added to the view controller a likely bug or am I missing something?