Is there any way to get Xcode to render a macOS settings view hierarchy as if it were in a Settings window? I have a TabView at the top of my Settings hierarchy, but in Xcode’s preview, it shows the tab as a simple text-only button:
instead of the icon it looks like when running:
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I'm working on a SwiftUI client app for my web service. A logged-in user can belong to one or more Organizations, and in the app's settings UI, the user can change which organization is the current selectedOrg. The selected org is displayed using a Picker view in the settings UI. The Model is made available to the UI primarily through @EnvironmentObject.
When the user changes the selectedOrg, the app has to fetch some data (a set of Machine records) and set the result in the model. I see at least two ways to do this.
A) Use @State in the SwiftUI view, initialized with the current selectedOrg in the model, and .onChange(of:) that state to then call a method that sets the new org and starts the relevant data fetch.
B) Bind to the selectedOrg in the model, and use willSet/didSet inside the model to initiate the network request.
The latter option feels better, because it pulls a lot of the logic out of the views and into the model/business logic.
What are your thoughts on this?
I have a multi-timezone clock displayed in a very thin window I like to put up on top of my menu bar or in the empty space to the left or right of the notch (when my laptop is not the primary display). There's no issue moving it there, but every time the screens reconfigure, macOS moves my window down out of that area.
Is there any way to exempt an NSWindow (or a SwitUI window) from this repositioning behavior?
I have an old project Objective-C iOS app I've been updating to Swift. It used to use Carthage, but now uses only SPM for dependencies. One such dependency is TrueTime (which was updated to support SPM by that author).
It has two products, TrueTime, a Swift library, and CTrueTime a header-only C library with some packed, byte-aligned structs used when parsing NTP responses.
I’m adding unit tests to my project, and when I import the app module with @testable import MyApp, the build fails on that line with
@testable import MyApp
^
<unknown>:0: error: missing required module 'CTrueTime'
The app itself builds and runs just fine.
If I don’t import the app, then I have to expressly add code I want to test to the test target, which is not the right way to do these things. The other dependencies I have all work fine, but none have a C library product.
I get a warning about implicitly including a bridging header, but I don’t think that has anything to do with this.
Update: I also just tried adding TrueTime to a modern, pure SwiftUI iOS app, and I get the same issue.
Is there something more I need to add to my test target config?
(I've also posted this to stack overflow https://stackoverflow.com/questions/76284838/when-importing-app-with-testable-get-missing-required-module-ctruetime-usi)
I have yet to see iOS call my background task. I can get it to call it with the debug support (_simulateLaunchForTaskWithIdentifier), but not on its own.
Does being foreground suppress the task call?
Does being connected to the debugger (but in the background) suppress the call?
Does an Xcode-installed app get its background task called?
What is the Xcode scheme Run Info option “Launch due to a background fetch event"?
Xcode 15b1 and iOS 17b1.
I get that iOS decides if it will call your task, but I let my app run overnight and it never once called a task that I schedule for about 20 min in the future.
If I trigger it via the debugging options here it works fine, but never has the OS called me back.
I'm trying to write a simple image converter app. I want it to open any image file, and export that image as another type. I started with the multiplatform document app template in Xcode 15b2, and suppressed the New command with
.commands
{
CommandGroup(replacing: CommandGroupPlacement.newItem) {}
}
I also changed the document readable types:
static var readableContentTypes: [UTType] { [.image] }
So far, so good. Then I tried modifying the document type declaration to be public.image, and removed the Imported UTI (reasoning that it’s a standard type and shouldn’t need to be redundantly declared, and how can I possibly list all the possible image filename extensions?), but I’m not at all sure I did that right.
And somewhere in all that, I noticed that my File menu no longer has an Open command. Does SwiftUI look at the declared document type or imported UTIs to decide what the File menu should look like?
UPDATE: Ah, suppressing the New menu also suppressed the Open/Open Recent menu items. That's frustrating.
I started building a new app with Xcode 15, and I uploaded a few builds to TestFlight with it. Now I'm trying with 15b4, and I'm getting:
Invalid Toolchain. Your app was built with an unsupported SDK or version of Xcode. If you plan to submit this build to the App Store, make sure you are using the versions listed in https://help.apple.com/xcode/mac/current/#/devf16aefe3b or later. (ID: 0c544ce8-53ee-4122-b4a9-467a3d766349)
Is this a bug? They've always allowed TestFlight builds with pre-release toolchans, havn't they? How else can you ensure your app is ready for the new OS version if you can't test changes to it?
I have a regular app and an app with LSUIElement=YES. Both have Swift app lifecycle main entry points. Both have an app group ".com.company.app". Both can read and write prefs and see each other's values.
But I can't for the life of me get changes from one app to notify the other. At first I tried NotificationCenter, but then learned (thanks to this thread) that you have to use KVO or Combine. I tried both. Both get the initial value, but never see subsequent changes.
Combine seems to just wrap KVO, looking at the stack trace.
I'm subscribing to updates like this:
let defaults = UserDefaults(suiteName: "<TEAMID>.com.company.app")!
defaults
.publisher(for: \.enabled)
.handleEvents(receiveOutput: { enabled in
print("Enabled is now: \(enabled)")
})
.sink { _ in }
.store(in: &subs)
…
extension UserDefaults {
@objc var enabled: Bool {
get {
return bool(forKey: "Enabled")
}
set {
set(newValue, forKey: "Enabled")
}
}
}
I'm writing to prefs like this:
let defaults = UserDefaults(suiteName: "<TEAMID>.com.company.app")!
defaults.set(enabled, forKey: "Enabled")
Am I missing something?
I would like for my app to have a menu bar under certain circumstances, but not most of the time. Is this possible?
Is there a way to play a specific rectangular region of interest of a video in an arbitrarily-sized view?
Let's say I have a 1080p video but I'm only interested in a sub-region of the full frame. Is there a way to specify a source rect to be displayed in an arbitrary view (SwiftUI view, ideally), and have it play that in real time, without having to pre-render the cropped region?
Update: I may have found a solution here: img DOT ly/blog/trim-and-crop-video-in-swift/ (Apple won't allow that URL for some dumb reason)
I spent some time cleaning up my TCC data. During that I learned that some TCC info is cached in “Mac OS X Detached Code Signature” files. Is there a way to dump their contents suitable for human consumption? The codesign tool doesn’t seem to do it (or I can’t figure out how to invoke it).
I’ve got some code that creates an AVPlayerItem from a URL, the creates an AVQueuePlayer from it. If I check the player item's status after that, it's still unknown.
According to the docs, it'll remain unknown until it is associated with an AVPlayer, and then it "immediately begins enqueuing the item’s media and preparing it for playback." But checking the status right after that, I still get unknown, which tells me it’s not quite immediate.
Is there any way to test if the player item will work immediately after creation? In this case, the problem is that my app doesn't have permission, due to it being a bookmark saved in a sandboxed app.
My macOS SwiftUI app has a single Window scene. .onAppear() is called when it is created, but if it is closed, .onDisappear() is not called.
How can I detect when the window is closed? Note, I need to distinguish between closed and inactive or background.
scenePhase is not updated for the window or any of its subviews when the window is deactivated. controlPhase is, however.
Note that my app delegate implements:
func applicationShouldTerminateAfterLastWindowClosed(_ sender: NSApplication) -> Bool {
return false
}
I've filed FB13497581 on this topic. But Ever since macOS 14 (or perhaps one of the releases that came after), my machine (M1 MacBook Pro) frequently locks up the UI. Sometimes it's so bad the mouse won't move. It only lasts a few seconds, but at its worst it happens every few seconds. It usually manifests as no text appearing as I type (which sucks, since I write code all day).
I'm fairly certain configd is to blame. When this occurs, configd is using nearly 100% CPU, and spams logs with thousands of messages per second.
Killing it does nothing. Rebooting might fix it for a bit, but it eventually comes back.
I'm wondering if there’s some on-disk state I can blow away to start SystemConfiguration from scratch. I hate to do that in case someone from Apple asks me for files to analyze, but I feel like my FB is just going to languish. Meanwhile, this is rendering my machine unusable.
Or if anyone knows how to interpret the log messages.