The code below is a test to trigger UI updates every 30 seconds. I'm trying to keep most work off main and only push to main once I have the string (which is cached). Why is updating SwiftUI 30 times per second so expensive? This code causes 10% CPU on my M4 Mac, but comment out the following line:
Text(model.timeString)
and it's 0% CPU. The reason why I think I have too much work on main is because of this from instruments. But I'm no instruments expert.
import SwiftUI
import UniformTypeIdentifiers
@main
struct RapidUIUpdateTestApp: App {
var body: some Scene {
DocumentGroup(newDocument: RapidUIUpdateTestDocument()) { file in
ContentView(document: file.$document)
}
}
}
struct ContentView: View {
@Binding var document: RapidUIUpdateTestDocument
@State private var model = PlayerModel()
var body: some View {
VStack(spacing: 16) {
Text(model.timeString) // only this changes
.font(.system(size: 44, weight: .semibold, design: .monospaced))
.transaction { $0.animation = nil } // no implicit animations
HStack {
Button(model.running ? "Pause" : "Play") {
model.running ? model.pause() : model.start()
}
Button("Reset") { model.seek(0) }
Stepper("FPS: \(Int(model.fps))", value: $model.fps, in: 10...120, step: 1)
.onChange(of: model.fps) { _, _ in model.applyFPS() }
}
}
.padding()
.onAppear { model.start() }
.onDisappear { model.stop() }
}
}
@Observable
final class PlayerModel {
// Publish ONE value to minimize invalidations
var timeString: String = "0.000 s"
var fps: Double = 30
var running = false
private var formatter: NumberFormatter = {
let f = NumberFormatter()
f.minimumFractionDigits = 3
f.maximumFractionDigits = 3
return f
}()
@ObservationIgnored private let q = DispatchQueue(label: "tc.timer", qos: .userInteractive)
@ObservationIgnored private var timer: DispatchSourceTimer?
@ObservationIgnored private var startHost: UInt64 = 0
@ObservationIgnored private var pausedAt: Double = 0
@ObservationIgnored private var lastFrame: Int = -1
// cache timebase once
private static let secsPerTick: Double = {
var info = mach_timebase_info_data_t()
mach_timebase_info(&info)
return Double(info.numer) / Double(info.denom) / 1_000_000_000.0
}()
func start() {
guard timer == nil else { running = true; return }
let desiredUIFPS: Double = 30 // or 60, 24, etc.
let periodNs = UInt64(1_000_000_000 / desiredUIFPS)
running = true
startHost = mach_absolute_time()
let t = DispatchSource.makeTimerSource(queue: q)
// ~30 fps, with leeway to let the kernel coalesce wakeups
t.schedule(
deadline: .now(),
repeating: .nanoseconds(Int(periodNs)), // 33_333_333 ns ≈ 30 fps
leeway: .milliseconds(30) // allow coalescing
)
t.setEventHandler { [weak self] in self?.tick() }
timer = t
t.resume()
}
func pause() {
guard running else { return }
pausedAt = now()
running = false
}
func stop() {
timer?.cancel()
timer = nil
running = false
pausedAt = 0
lastFrame = -1
}
func seek(_ seconds: Double) {
pausedAt = max(0, seconds)
startHost = mach_absolute_time()
lastFrame = -1 // force next UI update
}
func applyFPS() { lastFrame = -1 } // next tick will refresh string
// MARK: - Tick on background queue
private func tick() {
let s = now()
let str = formatter.string(from: s as NSNumber) ?? String(format: "%.3f", s)
let display = "\(str) s"
DispatchQueue.main.async { [weak self] in
self?.timeString = display
}
}
private func now() -> Double {
guard running else { return pausedAt }
let delta = mach_absolute_time() &- startHost
return pausedAt + Double(delta) * Self.secsPerTick
}
}
nonisolated struct RapidUIUpdateTestDocument: FileDocument {
var text: String
init(text: String = "Hello, world!") {
self.text = text
}
static let readableContentTypes = [
UTType(importedAs: "com.example.plain-text")
]
init(configuration: ReadConfiguration) throws {
guard let data = configuration.file.regularFileContents,
let string = String(data: data, encoding: .utf8)
else {
throw CocoaError(.fileReadCorruptFile)
}
text = string
}
func fileWrapper(configuration: WriteConfiguration) throws -> FileWrapper {
let data = text.data(using: .utf8)!
return .init(regularFileWithContents: data)
}
}
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Created
I have an app that only crashes once it's been notarised. I read a few posts that essentially said before trying to identify issues by reviewing the crash report I should ensure signing and notarisation has happened correctly.
I've worked through the document "Resolving common notarization issues"
spctl -vvv --assess --type exec: gives no errors and correctly returns my developer id.
codesign -dvv: returns a timestamp
My app uses a hardened runtime.
My app shows up in Xcode as a macOS Archive (e.g not a Generic Xcode Archive)
Here is the crash report.
Translated Report (Full Report Below)
Process: Scene Finder [44479]
Path: /Users/USER/Downloads/Scene Finder.app/Contents/MacOS/Scene Finder
Identifier:
Version: 0.9 (20250206.1)
Code Type: ARM-64 (Native)
Parent Process: launchd [1]
User ID: 501
Date/Time: 2025-02-11 13:09:03.7786 +1000
OS Version: macOS 15.3 (24D60)
Report Version: 12
Anonymous UUID: EE8B1269-0A8A-3AB6-516B-C752E8A18B5A
Sleep/Wake UUID: 436CD7CF-7B13-4A9C-9425-7EF94CC007A9
Time Awake Since Boot: 98000 seconds
Time Since Wake: 9524 seconds
System Integrity Protection: enabled
Crashed Thread: 0 Dispatch queue: com.apple.main-thread
Exception Type: EXC_CRASH (SIGABRT)
Exception Codes: 0x0000000000000000, 0x0000000000000000
Termination Reason: Namespace SIGNAL, Code 6 Abort trap: 6
Terminating Process: Scene Finder [44479]
I'm getting hundreds of the message below in Xcode. I've narrowed it down to when I instantiate the following
AVAudioUnitComponentManager.shared()
Message send exceeds rate-limit threshold and will be dropped. { reporterID=231700600717315, rateLimit=32hz }
I'm having an issue with my swiftui macOS application where it is continually consuming more memory over time and after a couple of hours will grind to a halt. I've watched a few videos now on how to use Xcode Memory Graph and Instruments to identify the source of a leak (I assume it is a leak). These videos all provide very obvious issues as examples but mine seems more elusive and I don't know how to identify which part of my code is the cause of the issue.
After running instruments I see the following but the leaked objects are not always consistent:
Xcode Memory Graph shows NSSet as the culprit which is shown under CoreFoundation (not my App). I really am a beginner here and because it's not showing me somewhere in my app that I can go and investigate I'm really stuck.
I'm seeing unexpected results when examining the results from a sound classification test. Whilst I appear to get accurate startTime for observations, the duration is always the same as the value put into the windowDuration.
I'm guessing I'm misunderstanding the purpose of duration in the classification results.
The link here says:
The time range’s CMTime values are the number of audio frames at the analyzer’s sample rate. Use these time indices to determine where, in time, the result corresponds to the original audio.
My understanding of this statement is it should give me the startTime AND the duration of that detection event. For example, if I attempt to detect a crowd sound and that sound lasts for 1.8 seconds, then I should see 1.8 seconds in the duration.
Below is some code showing what I'm seeing.
Initialisation of request.windDuration of 1 second. If I change this to any other value, that value is reported back as the duration of the event. Even if the event is half a second in duration.
Any help in either a code issue or understanding the results better would be appreciated. Thanks
let request = try SNClassifySoundRequest(classifierIdentifier: .version1)
request.overlapFactor = 0.8
request.windowDuration = CMTimeMakeWithSeconds(600, preferredTimescale: 600)
My code to get the values out of the SNResult
func request(_ request: SNRequest, didProduce result: SNResult) {
guard let analysisResult = result as? SNClassificationResult,
let predominantSound = analysisResult.classifications.first?.identifier,
soundsToDetect.contains(predominantSound) else { return }
let startTime = analysisResult.timeRange.start.seconds
let duration = analysisResult.timeRange.duration.seconds
let confidence = analysisResult.classifications.first?.confidence ?? 0.0
let detectedSound = ClassificationObject(id: UUID(), name: predominantSound, startTime: startTime, duration: duration, confidence: confidence)
self.detectedSounds.append(detectedSound)
}