Post

Replies

Boosts

Views

Created

Any solution yet to not being able to turn off debugging via WiFI?
Debugger on Xcode 16.x is super slow and it turns out it's only this way when Xcode is connected to my iPhone via WiFi. If I disable WiFI on my iPhone everything is just fine. But that's not a solution. An engineer posted this supposed solution, https://developer.apple.com/documentation/xcode-release-notes/xcode-15-release-notes. Forgive me but that's not a solution, especially since we used to be able to shut off "Connect via WiFI." I've seen so many posts here and everywhere else with no one stating any clear answer. Does anyone know why has this been removed? And is anyone aware of it? I've posted in the Feedback Asst. as many others have. What gives?
0
1
229
Jan ’25
Latest version of Xcode has weird Preview issues
So I believe my machine JUST updated to Xcode 16.3 (16E140). But it definitely just installed the latest iOS simulator 18.4. However, now my preview will sometimes give me the error Failed to launch app ”Picker.app” in reasonable time. If I add a space in my code, or hit refresh on the Preview, then it will run on the second or third attempt. Sometimes in between the refreshes, the preview will crash, and then it will work again. Anyone else experiencing this? Any ideas? Thanks
1
1
163
Apr ’25
Is there a global Alert View in SwiftUI?
I am writing a SwiftUI based app, and errors can occur anywhere. I've got a function that logs the error. But it would be nice to be able to call an Alert Msg, no matter where I am, then gracefully exits the app. Sure I can right the alert into every view, but that seems ridiculously unnecessary. Am I missing something?
2
0
114
Apr ’25
Is there anywhere to get precompiled WhisperKit models for Swift?
If try to dynamically load WhipserKit's models, as in below, the download never occurs. No error or anything. And at the same time I can still get to the huggingface.co hosting site without any headaches, so it's not a blocking issue. let config = WhisperKitConfig( model: "openai_whisper-large-v3", modelRepo: "argmaxinc/whisperkit-coreml" ) So I have to default to the tiny model as seen below. I have tried so many ways, using ChatGPT and others, to build the models on my Mac, but too many failures, because I have never dealt with builds like that before. Are there any hosting sites that have the models (small, medium, large) already built where I can download them and just bundle them into my project? Wasted quite a large amount of time trying to get this done. import Foundation import WhisperKit @MainActor class WhisperLoader: ObservableObject { var pipe: WhisperKit? init() { Task { await self.initializeWhisper() } } private func initializeWhisper() async { do { Logging.shared.logLevel = .debug Logging.shared.loggingCallback = { message in print("[WhisperKit] \(message)") } let pipe = try await WhisperKit() // defaults to "tiny" self.pipe = pipe print("initialized. Model state: \(pipe.modelState)") guard let audioURL = Bundle.main.url(forResource: "44pf", withExtension: "wav") else { fatalError("not in bundle") } let result = try await pipe.transcribe(audioPath: audioURL.path) print("result: \(result)") } catch { print("Error: \(error)") } } }
0
0
99
Jun ’25
Xcode can't find prepareCustomLanguageModel
I have Xcode 16 and am setting everything to a minimum target deployment to 17.5, and am using import Speech Never the less, Xcode doesn't can't find it. At ChatGPT's urging I tried going back to Xcode 15.3, but that won't work with Sequoia Am I misunderstanding something? Here's how I am trying to use it: if templateItems.isEmpty { templateItems = dbControl?.getAllItems(templateName: templateName) ?? [] items = templateItems.compactMap { $0.itemName?.components(separatedBy: " ") }.flatMap { $0 } let phrases = extractContextualWords(from: templateItems) Task { do { // 1. Get your items and extract words templateItems = dbControl?.getAllItems(templateName: templateName) ?? [] let phrases = extractContextualWords(from: templateItems) // 2. Build the custom model and export it let modelURL = try await buildCustomLanguageModel(from: phrases) // 3. Prepare the model (STATIC method) try await SFSpeechRecognizer.prepareCustomLanguageModel(at: modelURL) // ✅ Ready to use in recognition request print("✅ Model prepared at: \(modelURL)") // Save modelURL to use in Step 5 (speech recognition) // e.g., self.savedModelURL = modelURL } catch { print("❌ Error preparing model: \(error)") } } }
1
0
118
Jul ’25