Post

Replies

Boosts

Views

Activity

Xcode debugger seems slow
It just feels as if my debugger is running super slow when I step over each line. Each line is doing string comparison, splitting text into words, really nothing fancy. It appears that every time I hit F6, the Variables View (local variables) takes 4 seconds or more to refresh. But I don't know if that's the cause, or a symptom. Just curious if anyone can shed any light on this. Specs MacBook Pro 2019 2.6 GHz 6-Core Intel Core i7 16 GB 2667 MHz DDR4 Sequoia Version 15.1.1 (24B91) iPhone running app is 13 pro 18.1.1 Xcode Version 16.2 (16C5032a)
1
1
351
Jan ’25
Latest version of Xcode has weird Preview issues
So I believe my machine JUST updated to Xcode 16.3 (16E140). But it definitely just installed the latest iOS simulator 18.4. However, now my preview will sometimes give me the error Failed to launch app ”Picker.app” in reasonable time. If I add a space in my code, or hit refresh on the Preview, then it will run on the second or third attempt. Sometimes in between the refreshes, the preview will crash, and then it will work again. Anyone else experiencing this? Any ideas? Thanks
1
1
163
Apr ’25
Xcode can't find prepareCustomLanguageModel
I have Xcode 16 and am setting everything to a minimum target deployment to 17.5, and am using import Speech Never the less, Xcode doesn't can't find it. At ChatGPT's urging I tried going back to Xcode 15.3, but that won't work with Sequoia Am I misunderstanding something? Here's how I am trying to use it: if templateItems.isEmpty { templateItems = dbControl?.getAllItems(templateName: templateName) ?? [] items = templateItems.compactMap { $0.itemName?.components(separatedBy: " ") }.flatMap { $0 } let phrases = extractContextualWords(from: templateItems) Task { do { // 1. Get your items and extract words templateItems = dbControl?.getAllItems(templateName: templateName) ?? [] let phrases = extractContextualWords(from: templateItems) // 2. Build the custom model and export it let modelURL = try await buildCustomLanguageModel(from: phrases) // 3. Prepare the model (STATIC method) try await SFSpeechRecognizer.prepareCustomLanguageModel(at: modelURL) // ✅ Ready to use in recognition request print("✅ Model prepared at: \(modelURL)") // Save modelURL to use in Step 5 (speech recognition) // e.g., self.savedModelURL = modelURL } catch { print("❌ Error preparing model: \(error)") } } }
1
0
118
Jul ’25
Can I create an .fsh file from an image?
I wanted to use a png image to create a pattern for an SKSpriteNode. Supposedly: Pattern images are not supported via UIColor in SpriteKit So I am supposed to use an .fsh file for shading. The thing is, can I create such a file from an image? Everywhere I've looked only show's mathematical methods for creating those files. I hope this is something that is possible
0
0
636
Mar ’21
Need help on SQLite wrapper for Swift
I am using the SQLite wrapper for Xcode. I got it from the link below and did install it. But was hoping there would better documentation, or tutorials out there for it. Am new enough at Swift and its syntax. Whatever can make this easier for me would be a big help. https: //git.pado.name/reviewspur/ios/tree/fd2486cf91e422e2df8d048ffd2d40ea89527685/Carthage/Checkouts/SQLite.swift/Documentation#building-type-safe-sql
0
0
395
May ’21
What happens if I execute an SKAction run sequence on a node that is already executing another sequence?
I have an SKSpriteNode with an SKAction being run on it: theGem!.run(premAction, completion: {theGem!.run(repeatAction)}) Can't seem to find out the proper steps to run another action, such as: theGem.run(endsequence, completion: {theGem.removeAllActions(); theGem.run(stopAction)}) Should I stop the previous action first? Is there a way to turn the repeat part off so that the first SKAction ends smoothly?
0
0
483
Aug ’21
How to keep two SKPhysicsBodyies from passing through each other?
In my app, I have greyBars and one border bar. The border bar keeps the greyBars from falling off the screen. Only one greyBar is used at a time. When it is completely filed with colored gems, a new bar is created and brought up from the bottom of the screen, with the code below. The result I want is that the new bar, the one with all white diamonds pushes up the old bar and the new bar remains on the bottom. You can see by the screenshot that somehow the new bar ended up on top, even though it was coming from the bottom. I slowed down the duration of the greyBar's movement to see what was going wrong. While the two greyBars do clash with each other, the new one (the one on the bottom) ends up pushing THROUGH the top bar. My assumption was that the new greyBar would just push up the old bar(s), and remain on the bottom. Is there some "solidity" type property that I am missing? myGreyBar[0].physicsBody?.categoryBitMask = bodyMasks.greyBarMask.rawValue myGreyBar[0].physicsBody?.contactTestBitMask = bodyMasks.blankMask.rawValue myGreyBar[0].physicsBody?.collisionBitMask = bodyMasks.greyBarMask.rawValue myGreyBar[0].isHidden = false; myGV.gameScene?.addChild(myGreyBar[0]) let moveAction = SKAction.move(to: CGPoint(x:(myGV.safeSceneRect.width/2) - (size.width/2), y: (myGemBase?.size.height)! + (myGV.border?.size.height)! + 200), duration: 10.0) myGreyBar[0].run(moveAction, completion:{myGreyBar[0].physicsBody?.collisionBitMask = bodyMasks.borderMask.rawValue|bodyMasks.greyBarMask.rawValue})
0
0
428
Sep ’21
How can I change the bounce of SKSpriteNode dynamically?
I have a very simple app. All SKSpriteNodes, myBall, myBlue, and myRed. Only myBall moves, affected by gravity, and bounces off of different objects (myRed and myBlue). What I can't figure out is how to make myBall bounce harder or softer depending on which body it hits? I hav been playing with the density of all the objects, but it doesn't seem to make any difference? Is there some property I am unaware of? Or are there other methods?
0
0
377
Oct ’21
How can I apply safeAreaLayouts to multiple views?
When my app starts up I have my ViewController, which automatically creates my MainScreen (also a view controller). Right after self.addChild(mainController) I call a function which sets my constraints func setConstraints (vc: UIViewController) { vc.view.translatesAutoresizingMaskIntoConstraints = false var constraints = [NSLayoutConstraint]() constraints.append(vc.view.leadingAnchor.constraint(equalTo: view.safeAreaLayoutGuide.leadingAnchor)) constraints.append(vc.view.trailingAnchor.constraint(equalTo: view.safeAreaLayoutGuide.trailingAnchor)) constraints.append(vc.view.bottomAnchor.constraint(equalTo: view.safeAreaLayoutGuide.bottomAnchor)) constraints.append(vc.view.topAnchor.constraint(equalTo: view.safeAreaLayoutGuide.topAnchor)) NSLayoutConstraint.activate(constraints) } All is fine up to this point, the MainScreen is bound by the top and bottom safe areas. At some point from MainScreen I create another UIViewController. countController.modalPresentationStyle = .fullScreen self.present(countController, animated: true, completion: {}) Yet, no matter how hard I try to apply the constraints to the new controller, I crash with the following msg: Unable to activate constraint with anchors <NSLayoutXAxisAnchor....because they have no common ancestor. Does the constraint or its anchors reference items in different view hierarchies? That's illegal." Am too new to figure out where my error is.
0
0
375
Nov ’21
How to reset speechRecognizer
Building a very simple voice-to-text app, which I got from an online demo. What I can't seem to find is how to reset the response back to nil. This demo just keeps transcribing from the very beginning till it finally stalls. While I don't know how if the stall is related to my question, I still need to find out how to code "Ok, got the first 100 words. Reset response text to nil. Continue." func startSpeechRecognition(){ let node = audioEngine.inputNode let recordingFormat = node.outputFormat(forBus: 0) node.installTap(onBus: 0, bufferSize: 1024, format: recordingFormat, block: { (buffer, _) in self.request.append(buffer)}) audioEngine.prepare() do { try audioEngine.start() } catch let error { alertView(message: "audioEngine start error") } guard let myRecognition = SFSpeechRecognizer() else { self.alertView(message: "Recognition is not on your phone") return } if !myRecognition.isAvailable { self.alertView(message: "recognition is not available right now") } task = speechRecognizer?.recognitionTask(with: request, resultHandler: { (response, error) in guard let response = response else { if error != nil { self.alertView(message: error!.localizedDescription.debugDescription) } else { self.alertView(message: "Unknow error in creating task") } return } let message = response.bestTranscription.formattedString self.label.text = message }) }
0
1
860
Dec ’21
Looking for in depth tutorial on SFSpeechRecognizer
it's a great tool from Apple, but I want to delve more into its engine as I need to. The documentation doesn't seem to go there. For instance, I can't figure out how to clear the bestTranscritption object in speechRecognizer, as it always contains the entire transcription. There are other things I would like to work with as well. Has anyone worked with this heavily enough to recommend proper books are paid for tutorials? Many thanks
0
0
581
Dec ’21
How to convert node.outputFormat to settings for AVAudioFile
Am trying to go from the installTap straight to AVAudioFile(forWriting: I call: let recordingFormat = node.outputFormat(forBus: 0) and I get back : <AVAudioFormat 0x60000278f750:  1 ch,  48000 Hz, Float32> But AVAudioFile has a settings parameter of [String : Any] and am curious of how to place those values into recording the required format. Hopefully these are the values I need?
0
0
456
Dec ’21