Post

Replies

Boosts

Views

Activity

Any solution yet to not being able to turn off debugging via WiFI?
Debugger on Xcode 16.x is super slow and it turns out it's only this way when Xcode is connected to my iPhone via WiFi. If I disable WiFI on my iPhone everything is just fine. But that's not a solution. An engineer posted this supposed solution, https://developer.apple.com/documentation/xcode-release-notes/xcode-15-release-notes. Forgive me but that's not a solution, especially since we used to be able to shut off "Connect via WiFI." I've seen so many posts here and everywhere else with no one stating any clear answer. Does anyone know why has this been removed? And is anyone aware of it? I've posted in the Feedback Asst. as many others have. What gives?
0
1
227
Jan ’25
Is there anywhere to get precompiled WhisperKit models for Swift?
If try to dynamically load WhipserKit's models, as in below, the download never occurs. No error or anything. And at the same time I can still get to the huggingface.co hosting site without any headaches, so it's not a blocking issue. let config = WhisperKitConfig( model: "openai_whisper-large-v3", modelRepo: "argmaxinc/whisperkit-coreml" ) So I have to default to the tiny model as seen below. I have tried so many ways, using ChatGPT and others, to build the models on my Mac, but too many failures, because I have never dealt with builds like that before. Are there any hosting sites that have the models (small, medium, large) already built where I can download them and just bundle them into my project? Wasted quite a large amount of time trying to get this done. import Foundation import WhisperKit @MainActor class WhisperLoader: ObservableObject { var pipe: WhisperKit? init() { Task { await self.initializeWhisper() } } private func initializeWhisper() async { do { Logging.shared.logLevel = .debug Logging.shared.loggingCallback = { message in print("[WhisperKit] \(message)") } let pipe = try await WhisperKit() // defaults to "tiny" self.pipe = pipe print("initialized. Model state: \(pipe.modelState)") guard let audioURL = Bundle.main.url(forResource: "44pf", withExtension: "wav") else { fatalError("not in bundle") } let result = try await pipe.transcribe(audioPath: audioURL.path) print("result: \(result)") } catch { print("Error: \(error)") } } }
0
0
98
Jun ’25
How can I execute animations for multiple SKSprintNodes at once?
I already know how to run multiple animations on the same SKSpriteNode at once: createPaths() let myIcon1 = MyIcon(wIcon: "screen", iSize: iSize) let move = SKAction.follow(iconPath[0], asOffset: false, orientToPath: false, duration: 1) let shrink = SKAction.resize(byWidth: -iSize.width/2, height: -iSize.width/2, duration: 1) let blur = SKAction.fadeAlpha(to: 0.6, duration: 1) let group = SKAction.group([shrink, move, blur]) myIcon1.run(group) But I have two more icons I would like to animate at the same time. Granted, with just 3 icons total I can't see any lag if I do something like this: myIcon1.run(group1) myIcon2.run(group2) myIcon3.run(group3) But surely there is a proper way to do this?
1
0
577
Feb ’21
Is there a way to translate touches to screen coordinates
As you can see in the last two lines of the code below, I specify a specific SKSpriteNode to get the correct (or is it, adjusted?) touch coordinates. The last line is just left in there to compare while I am debugging. I was curious if there was already a method in Swift that translates any coordinates handed to it into physical screen coordinates? It would just be easier than having to first find out: Is this item that I am tracking, owned by the main GameScene, or a SKSpriteNode that has been placed somewhere other than 0,0 on the GameScene? override func touchesEnded(_ touches: Set<UITouch>, with event: UIEvent?) {         super.touchesEnded(touches , with:event)         var delta = CGPoint(x: 0, y: 0)         guard touches.first != nil else { return }         if let touch = touches.first,            let node = myGV.currentGem,            node.isMoving == true {             let touchLocation = touch.location(in: myGV.guessBar!)             let touchLocation2 = touch.location(in: self)
1
0
820
Jun ’21
How can I return a nil in Swift
I have a subclass of SKSpriteNode called MyGem. There are multiple instances of this class at runtime. They are all in an array of MyGems. At a specific point I would like to find out which MyGem is twinkling. The problem I am running into is if no MyGem is twinkling. What do I return? I can't return a nil. 'nil' is incompatible with return type 'MyGem' So what do I return? I thought of returning the index number of the MyGem in the array class, and then passing -1 if none were twinkling. But that seems kludgy. func getHighGem() -> MyGem {    for gem in myGems {     if gem.twinkling == true {     return gem       }    }     return nil //this line causes the IDE error }
1
0
3.7k
Jun ’21
Can I get an subclass's property from a function call that returns said subclass?
While debugging I would like to get a specific subclass's property. So far I can do this: if let myPeg = getTargetSlot(node: node, location touchLocation){           print (myPeg.index)             } Is it possible to write it in one line? Something like this? print ("\(getTargetSlot(node: node, location touchLocation).index") Because if there is a way, I cannot figure out the syntax. Thanks
1
0
442
Aug ’21
Why am I getting touchesCanceled instead of touchesEnded?
I am using debug labels so I know when I get any touchesBegan, touchesMoving, touchesEnded, and touchesCanceled. I get everything I expect, touchesBegan, touchesMoved, and touchesEnded. However, it would appear that if I barely, just barely move an SKSPriteNode, I get a touchesCanceled instead of touchesEnded. Even thought the call right before is touchesMoved. Is this as intended? Is there some threshold you have to reach to get an ended instead of a canceled? And if so, can I see and change it? Thanks More than glad to put my code, tho not sure how that would change anything.
1
0
672
Aug ’21
What is the proper term for create a world in Swift?
I know it's uncool to ask vague questions here, but what do they call it when you create a world and follow it with a camera in Swift? Like an RPG? Like Doom? I want to try and learn that now. And more importantly can it be done without using the Xcode scene builder? Can it be done all via code? Thanks, as always. Without the forum I would never have gotten much farther than "Hello World!"
1
0
625
Sep ’21
Looking for a proper Speech to Text tutorial
So sorry if I should't be asking this here, but am trying to find a current-ish tutorial on how to make an app that converts speech to text in real time. Transcribing from text to speech as you're speaking. I've found a few one's on YouTube, but they are quite old, or just transcribing from a recorded file, etc. etc. If anyone is aware of a good tutorial, paid or not, I would so appreciate any link. Thank you
1
0
756
Nov ’21