Post

Replies

Boosts

Views

Activity

Do I need to dismiss UIDocumentPickerViewController even when the enduser choses a file?
From a tutorial I pulled the extension below to allow the enduser to select a file off of the iPhone's storage system. Everything works fine. However I noticed that in the delegate that "dismiss" is only being called if the user chooses cancel. While the view does disappear when a file is selected, I am not sure that the view is being properly dismissed internally by Swift. Since I do call present, am I responsible for dismissing it when the user chooses a file as well? let supportedTypes: [UTType] = [UTType.item] let pickerViewController = UIDocumentPickerViewController(forOpeningContentTypes: supportedTypes) pickerViewController.delegate = self pickerViewController.allowsMultipleSelection = false present(pickerViewController, animated: true, completion: nil) extension uploadFile: UIDocumentPickerDelegate { func documentPicker(_ controller: UIDocumentPickerViewController, didPickDocumentsAt urls: [URL]) { for url in urls { guard url.startAccessingSecurityScopedResource() else { print ("error") return } ...save chosen file url here in an existing structre do { url.stopAccessingSecurityScopedResource() } } } func documentPickerWasCancelled(_ controller: UIDocumentPickerViewController) { controller.dismiss(animated: true, completion: nil) } }
0
0
667
Apr ’22
Am unable to add an AVAudioMixerNode to downsize my recording
Am at the beginning of a voice recording app. I store incoming voice data into a buffer array, and write 50 of them to a file. The code works fine, Sample One. However, I would like the recorded files to be smaller. So here I try to add an AVAudioMixer to downsize the sampling. But this code sample gives me two errors. Sample Two The first error I get is when I call audioEngine.attach(downMixer). The debugger gives me nine of these errors: throwing -10878 The second error is a crash when I try to write to audioFile. Of course they might all be related, so am looking to include the mixer successfully first. But I do need help as I am just trying to piece these all together from tutorials, and when it comes to audio, I know less than anything else. Sample One //these two lines are in the init of the class that contains this function... node = audioEngine.inputNode recordingFormat = node.inputFormat(forBus: 0) func startRecording() { audioBuffs = [] x = -1 node.installTap(onBus: 0, bufferSize: 8192, format: recordingFormat, block: { [self] (buffer, _) in x += 1 audioBuffs.append(buffer) if x >= 50 { audioFile = makeFile(format: recordingFormat, index: fileCount) mainView?.setLabelText(tag: 3, text: "fileIndex = \(fileCount)") fileCount += 1 for i in 0...49 { do { try audioFile!.write(from: audioBuffs[i]); } catch { mainView?.setLabelText(tag: 4, text: "write error") stopRecording() } } ...cleanup buffer code } }) audioEngine.prepare() do { try audioEngine.start() } catch let error { print ("oh catch \(error)") } } Sample Two //these two lines are in the init of the class that contains this function node = audioEngine.inputNode recordingFormat = node.inputFormat(forBus: 0) func startRecording() { audioBuffs = [] x = -1 // new code let format16KHzMono = AVAudioFormat.init(commonFormat: AVAudioCommonFormat.pcmFormatInt16, sampleRate: 11025.0, channels: 1, interleaved: true) let downMixer = AVAudioMixerNode() audioEngine.attach(downMixer) // installTap on the mixer rather than the node downMixer.installTap(onBus: 0, bufferSize: 8192, format: format16KHzMono, block: { [self] (buffer, _) in x += 1 audioBuffs.append(buffer) if x >= 50 { // use a different format in creating the audioFile audioFile = makeFile(format: format16KHzMono!, index: fileCount) mainView?.setLabelText(tag: 3, text: "fileIndex = \(fileCount)") fileCount += 1 for i in 0...49 { do { try audioFile!.write(from: audioBuffs[i]); } catch { stopRecording() } } ...cleanup buffers... } }) let format = node.inputFormat(forBus: 0) // new code audioEngine.connect(node, to: downMixer, format: format)//use default input format audioEngine.connect(downMixer, to: audioEngine.outputNode, format: format16KHzMono)//use new audio format downMixer.outputVolume = 0.0 audioEngine.prepare() do { try audioEngine.start() } catch let error { print ("oh catch \(error)") } }
0
0
1.2k
Aug ’22
Questions about Xcode 13.4.1 and supported iOSs
I already posted about Xcode 13.4.1 not supporting iPhone's 15.6 iOS. But the answer raised even more questions. If the latest version of Xcode (13.4.1) won't support iOS 15.6, why should I think an earlier version of Xcode would? What is the real solution to getting Xcode to run apps on that iOS? Github does not have files past 15.5? Does Xcode automatically update its supported iOS files behind the scenes? Is there a planned date for Xcode to support iOS 15.6? Thank you
0
0
640
Aug ’22
I can't get UITextView to scroll to its bottom before UIViewController appears
I have a UIViewText that I have in all my UIViewControllers that carries over data from the entire app's instance. If I put the 2 lines that scroll to the bottom in the viewDidAppear section, the text scrolls to the bottom, but you see it occur, so it's not pleasant visually. However, if I put the same 2 lines in the viewWillAppear section (as shown below) then for some reason the UITextView starts at the top of the text. Am I somehow doing this incorrectly? override func viewWillAppear(_ animated: Bool) { super.viewWillAppear(animated) myStatusWin.text = db.status let range = NSMakeRange(myStatusWin.text.count - 1, 0) myStatusWin.scrollRangeToVisible(range) } override func viewDidAppear(_ animated: Bool) { super.viewDidAppear(animated) }
0
0
561
Sep ’22
Why can't I get more info on why SFSpeechRecognizer won't read my recorded audio files?
Updated info below Full disclosure: I do have this question over on StackOverflow, but I am at a standstill till I find a way to move forward, debug, etc. I am trying to recognize prerecorded speech in Swift. Essentially it either detects no speech, detects blank speech, or works on the one prerecorded file where I screamed a few words. I can't tell where the headache lies and can't figure out if there's a more detailed way to debug this. I can't find any properties that give more detailed info. Someone on SO did recommend I go through Apple's demo, here. This works just fine, and my code is very similar to it. Yet the main difference remains if there is something about the way I save my audio files or something else is leading to my headaches. If anyone has any insight into this I would very much appreciate any hints. My question over on StackOverflow Updated info below, and new code Updated info It appears that I was calling SFSpeechURLRecognitionRequest too often, and before I completed the first request. Perhaps I need to create a new instance of SFSpeechRecognizer? Unsure. Regardless, I quickly/sloppily adjusted the code to only run it once the previous instance returned its results. The results were much better, except one audio file still came up as no results. Not an error, just no text. This file is the same as the previous file, in that I took an audio recording and split it in two. So the formats and volumes are the same. So I still need a better way to debug this, to find out what it going wrong with that file. The code where I grab the file and attempt to read it func findAudioFiles(){ let fm = FileManager.default var aFiles : URL print ("\(urlPath)") do { let items = try fm.contentsOfDirectory(atPath: documentsPath) let filteredInterestArray1 = items.filter({$0.hasSuffix(".m4a")}) let filteredInterestArray2 = filteredInterestArray1.filter({$0.contains("SS-X-")}) let sortedItems = filteredInterestArray2.sorted() for item in sortedItems { audioFiles.append(item) } NotificationCenter.default.post(name: Notification.Name("goAndRead"), object: nil, userInfo: myDic) } catch { print ("\(error)") } } @objc func goAndRead(){ audioIndex += 1 if audioIndex != audioFiles.count { let fileURL = NSURL.fileURL(withPath: documentsPath + "/" + audioFiles[audioIndex], isDirectory: false) transcribeAudio(url: fileURL, item: audioFiles[audioIndex]) } } func requestTranscribePermissions() { SFSpeechRecognizer.requestAuthorization { [unowned self] authStatus in DispatchQueue.main.async { if authStatus == .authorized { print("Good to go!") } else { print("Transcription permission was declined.") } } } } func transcribeAudio(url: URL, item: String) { guard let recognizer = SFSpeechRecognizer(locale: Locale(identifier: "en-US")) else {return} let request = SFSpeechURLRecognitionRequest(url: url) if !recognizer.supportsOnDeviceRecognition { print ("offline not available") ; return } if !recognizer.isAvailable { print ("not available") ; return } request.requiresOnDeviceRecognition = true request.shouldReportPartialResults = true recognizer.recognitionTask(with: request) {(result, error) in guard let result = result else { print("\(item) : There was an error: \(error.debugDescription)") return } if result.isFinal { print("\(item) : \(result.bestTranscription.formattedString)") NotificationCenter.default.post(name: Notification.Name("goAndRead"), object: nil, userInfo: self.myDic) } } }
0
0
926
Dec ’22
Am lost using URLSessionDelegate within SwiftUI
Trying to create an app in SwiftUI that uses HTTP for "gets", "posts", "forms", et al. I cannot assign URLSessionDataDelegate,URLSessionTaskDelegate, URLSessionDelegate to a SwiftUI view. So my assumption is that I need to create a UIViewControllerRepresentable for a regular UIViewController and assign the delegates to that UIViewController I just need to know if this is the correct path to proceed with? Or is there some sort ion tie in that I cannot find using Google? Thank you
0
0
454
Jun ’23
How can I keep a class declaration within SwiftUI view persistent through any of the view's updates?
I have a class, MyURL, from a UIKit app that I wrote that handles all my URL needs, uploads, downloads, GETS, POSTS, etc. I would like to use that class from within a SwiftUI view now. So the SwiftUI view that creates calls MyURL methods. And I would pass binding vars that would cause my SwiftUI view to update things such as statuses : percentage done, etc. My issue is, how can I properly declare that class, myURL, from within the SwiftUI view, so that it doesn't get redeclared every time the view updates. Should I declare the MyURL class in a view above it and pass it into the view that calls the MyUrl class? Would just like to do it the proper way. I wish I was better with terminology. Thank you struct ContentView: View { **var myURL = MyURL()** @State var proceed = false var body: some View { Button { myURL.pressed(proceed: $proceed) } label: { Text(proceed ? "pressed" : "not pressed") } } } class MyURL: NSObject, URLSessionDataDelegate,URLSessionTaskDelegate, URLSessionDelegate, URLSessionDownloadDelegate{ func urlSession(_ session: URLSession, downloadTask: URLSessionDownloadTask, didFinishDownloadingTo location: URL) { } func pressed(proceed: Binding<Bool>) { print("\(proceed)") proceed.wrappedValue.toggle() } // Error received func urlSession(_ session: URLSession, task: URLSessionTask, didCompleteWithError error: Error?) { if let err = error { DispatchQueue.main.async { [self] in //code } } } // Response received func urlSession(_ session: URLSession, dataTask: URLSessionDataTask, didReceive response: URLResponse, completionHandler: (URLSession.ResponseDisposition) -> Void) { completionHandler(URLSession.ResponseDisposition.allow) if let httpResponse = response as? HTTPURLResponse { xFile?.httpResponse = httpResponse.statusCode DispatchQueue.main.async { [self] in if httpResponse.statusCode != 200 { //code } } } } // Data received func urlSession(_ session: URLSession, dataTask: URLSessionDataTask, didReceive data: Data) { if xFile?.httpResponse == 200 { DispatchQueue.main.async { [self] in //code } } //DispatchQueue.main.async } } func urlSession(_ session: URLSession, downloadTask: URLSessionDownloadTask, didWriteData bytesWritten: Int64, totalBytesWritten: Int64, totalBytesExpectedToWrite: Int64) { let num : Float = Float(totalBytesWritten * 100) let den : Float = Float(totalBytesExpectedToWrite * 100) let percentDownloaded : Int = Int(num/den * 100) DispatchQueue.main.async { [self] in //code } } }
0
0
442
Jun ’23
How can I dismiss keyboard on TextField losing focus?
Am trying to figure out how to dismiss the iOS keyboard if my TextFields do not have focus. Obviously when clicking a button I can call a dismiss keyboard function. But what I wanted to learn was how to dismiss the keyboard of if the user hits return OR clicks off of the TextFields. Before I could figure out the "user hit return part" I got stuck at the user being able to click away from the TextFields. While I can add a onTapGesture to the text fields, I wonder if I can do something like detecting a tapGesture to the entire screen, so that if the user taps on any blank space I could call the dismiss keyboard function. import SwiftUI struct ContentView: View { @State var textField1 = "" @State var textField2 = "" @State var hasFocus = "No text field has focus" @FocusState var leftTyping : Bool @FocusState var rightTyping : Bool var body: some View { VStack { Text(hasFocus) .font(.largeTitle) HStack { TextField("left" , text: $textField1) .focused($leftTyping) .onChange(of: leftTyping) { if leftTyping == false, rightTyping == false { hideKeyboard() hasFocus = "No text field has focus" } else if leftTyping { hasFocus = "focus on left field" } } TextField("right", text: $textField2) .focused($rightTyping) .onChange(of: rightTyping) { if leftTyping == false, rightTyping == false { hideKeyboard() hasFocus = "No text field has focus" } else if rightTyping { hasFocus = "focus on right field" } } } Button ("steal focus"){ hideKeyboard() hasFocus = "No text field has focus" } .buttonStyle(.borderedProminent) .tint(.brown) .font(.largeTitle) .padding(10) .foregroundStyle(.white) } .padding() } func hideKeyboard() { UIApplication.shared.sendAction(#selector(UIResponder.resignFirstResponder), to: nil, from: nil, for: nil) } } #Preview { ContentView() } Is there a way to do this?
0
0
849
Dec ’23
Where can I find a tutorial on using GeometryReader within a View's background (closure) modifier?
Someone showed me the code below, when I was trying to figure out how to detect which views were being intercepted by one continuous DragGesture. It works, but I would like to find a tutorial whre I can learn what's going on here. From what I can tell, the background modifier is a closure, that calls a function which has a GeometryReader which returns a view. I've Googled .background as a closure in swiftui and still can't find any form of tutorial that discuss what is going on here. Not looking for the answers, looking to learn. Thank you struct ContentView: View { @State private var dragLocation = CGPoint.zero @State private var dragInfo = " " private func dragDetector(for name: String) -> some View { GeometryReader { proxy in let frame = proxy.frame(in: .global) let isDragLocationInsideFrame = frame.contains(dragLocation) let isDragLocationInsideCircle = isDragLocationInsideFrame && Circle().path(in: frame).contains(dragLocation) Color.clear .onChange(of: isDragLocationInsideCircle) { oldVal, newVal in if dragLocation != .zero { dragInfo = "\(newVal ? "entering" : "leaving") \(name)..." } } } } var body: some View { ZStack { Color(white: 0.2) VStack(spacing: 50) { Text(dragInfo) .foregroundStyle(.white) HStack { Circle() .fill(.red) .frame(width: 100, height: 100) .background { dragDetector(for: "red") } Circle() .fill(.white) .frame(width: 100, height: 100) .background { dragDetector(for: "white") } Circle() .fill(.blue) .frame(width: 100, height: 100) .background { dragDetector(for: "blue") } } } } .gesture( DragGesture(coordinateSpace: .global) .onChanged { val in dragLocation = val.location } .onEnded { val in dragLocation = .zero dragInfo = " " } ) } }
0
0
417
Jan ’24
Is there a trick to viewing an EnvironmentObject in Xcode debugger? They currently show up as invalid Expression.
I am trying to watch an @EnvironmentObject in the Xcode debugger and it comes up as an Invalid Expression in the View pane. If I want to view it, I need to declare it as a local variable and then assign it the value of the passed in @EnvironmentObject. While not impossible, it's a lot of work. Or maybe I am doing something incorrectly and this is a symptom of that? Would like to resolve this. encl: Example code &amp; Screenshot, where choice is the passed/unviewable EnviornmentObject and ch is the local variable I use to view it in the debugger Project "TestingApp" File: TestingApp import SwiftUI @main struct TestingApp: App { @StateObject private var choice = Choices() var body: some Scene { WindowGroup { ContentView() .environmentObject(choice) } } } File: Choices import Foundation @MainActor class Choices: ObservableObject { @Published var aChoice = 1 @Published var bChoice = 2 } File: ContentView import SwiftUI struct ContentView: View { @EnvironmentObject private var choice: Choices var body: some View { VStack { let ch = choice Text("\(choice.aChoice)") .font(.largeTitle) .padding(.bottom) Text("2") .font(.largeTitle) } .padding() } } #Preview { ContentView() }
0
0
602
Mar ’24
How can I enable HTTP exceptions in Xcode 15?
Before Xcode 15, when I could access the info.plist file, I was able to add exceptions to the App Transport Security Settings so I could connect with my home server, which has no HTTPS, just HTTP. But in Xcode 15 I have no idea, not can I buy a clue with google, on how to do this. Please help! Thanks p.s. I should probably add that one site mentioned going to the Target section of your project allows easy access to info.plist. Yet for some strange reason, there is no item in Targets, which is odd, as I can debug my. project.
0
0
832
Jun ’24
Strange error in com.apple.speech.localspeechrecognition that doesn't affect output?
While running Swift's SpeechRecognition capabilities I get the error below. However, the app successfully transcribes the audio file. So am not sure how worried I have to be, as well, would like to know that if when that error occurred, did that mean that the app went to the internet to transcribe that file? Yes, requiresOnDeviceRecognition is set to false. Would like to know what that error meant, and how much I need to worry about it? Received an error while accessing com.apple.speech.localspeechrecognition service: Error Domain=kAFAssistantErrorDomain Code=1101 "(null)"
0
0
774
Oct ’24