Post

Replies

Boosts

Views

Activity

Do I have to build in support for user scrolling through a UITextView object?
I am trying to add a UITextView within my app to output data to. Naturally the data will eventually be bigger than the size of the UITextView, and the view is a set size. So I would like the user to be able to scroll through its content. However, I cannot scroll through the content in the app. Am I supposed to build the scrolling function myself? Seems weird that I would have to do that, but I cannot seem to find the answer to this on the web. I’ve also noticed that no vertical scroll at shows up when the text count is larger than the size of the object, which makes me wonder if I am missing a property or two. func createStatusField() -> UITextView { let myStatus = UITextView(frame: CGRect(x: 50, y: 50, width: 100, height: 300)) myStatus.autocorrectionType = .no myStatus.text = "hello there" myStatus.backgroundColor = .secondarySystemBackground myStatus.textColor = .secondaryLabel myStatus.font = UIFont.preferredFont(forTextStyle: .body) myStatus.layer.zPosition = 1 myStatus.isScrollEnabled = true myStatus.showsVerticalScrollIndicator = true return myStatus }
3
0
716
Aug ’22
UITextView will not scroll to bottom under specific circumstances
Within my UIViewController I have a UITextView which I use to dump current status and info into. Obviously evry time I add text to the UITextView I would like it to scroll to the bottom. So I've created this function, which I call from UIViewController whenever I have new data. func updateStat(status: String, tView: UITextView) { db.status = db.status + status + "\n" tView.text = db.status let range = NSMakeRange(tView.text.count - 1, 0) tView.scrollRangeToVisible(range) tView.flashScrollIndicators() } The only thing that does not work is the tView.scrollRangeToVisible. However, if from UIViewController I call: updateStat(status: "...new data...", tView: mySession) let range = NSMakeRange(mySession.text.count - 1, 0) mySession.scrollRangeToVisible(range) then the UITextView's scrollRangeToVisible does work. I'm curious if anyone knows why this works when called within the UIViewController, but not when called from a function? p.s. I have also tried the updateStatus function as an extension to UIViewController, but that doesn't work either
1
0
751
Aug ’22
I can't get UITextView to scroll to its bottom before UIViewController appears
I have a UIViewText that I have in all my UIViewControllers that carries over data from the entire app's instance. If I put the 2 lines that scroll to the bottom in the viewDidAppear section, the text scrolls to the bottom, but you see it occur, so it's not pleasant visually. However, if I put the same 2 lines in the viewWillAppear section (as shown below) then for some reason the UITextView starts at the top of the text. Am I somehow doing this incorrectly? override func viewWillAppear(_ animated: Bool) { super.viewWillAppear(animated) myStatusWin.text = db.status let range = NSMakeRange(myStatusWin.text.count - 1, 0) myStatusWin.scrollRangeToVisible(range) } override func viewDidAppear(_ animated: Bool) { super.viewDidAppear(animated) }
0
0
570
Sep ’22
Looking for tutorial to write apps that receive push notifications (i.e. Messaging)
I am looking for a proper tutorial on how to write let's say a messaging app. in other words the user doesn't have to run Messaging to get messages. I would like to build that type of structured app. I realize that "push notifications" appear the way to go. But at this point I still can't find an decent tutorial that seems to cover all the bases. Thank you
1
0
646
Nov ’22
Why can't I get more info on why SFSpeechRecognizer won't read my recorded audio files?
Updated info below Full disclosure: I do have this question over on StackOverflow, but I am at a standstill till I find a way to move forward, debug, etc. I am trying to recognize prerecorded speech in Swift. Essentially it either detects no speech, detects blank speech, or works on the one prerecorded file where I screamed a few words. I can't tell where the headache lies and can't figure out if there's a more detailed way to debug this. I can't find any properties that give more detailed info. Someone on SO did recommend I go through Apple's demo, here. This works just fine, and my code is very similar to it. Yet the main difference remains if there is something about the way I save my audio files or something else is leading to my headaches. If anyone has any insight into this I would very much appreciate any hints. My question over on StackOverflow Updated info below, and new code Updated info It appears that I was calling SFSpeechURLRecognitionRequest too often, and before I completed the first request. Perhaps I need to create a new instance of SFSpeechRecognizer? Unsure. Regardless, I quickly/sloppily adjusted the code to only run it once the previous instance returned its results. The results were much better, except one audio file still came up as no results. Not an error, just no text. This file is the same as the previous file, in that I took an audio recording and split it in two. So the formats and volumes are the same. So I still need a better way to debug this, to find out what it going wrong with that file. The code where I grab the file and attempt to read it func findAudioFiles(){ let fm = FileManager.default var aFiles : URL print ("\(urlPath)") do { let items = try fm.contentsOfDirectory(atPath: documentsPath) let filteredInterestArray1 = items.filter({$0.hasSuffix(".m4a")}) let filteredInterestArray2 = filteredInterestArray1.filter({$0.contains("SS-X-")}) let sortedItems = filteredInterestArray2.sorted() for item in sortedItems { audioFiles.append(item) } NotificationCenter.default.post(name: Notification.Name("goAndRead"), object: nil, userInfo: myDic) } catch { print ("\(error)") } } @objc func goAndRead(){ audioIndex += 1 if audioIndex != audioFiles.count { let fileURL = NSURL.fileURL(withPath: documentsPath + "/" + audioFiles[audioIndex], isDirectory: false) transcribeAudio(url: fileURL, item: audioFiles[audioIndex]) } } func requestTranscribePermissions() { SFSpeechRecognizer.requestAuthorization { [unowned self] authStatus in DispatchQueue.main.async { if authStatus == .authorized { print("Good to go!") } else { print("Transcription permission was declined.") } } } } func transcribeAudio(url: URL, item: String) { guard let recognizer = SFSpeechRecognizer(locale: Locale(identifier: "en-US")) else {return} let request = SFSpeechURLRecognitionRequest(url: url) if !recognizer.supportsOnDeviceRecognition { print ("offline not available") ; return } if !recognizer.isAvailable { print ("not available") ; return } request.requiresOnDeviceRecognition = true request.shouldReportPartialResults = true recognizer.recognitionTask(with: request) {(result, error) in guard let result = result else { print("\(item) : There was an error: \(error.debugDescription)") return } if result.isFinal { print("\(item) : \(result.bestTranscription.formattedString)") NotificationCenter.default.post(name: Notification.Name("goAndRead"), object: nil, userInfo: self.myDic) } } }
0
0
944
Dec ’22
How can I open an audio file into a buffer that I can read pieces of said buffer?
I would like to open an audio file on my iOS device and remove long silences. I already have the code for calculating volumes so am not pasting that here. What I am unsure of "how to do" is: While I believe that I have the proper code to read the file below, I am unsure as to how to read it in proper pieces to I can later get the volume of each piece. I realize that this might be a situation of calculating the size of frames and whatnot. But I am totally green when it comes to audio. I would seriously appreciate any guidance. guard let input = try? AVAudioFile(forReading: url) else { return nil } guard let buffer = AVAudioPCMBuffer(pcmFormat: input.processingFormat, frameCapacity: AVAudioFrameCount(input.length)) else { return nil } do { try input.read(into: buffer) } catch { return nil }
2
0
932
Dec ’22
Can I determine the time length of an AVAudioPCMBuffer's individual frame?
I am looping through an audio file, below is my very simple code. Am looping through 400 frames each time, but I picked 400 here as a random number. I would prefer to read in by time instead. Let's say a quarter of second. So I was wondering how can I determine the time length of each frame in the audio file? I am assuming that determining this might differ based on audio formats? I know almost nothing about audio. var myAudioBuffer = AVAudioPCMBuffer(pcmFormat: input.processingFormat, frameCapacity: 400)! guard var buffer = AVAudioPCMBuffer(pcmFormat: input.processingFormat, frameCapacity: AVAudioFrameCount(input.length)) else { return nil } var myAudioBuffer = AVAudioPCMBuffer(pcmFormat: input.processingFormat, frameCapacity: 400)! while (input.framePosition < input.length - 1 ) { let fcIndex = ( input.length - input.framePosition > 400) ? 400 : input.length - input.framePosition try? input.read(into: myAudioBuffer, frameCount: AVAudioFrameCount(fcIndex)) let volUme = getVolume(from: myAudioBuffer, bufferSize: myAudioBuffer.frameLength) ...manipulation code }
1
0
1.6k
Dec ’22
Does AVAudioPCMBuffer have a "partial copy" method?
I have this long kludgy bit of code that works. I've outlined it below so as not to be confusing as I have no error within my code. I just need to know if there's a method that already exists to copy a specific part of an AVAudioPCMBuffer to a new AVAudioPCMBuffer? So if I have an AVAudioPCMBuffer of 10,000 frames, and I just want frames 500 through 3,000 copied into a new buffer (formatting and all) without altering the old buffer...is there a method to do this? My code detects silent moments in an audio recording I currently read an audio file into an AVAudioPCMBuffer (audBuffOne) I loop through the buffer and detect the starts and ends of silence I record their positions in an array This array holds what frame position I detect voice starting (A) and which frame position the voice ends (B), with some padding of course ... new loop ... I loop through my array to go through each A and B framePositions Using the sample size from the audio file's formatting info, I create a new AVAudioPCMBuffer (audBuffTwo) large enough to hold from A to B and having the same formatting as audBuffOne I go back to audBuffOne Set framePosition to on the audio file to A Read into audBuffTwo for there proper length to reach frame B Save audBuffTwo to a new file ...keep looping
1
1
786
Jan ’23
Why do I get a "Publishing changes from within view updates is not allowed" when moving my @Bindings to @Published in an @ObservableObject?
Am going through a SwiftUI course, so the code is not my own. When I migrated my @Bindings into @Published items in an @ObservableObject I started getting the following error: Publishing changes from within view updates is not allowed, this will cause undefined behavior. The warning occurs in the ScannerView which is integrated with the main view, BarcodeScannerView. It occurs when an error occurs, and scannerView.alertItem is set to a value. However, it does not occur when I am setting the value of scannerView.scannedCode, and as far as I can tell, they both come from the sample place, and are the same actions. There are tons of posts like mine, but I have yet to find an answer. Any thoughts or comments would be very appreciated. BarcodeScannerView import SwiftUI struct BarcodeScannerView: View { @StateObject var viewModel = BarcodeScannerViewModel() var body: some View { NavigationStack { VStack { ScannerView(scannedCode: $viewModel.scannedCode, typeScanned: $viewModel.typeScanned, alertItem: $viewModel.alertItem) .frame(maxWidth: .infinity, maxHeight: 300) Spacer().frame(height: 60) BarcodeView(statusText: viewModel.typeScanned) TextView(statusText: viewModel.statusText, statusTextColor: viewModel.statusTextColor) } .navigationTitle("Barcode Scanner") .alert(item: $viewModel.alertItem) { alertItem in Alert(title: Text(alertItem.title), message: Text(alertItem.message), dismissButton: alertItem.dismissButton) } } } } BarcodeScannerViewModel import SwiftUI final class BarcodeScannerViewModel: ObservableObject { @Published var scannedCode = "" @Published var typeScanned = "Scanned Barcode" @Published var alertItem: AlertItem? var statusText: String { return scannedCode.isEmpty ? "Not Yet scanned" : scannedCode } var statusTextColor: Color { scannedCode.isEmpty ? .red : .green } } ScannerView import SwiftUI struct ScannerView: UIViewControllerRepresentable { typealias UIViewControllerType = ScannerVC @Binding var scannedCode : String @Binding var typeScanned : String @Binding var alertItem: AlertItem? func makeCoordinator() -> Coordinator { Coordinator(scannerView: self) } func makeUIViewController(context: Context) -> ScannerVC { ScannerVC(scannerDelegate: context.coordinator) } func updateUIViewController(_ uiViewController: ScannerVC, context: Context) { } final class Coordinator: NSObject, ScannerVCDelegate { private let scannerView: ScannerView init(scannerView: ScannerView) { self.scannerView = scannerView } func didFind(barcode: String, typeScanned: String) { scannerView.scannedCode = barcode scannerView.typeScanned = typeScanned print (barcode) } func didSurface(error: CameraError) { switch error { case .invalidDeviceinput: scannerView.alertItem = AlertContext.invalidDeviceInput case .invalidScannedValue: scannerView.alertItem = AlertContext.invalidScannedValue case .invalidPreviewLayer: scannerView.alertItem = AlertContext.invalidPreviewLayer case .invalidStringObject: scannerView.alertItem = AlertContext.invalidStringObject } } } }
4
0
5.1k
Mar ’23
iPhone won't connect to Xcode over WiFi
Hello, Just starting to learn Xcode and I can test the first chapter's app on my iPhone if it's conncted via USB-C. The book walks me through the part where I can allow Xcode to connect to the iPhone via WiFi, just checking "Connect via Network."Yet Xcode cannot find my phone unless it's connected via USB. When I go to Devices that checkbox stays checked, unless I unplug the phone it which case that box doesn't even appear.And yes, they are both on the same WiFi, it's the only one I have and it's up and running.Any thoughs?
6
0
21k
Jun ’23
Am lost using URLSessionDelegate within SwiftUI
Trying to create an app in SwiftUI that uses HTTP for "gets", "posts", "forms", et al. I cannot assign URLSessionDataDelegate,URLSessionTaskDelegate, URLSessionDelegate to a SwiftUI view. So my assumption is that I need to create a UIViewControllerRepresentable for a regular UIViewController and assign the delegates to that UIViewController I just need to know if this is the correct path to proceed with? Or is there some sort ion tie in that I cannot find using Google? Thank you
0
0
468
Jun ’23
How can I keep a class declaration within SwiftUI view persistent through any of the view's updates?
I have a class, MyURL, from a UIKit app that I wrote that handles all my URL needs, uploads, downloads, GETS, POSTS, etc. I would like to use that class from within a SwiftUI view now. So the SwiftUI view that creates calls MyURL methods. And I would pass binding vars that would cause my SwiftUI view to update things such as statuses : percentage done, etc. My issue is, how can I properly declare that class, myURL, from within the SwiftUI view, so that it doesn't get redeclared every time the view updates. Should I declare the MyURL class in a view above it and pass it into the view that calls the MyUrl class? Would just like to do it the proper way. I wish I was better with terminology. Thank you struct ContentView: View { **var myURL = MyURL()** @State var proceed = false var body: some View { Button { myURL.pressed(proceed: $proceed) } label: { Text(proceed ? "pressed" : "not pressed") } } } class MyURL: NSObject, URLSessionDataDelegate,URLSessionTaskDelegate, URLSessionDelegate, URLSessionDownloadDelegate{ func urlSession(_ session: URLSession, downloadTask: URLSessionDownloadTask, didFinishDownloadingTo location: URL) { } func pressed(proceed: Binding<Bool>) { print("\(proceed)") proceed.wrappedValue.toggle() } // Error received func urlSession(_ session: URLSession, task: URLSessionTask, didCompleteWithError error: Error?) { if let err = error { DispatchQueue.main.async { [self] in //code } } } // Response received func urlSession(_ session: URLSession, dataTask: URLSessionDataTask, didReceive response: URLResponse, completionHandler: (URLSession.ResponseDisposition) -> Void) { completionHandler(URLSession.ResponseDisposition.allow) if let httpResponse = response as? HTTPURLResponse { xFile?.httpResponse = httpResponse.statusCode DispatchQueue.main.async { [self] in if httpResponse.statusCode != 200 { //code } } } } // Data received func urlSession(_ session: URLSession, dataTask: URLSessionDataTask, didReceive data: Data) { if xFile?.httpResponse == 200 { DispatchQueue.main.async { [self] in //code } } //DispatchQueue.main.async } } func urlSession(_ session: URLSession, downloadTask: URLSessionDownloadTask, didWriteData bytesWritten: Int64, totalBytesWritten: Int64, totalBytesExpectedToWrite: Int64) { let num : Float = Float(totalBytesWritten * 100) let den : Float = Float(totalBytesExpectedToWrite * 100) let percentDownloaded : Int = Int(num/den * 100) DispatchQueue.main.async { [self] in //code } } }
0
0
459
Jun ’23
Do I have to build in support for user scrolling through a UITextView object?
I am trying to add a UITextView within my app to output data to. Naturally the data will eventually be bigger than the size of the UITextView, and the view is a set size. So I would like the user to be able to scroll through its content. However, I cannot scroll through the content in the app. Am I supposed to build the scrolling function myself? Seems weird that I would have to do that, but I cannot seem to find the answer to this on the web. I’ve also noticed that no vertical scroll at shows up when the text count is larger than the size of the object, which makes me wonder if I am missing a property or two. func createStatusField() -> UITextView { let myStatus = UITextView(frame: CGRect(x: 50, y: 50, width: 100, height: 300)) myStatus.autocorrectionType = .no myStatus.text = "hello there" myStatus.backgroundColor = .secondarySystemBackground myStatus.textColor = .secondaryLabel myStatus.font = UIFont.preferredFont(forTextStyle: .body) myStatus.layer.zPosition = 1 myStatus.isScrollEnabled = true myStatus.showsVerticalScrollIndicator = true return myStatus }
Replies
3
Boosts
0
Views
716
Activity
Aug ’22
UITextView will not scroll to bottom under specific circumstances
Within my UIViewController I have a UITextView which I use to dump current status and info into. Obviously evry time I add text to the UITextView I would like it to scroll to the bottom. So I've created this function, which I call from UIViewController whenever I have new data. func updateStat(status: String, tView: UITextView) { db.status = db.status + status + "\n" tView.text = db.status let range = NSMakeRange(tView.text.count - 1, 0) tView.scrollRangeToVisible(range) tView.flashScrollIndicators() } The only thing that does not work is the tView.scrollRangeToVisible. However, if from UIViewController I call: updateStat(status: "...new data...", tView: mySession) let range = NSMakeRange(mySession.text.count - 1, 0) mySession.scrollRangeToVisible(range) then the UITextView's scrollRangeToVisible does work. I'm curious if anyone knows why this works when called within the UIViewController, but not when called from a function? p.s. I have also tried the updateStatus function as an extension to UIViewController, but that doesn't work either
Replies
1
Boosts
0
Views
751
Activity
Aug ’22
Unable to properly set or read user defined setting
I've had this issue before... Under User-Defined I have created DEBUG_LEVEL_1 And within my code I have #if DEBUG_LEVEL_1 self.status = printSimDir() #endif However the printSimDir function is never called. So obviously I am setting something incorrectly here
Replies
0
Boosts
0
Views
527
Activity
Aug ’22
How to "properly" fetch data from CoreData in descending order?
Within my code to fetch data from CoreData I have the following line: let itemNoSort = NSSortDescriptor(key:"itemNo", ascending: false) What I am not sure of however is that the above is the same as saying descending: true Can't seem to find it in the documentation.
Replies
2
Boosts
0
Views
922
Activity
Sep ’22
I can't get UITextView to scroll to its bottom before UIViewController appears
I have a UIViewText that I have in all my UIViewControllers that carries over data from the entire app's instance. If I put the 2 lines that scroll to the bottom in the viewDidAppear section, the text scrolls to the bottom, but you see it occur, so it's not pleasant visually. However, if I put the same 2 lines in the viewWillAppear section (as shown below) then for some reason the UITextView starts at the top of the text. Am I somehow doing this incorrectly? override func viewWillAppear(_ animated: Bool) { super.viewWillAppear(animated) myStatusWin.text = db.status let range = NSMakeRange(myStatusWin.text.count - 1, 0) myStatusWin.scrollRangeToVisible(range) } override func viewDidAppear(_ animated: Bool) { super.viewDidAppear(animated) }
Replies
0
Boosts
0
Views
570
Activity
Sep ’22
Looking for tutorial to write apps that receive push notifications (i.e. Messaging)
I am looking for a proper tutorial on how to write let's say a messaging app. in other words the user doesn't have to run Messaging to get messages. I would like to build that type of structured app. I realize that "push notifications" appear the way to go. But at this point I still can't find an decent tutorial that seems to cover all the bases. Thank you
Replies
1
Boosts
0
Views
646
Activity
Nov ’22
This forum needs a delete option (question deleted)
Sorry, my question was idiotic, and due to my bad typing skills (not the first time). So I am erasing it since I can't delete it Again, sorry
Replies
0
Boosts
0
Views
493
Activity
Dec ’22
Why can't I get more info on why SFSpeechRecognizer won't read my recorded audio files?
Updated info below Full disclosure: I do have this question over on StackOverflow, but I am at a standstill till I find a way to move forward, debug, etc. I am trying to recognize prerecorded speech in Swift. Essentially it either detects no speech, detects blank speech, or works on the one prerecorded file where I screamed a few words. I can't tell where the headache lies and can't figure out if there's a more detailed way to debug this. I can't find any properties that give more detailed info. Someone on SO did recommend I go through Apple's demo, here. This works just fine, and my code is very similar to it. Yet the main difference remains if there is something about the way I save my audio files or something else is leading to my headaches. If anyone has any insight into this I would very much appreciate any hints. My question over on StackOverflow Updated info below, and new code Updated info It appears that I was calling SFSpeechURLRecognitionRequest too often, and before I completed the first request. Perhaps I need to create a new instance of SFSpeechRecognizer? Unsure. Regardless, I quickly/sloppily adjusted the code to only run it once the previous instance returned its results. The results were much better, except one audio file still came up as no results. Not an error, just no text. This file is the same as the previous file, in that I took an audio recording and split it in two. So the formats and volumes are the same. So I still need a better way to debug this, to find out what it going wrong with that file. The code where I grab the file and attempt to read it func findAudioFiles(){ let fm = FileManager.default var aFiles : URL print ("\(urlPath)") do { let items = try fm.contentsOfDirectory(atPath: documentsPath) let filteredInterestArray1 = items.filter({$0.hasSuffix(".m4a")}) let filteredInterestArray2 = filteredInterestArray1.filter({$0.contains("SS-X-")}) let sortedItems = filteredInterestArray2.sorted() for item in sortedItems { audioFiles.append(item) } NotificationCenter.default.post(name: Notification.Name("goAndRead"), object: nil, userInfo: myDic) } catch { print ("\(error)") } } @objc func goAndRead(){ audioIndex += 1 if audioIndex != audioFiles.count { let fileURL = NSURL.fileURL(withPath: documentsPath + "/" + audioFiles[audioIndex], isDirectory: false) transcribeAudio(url: fileURL, item: audioFiles[audioIndex]) } } func requestTranscribePermissions() { SFSpeechRecognizer.requestAuthorization { [unowned self] authStatus in DispatchQueue.main.async { if authStatus == .authorized { print("Good to go!") } else { print("Transcription permission was declined.") } } } } func transcribeAudio(url: URL, item: String) { guard let recognizer = SFSpeechRecognizer(locale: Locale(identifier: "en-US")) else {return} let request = SFSpeechURLRecognitionRequest(url: url) if !recognizer.supportsOnDeviceRecognition { print ("offline not available") ; return } if !recognizer.isAvailable { print ("not available") ; return } request.requiresOnDeviceRecognition = true request.shouldReportPartialResults = true recognizer.recognitionTask(with: request) {(result, error) in guard let result = result else { print("\(item) : There was an error: \(error.debugDescription)") return } if result.isFinal { print("\(item) : \(result.bestTranscription.formattedString)") NotificationCenter.default.post(name: Notification.Name("goAndRead"), object: nil, userInfo: self.myDic) } } }
Replies
0
Boosts
0
Views
944
Activity
Dec ’22
How can I open an audio file into a buffer that I can read pieces of said buffer?
I would like to open an audio file on my iOS device and remove long silences. I already have the code for calculating volumes so am not pasting that here. What I am unsure of "how to do" is: While I believe that I have the proper code to read the file below, I am unsure as to how to read it in proper pieces to I can later get the volume of each piece. I realize that this might be a situation of calculating the size of frames and whatnot. But I am totally green when it comes to audio. I would seriously appreciate any guidance. guard let input = try? AVAudioFile(forReading: url) else { return nil } guard let buffer = AVAudioPCMBuffer(pcmFormat: input.processingFormat, frameCapacity: AVAudioFrameCount(input.length)) else { return nil } do { try input.read(into: buffer) } catch { return nil }
Replies
2
Boosts
0
Views
932
Activity
Dec ’22
Can I determine the time length of an AVAudioPCMBuffer's individual frame?
I am looping through an audio file, below is my very simple code. Am looping through 400 frames each time, but I picked 400 here as a random number. I would prefer to read in by time instead. Let's say a quarter of second. So I was wondering how can I determine the time length of each frame in the audio file? I am assuming that determining this might differ based on audio formats? I know almost nothing about audio. var myAudioBuffer = AVAudioPCMBuffer(pcmFormat: input.processingFormat, frameCapacity: 400)! guard var buffer = AVAudioPCMBuffer(pcmFormat: input.processingFormat, frameCapacity: AVAudioFrameCount(input.length)) else { return nil } var myAudioBuffer = AVAudioPCMBuffer(pcmFormat: input.processingFormat, frameCapacity: 400)! while (input.framePosition < input.length - 1 ) { let fcIndex = ( input.length - input.framePosition > 400) ? 400 : input.length - input.framePosition try? input.read(into: myAudioBuffer, frameCount: AVAudioFrameCount(fcIndex)) let volUme = getVolume(from: myAudioBuffer, bufferSize: myAudioBuffer.frameLength) ...manipulation code }
Replies
1
Boosts
0
Views
1.6k
Activity
Dec ’22
Does AVAudioPCMBuffer have a "partial copy" method?
I have this long kludgy bit of code that works. I've outlined it below so as not to be confusing as I have no error within my code. I just need to know if there's a method that already exists to copy a specific part of an AVAudioPCMBuffer to a new AVAudioPCMBuffer? So if I have an AVAudioPCMBuffer of 10,000 frames, and I just want frames 500 through 3,000 copied into a new buffer (formatting and all) without altering the old buffer...is there a method to do this? My code detects silent moments in an audio recording I currently read an audio file into an AVAudioPCMBuffer (audBuffOne) I loop through the buffer and detect the starts and ends of silence I record their positions in an array This array holds what frame position I detect voice starting (A) and which frame position the voice ends (B), with some padding of course ... new loop ... I loop through my array to go through each A and B framePositions Using the sample size from the audio file's formatting info, I create a new AVAudioPCMBuffer (audBuffTwo) large enough to hold from A to B and having the same formatting as audBuffOne I go back to audBuffOne Set framePosition to on the audio file to A Read into audBuffTwo for there proper length to reach frame B Save audBuffTwo to a new file ...keep looping
Replies
1
Boosts
1
Views
786
Activity
Jan ’23
Why do I get a "Publishing changes from within view updates is not allowed" when moving my @Bindings to @Published in an @ObservableObject?
Am going through a SwiftUI course, so the code is not my own. When I migrated my @Bindings into @Published items in an @ObservableObject I started getting the following error: Publishing changes from within view updates is not allowed, this will cause undefined behavior. The warning occurs in the ScannerView which is integrated with the main view, BarcodeScannerView. It occurs when an error occurs, and scannerView.alertItem is set to a value. However, it does not occur when I am setting the value of scannerView.scannedCode, and as far as I can tell, they both come from the sample place, and are the same actions. There are tons of posts like mine, but I have yet to find an answer. Any thoughts or comments would be very appreciated. BarcodeScannerView import SwiftUI struct BarcodeScannerView: View { @StateObject var viewModel = BarcodeScannerViewModel() var body: some View { NavigationStack { VStack { ScannerView(scannedCode: $viewModel.scannedCode, typeScanned: $viewModel.typeScanned, alertItem: $viewModel.alertItem) .frame(maxWidth: .infinity, maxHeight: 300) Spacer().frame(height: 60) BarcodeView(statusText: viewModel.typeScanned) TextView(statusText: viewModel.statusText, statusTextColor: viewModel.statusTextColor) } .navigationTitle("Barcode Scanner") .alert(item: $viewModel.alertItem) { alertItem in Alert(title: Text(alertItem.title), message: Text(alertItem.message), dismissButton: alertItem.dismissButton) } } } } BarcodeScannerViewModel import SwiftUI final class BarcodeScannerViewModel: ObservableObject { @Published var scannedCode = "" @Published var typeScanned = "Scanned Barcode" @Published var alertItem: AlertItem? var statusText: String { return scannedCode.isEmpty ? "Not Yet scanned" : scannedCode } var statusTextColor: Color { scannedCode.isEmpty ? .red : .green } } ScannerView import SwiftUI struct ScannerView: UIViewControllerRepresentable { typealias UIViewControllerType = ScannerVC @Binding var scannedCode : String @Binding var typeScanned : String @Binding var alertItem: AlertItem? func makeCoordinator() -> Coordinator { Coordinator(scannerView: self) } func makeUIViewController(context: Context) -> ScannerVC { ScannerVC(scannerDelegate: context.coordinator) } func updateUIViewController(_ uiViewController: ScannerVC, context: Context) { } final class Coordinator: NSObject, ScannerVCDelegate { private let scannerView: ScannerView init(scannerView: ScannerView) { self.scannerView = scannerView } func didFind(barcode: String, typeScanned: String) { scannerView.scannedCode = barcode scannerView.typeScanned = typeScanned print (barcode) } func didSurface(error: CameraError) { switch error { case .invalidDeviceinput: scannerView.alertItem = AlertContext.invalidDeviceInput case .invalidScannedValue: scannerView.alertItem = AlertContext.invalidScannedValue case .invalidPreviewLayer: scannerView.alertItem = AlertContext.invalidPreviewLayer case .invalidStringObject: scannerView.alertItem = AlertContext.invalidStringObject } } } }
Replies
4
Boosts
0
Views
5.1k
Activity
Mar ’23
iPhone won't connect to Xcode over WiFi
Hello, Just starting to learn Xcode and I can test the first chapter's app on my iPhone if it's conncted via USB-C. The book walks me through the part where I can allow Xcode to connect to the iPhone via WiFi, just checking "Connect via Network."Yet Xcode cannot find my phone unless it's connected via USB. When I go to Devices that checkbox stays checked, unless I unplug the phone it which case that box doesn't even appear.And yes, they are both on the same WiFi, it's the only one I have and it's up and running.Any thoughs?
Replies
6
Boosts
0
Views
21k
Activity
Jun ’23
Am lost using URLSessionDelegate within SwiftUI
Trying to create an app in SwiftUI that uses HTTP for "gets", "posts", "forms", et al. I cannot assign URLSessionDataDelegate,URLSessionTaskDelegate, URLSessionDelegate to a SwiftUI view. So my assumption is that I need to create a UIViewControllerRepresentable for a regular UIViewController and assign the delegates to that UIViewController I just need to know if this is the correct path to proceed with? Or is there some sort ion tie in that I cannot find using Google? Thank you
Replies
0
Boosts
0
Views
468
Activity
Jun ’23
How can I keep a class declaration within SwiftUI view persistent through any of the view's updates?
I have a class, MyURL, from a UIKit app that I wrote that handles all my URL needs, uploads, downloads, GETS, POSTS, etc. I would like to use that class from within a SwiftUI view now. So the SwiftUI view that creates calls MyURL methods. And I would pass binding vars that would cause my SwiftUI view to update things such as statuses : percentage done, etc. My issue is, how can I properly declare that class, myURL, from within the SwiftUI view, so that it doesn't get redeclared every time the view updates. Should I declare the MyURL class in a view above it and pass it into the view that calls the MyUrl class? Would just like to do it the proper way. I wish I was better with terminology. Thank you struct ContentView: View { **var myURL = MyURL()** @State var proceed = false var body: some View { Button { myURL.pressed(proceed: $proceed) } label: { Text(proceed ? "pressed" : "not pressed") } } } class MyURL: NSObject, URLSessionDataDelegate,URLSessionTaskDelegate, URLSessionDelegate, URLSessionDownloadDelegate{ func urlSession(_ session: URLSession, downloadTask: URLSessionDownloadTask, didFinishDownloadingTo location: URL) { } func pressed(proceed: Binding<Bool>) { print("\(proceed)") proceed.wrappedValue.toggle() } // Error received func urlSession(_ session: URLSession, task: URLSessionTask, didCompleteWithError error: Error?) { if let err = error { DispatchQueue.main.async { [self] in //code } } } // Response received func urlSession(_ session: URLSession, dataTask: URLSessionDataTask, didReceive response: URLResponse, completionHandler: (URLSession.ResponseDisposition) -> Void) { completionHandler(URLSession.ResponseDisposition.allow) if let httpResponse = response as? HTTPURLResponse { xFile?.httpResponse = httpResponse.statusCode DispatchQueue.main.async { [self] in if httpResponse.statusCode != 200 { //code } } } } // Data received func urlSession(_ session: URLSession, dataTask: URLSessionDataTask, didReceive data: Data) { if xFile?.httpResponse == 200 { DispatchQueue.main.async { [self] in //code } } //DispatchQueue.main.async } } func urlSession(_ session: URLSession, downloadTask: URLSessionDownloadTask, didWriteData bytesWritten: Int64, totalBytesWritten: Int64, totalBytesExpectedToWrite: Int64) { let num : Float = Float(totalBytesWritten * 100) let den : Float = Float(totalBytesExpectedToWrite * 100) let percentDownloaded : Int = Int(num/den * 100) DispatchQueue.main.async { [self] in //code } } }
Replies
0
Boosts
0
Views
459
Activity
Jun ’23