Post

Replies

Boosts

Views

Activity

Do child view controllers inherit the frame of their parents?
Below is my code. I have the ViewController, which takes up entire screen (set background color, makes sure status bar is visible, etc. It then calls up the MainController, which is set to be only in the safeAreaLayout frame. It has a button that brings up a third view controller when clicked. Everything works, the ViewController covers the entire screen, the MainController rests within the safeAreaLayouts of the iPhone X, and the third view controller comes up the same size and position as the MainController. It's that last part I want to make sure of, that that is the way it is supposed to come up. Can I count on that? Or must I set its frame myself to be sure? ViewController class ViewController: UIViewController { var mainController = MainController() override func viewDidLayoutSubviews() { super.viewDidLayoutSubviews() self.addChild(mainController) self.view.addSubview(mainController.view) setConstraints(vc: mainController, pc: view) } override func viewDidLoad() { super.viewDidLoad() view.backgroundColor = bgColor } override var prefersStatusBarHidden: Bool { return false } override var preferredStatusBarStyle: UIStatusBarStyle { return .darkContent } } func setConstraints (vc: UIViewController, pc: UIView) { vc.view.translatesAutoresizingMaskIntoConstraints = false var constraints = [NSLayoutConstraint]() constraints.append(vc.view.leadingAnchor.constraint(equalTo: pc.safeAreaLayoutGuide.leadingAnchor)) constraints.append(vc.view.trailingAnchor.constraint(equalTo: pc.safeAreaLayoutGuide.trailingAnchor)) constraints.append(vc.view.bottomAnchor.constraint(equalTo: pc.safeAreaLayoutGuide.bottomAnchor)) constraints.append(vc.view.topAnchor.constraint(equalTo: pc.safeAreaLayoutGuide.topAnchor)) NSLayoutConstraint.activate(constraints) } MainController class MainController: UIViewController { private lazy var countController = CountController() var invButton : MyButton! override func viewDidLoad() { super.viewDidLoad() view.backgroundColor = .black ...button code .... } override var prefersStatusBarHidden: Bool { return false } @objc func buttonAction(sender: UIButton!) { guard let theButton = sender as? MyButton else { return} self.addChild(countController) self.view.addSubview(countController.view) } } ThirdViewController class CountController : UIViewController { var backButton : MyButton! override func viewWillAppear(_ animated: Bool) { super.viewWillAppear(animated) } override func viewDidLoad() { super.viewDidLoad() self.view.backgroundColor = .gray } }
2
0
1.4k
Nov ’21
How to properly destroy a child UIViewController?
I have my mainController (parent) and my menuController (child). I call the menuController with addChild(child) view.addSubview(child.view) child.didMove(toParent: self) The child dismisses itself with: self.dismiss(animated: true, completion: nil) The question I have, is how do I clean up the child within the parent? Surely I have to do something?
1
1
5.2k
Dec ’21
Why is superview returning nil?
At the very bottom is my code where MainController initiates a subview called setController. The new subview is created when I click the button However, within setController's code I get back a nil when I try to the superview: override func viewWillAppear(_ animated: Bool) { super.viewWillAppear(animated) print("\(self.view.superview)" ?? "count->no parent") } I am assigning the second view as a subview, but obviously I am missing something. Have I misunderstood how UIView hierarchy works? class MainController: UIViewController { private lazy var setController = SetController() var invButton : MyButton! override func viewDidLoad() { super.viewDidLoad() view.backgroundColor = .black invButton = makeButton(vControl: self, btype: ButtType.inv, action: #selector(self.buttonAction(sender:))) invButton.frame.origin.x = self.view.frame.width * 0.1 invButton.frame.origin.y = self.view.frame.height * 0.1 invButton.setTitle("Settings", for: .normal) } override var prefersStatusBarHidden: Bool { return false } @objc func buttonAction(sender: UIButton!) { guard let theButton = sender as? MyButton else { return} UIView.transition(with: self.view, duration: 0.5, options: .transitionCurlDown, animations: { [self] in self.addChild(setController) self.view.addSubview(setController.view) setController.didMove(toParent: self) }, completion: nil) } }
1
0
1.8k
Dec ’21
Is this the proper order of commands to dismiss a UIView?
I've learned the hard way that specific commands to add a child UIView must be in a certain order, especially if I am bringing in the child UIView using animation. So I'd like to be clear that what the order I use to delete a child UIView is correct. Yes, what I have below works. But that doesn't mean it's correct. Thank you UIView.transition(with: parent, duration: 0.5, options: .transitionCurlUp, animations: { [self] in self.willMove(toParent: nil); self.removeFromParent(); self.view.removeFromSuperview(); self.dismiss(animated: false, completion: nil); }, completion:nil)
1
0
335
Dec ’21
Why does a dismissed child UIView doesn't reappear properly after its first appearance?
Thanks to people on this board I am able to successfully calla up a child UIViewConroller via animation with: This is the buttonAction from the Main UIViewController, which calls up setController @objc func buttonAction(sender: UIButton!) { guard let theButton = sender as? MyButton else { return} UIView.transition(with: self.view, duration: 0.5, options: .transitionCurlDown, animations: { [self] in self.addChild(setController); self.view.addSubview(setController.view); }, completion: { [self]_ in setController.didMove(toParent: self); setController.doLayout();}) } the doLayout method lies within the child: func doLayout (){ guard let parent = cView!.view.superview else {return} //make sure UIV honors safeAreaLayouts setConstraints(vc: self, pc: parent) } A button within the child, setController, dismisses itself: @objc func buttonAction(sender: UIButton!) { self.willMove(toParent: nil) self.removeFromParent() self.view.removeFromSuperview() self.dismiss(animated: false, completion: nil) } Everything works great the first time I call up the child UIView. It curls down while covering the first/parent UIVIEW, etc. etc. Figure 1 But after I dismiss the child view and call it again, the child view scrolls down without really covering the main view, it's like a mishmash. Figure 2 Only after all is said and done, then the child view covers everything. So am curious if I am dismissing something incorrectly.
3
0
791
Dec ’21
Need help understanding this syntax
Copied the code below from a tutorial, and I mostly understand what is going on. But I'd like to be able to fully read it. What I do get is that: handler can be nil, but here it's the code to run upon the completion of UIAlertAction. But am unsure what (_) in is. I have also sometimes seen it as [self] in Thank you controller.addAction(UIAlertAction(title: "OK", style: .default, handler: { (_)  in controller.dismiss(animated: true, completion: nil)}))
1
0
447
Dec ’21
How to reset speechRecognizer
Building a very simple voice-to-text app, which I got from an online demo. What I can't seem to find is how to reset the response back to nil. This demo just keeps transcribing from the very beginning till it finally stalls. While I don't know how if the stall is related to my question, I still need to find out how to code "Ok, got the first 100 words. Reset response text to nil. Continue." func startSpeechRecognition(){ let node = audioEngine.inputNode let recordingFormat = node.outputFormat(forBus: 0) node.installTap(onBus: 0, bufferSize: 1024, format: recordingFormat, block: { (buffer, _) in self.request.append(buffer)}) audioEngine.prepare() do { try audioEngine.start() } catch let error { alertView(message: "audioEngine start error") } guard let myRecognition = SFSpeechRecognizer() else { self.alertView(message: "Recognition is not on your phone") return } if !myRecognition.isAvailable { self.alertView(message: "recognition is not available right now") } task = speechRecognizer?.recognitionTask(with: request, resultHandler: { (response, error) in guard let response = response else { if error != nil { self.alertView(message: error!.localizedDescription.debugDescription) } else { self.alertView(message: "Unknow error in creating task") } return } let message = response.bestTranscription.formattedString self.label.text = message }) }
0
1
873
Dec ’21
Looking for in depth tutorial on SFSpeechRecognizer
it's a great tool from Apple, but I want to delve more into its engine as I need to. The documentation doesn't seem to go there. For instance, I can't figure out how to clear the bestTranscritption object in speechRecognizer, as it always contains the entire transcription. There are other things I would like to work with as well. Has anyone worked with this heavily enough to recommend proper books are paid for tutorials? Many thanks
0
0
590
Dec ’21
Does Swift 5.x offer any callback functions in AVAudioRecorder or must I use a callback timer?
Am using the demo code below to flesh out an audio recording app in Swift 5.x I would like to monitor certain aspects of the AVAudioRecorder as it is recording. Such as: frequency, power, volume, etc. but in live time. I found an example in Swift 3 where the user sets up a callback timer for 0.5 sec. I was wondering if this was still the case, or that in the latest version of Swift, there might be a callback function in the AVAudioEngine that gets called at a regular frequency? do { audioRecorder = try AVAudioRecorder(url: audioFilename!, settings: settings) audioRecorder.delegate = self audioRecorder.record() recordButton.setTitle("Tap to Stop", for: .normal) } catch { finishRecording(success: false) } }
1
0
888
Dec ’21
Is .installTap the equivalent of a C callback function?
Expanding a speech to text demo, and while it works, I am still trying to learn Swift. Is .installTap the Swift version of a C callback function? From what I interpret here, every time the buffer becomes full, the code in between the last { } runs, as well, the code below it is also run. It almost feels like a callback combined with a GOTO line from basic. yes, it works, but I'd like to understand that I am getting the flow of the code correctly. func startSpeechRecognition (){ let node = audioEngine.inputNode let recordingFormat = node.outputFormat(forBus: 0) node.installTap(onBus: 0, bufferSize: 1024, format: recordingFormat) { (buffer, _) in self.request.append(buffer) } audioEngine.prepare() do { try audioEngine.start() } catch let error { ... } guard let myRecognition = SFSpeechRecognizer() else { ... return } if !myRecognition.isAvailable { ... } task = speechRecognizer?.recognitionTask(with: request, resultHandler: { (response, error) in guard let response = response else { if error != nil { print ("\(String(describing: error.debugDescription))") } else { print ("problem in repsonse") } return } let message = response.bestTranscription.formattedString print ("\(message)") }) }
5
0
1.5k
Dec ’21
How to convert node.outputFormat to settings for AVAudioFile
Am trying to go from the installTap straight to AVAudioFile(forWriting: I call: let recordingFormat = node.outputFormat(forBus: 0) and I get back : <AVAudioFormat 0x60000278f750:  1 ch,  48000 Hz, Float32> But AVAudioFile has a settings parameter of [String : Any] and am curious of how to place those values into recording the required format. Hopefully these are the values I need?
0
0
462
Dec ’21
My writing the installTap buffer to an AVAudioFile seems to fail data-wise, or at the end
I am trying to save the buffer from my installTap to a file. I do it in chunks of 10 so I'll get a bigger file. When I try to play the written file (from the simulator's directory) QuickTime says that it's not compatible. I have examined the bad m4a file and a working one. There are a lot of zero's in the bad file at the beginning followed by a lot of data. However both files appears to have the same header. A lot of people mention that I have to nil the AudioFile, but: audioFile = nil is not a valid syntax, nor can I file a close method in AudioFile. Here's the complete code, edited into one working file: import UIKit import AVFoundation class ViewController: UIViewController { let audioEngine = AVAudioEngine() var audioFile = AVAudioFile() var x = 0 override func viewDidLoad() { super.viewDidLoad() record() // Do any additional setup after loading the view. } func makeFile(format: AVAudioFormat) { let paths = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask).first do { _ = try FileManager.default.contentsOfDirectory(at: paths!, includingPropertiesForKeys: nil) } catch { print ("error")} let destinationPath = paths!.appendingPathComponent("audioT.m4a") print ("\(destinationPath)") do { audioFile = try AVAudioFile(forWriting: destinationPath, settings: format.settings) print ("file created") } catch { print ("error creating file")} } func record(){ let node = audioEngine.inputNode let recordingFormat = node.inputFormat(forBus: 0) makeFile(format: recordingFormat) node.installTap(onBus: 0, bufferSize: 8192, format: recordingFormat, block: { [self] (buffer, _) in do { try audioFile.write(from: buffer); print ("buffer filled"); x += 1; print("wrote \(x)") if x > 9 { endThis() } } catch {return};}) audioEngine.prepare() do { try audioEngine.start() } catch let error { print ("oh catch") } } func endThis(){ audioEngine.stop() audioEngine.inputNode.removeTap(onBus: 0) } }
1
0
2.1k
Jan ’22
How can I decipher an M4A file?
I have my Swift app that records audio in chunks of multiple files, each M4A file is approx 1 minute long. I would like to go through those files and detect silence, or the lowest level. While I am able to read the file into a buffer, my problem is deciphering it. Even with Google, all it comes up with is "audio players" instead of sites that describe the header and the data. Where can I find what to look for? Or even if I should be reading it into a WAV file? But even then I cannot seem to find a tool, or a site, that tells me how to decipher what I am reading. Obviously it exists, since Siri knows when you've stopped speaking. Just trying to find the key.
0
0
680
Jan ’22
Do child view controllers inherit the frame of their parents?
Below is my code. I have the ViewController, which takes up entire screen (set background color, makes sure status bar is visible, etc. It then calls up the MainController, which is set to be only in the safeAreaLayout frame. It has a button that brings up a third view controller when clicked. Everything works, the ViewController covers the entire screen, the MainController rests within the safeAreaLayouts of the iPhone X, and the third view controller comes up the same size and position as the MainController. It's that last part I want to make sure of, that that is the way it is supposed to come up. Can I count on that? Or must I set its frame myself to be sure? ViewController class ViewController: UIViewController { var mainController = MainController() override func viewDidLayoutSubviews() { super.viewDidLayoutSubviews() self.addChild(mainController) self.view.addSubview(mainController.view) setConstraints(vc: mainController, pc: view) } override func viewDidLoad() { super.viewDidLoad() view.backgroundColor = bgColor } override var prefersStatusBarHidden: Bool { return false } override var preferredStatusBarStyle: UIStatusBarStyle { return .darkContent } } func setConstraints (vc: UIViewController, pc: UIView) { vc.view.translatesAutoresizingMaskIntoConstraints = false var constraints = [NSLayoutConstraint]() constraints.append(vc.view.leadingAnchor.constraint(equalTo: pc.safeAreaLayoutGuide.leadingAnchor)) constraints.append(vc.view.trailingAnchor.constraint(equalTo: pc.safeAreaLayoutGuide.trailingAnchor)) constraints.append(vc.view.bottomAnchor.constraint(equalTo: pc.safeAreaLayoutGuide.bottomAnchor)) constraints.append(vc.view.topAnchor.constraint(equalTo: pc.safeAreaLayoutGuide.topAnchor)) NSLayoutConstraint.activate(constraints) } MainController class MainController: UIViewController { private lazy var countController = CountController() var invButton : MyButton! override func viewDidLoad() { super.viewDidLoad() view.backgroundColor = .black ...button code .... } override var prefersStatusBarHidden: Bool { return false } @objc func buttonAction(sender: UIButton!) { guard let theButton = sender as? MyButton else { return} self.addChild(countController) self.view.addSubview(countController.view) } } ThirdViewController class CountController : UIViewController { var backButton : MyButton! override func viewWillAppear(_ animated: Bool) { super.viewWillAppear(animated) } override func viewDidLoad() { super.viewDidLoad() self.view.backgroundColor = .gray } }
Replies
2
Boosts
0
Views
1.4k
Activity
Nov ’21
How to properly destroy a child UIViewController?
I have my mainController (parent) and my menuController (child). I call the menuController with addChild(child) view.addSubview(child.view) child.didMove(toParent: self) The child dismisses itself with: self.dismiss(animated: true, completion: nil) The question I have, is how do I clean up the child within the parent? Surely I have to do something?
Replies
1
Boosts
1
Views
5.2k
Activity
Dec ’21
Why is superview returning nil?
At the very bottom is my code where MainController initiates a subview called setController. The new subview is created when I click the button However, within setController's code I get back a nil when I try to the superview: override func viewWillAppear(_ animated: Bool) { super.viewWillAppear(animated) print("\(self.view.superview)" ?? "count->no parent") } I am assigning the second view as a subview, but obviously I am missing something. Have I misunderstood how UIView hierarchy works? class MainController: UIViewController { private lazy var setController = SetController() var invButton : MyButton! override func viewDidLoad() { super.viewDidLoad() view.backgroundColor = .black invButton = makeButton(vControl: self, btype: ButtType.inv, action: #selector(self.buttonAction(sender:))) invButton.frame.origin.x = self.view.frame.width * 0.1 invButton.frame.origin.y = self.view.frame.height * 0.1 invButton.setTitle("Settings", for: .normal) } override var prefersStatusBarHidden: Bool { return false } @objc func buttonAction(sender: UIButton!) { guard let theButton = sender as? MyButton else { return} UIView.transition(with: self.view, duration: 0.5, options: .transitionCurlDown, animations: { [self] in self.addChild(setController) self.view.addSubview(setController.view) setController.didMove(toParent: self) }, completion: nil) } }
Replies
1
Boosts
0
Views
1.8k
Activity
Dec ’21
Is this the proper order of commands to dismiss a UIView?
I've learned the hard way that specific commands to add a child UIView must be in a certain order, especially if I am bringing in the child UIView using animation. So I'd like to be clear that what the order I use to delete a child UIView is correct. Yes, what I have below works. But that doesn't mean it's correct. Thank you UIView.transition(with: parent, duration: 0.5, options: .transitionCurlUp, animations: { [self] in self.willMove(toParent: nil); self.removeFromParent(); self.view.removeFromSuperview(); self.dismiss(animated: false, completion: nil); }, completion:nil)
Replies
1
Boosts
0
Views
335
Activity
Dec ’21
Why does a dismissed child UIView doesn't reappear properly after its first appearance?
Thanks to people on this board I am able to successfully calla up a child UIViewConroller via animation with: This is the buttonAction from the Main UIViewController, which calls up setController @objc func buttonAction(sender: UIButton!) { guard let theButton = sender as? MyButton else { return} UIView.transition(with: self.view, duration: 0.5, options: .transitionCurlDown, animations: { [self] in self.addChild(setController); self.view.addSubview(setController.view); }, completion: { [self]_ in setController.didMove(toParent: self); setController.doLayout();}) } the doLayout method lies within the child: func doLayout (){ guard let parent = cView!.view.superview else {return} //make sure UIV honors safeAreaLayouts setConstraints(vc: self, pc: parent) } A button within the child, setController, dismisses itself: @objc func buttonAction(sender: UIButton!) { self.willMove(toParent: nil) self.removeFromParent() self.view.removeFromSuperview() self.dismiss(animated: false, completion: nil) } Everything works great the first time I call up the child UIView. It curls down while covering the first/parent UIVIEW, etc. etc. Figure 1 But after I dismiss the child view and call it again, the child view scrolls down without really covering the main view, it's like a mishmash. Figure 2 Only after all is said and done, then the child view covers everything. So am curious if I am dismissing something incorrectly.
Replies
3
Boosts
0
Views
791
Activity
Dec ’21
Need help understanding this syntax
Copied the code below from a tutorial, and I mostly understand what is going on. But I'd like to be able to fully read it. What I do get is that: handler can be nil, but here it's the code to run upon the completion of UIAlertAction. But am unsure what (_) in is. I have also sometimes seen it as [self] in Thank you controller.addAction(UIAlertAction(title: "OK", style: .default, handler: { (_)  in controller.dismiss(animated: true, completion: nil)}))
Replies
1
Boosts
0
Views
447
Activity
Dec ’21
How to reset speechRecognizer
Building a very simple voice-to-text app, which I got from an online demo. What I can't seem to find is how to reset the response back to nil. This demo just keeps transcribing from the very beginning till it finally stalls. While I don't know how if the stall is related to my question, I still need to find out how to code "Ok, got the first 100 words. Reset response text to nil. Continue." func startSpeechRecognition(){ let node = audioEngine.inputNode let recordingFormat = node.outputFormat(forBus: 0) node.installTap(onBus: 0, bufferSize: 1024, format: recordingFormat, block: { (buffer, _) in self.request.append(buffer)}) audioEngine.prepare() do { try audioEngine.start() } catch let error { alertView(message: "audioEngine start error") } guard let myRecognition = SFSpeechRecognizer() else { self.alertView(message: "Recognition is not on your phone") return } if !myRecognition.isAvailable { self.alertView(message: "recognition is not available right now") } task = speechRecognizer?.recognitionTask(with: request, resultHandler: { (response, error) in guard let response = response else { if error != nil { self.alertView(message: error!.localizedDescription.debugDescription) } else { self.alertView(message: "Unknow error in creating task") } return } let message = response.bestTranscription.formattedString self.label.text = message }) }
Replies
0
Boosts
1
Views
873
Activity
Dec ’21
Looking for in depth tutorial on SFSpeechRecognizer
it's a great tool from Apple, but I want to delve more into its engine as I need to. The documentation doesn't seem to go there. For instance, I can't figure out how to clear the bestTranscritption object in speechRecognizer, as it always contains the entire transcription. There are other things I would like to work with as well. Has anyone worked with this heavily enough to recommend proper books are paid for tutorials? Many thanks
Replies
0
Boosts
0
Views
590
Activity
Dec ’21
Does Swift 5.x offer any callback functions in AVAudioRecorder or must I use a callback timer?
Am using the demo code below to flesh out an audio recording app in Swift 5.x I would like to monitor certain aspects of the AVAudioRecorder as it is recording. Such as: frequency, power, volume, etc. but in live time. I found an example in Swift 3 where the user sets up a callback timer for 0.5 sec. I was wondering if this was still the case, or that in the latest version of Swift, there might be a callback function in the AVAudioEngine that gets called at a regular frequency? do { audioRecorder = try AVAudioRecorder(url: audioFilename!, settings: settings) audioRecorder.delegate = self audioRecorder.record() recordButton.setTitle("Tap to Stop", for: .normal) } catch { finishRecording(success: false) } }
Replies
1
Boosts
0
Views
888
Activity
Dec ’21
Is there a link to print out the full documentation for AVAudioEngine?
I’m trying to do something really complex with audio streams. I.e. process the stream live edit it and then save it in snippets, all while the user is still speaking. I’m a book person, and reading hardcopy documentation is much easier for me.
Replies
0
Boosts
0
Views
556
Activity
Dec ’21
Is .installTap the equivalent of a C callback function?
Expanding a speech to text demo, and while it works, I am still trying to learn Swift. Is .installTap the Swift version of a C callback function? From what I interpret here, every time the buffer becomes full, the code in between the last { } runs, as well, the code below it is also run. It almost feels like a callback combined with a GOTO line from basic. yes, it works, but I'd like to understand that I am getting the flow of the code correctly. func startSpeechRecognition (){ let node = audioEngine.inputNode let recordingFormat = node.outputFormat(forBus: 0) node.installTap(onBus: 0, bufferSize: 1024, format: recordingFormat) { (buffer, _) in self.request.append(buffer) } audioEngine.prepare() do { try audioEngine.start() } catch let error { ... } guard let myRecognition = SFSpeechRecognizer() else { ... return } if !myRecognition.isAvailable { ... } task = speechRecognizer?.recognitionTask(with: request, resultHandler: { (response, error) in guard let response = response else { if error != nil { print ("\(String(describing: error.debugDescription))") } else { print ("problem in repsonse") } return } let message = response.bestTranscription.formattedString print ("\(message)") }) }
Replies
5
Boosts
0
Views
1.5k
Activity
Dec ’21
How to convert node.outputFormat to settings for AVAudioFile
Am trying to go from the installTap straight to AVAudioFile(forWriting: I call: let recordingFormat = node.outputFormat(forBus: 0) and I get back : <AVAudioFormat 0x60000278f750:  1 ch,  48000 Hz, Float32> But AVAudioFile has a settings parameter of [String : Any] and am curious of how to place those values into recording the required format. Hopefully these are the values I need?
Replies
0
Boosts
0
Views
462
Activity
Dec ’21
My writing the installTap buffer to an AVAudioFile seems to fail data-wise, or at the end
I am trying to save the buffer from my installTap to a file. I do it in chunks of 10 so I'll get a bigger file. When I try to play the written file (from the simulator's directory) QuickTime says that it's not compatible. I have examined the bad m4a file and a working one. There are a lot of zero's in the bad file at the beginning followed by a lot of data. However both files appears to have the same header. A lot of people mention that I have to nil the AudioFile, but: audioFile = nil is not a valid syntax, nor can I file a close method in AudioFile. Here's the complete code, edited into one working file: import UIKit import AVFoundation class ViewController: UIViewController { let audioEngine = AVAudioEngine() var audioFile = AVAudioFile() var x = 0 override func viewDidLoad() { super.viewDidLoad() record() // Do any additional setup after loading the view. } func makeFile(format: AVAudioFormat) { let paths = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask).first do { _ = try FileManager.default.contentsOfDirectory(at: paths!, includingPropertiesForKeys: nil) } catch { print ("error")} let destinationPath = paths!.appendingPathComponent("audioT.m4a") print ("\(destinationPath)") do { audioFile = try AVAudioFile(forWriting: destinationPath, settings: format.settings) print ("file created") } catch { print ("error creating file")} } func record(){ let node = audioEngine.inputNode let recordingFormat = node.inputFormat(forBus: 0) makeFile(format: recordingFormat) node.installTap(onBus: 0, bufferSize: 8192, format: recordingFormat, block: { [self] (buffer, _) in do { try audioFile.write(from: buffer); print ("buffer filled"); x += 1; print("wrote \(x)") if x > 9 { endThis() } } catch {return};}) audioEngine.prepare() do { try audioEngine.start() } catch let error { print ("oh catch") } } func endThis(){ audioEngine.stop() audioEngine.inputNode.removeTap(onBus: 0) } }
Replies
1
Boosts
0
Views
2.1k
Activity
Jan ’22
Is there a way to find gaps, or silences in audio files?
Hello, Am starting to work with/learn the AVAudioEngine. Currently am at the point where I would like to be able read an audio file of a speech and determine if there are any moments of silence in the speech. Does this framework provide any such properties, such as power lever, decibels, etc. that I can use in finding long enough moments of silence?
Replies
1
Boosts
0
Views
1.4k
Activity
Jan ’22
How can I decipher an M4A file?
I have my Swift app that records audio in chunks of multiple files, each M4A file is approx 1 minute long. I would like to go through those files and detect silence, or the lowest level. While I am able to read the file into a buffer, my problem is deciphering it. Even with Google, all it comes up with is "audio players" instead of sites that describe the header and the data. Where can I find what to look for? Or even if I should be reading it into a WAV file? But even then I cannot seem to find a tool, or a site, that tells me how to decipher what I am reading. Obviously it exists, since Siri knows when you've stopped speaking. Just trying to find the key.
Replies
0
Boosts
0
Views
680
Activity
Jan ’22