Am trying to go from the installTap straight to AVAudioFile(forWriting:
I call:
let recordingFormat = node.outputFormat(forBus: 0)
and I get back :
<AVAudioFormat 0x60000278f750: 1 ch, 48000 Hz, Float32>
But AVAudioFile has a settings parameter of [String : Any] and am curious of how to place those values into recording the required format.
Hopefully these are the values I need?
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hello,
Am starting to work with/learn the AVAudioEngine.
Currently am at the point where I would like to be able read an audio file of a speech and determine if there are any moments of silence in the speech.
Does this framework provide any such properties, such as power lever, decibels, etc. that I can use in finding long enough moments of silence?
I have my Swift app that records audio in chunks of multiple files, each M4A file is approx 1 minute long. I would like to go through those files and detect silence, or the lowest level.
While I am able to read the file into a buffer, my problem is deciphering it. Even with Google, all it comes up with is "audio players" instead of sites that describe the header and the data.
Where can I find what to look for? Or even if I should be reading it into a WAV file? But even then I cannot seem to find a tool, or a site, that tells me how to decipher what I am reading.
Obviously it exists, since Siri knows when you've stopped speaking. Just trying to find the key.
Am working on a recording app from scratch and it just has the basics. Within my info.plist I do set Privacy - Microphone Usage Description
Still, I always want to check the "privacy permission" on the microphone because I know people can hit "No" by accident.
However, whatever I try, the app keeps running without waiting for the iOs permission alert to pop up and complete.
let mediaType = AVMediaType.audio
let mediaAuthorizationStatus = AVCaptureDevice.authorizationStatus(for: mediaType)
switch mediaAuthorizationStatus {
case .denied:
print (".denied")
case .authorized:
print ("authorized")
case .restricted:
print ("restricted")
case .notDetermined:
print("huh?")
let myQue = DispatchQueue(label: "get perm")
myQue.sync
{
AVCaptureDevice.requestAccess(for: .audio, completionHandler: { (granted: Bool) in
if granted {
} else {
}
})
}
default:
print ("not a clue")
}
Working on a recording app. So I started from scratch, and basically jump right into recording. I made sure to add the Privacy - Microphone Usage Description string.
What strikes me as odd, is that the app launches straight into recording. No alert comes up the first time asking the user for permission, which I thought was the norm.
Have I misunderstood something?
override func viewDidLoad() {
super.viewDidLoad()
record3()
}
func record3() {
print ("recording")
let node = audioEngine.inputNode
let recordingFormat = node.inputFormat(forBus: 0)
var silencish = 0
var wordsish = 0
makeFile(format: recordingFormat)
node.installTap(onBus: 0, bufferSize: 8192, format: recordingFormat, block: {
[self]
(buffer, _) in
do {
try audioFile!.write(from: buffer);
x += 1;
if x > 300 {
print ("it's over sergio")
endThis()
}
} catch {return};})
audioEngine.prepare()
do {
try audioEngine.start()
} catch let error {
print ("oh catch \(error)")
}
}
Xcode 13.3
Whenever I hit enter on Xcode, it starts off the new line with an indentation. It doesn't matter if I am creating a new line, or moving lines down, it always start the new line, or the moved line with an indentation.
This happens when I have:
Prefer Indent Using set to Tabs
regardless if I have
Syntax-Aware Indenting: Return checked or unchecked
Any thoughts?
Update, now somehow it does that auto indent on every line no matter whether I set it tabs or spaces. It's as if I broke it. Very annoying, please help!
Am trying to add a file uploader to my iPhone app in Swift, and need help as am unsure how to save the data from returned from the UIDocumentPickerViewController.
My whole document picking code works, and ends with this line of code:
let data = try Data.init(contentsOf: url)
The thing is I don't do my uploading till the user clicks another button, so I need to save that data. but am unsure how to cast a variable to hold it, then release the original data, and then finally free the copy.
I thought this would work
var dataToSend : AnyObject?
but it doesn't
Yes, still have casting issues to learn in Swift
I realize I could declare a global var, such as:
var mySpecialSubClass : MySpecialSubClass?
..and then check if it is defined.
Don't know of any reason to do one way or another, but I was wondering if there was a search for an instance of "MySpecialSubClass" function or method available?
From a tutorial I pulled the extension below to allow the enduser to select a file off of the iPhone's storage system.
Everything works fine.
However I noticed that in the delegate that "dismiss" is only being called if the user chooses cancel. While the view does disappear when a file is selected, I am not sure that the view is being properly dismissed internally by Swift.
Since I do call present, am I responsible for dismissing it when the user chooses a file as well?
let supportedTypes: [UTType] = [UTType.item]
let pickerViewController = UIDocumentPickerViewController(forOpeningContentTypes: supportedTypes)
pickerViewController.delegate = self
pickerViewController.allowsMultipleSelection = false
present(pickerViewController, animated: true, completion: nil)
extension uploadFile: UIDocumentPickerDelegate {
func documentPicker(_ controller: UIDocumentPickerViewController, didPickDocumentsAt urls: [URL]) {
for url in urls {
guard url.startAccessingSecurityScopedResource() else {
print ("error")
return
}
...save chosen file url here in an existing structre
do { url.stopAccessingSecurityScopedResource() }
}
}
func documentPickerWasCancelled(_ controller: UIDocumentPickerViewController) {
controller.dismiss(animated: true, completion: nil)
}
}
Below is a quick snippet of where I record audio. I would like to get a sampling of the background audio so that later I can filter out background noise. I figure 10 to 15 seconds should be a good amount of time.
Although I am assuming that it can change depending on the iOS device, the format returned from .inputFormat is :
<AVAudioFormat 0x600003e8d9a0: 1 ch, 48000 Hz, Float32>
Based on the format info, is it possible to make bufferSize for .installTap be just the write size for whatever time I wish to record for?
I realize I can create a timer for 10 seconds, stop recording, paster the files I have together, etc. etc. But if I can avoid all that extra coding it would be nice.
let node = audioEngine.inputNode
let recordingFormat = node.inputFormat(forBus: 0)
makeFile(format: recordingFormat)
node.installTap(onBus: 0, bufferSize: 8192, format: recordingFormat, block: {
[self]
(buffer, _) in
audioMetering(buffer: buffer)
print ("\(self.averagePowerForChannel0) \(self.averagePowerForChannel1)")
if self.averagePowerForChannel0 < -50 && self.averagePowerForChannel1 < -50 {
...
} else {
...
}
do {
... write audio file
} catch {return};})
audioEngine.prepare()
do {
try audioEngine.start()
} catch let error {
print ("oh catch \(error)")
}
Am at the beginning of a voice recording app. I store incoming voice data into a buffer array, and write 50 of them to a file. The code works fine, Sample One.
However, I would like the recorded files to be smaller. So here I try to add an AVAudioMixer to downsize the sampling. But this code sample gives me two errors. Sample Two
The first error I get is when I call audioEngine.attach(downMixer). The debugger gives me nine of these errors:
throwing -10878
The second error is a crash when I try to write to audioFile. Of course they might all be related, so am looking to include the mixer successfully first.
But I do need help as I am just trying to piece these all together from tutorials, and when it comes to audio, I know less than anything else.
Sample One
//these two lines are in the init of the class that contains this function...
node = audioEngine.inputNode
recordingFormat = node.inputFormat(forBus: 0)
func startRecording()
{
audioBuffs = []
x = -1
node.installTap(onBus: 0, bufferSize: 8192, format: recordingFormat, block: {
[self]
(buffer, _) in
x += 1
audioBuffs.append(buffer)
if x >= 50 {
audioFile = makeFile(format: recordingFormat, index: fileCount)
mainView?.setLabelText(tag: 3, text: "fileIndex = \(fileCount)")
fileCount += 1
for i in 0...49 {
do {
try audioFile!.write(from: audioBuffs[i]);
} catch {
mainView?.setLabelText(tag: 4, text: "write error")
stopRecording()
}
}
...cleanup buffer code
}
})
audioEngine.prepare()
do {
try audioEngine.start()
} catch let error { print ("oh catch \(error)") }
}
Sample Two
//these two lines are in the init of the class that contains this function
node = audioEngine.inputNode
recordingFormat = node.inputFormat(forBus: 0)
func startRecording() {
audioBuffs = []
x = -1
// new code
let format16KHzMono = AVAudioFormat.init(commonFormat: AVAudioCommonFormat.pcmFormatInt16, sampleRate: 11025.0, channels: 1, interleaved: true)
let downMixer = AVAudioMixerNode()
audioEngine.attach(downMixer)
// installTap on the mixer rather than the node
downMixer.installTap(onBus: 0, bufferSize: 8192, format: format16KHzMono, block: {
[self]
(buffer, _) in
x += 1
audioBuffs.append(buffer)
if x >= 50 {
// use a different format in creating the audioFile
audioFile = makeFile(format: format16KHzMono!, index: fileCount)
mainView?.setLabelText(tag: 3, text: "fileIndex = \(fileCount)")
fileCount += 1
for i in 0...49 {
do {
try audioFile!.write(from: audioBuffs[i]);
} catch {
stopRecording()
}
}
...cleanup buffers...
}
})
let format = node.inputFormat(forBus: 0)
// new code
audioEngine.connect(node, to: downMixer, format: format)//use default input format
audioEngine.connect(downMixer, to: audioEngine.outputNode, format: format16KHzMono)//use new audio format
downMixer.outputVolume = 0.0
audioEngine.prepare()
do {
try audioEngine.start()
} catch let error { print ("oh catch \(error)") }
}
So I've found out from other posts that Xcode 13.4.1 won't debug apps on iPhones with iOS 15.6.
The solution everyone that everyone seems to agree on is to go back to Xcode 13.3.1.
While I am downloading the xip file for that version, I want to check first on how to install the older version? I don't need to mess things up any worse than they are now.
I already posted about Xcode 13.4.1 not supporting iPhone's 15.6 iOS. But the answer raised even more questions.
If the latest version of Xcode (13.4.1) won't support iOS 15.6, why should I think an earlier version of Xcode would?
What is the real solution to getting Xcode to run apps on that iOS? Github does not have files past 15.5?
Does Xcode automatically update its supported iOS files behind the scenes?
Is there a planned date for Xcode to support iOS 15.6?
Thank you
Within my UIViewController I have a UITextView which I use to dump current status and info into. Obviously evry time I add text to the UITextView I would like it to scroll to the bottom.
So I've created this function, which I call from UIViewController whenever I have new data.
func updateStat(status: String, tView: UITextView) {
db.status = db.status + status + "\n"
tView.text = db.status
let range = NSMakeRange(tView.text.count - 1, 0)
tView.scrollRangeToVisible(range)
tView.flashScrollIndicators()
}
The only thing that does not work is the tView.scrollRangeToVisible. However, if from UIViewController I call:
updateStat(status: "...new data...", tView: mySession)
let range = NSMakeRange(mySession.text.count - 1, 0)
mySession.scrollRangeToVisible(range)
then the UITextView's scrollRangeToVisible does work.
I'm curious if anyone knows why this works when called within the UIViewController, but not when called from a function?
p.s. I have also tried the updateStatus function as an extension to UIViewController, but that doesn't work either
I've had this issue before...
Under User-Defined I have created DEBUG_LEVEL_1
And within my code I have
#if DEBUG_LEVEL_1
self.status = printSimDir()
#endif
However the printSimDir function is never called.
So obviously I am setting something incorrectly here