I have my mainController (parent) and my menuController (child).
I call the menuController with
addChild(child)
view.addSubview(child.view)
child.didMove(toParent: self)
The child dismisses itself with:
self.dismiss(animated: true, completion: nil)
The question I have, is how do I clean up the child within the parent? Surely I have to do something?
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
At the very bottom is my code where MainController initiates a subview called setController. The new subview is created when I click the button
However, within setController's code I get back a nil when I try to the superview:
override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
print("\(self.view.superview)" ?? "count->no parent")
}
I am assigning the second view as a subview, but obviously I am missing something. Have I misunderstood how UIView hierarchy works?
class MainController: UIViewController {
private lazy var setController = SetController()
var invButton : MyButton!
override func viewDidLoad() {
super.viewDidLoad()
view.backgroundColor = .black
invButton = makeButton(vControl: self, btype: ButtType.inv, action: #selector(self.buttonAction(sender:)))
invButton.frame.origin.x = self.view.frame.width * 0.1
invButton.frame.origin.y = self.view.frame.height * 0.1
invButton.setTitle("Settings", for: .normal)
}
override var prefersStatusBarHidden: Bool {
return false
}
@objc func buttonAction(sender: UIButton!) {
guard let theButton = sender as? MyButton else { return}
UIView.transition(with: self.view, duration: 0.5, options: .transitionCurlDown, animations: { [self] in
self.addChild(setController)
self.view.addSubview(setController.view)
setController.didMove(toParent: self)
}, completion: nil)
}
}
I've learned the hard way that specific commands to add a child UIView must be in a certain order, especially if I am bringing in the child UIView using animation.
So I'd like to be clear that what the order I use to delete a child UIView is correct. Yes, what I have below works. But that doesn't mean it's correct.
Thank you
UIView.transition(with: parent, duration: 0.5, options: .transitionCurlUp, animations: { [self] in
self.willMove(toParent: nil);
self.removeFromParent();
self.view.removeFromSuperview();
self.dismiss(animated: false, completion: nil);
}, completion:nil)
Copied the code below from a tutorial, and I mostly understand what is going on. But I'd like to be able to fully read it.
What I do get is that:
handler can be nil, but here it's the code to run upon the completion of UIAlertAction.
But am unsure what (_) in is. I have also sometimes seen it as [self] in
Thank you
controller.addAction(UIAlertAction(title: "OK", style: .default, handler: { (_) in controller.dismiss(animated: true, completion: nil)}))
I was under the impression, with offline speech to text, that there was no limit. Since the app wouldn't be using Apple's servers in real time.
Yet when I process: speechRecognizer.recognitionTask it quits after one minute.
Did I misread something ?
Am using the demo code below to flesh out an audio recording app in Swift 5.x
I would like to monitor certain aspects of the AVAudioRecorder as it is recording. Such as: frequency, power, volume, etc. but in live time.
I found an example in Swift 3 where the user sets up a callback timer for 0.5 sec. I was wondering if this was still the case, or that in the latest version of Swift, there might be a callback function in the AVAudioEngine that gets called at a regular frequency?
do {
audioRecorder = try AVAudioRecorder(url: audioFilename!, settings: settings)
audioRecorder.delegate = self
audioRecorder.record()
recordButton.setTitle("Tap to Stop", for: .normal)
} catch {
finishRecording(success: false)
}
}
I am trying to save the buffer from my installTap to a file. I do it in chunks of 10 so I'll get a bigger file. When I try to play the written file (from the simulator's directory) QuickTime says that it's not compatible.
I have examined the bad m4a file and a working one. There are a lot of zero's in the bad file at the beginning followed by a lot of data. However both files appears to have the same header.
A lot of people mention that I have to nil the AudioFile, but:
audioFile = nil
is not a valid syntax, nor can I file a close method in AudioFile.
Here's the complete code, edited into one working file:
import UIKit
import AVFoundation
class ViewController: UIViewController {
let audioEngine = AVAudioEngine()
var audioFile = AVAudioFile()
var x = 0
override func viewDidLoad() {
super.viewDidLoad()
record()
// Do any additional setup after loading the view.
}
func makeFile(format: AVAudioFormat) {
let paths = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask).first
do {
_ = try FileManager.default.contentsOfDirectory(at: paths!, includingPropertiesForKeys: nil)
} catch { print ("error")}
let destinationPath = paths!.appendingPathComponent("audioT.m4a")
print ("\(destinationPath)")
do {
audioFile = try AVAudioFile(forWriting: destinationPath,
settings: format.settings)
print ("file created")
} catch { print ("error creating file")}
}
func record(){
let node = audioEngine.inputNode
let recordingFormat = node.inputFormat(forBus: 0)
makeFile(format: recordingFormat)
node.installTap(onBus: 0, bufferSize: 8192, format: recordingFormat, block: { [self]
(buffer, _) in
do {
try audioFile.write(from: buffer);
print ("buffer filled");
x += 1;
print("wrote \(x)")
if x > 9 {
endThis()
}
} catch {return};})
audioEngine.prepare()
do {
try audioEngine.start()
} catch let error {
print ("oh catch")
}
}
func endThis(){
audioEngine.stop()
audioEngine.inputNode.removeTap(onBus: 0)
}
}
Hello,
Am starting to work with/learn the AVAudioEngine.
Currently am at the point where I would like to be able read an audio file of a speech and determine if there are any moments of silence in the speech.
Does this framework provide any such properties, such as power lever, decibels, etc. that I can use in finding long enough moments of silence?
I have an iOS app with multiple subclasses of UIViewControllers. There are many type of UIAlertControllers I might need to use based on user interaction, internet connection, and catching any other fatal errors.
So I wrote the extension for UIViewController below, which works just fine. And I can call from any of my UIViewControllers as simply as:
myErrors(error: MyErrors.e1.rawValue, title: "Internet Error", msg: "Unable to connect to Internet\nTry Again?")
While this works, I do not know if it's proper to add an extension to UIViewController. Is this considered bad practice? Is there another way I should be pursuing this?
extension UIViewController {
func myErrors(error: MyErrors, title: String, msg: String)
{
var title = ""
var message = ""
switch error {
case .e1:
title = String(format: "%@", title)
message = String(format: "Database Error %03d%@\n", error.rawValue, msg)
case .e2:
title = String(format: "%@", title)
message = String(format: "Internet Error %03d%@\n", error.rawValue, msg)
case .e3:
title = String(format: "%@", title)
message = String(format: "User Error %03d%@\n", error.rawValue, msg)
}
let alert = UIAlertController(title: title, message: message, preferredStyle: UIAlertController.Style.alert)
switch error {
case .e1:
alert.addAction(UIAlertAction(title: "No", style: .init(rawValue: 0)!, handler: { (action: UIAlertAction!) in
// ..log error
//...proceed to code based on No ....
}))
alert.addAction(UIAlertAction(title: "Yes", style: .init(rawValue: 0)!, handler: { (action: UIAlertAction!) in
// ..log error
//...code based on Yes ....
}))
case .e2:
// No user option availabe in this alert, just OK
// ... do all logging of errors
// proceed
case .e3:
// Add specific acctions to this error
// ... do all logging of errors
// proceed
}
self.present(alert, animated: true, completion: nil)
}
}
Stupid question, could a moderator please delete this embarrassing post on my part?
I was looking up removing the Main storyboard, and am a little confused. Everywhere in my app, I'm supposed to remove "Main." But I can't find where ViewController.swift is called.
So am trying to figure that part out, who calls whom? Because if I add a background color to ViewController, and a label to the Main storyboard, both show up.
I'm just trying to figure out how both are called.
Closing because again I asked a stupid question
Below is my very simple code, where I programmatically create another UIViewController. My only problem is that it pops up as a view that can be swiped away. I want this to be its own UIViewController that takes the entire screen, and cannot be just shipped away.
Is that possible?
import UIKit
class ViewController: UIViewController {
lazy var mainMenu = MainMenuCtrl()
private let myView : UIView = {
let myView = UIView()
myView.translatesAutoresizingMaskIntoConstraints = false
myView.backgroundColor = .gray
return myView
}()
override func viewDidLayoutSubviews() {
super.viewDidLayoutSubviews()
}
override func viewDidAppear(_ animated: Bool) {
super.viewDidAppear(animated)
view.backgroundColor = bgColor
view.addSubview(myView)
addContraints()
present(mainMenu, animated: true)
}
override func viewDidLoad() {
super.viewDidLoad()
}
override var prefersStatusBarHidden: Bool {
return false
}
override var preferredStatusBarStyle: UIStatusBarStyle {
return .darkContent
}
func addContraints() {
var constraints = [NSLayoutConstraint]()
constraints.append(myView.leadingAnchor.constraint(equalTo: view.safeAreaLayoutGuide.leadingAnchor, constant: 5))
constraints.append(myView.trailingAnchor.constraint(equalTo: view.safeAreaLayoutGuide.trailingAnchor, constant: -5))
constraints.append(myView.bottomAnchor.constraint(equalTo: view.safeAreaLayoutGuide.bottomAnchor, constant: -5))
constraints.append(myView.topAnchor.constraint(equalTo: view.safeAreaLayoutGuide.topAnchor,constant: +5))
NSLayoutConstraint.activate(constraints)
}
}
Can I have multiple extensions for the same class? Obviously I have already tried it and it works. But that doesn't make it right?
So want to make sure this doesn't blowup in my face at a later date.
p.s I keep typing "blow-up" as two separate words but the size changes the b word to "****" what's up with that?
Updated info below, in bold.
I went and changed one of the entities in the CoreData of my app. For all my entities I have them selected as "Manual" for Codegen
So I deleted all four files (for two entities), cleaned the build folder, regenerated the CoreData files with Editor -> Create NSManagedObject Subclass.
Now every time I run the app I get a fatalError in the following code in the AppDelegate:
lazy var persistentContainer: NSPersistentContainer = {
let container = NSPersistentContainer(name: “Invoice_Gen")
container.loadPersistentStores(completionHandler: { (storeDescription, error) in
if let error = error as NSError? {
fatalError("Unresolved error \(error), \(error.userInfo)")
}
})
return container
}()
The error code being
[error] error: addPersistentStoreWithType:configuration:URL:options:error: returned error NSCocoaErrorDomain (134140)
Even if I remove the files for the CoreData entities, and comment out anything related to them code wise, I will still get this crash.
If someone has any idea of whether I have to delete something else, or am whatever I would so appreciate it. This one has me more stumped than anything before it.
The change I made was to turn one of the entities' attribute from String to Int
When I changed it back, everything works. So from my research on Google there is something about the mapping model. But I can not find it at all.
I am using the code below to create my own debug log for my app. On the simulator, I have no problem viewing that log. I simply print out the documents directory in the debugger, then open it in my finder.
However I do not know how to access the created log on my iPhone itself. Even if I go to Window -> Devices and Simulator's, when I look at my app's container it's empty. Although I would like to be able to access the file from any actual device in the future.
Am I using the wrong directory? I even used allDomainsMask in place of userDomainMask below, but to no avail.
debugFileURL = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask)[0].appendingPathComponent("myApp.log")
if let handle = try? FileHandle(forWritingTo: dbClass.debugFileURL!) {
handle.seekToEndOfFile() // moving pointer to the end
handle.write(text.data(using: .utf8)!) // adding content
handle.closeFile() // closing the file
} else {
try! text.write(to: dbClass.debugFileURL!, atomically: false, encoding: .utf8)
}
Am trying to distinguish the differences in volumes between background noise, and someone speaking in Swift.
Previously, I had come across a tutorial which had me looking at the power levels in each channel. It come out as the code listed in Sample One which I called within the installTap closure. It was ok, but the variance between background and the intended voice to record, wasn't that great. Sure, it could have been the math used to calculate it, but since I have no experience in audio data, it was like reading another language.
Then I came across another demo. It's code was much simpler, and the difference in values between background noise and speaking voice was much greater, therefore much more detectable. It's listed here in Sample Two, which I also call within the installTap closure.
My issue here is wanting to understand what is happening in the code. In all my experiences with other languages, voice was something I never dealt with before, so this is way over my head.
Not looking for someone to explain this to me line by line. But if someone could let me know where I can find decent documentation so I can better grasp what is going on, I would appreciate it.
Thank you
Sample One
func audioMetering(buffer:AVAudioPCMBuffer) {
// buffer.frameLength = 1024
let inNumberFrames:UInt = UInt(buffer.frameLength)
if buffer.format.channelCount > 0 {
let samples = (buffer.floatChannelData![0])
var avgValue:Float32 = 0
vDSP_meamgv(samples,1 , &avgValue, inNumberFrames)
var v:Float = -100
if avgValue != 0 {
v = 20.0 * log10f(avgValue)
}
self.averagePowerForChannel0 = (self.LEVEL_LOWPASS_TRIG*v) + ((1-self.LEVEL_LOWPASS_TRIG)*self.averagePowerForChannel0)
self.averagePowerForChannel1 = self.averagePowerForChannel0
}
if buffer.format.channelCount > 1 {
let samples = buffer.floatChannelData![1]
var avgValue:Float32 = 0
vDSP_meamgv(samples, 1, &avgValue, inNumberFrames)
var v:Float = -100
if avgValue != 0 {
v = 20.0 * log10f(avgValue)
}
self.averagePowerForChannel1 = (self.LEVEL_LOWPASS_TRIG*v) + ((1-self.LEVEL_LOWPASS_TRIG)*self.averagePowerForChannel1)
}
}
Sample Two
private func getVolume(from buffer: AVAudioPCMBuffer, bufferSize: Int) -> Float {
guard let channelData = buffer.floatChannelData?[0] else {
return 0
}
let channelDataArray = Array(UnsafeBufferPointer(start:channelData, count: bufferSize))
var outEnvelope = [Float]()
var envelopeState:Float = 0
let envConstantAtk:Float = 0.16
let envConstantDec:Float = 0.003
for sample in channelDataArray {
let rectified = abs(sample)
if envelopeState < rectified {
envelopeState += envConstantAtk * (rectified - envelopeState)
} else {
envelopeState += envConstantDec * (rectified - envelopeState)
}
outEnvelope.append(envelopeState)
}
// 0.007 is the low pass filter to prevent
// getting the noise entering from the microphone
if let maxVolume = outEnvelope.max(),
maxVolume > Float(0.015) {
return maxVolume
} else {
return 0.0
}
}