Post

Replies

Boosts

Views

Activity

Is anyone aware of any bugs associated with: .modalTransitionStyle = .partialCurl ?
I posted this question below on StackOverflow, and when someone pointed out .partialCurl, I tried different modalTransitionStyles, and the code doesn't crash. Is anyone aware of "known" issues with partialCurl from Apple? original post on StackOverflow Within my root UIViewController I call up a submenu, second UIViewController, with the following code: within root UIViewController let myInvMenu = InvMenuCtrl() myInvMenu.modalPresentationStyle = .fullScreen myInvMenu.modalTransitionStyle = .partialCurl present(myInvMenu, animated: false) Within the new screen, I have a back button, I want dismiss it, and return to the original UIViewController. dismiss(animated: false) In this post, I have the animation set to false, because that works fine. But if I set it to true, I crash on the dismissal. From the docs, below I assumed that I didn't have to handle anything myself, but obviously if someone could tell me where my misunderstanding is: The presenting view controller is responsible for dismissing the view controller it presented. If you call this method on the presented view controller itself, UIKit asks the presenting view controller to handle the dismissal.
2
0
444
Mar ’22
What is the difference between func (to view: UIView) and not using the 'to'?
I have a function in my UIView extension and I can call it both ways This way: func anchorSizePercent(to view: UIView, sFactor: CGSize) { ... } myHeader.anchorSizePercent(to: myView, sFactor: CGSize(width: 0.8, height: 0.2)) As well, I can call it without the to, such as: func anchorSizePercent(view: UIView, sFactor: CGSize) { ... } myHeader.anchorSizePercent(view: myView, sFactor: CGSize(width: 0.8, height: 0.2)) May I ask what the difference is?
3
0
482
Mar ’22
Getting strange message in debugger since Xcode last updated
Xcode updated yesterday, and am now at Xcode Version 13.3 (13E113) Since then, whenever I debug my app the the simulator, I get the message below. Am curious if I should be concerned. If it makes a difference: Yes, I have an IBM keyboard attached via USB. Also this does not happen when I debug the app on the iPhone itself. 2022-03-15 14:50:45.745237-0400 Hello World[48883:648134] [HardwareKeyboard] -[UIApplication getKeyboardDevicePropertiesForSenderID:shouldUpdate:usingSyntheticEvent:], failed to fetch device property for senderID (778835616971358211) use primary keyboard info instead.
3
1
7k
Mar ’22
How can i create a second UIViewController programmatically that is independent ?
Closing because again I asked a stupid question Below is my very simple code, where I programmatically create another UIViewController. My only problem is that it pops up as a view that can be swiped away. I want this to be its own UIViewController that takes the entire screen, and cannot be just shipped away. Is that possible? import UIKit class ViewController: UIViewController { lazy var mainMenu = MainMenuCtrl() private let myView : UIView = { let myView = UIView() myView.translatesAutoresizingMaskIntoConstraints = false myView.backgroundColor = .gray return myView }() override func viewDidLayoutSubviews() { super.viewDidLayoutSubviews() } override func viewDidAppear(_ animated: Bool) { super.viewDidAppear(animated) view.backgroundColor = bgColor view.addSubview(myView) addContraints() present(mainMenu, animated: true) } override func viewDidLoad() { super.viewDidLoad() } override var prefersStatusBarHidden: Bool { return false } override var preferredStatusBarStyle: UIStatusBarStyle { return .darkContent } func addContraints() { var constraints = [NSLayoutConstraint]() constraints.append(myView.leadingAnchor.constraint(equalTo: view.safeAreaLayoutGuide.leadingAnchor, constant: 5)) constraints.append(myView.trailingAnchor.constraint(equalTo: view.safeAreaLayoutGuide.trailingAnchor, constant: -5)) constraints.append(myView.bottomAnchor.constraint(equalTo: view.safeAreaLayoutGuide.bottomAnchor, constant: -5)) constraints.append(myView.topAnchor.constraint(equalTo: view.safeAreaLayoutGuide.topAnchor,constant: +5)) NSLayoutConstraint.activate(constraints) } }
1
0
389
Mar ’22
Is it Main.storyboard or is it ViewController?
Stupid question, could a moderator please delete this embarrassing post on my part? I was looking up removing the Main storyboard, and am a little confused. Everywhere in my app, I'm supposed to remove "Main." But I can't find where ViewController.swift is called. So am trying to figure that part out, who calls whom? Because if I add a background color to ViewController, and a label to the Main storyboard, both show up. I'm just trying to figure out how both are called.
1
0
399
Mar ’22
Why won't my app ask permission to use the microphone?
Working on a recording app. So I started from scratch, and basically jump right into recording. I made sure to add the Privacy - Microphone Usage Description string. What strikes me as odd, is that the app launches straight into recording. No alert comes up the first time asking the user for permission, which I thought was the norm. Have I misunderstood something? override func viewDidLoad() { super.viewDidLoad() record3() } func record3() { print ("recording") let node = audioEngine.inputNode let recordingFormat = node.inputFormat(forBus: 0) var silencish = 0 var wordsish = 0 makeFile(format: recordingFormat) node.installTap(onBus: 0, bufferSize: 8192, format: recordingFormat, block: { [self] (buffer, _) in do { try audioFile!.write(from: buffer); x += 1; if x > 300 { print ("it's over sergio") endThis() } } catch {return};}) audioEngine.prepare() do { try audioEngine.start() } catch let error { print ("oh catch \(error)") } }
0
0
494
Feb ’22
How can I get my app to wait for a permission request to complete?
Am working on a recording app from scratch and it just has the basics. Within my info.plist I do set Privacy - Microphone Usage Description Still, I always want to check the "privacy permission" on the microphone because I know people can hit "No" by accident. However, whatever I try, the app keeps running without waiting for the iOs permission alert to pop up and complete. let mediaType = AVMediaType.audio let mediaAuthorizationStatus = AVCaptureDevice.authorizationStatus(for: mediaType) switch mediaAuthorizationStatus { case .denied: print (".denied") case .authorized: print ("authorized") case .restricted: print ("restricted") case .notDetermined: print("huh?") let myQue = DispatchQueue(label: "get perm") myQue.sync { AVCaptureDevice.requestAccess(for: .audio, completionHandler: { (granted: Bool) in if granted { } else { } }) } default: print ("not a clue") }
3
0
1.6k
Feb ’22
Is it proper to add an extension to UIViewController class, or is there a preferred way of adding another method to an established Swift Class?
I have an iOS app with multiple subclasses of UIViewControllers. There are many type of UIAlertControllers I might need to use based on user interaction, internet connection, and catching any other fatal errors. So I wrote the extension for UIViewController below, which works just fine. And I can call from any of my UIViewControllers as simply as: myErrors(error: MyErrors.e1.rawValue, title: "Internet Error", msg: "Unable to connect to Internet\nTry Again?") While this works, I do not know if it's proper to add an extension to UIViewController. Is this considered bad practice? Is there another way I should be pursuing this? extension UIViewController { func myErrors(error: MyErrors, title: String, msg: String) { var title = "" var message = "" switch error { case .e1: title = String(format: "%@", title) message = String(format: "Database Error %03d%@\n", error.rawValue, msg) case .e2: title = String(format: "%@", title) message = String(format: "Internet Error %03d%@\n", error.rawValue, msg) case .e3: title = String(format: "%@", title) message = String(format: "User Error %03d%@\n", error.rawValue, msg) } let alert = UIAlertController(title: title, message: message, preferredStyle: UIAlertController.Style.alert) switch error { case .e1: alert.addAction(UIAlertAction(title: "No", style: .init(rawValue: 0)!, handler: { (action: UIAlertAction!) in // ..log error //...proceed to code based on No .... })) alert.addAction(UIAlertAction(title: "Yes", style: .init(rawValue: 0)!, handler: { (action: UIAlertAction!) in // ..log error //...code based on Yes .... })) case .e2: // No user option availabe in this alert, just OK // ... do all logging of errors // proceed case .e3: // Add specific acctions to this error // ... do all logging of errors // proceed } self.present(alert, animated: true, completion: nil) } }
1
0
382
Jan ’22
Can I create a class instance using DispatchQueue.global().async and have its methods run asynchronously?
I created the playground below to answer my question "If I created a class instance using DispatchQueue.global().async would that class remain in its own asynchronous queue? Even if the main app called one of that classes methods, would that method run asynchronously compared to the main app? With the sleep line I discovered that the answer is "no." But I am curious if there is a legit way to do this? Or even if there is, it is considered bad programming? import UIKit class MyFunx : NSObject { var opsCount = 0 override init() { super.init() } func addValues (a: Int, b: Int) { let c = a + b opsCount += 1 sleep(1) } } var firstVar = 0 var secondVar = 0 var myFunx : MyFunx? while secondVar < 100 { print ("starting") if myFunx == nil { print ("making myFunx") DispatchQueue.global().async { myFunx = MyFunx() } } else { myFunx!.addValues(a: firstVar, b: secondVar) } firstVar += 1 secondVar += 1 } print ("myFunx = \(myFunx)") print ("\(myFunx?.opsCount)") print ("exiting")
3
0
766
Jan ’22
How can I decipher an M4A file?
I have my Swift app that records audio in chunks of multiple files, each M4A file is approx 1 minute long. I would like to go through those files and detect silence, or the lowest level. While I am able to read the file into a buffer, my problem is deciphering it. Even with Google, all it comes up with is "audio players" instead of sites that describe the header and the data. Where can I find what to look for? Or even if I should be reading it into a WAV file? But even then I cannot seem to find a tool, or a site, that tells me how to decipher what I am reading. Obviously it exists, since Siri knows when you've stopped speaking. Just trying to find the key.
0
0
641
Jan ’22
My writing the installTap buffer to an AVAudioFile seems to fail data-wise, or at the end
I am trying to save the buffer from my installTap to a file. I do it in chunks of 10 so I'll get a bigger file. When I try to play the written file (from the simulator's directory) QuickTime says that it's not compatible. I have examined the bad m4a file and a working one. There are a lot of zero's in the bad file at the beginning followed by a lot of data. However both files appears to have the same header. A lot of people mention that I have to nil the AudioFile, but: audioFile = nil is not a valid syntax, nor can I file a close method in AudioFile. Here's the complete code, edited into one working file: import UIKit import AVFoundation class ViewController: UIViewController { let audioEngine = AVAudioEngine() var audioFile = AVAudioFile() var x = 0 override func viewDidLoad() { super.viewDidLoad() record() // Do any additional setup after loading the view. } func makeFile(format: AVAudioFormat) { let paths = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask).first do { _ = try FileManager.default.contentsOfDirectory(at: paths!, includingPropertiesForKeys: nil) } catch { print ("error")} let destinationPath = paths!.appendingPathComponent("audioT.m4a") print ("\(destinationPath)") do { audioFile = try AVAudioFile(forWriting: destinationPath, settings: format.settings) print ("file created") } catch { print ("error creating file")} } func record(){ let node = audioEngine.inputNode let recordingFormat = node.inputFormat(forBus: 0) makeFile(format: recordingFormat) node.installTap(onBus: 0, bufferSize: 8192, format: recordingFormat, block: { [self] (buffer, _) in do { try audioFile.write(from: buffer); print ("buffer filled"); x += 1; print("wrote \(x)") if x > 9 { endThis() } } catch {return};}) audioEngine.prepare() do { try audioEngine.start() } catch let error { print ("oh catch") } } func endThis(){ audioEngine.stop() audioEngine.inputNode.removeTap(onBus: 0) } }
1
0
2.1k
Jan ’22
How to convert node.outputFormat to settings for AVAudioFile
Am trying to go from the installTap straight to AVAudioFile(forWriting: I call: let recordingFormat = node.outputFormat(forBus: 0) and I get back : <AVAudioFormat 0x60000278f750:  1 ch,  48000 Hz, Float32> But AVAudioFile has a settings parameter of [String : Any] and am curious of how to place those values into recording the required format. Hopefully these are the values I need?
0
0
449
Dec ’21
Is .installTap the equivalent of a C callback function?
Expanding a speech to text demo, and while it works, I am still trying to learn Swift. Is .installTap the Swift version of a C callback function? From what I interpret here, every time the buffer becomes full, the code in between the last { } runs, as well, the code below it is also run. It almost feels like a callback combined with a GOTO line from basic. yes, it works, but I'd like to understand that I am getting the flow of the code correctly. func startSpeechRecognition (){ let node = audioEngine.inputNode let recordingFormat = node.outputFormat(forBus: 0) node.installTap(onBus: 0, bufferSize: 1024, format: recordingFormat) { (buffer, _) in self.request.append(buffer) } audioEngine.prepare() do { try audioEngine.start() } catch let error { ... } guard let myRecognition = SFSpeechRecognizer() else { ... return } if !myRecognition.isAvailable { ... } task = speechRecognizer?.recognitionTask(with: request, resultHandler: { (response, error) in guard let response = response else { if error != nil { print ("\(String(describing: error.debugDescription))") } else { print ("problem in repsonse") } return } let message = response.bestTranscription.formattedString print ("\(message)") }) }
5
0
1.4k
Dec ’21