Post

Replies

Boosts

Views

Activity

How to obtain an AVAudioFormat for a canonical format?
I receive a buffer from[AVSpeechSynthesizer convertToBuffer:fromBuffer:] and want to schedule it on an AVPlayerNode. The player node's output format need to be something that the next node could handle and as far as I understand most nodes can handle a canonical format. The format provided by AVSpeechSynthesizer is not something thatAVAudioMixerNode supports. So the following:   AVAudioEngine *engine = [[AVAudioEngine alloc] init];   playerNode = [[AVAudioPlayerNode alloc] init];   AVAudioFormat *format = [[AVAudioFormat alloc] initWithSettings:utterance.voice.audioFileSettings];   [engine attachNode:self.playerNode];   [engine connect:self.playerNode to:engine.mainMixerNode format:format]; Throws an exception: Thread 1: "[[busArray objectAtIndexedSubscript:(NSUInteger)element] setFormat:format error:&nsErr]: returned false, error Error Domain=NSOSStatusErrorDomain Code=-10868 \"(null)\"" I am looking for a way to obtain the canonical format for the platform so that I can use AVAudioConverter to convert the buffer. Since different platforms have different canonical formats, I imagine there should be some library way of doing this. Otherwise each developer will have to redefine it for each platform the code will run on (OSX, iOS etc) and keep it updated when it changes. I could not find any constant or function which can make such format, ASDB or settings. The smartest way I could think of, which does not work:   AudioStreamBasicDescription toDesc;   FillOutASBDForLPCM(toDesc, [AVAudioSession sharedInstance].sampleRate,                      2, 16, 16, kAudioFormatFlagIsFloat, kAudioFormatFlagsNativeEndian);   AVAudioFormat *toFormat = [[AVAudioFormat alloc] initWithStreamDescription:&toDesc]; Even the provided example for iPhone, in the documentation linked above, uses kAudioFormatFlagsAudioUnitCanonical and AudioUnitSampleType which are deprecated. So what is the correct way to do this?
4
0
2.6k
Feb ’24
What happens to secrets stored by apps when iCloud Keychain is turned off?
I this FAQ it says: What happens when I turn off iCloud Keychain on a device? When you turn off iCloud Keychain for a device, you're asked to keep or delete the passwords and credit card information that you saved. If you choose to keep the information, it isn't deleted or updated when you make changes on other devices. If you don't choose to keep the information on at least one device, your Keychain data will be deleted from your device and the iCloud servers.  So understand data stored by the user such as passwords is removed. What about data stored by an app installed on an iOS device? Is it removed as well? Can we assume that data stored in the keychain will not be removed by other means external to the app, or should we plan our app to handle such situation?
0
0
954
Feb ’22
How to fit AVPlayer inside a container and align it to the top
I have a following simple code to present a video on the screen. This or similar can be found on any tutorial there is about using AVplayer from UISwift. struct PlayerView: UIViewRepresentable { func updateUIView(_ uiView: UIView, context: UIViewRepresentableContext<PlayerView>) { } func makeUIView(context: Context) -> UIView { return PlayerUIView(frame: .zero) } } class PlayerUIView: UIView { private let playerLayer = AVPlayerLayer() override init(frame: CGRect) { super.init(frame: frame) let url = Bundle.main.url(forResource: "my_video", withExtension: "mp4")! let player = AVPlayer(url: url) player.play() playerLayer.player = player layer.addSublayer(playerLayer) } required init?(coder: NSCoder) { fatalError("init(coder:) has not been implemented") } override func layoutSubviews() { super.layoutSubviews() playerLayer.frame = bounds } } If I put PlayerView inside an VStack the view will take as much vertical space as possible based on the sizes of the other views in the stack and then will put the video inside filling it horizontally. So if it is the only view, it will always put the video in the center of the screen and leave margins below and above it. Even if I put spacer after it. Even if I put it inside GeometryReader. What I want the view to "hug" the presented video as much as possible, and then bahve as any other SwiftUI view. But I do not understand how SwiftUI does the layout in this case when AVPlayer inside UIViewRepresentable is involved.
0
0
1.1k
Apr ’23
Text with standard font style appears smaller on iPadOS than on iOS
I used standard font styles in an iOS app. For example .font(.headline). I hoped that developing this way would allow the adoption of the app to other platforms, relatively easy. However, when trying to run for iPadOS, the text did not increase in size to occupy the more abundant space offered by larger screen, but it actually shrank. Overall, the app does not look great "automatically". Why does it happen? What is the best practice for cross platform development with SwiftUI (with regards to font sizes). I want to make as little as possible human interface design decisions on my own and just let SwiftUI take care of everything. (But I also want the results to look as Apple would consider great looking)
0
0
513
Oct ’24
How to do content transition when one image is a system symbol and the other is in assets catalouge?
I have a save button: Button(action: { saveTapped() }) { Label("Save", systemImage: wasSaved ? "checkmark" : "square.and.arrow.down") } .contentTransition(.symbolEffect(.replace)) .disabled(wasSaved) Now, I created a new custom icon and added it into the assets catalogue: "custom.square.and.arrow.down.badge.checkmark" for when the wasSaved state is true. The issue I am facing is that to use it I need to use the method of Label with image:, while to use the non custom symbol, I need to use systemImage:. So how is it possible to transition between the two?
1
0
526
Dec ’24
How to connect a @Parameter of a Tip to my app's state?
I have an observable object which is a model a view. I also have a Tip (from TipKit) with @Parameter and a rule. The source of truth for the state is in the observable object, however the Tip needs to be updated when state changes. So how do I bind between the two? The way I was thinking was to sink updates from objectWillChange and check if Tips parameter needs to be updated or add didSet if it a @Published property. But I am not sure this is the best practice. Schematic example: class MyModel: ObservableObject { struct TapSubmitTip: Tip { @Parameter static var isSubmitButtonEnabled: Bool = false var title: Text { Text("Tap \"Submit\" to confirm your selection.") } var rules: [Rule] { #Rule(Self.$isSubmitButtonEnabled) { $0 == true } } } let tapSubmitTip = TapSubmitTip() var objectWillChangeCancallable: AnyCancellable! // Used by the view to enable or disable the button var isSubmitButtonEnabled: Bool { // Some logic to determine if button should be enabled. } init() { objectWillChangeCancallable = objectWillChange.sink { [weak self] void in guard let self else { return } if isSubmitButtonEnabled { TapSubmitTip.isSubmitButtonEnabled = true } } } ... // Call objectWillChange or update published properties as needed. ... }
0
0
412
Dec ’24