Post

Replies

Boosts

Views

Activity

Subscription in iOS subscription list showing productId not localized name
Hi - I have an auto-renew subscription, for some reason when the user looks at their subscriptions in the iOS subscription list the subscription shows the productId as the name e.g. com.company.app.subscriptonName not the localized string e.g. "Foo Monthly Subscription". I have other subscriptions and they all works as expected and show the localized name. The one that is showing the productId is configured the same as the others as far as I can tell, any reason why this is happening and how to fix it? Thanks.
0
0
883
May ’21
WeatherKit REST API production rate limiting
The documentation lists the number of calls per month you can make for the different pricing tiers here: https://developer.apple.com/weatherkit/get-started/ but are there going to be any per second/minute/hour rate limits on calls to the API like with other weather service APIs. For example maybe you get 1,000,000 calls per month but there is some limit like 5 calls per second or 10,000 calls per hour imposed by the APIs? Also what happens if you go over your allowed call limit for the month, will the API just return an error like HTTP 402 or you will get a warning for a period of time? Would you be able to buy extra calls for the month without having to upgrade to a new tier, assuming maybe you are experiencing a temporary increase in traffic.
0
2
1.3k
Jun ’22
Convert stereo to mono dynamically using AVAudioEngine
I am using AVAudioEngine, I have a stereo wav file that I want to use with an AVAudioEnvironmentNode but it requires the input source is mono otherwise it will not spatialize the sounds. Is there some way I can convert the stereo input to mono dynamically, maybe using an avaudiomixernode or something else? It looks like maybe I can use AVAudioConverter to convert the original buffer, but I want to be able to just add a node in the AVAudioEngine graph that can convert it to mono if that is possible, rather than having to convert the input buffer explicitly. So something like: WAV -> AVAudioPlayerNode -> Stereo to mono node -> AVAudioEnvironmentNode
0
0
1.6k
Dec ’22
When to use: AVAudioSession.RouteSharingPolicy.longFormAudio
The docs for this property say: "Apps that play long-form audio, such as music or audio books, can use this policy to play to the same output as the built-in Music and Podcast apps. Long-form audio apps should also use the Media Player framework to add support for remote control events and to provide Now Playing information." That doesn't really say what the benefit is for adding this option or what are the drawbacks? Does somebody have some more information on when you should use this policy and potential issues it might cause? We are working on a music app that will play music in the background so it sounds like we should add this policy, but I can't find any information about why you should. Thanks
0
0
808
Mar ’23
Information on AudioPlaybackIntent
Is there any information on what the AudioPlaybackIntent does? The documentation doesn't give any information on it: https://developer.apple.com/documentation/appintents/audioplaybackintent I have an audio app and I want to add a play/pause button to our widget for iOS17, I'm assuming I need to use an AudioPlaybackIntent but there is no documentation on what it does. Thanks.
0
0
868
Sep ’23
Crashes launching app from Camera control button
I'm using the LockedCameraCaptureExtension to launch my Camera app from the camera button. Some of our users report our app crashing a few seconds after being launch from the camera button. The call stack is something inside BoardServices [BSServicesConfiguration activateXPCService] I'm not sure how to investigate or fix this, anyone else having the same issue? Basically in my LockedCameraCaptureExtension when it loads I just call openApplication on the LockedCameraCaptureSession to launch into my app. @main struct captureExtension: LockedCameraCaptureExtension { var body: some LockedCameraCaptureExtensionScene { LockedCameraCaptureUIScene { session in Button(action: { Task { await openCamera(session: session) } }, label: { Text("Open Camera") }) .buttonStyle(PlainButtonStyle()) .task { await openCamera(session: session) } } } private func openCamera(session: LockedCameraCaptureSession) async { try? await session.openApplication(for: NSUserActivity(activityType: "com.mycompany.camera")) } }
0
0
81
Sep ’25
White flash using manageSubscriptionsSheet in SwiftUI
In SwiftUI I am using the manageSubscriptionsSheet modifier to open the iOS subscription screen. When this is presented it immediately flashes a white view and then animated the subscription screen up from the bottom, it looks pretty bad. The view I am calling manageSubscriptionsSheet on is presented in a sheet, so maybe trying to present the subscriptions view as well is causing the visual glitch. Any way to not have this white flashing view when opening the subscription screen?
0
0
137
Oct ’25
String(localized:locale) ignores locale parameter
I'm trying to load a localized string with a specific Locale, for example: String(localized: "someKey", locale: Locale(identifier: "fr")) However the locale I pass in is being ignored, no matter what I set it to the string returns a value using the Locale.current value not the parameter I pass in. Am I doing something wrong, is there some way to specify a certain locale?
1
0
799
Jan ’24
Elapsed scene time used in custom shader uniform
If you create a custom shader you get access to a collection of uniform values, one is the uniforms::time() parameter which is defined as "the number of seconds that have elapsed since RealityKit began rendering the current scene" in this doc: https://developer.apple.com/metal/Metal-RealityKit-APIs.pdf Is there some way to get this value from Swift code? I want to animate a value in my shader based on the time so I need to get the starting time value so I can interpolate the animation offset from that point. If I create a System in the update() function I get a SceneUpdateContext instance and that has a deltaTime property but not an elapsedTime property which I would assume would map to the shader time() value.
1
0
889
Aug ’24
Loading a DNG into CIRAWFilter and use HDR
I have DNG files that I want to open and show as EDR content in my app. It seems like the DNG files should have enough per pixel information to show more colors that Display P3 but whenever I load the images using CIRawFilter and then inspect the outputImage color space it is always "DisplayP3", not something like "ITU-R BT.2100 PQ" there doesn't seem to be any way to make it load with a different color space for displaying EDR images. Does this make sense for DNG files, it seems like it should? If I open the same file using CIImage with the expandToHDR option e.g. CIImage(contentsOf: rawURL, options: [.expandToHDR: true]) then it does have the desired EDR color space, but then I don't get any of the properties that are available via the CIRAWFilter class to manipulate the data. Basically I just want to be able to open the DNG file via CIRAWFilter and then display it in my SwiftUI app as an EDR image by adding the allowedDynamicRange(.high) property. Image("my-dng-image").allowedDynamicRange(.high) Or do DNG files (just RAW not ProRAW) not contain enough information to be displayed as EDR images, seems like they should.
1
0
129
Mar ’25