I want to send my app silent push notifications while it is running in the background, but I've been unable to get it to work.
I understand how to send the notification by setting the content-available flag, and I've implemented the "didReceiveRemoteNotification" function in my app delegate. It works, but only if the app is open and running in the foreground. As soon as I put it in the background, didReceiveRemoteNotification doesn't get called again unless I reopen the app.
I feel certain that I should be able to receive these notifications when my app is in the background. I have both "Remote notifications" and "Background processing" checked in Signing & Capabilities.
Thanks,
Frank
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I'm working on a BLE app with a hardware partner using samples of their device, which I am able to erase and re-flash when needed.
When I try to use the device it initially works fine. I can connect to it, make requests, and get back responses.
However, after a short time and a few successful runs, I start getting the error "Encryption is insufficient". The error is seen in the "didUpdateValueFor" function. I can still connect to the device, and enumerate all of its services and characteristics, but I can't read any data.
If at this point I re-flash the device and also go into the iOS setting app and "forget" it, then I can use the device normally again, at least for a while.
What causes this? Is there a way I can fix it in my app?
Thanks,
Frank
I'm not getting any values from the floorsAscended or floorsDescended properties of CMPedometerData. I tested it by walking up and down a flight of stairs twice while monitoring for pedometer updates for 60 seconds. I also queried for the pedometer data separately at the end of the time period in case there were any updates I missed.
I'm using an iPhone 13 pro for my tests and I did check to make sure CMPedometer.isFloorCountingAvailable() is true, and I am getting other kinds of pedometer data such as distance, pace, and steps.
Is there something else I need to do in order to enable floor counting?
Thanks,
Frank
I added this file as Apple requested due to my app using UserDefaults, but it still complains when I upload it.
This is the message:
TMS-91053: Missing API declaration - Your app’s code in the “Production” file references one or more APIs that require reasons, including the following API categories: NSPrivacyAccessedAPICategoryUserDefaults.
And this is my file, what's wrong?
<dict>
<key>NSPrivacyAccessedAPITypes</key>
<array>
<dict>
<key>NSPrivacyAccessedAPIType</key>
<string>NSPrivacyAccessedAPICategoryUserDefaults</string>
<key>NSPrivacyAccessedAPITypeReasons</key>
<array>
<string>CA92.1</string>
</array>
</dict>
</array>
</dict>
I have a new app that needs to be submitted for review this week. When I tried to submit it, I was told I could not do so because "Under the Digital Services Act, you must provide and verify information regarding your account".
I am working on behalf of a large corporate customer. They are telling me that they cannot do anything without consulting their legal team, which is going to take time. In the meantime, they asked me if I could omit the European region from the app's distribution list. I tried this, but it did not work.
I manage about 20 apps for different customers and I have never seen this requirement appear on any other account. Is it new? Does it only apply to certain kinds of accounts, or to new apps, or new accounts publishing their first app?
If this is a European Union requirement, why is it needed if I don't distribute to EU countries?
Thanks,
Frank
I'm assisting a customer with an iOS app. He has a personal (non-company) Apple Developer account. I know that this kind of account didn't used to support collaborators, but I'm not sure what the current status is.
He was able to add me to his account and give me permissions (Developer, App Manager). However when I run Xcode, his account does not appear in the list of Teams under my Apple ID, which is preventing me from working.
Is this a bug or temporary problem or is the fact that his account is a personal account preventing me from doing this?
Topic:
App Store Distribution & Marketing
SubTopic:
App Store Connect
Tags:
App Store Connect
Developer Program
Code Signing
Hi,I have a universal app that uses the camera along with a preview layer (AVCaptureVideoPreviewLayer) to display a preview of the live camera view.It works fine on an iPhone and fine on an iPad as long as my app is full screen.However, if I open another app in split-view mode, my app loses its live camera view and freezes on the last frame captured.I've tried stopping the AVCaptureSession before the transition and restarting it afterward, and even tearing the whole thing down and building a new one, but neither works.My app is checking permissions and the permissions are fine, and all of the objects created seem valid.Is there some rule that disallows apps from accessing the camera in split screen mode?Frank
How do I get a background image to fill the screen in my launch screen on an iPhone X?I have aligned the image to the top and bottom layout guides, but this leaves large white gaps.
I'm confused about the SF Pro fonts. Can these be used in our apps?
I tried pasting characters from SF Pro into a label, but was unable to get them to display properly. "SF Pro" doesn't appear in the list of available fonts in Xcode.
If these are not intended to be used by app developers, then what is their purpose?
Are "SF Symbols" different that SF Pro? What about the list of icons that appears in the "Symbols Library" in Xcode? There are so many different sources of symbols and icons, it is very confusing.
If any of these sources is OK to use in an iOS app, is it also OK to export them for use in the event that business needs require me to create an alternate version of my app for some hypothentical non-iOS platform?
Thanks,
Frank
I installed a custom font (Font awesome) into my app. I triple checked that I did everything right: the font files are included in the bundle (they appear in the "Copy Bundle Resources" build phase) and the names of the fonts appear in the Info.plist file under "Fonts provided by application".
In Interface builder, I select a Label, set the font to "Custom", then I click the Family list to select the font I want.
Once or twice, I was actually able to see the Font Awesome fonts in this list and select one. However, they no longer appear there when I create new labels in new views. I do not understand why. I've been limping along by copying a label from one of the views where it worked and pasting it into the new view, but this is tiresome.
I know the fonts are installed correctly because I can see them when I run the app.
Why are the fonts not showing up on the font list in interface builder?
I've been using Xcode and Objective-C for years, and rely heavily on the built-in documentation, which is for the most part good.There's an issue that's always bothered me. Today I was looking up a function called class_getName. I found its documentation page, and was also able to navigate to the page showing the overview of the runtime system, all of which is fine.When I typed "class_getName" into my program, the compiler immediately flagged it as an unknown symbol.I knew I probably had to include a header, but none of the documentation pages I looked at mentioned the name of the header file. I had to end up searching the internet for examples until I hit one that showed &lt;objc/runtime.h&gt; being imported.Why isn't this very small, simple and extremely useful bit of information included in the documentation?Frank
Hi,It seems like it's pretty easy to consume HTTP Live Streaming content in an iOS app. Unfortunately, I need to consume media from an RTSP server. It seems to me that this is a very similar thing, and that all of the underpinnings for doing it ought to be present in iOS, but I'm having a devil of a time figuring out how to make it work without doing a lot of programming.For starters, I know that there are web-based services that can consume an RTSP stream and rebroadcast it as an HTTP Live Stream that can be easily consumed by the media players in iOS. This won't work for me because my application needs to function in an environment where there is no internet access (it's on a private Wifi network where the only other thing on the network is the device that is serving the RTSP stream).Having read everything I can get my hands on and exploring third-party and open-source solutions, I've compiled the following list of ideas:1. Using an iOS build of the open-source ffmpeg library, which supports RTSP, I've come up with a test app that can receive the RTSP packets, decode them, create UIImages out of the frames, and display those frames on-screen. This provides a crude player, but performance is poor, most likely because ffmpeg can't take advantage of any hardware acceleration. It also doesn't provide me with any way to integrate the video stream into AVFoundation, so I'm on my own as far as saving the stream to a file, transcoding it, etc.2. I know that the AVURLAsset class doesn't directly support the RTSP scheme. Since I have access to the undecoded RTSP packets via ffmpeg, I've thought it should be possible to implement RTSP support myself via a custom NSURLProtocol, essentially fooling AVFoundation into reading those packets as if they originated in a file. I'm not sure if this would work, since the raw packets coming from the RTSP server might lack the headers that would otherwise be present in data being read from a file. I'm not even sure if AVFoundation would recognize my custom protocol.3. If a protocol doesn't work, I've considered that I might be able to implement my own local HTTP Live Streaming server that converts the RTSP packets into an HTTP stream that the media players can read. This sounds like a terribly convoluted solution to the problem, at best, and very difficult at worst.4. Going back to solution (1), if I could speed up the decoding by using some iOS CoreVideo function instead of ffmpeg, this solution might be okay. However, I can't find any documentation for CoreVideo on iOS (Apple only documents it for OS X).5. I'm certainly willing to license a third-party solution if it works well and provides good performance. Unfortunately, everything I've found so far is pretty crummy and mostly just leverages ffmpeg and/or VLC. What is most disappointing to me is that nobody seems to be able or willing to provide a solution that neatly integrates with AVFoundation. I really want to make my RTSP stream available as an AVAsset so I can use it with AVFoundation players and other classes -- I don't want to build an app that relies on custom third-party code for everything.Any ideas, tips, advice would be greatly appreciated.Thanks,Frank
I have chronic problems with the connection between Xcode and my phone.
I plug the phone in and Xcode says "Waiting for phone to unlock", but the phone is already unlocked. I try locking and unlocking it, but nothing happens.
If I can get past this problem by disconnection and reconnecting the phone a couple of times, it gets into the "downloading symbols" phase and never gets out of it.
Finally, even though I have checked the "Connect via network" option, it never works, and I can never connect unless I plug in the phone with a USB cord (yes, the phone and the computer are on the same Wifi network).
Some days this is just an annoyance, but some days (like today) I really need to test something on my phone for a customer who's waiting for it, and I cannot. The thing I'm trying to test involves sending text messages so I can't use the simulator or even an iPad. What can I do to debug this problem?
Hi,I'm having two problems using the scheduleBuffer function of AVAudioPlayerNode.Background: my app generates audio programatically, which is why I am using this function. I also need low latency. Therefore, I'm using a strategy of scheduling a small number of buffers, and using the completion handler to keep the process moving forward by scheduling one more buffer for each one that completes.I'm seeing two problems with this approach:One, the total memory consumed by my app grows steadily while the audio is playing, which suggests that the audio buffers are never being deallocated or some other runaway process is underway. (The Leaks tool doesn't detect any leaks, however).Two, audio playback sometimes stops, particularly on slower devices. By "stops", what I mean is that at some point I schedule a buffer and the completion block for that buffer is never called. When this happens, I can't even clear the problem by stopping the player.Now, regarding the first issue, I suspected that if my completion block recursively scheduled another buffer with another completion block, I would probably end up blowing out the stack with an infinite recursion. To get around this, instead of directly scheduling the buffer in the completion block, I set it up to enqueue the schedule in a dispatch queue. However, this doesn't seem to solve the problem.Any advice would be appreciated. Thanks.
Hi,I have an app that was approved and is "Pending developer release".While we were waiting to release the app, someone found a bug that we'd like to fix.Unfortunately, I can neither upload a new version of the app, nor create a new version in iTunes Connect (the option to add a new iOS version is disabled).Is there any way to revoke the approved version without releasing it to the app store? I don't mind having to wait for Apple to review the updated version again.Frank