I have chronic problems with the connection between Xcode and my phone.
I plug the phone in and Xcode says "Waiting for phone to unlock", but the phone is already unlocked. I try locking and unlocking it, but nothing happens.
If I can get past this problem by disconnection and reconnecting the phone a couple of times, it gets into the "downloading symbols" phase and never gets out of it.
Finally, even though I have checked the "Connect via network" option, it never works, and I can never connect unless I plug in the phone with a USB cord (yes, the phone and the computer are on the same Wifi network).
Some days this is just an annoyance, but some days (like today) I really need to test something on my phone for a customer who's waiting for it, and I cannot. The thing I'm trying to test involves sending text messages so I can't use the simulator or even an iPad. What can I do to debug this problem?
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hi,It seems like it's pretty easy to consume HTTP Live Streaming content in an iOS app. Unfortunately, I need to consume media from an RTSP server. It seems to me that this is a very similar thing, and that all of the underpinnings for doing it ought to be present in iOS, but I'm having a devil of a time figuring out how to make it work without doing a lot of programming.For starters, I know that there are web-based services that can consume an RTSP stream and rebroadcast it as an HTTP Live Stream that can be easily consumed by the media players in iOS. This won't work for me because my application needs to function in an environment where there is no internet access (it's on a private Wifi network where the only other thing on the network is the device that is serving the RTSP stream).Having read everything I can get my hands on and exploring third-party and open-source solutions, I've compiled the following list of ideas:1. Using an iOS build of the open-source ffmpeg library, which supports RTSP, I've come up with a test app that can receive the RTSP packets, decode them, create UIImages out of the frames, and display those frames on-screen. This provides a crude player, but performance is poor, most likely because ffmpeg can't take advantage of any hardware acceleration. It also doesn't provide me with any way to integrate the video stream into AVFoundation, so I'm on my own as far as saving the stream to a file, transcoding it, etc.2. I know that the AVURLAsset class doesn't directly support the RTSP scheme. Since I have access to the undecoded RTSP packets via ffmpeg, I've thought it should be possible to implement RTSP support myself via a custom NSURLProtocol, essentially fooling AVFoundation into reading those packets as if they originated in a file. I'm not sure if this would work, since the raw packets coming from the RTSP server might lack the headers that would otherwise be present in data being read from a file. I'm not even sure if AVFoundation would recognize my custom protocol.3. If a protocol doesn't work, I've considered that I might be able to implement my own local HTTP Live Streaming server that converts the RTSP packets into an HTTP stream that the media players can read. This sounds like a terribly convoluted solution to the problem, at best, and very difficult at worst.4. Going back to solution (1), if I could speed up the decoding by using some iOS CoreVideo function instead of ffmpeg, this solution might be okay. However, I can't find any documentation for CoreVideo on iOS (Apple only documents it for OS X).5. I'm certainly willing to license a third-party solution if it works well and provides good performance. Unfortunately, everything I've found so far is pretty crummy and mostly just leverages ffmpeg and/or VLC. What is most disappointing to me is that nobody seems to be able or willing to provide a solution that neatly integrates with AVFoundation. I really want to make my RTSP stream available as an AVAsset so I can use it with AVFoundation players and other classes -- I don't want to build an app that relies on custom third-party code for everything.Any ideas, tips, advice would be greatly appreciated.Thanks,Frank
Hi,I have an app that was approved and is "Pending developer release".While we were waiting to release the app, someone found a bug that we'd like to fix.Unfortunately, I can neither upload a new version of the app, nor create a new version in iTunes Connect (the option to add a new iOS version is disabled).Is there any way to revoke the approved version without releasing it to the app store? I don't mind having to wait for Apple to review the updated version again.Frank
I've seen examples of Swift code with #if DEBUG / #endif, and have even used it myself.I know exactly how preprocessor macros work in c, c++ and Objective-C. In those languages, I know I can say:#ifdef DEBUG ... some code ...#endifAnd I know that if the DEBUG flag is false at build time, the code will not only not run, but it won't even be compiled. This is important to me because the code inside that block contains some sensitive information that must not end up in my compiled code for non-debug builds.In Swift, I'm really not sure what happens. I know the code doesn't execute, but I'm also told that Swift doesn't have a preprocessor. So, what exactly is going on here?Specifically, can I hide sensitive information inside an #if DEBUG block in Swift and be assured that it won't get compiled or in any way be present in the executable when the DEBUG flag is false? Or is #if DEBUG evaluated at runtime in Swift?Thanks,Frank
Xcode won't remember my Git credentials if I close and re-open it. It remembers them while Xcode is running, but always asks me again whenever I shut it down and reopen it.
My Git repository is hosted with Beanstalk, which is not one of the ones listed in the drop-down list in Xcode.
I have saved my credentials with Git on the command line and I'm able to do Git command line operations without re-entering them, but Xcode doesn't seem to recognize this.
I'm really getting tired of retyping my password all the time. What else can I do?
I'm confused about the SF Pro fonts. Can these be used in our apps?
I tried pasting characters from SF Pro into a label, but was unable to get them to display properly. "SF Pro" doesn't appear in the list of available fonts in Xcode.
If these are not intended to be used by app developers, then what is their purpose?
Are "SF Symbols" different that SF Pro? What about the list of icons that appears in the "Symbols Library" in Xcode? There are so many different sources of symbols and icons, it is very confusing.
If any of these sources is OK to use in an iOS app, is it also OK to export them for use in the event that business needs require me to create an alternate version of my app for some hypothentical non-iOS platform?
Thanks,
Frank
Is there any way an account owner can authorize a different user to accept updated license agreements?
I manage multiple apps on behalf of customers, most of whom fail to accept these agreements until the day they need me to publish a new app and I have to hunt them down and ask them to do it. If it were one or two customers it wouldn't be a big deal, but I have nearly 40 of them, and this happens several times per year. It's a major hassle.
I have a new app that needs to be submitted for review this week. When I tried to submit it, I was told I could not do so because "Under the Digital Services Act, you must provide and verify information regarding your account".
I am working on behalf of a large corporate customer. They are telling me that they cannot do anything without consulting their legal team, which is going to take time. In the meantime, they asked me if I could omit the European region from the app's distribution list. I tried this, but it did not work.
I manage about 20 apps for different customers and I have never seen this requirement appear on any other account. Is it new? Does it only apply to certain kinds of accounts, or to new apps, or new accounts publishing their first app?
If this is a European Union requirement, why is it needed if I don't distribute to EU countries?
Thanks,
Frank
I have an extremely straightforward situation where an @IBOutlet in a ViewController is connected to a property in an XIB file. I've been working with iOS apps for more than ten years, and done this about a million times.
For some reason, the property becomes nil at some point after the view is loaded. I can check with the debugger to see that it is not nil at viewDidLoad, and there is nothing in my code that sets it to anything else.
I added a custom setter and getter to the variable so that I could stop in the debugger when it gets set, and the setter only gets called once, with a non-nil value.
I suspect that somehow, a different copy of my ViewController is getting instantiated, but when it does, there are no calls to any of the usual methods like viewDidLoad. In fact there is not even a call to the init method. I don't understand how this is possible.
I was added to a team yesterday as a Developer. I can see this team when I log in to App Store Connect, but it does not appear on the Teams list in Xcode. How do I get Xcode to refresh this list?
I'm writing an app in which the user is expected to initiate location tracking, let the app track for a period of time (a few minutes to a couple of hours) and then discontinue tracking. We want the user to be able to switch apps or let their device lock while tracking without losing any location updates.
My understanding is that this can be done with the "While in use" location permission and does not require "Always". We don't want to have to ask our users for the "Always" permission.
I'm configuring the location manager this way:
locationManager.delegate = self locationManager.desiredAccuracy = kCLLocationAccuracyBestForNavigation locationManager.allowsBackgroundLocationUpdates = true locationManager.showsBackgroundLocationIndicator = true locationManager.distanceFilter = kCLDistanceFilterNone locationManager.activityType = .otherNavigation locationManager.pausesLocationUpdatesAutomatically = false
(The user is expected to be walking around in an outdoor location, stopping occasionally to take notes and pictures).
I've tested this using both an iPhone and an iPad that relies on an external GPS device. It works. I can lock the device and see a continuous stream of location updates in the debugger for hours. I've also tested it while walking outdoors.
However, my customer keeps reporting that the app stops tracking his location whenever it goes into the background. He says that it will track his location fine while in the foreground, but when he backgrounds it, it stops getting location updates. Then when it comes into the foreground again, it resumes. When we plot the locations on a map, you see a straight line between the place where the app went into background and where it woke up again. We know for sure that the app is just transitioning to and from the background and that it is not being terminated and restarted.
I can't reproduce this result on my devices and can't figure out what I'm doing wrong. The customer says he has another app on his device (which is also an iPad with an external GPS) and that the other app does track him when it is in the background.
My app does process all of the locations received in the didUpdateLocations method and not just the last one, so it's not that I'm getting the updates and ignoring them. I'm also not receiving any calls to 'locationManagerDidPauseLocationUpdates', 'didFinishDeferredUpdatesWithError', or 'didFailWithError'.
The only explanation I can think of at the moment is that something changed in iOS. I know that the other app my customer is using is fairly old and built against an old version of the iOS SDK.
Thanks for your help.
Hi,I'm having two problems using the scheduleBuffer function of AVAudioPlayerNode.Background: my app generates audio programatically, which is why I am using this function. I also need low latency. Therefore, I'm using a strategy of scheduling a small number of buffers, and using the completion handler to keep the process moving forward by scheduling one more buffer for each one that completes.I'm seeing two problems with this approach:One, the total memory consumed by my app grows steadily while the audio is playing, which suggests that the audio buffers are never being deallocated or some other runaway process is underway. (The Leaks tool doesn't detect any leaks, however).Two, audio playback sometimes stops, particularly on slower devices. By "stops", what I mean is that at some point I schedule a buffer and the completion block for that buffer is never called. When this happens, I can't even clear the problem by stopping the player.Now, regarding the first issue, I suspected that if my completion block recursively scheduled another buffer with another completion block, I would probably end up blowing out the stack with an infinite recursion. To get around this, instead of directly scheduling the buffer in the completion block, I set it up to enqueue the schedule in a dispatch queue. However, this doesn't seem to solve the problem.Any advice would be appreciated. Thanks.
Hi,I'm having a really weird problem. Brand new project, using a UINavigationController. I want to set the background color of my navigation bar to match the background color of my root view controller.I thought that this was the right way to do it: [[UINavigationBar appearance] setTranslucent:NO];
[[UINavigationBar appearance] setBarTintColor:[UIColor colorWithRed:0x2d/255.0 green:0x55/255.0 blue:0x97/255.0 alpha:1.0]];However, this generates a navigation bar with a color value of #234185 instead of the expected #2D5597. What is the explanation for this, and how do I fix it?Thanks,Frank
Hi,How do I check to see if a Character is in a Character Set?In Objective-C, I could write:[[NSCharacterSet lowercaseLetterCharacterSet] characterIsMember:c]Where "c" is a variable of type char.In Swift, I tried to write:CharacterSet.lowercaseLetters.contains(c)Where "c" is of type Character, but I received an error saying that the function was expecting a parameter of type "Unicode.Scalar".Thanks,Frank
Hi,I have a universal app that uses the camera along with a preview layer (AVCaptureVideoPreviewLayer) to display a preview of the live camera view.It works fine on an iPhone and fine on an iPad as long as my app is full screen.However, if I open another app in split-view mode, my app loses its live camera view and freezes on the last frame captured.I've tried stopping the AVCaptureSession before the transition and restarting it afterward, and even tearing the whole thing down and building a new one, but neither works.My app is checking permissions and the permissions are fine, and all of the objects created seem valid.Is there some rule that disallows apps from accessing the camera in split screen mode?Frank