Will Apple continue to support it, or will we wake up one day to find that Swift is the only viable language?It's a serious question. Careers depend on it. I don't accept the "No comment" approach that Apple usually takes. It's cruel.I'm willing to put the time into learning Swift if I have to. I'm not going to do it if I don't. I want to know.Frank
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hi,I've been unable to upload a build for the last several hours. Xcode spins for a while on "Fetching list of teams from the Developer Portal..." and then reports "The request timed out".This is not a new app or a new account; I've uploaded about 70 previous builds.Frank
Hi,I have an app that was approved and is "Pending developer release".While we were waiting to release the app, someone found a bug that we'd like to fix.Unfortunately, I can neither upload a new version of the app, nor create a new version in iTunes Connect (the option to add a new iOS version is disabled).Is there any way to revoke the approved version without releasing it to the app store? I don't mind having to wait for Apple to review the updated version again.Frank
Hi,I'm having two problems using the scheduleBuffer function of AVAudioPlayerNode.Background: my app generates audio programatically, which is why I am using this function. I also need low latency. Therefore, I'm using a strategy of scheduling a small number of buffers, and using the completion handler to keep the process moving forward by scheduling one more buffer for each one that completes.I'm seeing two problems with this approach:One, the total memory consumed by my app grows steadily while the audio is playing, which suggests that the audio buffers are never being deallocated or some other runaway process is underway. (The Leaks tool doesn't detect any leaks, however).Two, audio playback sometimes stops, particularly on slower devices. By "stops", what I mean is that at some point I schedule a buffer and the completion block for that buffer is never called. When this happens, I can't even clear the problem by stopping the player.Now, regarding the first issue, I suspected that if my completion block recursively scheduled another buffer with another completion block, I would probably end up blowing out the stack with an infinite recursion. To get around this, instead of directly scheduling the buffer in the completion block, I set it up to enqueue the schedule in a dispatch queue. However, this doesn't seem to solve the problem.Any advice would be appreciated. Thanks.
I've been using Xcode and Objective-C for years, and rely heavily on the built-in documentation, which is for the most part good.There's an issue that's always bothered me. Today I was looking up a function called class_getName. I found its documentation page, and was also able to navigate to the page showing the overview of the runtime system, all of which is fine.When I typed "class_getName" into my program, the compiler immediately flagged it as an unknown symbol.I knew I probably had to include a header, but none of the documentation pages I looked at mentioned the name of the header file. I had to end up searching the internet for examples until I hit one that showed <objc/runtime.h> being imported.Why isn't this very small, simple and extremely useful bit of information included in the documentation?Frank
Hi,It seems like it's pretty easy to consume HTTP Live Streaming content in an iOS app. Unfortunately, I need to consume media from an RTSP server. It seems to me that this is a very similar thing, and that all of the underpinnings for doing it ought to be present in iOS, but I'm having a devil of a time figuring out how to make it work without doing a lot of programming.For starters, I know that there are web-based services that can consume an RTSP stream and rebroadcast it as an HTTP Live Stream that can be easily consumed by the media players in iOS. This won't work for me because my application needs to function in an environment where there is no internet access (it's on a private Wifi network where the only other thing on the network is the device that is serving the RTSP stream).Having read everything I can get my hands on and exploring third-party and open-source solutions, I've compiled the following list of ideas:1. Using an iOS build of the open-source ffmpeg library, which supports RTSP, I've come up with a test app that can receive the RTSP packets, decode them, create UIImages out of the frames, and display those frames on-screen. This provides a crude player, but performance is poor, most likely because ffmpeg can't take advantage of any hardware acceleration. It also doesn't provide me with any way to integrate the video stream into AVFoundation, so I'm on my own as far as saving the stream to a file, transcoding it, etc.2. I know that the AVURLAsset class doesn't directly support the RTSP scheme. Since I have access to the undecoded RTSP packets via ffmpeg, I've thought it should be possible to implement RTSP support myself via a custom NSURLProtocol, essentially fooling AVFoundation into reading those packets as if they originated in a file. I'm not sure if this would work, since the raw packets coming from the RTSP server might lack the headers that would otherwise be present in data being read from a file. I'm not even sure if AVFoundation would recognize my custom protocol.3. If a protocol doesn't work, I've considered that I might be able to implement my own local HTTP Live Streaming server that converts the RTSP packets into an HTTP stream that the media players can read. This sounds like a terribly convoluted solution to the problem, at best, and very difficult at worst.4. Going back to solution (1), if I could speed up the decoding by using some iOS CoreVideo function instead of ffmpeg, this solution might be okay. However, I can't find any documentation for CoreVideo on iOS (Apple only documents it for OS X).5. I'm certainly willing to license a third-party solution if it works well and provides good performance. Unfortunately, everything I've found so far is pretty crummy and mostly just leverages ffmpeg and/or VLC. What is most disappointing to me is that nobody seems to be able or willing to provide a solution that neatly integrates with AVFoundation. I really want to make my RTSP stream available as an AVAsset so I can use it with AVFoundation players and other classes -- I don't want to build an app that relies on custom third-party code for everything.Any ideas, tips, advice would be greatly appreciated.Thanks,Frank
I have chronic problems with the connection between Xcode and my phone.
I plug the phone in and Xcode says "Waiting for phone to unlock", but the phone is already unlocked. I try locking and unlocking it, but nothing happens.
If I can get past this problem by disconnection and reconnecting the phone a couple of times, it gets into the "downloading symbols" phase and never gets out of it.
Finally, even though I have checked the "Connect via network" option, it never works, and I can never connect unless I plug in the phone with a USB cord (yes, the phone and the computer are on the same Wifi network).
Some days this is just an annoyance, but some days (like today) I really need to test something on my phone for a customer who's waiting for it, and I cannot. The thing I'm trying to test involves sending text messages so I can't use the simulator or even an iPad. What can I do to debug this problem?
Hi,I have a universal app that uses the camera along with a preview layer (AVCaptureVideoPreviewLayer) to display a preview of the live camera view.It works fine on an iPhone and fine on an iPad as long as my app is full screen.However, if I open another app in split-view mode, my app loses its live camera view and freezes on the last frame captured.I've tried stopping the AVCaptureSession before the transition and restarting it afterward, and even tearing the whole thing down and building a new one, but neither works.My app is checking permissions and the permissions are fine, and all of the objects created seem valid.Is there some rule that disallows apps from accessing the camera in split screen mode?Frank
How do I get a background image to fill the screen in my launch screen on an iPhone X?I have aligned the image to the top and bottom layout guides, but this leaves large white gaps.
I'm confused about the SF Pro fonts. Can these be used in our apps?
I tried pasting characters from SF Pro into a label, but was unable to get them to display properly. "SF Pro" doesn't appear in the list of available fonts in Xcode.
If these are not intended to be used by app developers, then what is their purpose?
Are "SF Symbols" different that SF Pro? What about the list of icons that appears in the "Symbols Library" in Xcode? There are so many different sources of symbols and icons, it is very confusing.
If any of these sources is OK to use in an iOS app, is it also OK to export them for use in the event that business needs require me to create an alternate version of my app for some hypothentical non-iOS platform?
Thanks,
Frank
I installed a custom font (Font awesome) into my app. I triple checked that I did everything right: the font files are included in the bundle (they appear in the "Copy Bundle Resources" build phase) and the names of the fonts appear in the Info.plist file under "Fonts provided by application".
In Interface builder, I select a Label, set the font to "Custom", then I click the Family list to select the font I want.
Once or twice, I was actually able to see the Font Awesome fonts in this list and select one. However, they no longer appear there when I create new labels in new views. I do not understand why. I've been limping along by copying a label from one of the views where it worked and pasting it into the new view, but this is tiresome.
I know the fonts are installed correctly because I can see them when I run the app.
Why are the fonts not showing up on the font list in interface builder?
Hi,Usually when you create a new class or other file in Xcode, the default location of the new file is a folder with the same name as your project.Somehow, I've ended up with a project where the default location for new files is the folder above this (the one with the .xcodeproj file in it). I can certainly change the folder whenever I make a new file, if I remember to do it, but often I forget, and end up with half my files in one place and half in the other.What controls this default, and how can I change it back to the folder I want?Thanks.
Hi,How do I check to see if a Character is in a Character Set?In Objective-C, I could write:[[NSCharacterSet lowercaseLetterCharacterSet] characterIsMember:c]Where "c" is a variable of type char.In Swift, I tried to write:CharacterSet.lowercaseLetters.contains(c)Where "c" is of type Character, but I received an error saying that the function was expecting a parameter of type "Unicode.Scalar".Thanks,Frank
I've always understood it to be the case that apps could permit users to log in or use a code to access paid content, as long as the app did not direct the user to an external web site to make a purchase. For example, apps like Netflix or Kindle.My app isn't even doing anything close to this. I have an app in which digital content is unlocked via in-app purchase. Occasionally I give away promo codes that allow a user to unlock one free piece of content. These codes can only be used once, and are distributed in very limited numbers on cards, in person, at events like trade shows.My app has been in the store for about four years using this method. This week I attempted to publish a small bug-fix update, and was flagged by Apple and told that my use of Promo Codes in this manner violates the rules.What happened here? Did Apple misunderstand what my app is doing? Did the rules change recently and I was unaware of the change? Has this always been against the rules, but either Apple didn't enforce it before or they never reviewed my app thorougly enough to find it?Thanks,Frank
Hi,I have an app that includes a very simple one-button interface for recording a video. Our customer wants to be able to switch between the front and rear cameras while the video is being recorded, and without any interruption in the video stream. I notice that even the iOS built-in camera app doesn't do this, but I've heard that some third-party apps do.Is this possible in a practical way? If so, how would I go about it?Thanks,Frank