Post

Replies

Boosts

Views

Activity

Questions from a dev with plans to move to a different region
Hello, I'm a Canadian living in Hong Kong who has attempted to enroll in the developer program. After submitting my Canadian passport as identification, I was told it's not valid ID for the region (Hong Kong) and that I should submit something else.  I have several concerns that I would like answers to so I can figure out how to proceed: I'm very likely to move back to Canada in the near future and would like to know if there are any issues associated with switching between App Store regions? Issues like having to pay for enrollment twice, or legal/regulatory problems, or anything else? Another concern I have is my app being somehow subjected to censorship or some regulation that escapes my common sense, since Hong Kong is part of China.  My apps are not political but this is still a point of concern given the recent crackdowns on its tech and education sectors--it's not entirely unthinkable that something will strike down my app out of the blue, for no fault of my own. My app is intended for a global audience, but I fear having my developer account tied to Hong Kong (China) will come back to haunt me in some way. Am I fretting over nothing? There's also the payment issue--can payouts from my app be sent to an bank account of my choosing, or does it have to be in the region I'm tied to? Thank you in advance for your time.
0
0
740
Jan ’22
Ultra-large texture atlases in SpriteKit
I'm looking to use high-quality and large images as animated sprites via SpriteKit. Right now the default maximum size of a texture atlas is 4096*4096, which is at least an order of magnitude below what I need. There's an option to create custom maximum sizes, but the tiny default and the age of the SpriteKit framework is giving me second thoughts even though I'd very much like to stick with an Apple framework and not have to rely on another party like Unity. Before I invest my time and energy into SpriteKit I'd like to know whether the decade-old framework, running on modern hardware (A11 and newer), can support massive texture atlases while maintaining decent performance (~30 FPS). If not, is it possible to implement something similar in less antiquated frameworks, such as RealityKit (non-AR). Thanks in advance for your time.
0
0
893
Apr ’22
Are user-to-user token transactions, with the ability to cash-out, allowed?
I’m trying to build a simple app that will allow users to create items in-app and buy/sell access to them. Users can buy tokens from the App Store that can be used to fund these purchases (e.g. 1 token = 100 coins), and can also be used support the development of these items (e.g. give creator 1 coin a week). The creators can then cash out the coins once a certain amount is reached, via Stripe or some other external payment provider. Is this allowed under the current rules of the app store? Is it possible to implement using CloudKit? If this isn’t allowed, is it possible to allow users to share content on the app’s “store” for other users to browse and download, and can that be done just using CloudKit? This is my first-ever app and I don’t know left from right when it comes to back-ends. I will greatly appreciate any pointers.
0
0
681
May ’23
Is there a way to save SKTextureAtlas(dictionary) sprite sheets to disk to read later?
My app uses SpriteKit and requires the use of SKTextureAtlas for performance. However, it uses user-generated content, which means that atlases don’t initialize using bundled images, but instead have to recreate all sprite sheets, leading to long loading times. Is it possible to save the sprite sheets made from user-generated content to disk so that SKTextureAtlas can load them instead of recreating all sprite sheets upon every initialization? Is there any alternative solution to this problem? For example, is there a way to dump a bunch of images into memory to use as a texture pool and keep them there until deallocated?
0
0
747
May ’23
CloudKit for storage of gated user content and account profiles?
I'm a new developer and I can't decide whether CloudKit or Firebase is more suited to my app's needs. The general vibe I'm getting from multiple sources is that CloudKit is purely for storage, but that seems off to me, so I'm hoping to find some hints or answers with regard to my specific use-cases. My app allows users to create multimedia objects that are stored somewhere and displayed in the app's marketplace, with use access granted to those who decide to spend tokens on them. These multimedia objects can go up to a few gigabytes. Ownership of these objects are tracked via a user account database. Am I correct in assuming that my app's CloudKit public database can be used to store, check user request validity, and distribute these multimedia objects? Can it also be used for the management of user accounts and overall token counts? Additionally, is it possible for the app's administrators to pipe data out of the public database for analysis and to payment processor APIs? Any help would be greatly appreciated
0
0
591
Jun ’23
Is the ability to make in-app currency purchases of creator content a violation of app store rules?
Hello, I'm making an app where users can create a multimedia object and other users can use in-app currency (purchased via in-app purchase) to gain access to the object. They can also choose to subscribe to a creator to encourage and support their content creation. Tokens can be converted to cash and sent to creators. It is unclear to me whether this violates app store rules. After reading through the App Store Guidelines and searching through the forums (this thread was close to what I was looking for), I have yet to arrive at a clear answer. The Guidelines state that "tipping" content creators is acceptable, but this isn't exactly what I'm looking for in a creator marketplace. The Business section doesn't contain anything else that seems relevant, and this makes it seem like only voluntary tipping of content creators is accepted. The commerce engineer in the thread linked above discourages using in-app currency, but that doesn't seem to work for my use case (the thread's creator wants to use the IAP mechanism). Furthermore, IAP cannot be created programmatically (i.e. by users) and is error-prone. It must be stressed that I'm not trying to deny giving Apple its 15-30% share, since users must buy in-app currency using Apple's IAP (this is not a multi-platform app). Cash in the app's economy has only one entry point, and that is via Apple's IAP. I have asked a similar question recently, but received no response. This is probably because I didn't phrase it well enough and didn't attach the correct tags. Creating the infrastructure for a creator marketplace app is a lot of grueling work, and I would very much like to know whether my app will be rejected for it before I embark on this quest. Any help would be greatly appreciated. tl;dr - Is the ability to make in-app currency purchases of creator content a violation of app store rules?
1
0
841
Jun ’23
How do I detect whether an iPad's camera location is landscape or portrait?
The 10th-Gen iPad differs from its predecessors by having a camera that's located at the top of its landscape orientation. This is a headache for me since my app needs to know the rough camera location given the device's orientation for AR purposes. I can find out whether the device is a tablet or not, but I can't find out whether it's an iPad 10. Are there any direct or indirect ways for me to find out whether a camera is placed for portrait or landscape use?
0
0
746
Aug ’23
Navigating between sticker browsers and messages extensions and the main app
Happy new year! My app has the ability to generate stickers so I'm trying to connect two separate Messages extensions: The sticker browser and the app extension. The former is a repository and the latter is where stickers are made. It would be easy to create these two extensions and have them stand separately, like Apple's Memoji, but I'm trying to find a way to better streamline the user experience, such that they can navigate from the browser to the extension and back seamlessly. As far as I can tell, there's no indication that this is possible, but also nothing to indicate that it isn't. Similarly, are there ways to navigate from the main app to a messages extension, or go the other way? From what I've read, there's no known way to do the former but there was a way to do the latter that no longer works. tl;dr - Is it possible for users to press a button to go from a MSStickerBrowserViewController to a MSMessagesAppViewController (both belonging to the same app group) or back? Or go from the main app to either or back?
0
0
584
Jan ’24
Getting a list of words recognized by Speech
Is there a way to extract the list of words recognized by the Speech framework? I'm trying to filter out words that won't appear in the transcription output, but to do that I'll need a list of words that can appear. SFSpeechLanguageModel.Configuration can be initialized with a vocabulary, but there doesn't seem to be a way to read it, and while there are ways to create custom vocabularies, I have yet to find a way to retrieve it. I added the Natural Language tag in case the framework might contribute to a solution
0
0
875
Feb ’24
AVAudioEngine: Is there a way to play audio at full volume while having an active input tap?
My project has uses an AVAudioEngine with a very simple setup: A Speech recognizer running on a tap on the engine's input with separate AVAudioPlayerNodes handling playback. try session.setCategory(.playAndRecord, mode: .default, options: []) try session.setActive(true, options: .notifyOthersOnDeactivation) try session.setAllowHapticsAndSystemSoundsDuringRecording(true) filePlayerNode ---> engine.mainMixerNode bufferPlayerNode --> engine.mainMixerNode engine.mainMixerNode --> engine.outputNode //bufferPlayer.scheduleBuffer() is called on its own queue The input works fine since the buffers can be collected into a file and plays back correctly, and also because the recognizer works fine; but when I try to play the live audio by sending the buffer to the bufferPlayer on this or another device, the buffer audio plays at a very low volume, sometimes with severe distortions. If I lower the sample rate via AVAudioConverter, the distortions get worse. I've tried experimenting with the AVAudioSession category options, having separate AVAudioEngines, and much, much more, yet I still haven't figured this out. It's gotten to the point where I've fixed almost all the arcane and minor issues in my audio system, yet I still can't play back my voice properly. The ability to both play and record simultaneously is a basic feature of phones--when on speaker mode, a phone doesn't need to behave like a walkie-talkie. In my mind, it's inconceivable that the relatively new AVAudioEngine doesn't have a implementation for this, since the main issue (feedback loops) can be dealt with via a simple primitive circuit. Live video chat apps like FaceTime wouldn't be possible without this, yet to my surprise I found no answers online (what I did find were articles explaining how to write a file while playback is occurring). Is there truly no way to do this on AVAudioEngine? Am I missing something fundamental? Any pointers would be greatly appreciated
1
0
1.2k
Mar ’24
[iOS 17.4, XCode 15.3] Previously working NWConnection for peer-to-peer connection now permanently stuck on "Preparing"
After updating my devices to iOS/iPadOS 17.4, and XCode to 15.3, Network framework peer-to-peer connections have stopped working entirely. The system was working fine before and the code has not been changed. On the client side (NWBrowser) the server (NWListener) can be seen, but upon attempting to establish a connection the client-side NWConnection.State gets permanently stuck at .preparing. NWConnection.stateUpdateHandler doesn't enter any other state. It doesn't seem as though it's taking a long time to prepare; it's just stuck. This situation occurs across multiple connection modes (wired, common wifi, separate wifi). Additional information I didn't participate in the 17.4 beta and RC The code in "Creating a custom peer-to-peer protocol" works--this code forms the basis of my code
2
0
1.5k
Mar ’24
When using UIView.drawHierarchy, older CPU performs far better than newer CPU
Hi all, My app uses SpriteKit views which are rendered into images for various uses. For some reason, the same code performs worse on a newer CPU than on an older one. My A13 Bionic flies through the task at high resolution and 60FPS while CPU usage is <60%, while the A15 Bionic chokes and sputters at a lower resolution and 30FPS. Because of how counterintuitive this is, it took me a while to isolate the call directly responsible--with UIView.drawHierarchy commented out, both devices returned to their baseline performances. guard let sceneView = skScene.view else { return } let size = CGSize(width: outputResolution, height: outputResolution) return UIGraphicsImageRenderer(size: size).image { context in let rect = CGRect(origin: .zero, size: size) sceneView.drawHierarchy(in: rect, afterScreenUpdates: false) } Does anyone know why this is the case, and how to fix it? I tried using UIView.snapshotView, which is supposedly much faster, but it only returns blank images. Am I using it wrong or does it simply not work in this context? sceneView.snapshotView(afterScreenUpdates: false)?.draw(rect) Any hints or pointers would be greatly appreciated
0
0
668
Jun ’24
Are default fonts under UIFont.familyNames licensed for use by iOS developers?
This is a follow-up to my previous question: How to attribute/credit Apple Fonts added to app? In that previous post, I misremembered what I did and said I found fonts via macOS' FontBooks, when instead I came acrossUIFont.familyNames. Since these are included via UIKit, the legal implications should be different. I looked at various license agreements that govern iOS app development but haven't found anything mentioning fonts. Since these are included as part of UIKit, its reasonable to assume that developers are allowed to include these fonts--but in what ways? Am I allowed to let users create, say, documents with these fonts? Am I only allowed to display these fonts? There are 84 fonts, and judging by their FontBook entries, there is a wide range of licenses and restrictions. It seems unnecessarily harsh to have every iOS developer verify each one and figure out which they can legally keep if they want to offer their users access to all (for, say, a text-editing app). There must be some overarching rule that supersedes/encapsulates them, but this rule isn't clear to me after hours of research. I'm not a lawyer, and I don't think Apple expects every app developer to consult their lawyers on whether they can use system fonts. I'm about to send an email to Apple's legal team (I will post their response here if allowed), but in the meantime I want to hear what other devs think about this. In Xcode, entering UIFont.familyNames returns the following: ["Academy Engraved LET", "Al Nile", "American Typewriter", "Apple Color Emoji", "Apple SD Gothic Neo", "Apple Symbols", "Arial", "Arial Hebrew", "Arial Rounded MT Bold", "Avenir", "Avenir Next", "Avenir Next Condensed", "Baskerville", "Bodoni 72", "Bodoni 72 Oldstyle", "Bodoni 72 Smallcaps", "Bodoni Ornaments", "Bradley Hand", "Chalkboard SE", "Chalkduster", "Charter", "Cochin", "Copperplate", "Courier New", "Damascus", "Devanagari Sangam MN", "Didot", "DIN Alternate", "DIN Condensed", "Euphemia UCAS", "Farah", "Futura", "Galvji", "Geeza Pro", "Georgia", "Gill Sans", "Grantha Sangam MN", "Helvetica", "Helvetica Neue", "Hiragino Maru Gothic ProN", "Hiragino Mincho ProN", "Hiragino Sans", "Hoefler Text", "Impact", "Kailasa", "Kefa", "Khmer Sangam MN", "Kohinoor Bangla", "Kohinoor Devanagari", "Kohinoor Gujarati", "Kohinoor Telugu", "Lao Sangam MN", "Malayalam Sangam MN", "Marker Felt", "Menlo", "Mishafi", "Mukta Mahee", "Myanmar Sangam MN", "Noteworthy", "Noto Nastaliq Urdu", "Noto Sans Kannada", "Noto Sans Myanmar", "Noto Sans Oriya", "Optima", "Palatino", "Papyrus", "Party LET", "PingFang HK", "PingFang SC", "PingFang TC", "Rockwell", "Savoye LET", "Sinhala Sangam MN", "Snell Roundhand", "STIX Two Math", "STIX Two Text", "Symbol", "Tamil Sangam MN", "Thonburi", "Times New Roman", "Trebuchet MS", "Verdana", "Zapf Dingbats", "Zapfino"]
Topic: Design SubTopic: General Tags:
1
0
851
Jan ’25
How does homomorphic encryption usage affect privacy labels?
If I encrypt user data with Apple's newly released homomorphic encryption package and send it to servers I control for analysis, how would that affect the privacy label for that app? E.g. If my app collected usage data plus identifiers, then sent it for collection and analysis, would I be allowed to say that we don't collect information linked to the user? Does it also automatically exclude the relevant fields from the "Data used to track you" section? Is it possible to make even things that were once considered inextricably tied to a user identity (e.g. purchases in an in-app marketplace) something not linked, according to Apple's rules? How would I prove to Apple that the relevant information is indeed homomorphically encrypted?
0
0
796
Jul ’24