Overview

Post

Replies

Boosts

Views

Created

Keeping PiP alive during third-party video recording (camera capture)
I’m building a teleprompter-style app that relies on Picture in Picture. PiP starts correctly on device. Everything works — until another app (e.g. TikTok / Instagram) starts active video recording. When camera capture begins in the foreground app, iOS terminates my PiP session. Some teleprompter apps appear to keep PiP active while recording in other apps, so I’m trying to understand the recommended architectural pattern for this scenario. Is there a documented approach or best practice to keep PiP stable during third-party camera capture? Looking specifically for guidance on the correct AVKit / AVAudioSession configuration for this use case.
0
0
1
10m
Query regarding hardware usage for Swift Student Challenge entry
Hello everyone, I am a participant in this year's Swift Student Challenge. My project requires the use of the front-facing camera to detect the user's body movements (supporting connection via iPhone to a Mac or other devices capable of running Xcode/Swift Playgrounds). I would like to ask if this is permitted under the competition rules? Please note that this feature functions entirely offline and does not require an internet connection.
1
0
9
1h
Providing client with IPA for internal distribution
Hey folks, I work as a software development consultant. We develop enterprise applications for our clients, and the apps we create are usually for internal use. We've ran into a bit of a conundrum with a client who doesn't have their own Apple Enterprise account, and neither do we as we don't meet the criteria, but they're wanting to distribute an application we've built for them via their own MDM software. We are not entirely sure how to provide them with a distribution ready .ipa file that isn't AdHoc and will be recognized as a secure app. We've looked into generating a Developer ID provisioning profile and accompanying cert, however we're running into a problem where the platform of our app (iOS) doesn't match the platform required by the Developer ID profile (macOS). I've also come across the idea of resigning an .ipa, but again, the client doesn't have a Apple Developer account and expects the working .ipa to be included in the service rendered. Any suggestions or advice or documentation around the subject would be greatly appreciated. Thanks, Ale
0
0
8
1h
Internal inconsistency in menus - menu warnings...
I get warnings like this on each project I build while debugging.. Internal inconsistency in menus - menu <NSMenu: 0x8b4b49ec0> Title: Help Supermenu: 0x8b4b49f80 (Main Menu), autoenable: YES Previous menu: 0x0 (None) Next menu: 0x0 (None) Items: ( "<NSMenuItem: 0x8b5771720 Metal4C Help, ke='Command-?'>" ) believes it has <NSMenu: 0x8b4b49f80> Title: Main Menu Supermenu: 0x0 (None), autoenable: YES Previous menu: 0x0 (None) Next menu: 0x0 (None) Items: ( ) as a supermenu, but the supermenu does not seem to have any item with that submenu What am I doing wrong? I get these errors even if I create a default app with no code?
0
0
9
3h
Finder tag colors and folder icons become gray for iCloud Drive items (URLResourceValues / xattr / QLThumbnailGenerator)
Hi, I’m working on a macOS app that includes a file browser component. And I’m trying to match Finder’s behavior for color tags and folder icons. For local files/folders everything works fine: Tag color key returns the expected label number via NSColor * labelColor = nil; [fileURL getResourceValue:&labelColor forKey:NSURLLabelColorKey error:nil]; NSNumber * labelKey = nil; [fileURL getResourceValue:&labelKey forKey:NSURLLabelNumberKey error:nil]; QLThumbnailGenerator obtains the expected colored folder icon (including emoji/symbol overlay if set) via QLThumbnailGenerationRequest * request = [[QLThumbnailGenerationRequest alloc] initWithFileAtURL:fileURL size:iconSize scale:scaleFactor representationTypes:QLThumbnailGenerationRequestRepresentationTypeIcon]; request.iconMode = YES; [[QLThumbnailGenerator sharedGenerator] generateBestRepresentationForRequest:request completionHandler:^(QLThumbnailRepresentation * _Nullable thumbnail, NSError * _Nullable error) { if (thumbnail != nil && error == nil) { NSImage * thumbnailImage = [thumbnail NSImage]; // ... } }]; However, for items on iCloud Drive (whether currently downloaded locally or only stored in the cloud), the same code always produces gray colors, while Finder shows everything correctly: NSURLLabelNumberKey always returns 1 (gray) for items with color tags, and 0 for non-tagged. Folder icons returned via QLThumbnailGenerator are gray, no emoji/symbol overlays. Reading tag data from xattr gives values like “Green\1” (tag name matches, but numeric value is still "Gray"). Also, if I move a correctly-tagged local item into iCloud Drive, it immediately becomes gray in my app (Finder still shows the correct colors). Question: What is the supported way to retrieve Finder tag colors and the correct folder icon appearance (color + overlays) for items in iCloud Drive, so that the result matches Finder? I am on macOS Tahoe 26.2/26.3, Xcode 26.2 (17C52). If you need any additional details, please let me know. Thanks!
0
0
24
3h
AVP, Developer Strap v2, Ethernet connection and Xcode
Hi, I wonder if there's something that can be configured to force Xcode (and preferably MVD too) to use Ethernet connection between Mac Mini and Apple Vision Pro (over a USB hub, not a direct USB connection)? If I connect AVP to Mac directly via USB, the bridge gets created and both MVD and Xcode default to it, which is great because of higher speed and lower latency. My problem is that I work with external camera, so I can have either the camera, or the Mac connection, but not both. I tried to solve that by plugging in a small active USB hub, so the strap and camera are connected to it, plus it has Ethernet adapter, which is plugged into Mac port. I tried with internet sharing on Mac - AVP has internet access, I can ping AVP from Mac, but Xcode and MVD still use wifi. I tried to manually configure bridge without internet sharing - same effect. I tried to make the bridge highest priority connection - nothing changed. I tried to force routing to AVP IP over the bridge - nothing (and it seems that my routing entry went missing after some time and was replaced by "use wifi interface"). So - is there something more I can do to make at least Xcode go over the cable? Debugging over wifi often takes forever.
0
0
11
5h
Developer Control Question
Hi everyone, I’m new to the Apple Developer side of things and I want to make sure I’m handling this correctly. I hired a freelance iOS developer to build my app. I have my own Apple Developer account (Individual), and I want the app published under my account. The developer says he needs me to create a Certificate Signing Request (CSR), then he will generate the distribution certificate on his end and send me the .cer file back. From there, he would use that for signing and submitting the app. My questions: 1. Is this the correct and modern way to handle app signing? 2. Should I instead just add him to my App Store Connect account with Developer access and let him manage signing through Xcode? Dev claims this won’t work, which I haven’t been able to prove or disprove. ChatGPT says it does,but other AI tools says it doesn’t. 3. Is there any risk in sharing certificates like this? 4. What is considered best practice in 2026 for working with freelancers while keeping full control of the account? My goal is simple: • The app stays fully under my Apple Developer account. • I retain long term control. • The developer can build and submit without needing my Apple ID or password. I’m not trying to make things complicated, I just want to do this the right way and avoid issues later. Thanks in advance.
0
0
15
5h
App Store Events
We have run over 15 App Store Events. I have only ever seen data on 1 of them. We have several large apps, and based on our numbers, it is unlikely that we don't receive more than 5 installs from the event. We see an overall CR uplift, but no direct data in the event data section. Does anyone have any ideas as to what could be causing this?
0
0
13
5h
Smooth appearance switching
Hello every developers. I need your help. Do you know how to attach animation to appearance, like a smooth transition from dark to light and vise versa. My code here: @main struct The_Library_of_BabelonApp: App { @AppStorage("selectedAppearance") private var selectedAppearance = 0 @StateObject private var router = AppRouter() var scheme: ColorScheme? { if selectedAppearance == 1 { return .light } if selectedAppearance == 2 { return .dark } return nil } var body: some Scene { WindowGroup { RootView() .preferredColorScheme(scheme) .environmentObject(router) // this is doesn't work correctly .animation(.smooth(duration: 2), value: selectedAppearance) } } } And my appearance switching looks: struct SettingsView: View { @AppStorage("selectedAppearance") private var selectedAppearance = 0 var body: some View { List { Section(header: Text("Appearance")) { HStack(spacing: 20) { ThemePreview(title: "Light", imageName: "lightTheme", tag: 1, selection: $selectedAppearance) ThemePreview(title: "Dark", imageName: "darkTheme", tag: 2, selection: $selectedAppearance) ThemePreview(title: "System", imageName: "systemMode", tag: 0, selection: $selectedAppearance) } .padding(.vertical, 10) .frame(maxWidth: .infinity) } } } } struct ThemePreview: View { let title: String let imageName: String let tag: Int @Binding var selection: Int var body: some View { Button { selection = tag } label: { VStack { Image(imageName) .resizable() .aspectRatio(contentMode: .fill) .frame(width: 120, height: 80) .clipShape(RoundedRectangle(cornerRadius: 12)) .overlay( RoundedRectangle(cornerRadius: 12) .stroke(selection == tag ? Color.blue : Color.clear, lineWidth: 3) ) Text(title) .font(.caption) .foregroundColor(selection == tag ? .blue : .primary) } } .buttonStyle(.plain) } } I guess my code works but animation working another way, its turn my Section, I don't know.... Thank you in advance
1
0
12
5h
Unusually long “Waiting for Review” times this week (App Store + TestFlight delays?)
Hi everyone, I’m currently experiencing unusually long review waiting times and wanted to ask if others see the same behavior this week. My situation: • App Store update has been in “Waiting for Review” significantly longer than usual • A newly submitted build also seems stuck • TestFlight processing is slower than I normally see • Expedited review request and contact attempts didn’t change the status so far What confuses me is that I still see other apps receiving updates, so I’m unsure whether this is a broader review delay or something submission-specific. I’m not trying to escalate anything — just looking to understand if this is currently affecting more developers. Would really appreciate hearing about your recent experiences. Thanks and good luck to everyone waiting 🙂
1
1
38
6h
VisionOs Development: Seeking Advice on Key Strategic Crossroads
I am a developer working on developing a space journal application. During the development process, I encountered several crucial strategic and technical decisions, and I would like to hear the experiences of those who have gone through similar situations. Here are the simplified versions of several questions I have. Resource allocation: Which problem should I address first? Design direction: In terms of interaction and UI design, how should I balance "immersion" and "usability"? Market selection: Was it easier for a business to survive in the early stages as a B2B or B2C entity? Cost estimation: How can I reasonably present to my investors the development costs of this project? In order to avoid relying solely on intuition in my decisions, I created a short questionnaire, hoping to gather more structured opinions from my colleagues. If you are also exploring VisionOS, I sincerely hope you can take a few minutes to fill it out. The results are extremely important to me, and I would be more than happy to share the final summary findings with you.
0
0
15
8h
ControlWidgetToggle image design
I need help designing the image of a ControlWidgetToggle. do I understand correctly that I can only use an SFSymbol as image and not my custom image (unless setup via a custom SFSymbol)? is there any way I can influence the size of the image? I tried multiple SwiftUI modifiers (.imageScale, .font, .resizable, .controlSize) none of them seem to work. My image remains too tiny the image size of the on and off state is different. Seems to be enforced by the system. Is there any way to make both images use the same size? the on-state tints the image. Is there a way to set the tint color? .tint and .foregroundstyle seem to be ignored. Thank you for your help
1
0
24
8h
External Keyboard DatePicker Issues
I am currently trying to get my app ready for full external keyboard support, while testing I found an issue with the native DatePicker. Whenever I enter the DatePicker with an external keyboard it only jumps to the time picker and I am not able to move away from it. Arrow keys don't work, tab and control + tab only move me to the toolbar and back. This is how they look like private var datePicker: some View { DatePicker( "", selection: date, in: minDate..., displayedComponents: [.date] ) .fixedSize() .accessibilityIdentifier("\(datePickerLabel).DatePicker") } private var timePicker: some View { DatePicker( "", selection: date, in: minDate..., displayedComponents: [.hourAndMinute] ) .fixedSize() .accessibilityIdentifier("\(datePickerLabel).TimePicker") } private var datePickerLabelView: some View { Text(datePickerLabel.localizedString) .accessibilityIdentifier(datePickerLabel) } And we implement it like this in the view: HStack { datePickerLabelView Spacer() datePicker timePicker } Does anyone know how to fix this behavior? Is it our fault or is it the system? The issue comes up both in iOS 18 and 26.
0
1
23
9h
macOS: Is ARKit-equivalent face tracking possible with an external camera?
Hello, I am an individual developer working on a macOS application using SwiftUI and RealityKit. I would like to understand the feasibility of face-related tracking on macOS when using an external USB camera, compared to iOS/iPadOS. Specifically: • Does macOS provide an ARKit Face Tracking–equivalent API (e.g., real-time facial expressions, gaze direction, depth)? • If not, is it common to rely on Vision / AVFoundation as alternatives for: • Facial expression coefficients • Gaze estimation • Depth approximation • In an environment without dedicated sensors such as TrueDepth, is it correct to assume that accurate depth data and high-fidelity blend shape extraction are realistically difficult? Any clarification on official limitations, recommended alternatives, or relevant documentation would be greatly appreciated. Thank you.
0
0
11
9h
Basics - Dice Demo, calculate total score
I've worked through Apple's dice demo for SwiftUI, so far so good. I've got a single Die view with a button to "roll" the die. This works perfectly using the code below: struct DieView: View { init(dieType: DieType) { self.dieValue = Int.random(in: 1...dieType.rawValue) self.dieType = dieType } @State private var dieValue: Int @State private var dieType: DieType var body: some View { VStack { if self.dieType == DieType.D6 { Image(systemName: "die.face.\(dieValue)") .resizable() .frame(width: 100, height: 100) .padding() } else {//self.dieType == DieType.D12{ Text("\(self.dieValue)") .font(.largeTitle) } Button("Roll"){ withAnimation{ dieValue = Int.random(in: 1...dieType.rawValue) } } .buttonStyle(.bordered) } Spacer() } } Now I want to do a DiceSetView with an arbitrary number of dice. I've got the UI working with the following; struct DiceSetView: View { @State private var totalScore: Int = 0 var body: some View { ScrollView(.horizontal) { HStack{ DieView(dieType: DieType.D6) DieView(dieType: DieType.D6) DieView(dieType: DieType.D6) } } HStack{ Button("Roll All"){} .buttonStyle(.bordered) Text("Score \(totalScore)") .font(.callout) } Spacer() } } Where I'm struggling is how to get the total of all the dice in a set and to roll all the dice in a set on a button click. I can't iterate through the dice, and just "click" the buttons in the child views from their parents, and I can't think how it should be structured to achieve this (I'm new to this style of programming!) - can anyone point me in the right direction for how to achieve what I want? I realise that I'm probably missing something fundamentally conceptual here....
Topic: UI Frameworks SubTopic: SwiftUI
0
0
15
9h
Place an app on a family device
I had a free developer account and now have a paid account. I can now install unlimited number of apps on my personal devices. I want to install more than 3 apps on a device of a family member. The device of this member has been registered under Certificates, Identifiers & Profiles. When I try to install apps with xCode on this family device I still get the message that I reached the limit of 3 apps (free account). By signing & capabilities I use the paid account as team. I have deleted all the apps on that device where installed with the free account, restarted the device and restarted xcode but the problem still exists.
0
0
22
9h
Can not delete "Other installed Platforms"
I recently found out that my System Storage is at about 200GB. I checked my XCode and found these "ohter installed platforms" (screenshot1). I can delete them there but as soon as I restart my computer, they will show up again. I found the folder /System/Library/AssetsV2/com_apple_MobileAsset_iOSSimulatorRuntime which uses 75GB and Im pretty sure that these simulators are the ones in XCode. This folder doesnt change a bit when I delete the simulators in Xcode. I cant delete the folder because of permissions... anyone an idea? Btw: 200gb systemdata of 500gb seems pretty miserable. Kind regards
1
0
18
9h