Overview

Post

Replies

Boosts

Views

Activity

Swift Charts - weak scrolling performance
Hello there! I wanted to give a native scrolling mechanism for the Swift Charts Graph a try and experiment a bit if the scenario that we try to achieve might be possible, but it seems that the Swift Charts scrolling performance is very poor. The graph was created as follows: X-axis is created based on a date range, Y-axis is created based on an integer values between moreless 0-320 value. the graph is scrollable horizontally only (x-axis), The time range (x-axis) for the scrolling content was set to one year from now date (so the user can scroll one year into the past as a minimum visible date (.chartXScale). The X-axis shows 3 hours of data per screen width (.chartXVisibleDomain). The data points for the graph are generated once when screen is about to appear so that the Charts engine can use it (no lazy loading implemented yet). The line data points (LineMark views) consist of 2880 data points distributed every 5 minutes which simulates - two days of continuous data stream that we want to present. The rest of the graph displays no data at all. The performance result: The graph on the initial loading phase is frozen for about 10-15 seconds until the data appears on the graph. Scrolling is very laggy - the CPU usage is 100% and is unacceptable for the end users. If we show no data at all on the graph (so no LineMark views are created at all) - the result is similar - the empty graph scrolling is also very laggy. Below I am sharing a test code: @main struct ChartsTestApp: App { var body: some Scene { WindowGroup { ContentView() Spacer() } } } struct LineDataPoint: Identifiable, Equatable { var id: Int let date: Date let value: Int } actor TestData { func generate(startDate: Date) async -> [LineDataPoint] { var values: [LineDataPoint] = [] for i in 0..<(1440 * 2) { values.append( LineDataPoint( id: i, date: startDate.addingTimeInterval( TimeInterval(60 * 5 * i) // Every 5 minutes ), value: Int.random(in: 1...100) ) ) } return values } } struct ContentView: View { var startDate: Date { return endDate.addingTimeInterval(-3600*24*30*12) // one year into the past from now } let endDate = Date() @State var dataPoints: [LineDataPoint] = [] var body: some View { Chart { ForEach(dataPoints) { item in LineMark( x: .value("Date", item.date), y: .value("Value", item.value), series: .value("Series", "Test") ) } } .frame(height: 200) .chartScrollableAxes(.horizontal) .chartYAxis(.hidden) .chartXScale(domain: startDate...endDate) // one year possibility to scroll back .chartXVisibleDomain(length: 3600 * 3) // 3 hours visible on screen .onAppear { Task { dataPoints = await TestData().generate(startDate: startDate) } } } } I would be grateful for any insights or suggestions on how to improve it or if it's planned to be improved in the future. Currently, I use UIKit CollectionView where we split the graph into smaller chunks of the graph and we present the SwiftUI Chart content in the cells, so we use the scrolling offered there. I wonder if it's possible to use native SwiftUI for such a scenario so that later on we could also implement some kind of lazy loading of the data as the user scrolls into the past.
4
2
1.2k
1w
Cant Create/Update App Clips
Hey there, We've been using App Store Connect API to manage (create/update) Advanced App Clip Experiences via App Store Connect API. Everything has worked fine, we've been able to successfully manage hundreds of app clips but all of a sudden starting on December 15th the API started returning the following error: "id" => "1e15b36b-5347-4af0-9bab-7f6626ffec65" "status" => "409" "code" => "ENTITY_ERROR.INCLUDED.INVALID_ID" "title" => "The provided entity id is invalid" "detail" => "The provided included entity id 'EN' has invalid format" "source" => array:1 [▼ "pointer" => "/included/0/id" ] It does seem to be an API bug considering it has always worked fine and we didn't change anything on our side, the /included/0/id value has always been EN and never changed. Moreover, EN still seems to be a valid value according to the API docs and there are no changes reported of that field in the API release notes. Here's the request ID: 1e15b36b-5347-4af0-9bab-7f6626ffec65 I've tried using different values through trial and error (en, EN, EN-US, ...) and none of them worked.
3
4
174
16h
DOMContentLoaded not working in Safari App Extension
I am trying to run JavaScript only after the page has loaded, and according to here - https://developer.apple.com/documentation/safariservices/safari_app_extensions/injecting_a_script_into_a_webpage, I should use DOMContentLoaded. However, it does not seem to work. This is my content.js file: function runOnStart() {     document.addEventListener('DOMContentLoaded', function(e) {         document.body.style.background = "rgb(20, 20, 20)";         document.html.style.background = "rgb(20, 20, 20)";                var divElements = document.body.getElementsByTagName('div');         for(var i = 0; i < divElements.length; i++) {             let elem = divElements[i];             elem.style.background = "rgba(255, 255, 255, 0.05)";         }     }); } runOnStart(); If I take the code outside of the event listener, it runs fine, but a lot of the elements haven't loaded in yet so it doesn't work as it should. The function is definitely running, but the event listener simply doesn't work. I appreciate any help you can give!
12
3
11k
1w
ExtensionKit and iOS 26
It looks like ExtensionKit (and ExtensionFoundation) is fully available on iOS 26 but there is no mention about this in WWDC. From my testing, it seems as of beta 1, ExtensionKit allows the app from one dev team to launch extension provided by another dev team. Before we start building on this, can someone from Apple help confirm this is the intentional behavior and not just beta 1 thing?
3
4
260
2w
iOS 26 regression: `DeviceActivityEvent`: `eventDidReachThreshold` called immediately (instead of waiting till threshold is reached)
Hello! I am experiencing some strange bugs around DeviceActivityEvents: When creating a DeviceActivityEvent we can assign a threshold and applicationTokens. The idea is, that after the user has spent said threshold on said apps, eventDidReachThreshold is called. includesPastActivity is set to false. On iOS 26 however, it happens (quite reliably after updating to a new beta seed) quite often that eventDidReachThreshold is called immediately (after a couple of seconds) instead of waiting for the threshold to be met. Is anyone else seeing similar issues on iOS 26? Only workaround I have found is to ask users to re-grant Screen Time permissions. This only holds for about two weeks though or at most until the next iOS 26 beta update is installed. Feedback filed under: FB18061981 FB18927456
16
4
985
2w
Can not re-upload an Asset Pack that's been archived?
Hello, I'm trying to upload an asset pack that has the same identifier as an asset pack that I've archived. I understand this isn't likely a common scenario, but I'd expect that uploading an archived Asset Pack to become un-archived. Reverting to the next newest version available for the Asset Pack. Further, this restriction is not clear that you won't be able to reuse the assetPackID on the archive asset pack alert Archive Asset Pack? Are you sure you want to archive "[NAME]" asset pack? All versions of this asset will no longer be accessible by your app. Errors: Failed to create a new background asset pack version for '[NAME]'. (-19243) operation not allowed (409) Cannot create version for an archived background asset. (ID: 2cc6499a-83fa-4bbb-bc1f-0bb67d2a873d) httpBody: { "errors" : [ { "id" : "2cc6499a-83fa-4bbb-bc1f-0bb67d2a873d", "status" : "409", "code" : "STATE_ERROR.ARCHIVED_BACKGROUND_ASSET", "title" : "operation not allowed", "detail" : "Cannot create version for an archived background asset." } ]
3
0
184
5d
Apple’s age rating deadline: will apps be blocked after 31 Jan 2026?
Apple sent a final reminder asking developers to complete the updated age rating questions in App Store Connect. Final reminder: Answer the updated age ratings questions. We’re reaching out because you have not provided responses to the updated age ratings questions in the App Information section of your app in App Store Connect. If you don’t answer these questions by January 31, 2026, you won’t be able to submit app updates in App Store Connect. The email says that if the age rating questions are not answered by 31 January 2026, you will not be able to submit app updates. What is not clear is what actually happens after that date. Many of us are in the middle of development and may not be ready to submit a new build before the deadline. The email does not explain whether this means: A) You can still submit updates after 31 January 2026, as long as you complete the age rating questionnaire before submitting, or B) The app becomes locked and cannot be updated at all once the deadline passes This is not stated explicitly in the email, which makes it confusing for me. It would be helpful if Apple could clearly confirm what developers should expect after 31 January 2026, especially for apps that are still undergoing active development. Anyone knows?
4
3
461
5d
Menu presentation in UIHostingController issues
Looking to see if anyone has experienced this issue, and is aware of any workarounds. With an app migrating towards SwiftUI Views but still using UIKit for primary navigation, my app makes use of UIHostingController to push SwiftUI Views onto a UINavigationController stack in a lot of areas. With iOS 26, I notice that SwiftUI's Menu view really struggles to present when contained in a UIHostingController. An error is logged to the console on presentation, and depending on the UI, the Menu won't present inside of it's container, or will jump around the screen. The bug, it seems is based in a private class UIReparentingView and I am curious if anyone has found a work around for this issue. The error reported is: Adding '_UIReparentingView' as a subview of UIHostingController.view is not supported and may result in a broken view hierarchy. Add your view above UIHostingController.view in a common superview or insert it into your SwiftUI content in a UIViewRepresentable instead. The simplest way to see this issue is to create a new storyboard based project. From the ViewController present a UIHostingController with a SwiftUI view that has a Menu and then simply tap to open the Menu. Thanks for any input!
7
3
629
2w
Stopping and Resuming Background Location Activity with CLLocationUpdates and CLBackgroundActivitySession
Hello, This is my first post in the forums, and I'm still learning my way with iOS Development and Swift. My apologies if the formatting is not correct, or If I'm making any mistakes. I'm currently trying to implement an iOS App where the device needs to share the location with my server via an API call. The use case is as follows: the server expects location updates to determine if a device is inside/outside a geofence. If the device is stationary, no locations need to be sent. If the device begins moving, regardless of whether the app is in foreground, background, or terminated, the app should resume posting locations to the server. I've decided to use the CLLocationUpdate.liveUpdates() stream, together with CLBackgroundActivitySession(). However, I have not been able to achieve the behavior successfully. My app either maintains the blue CLActivitySession indicator active, regardless of whether the phone is stationary or not, or kills the Indicator (and the background capability) and does not restore it when moving again. Below I've attached my latest code snippet (the indicator disappears and does not come back). // This method is called in the didFinishLaunchingWithOptions func startLocationUpdates(precise: Bool) { // Show the location permission pop up requestAuthorization() // Stop any previous sessions stopLocationUpdates() Task { do { // If we have the right authorization, we will launch the updates in the background // using CLBackgroundActivitySession if self.manager.authorizationStatus == .authorizedAlways { self.backgroundActivity = true } else { self.backgroundActivity = false self.backgroundSession?.invalidate() } // We will start collecting live location updates for try await update in CLLocationUpdate.liveUpdates() { // Handle deprecation let stationary = if #available(iOS 18.0, *) { update.stationary } else { update.isStationary } // If the update is identified as stationary, we will skip this update // and turn off background location updates if stationary { self.backgroundSession?.invalidate() continue } // if background activity is enabled, we restore the Background Activity Session if backgroundActivity == true { self.backgroundSession = CLBackgroundActivitySession() } guard let location = update.location else { continue } // Do POST with location to server } } catch { print("Could not start location updates") } } } I'm not sure why the code does not work as expected, and I believe I may be misunderstanding how the libraries Work. My understanding is that the liveUpdates stream is capable of emitting values, even if the app has gone to the background/terminated, thus why I'm trying to stop/resume the Background Activity using the "stationary" or "isStationary" attribute coming from the update. Is the behavior I'm trying to achieve possible? If so, I'm I using the right libraries for it? Is my implementation correct? And If not, what would be the recommended approach? Regards
2
1
71
5d
Attributes inspector in Xcode 26
It has been two years since I wrote my a SwiftUI app, and I wanted to start again in Xcode 26. I can no longer see the attributes inspector when I select an element in the canvas. This was an Xcode feature that was very helpful as I am still a novice. Has this feature been deprecated in Xcode 26? And if not, please help explain how I can find and use it.
3
3
304
2w
XCTFail immediately aborts the test in Xcode 26 — no retry on failure
Hi, I’m seeing an unexpected change in how XCTFail behaves in UI tests after updating Xcode. I use the following helper method: `func waitForExistance(file: StaticString, line: UInt) -> Self { if !(element.exists || element.waitForExistence(timeout: Configuration.current.predicateTimeout)) { XCTFail("couldn't find element: \(element) after \(Configuration.current.predicateTimeout) seconds", file: file, line: line) return self } else { return self } }` In Xcode 16.4, this worked as expected: – when an element wasn’t found, XCTFail was triggered, but the test continued running, allowing my retry logic to execute. After updating to Xcode 26.1 / 26.2 - the test now immediately aborts after XCTFail, without executing the next retry. The logs show: `t = 113.22s Tear Down t = 113.22s Terminate com.viessmann.care:81789 *** Assertion failure in -[UITests.Tests _caughtUnhandledDeveloperExceptionPermittingControlFlowInterruptions:caughtInterruptionException:whileExecutingBlock:], XCTestCase+IssueHandling.m:273 Test Case '-[UITests.Tests test_case]' failed (114.323 seconds). Flushing outgoing messages to the IDE with timeout 600.00s Received confirmation that IDE processed remaining outgoing messages` It looks like XCTFail in Xcode 26 is now treated as an unhandled developer exception, which stops the test execution immediately, even when it’s called inside a helper method. This was not the case in earlier versions. My questions: Is this a regression in XCTest? Or an intentional change in how XCTFail behaves in newer Xcode versions? Should failures now be reported differently (e.g., using record(.init(type: .assertionFailure, …))) if I want to continue the test instead of aborting it? I would like to restore the previous behavior where the failure is logged without terminating the entire test, so my retry mechanism can still run. Has anyone else run into this after upgrading? Thanks in advance! If you’d like, I can also add recommended workarounds that actually work with Xcode 16.4 (e.g., replacing XCTFail with a non-terminating issue record).
3
3
196
6d
In visionOS 26.2 RC, pushWindow + dismissWindow is broken
I recently added pushWindow to my app, and I discovered that in visionOS 26.2 RC (23N301), pushWindow followed by dismissWindow no longer works as expected. Specifically, if the user moves the pushed window, then when the pushed window is later dismissed, the parent window's position isn't aligned with the pushed window's new position. Its original position is restored instead. Curiously, the bug only happens when an app is launched from the visionOS home view, and not when an app is launched from Xcode. It also doesn't happen in the visionOS 26.2 simulator. Another interesting detail is that while the parent window is hidden, if the user long-presses the Digital Crown and then dismisses the pushed window, the parent window's position seems to be immune from the Digital Crown scene reorientation. It's restored to its original real world position. Demo video: https://youtu.be/zR3t2ON3Wz0 I've submitted feedback as FB21287011 with a sample app and detailed repro steps. Has anyone else encountered this issue already and figured out a workaround? It would be nice if I could get pushWindow to work correctly in my app. Thanks everybody! 😀
2
2
285
1w
Declared Age Range: How to support age verification on iOS < 26?
Hello, we get in touch as we need some guidance from Apple regarding age verification for minors in our app. Our app supports iOS 17 and above. The Declared Age Range API is available only starting on iOS 26, but we must comply with legal requirements (e.g., Texas SB 2420) and ensure that minor users cannot access certain sections of the app, regardless of the version of the operating system. Our question: What is the correct and Apple-approved approach for handling age verification and restricting access for minor users on iOS versions prior to 26, given that the Declared Age Range API is not available on those systems? We want to ensure that our implementation aligns with the regulations, the App Store Review Guidelines and platform expectations.
2
3
648
6d
Facing 2 issues: Issue uploading for external testers, and issue in downloading the app for internal testers. Although the app work perfeclty when i build it directly locally to my phone.
Although the application functions correctly when deployed locally to a physical iPhone using Flutter (direct debug/development build via Xcode), we are encountering the following issues only within App Store Connect / TestFlight: 1- Internal Testing Issue: After completing all required steps in App Store Connect and adding internal testers, testers are unable to install the app. TestFlight displays the error: “Could not install app. The requested app is not available or doesn’t exist.” 2- External Testing Issue: When attempting to add a processed build to an External Testing group (Public Link), App Store Connect returns the following message: “There was an error processing your request. Please try again later.” The build has successfully uploaded and processed, and is marked as Ready to Submit in TestFlight. These issues do not occur during direct device deployment and appear to be isolated to TestFlight / App Store Connect. We would appreciate guidance on how to solve this issue which is preventing TestFlight distribution. *Note: we tried multible builds and upgraded the version 4 times, and we signed all agreements.
1
3
84
1w
Flutter library that basically makes a call every "x" minutes if the app is in the background.
Hi everyone, could you help us? We implemented a Flutter library that basically makes a call every x minutes if the app is in the background, but when I generate the version via TestFlight for testing, it doesn't work. Can you help us understand why? Below is a more detailed technical description. Apple Developer Technical Support Request Subject: BGTaskScheduler / Background Tasks Not Executing in TestFlight - Flutter App with workmanager Plugin Issue Summary Background tasks scheduled using BGTaskScheduler are not executing when the app is distributed via TestFlight. The same implementation works correctly when running the app locally via USB/Xcode debugging. We are developing a Flutter application that needs to perform periodic API calls when the app is in the background. We have followed all documentation and implemented the required configurations, but background tasks are not being executed in the TestFlight build. App Information Field Value App Version 3.1.15 (Build 311) iOS Minimum Deployment Target iOS 15.0 Framework Flutter Flutter SDK Version ^3.7.2 Technical Environment Flutter Dependencies (Background Task Related) Package Version Purpose workmanager ^0.9.0+3 Main background task scheduler (uses BGTaskScheduler on iOS 13+) flutter_background_service ^5.0.5 Background service management flutter_background_service_android ^6.2.4 Android-specific background service flutter_local_notifications ^19.4.2 Local notifications for background alerts timezone ^0.10.0 Timezone support for scheduling Other Relevant Flutter Dependencies Package Version firebase_core 4.0.0 firebase_messaging (via native Podfile) sfmc (Salesforce Marketing Cloud) ^9.0.0 geolocator ^14.0.0 permission_handler ^12.0.0+1 Info.plist Configuration We have added the following configurations to Info.plist: UIBackgroundModes <key>UIBackgroundModes</key> <array> <string>location</string> <string>remote-notification</string> <string>processing</string> </array> ### BGTaskSchedulerPermittedIdentifiers ```xml <key>BGTaskSchedulerPermittedIdentifiers</key> <array> <string>br.com.unidas.apprac.ios.workmanager.carrinho_api_task</string> <string>br.com.unidas.apprac.ios.workmanager</string> <string>be.tramckrijter.workmanager.BackgroundTask</string> </array> **Note:** We included multiple identifier formats as recommended by the `workmanager` Flutter plugin documentation: 1. `{bundleId}.ios.workmanager.{taskName}` - Custom task identifier 2. `{bundleId}.ios.workmanager` - Default workmanager identifier 3. `be.tramckrijter.workmanager.BackgroundTask` - Plugin's default identifier (as per plugin documentation) ## AppDelegate.swift Configuration We have configured the `AppDelegate.swift` with the following background processing setup: ```swift // In application(_:didFinishLaunchingWithOptions:) // Configuration to enable background processing via WorkManager // The "processing" mode in UIBackgroundModes allows WorkManager to use BGTaskScheduler (iOS 13+) // This is required to execute scheduled tasks in background (e.g., API calls) // Note: User still needs to have Background App Refresh enabled in iOS settings if UIApplication.shared.backgroundRefreshStatus == .available { // Allows iOS system to schedule background tasks with minimum interval UIApplication.shared.setMinimumBackgroundFetchInterval(UIApplication.backgroundFetchIntervalMinimum) } ## WorkManager Implementation (Dart/Flutter) ### Initialization ```dart /// Initializes WorkManager static Future<void> initialize() async { await Workmanager().initialize(callbackDispatcher, isInDebugMode: false); print('WorkManagerService: WorkManager initialized'); } ### Task Registration /// Schedules API execution after a specific delay ## Observed Behavior ### Works (Debug/USB Connection) - When running the app via Xcode/USB debugging - Background tasks are scheduled and executed as expected - API calls are made successfully when the app is backgrounded ### Does NOT Work (TestFlight) - When the app is distributed via TestFlight - Background tasks appear to be scheduled (no errors in code) - Tasks are **never executed** when the app is in background - We have tested with: - Background App Refresh enabled in iOS Settings - App used frequently - Device connected to WiFi and charging - Waited for extended periods (hours) ## Possible heart points 1. **Are there any additional configurations required for `BGTaskScheduler` to work in TestFlight/Production builds that are not required for debug builds?** 2. **Is the identifier format correct?** We are using: `br.com.unidas.apprac.ios.workmanager.carrinho_api_task` - Should it match exactly with the task name registered in code? 3. **Are there any known issues with Flutter's `workmanager` plugin and iOS BGTaskScheduler in production environments?** 4. **Is there any way to verify through logs or system diagnostics if the background tasks are being rejected by the system?** 5. **Could there be any conflict between our other background modes (`location`, `remote-notification`) and `processing`?** 6. **Does the Salesforce Marketing Cloud SDK (SFMC) interfere with BGTaskScheduler operations?** ## Additional Context - We have verified that `Background App Refresh` is enabled for our app in iOS Settings - The app has proper entitlements for push notifications and location services - Firebase, SFMC (Salesforce Marketing Cloud), and other SDKs are properly configured - The issue is **only** present in TestFlight builds, not in debug/USB-connected builds ## References - [Apple Documentation - BGTaskScheduler](https://developer.apple.com/documentation/backgroundtasks/bgtaskscheduler) - [Apple Documentation - Choosing Background Strategies](https://developer.apple.com/documentation/backgroundtasks/choosing_background_strategies_for_your_app) Thank you
2
0
58
3w
My app doesn't respond on iPhone Air iOS 26.1.
My app doesn't respond on iPhone Air iOS 26.1. After startup, my app shows the main view with a tab bar controller containing 4 navigation controllers. However, when a second-level view controller is pushed onto any navigation controller, the UI freezes and becomes unresponsive. The iPhone simulator running iOS 26.1 exhibits the same problem. The debug profile shows CPU usage at 100%. However, other devices and simulators do not have this problem.
6
3
319
1w
Issue with multiple touches with "Defer System Gestures" on, with iOS
I'm developing a rhythm game for iOS which has four buttons spanning the width of the screen in portrait. I noticed that my testers were having some missed inputs on the buttons on the left and right due to the fact that iOS, by default, tries to ignore accidental touches on the edges of the screen. So I enabled "Defer System Gestures" on the left and right edges, but then quickly started to notice a new, very specific, issue. Description of the issue If you have finger #1 touching and holding anywhere in the middle of the screen, and finger #2 touches on the far right or left edge of the screen just below the horizontal position of finger #1, those touches are inconsistently not recognized. If finger #1 is not present, this issue does not occur. If finger #2 is above or well below finger #1, this issue also does not occur. A dead zone is created on the right and left edges of the screen just below the horizontal position of the first touch. Here is a rough representative example of where touches #1 and #2 need to be for this issue to manifest, in case the text above is not clear. |&amp;#9;&amp;#9;&amp;#9;&amp;#9;| |&amp;#9;&amp;#9;&amp;#9;&amp;#9;| |&amp;#9;&amp;#9;&amp;#9;&amp;#9;| |&amp;#9;&amp;#9;&amp;#9;&amp;#9;| |&amp;#9; 1&amp;#9;&amp;#9;| |&amp;#9;&amp;#9;&amp;#9; 2| |&amp;#9;&amp;#9;&amp;#9;&amp;#9;| It just so happens that this issue is causing major usability problems with my game, as it results in what the user sees as sporadic and inconsistent response when the game calls for two notes to be played at the same time. Steps to recreate the issue Here are the steps if you want to recreate the problem yourself using the "Create New Gesture" pane in "Assistive Touch" (Note that this problem is not specific to the Settings app, but rather is an issue across the system—however this panel defers system gestures and shows where touches are being read, so it is a great place to demonstrate): (1) Go to Settings &gt; Accessibility &gt; Touch &gt; Assistive Touch &gt; Create New Gesture...; (2) With one finger, touch the middle of the screen and hold it through step 3; (3) With a second finger, tap 4 times along the right (or left) edge of the screen in the following places: (a) well above the vertical position of the first touch, (b) just above the vertical position of the first touch, (c) just below the vertical position of the first touch, and (d) well below the vertical position of the first touch; (4) Notice how, more than half the time, touch (c) does not register. I have found that this problem is more replicatable when the first touch is on the lower half of the screen, but I have been able to replicate it when the finger is higher as well, just not as consistently. Here are the four positions described in the steps above: Position a: both touches register |&amp;#9;&amp;#9;&amp;#9;&amp;#9;| |&amp;#9;&amp;#9;&amp;#9;&amp;#9;| |&amp;#9;&amp;#9;&amp;#9;&amp;#9;| |&amp;#9;&amp;#9;&amp;#9; 2| |&amp;#9; 1&amp;#9;&amp;#9;| |&amp;#9;&amp;#9;&amp;#9;&amp;#9;| |&amp;#9;&amp;#9;&amp;#9;&amp;#9;| Position b: both touches usually register |&amp;#9;&amp;#9;&amp;#9;&amp;#9;| |&amp;#9;&amp;#9;&amp;#9;&amp;#9;| |&amp;#9;&amp;#9;&amp;#9;&amp;#9;| |&amp;#9;&amp;#9;&amp;#9;&amp;#9;| |&amp;#9; 1&amp;#9; 2| |&amp;#9;&amp;#9;&amp;#9;&amp;#9;| |&amp;#9;&amp;#9;&amp;#9;&amp;#9;| Position c: only touch 1 registers |&amp;#9;&amp;#9;&amp;#9;&amp;#9;| |&amp;#9;&amp;#9;&amp;#9;&amp;#9;| |&amp;#9;&amp;#9;&amp;#9;&amp;#9;| |&amp;#9;&amp;#9;&amp;#9;&amp;#9;| |&amp;#9; 1&amp;#9;&amp;#9;| |&amp;#9;&amp;#9;&amp;#9; 2| |&amp;#9;&amp;#9;&amp;#9;&amp;#9;| Position d: both touches register |&amp;#9;&amp;#9;&amp;#9;&amp;#9;| |&amp;#9;&amp;#9;&amp;#9;&amp;#9;| |&amp;#9;&amp;#9;&amp;#9;&amp;#9;| |&amp;#9;&amp;#9;&amp;#9;&amp;#9;| |&amp;#9; 1&amp;#9;&amp;#9;| |&amp;#9;&amp;#9;&amp;#9;&amp;#9;| |&amp;#9;&amp;#9;&amp;#9; 2| Is there anything I can do to resolve this behavior? My app requires gesture deferment to be on for the expected experience by the user, and this bug is causing other issues for my testers that kind of need to be resolved before I can confidently release the game.
2
2
1.3k
1w
CoreML regression between macOS 26.0.1 and macOS 26.1 Beta causing scrambled tensor outputs
We’ve encountered what appears to be a CoreML regression between macOS 26.0.1 and macOS 26.1 Beta. In macOS 26.0.1, CoreML models run and produce correct results. However, in macOS 26.1 Beta, the same models produce scrambled or corrupted outputs, suggesting that tensor memory is being read or written incorrectly. The behavior is consistent with a low-level stride or pointer arithmetic issue — for example, using 16-bit strides on 32-bit data or other mismatches in tensor layout handling. Reproduction Install ON1 Photo RAW 2026 or ON1 Resize 2026 on macOS 26.0.1. Use the newest Highest Quality resize model, which is Stable Diffusion–based and runs through CoreML. Observe correct, high-quality results. Upgrade to macOS 26.1 Beta and run the same operation again. The output becomes visually scrambled or corrupted. We are also seeing similar issues with another Stable Diffusion UNet model that previously worked correctly on macOS 26.0.1. This suggests the regression may affect multiple diffusion-style architectures, likely due to a change in CoreML’s tensor stride, layout computation, or memory alignment between these versions. Notes The affected models are exported using standard CoreML conversion pipelines. No custom operators or third-party CoreML runtime layers are used. The issue reproduces consistently across multiple machines. It would be helpful to know if there were changes to CoreML’s tensor layout, precision handling, or MLCompute backend between macOS 26.0.1 and 26.1 Beta, or if this is a known regression in the current beta.
5
3
1.6k
4w
I’m desperate…
Hello everyone, I’ll be honest with you, the kind of honesty that comes when you’ve run out of places to turn. My name is Donovan, I’m a French student, and I’m writing here because at this point, I truly don’t know what else to do Two years ago, I started a small project to learn mobile development. Nothing ambitious at first just a personal exercise, a way to grow. But after countless late nights, weekends sacrificed, and lines of code no one will ever see… that small project became a real application. I finished it. Refined it. Carried it like something that genuinely mattered. And for the past two months, I’ve been fighting with App Review. Always for the same reason: Guideline 4.3(b) – Design – Spam. Each time, I respond. Each time, I explain. But each time, the door closes with the same cold, impersonal message: “We encourage you to reconsider your app concept and submit a new app that provides a unique experience not already found on the App Store.” Unique. Such an easy word to use… especially when no one seems willing to look closely at what’s actually in front of them. My application is a dating app, yes. I know there are many on the App Store. But I implemented a feature that no other dating app currently provides, and more importantly: the app is 100% free, no paywallsno mandatory subscription. As far as I know, there is no completely free dating application on the App Store. I even added features that were never planned, just to avoid being dismissed as “spam.” But nothing has changed. Two years of work. Two years of progressing 4–5 hours per week, between classes, exams, and everything else life throws your way. And now, it feels like all of that can be wiped away by a single generic sentence. So here I am, turning to you, the developer community, the only people who truly understand what it means to run into an invisible wall. I need your help. Your advice, your experiences, your strategies. How can I get Apple to finally accept my app? How can I avoid throwing away two year of work because of a vague, unexplained guideline? Thank you in advance to anyone who takes the time to reply
2
0
159
1w