We tried to fetch the recorded PPG data using SensorKit
with the following code, however the didFetchResult callback method is never called.
let ppgReader = SRSensorReader(sensor: .photoplethysmogram)
let request = SRFetchRequest()
let nowDate = Date()
let toDate = nowDate.addingTimeInterval(-25 * 60 * 60)
let fromDate = toDate.addingTimeInterval(-24 * 60 * 60)
request.from = SRAbsoluteTime.fromCFAbsoluteTime(_cf: fromDate.timeIntervalSinceReferenceDate)
request.to = SRAbsoluteTime.fromCFAbsoluteTime(_cf: toDate.timeIntervalSinceReferenceDate)
ppgReader.delegate = self;
ppgReader.fetch(request)
The delegate called the didComplete successfully:
func sensorReader(_ reader: SRSensorReader, didCompleteFetch fetchRequest: SRFetchRequest)
But never called the didFetchResult
func sensorReader(_ reader: SRSensorReader, fetching fetchRequest: SRFetchRequest, didFetchResult result: SRFetchResult<AnyObject>) -> Bool
Any ideas why ? (I am wearing the watch for couple days and ensure it has the data for the time period I am querying)
One thing I notice is when Apple granted us the entitlement, it uses Uppercase for ECG and PPG, however the document use Lowercases in the plist https://developer.apple.com/documentation/sensorkit/srsensor/photoplethysmogram
Dose it matter ?
General
RSS for tagDelve into the world of built-in app and system services available to developers. Discuss leveraging these services to enhance your app's functionality and user experience.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I am getting this error when I try to show device activity report view by this DeviceActivityReport(appsContext, filter: filter)
Attempt to map database failed: permission was denied. This attempt will not be retried.
I have taken access by this way. AuthorizationCenter.shared.requestAuthorization(for: .individual)
Hey there! So, I'm trying to see what I'm able to do with the Device Activity Report Extension, and I have a few questions about the following quote:
To protect the user’s privacy, your extension runs in a sandbox. This sandbox prevents your extension from making network requests or moving sensitive content outside the extension’s address space.
In particular, what constitutes the address space for this extension?
Can I save data to a UserDefaults object that only the extension can access? (Apps like Opal allow the user to label apps as "distracting" and "non-distracting", and I'm wondering how they do that!)
From what I've read, I believe it cannot write to a shared app group or model (and I just want to confirm this)
It also seems that there's nothing preventing it from reading data from the main app, so I'm just wondering if it's able to read data from an app group or model with no problem.
Thanks in advance!
Topic:
App & System Services
SubTopic:
General
Tags:
Family Controls
Device Activity
Screen Time
Privacy
Hi DTS / Apple engineers,
We’re attempting to extending our screen time app target to Mac Catalyst. On iOS, FamilyControls works as expected (AuthorizationCenter + FamilyActivityPicker, then ManagedSettings shields + DeviceActivity monitoring/reporting).
On Mac Catalyst:
The project builds with FamilyControls/DeviceActivity/ManagedSettings capabilities enabled.
But attempting to request FamilyControls authorization (or present FamilyActivityPicker) fails at runtime. We see errors similar to:
Failed to get service proxy: The connection to service named com.apple.FamilyControlsAgent was invalidated: failed at lookup with error 159 - Sandbox restriction.
And our app stays authorizationStatus == .notDetermined, with the request failing.
We saw an Apple engineer suggestion to “disable App Sandbox”, but Mac Catalyst apps appear to always be sandboxed, so we can’t disable it.
Questions:
Is FamilyControls authorization supported on Mac Catalyst today? If so, what entitlement/capability is required specifically for Catalyst/macOS?
If FamilyControls auth cannot succeed on Catalyst, does that mean ManagedSettings shields and DeviceActivity monitoring/reporting are effectively unusable on Catalyst (since they depend on that authorization)?
Is there an Apple‑recommended approach for a Catalyst “portal” app that mirrors an iOS child device’s restrictions, or is local enforcement on Catalyst intentionally unsupported?
Any guidance (and any official docs that clarify current platform support) would be hugely appreciated.
Topic:
App & System Services
SubTopic:
General
Tags:
Mac Catalyst
Family Controls
Device Activity
Managed Settings
Since the iOS 26.2 update, we have been experiencing anomalous behavior with the DeviceActivityMonitor extension when utilizing the ScreenTime API. Specifically, we are receiving the eventDidReachThreshold event within a few minutes of initiating monitoring, despite configuring a high usage limit.
The process of turning off Screen Time -> restarting the device -> turning on Screen Time does not work.
Any ideas?
Thanks
Filed Feedback Assistant: FB21560904
The Problem
The Family Activity Picker shows only the child's app categories on the guardian's/parent's device.
The application names from the child's device are not showing on the guardian's/parent's device.
The authorization is done on the child's device via
try await AuthorizationCenter.shared.requestAuthorization(for: .child)
Usage of the family activity picker on the guardian's/parent's device
struct ContentView: View {
@State private var isPresented = true
@StateObject private var familyControlsHelper = FamilyControlsHelper.shared
var onClose: () -> Void
var body: some View {
ZStack {
Color.black.opacity(0.1).ignoresSafeArea()
}
.familyActivityPicker(
isPresented: $isPresented,
selection: $familyControlsHelper.familyActivitySelection
)
.onChange(of: isPresented) { _ in
if !isPresented {
onClose()
}
}
}
}
IMPORTANT
Both devices are real (not simulators), and the app has granted distribution Family Controls entitlement.
Question
Is this the expected behavior? Or the child's app should appear on the guardian's device?
Thanks.
reposting this in case it got missed the first time around here
https://developer.apple.com/forums/thread/775900
We had a question that came up when we comparing data from WeatherKit to other sources - WeatherKit visibility was well beyond the boundaries we had historically, even from Darksky. That raises two questions:
is visibility actually in meters like the docs say?
is this visibility at ground level, 500ft, or some other height?
We were seeing visibility numbers of up to 40 miles (after converting the number the API sent to miles), where all of our other sources are usually within 10 miles
Issue Description:
When an alarm is set for a time earlier than the current system time, and the system time is manually adjusted back to before the alarm time, the alarm does not ring when the scheduled time is reached.
Steps to Reproduce:
Current time is 23:34
Set an alarm for 23:30 (earlier than the current time)
Manually change the system time to 23:28
Wait until the time reaches 23:30
The alarm does not ring
Device:
iPhone 14 Pro / iOS 26.2
Topic:
App & System Services
SubTopic:
General
My CoreSpotlight extension seems to exceed the 6 MB memory limit. What’s the best way to debug this?
I've tried to attach the debugger on the Simulator but the extension seems to be never launched when I trigger the reindex from Developer settings. Is this supposed to work?
On device, I am able to attach the debugger. However, I can neither transfer the debug session to Instruments, nor display the memory graph. So I've no idea how the memory is used.
Any recommendations how to move forward? Is there a way to temporarily disable the memory limit since even with LLDB attached, the extension is killed.
I'm developing a proximity tool on macOS Tahoe 26.2 (M4 MacBook Pro) to detect when my iPhone leaves the immediate vicinity of my macbook.
Does NearbyInteraction on macOS support persistent background sessions for detecting peer absence (didInvalidate/timeout), or is CoreBluetooth still required as the keep-alive trigger?
We have created an app that uses Appintents to plug into Siri. However, launching the app >sometimes< will launch a menu that will let the user choose between the app and Contacts. Why? How can I tell Siri to not ask for Contacts?
Are we planning to have some APIs or methods to know that status of Call blocking extension and message filter extension in future releases as currently it is not available.
We've been using the WeatherKit API for a few years now. Everything has been pretty stable. We'll periodically get 404 errors, but they usually disappear within a couple days.
Starting March 5th we've again been getting 404 errors that slowly ramped up to March 20th and continued. We have had no code changes on our end, so something seems to have changed / broken on the server side of things.
Here are some example API calls that are giving us a 404 error now
https://weatherkit.apple.com/api/v1/weather/en/35.9981205/-78.8920444?dataSets=forecastDaily&dailyStart=2025-03-21T05:00:00Z&timezone=America/New_York&countryCode=US
https://weatherkit.apple.com/api/v1/weather/en/41.4789363/-81.7404134?dataSets=forecastDaily&dailyStart=2025-03-21T04:56:00Z&timezone=America/New_York&countryCode=US
Does anyone have any insights or information on this?
Also if Apple is listening, an error more meaningful than 404 would be much much appreciated.
Hi,
I would like to reset system window private picker alert with ScreenCapture kit. i can reset the ScreenCapture permission with tccutil reset ScreenCapture. but it does not reset the system window private picker alert. i tried deleting the application directory from container and it does not help. the system window private picker alert uses the old approval i gave and it does not prompt a new alert. How can i starta with fresh screencapture kit settings for an app in testing?
Thanks
I’m configuring App Clip launch behavior and would appreciate some clarification.
In my setup, the App Clip launch URL is the same as the deep link used to open the full app. Both are configured in the Apple App Site Association (AASA) file.
Observed behavior:
Scanning a QR code with this URL correctly launches the App Clip.
Tapping the same URL when it’s shared (for example, via Messages) launches the full app via the deep link instead of the App Clip experience.
I’m reviewing the documentation here:
https://developer.apple.com/documentation/appclip/configuring-the-launch-experience-of-your-app-clip#Choose-App-Clip-experiences-you-want-to-support
The table mentions that an App Clip can be invoked via “A shared link to an App Clip in the Messages app.” However, when I tap the shared link in Messages, the deep link experience is triggered instead of the App Clip.
My questions are:
Is this behavior expected when the App Clip URL and the app’s deep link URL are the same?
Does launching an App Clip from a shared Messages link require a distinct URL or additional configuration beyond what’s in the AASA file?
Are there specific constraints or priorities between universal links for the full app and App Clip invocation in this scenario?
Any clarification or guidance would be greatly appreciated.
Thank you.
We integrated DeviceCheck framework into our app to prevent fraudulent call to our app service around one year ago.
Recently, we received a few cases related to this function over Christmas Eve period.
Based on the logs we have, it indicated both the following two functions returned errors. But we don't have the exactly errors logged and now we cannot replicate.
DCAppAttestService.shared.attestKey()
DCAppAttestService.shared.generateAssertion()
The other finding we have is some users reporting this issue recently upgraded their devices from iOS 18 to iOS 26.
So we are suspecting it's due to either the OS upgrading, or Apple's app attest service degrading.
Anyone encountered the similar issues before, or have any idea regarding the root cause? Thanks!
Dear Apple,
while implementing Declared Age Range API in my app, I've noticed a mistake in documentation: the isEligibleForAgeFeatures property is marked 26.0+ in documentation, but 26.2+ in Xcode, which ultimately leads to inability to use it with OS below 26.2.
Moreover, I'm thoroughly confused by this quote from documentation:
This flag returns true on iOS and iPadOS based on a person’s eligibility and always returns false on macOS.
It leads me to two questions:
Is it possible to use Declared Age Range API for macOS apps? Will it be possible to use it in future?
Will there be any changes regarding this matter in a meantime (especially after Jan 1st)?
If yes - when should we expect these changes?
If no - why this API declares macOS 26+ support alongside iOS/iPadOS, if it simply doesn't work for macOS now?
As of now, my iOS app works flawlessly with given API (on iOS 26.2) while macOS app returns isEligibleForAgeFeatures = false and requestAgeRange request always throws AgeRangeService.Error.notAvailable.
Also, does it mean that one should not use isEligibleForAgeFeatures boolean while implementing Declared Age Range API for apps below iOS 26.2 (I mean 26.0+)? Or implementing given API for iOS 26.2+ is a sufficient way to go? So shouldn't the whole API be marked as 26.2+?
The minimum iOS version in my app is 16.0 and minimum macOS version is 13.0 anyway, so the significant part of users is left out of these updates, but the main goal here is legal compliance.
Hi everyone,
I’ve removed my App Clip completely:
Deleted all Advanced App Clip Experiences
Removed the App Clip target from my build
Removed App Clip references from my apple-app-site-association file
Deleted the meta tag from my website:
But when I scan the QR code, the App Clip card still appears with:
"This App Clip is not currently available in your country or region."
Does anyone know why this is still showing and how to fully remove it? we need to show our website when this QRCode is scanned.
Thanks!
I've discovered a bug in the Phone app on iOS related to how long verdicts are displayed.
When a call is identified by a third-party Caller ID app, long verdicts display correctly during the call (they auto-scroll) and in the call log (with an ellipsis at the end). However, on the call details screen, the text is strangely truncated - showing only the beginning of the string and the last word.
For testing, I used this verdict: "Musclemen grow on trees. They can tense their muscles and look good in a mirror. So what? I'm interested in practical strength that's going to help me run, jump, twist, punch."
I'll attach a screenshots demonstrating the problem:
I'm building a voice-to-text keyboard extension that needs to open the main app briefly for audio recording (since keyboard extensions can't record audio), then return the user to their original app.
The flow I'm trying to achieve:
User is in WhatsApp (or Messages, Slack, etc.)
User taps "Voice" button in my keyboard
My main app opens via deep link (myapp://keyboard/dictation)
App starts recording
App automatically returns user to WhatsApp
I cannot find a way to detect which app the keyboard is running inside, or which app opened my main app via the deep link.
UIInputViewController.textDocumentProxy - No host app information available
UIApplication.OpenURLOptionsKey.sourceApplication in application(_:open:options:) - When opened from a keyboard extension, does this return the host app bundle ID or the keyboard extension bundle ID?
Private APIs (for research only, not production):
_hostBundleID on UIInputViewController - blocked/returns nil on iOS 18
KVC approaches - all blocked
Hardcoded app support - Works but requires maintaining a list of popular apps and showing multiple buttons instead of a single "Voice" button
My questions:
When a keyboard extension triggers a URL open (via SwiftUI Link or UIApplication.shared.open), what does sourceApplication contain? The host app or the keyboard extension?
Is there any supported way for a main app to know which app it was launched from, specifically when that launch originated from a keyboard extension?
How do apps like "Wispr Flow" achieve seamless return-to-app with a single voice button? They seem to auto-return to whatever app the user was in.
Environment:
iOS 18.0+
Xcode 16
SwiftUI keyboard using KeyboardKit
Any guidance on the recommended approach would be greatly appreciated. I understand there may be privacy reasons for limiting host app detection, but the UX of requiring users to manually swipe back (or tap app-specific buttons) is significantly worse than automatic return.