Prioritize user privacy and data security in your app. Discuss best practices for data handling, user consent, and security measures to protect user information.

Posts under General subtopic

Post

Replies

Boosts

Views

Activity

Crashing in sandbox-exec (FB16964888)
Why are we doing this nonsense? We want to be able to run builds in a sandbox such that they can only see the paths they are intended to depend on, to improve reproducibility. With builds with a very large number of dependencies, there's a very large number of paths added to the sandbox, and it breaks things inside libsandbox. Either it hits some sandbox length limit (sandbox-exec: pattern serialization length 66460 exceeds maximum (65535), Nix issue #4119, worked around: Nix PR 12570), or it hits an assert (this report; also Nix issue #2311). The other options for sandboxing on macOS are not viable; we acknowledge sandbox-exec and sandbox_init_with_parameters are deprecated; App Sandbox is inapplicable because we aren't an app. Our use case is closer to a browser, and all the browsers use libsandbox internally. We could possibly use SystemExtension or a particularly diabolical use of Virtualization.framework, but the former API requires notarization which is close to a no-go for our use case as open source software: it is nearly impossible to develop the software on one's own computer, and it would require us to ship a binary blob (and have the build processes to produce one in infrastructure completely dissimilar to what we use today); it also requires a bunch of engineering time. Today, we can pretend that code signing/notarization doesn't exist and that we are writing an old-school Unix daemon, because we are one. The latter is absolutely diabolical and hard to implement. See this saga about the bug we are facing: Nix issue #4119, Nix issue #2311, etc. What is going wrong I can't attach the file fail.sb as it is too large (you can view the failing test case at Lix's gerrit, CL 2870) and run this: $ sandbox-exec -D _GLOBAL_TMP_DIR=/tmp -f fail.sb /bin/sh Assertion failed: (diff <= INSTR_JUMP_NE_MAX_LENGTH), function push_jne_instr, file serialize.c, line 240. zsh: abort sandbox-exec -D _GLOBAL_TMP_DIR=/tmp -f fail.sb /bin/sh Or a stacktrace: stacktrace.txt Credits Full credits to Jade Lovelace (Lix) for writing the above text and filing a bug. This is submitted under FB16964888
1
0
171
Mar ’25
Unsandboxed app can't modify other app
I work for Brave, a browser with ~80M users. We want to introduce a new system for automatic updates called Omaha 4 (O4). It's the same system that powers automatic updates in Chrome. O4 runs as a separate application on users' systems. For Chrome, this works as follows: An app called GoogleUpdater.app regularly checks for updates in the background. When a new version is found, then GoogleUpdater.app installs it into Chrome's installation directory /Applications/Google Chrome.app. But consider what this means: A separate application, GoogleUpdater.app, is able to modify Google Chrome.app. This is especially surprising because, for example, the built-in Terminal.app is not able to modify Google Chrome.app. Here's how you can check this for yourself: (Re-)install Chrome with its DMG installer. Run the following command in Terminal: mkdir /Applications/Google\ Chrome.app/test. This works. Undo the command: rm -rf /Applications/Google\ Chrome.app/test Start Chrome and close it again. mkdir /Applications/Google\ Chrome.app/test now fails with "Operation not permitted". (These steps assume that Terminal does not have Full Disk Access and System Integrity Protection is enabled.) In other words, once Chrome was started at least once, another application (Terminal in this case) is no longer allowed to modify it. But at the same time, GoogleUpdater.app is able to modify Chrome. It regularly applies updates to the browser. For each update, this process begins with an mkdir call similarly to the one shown above. How is this possible? What is it in macOS that lets GoogleUpdater.app modify Chrome, but not another app such as Terminal? Note that Terminal is not sandboxed. I've checked that it's not related to codesigning or notarization issues. In our case, the main application (Brave) and the updater (BraveUpdater) are signed and notarized with the same certificate and have equivalent requirements, entitlements and provisioning profiles as Chrome and GoogleUpdater. The error that shows up in the Console for the disallowed mkdir call is: kernel (Sandbox) System Policy: mkdir(8917) deny(1) file-write-create /Applications/Google Chrome.app/foo (It's a similar error when BraveUpdater tries to install a new version into /Applications/Brave Browser.app.) The error goes away when I disable System Integrity Protection. But of course, we cannot ask users to do that. Any help would be greatly appreciated.
4
0
243
May ’25
iOS 18 - Intermittent keychain issue
Hi, We're encountering an intermittent issue where certain users are unexpectedly logged out of our app and unable to log in again. We believe we've narrrowed down the issue to the Keychain due to the following reasons: We use a keychain item to determine if the member is logged in or not. Failure to retrieve the value leads the app to believe the member is logged out. API error logs on the server show 3 missing values in fields that are each populated from items stored in the keychain. Additional Notes: The issue is hard to reproduce and seems to affect only a subset of users. In some cases, uninstalling and reinstalling the app temporarily resolves the problem, but the issue recurs after a period of time. The behavior appears to have coincided with the release of iOS 18. We’re using the “kSecAttrAccessibleWhenUnlocked” accessibility attribute. Given that our app doesn’t perform background operations, we wouldn’t expect this to be an issue. We’re also considering changing this to "kSecAttrAccessibleAfterFirstUnlockThisDeviceOnly" to see if this might resolve the issue. We're the keychain-swift library to interact with the keychain. We are currently adding extensive logging around our keychain implementation to confirm our findings but are looking for any additional input. Questions: Has anyone encountered similar keychain behavior on iOS 18? Are there known changes or stability issues with the keychain in iOS 18 that might lead to such intermittent “item not found” errors? Any recommended workarounds or troubleshooting steps that could help isolate the problem further? Thanks for any help you can provide.
1
0
594
Feb ’25
API: SecPKCS12Import; error code: -25264; error message: MAC verification failed during PKCS12 import (wrong password?)
Problem Statement: Pre-requisite is to generate a PKCS#12 file using openssl 3.x or above. Note: I have created a sample cert, but unable to upload it to this thread. Let me know if there is a different way I can upload. When trying to import a p12 certificate (generated using openssl 3.x) using SecPKCS12Import on MacOS (tried on Ventura, Sonoma, Sequoia). It is failing with the error code: -25264 and error message: MAC verification failed during PKCS12 import (wrong password?). I have tried importing in multiple ways through, Security Framework API (SecPKCS12Import) CLI (security import <cert_name> -k ~/Library/Keychains/login.keychain -P "<password>”) Drag and drop in to the Keychain Application All of them fail to import the p12 cert. RCA: The issues seems to be due to the difference in the MAC algorithm. The MAC algorithm used in the modern certs (by OpenSSL3 is SHA-256) which is not supported by the APPLE’s Security Framework. The keychain seems to be expecting the MAC algorithm to be SHA-1. Workaround: The current workaround is to convert the modern p12 cert to a legacy format (using openssl legacy provider which uses openssl 1.1.x consisting of insecure algorithms) which the SecPKCS12Import API understands. I have created a sample code using references from another similar thread (https://developer.apple.com/forums/thread/723242) from 2023. The steps to compile and execute the sample is mentioned in the same file. PFA the sample code by the name “pkcs12_modern_to_legacy_converter.cpp”. Also PFA a sample certificate which will help reproduce the issue by the name “modern_certificate.p12” whose password is “export”. Questions: Is there a fix on this issue? If yes, pls guide me through it; else, is it expected to be fixed in the future releases? Is there a different way to import the p12 cert which is resistant to the issue? This issue also poses a security concerns on using outdated cryptographic algorithms. Kindly share your thoughts. pkcs12_modern_to_legacy_converter.cpp
11
0
342
Apr ’25
Secure Enclave Cryptokit
I am using the CryptoKit SecureEnclave enum to generate Secure Enclave keys. I've got a couple of questions: What is the lifetime of these keys? When I don't store them somewhere, how does the Secure Enclave know they are gone? Do backups impact these keys? I.e. can I lose access to the key when I restore a backup? Do these keys count to the total storage capacity of the Secure Enclave? If I recall correctly, the Secure Enclave has a limited storage capacity. Do the SecureEnclave key instances count towards this storage capacity? What is the dataRepresentation and how can I use this? I'd like to store the Secure Enclave (preferably not in the Keychain due to its limitations). Is it "okay" to store this elsewhere, for instance in a file or in the UserDefaults? Can the dataRepresentation be used in other apps? If I had the capability of extracting the dataRepresentation as an attacker, could I then rebuild that key in my malicious app, as the key can be rebuilt with the Secure Enclave on the same device, or are there measures in place to prevent this (sandbox, bundle id, etc.)
3
0
238
Jun ’25
implement entitlement "com.apple.security.files.user-selected.read-only" in sandbox profile
First, I do not publish my application to the AppStore, but I need to customize a sandbox environment. It seems that sandbox-exec cannot configure entitlements, so I have used some other APIs, such as sandbox_compile_entitlements and sandbox_apply_container. When encountering the entitlement "com.apple.security.files.user-selected.read-only", I am unsure how to correctly write sandbox profile to implement this. Can anyone help me?
1
0
157
May ’25
Unit tests and persistent tokens
I'd like to implement unit tests that exercise keys made available via a persistent token interface. However, when attempting to list available tokens by passing kSecAttrAccessGroupToken as the kSecAttrAccessGroup to SecItemCopyMatching from a unit test, -34018 is returned. It succeeds without the kSecAttrAccessGroup, which makes sense given the unit test binary does not have com.apple.token Keychain Group. The Xcode UI indicates "Capabilities are not supported" for the unit test binary when attempting to add a Keychain Sharing capability to enable use of persistent tokens. This feels like a dead end but begs the question is there any way to implement unit tests to exercise a persistent token interface? It seems like the only path may be write unit tests that drive an independent app that handles the interactions with the persistent token.
1
0
478
Feb ’25
Clarification on Team ID Behavior After App Transfer
Hi everyone, I’d like to clarify something regarding the behavior of Team IDs after an app transfer between Apple Developer accounts. I have an app update that enforces a force update for all users. My plan is to release this update under the current developer account, and then proceed with transferring the app to a different developer account shortly afterward. My concern is: once the transfer is complete, will users who download the same app version (released before the transfer) be logged out due to a change in Team ID? Specifically, does the transferred app continue to use the original Team ID (used to sign the last submitted build), or does the Team ID change immediately upon transfer — affecting Keychain access? Any insights or confirmation on this would be greatly appreciated. Thanks!
4
0
140
Jun ’25
Integrating CryptoTokenKit with productsign
Hi all, I'm using a CryptoTokenKit (CTK) extension to perform code signing without having the private key stored on my laptop. The extension currently only supports the rsaSignatureDigestPKCS1v15SHA256 algorithm: func tokenSession(_ session: TKTokenSession, supports operation: TKTokenOperation, keyObjectID: TKToken.ObjectID, algorithm: TKTokenKeyAlgorithm) -> Bool { return algorithm.isAlgorithm(SecKeyAlgorithm.rsaSignatureDigestPKCS1v15SHA256) } This setup works perfectly with codesign, and signing completes without any issues. However, when I try to use productsign, the system correctly detects and delegates signing to my CTK extension, but it seems to always request rsaSignatureDigestPKCS1v15SHA1 instead: productsign --timestamp --sign <identity> unsigned.pkg signed.pkg productsign: using timestamp authority for signature productsign: signing product with identity "Developer ID Installer: <org> (<team>)" from keychain (null) ... Error Domain=NSOSStatusErrorDomain Code=-50 "algid:sign:RSA:digest-PKCS1v15:SHA1: algorithm not supported by the key" ... productsign: error: Failed to sign the product. From what I understand, older versions of macOS used SHA1 for code signing, but codesign has since moved to SHA256 (at least when legacy compatibility isn't a concern). Oddly, productsign still seems to default to SHA1, even in 2025. Is there a known way to force productsign to use SHA256 instead of SHA1 for the signature digest algorithm? Or is there some flag or configuration I'm missing? Thanks in advance!
7
0
558
Jun ’25
Whether non-Apple Store mac apps can use passkey?
Our desktop app for macos will be released in 2 channels appstore dmg package on our official website for users to download and install Now when we debug with passkey, we find that the package name of the appstore can normally arouse passkey, but the package name of the non-App Store can not arouse the passkey interface I need your help. Thank you
2
0
735
Apr ’25
Using provision profile to access assessments triggers a keychain popup
Hello! I do know apple does not support electron, but I do not think this is an electron related issue, rather something I am doing wrong. I'd be curious to find out why the keychain login is happenning after my app has been signed with the bundleid, entitlements, and provision profile. Before using the provision profile I did not have this issue, but it is needed for assessments feature. I'm trying to ship an Electron / macOS desktop app that must run inside Automatic Assessment Configuration. The build signs and notarizes successfully, and assessment mode itself starts on Apple-arm64 machines, but every single launch shows the system dialog that asks to allow access to the "login" keychain. The dialog appears on totally fresh user accounts, so it's not tied to anything I store there. It has happened ever since I have added the provision profile to the electron builder to finally test assessment out. entitlements.inherit.plist keys &lt;key&gt;com.apple.security.cs.allow-jit&lt;/key&gt; &lt;true/&gt; &lt;key&gt;com.apple.security.cs.allow-unsigned-executable-memory&lt;/key&gt; &lt;true/&gt; entitlements.plist keys: &lt;key&gt;com.apple.security.cs.allow-jit&lt;/key&gt; &lt;true/&gt; &lt;key&gt;com.apple.security.cs.allow-unsigned-executable-memory&lt;/key&gt; &lt;true/&gt; &lt;key&gt;com.apple.developer.automatic-assessment-configuration&lt;/key&gt; &lt;true/&gt; I'm honestly not sure whether the keychain is expected, but I have tried a lot of entitlement combinations to get rid of It. Electron builder is doing the signing, and we manually use the notary tool to notarize but probably irrelevant. mac: { notarize: false, target: 'dir', entitlements: 'buildResources/entitlements.mac.plist', provisioningProfile: 'buildResources/xyu.provisionprofile', entitlementsInherit: 'buildResources/entitlements.mac.inherit.plist', Any lead is welcome!
2
0
106
Jun ’25
Multiple views in SFAuthorizationPluginView
Hi there, I'm trying to use SFAuthorizationPluginView in order to show some fields in the login screen, have the user click the arrow, then continue to show more fields as a second step of authentication. How can I accomplish this? Register multiple SecurityAgentPlugins each with their own mechanism and nib? Some how get MacOS to call my SFAuthorizationPluginView::view() and return a new view? Manually remove text boxes and put in new ones when button is pressed I don't believe 1 works, for the second mechanism ended up calling the first mechanism's view's view() Cheers, -Ken
2
0
120
May ’25
Proper Approach to Programmatically Determine SIP State
Hello, I have encountered several challenges related to System Integrity Protection (SIP) state detection and code signing requirements. I would like to seek clarification and guidance on the proper approach to programmatically determine the SIP state. Here are the issues I’ve encountered: XPC Code Signing Check APIs: APIs like setCodeSigningRequirement and setConnectionCodeSigningRequirement do not work when SIP disabled and that's ok given what SIP is. LaunchCodeRequirement API: When using Process.launchRequirement, the LaunchCodeRequirement API does not function anymore when SIP disabled. The IsSIPProtected requirement behaves in a way that is not clearly documented -- it appears to only apply to pre-installed Apple apps. Legacy APIs: Older APIs like SecCodeCheckValidity are likely to be non-functional, though I haven’t had the chance to validate this yet. Private API Concerns: So to mitigate those limitations I prefer my app to not even try to connect to untrusted XPC or launch untrusted Processes when SIP is disabled. The only way to determine SIP state I could find is a low-level C function csr_get_active_config. However, this function is not declared in any publicly available header file, indicating that it is a private API. Since private APIs cannot be used in App Store-distributed apps and are best avoided for Developer ID-signed apps, this does not seem like a viable solution. Given these limitations, what is the recommended and proper approach to programmatically determine the SIP state in a macOS application? Any insights or guidance would be greatly appreciated. Thank you!
2
0
151
May ’25
Is it possible to launch a GUI application that is not killable by the logged in user
I'm trying to develop a GUI app on macOS that takes control of the screen so that user must perform certain actions before regaining control of the desktop. I don't want the user to be able to kill the process (for example via an "assassin" shell script that looks for the process and terminates it with kill). Based on this post it is not possible to create an unkillable process on macOS. I'm wondering, however, if it's possible to run the GUI process in root (or with other escalated privileges) such that the logged in user cannot kill it. So it's killable, but you need privileges above what the logged in user has (assuming they are not root). I'm not worried about a root user being able to kill it. Such an app would run in a managed context. I've played around with Service Background Tasks, but so far haven't found what I'm looking for. I'm hoping someone (especially from Apple) might be able to tell me if this goal is even achievable with macOS Sequoia (and beyond).
8
0
157
May ’25
Gatekeeper refuses to start application from downloaded DMG
Hello, I have an application which uses a helper[1] to download[2] files. When files download is a DMG and user mounts the image to run the application from this DMG it doesn't pass Gatekeeper. It presents the "Application XYZ.app can't be opened.". Same file downloaded via Safari shows a different dialog, the "XYZ.app is an app downloaded from the internet. Are you sure you want to open it?" In the system log I see this line: exec of /Volumes/SampleApp/SampleApp.app/Contents/MacOS/SampleApp denied since it was quarantined by Download\x20Helper and created without user consent, qtn-flags was 0x00000187 The application is running sandboxed and hardened, the main application has com.apple.security.files.downloads.read-write entitlement. Everything is signed by DeveloperID and passes all checks[3]. I tried to check the responsible process[4] of the helper. Then trivial stuff like download folder access in System Settings/Privacy & Security/Files & Folders. Everything seems to be fine. For what it worths the value of quarantine attribute is following: com.apple.quarantine: 0087;6723b80e;My App; The Safari downloaded one posses: com.apple.quarantine: 0083;6723b9fa;Safari;02162070-2561-42BE-B30B-19A0E94FE7CA Also tried a few more ways and got to 0081 with Edge and 0082 with a sample app with similar setup. Not sure if that has any meaning. What could I be doing wrong that Gatekeeper right away refuses to run the application from DMG instead of showing the dialog like in other cases? [1] The executable is in application bundle located in Contents/Helpers/DownloadHelper.app in the main application bundle. [2] Nothing fancy, curl + regular POSIX file operations [3] codesign, syspolicy_check, spctl [4] launchctl procinfo pid
13
0
1.4k
Feb ’25
Launch Constraint, SIP and legacy launchd plist
I have 2 basic questions related to Launch Constraints: [Q1] Are Launch Constraints supposed to work when SIP is disabled? From what I'm observing, when SIP is disabled, Launch Constraints (e.g. Launch Constraint Parent Process) are not enforced. I can understand that. But it's a bit confusing considering that the stack diagram in the WWDC 2023 session is placing the 'Environment Constraints' block under SIP, not above. Also the documentation only mentions SIP for the 'is-sip-protected' fact. [Q2] Is the SpawnConstraint key in legacy launchd plist files (i.e. inside /Library/Launch(Agents|Daemons)) officially supported? From what I'm seeing, it seems to be working when SIP is enabled. But the WWDC session and the documentation don't really talk about this case.
11
0
182
Jun ’25
Apple Events won't trigger Privacy & Security alerts due to Sandboxing
I created an app in Xcode using ApplescriptObjC that is supposed to communicate with Finder and Adobe Illustrator. It has been working for the last 8 years, until now I have updated it for Sonoma and it no longer triggers the alerts for the user to approve the communication. It sends the Apple Events, but instead of the alert dialog I get this error in Console: "Sandboxed application with pid 15728 attempted to lookup App: "Finder"/"finder"/"com.apple.finder" 654/0x0:0x1d01d MACSstill-hintable sess=100017 but was denied due to sandboxing." The Illustrator error is prdictably similar. I added this to the app.entitlements file: <key>com.apple.security.automation.apple-events</key> <array> <string>com.apple.finder</string> <string>com.adobe.illustrator</string> </array> I added this to Info.plist: <key>NSAppleEventsUsageDescription</key> <string>This app requires access to Finder and Adobe Illustrator for automation.</string> I built the app, signed with the correct Developer ID Application Certificate. I've also packaged it into a signed DMG and installed it, with the same result as running it from Xcode. I tried stripping it down to just the lines of code that communicate with Finder and Illustrator, and built it with a different bundle identifier with the same result. What am I missing?
3
0
606
Jan ’25
Transfer an application between accounts with an existing App Group
Due to business requirements, we need to transfer our app Gem Space for iOS from our current Apple Developer account to a new account. We have a major concern regarding our users and the data associated with the app. The user data is currently stored using an App Group with the identifier, for example: "group.com.app.sharedData" According to some information we’ve found, it might be possible to complete the transfer by removing the App Group from the old account and creating a new one with the same identifier in the new account. However, other sources suggest that App Group containers are owned by the specific team, and data stored in the container may become inaccessible after the app is transferred to a different team. This raises concerns about the possibility of users losing access to their data after updating the app from the new account. Could you please clarify the expected behavior of App Groups in this case? Do we need to perform any kind of data migration, and if so, could you please provide detailed guidance on how to do it safely and without impacting user data access?
2
0
83
Apr ’25