Hardware

RSS for tag

Delve into the physical components of Apple devices, including processors, memory, storage, and their interaction with the software.

Posts under Hardware subtopic

Post

Replies

Boosts

Views

Created

Temporarily disable macOS capture of USB RFID reader(s)
Hello. I am attempting to wrap the C library libnfc as a Swift library. This is not for use on macOS - it's mainly for use on Linux (Raspberry Pi). I have a USB reader and my code appears to work so far, however the code/test/debug cycle is suboptimal if I'm running the code on the Pi. As I use a Mac for day-to-day coding, I'd prefer to use Xcode and my Mac for development. MacOS appears to capture the NFC hardware for its own frameworks and attempting to open a connection to the USB device gives a Unable to claim USB interface (Permission denied) error. ioreg shows that the hardware is claimed by an Apple framework: "UsbExclusiveOwner" = "pid 10946, com.apple.ifdbun" Is there a way to temporarily over-ride that system and use the hardware myself? I've tried Googling but most of the replies are out of date and Claude's advice launchctl unload /System/Library/LaunchDaemons/com.apple.ifdreader.plist doesn't appear to work... I'm wary of disabling SIP - is there a simple way to have access to the hardware myself? Thanks.
2
0
176
Jan ’26
Does Apple Home support Matter commissioning using concatenated / multi-device QR codes?
Hi, I’m developing a Matter commissioning flow and would like to clarify Apple Home’s support for concatenated (multi-device) QR codes. In my implementation, I generate a single QR code that contains multiple Matter onboarding payloads (concatenated payloads), intended to commission multiple devices in one scan, similar to a multi-pack / multi-accessory flow. What I’ve tested: Standard single-device Matter QR codes work as expected in the Apple Home app A concatenated QR code (multiple Matter payloads combined into one QR) does not get recognized / commissioned by Apple Home My questions: Does Apple Home officially support commissioning via concatenated or multi-device Matter QR codes? If yes, is there a specific payload format or delimiter that Apple Home expects? If not, is this a known limitation or something planned for future iOS/Home releases?
1
0
90
4w
Accelerometer sampling rate limits for third-party iOS apps
Hello everyone, I am developing an iOS application that relies on accelerometer data for precise motion and reaction-time measurements. Based on practical testing, it appears that third-party iOS applications receive accelerometer data at a maximum rate of approximately 100 Hz, regardless of hardware capabilities or requested update intervals. I would like to ask for clarification on the following points: Is there an officially supported way for third-party iOS apps to access accelerometer data at sampling rates higher than ~100 Hz? If the hardware supports higher sampling rates, is this limitation intentionally enforced at the iOS level for third-party applications? Are there any public APIs, entitlements, or documented approaches that allow access to higher-frequency sensor data, or is this restricted to system/internal components only? Thank you in advance for any clarification.
1
0
64
3w
How to add matter device to my own Fabric,Use Matter Support SDK
I want to add matter device to my own fabric,not same as to homeKit in Home APP I implemented a demo which add a matter support extension, and it can success, but I use MTRDeviceController to commission,it go wrong, blow is the log Couldn't read values in CFPrefsPlistSource<0x1062ec100> (Domain: group.wxx.MatterTest, User: kCFPreferencesAnyUser, ByHost: Yes, Container: (null), Contents Need Refresh: Yes): Using kCFPreferencesAnyUser with a container is only allowed for System Containers, detaching from cfprefsd <<5 [E:46634i S:0 M:188511265] (U) Msg Retransmission to 0:0000000000000000 failure (max retries:4) PASESession timed out while waiting for a response from the peer. Expected message type was 33 controller(:commissioningSessionEstablishmentDone:) error = nil Error on commissioning step 'AttestationVerification': 'src/controller/CHIPDeviceController.cpp:1288: CHIP Error 0x000000AC: Internal error' Failed verifying attestation information. Now checking DAC chain revoked status. Failed in verifying 'Attestation Information' command received from the device: err 101. Look at AttestationVerificationResult enum to understand the errors Error on commissioning step 'AttestationRevocationCheck': 'src/controller/CHIPDeviceController.cpp:1337: CHIP Error 0x000000AC: Internal error' Failed to send Solitary ack for MessageCounter:265529558 on exchange 46643i:src/messaging/ExchangeContext.cpp:99: CHIP Error 0x00000002: Connection aborted Creating NSError from src/controller/CHIPDeviceController.cpp:1337: CHIP Error 0x000000AC: Internal error (context: (null)) controller(:commissioningComplete:nodeID:metrics:) error = Optional(Error Domain=MTRErrorDomain Code=1 "General error: 172" UserInfo={NSLocalizedDescription=General error: 172, errorCode=172}) Is there any suggestion to me with the issue
0
0
57
3w
MFi enrollment process
I followed the instructions on the page https://mfi.apple.com/en/help/login-help/How-to-Register-Your-Existing-Apple-ID.html to apply for the MFi Program. According to step 7 of the guide: "You have now created and registered your Apple Account. You will be automatically directed to the MFi Portal to begin the enrollment process," I should have been taken to the enrollment process after logging in. However, instead of accessing the enrollment page, a pop-up message appears stating: "The Apple Account you signed in with does not have permission to view this page. If you believe your company is currently enrolled in the MFi Program, please contact your company’s Account Administrator to request access to the MFi Portal. If your company is not currently enrolled in the MFi Program, please click here to learn about the program and start the enrollment process." This has created an endless loop—I cannot proceed to the enrollment process as instructed, and the pop-up only redirects me to information that leads back to the same login and permission issue. Could you please provide guidance on how to resolve this and successfully access the MFi Program enrollment process?
1
0
91
2w
Requesting guidance on long-running background BLE control triggered by server-side events
Hello Apple Forums, We are developing an iOS application that connects to a custom BLE accessory and sends control commands to it. Our system architecture is as follows: A separate hardware device collects data and sends it to our backend server via Wi-Fi. The backend evaluates state changes and determines when the BLE accessory should update its display. The iOS app acts purely as a BLE command executor for this accessory. Our goal is to: Maintain a BLE connection with the accessory while the app is in the background. Receive state-change events from our backend server. Upon receiving such events, send a BLE command to the accessory to update its state. We understand that iOS does not allow arbitrary background execution. We would like to confirm whether there is any supported mechanism, entitlement, or program that allows: Long-running background execution for BLE control, or Server-originated events (other than APNs) to trigger background BLE actions. If this is not supported, we would appreciate confirmation that APNs (silent push) is the only supported way to trigger such background BLE actions, or guidance on any recommended alternative architectures. Thank you for your guidance.
0
0
70
2w
Mac Studio: Continuity Camera unavailable after reboot unless USB camera is connected
Summary On Mac Studio systems (no built-in camera), macOS does not initialize camera services after a normal reboot if no physical camera is present. As a result, Continuity Camera does not appear anywhere in the system. Observed behavior System Information → Camera reports “No video capture devices were found.” Continuity Camera (iPhone) is completely absent from camera lists. Plugging in any USB UVC webcam immediately initializes camera services and causes both the USB camera and the iPhone (Continuity Camera) to appear. The USB camera can then be unplugged and Continuity Camera continues working until the next reboot. Reproduction steps Use a Mac Studio (no built-in camera) on recent macOS. Ensure no USB webcam or external camera is connected. Reboot the Mac normally. After login, open System Information → Camera. Expected Camera services should initialize even when no physical camera is present, allowing Continuity Camera to be available as the primary camera. Actual No camera devices are present unless a physical USB camera is connected at least once after boot. This reproduces 100% of the time on Mac Studio and appears to be a camera service bootstrap issue where Continuity Camera cannot be the first camera device. Issue has been filed via Feedback Assistant.
1
0
92
2w
hide/show scene in Home View API
Which HomeKit API serves for the Home application scene (HMActionSet)-related functionality “Remove from Home View” and “Add to Home View”? There must be a public API for that, for at the very least one 3rd party application shows/hides scenes appropriately as they are set up in Home; nevertheless, whatever I try, I can't find the API. Thanks!
2
0
78
2w
Matter Operating Device issue
My team has developed an app with a biref Matter commissioner feature using the Matter framework on the MatterSupport extension. Our app support iOS and Android. However, we ran into a problem that the control certificate generated by the iOS app could not control the device on the Android side. And the control certificate generated by the Android app could not control the device on the iOS side. The Matter library used by Android is compiled by connectedhomeip. Does anyone have the same problem as us? How to solve this? Thank you
3
0
182
2w
MacBook Pro m5 can’t recognize two external monitors with same EDID binary serial (only one works at a time)
My MacBook Pro M5 running MacOS Tahoe 26.3 beta fails to detect two identical ASUS ROG Swift OLED PG32UCDM monitors simultaneously. Only one display is recognized at a time. One potential root cause might be that both monitors report identical binary EDID serial numbers (0x01010101), and the MacBook Pro M5 appears to use this value exclusively for display identity rather than combining it with other more detailed information (e.g., port, or alphanumeric serial number). I've verified that the monitor EDID binary serial numbers are in fact identical -- however the alphanumerical serial numbers are not identical. NOTE: This behavior is specific to the MacBook Pro M5 — when connecting both monitors via usb-c to a Mac Mini M4 Pro running the same MacOS Tahoe 26.3 beta, the monitors work fine. The OS detects both and assigns different names to them (PG32UCDM (1) and PG32UCDM (2)). NOTE: I could be wrong about this root cause, I don't have a way to disprove it, though the fact the monitors work fine on a Mac Mini is suspicious. What I have tried: Connecting the two monitors using different monitor ports (one on DisplayPort, another on HDMI, etc.), and different MacBook ports (one on HDMI, another on USB-C, etc.) Bumping down the resolution on the monitors to "1920x1080 (low resolution)" and 30Hz to rule out bandwidth issues. Connecting one, or both, monitors to CalDigit TS5 Plus dock. Neither alternate configuration yields the device recognizing both screens. Using BetterDisplay to import a manually-edited EDID for the screen, with a different binary EDID value, manufacturer name, etc. I've also verified that if I plug in my Apple Studio Display as one of the monitors, then the MacBook recognizes both one of the PG32UCDM monitors and the Studio Display at the same time. The issue seems to occur only when both monitors plugged into it are the same PG32UCDM model. When I have both monitors plugged into my MacBook, each time I disconnect the cable to whichever monitor is currently recognized, it immediately recognizes the other monitor. Plugging the cable for the disconnected monitor back in has no effect. I'm at a loss. Has anyone run into this issue and found a successful workaround that is not one of the approaches I've described above?
14
0
419
1w
readValueWithCompletionHandler: limitations?
Are there some undocumented (or, well, documented, but overlooked by me) prerequisites to the readValueWithCompletionHandler: method? The reason I ask is that occasionally I am getting the Read/Write operation failed error in the callback, even in cases where direct, non-deferred reading of the value worked properly. It seems to happen very consistently with some accessories and characteristics, not randomly; thus, it is not likely a temporary quirk in the communication with the device. Probably I am overlooking something of importance, but it does not make a good sense to me. My code (is it right, or can you see anything wrong in there?) // in an HMCharacteristic category if ([self.properties containsObject:HMCharacteristicPropertyReadable]) { id val=self.value, ident=[NSString stringWithFormat:@" [%@] %@ (%@)", self.uniqueIdentifier, self.localizedDescription, self.service.accessory.name]; NSLog(@"nondeferred '%@'%@", val, ident); if (self.service.accessory.reachable) { [self readValueWithCompletionHandler:^(NSError * _Nullable error) { if (error) NSLog(@"deferred ERROR %@ -> %@", ident, error); else NSLog(@"deferred '%@'%@", self.value, ident); }]; } } for most accessories/characteristics works properly, but for some of them I am consistently getting results like nondeferred '70.5' [64998F70-9C11-502F-B8B4-E99DC5C3171B] Current Relative Humidity (Vlhkoměr TH) deferred '70.5' ERROR [64998F70-9C11-502F-B8B4-E99DC5C3171B] Current Relative Humidity (Vlhkoměr TH) -> Error Domain=HMErrorDomain Code=74 "Read/Write operation failed." UserInfo={NSLocalizedDescription=Read/Write operation failed.} Do I do something wrong in my code, or is that normal with some devices? If the latter, is there perhaps a way to know beforehand that I should not use readValueWithCompletionHandler: (for it is bound to fail anyway), and instead I should simply use self.value non-deferred? For some time it seemed to me it happens with bridged accessories, but not really, this hypothesis proved wrong by further testing. Thanks!
3
0
153
1w
UAC 2.0 Channel Count and Channel Names
I am developing a standard UAC 2.0 device and encountered an issue where the channel names do not update according to the iChannelNames field in the Class Specific AS Interface Descriptor when switching between different channel counts. For example: AS1 (6 channels) is configured with the following channel names: ADAT 1, ADAT 2, ADAT 3, ADAT 4, HP L, HP R AS2 (4 channels) is configured with: ADAT 1, ADAT 2, HP L, HP R However, when switching from AS1 (6 channels) to AS2 (4 channels), the channel names displayed in Audio MIDI Setup do not reflect the change as expected. The actual result is: ADAT 1, ADAT 2, ADAT 3, ADAT 4 The system simply hides the last two channels; the names of the remaining channels are not updated. Initial Topology My original topology was as follows: Later, I discovered that macOS uses the iChannelNames field from the Input Terminal to display channel names. Therefore, I modified the USB device descriptors and updated the topology to the following: To distinguish the channel names for different channel counts, each Input Terminal is assigned a unique iChannelNames value. This method worked perfectly on macOS 15. However, after updating to macOS 26, this topology no longer displays the correct channel names. Question On macOS 26, what is the correct method to ensure that the channel names update dynamically when switching between different audio channel configurations?
0
0
39
1w
UAC 2.0 Channel Count and Channel Names
I am developing a standard UAC 2.0 device and encountered an issue where the channel names do not update according to the iChannelNames field in the Class Specific AS Interface Descriptor when switching between different channel counts. For example: AS1 (6 channels) is configured with the following channel names: ADAT 1, ADAT 2, ADAT 3, ADAT 4, HP L, HP R AS2 (4 channels) is configured with: ADAT 1, ADAT 2, HP L, HP R However, when switching from AS1 (6 channels) to AS2 (4 channels), the channel names displayed in Audio MIDI Setup do not reflect the change as expected. The actual result is: ADAT 1, ADAT 2, ADAT 3, ADAT 4 The system simply hides the last two channels; the names of the remaining channels are not updated. Initial Topology My original topology was as follows: Later, I discovered that macOS uses the iChannelNames field from the Input Terminal to display channel names. Therefore, I modified the USB device descriptors and updated the topology to the following: To distinguish the channel names for different channel counts, each Input Terminal is assigned a unique iChannelNames value. This method worked perfectly on macOS 15. However, after updating to macOS 26, this topology no longer displays the correct channel names. Question On macOS 26, what is the correct method to ensure that the channel names update dynamically when switching between different audio channel configurations?
0
0
46
1w
How can I obtain the documentation for the specific implementation of WAC?
Hi everyone, We are currently exploring ways to implement a frictionless Wi-Fi setup for our hardware devices without requiring a dedicated third-party application. We are interested in leveraging Apple's WAC (Wireless Accessory Configuration) to sync Wi-Fi credentials directly from iOS devices. However, we have struggled to find comprehensive technical documentation or specifications regarding the WAC service. Could anyone point us to the official source for these materials? Additionally, we have a couple of technical questions: 1.We are testing WAC provisioning and found that the Home app can discover our device and successfully get it online. However, it always ends with a "Failed to add accessory" message. Does WAC support imply that a device should be addable via the Home app? If not, why is the Home app able to discover and start the setup for a non-HomeKit WAC device? 2. Our device is already Apple AirPlay certified. Does implementing WAC require additional standalone certification, or is it covered under the existing MFi/AirPlay certification umbrella? Any insights or guidance would be greatly appreciated. Thank you!
1
0
26
1d
macOS: Is ARKit-equivalent face tracking possible with an external camera?
Hello, I am an individual developer working on a macOS application using SwiftUI and RealityKit. I would like to understand the feasibility of face-related tracking on macOS when using an external USB camera, compared to iOS/iPadOS. Specifically: • Does macOS provide an ARKit Face Tracking–equivalent API (e.g., real-time facial expressions, gaze direction, depth)? • If not, is it common to rely on Vision / AVFoundation as alternatives for: • Facial expression coefficients • Gaze estimation • Depth approximation • In an environment without dedicated sensors such as TrueDepth, is it correct to assume that accurate depth data and high-fidelity blend shape extraction are realistically difficult? Any clarification on official limitations, recommended alternatives, or relevant documentation would be greatly appreciated. Thank you.
0
0
9
3h