Hardware

RSS for tag

Delve into the physical components of Apple devices, including processors, memory, storage, and their interaction with the software.

Posts under Hardware subtopic

Post

Replies

Boosts

Views

Activity

CHHapticAdvancedPatternPlayer not working with GCController
Hello everyone, I want send haptics to ps4 controller. CHHapticPatternPlayer and CHHapticAdvancedPatternPlayer good work with iPhone. On PS4 controller If I use CHHapticPatternPlayer all work good, but if I use CHHapticAdvancedPatternPlayer I get error. I want use CHHapticAdvancedPatternPlayer to use additional settings. I don't found any information how to fix it - CHHapticEngine.mm:624 -[CHHapticEngine finishInit:]_block_invoke: ERROR: Server connection broke with error 'Не удалось завершить операцию. (com.apple.CoreHaptics, ошибка -4811)' The engine stopped because a system error occurred. AVHapticClient.mm:1228 -[AVHapticClient getSyncDelegateForMethod:errorHandler:]_block_invoke: ERROR: Sync XPC call for 'loadAndPrepareHapticSequenceFromEvents:reply:' (client ID 0x21) failed: Не удалось установить связь с приложением-помощником. Не удалось создать или воспроизвести паттерн: Error Domain=NSCocoaErrorDomain Code=4097 "connection to service with pid 5087 named com.apple.GameController.gamecontrollerd.haptics" UserInfo={NSDebugDescription=connection to service with pid 5087 named com.apple.GameController.gamecontrollerd.haptics} My Haptic class - import Foundation import CoreHaptics import GameController protocol HapticsControllerDelegate: AnyObject { func didConnectController() func didDisconnectController() func enginePlayerStart(value: Bool) } final class HapticsControllerManager { static let shared = HapticsControllerManager() private var isSetup = false private var hapticEngine: CHHapticEngine? private var hapticPlayer: CHHapticAdvancedPatternPlayer? weak var delegate: HapticsControllerDelegate? { didSet { if delegate != nil { startObserving() } } } deinit { NotificationCenter.default.removeObserver(self) } private func startObserving() { guard !isSetup else { return } NotificationCenter.default.addObserver( self, selector: #selector(controllerDidConnect), name: .GCControllerDidConnect, object: nil ) NotificationCenter.default.addObserver( self, selector: #selector(controllerDidDisconnect), name: .GCControllerDidDisconnect, object: nil ) isSetup = true } @objc private func controllerDidConnect(notification: Notification) { delegate?.didConnectController() self.createAndStartHapticEngine() } @objc private func controllerDidDisconnect(notification: Notification) { delegate?.didDisconnectController() hapticEngine = nil hapticPlayer = nil } private func createAndStartHapticEngine() { guard let controller = GCController.controllers().first else { print("No controller connected") return } guard controller.haptics != nil else { print("Haptics not supported on this controller") return } hapticEngine = createEngine(for: controller, locality: .default) hapticEngine?.playsHapticsOnly = true do { try hapticEngine?.start() } catch { print("Не удалось запустить движок тактильной обратной связи: \(error)") } } private func createEngine(for controller: GCController, locality: GCHapticsLocality) -> CHHapticEngine? { guard let engine = controller.haptics?.createEngine(withLocality: locality) else { print("Failed to create engine.") return nil } print("Successfully created engine.") engine.stoppedHandler = { reason in print("The engine stopped because \(reason.message)") } engine.resetHandler = { print("The engine reset --> Restarting now!") do { try engine.start() } catch { print("Failed to restart the engine: \(error)") } } return engine } func startHapticFeedback(haptics: [CHHapticEvent]) { do { let pattern = try CHHapticPattern(events: haptics, parameters: []) hapticPlayer = try hapticEngine?.makeAdvancedPlayer(with: pattern) hapticPlayer?.loopEnabled = true try hapticPlayer?.start(atTime: 0) self.delegate?.enginePlayerStart(value: true) } catch { self.delegate?.enginePlayerStart(value: false) print("Не удалось создать или воспроизвести паттерн: \(error)") } } func stopHapticFeedback() { do { try hapticPlayer?.stop(atTime: 0) self.delegate?.enginePlayerStart(value: false) } catch { self.delegate?.enginePlayerStart(value: true) print("Не удалось остановить воспроизведение вибрации: \(error)") } } } extension CHHapticEngine.StoppedReason { var message: String { switch self { case .audioSessionInterrupt: return "the audio session was interrupted." case .applicationSuspended: return "the application was suspended." case .idleTimeout: return "an idle timeout occurred." case .systemError: return "a system error occurred." case .notifyWhenFinished: return "playback finished." case .engineDestroyed: return "the engine was destroyed." case .gameControllerDisconnect: return "the game controller disconnected." @unknown default: return "an unknown error occurred." } } } custom haptic events - static func changeVibrationPower(power: HapricPower) -> [CHHapticEvent] { let continuousEvent = CHHapticEvent(eventType: .hapticContinuous, parameters: [ CHHapticEventParameter(parameterID: .hapticSharpness, value: 1.0), CHHapticEventParameter(parameterID: .hapticIntensity, value: power.value) ], relativeTime: 0, duration: 0.5) return [continuousEvent] }
1
0
405
Feb ’25
Unable to connect to any HID device using Core HID
Hello, I am currently working on a USB HID-class device and I wanted to test communications between various OSes and the device. I was able to communicate through standard USB with the device on other OSes such as Windows and Linux, through their integrated kernel modules and generic HID drivers. As a last test, I wanted to test macOS as well. This is my code, running in a Swift-based command line utility: import Foundation import CoreHID let matchingCriteria = HIDDeviceManager.DeviceMatchingCriteria(vendorID: 0x1234, productID: 0x0006) // This is the VID/PID combination that the device is actually listed under let manager = HIDDeviceManager() for try await notification in await manager.monitorNotifications(matchingCriteria: [matchingCriteria]) { switch notification { case .deviceMatched(let deviceReference): print("Device Matched!") guard let client = HIDDeviceClient(deviceReference: deviceReference) else { fatalError("Unable to create client. Exiting.") // crash on purpose } let report = try await client.dispatchGetReportRequest(type: .input) print("Get report data: [\(report.map { String(format: "%02x", $0) }.joined(separator: " "))]") case .deviceRemoved(_): print("A device was removed.") default: continue } } The client.dispatchGetReportRequest(...) line always fails, and if I turn the try expression into a force-unwrapped one (try!) then the code, unsurprisingly, crashes. The line raises a CoreHID.HIDDeviceError.unknown() exception with a seemingly meaningless IOReturn code (last time I tried I got an IOReturn code with the value of -536870211). The first instinct is to blame my own custom USB device for not working properly, but it doesn't cooperate with with ANY USB device currently connected: not a keyboard (with permissions granted), not a controller, nothing. I did make sure to enable USB device access in the entitlements (when I tried to run this code in a simple Cocoa app) as well. ...What am I doing wrong here? What does the IOReturn code mean? Thanks in advance for anybody willing to help out!
1
0
442
Feb ’25
iPhone Xsmax battery issue
I have an Iphone Xsmax and the battery health is degraded to 69 i noticed whenever I put it on charge it just restarts and keeps doing that until I start using it or keep the screen on before it charges please is it my charger or it’s because the battery health has degraded to 69?
1
0
280
Feb ’25
Onedrive
Im having issue with OneDrive that is affected our company iPads. User are able to drag and drop any folder or files over and now they cant. they are on the latest update for OneDrive and the IOS. Can someone look at this and also i reach to Microsoft and they said that nothing have change on there end.
1
0
388
Feb ’25
Carplay
When are you guys going to fix the CarPlay issues with this new update? I use this for work and it’s really an issue. Nothing is working and it takes up entirely too much space.
1
0
355
Feb ’25
iPhone 13promax camera issue
Ever since the last update i have issues with my camera app. Sometimes when I open the app the forward facing cameras don’t work and it’s just a Black screen. I also get a warning that I may not have genuine iPhone parts installed. I have to reboot the phone every time just to have it app function again. It’s annoying. Please fix this. I never had any issues with the camera or its app up until after the update.
1
0
195
Mar ’25
Bluetooth
你好,有个问题想请教一下: 我们的app是一款与CGM实时血糖相关的app,app支持在后台通过蓝牙跟CGM血糖设备保持连接。 现在遇到一个问题: 首先,app在后台运行期间,蓝牙是开启状态的,跟CGM设备也是连接的状态。 某个时刻,监听到蓝牙的状态突然从 poweredOn 变为了 resetting ,然后蓝牙又恢复了,状态变为了 poweredOn 这时候,问题出现了:之前连接的那个CGM血糖设备一直无法扫描到了!! 我想问一下: 什么情况下,蓝牙状态会变为resetting 蓝牙状态恢复为poweredOn后,之前连接的那个CGM血糖设备一直无法扫描到了,为什么?我要怎么做才能恢复,重新扫描到之前连接的这个设备?
1
0
98
Mar ’25
I sent Proposal for Apple Silicon.app
Today, I submitted the following proposal to Apple through the Feedback Assistant app. I'm not confident in how I phrased it—I'd appreciate any thoughts or feedback from fellow developers. Proposal: "Apple Silicon.app" for macOS with Apple Silicon – Enhancing Performance and Swap Memory Control This suggestion has been machine translated into English, so there may be some discrepancies. If you need the actual text, please feel free to reply to this or ask at the email address below. <mail address> If the author of this proposal is to be credited, I would appreciate being listed under the nickname “DiamondGotCat,” where possible. Summary: - Currently, Apple Silicon-equipped Macs have many system-level features locked down or restricted. - This proposal suggests a new application that enables certain advanced controls for power users. - Tentatively named "Apple Silicon.app", the name may be subject to change if a more suitable alternative arises. - I propose this application be added as a pre-installed utility on compatible systems: macOS (M-series), iPadOS (A-series and M-series), and iOS (iPhones with Apple-designed A5 and newer chips, provided the latest OS is available for them). Overview: This proposal introduces "Apple Silicon.app", a new system-level utility designed to offer power users greater flexibility and control over Apple Silicon behavior, as part of a broader feature update for Apple devices. I propose that Apple Silicon.app be automatically installed as a pre-installed application on Apple Silicon devices (Mac, iPad, iPhone, and select Vision devices) that support upcoming major system updates. Suggested Features of Apple Silicon.app: 1. Performance Core Control To describe this functionality, the following terminology will be used: - P-cores: Performance cores - E-cores: Efficiency cores I understand that Apple Silicon emphasizes energy efficiency, but I believe there are users—myself included—who prioritize maximum performance regardless of power usage. Therefore, I propose that the app offer a drop-down menu with the following six modes for performance core usage: A. Automatic (Recommended) – Default macOS behavior; automatically switches between P/E cores based on workload. B. Performance Priority – Prioritizes P-cores for high-demand tasks, restricts E-cores. Ideal for developers, video editors. C. Power Saving – Uses E-cores only whenever possible; limits P-core usage. Great for battery saving. D. P-Core Exclusive Mode – User-defined processes always run on P-cores. Suitable for benchmarks or low-latency tasks. E. E-Core Exclusive Mode – Prioritizes background tasks and thermal efficiency. F. Manual Assignment (Advanced) – Users can manually assign P/E cores per application in a dedicated settings screen. Additionally, I propose the following optional checkbox settings: - Thermal Safety Mode: Automatically switches from P- to E-cores when system heat exceeds a threshold. - Restore Core Settings on Wake: Remembers P/E settings after sleep/wake. - Power Source Adaptive Mode: Switches to power-saving on battery, and performance mode when plugged in. 2. Swap Memory Configuration The app should also enable user-level control over swap memory (i.e., using part of the SSD as virtual memory). Currently, macOS manages swap space automatically with no user customization available. I propose the ability to manually configure the swap system with the following options: - Enable Manual Configuration: Checkbox to switch from automatic to manual control. - Swap Size: Adjustable in GB units, allowing users to allocate desired swap capacity. 3. Other Settings At this point, these are the core features I propose. If additional useful features exist that align with this concept, I welcome further suggestions or expansion. As users—at least speaking for myself—we look forward to such customization options becoming available.
1
0
84
Mar ’25
iPad Bluetooth Keyboard Defaults to ANSI When Connected After App Launch (JIS Layout Issue)
I'm developing an iPad app and encountered a strange issue with external Bluetooth keyboards. Issue: I have a Bluetooth keyboard set to JIS layout in Settings > General > Keyboard > Hardware Keyboard > Keyboard Type. If I connect the keyboard before launching the app, everything works fine, and the input follows the JIS layout. However, if I launch the app first and then turn on the Bluetooth keyboard, the input behaves as if the keyboard is in ANSI layout, even though the settings still show JIS. It seems like iPadOS defaults to ANSI if no external keyboard is connected when the app starts, and later connections do not update the layout properly. Has anyone encountered a similar issue, and is there a programmatic way to ensure that the correct keyboard layout is applied after the keyboard is connected? Any help or insights would be greatly appreciated!
0
0
72
Mar ’25
About USB accessory certification
I have a question about Apple certification. We are planning card reader via HID(human interface device) for iPad that support USB-C. iPad will receive data as HID protocol. In this case do I have to get certificate(for example MFi) like Apple USB accessory?
3
0
112
Apr ’25
Dockkit ADK 1.0 Compatible Nordic SDK
While compiling nRF5340 target of Dockkit ADK 1.0 following the guide of README.md, I selected the latest Nordic SDK, because there is no specified SDK version in the README.md. But it seems that ADK and SDK are not compatible with each others. For example, it calls nrfx_gpiote_channel_alloc() with one argument in PAL\NCS\HAPPlatformExperience.c. But the difinition of this function needs two arguments. Also I found that in some older version of Nordic SDK, this function needs only one argument. So could you please make sure which version of Nordic SDK should developer use?
1
0
100
Apr ’25
BLE timeout issue when connecting two devices on iOS 18 (but not iOS 16)
Hi, We’re developing a BLE peripheral device and encountered a connection issue when connecting two devices (Device A and Device B) simultaneously to an iOS device. Problem: On iOS 18, we are experiencing occasional BLE timeouts and disconnections when both devices are connected at the same time. On iOS 16, we did not encounter this issue under the same conditions. What we’ve tried: Adjusted the connection interval from 30ms to 15ms. This seems to have improved stability somewhat. However, we still observe intermittent timeout/disconnection issues. Questions: Are there any known changes in BLE connection handling or timing constraints in iOS 18? Are there recommended connection parameter settings (interval, latency, timeout, etc.) for multi-device BLE connections in iOS? Is there a way to debug or log more details about the disconnection reasons on the iOS side? Any guidance or suggestions would be greatly appreciated.
0
0
109
Apr ’25
NSLocalizedDescription = \"Peer removed pairing information\";
After hardware and mobile phone hid mode pairing, the first connection is successful, after a while disconnect and reconnect,APP monitoring Bluetooth error NSLocalizedDescription = "Peer removed pairing information"; Failed to connect Hardware engineers detect the pairing information and find that the local pairing information of the iPhone has changed, which is a non-mandatory phenomenon
1
0
140
Apr ’25
Matter device data pipeline
I'm device manufacturer and in future planning to get my device matter certified. If I want my device data for analytics purpose into my cloud than what is the best way possible. My research says that the most latest approach suggested by apple is, developing a custom mobile app using device homekit sdk and subscribe to device state and send it to my cloud. If I go that route, will it work even though the device was onboarded via homekit app and homekit hub device is also there. I want to make sure that both path will be active, device to hub to home app and device to custom app to my cloud, and both on matter ecosystem. The homekit sdk and matter support mentioned here https://developer.apple.com/apple-home/matter, are these two same thing?
1
0
109
May ’25
Mic Button in Microsoft Web Chat Control Unresponsive in WKWebView After App Returns from Background (iOS)
We are integrating the Microsoft Web Chat Control inside a WKWebView in our iOS application. The microphone button (used for speech input) works as expected when the app is active. However, we are facing an issue when the app is sent to the background and then brought back to the foreground. Issue Details: When the app returns from background to foreground: 🔹 The mic button becomes unresponsive (taps are not recognized). 🔹 No permission prompt or speech functionality is triggered. 🔹 Other elements of the Web Chat control continue to work fine. This issue seems isolated to iOS and WKWebView usage. We have verified that microphone permissions are granted and there are no system-level blocks. Environment: Platform: iOS Web Container: WKWebView Microsoft Web Chat Version: Devanshu Kinariwala add version here iOS Version: iOS 18.3.1 Devices: iPhone 13, iPhone 14 Pro error in browser console [Error] A MediaStreamTrack ended due to a capture failure (x3) [Error] WebSocket connection to 'wss://directline.botframework.com/v3/directline/conversations/JQ1k0phVogeJ30ZQddBvAQ-in/stream?watermark=-&t=eyJhbGciOiJSUzI1NiIsImtpZCI6ImlmOEs0aFg4R1hXVnZkS3pwdFRFWFJveURTUSIsIng1dCI6ImlmOEs0aFg4R1hXVnZkS3pwdFRFWFJveURTUSIsInR5cCI6IkpXVCJ9.eyJib3QiOiJhaWFhcy1xYS1jb252YWktYm90Iiwic2l0ZSI6InRWcW14cDBQZU9vIiwiY29udiI6IkpRMWswcGhWb2dlSjMwWlFkZEJ2QVEtaW4iLCJuYmYiOjE3NDI5NzE1MTgsImV4cCI6MTc0Mjk3MTU3OCwiaXNzIjoiaHR0cHM6Ly9kaXJlY3RsaW5lLmJvdGZyYW1ld29yay5jb20vIiwiYXVkIjoiaHR0cHM6Ly9kaXJlY3RsaW5lLmJvdGZyYW1ld29yay5jb20vIn0.Mx3MMVP3t9Ex36UW-YARskZLny0iORxc6-B0ewvNp0S-ivUjvOS43kZc0J5HoOgYRkoGaKemo00_JSkzryAbKKoSwqMjahf0VotqTZsJjoIgtyNJFfAYyGVriBHMV_6FfH_YEezDMD5puY6R89eM-atQOw-CfoClwrxn8jgVL5Kn19WdDZvmQwFIArklA7as8bboKcWv4PveEKptM9xCokttaGzv-S5pdbNETMoJzIhLcJDHmEVJ6oJ0TFs5XS7RGMSQlM_gs95TySzVjVL7XV6qEOt_A10lRzmx0PxPIUw_nqllEIbWFy5H7AfsxbKRtM1nLe4lRm1KS7_xw9dSlw' failed: The operation couldn’t be completed. Software caused connection abort [Error] WebSocket connection to 'wss://eastus2.stt.speech.microsoft.com/speech/recognition/conversation/cognitiveservices/v1?language=en-US&format=detailed&Ocp-Apim-Subscription-Key=4JuKqIMwLMhgAxfORVDEYfiuTL6Hrbnj3isAeGfs7aks4AOltun6JQQJ99AKACHYHv6XJ3w3AAAYACOGJFYj&X-ConnectionId=A93B097C62F14C55B30A851798609F73' failed: The operation couldn’t be completed. Software caused connection abort [Error] WebSocket connection to 'wss://eastus2.stt.speech.microsoft.com/speech/recognition/conversation/cognitiveservices/v1?language=en-US&format=detailed&Ocp-Apim-Subscription-Key=4JuKqIMwLMhgAxfORVDEYfiuTL6Hrbnj3isAeGfs7aks4AOltun6JQQJ99AKACHYHv6XJ3w3AAAYACOGJFYj&X-ConnectionId=E99DF3A6CE734E0294A5FB5296D725CC' failed: The operation couldn’t be completed. Software caused connection abort [Error] WebSocket connection to 'wss://eastus2.stt.speech.microsoft.com/speech/recognition/conversation/cognitiveservices/v1?language=en-US&format=detailed&Ocp-Apim-Subscription-Key=4JuKqIMwLMhgAxfORVDEYfiuTL6Hrbnj3isAeGfs7aks4AOltun6JQQJ99AKACHYHv6XJ3w3AAAYACOGJFYj&X-ConnectionId=8B3370005E7A4946BEA174E804F64FF7' failed: The operation couldn’t be completed. Software caused connection abort Swift: // Method to check microphone permission func checkMicrophonePermission(completion: @escaping (Bool) -> Void) { let authorizationStatus = AVCaptureDevice.authorizationStatus(for: .audio) switch authorizationStatus { case .authorized: // Permission granted completion(true) case .notDetermined: // Request permission AVCaptureDevice.requestAccess(for: .audio) { granted in DispatchQueue.main.async { completion(granted) } } case .denied, .restricted: // Permission denied or restricted completion(false) @unknown default: // Handle future cases completion(false) } }
0
0
87
May ’25