We have developed an accessory that supports Find My. When using the Find My app to set it up, it occasionally gets stuck at the final " setting up"" interface. The app just stays like that. We would like to know what could cause this situation and how to resolve it.
Thanks a lot.
Hardware
RSS for tagDelve into the physical components of Apple devices, including processors, memory, storage, and their interaction with the software.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
We are currently planning to develop a third‑party hardware accessory that supports Wi‑Fi Aware using AccessorySetupKit on iOS, based on the official documentation:
https://developer.apple.com/documentation/accessorysetupkit/
Before finalizing our hardware and firmware design, we would like to better understand the real‑world behavior and user experience of Wi‑Fi Aware in actual third‑party accessories.
Specifically, we would like to ask:
Existing Third‑Party Hardware
Are there any commercially available third‑party accessories (not Apple products) that already support Wi‑Fi Aware via AccessorySetupKit?
If so, are there any public examples, reference designs, or recommended products we can purchase to observe the real onboarding, discovery, and pairing experience?
Reference or Evaluation Hardware
Does Apple provide any reference hardware, evaluation kits, or recommended vendor solutions (for example, based on common Wi‑Fi chipsets) that are known to work well with Wi‑Fi Aware on iOS?
Are there specific Wi‑Fi chipset vendors that have validated interoperability with AccessorySetupKit?
Practical Behavior and Limitations
In real usage, what are the typical discovery latency, reliability, and background/foreground behavior developers should expect?
Are there known limitations or best practices when designing hardware that relies on Wi‑Fi Aware for initial accessory discovery and setup?
Our goal is to evaluate the feasibility and user experience of Wi‑Fi Aware for third‑party accessories by testing against existing implementations or recommended hardware, before investing heavily in custom hardware development.
Any guidance, examples, or pointers to existing accessories or partners would be greatly appreciated.
I want to add matter device to my own fabric,not same as to homeKit in Home APP
I implemented a demo which add a matter support extension, and it can success, but I use MTRDeviceController to commission,it go wrong, blow is the log
Couldn't read values in CFPrefsPlistSource<0x1062ec100> (Domain: group.wxx.MatterTest, User: kCFPreferencesAnyUser, ByHost: Yes, Container: (null), Contents Need Refresh: Yes): Using kCFPreferencesAnyUser with a container is only allowed for System Containers, detaching from cfprefsd
<<5 [E:46634i S:0 M:188511265] (U) Msg Retransmission to 0:0000000000000000 failure (max retries:4)
PASESession timed out while waiting for a response from the peer. Expected message type was 33
controller(:commissioningSessionEstablishmentDone:) error = nil
Error on commissioning step 'AttestationVerification': 'src/controller/CHIPDeviceController.cpp:1288: CHIP Error 0x000000AC: Internal error'
Failed verifying attestation information. Now checking DAC chain revoked status.
Failed in verifying 'Attestation Information' command received from the device: err 101. Look at AttestationVerificationResult enum to understand the errors
Error on commissioning step 'AttestationRevocationCheck': 'src/controller/CHIPDeviceController.cpp:1337: CHIP Error 0x000000AC: Internal error'
Failed to send Solitary ack for MessageCounter:265529558 on exchange 46643i:src/messaging/ExchangeContext.cpp:99: CHIP Error 0x00000002: Connection aborted
Creating NSError from src/controller/CHIPDeviceController.cpp:1337: CHIP Error 0x000000AC: Internal error (context: (null))
controller(:commissioningComplete:nodeID:metrics:) error = Optional(Error Domain=MTRErrorDomain Code=1 "General error: 172" UserInfo={NSLocalizedDescription=General error: 172, errorCode=172})
Is there any suggestion to me with the issue
Topic:
App & System Services
SubTopic:
Hardware
Hello Apple Forums,
We are developing an iOS application that connects to a custom BLE accessory and sends control commands to it.
Our system architecture is as follows:
A separate hardware device collects data and sends it to our backend server via Wi-Fi.
The backend evaluates state changes and determines when the BLE accessory should update its display.
The iOS app acts purely as a BLE command executor for this accessory.
Our goal is to:
Maintain a BLE connection with the accessory while the app is in the background.
Receive state-change events from our backend server.
Upon receiving such events, send a BLE command to the accessory to update its state.
We understand that iOS does not allow arbitrary background execution. We would like to confirm whether there is any supported mechanism, entitlement, or program that allows:
Long-running background execution for BLE control, or
Server-originated events (other than APNs) to trigger background BLE actions.
If this is not supported, we would appreciate confirmation that APNs (silent push) is the only supported way to trigger such background BLE actions, or guidance on any recommended alternative architectures.
Thank you for your guidance.
I am creating a barcode reader using the AVfoundation framework for iOS and IPadOS. The read result goes into payloadstringvalue, but I want to check the control characters contained in the symbol, so I am using the raw data of the description, which is a property of NSObjectProtocol inherited by VNBarcodeObservation. However, I noticed that if the length set in the raw data exceeds 26, some of the raw data in the description is omitted. So my question is, is it possible to set it so that all the raw data in the description is written out without omitting any raw data? If so, could you please tell me how to set this up? Also, if you know of any other way to extract the raw barcode data, I would appreciate it if you could let me know.
Thank you.
I am using NFC when the phone is near the NFC reader times below the error:
2024-07-15 15:43:03.608427+0800 TestNFC[16022:1038141] [xpc.exceptions] <NSXPCConnection: 0x282ba90e0> connection to service with pid 58 named com.apple.nfcd.service.corenfc: Exception caught during decoding of received selector didDetectExternalReaderWithNotification:, dropping incoming message.
Exception: Exception while decoding argument 0 (#2 of invocation):
Exception: decodeObjectForKey: class "NFFieldNotification" not loaded or does not exist
my code:
#import <CoreNFC/CoreNFC.h>
@interface ViewController ()<NFCTagReaderSessionDelegate>
@property (strong, nonatomic) NFCTagReaderSession *session;
@end
@implementation ViewController
- (void)viewDidLoad {
[super viewDidLoad];
// Do any additional setup after loading the view.
self.view.backgroundColor = [UIColor whiteColor];
if (@available(iOS 13.0, *)) {
// 初始化 NFC 设置代理 NFCTagReaderSessionDelegate
if (NFCNDEFReaderSession.readingAvailable)
{
self.session = [[NFCTagReaderSession alloc]
initWithPollingOption:NFCPollingISO14443 delegate:self queue:nil];
// NFC 显示提示信息
self.session.alertMessage = @"准备扫描,请将卡片贴近手机";
// 开启 NFC
[self.session beginSession];
}
} else {
}
}
#pragma mark - NFCNDEFReaderSessionDelegate
//读取失败回调-读取成功后还是会回调这个方法
- (void)tagReaderSessionDidBecomeActive:(NFCTagReaderSession *)session API_AVAILABLE(ios(13.0)){
NSLog(@"tagReaderSessionDidBecomeActive");
}
- (void)tagReaderSession:(NFCTagReaderSession *)session didInvalidateWithError:(NSError *)error API_AVAILABLE(ios(13.0)){
NSLog(@"readerSession:didInvalidateWithError: (%@)", [error localizedDescription]);
}
- (void)tagReaderSession:(NFCTagReaderSession *)session didDetectTags:(NSArray<__kindof id<NFCTag>> *)tags API_AVAILABLE(ios(13.0)){
}
I have a 4-input, 4-output hardware device and an 8-input, 8-output virtual device, which I combine into an aggregate device. I am using the SimplyCoreAudio library to get the channel count. The code is as follows:
aggregationDevice!.channels(scope: .input) =>> 12 aggregationDevice!.channels(scope: .output) =>> 12
When the program's MicrophoneMode is set to standard, the channel count is correct. However, when I set the MicrophoneMode to voiceIsolation, the channel count is incorrect:
aggregationDevice!.channels(scope: .input) =>> 4 aggregationDevice!.channels(scope: .output) =>> 12
Below is the code for creating the aggregate device:
func createAggregateDevice(mainDevice: AudioDevice,
secondDevice: AudioDevice?,
named name: String,
uid: String) -> AudioDevice?
{
guard let mainDeviceUID = mainDevice.uid else { return nil }
var deviceList: [[String: Any]] = [
[
kAudioSubDeviceUIDKey: mainDeviceUID,
kAudioSubDeviceDriftCompensationKey:1
]
]
// make sure same device isn't added twice
if let secondDeviceUID = secondDevice?.uid, secondDeviceUID != mainDeviceUID {
deviceList.append([
kAudioSubDeviceUIDKey: secondDeviceUID,
kAudioSubDeviceDriftCompensationKey:1,
kAudioSubDeviceInputChannelsKey:8
])
}
let desc: [String: Any] = [
kAudioAggregateDeviceNameKey: name,
kAudioAggregateDeviceUIDKey: uid,
kAudioAggregateDeviceSubDeviceListKey: deviceList,
kAudioAggregateDeviceMainSubDeviceKey: mainDeviceUID,
kAudioAggregateDeviceIsPrivateKey:false,
]
var deviceID: AudioDeviceID = 0
let error = AudioHardwareCreateAggregateDevice(desc as CFDictionary, &deviceID)
guard error == noErr else {
return nil
}
return AudioDevice.lookup(by: deviceID)
}
I hope someone can tell me the reason Thank you!
We just updated our ATS to the latest 8.3.0 version and tried to run the iAP2 Session Test via BPA100 Bluetooth Analyzer and we are experiencing this EXC_BAD_INSTRUCTION. This same test still seems to work on ATS version 6. Please advise.
Process: ATS [1782]
Path: /private/var/folders/*/ATS.app/Contents/MacOS/ATS
Identifier: com.apple.ATSMacApp
Version: 8.3.0 (1826)
Build Info: ATSMacApp-1826000000000000~2 (1A613)
Code Type: X86-64 (Native)
Parent Process: launchd [1]
User ID: 501
Date/Time: 2025-01-27 11:05:21.1334 -0800
OS Version: macOS 15.2 (24C101)
Report Version: 12
Bridge OS Version: 9.2 (22P2093)
Anonymous UUID: 098E2BB5-CB98-CA1C-CEFE-188AF6EFE8CF
Time Awake Since Boot: 9700 seconds
System Integrity Protection: enabled
Crashed Thread: 2 com.apple.ATSMacApp.FrontlineFrameworkInterface
Exception Type: EXC_BAD_INSTRUCTION (SIGILL)
Exception Codes: 0x0000000000000001, 0x0000000000000000
Termination Reason: Namespace SIGNAL, Code 4 Illegal instruction: 4
Terminating Process: exc handler [1782]
Topic:
App & System Services
SubTopic:
Hardware
Tags:
Developer Tools
External Accessory
Testing
Core Bluetooth
Hello everyone,
I want send haptics to ps4 controller.
CHHapticPatternPlayer and CHHapticAdvancedPatternPlayer good work with iPhone.
On PS4 controller If I use CHHapticPatternPlayer all work good, but if I use CHHapticAdvancedPatternPlayer I get error. I want use CHHapticAdvancedPatternPlayer to use additional settings. I don't found any information how to fix it -
CHHapticEngine.mm:624 -[CHHapticEngine finishInit:]_block_invoke: ERROR: Server connection broke with error 'Не удалось завершить операцию. (com.apple.CoreHaptics, ошибка -4811)'
The engine stopped because a system error occurred.
AVHapticClient.mm:1228 -[AVHapticClient getSyncDelegateForMethod:errorHandler:]_block_invoke: ERROR: Sync XPC call for 'loadAndPrepareHapticSequenceFromEvents:reply:' (client ID 0x21) failed: Не удалось установить связь с приложением-помощником.
Не удалось создать или воспроизвести паттерн: Error Domain=NSCocoaErrorDomain Code=4097 "connection to service with pid 5087 named com.apple.GameController.gamecontrollerd.haptics" UserInfo={NSDebugDescription=connection to service with pid 5087 named com.apple.GameController.gamecontrollerd.haptics}
My Haptic class -
import Foundation
import CoreHaptics
import GameController
protocol HapticsControllerDelegate: AnyObject {
func didConnectController()
func didDisconnectController()
func enginePlayerStart(value: Bool)
}
final class HapticsControllerManager {
static let shared = HapticsControllerManager()
private var isSetup = false
private var hapticEngine: CHHapticEngine?
private var hapticPlayer: CHHapticAdvancedPatternPlayer?
weak var delegate: HapticsControllerDelegate? {
didSet {
if delegate != nil {
startObserving()
}
}
}
deinit {
NotificationCenter.default.removeObserver(self)
}
private func startObserving() {
guard !isSetup else { return }
NotificationCenter.default.addObserver(
self,
selector: #selector(controllerDidConnect),
name: .GCControllerDidConnect,
object: nil
)
NotificationCenter.default.addObserver(
self,
selector: #selector(controllerDidDisconnect),
name: .GCControllerDidDisconnect,
object: nil
)
isSetup = true
}
@objc private func controllerDidConnect(notification: Notification) {
delegate?.didConnectController()
self.createAndStartHapticEngine()
}
@objc private func controllerDidDisconnect(notification: Notification) {
delegate?.didDisconnectController()
hapticEngine = nil
hapticPlayer = nil
}
private func createAndStartHapticEngine() {
guard let controller = GCController.controllers().first else {
print("No controller connected")
return
}
guard controller.haptics != nil else {
print("Haptics not supported on this controller")
return
}
hapticEngine = createEngine(for: controller, locality: .default)
hapticEngine?.playsHapticsOnly = true
do {
try hapticEngine?.start()
} catch {
print("Не удалось запустить движок тактильной обратной связи: \(error)")
}
}
private func createEngine(for controller: GCController, locality: GCHapticsLocality) -> CHHapticEngine? {
guard let engine = controller.haptics?.createEngine(withLocality: locality) else {
print("Failed to create engine.")
return nil
}
print("Successfully created engine.")
engine.stoppedHandler = { reason in
print("The engine stopped because \(reason.message)")
}
engine.resetHandler = {
print("The engine reset --> Restarting now!")
do {
try engine.start()
} catch {
print("Failed to restart the engine: \(error)")
}
}
return engine
}
func startHapticFeedback(haptics: [CHHapticEvent]) {
do {
let pattern = try CHHapticPattern(events: haptics, parameters: [])
hapticPlayer = try hapticEngine?.makeAdvancedPlayer(with: pattern)
hapticPlayer?.loopEnabled = true
try hapticPlayer?.start(atTime: 0)
self.delegate?.enginePlayerStart(value: true)
} catch {
self.delegate?.enginePlayerStart(value: false)
print("Не удалось создать или воспроизвести паттерн: \(error)")
}
}
func stopHapticFeedback() {
do {
try hapticPlayer?.stop(atTime: 0)
self.delegate?.enginePlayerStart(value: false)
} catch {
self.delegate?.enginePlayerStart(value: true)
print("Не удалось остановить воспроизведение вибрации: \(error)")
}
}
}
extension CHHapticEngine.StoppedReason {
var message: String {
switch self {
case .audioSessionInterrupt:
return "the audio session was interrupted."
case .applicationSuspended:
return "the application was suspended."
case .idleTimeout:
return "an idle timeout occurred."
case .systemError:
return "a system error occurred."
case .notifyWhenFinished:
return "playback finished."
case .engineDestroyed:
return "the engine was destroyed."
case .gameControllerDisconnect:
return "the game controller disconnected."
@unknown default:
return "an unknown error occurred."
}
}
}
custom haptic events -
static func changeVibrationPower(power: HapricPower) -> [CHHapticEvent] {
let continuousEvent = CHHapticEvent(eventType: .hapticContinuous, parameters: [
CHHapticEventParameter(parameterID: .hapticSharpness, value: 1.0),
CHHapticEventParameter(parameterID: .hapticIntensity, value: power.value)
], relativeTime: 0, duration: 0.5)
return [continuousEvent]
}
Hello,
I am currently working on a USB HID-class device and I wanted to test communications between various OSes and the device.
I was able to communicate through standard USB with the device on other OSes such as Windows and Linux, through their integrated kernel modules and generic HID drivers. As a last test, I wanted to test macOS as well.
This is my code, running in a Swift-based command line utility:
import Foundation
import CoreHID
let matchingCriteria = HIDDeviceManager.DeviceMatchingCriteria(vendorID: 0x1234, productID: 0x0006) // This is the VID/PID combination that the device is actually listed under
let manager = HIDDeviceManager()
for try await notification in await manager.monitorNotifications(matchingCriteria: [matchingCriteria]) {
switch notification {
case .deviceMatched(let deviceReference):
print("Device Matched!")
guard let client = HIDDeviceClient(deviceReference: deviceReference) else {
fatalError("Unable to create client. Exiting.") // crash on purpose
}
let report = try await client.dispatchGetReportRequest(type: .input)
print("Get report data: [\(report.map { String(format: "%02x", $0) }.joined(separator: " "))]")
case .deviceRemoved(_):
print("A device was removed.")
default:
continue
}
}
The client.dispatchGetReportRequest(...) line always fails, and if I turn the try expression into a force-unwrapped one (try!) then the code, unsurprisingly, crashes.
The line raises a CoreHID.HIDDeviceError.unknown() exception with a seemingly meaningless IOReturn code (last time I tried I got an IOReturn code with the value of -536870211).
The first instinct is to blame my own custom USB device for not working properly, but it doesn't cooperate with with ANY USB device currently connected: not a keyboard (with permissions granted), not a controller, nothing.
I did make sure to enable USB device access in the entitlements (when I tried to run this code in a simple Cocoa app) as well.
...What am I doing wrong here? What does the IOReturn code mean?
Thanks in advance for anybody willing to help out!
Topic:
App & System Services
SubTopic:
Hardware
Battery health reduced to 89 from 98 within 2 months on iPhone 15 Pro and Cycle Count is just 314.
Is it the software update doing this?
I have an Iphone Xsmax and the battery health is degraded to 69
i noticed whenever I put it on charge it just restarts and keeps doing that until I start using it or keep the screen on before it charges
please is it my charger or it’s because the battery health has degraded to 69?
Topic:
App & System Services
SubTopic:
Hardware
So the battery level value is in accurate returns the battery percentage in multiple of 5 values e.g. battery percentage is 42 but the api returns it as 40. So please fix the issue if possible because i checked that the
devices running iOS versions below 17 appear to be working fine.
Im having issue with OneDrive that is affected our company iPads. User are able to drag and drop any folder or files over and now they cant. they are on the latest update for OneDrive and the IOS. Can someone look at this and also i reach to Microsoft and they said that nothing have change on there end.
When are you guys going to fix the CarPlay issues with this new update? I use this for work and it’s really an issue. Nothing is working and it takes up entirely too much space.
Our company is developing an MFi headset with a button that we would like to use for initiating PTT.
We can detect the button press and initiate PTT successfully, even when the app is not in the foreground, using the ExternalAccessory framework.
But I wonder if this is a coincidence, or a scenario that should reliably work with Push to Talk?
After updating to ios18.4, 3d scanning function, including AR function in apple clips, cannot be used. Does anyone else have the same problem?
Topic:
App & System Services
SubTopic:
Hardware
**
Every time after I downloaded an app this window opens and never closes how to close it?**
Topic:
App & System Services
SubTopic:
Hardware
My audio and MIDI sequencer application consumes about 600 % of CPU power with 10 different instruments during playback. While idle approximately 100%.
What is the maximum of CPU power that an application can consume? Are there any limits and could they be modified?
I am asking because if I add more instruments the real-time behaviour gets bad at 700 % of CPU power.
I have got following HW:
MacBook Pro
14-inch, Nov 2024
Apple M4 Pro
24 GB
Hello,
Since updating to iOS 18.3.1, the rear camera on my iPhone 13 Pro Max has not been functioning properly. The Camera app displays a black screen and becomes unresponsive.
I analyzed the crash logs and found that the issue is related to the cameracaptured process, which handles image and video capture on iOS. Here are the key details from the crash log:
📌 Memory Error: "Address size fault"
📌 Impacted Thread: com.apple.coremedia.capturesession.workerQueue
The "Address size fault" error suggests a memory access issue, likely causing the cameracaptured process to crash. This could be due to a bug in the video capture thread management introduced in the update.
What do you think?
name":"cameracaptured","timestamp":"2025-03-12 10:37:31.00 +0100","app_version":"1.0","slice_uuid":"cc45251e-92fc-329d-a3e9-d1c8c019e59e","build_version":"587.82.13","platform":2,"share_with_app_devs":0,"is_first_party":1,"bug_type":"309","os_version":"iPhone OS 18.3.2 (22D82)","roots_installed":0,"incident_id":"E97F5B3A-345F-42A6-97E8-28D175C8C5A9","name":"cameracaptured"}
{
"uptime" : 820,
"procRole" : "Unspecified",
"version" : 2,
"userID" : 501,
"deployVersion" : 210,
"modelCode" : "iPhone14,3",
"coalitionID" : 75,
"osVersion" : {
"isEmbedded" : true,
"train" : "iPhone OS 18.3.2",
"releaseType" : "User",
"build" : "22D82"
},
"captureTime" : "2025-03-12 10:37:30.1093 +0100",
"codeSigningMonitor" : 2,
"incident" : "E97F5B3A-345F-42A6-97E8-28D175C8C5A9",
"pid" : 68,
"translated" : false,
"cpuType" : "ARM-64",
"roots_installed" : 0,
"bug_type" : "309",
"procLaunch" : "2025-03-12 10:04:03.7137 +0100",
"procStartAbsTime" : 225890551,
"procExitAbsTime" : 19918403953,
"procName" : "cameracaptured",
"procPath" : "/usr/libexec/cameracaptured",
"bundleInfo" : {"CFBundleVersion":"587.82.13","CFBundleShortVersionString":"1.0"},
"parentProc" : "launchd",
"parentPid" : 1,
"coalitionName" : "com.apple.cameracaptured",
"crashReporterKey" : "137125638e43c62173057ae3dc983089b1f083cf",
"appleIntelligenceStatus" : {"state":"unavailable","reasons":["siriAssetIsNotReady","selectedLanguageIneligible","selectedLanguageDoesNotMatchSelectedSiriLanguage","notOptedIn","deviceNotCapable","selectedSiriLanguageIneligible","countryLocationIneligible","unableToFetchAvailability","assetIsNotReady"]},
"wasUnlockedSinceBoot" : 1,
"isLocked" : 0,
"throttleTimeout" : 5,
"codeSigningID" : "com.apple.cameracaptured",
"codeSigningTeamID" : "",
"codeSigningFlags" : 570434305,
"codeSigningValidationCategory" : 1,
"codeSigningTrustLevel" : 7,
"instructionByteStream" : {"beforePC":"BgCA0hUnFpTgAxOqIaSGUiFLu3KJJBaU4AMTqqfYDZTozSGQAFEC+Q==","atPC":"IAAg1KiDW/jJkB+QKd1B+SkBQPk/AQjrAQEAVP17Uqn0T1Gp9ldQqQ=="},
"bootSessionUUID" : "33672FC1-99EC-48FC-8BCD-2B96DF170CC3",
"basebandVersion" : "4.20.03",
"exception" : {"codes":"0x0000000000000001, 0x00000001a93909f0","rawCodes":[1,7134054896],"type":"EXC_BREAKPOINT","signal":"SIGTRAP"},
"termination" : {"flags":0,"code":5,"namespace":"SIGNAL","indicator":"Trace/BPT trap: 5","byProc":"exc handler","byPid":68},
"os_fault" : {"process":"cameracaptured"},
"faultingThread" : 4,
"threads" : [{"id":1699,"threadState":{"x":[{"value":268451845},{"value":21592279046},{"value":8589934592},{"value":28600187224064},{"value":0},{"value":28600187224064},{"value":2},{"value":4294967295},{"value":18446744073709550527},{"value":2},{"value":0},{"value":0},{"value":0},{"value":6659},{"value":0},{"value":0},{"value":18446744073709551569},{"value":6677212688,"symbolLocation":56,"symbol":"clock_gettime"},{"value":0},{"value":4294967295},{"value":2},{"value":28600187224064},{"value":0},{"value":28600187224064},{"value":6126594600},{"value":8589934592},{"value":21592279046},{"value":21592279046},{"value":4412409862}],"flavor":"ARM_THREAD_STATE64","lr":{"value":7911718552},"cpsr":{"value":4096},"fp":{"value":6126594448},"sp":{"value":6126594368},"esr":{"value":1442840704,"description":" Address size fault"},"pc":{"value":7911704456},pc":{"value":7911704456},"far":{"value":0}},"queue":"com.apple.main-thread","frames":[{"imageOffset":6024,"symbol":"mach_msg2_trap","symbolLocation":8,"imageIndex":10},{"imageOffset":20120,"symbol":"mach_msg2_internal","symbolLocation":80,"imageIndex":10},{"imageOffset":19888,"symbol":"mach_msg_overwrite","symbolLocation":424,"imageIndex":10},{"imageOffset":19452,"symbol":"mach_msg","symbolL
Topic:
App & System Services
SubTopic:
Hardware