as i want to tract activity of iphone user using core motion framework , guide me through .
Hardware
RSS for tagDelve into the physical components of Apple devices, including processors, memory, storage, and their interaction with the software.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Created
Hello
I've noticed that this product, heavily promoted on the ASC forums for many years, is no longer available from the Apple App Store.
Can anyone tell me the reason why the product is no longer supported?
Friends have asked me if it is 'safe' to use.
Is it?
Note to moderator: If I'm asking in the wrong places, please redirect my question. Thank you.
Dove in and upgraded two Macs today to beta 1. Unfortunately, it appears L2TP VPN is broken or something changed in the way it works. I can longer get a connection to any VPN concentrator I used previously. I tested with Cisco Anyconnect SSL VPN client and can connect to the same concentrators (as they're configured to accept L2TP or SSL clients).
I also tested from my phone running iOS 16 beta and it still works for the L2TP connections.
The Mac not working with L2TP VPN ppp.log shows this
Fri Jun 10 19:18:52 2022 : L2TP connecting to server 'IP removed' (IP removed)...
Fri Jun 10 19:18:52 2022 : IPSec connection started
Fri Jun 10 19:18:52 2022 : IPSec phase 1 client started
Fri Jun 10 19:19:02 2022 : IPSec connection failed
Connecting a Mac successfully on 12.4 the log shows
Fri Jun 10 19:12:33 2022 : L2TP connecting to server 'IP removed' (IP removed)...
Fri Jun 10 19:12:33 2022 : IPSec connection started
Fri Jun 10 19:12:33 2022 : IPSec phase 1 client started
Fri Jun 10 19:12:33 2022 : IPSec phase 1 server replied
Fri Jun 10 19:12:34 2022 : IPSec phase 2 started
Fri Jun 10 19:12:34 2022 : IPSec phase 2 established
Fri Jun 10 19:12:34 2022 : IPSec connection established
(and then a ton more lines of the entire process ending with client getting an IP that I won't bother posting)
VPN wasn't high on my list of apps I was concerned about breaking with the beta. But, now that it is broke and I need it for work I'm kinda screwed myself.
Anyway, if anyone knows a way to fix this please let me know.
I am creating a barcode reader using the AVfoundation framework for iOS and IPadOS. The read result goes into payloadstringvalue, but I want to check the control characters contained in the symbol, so I am using the raw data of the description, which is a property of NSObjectProtocol inherited by VNBarcodeObservation. However, I noticed that if the length set in the raw data exceeds 26, some of the raw data in the description is omitted. So my question is, is it possible to set it so that all the raw data in the description is written out without omitting any raw data? If so, could you please tell me how to set this up? Also, if you know of any other way to extract the raw barcode data, I would appreciate it if you could let me know.
Thank you.
This is a regression since iOS 13. Is there no-one at Apple interested in fixing this?
FB9856371
Will UVC native support come for the Iphone as well?
So, using external cameras with the ipad is greatly beneficial, but for the iphone, it can make it a production powerhouse!
So, have there been discussions around bringing UVC support for the Iphone as well? and if so, what were your conclusions?
Since 17.4 Dev Beta 2, I have been having Bluetooth issues.
I had hoped it would have cleared up but even in 17.4.1 it continues.
Airpod and Echo Auto are the only 2 audio devices I have.
The audio will become chopping, rubber band or sound robotic and sometime completely disconnect.
While driving it will occur on both audio devices.
Sometimes I'm stopped at red light and the issue occurs.
The phone is less than 3 feet from the device at all times.
I have read forums and removed and readded the devices but that did not help.
I really do not want to have to reset my phone since my 2FA apps do not recover in a restore.
Anyone have any suggestions?
I am using NFC when the phone is near the NFC reader times below the error:
2024-07-15 15:43:03.608427+0800 TestNFC[16022:1038141] [xpc.exceptions] <NSXPCConnection: 0x282ba90e0> connection to service with pid 58 named com.apple.nfcd.service.corenfc: Exception caught during decoding of received selector didDetectExternalReaderWithNotification:, dropping incoming message.
Exception: Exception while decoding argument 0 (#2 of invocation):
Exception: decodeObjectForKey: class "NFFieldNotification" not loaded or does not exist
my code:
#import <CoreNFC/CoreNFC.h>
@interface ViewController ()<NFCTagReaderSessionDelegate>
@property (strong, nonatomic) NFCTagReaderSession *session;
@end
@implementation ViewController
- (void)viewDidLoad {
[super viewDidLoad];
// Do any additional setup after loading the view.
self.view.backgroundColor = [UIColor whiteColor];
if (@available(iOS 13.0, *)) {
// 初始化 NFC 设置代理 NFCTagReaderSessionDelegate
if (NFCNDEFReaderSession.readingAvailable)
{
self.session = [[NFCTagReaderSession alloc]
initWithPollingOption:NFCPollingISO14443 delegate:self queue:nil];
// NFC 显示提示信息
self.session.alertMessage = @"准备扫描,请将卡片贴近手机";
// 开启 NFC
[self.session beginSession];
}
} else {
}
}
#pragma mark - NFCNDEFReaderSessionDelegate
//读取失败回调-读取成功后还是会回调这个方法
- (void)tagReaderSessionDidBecomeActive:(NFCTagReaderSession *)session API_AVAILABLE(ios(13.0)){
NSLog(@"tagReaderSessionDidBecomeActive");
}
- (void)tagReaderSession:(NFCTagReaderSession *)session didInvalidateWithError:(NSError *)error API_AVAILABLE(ios(13.0)){
NSLog(@"readerSession:didInvalidateWithError: (%@)", [error localizedDescription]);
}
- (void)tagReaderSession:(NFCTagReaderSession *)session didDetectTags:(NSArray<__kindof id<NFCTag>> *)tags API_AVAILABLE(ios(13.0)){
}
Is it mandatory to use classic Bluetooth (Bluetooth Classic) to connect game controllers that support Apple’s MFi games and Arcade games, or can game controllers be developed using only Bluetooth Low Energy (BLE) for such accessories?
On iOS 18.x when try to create EASession we get nil, but on iOS 17.x everything works.
We have app which use USB cable for connecting external accessories.
Scenario is when we have fresh instal, connecting with accessory work fine, EASession is created, streams are opened.
When we unplug USB, we close streams, remove any reference to session and accessory, remove accessory delegate.
When plug it again, creating EASession is returning nil. Only after restarting iPhone, we can create new EASession with appropriate protocol and accessory. Every next attempt without reseting iPhone is failing.
Logs from accessory is following:
00:05:51.811000 : onUSBDeviceFound(pDevice=0xffc818)) iPhone USB device already in the device list w/id=1 -> update status now[21;1H
00:05:51.830000 : setConnectionStatus(status=connected) [devId=1] state updated -> forward[21;1H
Capabilities indicate HostMode possibility => role switch is triggered
00:05:52.848000 : updateDIPODeviceConnections() iPhoneUSB w/caps=5 (=CarPlay or HostMode), deviceTag=2 in Device mode -> request role switch[21;1H
Role switch seems to be successful
00:05:54.914000 : setSwitching('stable') changed[21;1H
00:05:54.915000 : updateDIPODeviceConnections() iPhoneUSB w/caps=2, id=1, deviceTag=2 and native transport -> request app launch and call connectUSB[21;1H
00:05:54.967000 : ConnectiAP2(05ac:12a8, s/n='00008101000160921E90801E', writeFD='/dev/ffs/ep3', readFD='/dev/ffs/ep4', hostMode){3}[21;1H
Native transport should become available but does not (the following line is not present for failed case. Taken from successful case)
00:05:24.983000 : OnDBusPropChanged_NativeTransport(): deviceId=2, started=1, iAP2iOSAppIdentifier=1, sinkEndpoint=3, sourceEndpoint=4, TransactionID=1
EAP Start event not received (trace line from success try)
00:05:25.057000 : EAPSessionStart(ctx=0x74e0b800){2} called[21;1H
Is there any braking change on iOS 18 considering EASession?
Also what is strange is that it works on fresh instal/restart iPhone, but not working on second attempt?
As of iOS 18.1 being released we are having issues with our users experiencing issues with our app that relies on strobing the device torch.
We have narrowed this down to being caused on devices with adaptive true-tone flash and have submitted a radar: FB15787160.
The issue seems to be caused by ambient light levels. If run in a dark room, the torch strobes exactly as effectively as in previous iOS versions, if run in a light room, or outdoors, or near a window, the strobe will run for ~1s and then the torch will get stuck on for half a second or so (less frequently it gets stuck off) and then it will strobe again for ~1s and this behaviour repeats indefinitely.
If we go to a darker environment, and background and then foreground the app (this is required) the issue is resolved, until moving to an area with higher ambient light levels again. We have done a lot of debugging, and also discovered that turning off "Auto-Brightness" from Settings -> Accessibility -> Display & Text Size resolves the issue.
We have also viewed logs from Console.app at the time of the issue occurring and it seems to be that there are quite sporadic ambient light level readings at the time at which the issue occurs. The light readings transition from ~100 Lux to ~8000 Lux at the point that the issue starts occurring (seemingly caused by the rear sensor being affected by the torch). With "Auto-Brightness" turned off, it seems these readings stay at lower levels.
This is rendering the primary use case of our app essentially useless, would be great to get to the bottom of it! We can't even really detect it in-app as I believe using SensorKit is restricted to research applications and requires a review process with Apple before accessing?
Edit: It's worth noting this is also affecting other apps with strobe functionality in the exact same way
I am working on a Bluetooth Low Energy (BLE) project using the nRF52840 Development Kit (DK), which has been reconfigured to simulate an nRF52805 chip. The firmware is based on Nordic Semiconductor's ble_app_hids_keyboard example, with modifications to implement a BLE HID Gamepad. I am using the S113 SoftDevice and have successfully tested the functionality with Android devices. The gamepad is recognized as a HID device, and it works as expected on Android, verified using the hardwareTester website.
However, when I connect the gamepad to an iPhone via BLE, the same hardwareTester website does not respond as it does on Android, indicating that the iPhone does not recognize the device as a gamepad. The BLE connection is established successfully, but it seems iOS does not interpret the HID report descriptor or the BLE HID service correctly. I suspect there might be compatibility issues with the HID descriptor or the GATT attributes for iOS-specific BLE HID requirements.
I would like to have some help.
I am working on a Bluetooth Low Energy (BLE) project using the nRF52840 Development Kit (DK), which has been reconfigured to simulate an nRF52805 chip. The firmware is based on Nordic Semiconductor's ble_app_hids_keyboard example, with modifications to implement a BLE HID Gamepad. I am using the S113 SoftDevice and have successfully tested the functionality with Android devices. The gamepad is recognized as a HID device, and it works as expected on Android, verified using the hardwareTester website.
However, when I connect the gamepad to an iPhone via BLE, the same hardwareTester website does not respond as it does on Android, indicating that the iPhone does not recognize the device as a gamepad. The BLE connection is established successfully, but it seems iOS does not interpret the HID report descriptor or the BLE HID service correctly. I suspect there might be compatibility issues with the HID descriptor or the GATT attributes for iOS-specific BLE HID requirements.
I would like to have some help.
i never imagined that an apple product could do such. a thing . i 've updated to the latest version , 15.3 what should i do next time? i've had to restart it three times, the last one finally helped
here is the link https://youtu.be/-aqjzVKMZGA
New projects in Xcode do not include an Info.plist. Where do I put the IDs for Supported External Accessory Protocols?
I have a C++/Objective-C command line application, running on MacOs (15.1.1 (24B91)), that communicates with a Bluetooth LE peripheral. The application is build with Apple clang 16.0.0 and CMake as build system using Boost.Asio.
I'm able to establish a L2CAP channel and after the channel is established, the peripheral sends a first (quite small) SDU on that channel to the application. The PSM is 0x80 and was chosen by the peripherals BLE stack. The application receives the PSM via GATT notification.
I can see the SDU being send in a single LL PDU with Wireshark. I can also see the SDU being received in Apples PacketLogger. But I miss the corresponding call to a stream event handler. For all other GATT related events, the corresponding delegates / callbacks are called.
The code that creates a dispatch queue and passes it to the CBCentralManager looks like this:
dispatch_queue = dispatch_queue_create("de.torrox.ble_event_queue", NULL);
manager = [[CBCentralManager alloc] initWithDelegate:self queue:dispatch_queue options:nil];
When the L2CAP channel is established, the didOpenL2CAPChannel callback gets called from a thread within the dispatch_queue (has been verified with lldb):
- (void)peripheral:(CBPeripheral *)peripheral
didOpenL2CAPChannel:(CBL2CAPChannel *)channel
error:(NSError *)error
{
[channel inputStream].delegate = self;
[channel outputStream].delegate = self;
[[channel inputStream] scheduleInRunLoop:[NSRunLoop currentRunLoop] forMode:NSDefaultRunLoopMode];
[[channel outputStream] scheduleInRunLoop:[NSRunLoop currentRunLoop] forMode:NSDefaultRunLoopMode];
[[channel inputStream] open];
[[channel outputStream] open];
...
// a reference to the channel is stored in the outside channel object
[channel retain];
...
}
Yet, not a single stream event is generated:
- (void)stream:(NSStream *)stream
handleEvent:(NSStreamEvent)event_code
{
Log( @"stream:handleEvent %@, %lu", stream, event_code );
...
}
When I add a functionality, to poll the input stream, the stream will report the expected L2CAP input. But no event is generated.
The main thread of execution is usually blocking on a boost::asio::io_context::run() call. The design is, to have the stream callback stream:handleEvent to post call back invocations on that io_context, and thus to wake up the main thread and get that callbacks being invoked on the main thread.
All asynchronous GATT delegate calls are working as expected. The only missing events, are the events from the L2CAP streams. The same code worked in an older project on an older version of MacOs and an older version of Boost.
How can I find out, why the stream delegates are not called?
I upgraded my iPhone 13 Pro Max last week, but I am unable to connect my Sony WH-1000XM5 headphones via Bluetooth. I'm not sure why everything else connects just fine except my headphones. Please send help.
Topic:
App & System Services
SubTopic:
Hardware
Six months ago I wrote FB14122473, detailing how the built-in CDC (or FTDI) VCP serial port driver is limited to 3 Mbps or less. Thing is, there are some FTDI devices that can do 12 Mbps (maybe more), and I have devices I need to communicate with at 4 Mbps. I had to use the FTDI SDK to be able to communicate with these.
I was hoping this post might help draw attention to that bug report.
Topic:
App & System Services
SubTopic:
Hardware
Hello!
In our App we found out the сrash at QuickLook after users updated their iPhones up to iOS 18.
Crash report:
SIGABRT 0x0000000000000000
Crashed: Thread
0 libsystem_kernel.dylib __abort_with_payload + 8
1 libsystem_kernel.dylib abort_with_payload_wrapper_internal + 104
2 libsystem_kernel.dylib abort_with_payload_wrapper_internal + 0
3 libobjc.A.dylib _objc_fatalv(unsigned long long, unsigned long long, char const*, char*) + 116
4 libobjc.A.dylib _objc_fatalv(unsigned long long, unsigned long long, char const*, char*) + 0
5 libobjc.A.dylib weak_register_no_lock + 396
6 libobjc.A.dylib objc_initWeak + 440
7 UIKitCore -[UIViewController viewDidMoveToWindow:shouldAppearOrDisappear:] + 1412
8 UIKitCore -[_UIRemoteViewController viewDidMoveToWindow:shouldAppearOrDisappear:] + 396
9 UIKitCore -[UIView(Internal) _didMoveFromWindow:toWindow:] + 1180
10 UIKitCore -[_UISizeTrackingView _didMoveFromWindow:toWindow:] + 112
11 UIKitCore -[UIView(Internal) _didMoveFromWindow:toWindow:] + 712
12 UIKitCore __45-[UIView(Hierarchy) _postMovedFromSuperview:]_block_invoke + 128
13 CoreAutoLayout -[NSISEngine withBehaviors:performModifications:] + 84
14 UIKitCore -[UIView _postMovedFromSuperview:] + 512
15 UIKitCore __UIViewWasRemovedFromSuperview + 136
16 UIKitCore -[UIView(Hierarchy) removeFromSuperview] + 248
17 QuickLook -[QLToolbarController setAccessoryView:animated:] + 576
18 QuickLook __55-[QLPreviewController _presentLoadedPreviewCollection:]_block_invoke + 120
19 QuickLookUICore QLRunInMainThread + 60
20 QuickLook -[QLPreviewController _presentLoadedPreviewCollection:] + 116
21 QuickLook __48-[QLPreviewController _presentPreviewCollection]_block_invoke_2 + 68
22 QuickLook (Missing)
23 libswift_Concurrency.dylib swift::runJobInEstablishedExecutorContext(swift::Job*) + 252
After research, we found out that a crash occurs in QuickLook when a user tries to open a file with unknown mimeType - application/octet-stream.
QuickLook convert that files to known type (for an example - application/pdf), but crashes for the first opening the file.
Any help would be greatly appreciated!
At present, I am using the avfoundation external device API to connect my iPad to a DSLR camera for data collection. On my end, I am using AVCapture Video Data Output to obtain raw data for processing and rendering. However, the pixelbuf returned from the system layer is incomplete, with only a portion cropped in the middle. But using the Mac API is normal. I would like to ask how to obtain the complete pixelbuf of the image on iPad