Post

Replies

Boosts

Views

Activity

Video required for WatchOS app ?
I submitted an iOS app with a watchOS companion app.App has been 'Metadata Rejected':Here is the full message:Guideline 2.1 - Information NeededWe have started the review of your app, but we are not able to continue because we need access to a video that demonstrates the current version of your app in use on a physical watchOS device.Please only include footage in your demo video of your app running on a physical watchOS device, and not on a simulator. It is acceptable to use a screen recorder to capture footage of your app in use.Next StepsTo help us proceed with the review of your app, please provide us with a link to a demo video in the App Review Information section of App Store Connect and reply to this message in Resolution Center.To provide a link to a demo video:- Log in to App Store Connect- Click on "My Apps"- Select your app- Click on the app version on the left side of the screen- Scroll down to "App Review Information"- Provide demo video access details in the "Notes" section- Once you've completed all changes, click the "Save" button at the top of the Version Information page.Please note that in the event that your app may only be reviewed by means of a demo video, you will be required to provide an updated demo video with every resubmission.Since your App Store Connect status is Metadata Rejected, we do NOT require a new binary. To revise the metadata, visit App Store Connect to select your app and revise the desired metadata values. Once you’ve completed all changes, reply to this message in Resolution Center and we will continue the review.I have 3 questions:- Is it a systematic requirement for Watch apps ? I did not see in guidelines ; or is it for some reason specific to my app or to the reviewer ?- How can I record video on the Apple Watch ? Should I film the watch while in operation and post this video ? Or is there a direct way to record the video from the watch to iPhone (using system tools, not third party).- I understand it is not video for publication on appstore, but video for tester. So should I include video in the screen captures section or put it on some web site and give a link to it to the tester ?
7
0
3.7k
Sep ’21
Capturing the image read by QRCode
In this app I get access to QRCode. Reading works perfectly from the camera. Now I am struggling to get the image that was processed by the built in QRCode reader. I have found many hints on SO, but cannot make it work. Here is the code I have now. It is a bit long, I have to slit in 2 parts I looked at: // https://stackoverflow.com/questions/56088575/how-to-get-image-of-qr-code-after-scanning-in-swift-ios // https://stackoverflow.com/questions/37869963/how-to-use-avcapturephotooutput import UIKit import AVFoundation class ScannerViewController: UIViewController, AVCaptureMetadataOutputObjectsDelegate { &#9;&#9;fileprivate var captureSession: AVCaptureSession! // use for QRCode reading &#9;&#9;fileprivate var previewLayer: AVCaptureVideoPreviewLayer! &#9;&#9; &#9;&#9;// To get the image of the QRCode &#9;&#9;private var photoOutputQR: AVCapturePhotoOutput! &#9;&#9;private var isCapturing = false &#9;&#9; &#9;&#9;override func viewDidLoad() { &#9;&#9;&#9;&#9;super.viewDidLoad() &#9;&#9;&#9;&#9;var accessGranted = false &#9;&#9;&#9; //&#9;switch AVCaptureDevice.authorizationStatus(for: .video) { // HERE TEST FOR ACCESS RIGHT. WORKS OK ; // But is .video enough ? &#9;&#9;&#9;&#9;} &#9;&#9;&#9;&#9; &#9;&#9;&#9;&#9;if !accessGranted {&#9;return } &#9;&#9;&#9;&#9;captureSession = AVCaptureSession() &#9;&#9;&#9;&#9; &#9;&#9;&#9;&#9;photoOutputQR = AVCapturePhotoOutput() // IS IT THE RIGHT PLACE AND THE RIGHT THING TO DO ? &#9;&#9;&#9;&#9;captureSession.addOutput(photoOutputQR)&#9; // Goal is to capture an image of QRCode once acquisition is done &#9;&#9;&#9;&#9;guard let videoCaptureDevice = AVCaptureDevice.default(for: .video) else { return } &#9;&#9;&#9;&#9;let videoInput: AVCaptureDeviceInput &#9;&#9;&#9;&#9; &#9;&#9;&#9;&#9;do { &#9;&#9;&#9;&#9;&#9;&#9;videoInput = try AVCaptureDeviceInput(device: videoCaptureDevice) &#9;&#9;&#9;&#9;} catch {&#9;return } &#9;&#9;&#9;&#9; &#9;&#9;&#9;&#9;if (captureSession.canAddInput(videoInput)) { &#9;&#9;&#9;&#9;&#9;&#9;captureSession.addInput(videoInput) &#9;&#9;&#9;&#9;} else { &#9;&#9;&#9;&#9;&#9;&#9;failed() &#9;&#9;&#9;&#9;&#9;&#9;return &#9;&#9;&#9;&#9;} &#9;&#9;&#9;&#9; &#9;&#9;&#9;&#9;let metadataOutput = AVCaptureMetadataOutput() &#9;&#9;&#9;&#9; &#9;&#9;&#9;&#9;if (captureSession.canAddOutput(metadataOutput)) { &#9;&#9;&#9;&#9;&#9;&#9;captureSession.addOutput(metadataOutput) // SO I have 2 output in captureSession. IS IT RIGHT ? &#9;&#9;&#9;&#9;&#9;&#9; &#9;&#9;&#9;&#9;&#9;&#9;metadataOutput.setMetadataObjectsDelegate(self, queue: DispatchQueue.main) &#9;&#9;&#9;&#9;&#9;&#9;metadataOutput.metadataObjectTypes = [.qr]&#9;// For QRCode video acquisition &#9;&#9;&#9;&#9;&#9;&#9; &#9;&#9;&#9;&#9;} else { &#9;&#9;&#9;&#9;&#9;&#9;failed() &#9;&#9;&#9;&#9;&#9;&#9;return &#9;&#9;&#9;&#9;} &#9;&#9;&#9;&#9; &#9;&#9;&#9;&#9;previewLayer = AVCaptureVideoPreviewLayer(session: captureSession) &#9;&#9;&#9;&#9;previewLayer.frame = view.layer.bounds &#9;&#9;&#9;&#9;previewLayer.frame.origin.y += 40 &#9;&#9;&#9;&#9;previewLayer.frame.size.height -= 40 &#9;&#9;&#9;&#9;previewLayer.videoGravity = .resizeAspectFill &#9;&#9;&#9;&#9;view.layer.addSublayer(previewLayer) &#9;&#9;&#9;&#9;captureSession.startRunning() &#9;&#9;} &#9;&#9; &#9;&#9;override func viewWillAppear(_ animated: Bool) { &#9;&#9;&#9;&#9; &#9;&#9;&#9;&#9;super.viewWillAppear(animated) &#9;&#9;&#9;&#9;if (captureSession?.isRunning == false) { &#9;&#9;&#9;&#9;&#9;&#9;captureSession.startRunning() &#9;&#9;&#9;&#9;} &#9;&#9;} &#9;&#9; &#9;&#9;override func viewWillDisappear(_ animated: Bool) { &#9;&#9;&#9;&#9; &#9;&#9;&#9;&#9;super.viewWillDisappear(animated) &#9;&#9;&#9;&#9;if (captureSession?.isRunning == true) { &#9;&#9;&#9;&#9;&#9;&#9;captureSession.stopRunning() &#9;&#9;&#9;&#9;} &#9;&#9;} &#9;&#9; &#9;&#9;// MARK: - scan Results &#9;&#9; &#9;&#9;func metadataOutput(_ output: AVCaptureMetadataOutput, didOutput metadataObjects: [AVMetadataObject], from connection: AVCaptureConnection) { &#9;&#9;&#9;&#9; &#9;&#9;&#9;&#9;captureSession.stopRunning() &#9;&#9;&#9;&#9; &#9;&#9;&#9;&#9;if let metadataObject = metadataObjects.first { &#9;&#9;&#9;&#9;&#9;&#9;guard let readableObject = metadataObject as? AVMetadataMachineReadableCodeObject else { return } &#9;&#9;&#9;&#9;&#9;&#9;guard let stringValue = readableObject.stringValue else { return } &#9;&#9;&#9;&#9;&#9;&#9;AudioServicesPlaySystemSound(SystemSoundID(kSystemSoundID_Vibrate)) &#9;&#9;&#9;&#9;&#9;&#9;found(code: stringValue) &#9;&#9;&#9;&#9;} &#9;&#9;&#9;&#9;// Get image - IS IT THE RIGHT PLACE TO DO IT ? &#9;&#9;&#9;&#9;// https://stackoverflow.com/questions/37869963/how-to-use-avcapturephotooutput &#9;&#9;&#9;&#9;print("Do I get here ?", isCapturing) &#9;&#9;&#9;&#9;let photoSettings = AVCapturePhotoSettings() &#9;&#9;&#9;&#9;let previewPixelType = photoSettings.availablePreviewPhotoPixelFormatTypes.first! &#9;&#9;&#9;&#9;print("previewPixelType", previewPixelType) &#9;&#9;&#9;&#9;let previewFormat = [kCVPixelBufferPixelFormatTypeKey as String: previewPixelType, &#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9; kCVPixelBufferWidthKey as String: 160, &#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9; kCVPixelBufferHeightKey as String: 160] &#9;&#9;&#9;&#9;photoSettings.previewPhotoFormat = previewFormat &#9;&#9;&#9;&#9;if !isCapturing { &#9;&#9;&#9;&#9;&#9;&#9;isCapturing = true &#9;&#9;&#9;&#9;&#9;&#9;photoOutputQR.capturePhoto(with: photoSettings, delegate: self) &#9;&#9;&#9;&#9;} &#9;&#9;&#9;&#9;dismiss(animated: true) &#9;&#9;} &#9;&#9; } extension ScannerViewController: AVCapturePhotoCaptureDelegate { &#9; &#9;&#9;func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) { &#9;&#9;&#9; &#9;&#9;&#9;&#9;isCapturing = false &#9;&#9;&#9;&#9;print("photo", photo, photo.fileDataRepresentation()) &#9;&#9;&#9;&#9;guard let imageData = photo.fileDataRepresentation() else { &#9;&#9;&#9;&#9;&#9;&#9;print("Error while generating image from photo capture data."); &#9;&#9;&#9;&#9;&#9;&#9;return &#9;&#9;&#9;&#9;} &#9;&#9; } } I get the following print on console Clearly photo is not loaded properly Do I get here ? false previewPixelType 875704422 photo <AVCapturePhoto: 0x281973a20 pts:nan 1/1 settings:uid:3 photo:{0x0} time:nan-nan> nil Error while generating image from photo capture data.
4
0
6.1k
Jul ’22
Lines numbering in Code Block
The Code Block doesn't number lines anymore. The work around is to ask for Numbered list in addition to Code block. Just applying Code Block: required init?(coder aDecoder: NSCoder) { super.init(coder: aDecoder) commonInit() } Code Block AND Numbered List (or the other order) 1. required init?(coder aDecoder: NSCoder) { 2. super.init(coder: aDecoder) 3. commonInit() 4. } Is there another way to get numbering directly ?
4
0
922
Aug ’21
Excluding activity types for UIActivityViewController: some are still present
I try to exclude some activities from UIActivity. It works as expected when exclusion is done directly with the activity, as with: UIActivity.ActivityType.message, UIActivity.ActivityType.airDrop but not when activity is declared with an init as with: UIActivity.ActivityType(rawValue: "net.whatsapp.WhatsApp.ShareExtension"), UIActivity.ActivityType(rawValue: "com.ifttt.ifttt.share"), So, with the following code: let excludedActivityTypes = [ UIActivity.ActivityType.message, UIActivity.ActivityType.airDrop, UIActivity.ActivityType(rawValue: "net.whatsapp.WhatsApp.ShareExtension"), UIActivity.ActivityType(rawValue: "com.ifttt.ifttt.share") ]         let activityVC = UIActivityViewController(activityItems: [modifiedPdfURL], applicationActivities: nil)          activityVC.excludedActivityTypes = excludedActivityTypes message and airDrop do not show, but WhatsApp and IFTTT still show. I have tested with         activityVC.completionWithItemsHandler = { (activity, success, modifiedItems, error) in             print("activity: \(activity), success: \(success), items: \(modifiedItems), error: \(error)")         } that WhatsApp and IFTTT services are effectively the ones listed here. When selecting WhatsApp, print above gives: activity: Optional(__C.UIActivityType(_rawValue: net.whatsapp.WhatsApp.ShareExtension)), success: false, items: nil, error: nil
4
0
5.2k
Jul ’22
NSKeyedUnarchiver validateAllowedClass error
I get this error in Xcode 14 / iOS 16 on device that I had not with previous versions. [general] *** -[NSKeyedUnarchiver validateAllowedClass:forKey:] allowed unarchiving safe plist type ''NSNumber' (0x205da88f8) [/System/Library/Frameworks/Foundation.framework]' for key 'NS.objects', even though it was not explicitly included in the client allowed classes set: '{(     "'NSDictionary' (0x205da1178) [/System/Library/Frameworks/CoreFoundation.framework]",     "'NSString' (0x205da8948) [/System/Library/Frameworks/Foundation.framework]" )}'. This will be disallowed in the future. The only places where I reference NSDictionary.self or NSString.self or NSNumber.self for allowed classes are: @objc class MyClass : NSObject, NSSecureCoding {       required init(coder decoder: NSCoder) {        let myObject = decoder.decodeObject(of: [MyClass.self, NSNumber.self, NSArray.self, NSDictionary.self, NSString.self], forKey: myKey) as? [SomeClass] ?? []     } and in another class class Util { // in a class func:     let data = try Data(contentsOf: fileURL)     guard let unarchived = try NSKeyedUnarchiver.unarchivedObject(ofClass: NSDictionary.self, from: data) else { return nil } I do not understand what NS.objects refers to. I tried to add NSObject.self in        let myObject = decoder.decodeObject(of: [MyClass.self, NSNumber.self, NSArray.self, NSDictionary.self, NSString.self, NSObject.self], forKey: myKey) as? [SomeClass] ?? [] but that gave even more warnings: [Foundation] *** -[NSKeyedUnarchiver validateAllowedClass:forKey:]: NSSecureCoding allowed classes list contains [NSObject class], which bypasses security by allowing any Objective-C class to be implicitly decoded. Consider reducing the scope of allowed classes during decoding by listing only the classes you expect to decode, or a more specific base class than NSObject. This will become an error in the future. Allowed class list: {(     "'NSNumber' (0x205da88f8) [/System/Library/Frameworks/Foundation.framework]",     "'NSArray' (0x205da1240) [/System/Library/Frameworks/CoreFoundation.framework]",     "'NSObject' (0x205d8cb98) [/usr/lib]",     "'NSDictionary' (0x205da1178) [/System/Library/Frameworks/CoreFoundation.framework]",     "'MyClass.self' (0x1002f11a0) [/private/var/containers/Bundle/Application/5517240E-FB23-468D-80FA-B7E37D30936A/MyApp.app]",     "'NSString' (0x205da8948) [/System/Library/Frameworks/Foundation.framework]" Another warning refers to NS.keys: 2022-09-16 16:19:10.911977+0200 MyApp[4439:1970094] [general] *** -[NSKeyedUnarchiver validateAllowedClass:forKey:] allowed unarchiving safe plist type ''NSString' (0x205da8948) [/System/Library/Frameworks/Foundation.framework]' for key 'NS.keys', even though it was not explicitly included in the client allowed classes set: '{(     "'NSDictionary' (0x205da1178) [/System/Library/Frameworks/CoreFoundation.framework]" )}'. This will be disallowed in the future. I do not understand what NS.keys refers to. What additional class types should I add ?
4
2
4.7k
Sep ’22
Design pattern for scheduling simulation steps
I'm running a simulation (SwiftUI app), which has 100 steps. I need each step to be executed in order. A first try was to dispatch with delay to schedule each second: for step in 0..<100 { DispatchQueue.main.asyncAfter(deadline: .now() + Double(step) * 1.0) { // simulation code } } Very poor results as 100 running threads are too much load for the system. So I split in 2 stages: for bigStep in 0..<10 { DispatchQueue.main.asyncAfter(deadline: .now() + Double(bigStep) * 10.0 ) { for step in 0..<10 { DispatchQueue.main.asyncAfter(deadline: .now() + Double(step) * 1.0) { // simulation code } } } } It works much better, as now there are a max of 20 threads active (in fact I create more levels to limit to a max of 8 concurrent threads). It addition, it allows to interrupt the simulation before end. My questions: is it the appropriate pattern ? Would a timer be better ? Other options ?
4
0
1.2k
May ’23
Several tags are inaccessible on the forum
I notice that several (most) tags are not accessible now (Nov 2, 19:30 GMT). For instance https://developer.apple.com/forums/tags/ios https://developer.apple.com/forums/tags/uikit https://developer.apple.com/forums/tags/macos https://developer.apple.com/forums/tags/appkit https://developer.apple.com/forums/tags/xcode https://developer.apple.com/forums/tags/interface-builder all with the same error: The page you’re looking for can’t be found But others are: https://developer.apple.com/forums/ https://developer.apple.com/forums/tags/swift Is it only me ?
4
0
1k
Nov ’23
Developing a driver to read HFS disks on MacOS Sonoma and newer
Capability to read and write ofd HFS disks on Mac has been removed since a long time. Capability to simply read was also removed since Catalina I think. That is surprising and sometimes frustrating. I still use a 90's MacBook for a few tasks and need from time to time to transfer files to newer Mac or read some old files stored on 3.5" disks. Solution I use is to read the disk on an old Mac with MacOS 10.6 (I'm lucky enough to have kept one) and transfer to USB stick or airdrop… As there is no USB port on the Macbook of course (and I have no more a working 56k modem to transfer by mail), only option if not 3,5" disk is using PCMCIA port on the MacBook for writing to an SD Card to be read in Mac Sonoma. But reading directly 3.5" disk would be great. Hence my questions for the forum: how hard would it be to write such a driver for READING only HFS on Mac Sonoma? There are some software like FuseHFS. Did anyone experience it ? Did anyone have a look at the source code (said to be open source). does anyone know why Apple removed such capability (I thought it was a tiny piece of code compared to the GB of present MacOS)? Thanks for any insights on the matter.
4
0
722
Mar ’25
No new Xcode beta Since 16.2 Release on Dec 11
Not a question, but a surprise. Did I miss something, but apparently there has been no new beta release (16.3) since Release of 16.2 on Dec 11. 2 months without betas is really unusual (in fact, it never happened and usually, next beta n+1 is even released before the final release of version n). So does that mean 16.3 will be a major update ? Wait and see.
4
4
693
Feb ’25
How to solve this NSKeyedArchiver warning
I get several warnings in log: *** -[NSKeyedUnarchiver validateAllowedClass:forKey:] allowed unarchiving safe plist type ''NSNumber' (0x204cdbeb8) [/System/Library/Frameworks/Foundation.framework]' for key 'NS.objects', even though it was not explicitly included in the client allowed classes set: '{( "'NSArray' (0x204cd5598) [/System/Library/Frameworks/CoreFoundation.framework]" )}'. This will be disallowed in the future. I am not sure how to understand it: I have removed every NSNumber.self in the allowed lists for decode. To no avail, still get the avalanche of warnings. What is the key NS.objects about ? What may allowed classes set: '{( "'NSArray' be referring to ? An inclusion of NSArray.self in a list for decode ? The type of a property in a class ?
4
0
259
1w
Setting the highlight colour of a selected cell in NSTableView
That's a question for Mac app (Cocoa). I want to change the standard highlighting. I thought to use tableView.selectionHighlightStyle. But there are only 2 values: .none and .regular. Cannot find how to define a custom one. So I tried a workaround: set tableView.selectionHighlightStyle to .none func tableView(_ tableView: NSTableView, viewFor tableColumn: NSTableColumn?, row: Int) -> NSView? { tableView.selectionHighlightStyle = .none keep track of previousSelection Then, in tableViewSelectionDidChange reset for previousSelection func tableViewSelectionDidChange(_ notification: Notification) { } if previousSelection >= 0 { let cellView = theTableView.rowView(atRow: previousSelection, makeIfNecessary: false) cellView?.layer?.backgroundColor = .clear } set for the selection to a custom color let cellView = theTableView.rowView(atRow: row, makeIfNecessary: false) cellView?.layer?.backgroundColor = CGColor(red: 0, green: 0, blue: 1, alpha: 0.4) previousSelection = row Result is disappointing : Even though tableView.selectionHighlightStyle is set to .none, it does overlays the cellView?.layer Is there a way to directly change the color for selection ?
4
0
120
5d
Update to Catalina, not to Big Sur
One of my production Mac is still on Mojave. I want to update to Catalina and not Big Sur (waiting for dust to settle down). Automatic update from system preferences only proposes Big Sur as well as some patch updates for MacOS 10.14.6 What is the safe and sure way to update from Mojave to the latest Catalina ?
3
0
6.1k
Mar ’21
Change in Watch 7 in rounded corners of WKInterfaceLabel or WKInterfaceGroup
When testing in simulator with Xcode 13, I noted a subtle difference in the display of WKInterfaceLabel between Watch series 6 and series 7. WKInterfaceLabel is in a WKInterfaceGroup. On Series 7: the text Fast Driving is clipped with round corner at top and bottom On Series 6, round clipping is much less (noticeable on leading F and D) I could not find what parameter has changed in IB nor how to change this round corner value. Nor why such a change ?
3
0
2.0k
Nov ’21