I submitted an iOS app with a watchOS companion app.App has been 'Metadata Rejected':Here is the full message:Guideline 2.1 - Information NeededWe have started the review of your app, but we are not able to continue because we need access to a video that demonstrates the current version of your app in use on a physical watchOS device.Please only include footage in your demo video of your app running on a physical watchOS device, and not on a simulator. It is acceptable to use a screen recorder to capture footage of your app in use.Next StepsTo help us proceed with the review of your app, please provide us with a link to a demo video in the App Review Information section of App Store Connect and reply to this message in Resolution Center.To provide a link to a demo video:- Log in to App Store Connect- Click on "My Apps"- Select your app- Click on the app version on the left side of the screen- Scroll down to "App Review Information"- Provide demo video access details in the "Notes" section- Once you've completed all changes, click the "Save" button at the top of the Version Information page.Please note that in the event that your app may only be reviewed by means of a demo video, you will be required to provide an updated demo video with every resubmission.Since your App Store Connect status is Metadata Rejected, we do NOT require a new binary. To revise the metadata, visit App Store Connect to select your app and revise the desired metadata values. Once you’ve completed all changes, reply to this message in Resolution Center and we will continue the review.I have 3 questions:- Is it a systematic requirement for Watch apps ? I did not see in guidelines ; or is it for some reason specific to my app or to the reviewer ?- How can I record video on the Apple Watch ? Should I film the watch while in operation and post this video ? Or is there a direct way to record the video from the watch to iPhone (using system tools, not third party).- I understand it is not video for publication on appstore, but video for tester. So should I include video in the screen captures section or put it on some web site and give a link to it to the tester ?
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
In this app I get access to QRCode.
Reading works perfectly from the camera.
Now I am struggling to get the image that was processed by the built in QRCode reader.
I have found many hints on SO, but cannot make it work.
Here is the code I have now.
It is a bit long, I have to slit in 2 parts
I looked at:
// https://stackoverflow.com/questions/56088575/how-to-get-image-of-qr-code-after-scanning-in-swift-ios
// https://stackoverflow.com/questions/37869963/how-to-use-avcapturephotooutput
import UIKit
import AVFoundation
class ScannerViewController: UIViewController, AVCaptureMetadataOutputObjectsDelegate {
		fileprivate var captureSession: AVCaptureSession! // use for QRCode reading
		fileprivate var previewLayer: AVCaptureVideoPreviewLayer!
		
		// To get the image of the QRCode
		private var photoOutputQR: AVCapturePhotoOutput!
		private var isCapturing = false
		
		override func viewDidLoad() {
				super.viewDidLoad()
				var accessGranted = false
			 //	switch AVCaptureDevice.authorizationStatus(for: .video) {
// HERE TEST FOR ACCESS RIGHT. WORKS OK ;
// But is .video enough ?
				}
				
				if !accessGranted {	return }
				captureSession = AVCaptureSession()
				
				photoOutputQR = AVCapturePhotoOutput() // IS IT THE RIGHT PLACE AND THE RIGHT THING TO DO ?
				captureSession.addOutput(photoOutputQR)	 // Goal is to capture an image of QRCode once acquisition is done
				guard let videoCaptureDevice = AVCaptureDevice.default(for: .video) else { return }
				let videoInput: AVCaptureDeviceInput
				
				do {
						videoInput = try AVCaptureDeviceInput(device: videoCaptureDevice)
				} catch {	return }
				
				if (captureSession.canAddInput(videoInput)) {
						captureSession.addInput(videoInput)
				} else {
						failed()
						return
				}
				
				let metadataOutput = AVCaptureMetadataOutput()
				
				if (captureSession.canAddOutput(metadataOutput)) {
						captureSession.addOutput(metadataOutput) // SO I have 2 output in captureSession. IS IT RIGHT ?
						
						metadataOutput.setMetadataObjectsDelegate(self, queue: DispatchQueue.main)
						metadataOutput.metadataObjectTypes = [.qr]	// For QRCode video acquisition
						
				} else {
						failed()
						return
				}
				
				previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
				previewLayer.frame = view.layer.bounds
				previewLayer.frame.origin.y += 40
				previewLayer.frame.size.height -= 40
				previewLayer.videoGravity = .resizeAspectFill
				view.layer.addSublayer(previewLayer)
				captureSession.startRunning()
		}
		
		override func viewWillAppear(_ animated: Bool) {
				
				super.viewWillAppear(animated)
				if (captureSession?.isRunning == false) {
						captureSession.startRunning()
				}
		}
		
		override func viewWillDisappear(_ animated: Bool) {
				
				super.viewWillDisappear(animated)
				if (captureSession?.isRunning == true) {
						captureSession.stopRunning()
				}
		}
		
		// MARK: - scan Results
		
		func metadataOutput(_ output: AVCaptureMetadataOutput, didOutput metadataObjects: [AVMetadataObject], from connection: AVCaptureConnection) {
				
				captureSession.stopRunning()
				
				if let metadataObject = metadataObjects.first {
						guard let readableObject = metadataObject as? AVMetadataMachineReadableCodeObject else { return }
						guard let stringValue = readableObject.stringValue else { return }
						AudioServicesPlaySystemSound(SystemSoundID(kSystemSoundID_Vibrate))
						found(code: stringValue)
				}
				// Get image - IS IT THE RIGHT PLACE TO DO IT ?
				// https://stackoverflow.com/questions/37869963/how-to-use-avcapturephotooutput
				print("Do I get here ?", isCapturing)
				let photoSettings = AVCapturePhotoSettings()
				let previewPixelType = photoSettings.availablePreviewPhotoPixelFormatTypes.first!
				print("previewPixelType", previewPixelType)
				let previewFormat = [kCVPixelBufferPixelFormatTypeKey as String: previewPixelType,
														 kCVPixelBufferWidthKey as String: 160,
														 kCVPixelBufferHeightKey as String: 160]
				photoSettings.previewPhotoFormat = previewFormat
				if !isCapturing {
						isCapturing = true
						photoOutputQR.capturePhoto(with: photoSettings, delegate: self)
				}
				dismiss(animated: true)
		}
		
}
extension ScannerViewController: AVCapturePhotoCaptureDelegate {
	
		func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) {
			
				isCapturing = false
				print("photo", photo, photo.fileDataRepresentation())
				guard let imageData = photo.fileDataRepresentation() else {
						print("Error while generating image from photo capture data.");
						return
				}
		 }
}
I get the following print on console
Clearly photo is not loaded properly
Do I get here ? false
previewPixelType 875704422
photo <AVCapturePhoto: 0x281973a20 pts:nan 1/1 settings:uid:3 photo:{0x0} time:nan-nan> nil
Error while generating image from photo capture data.
The Code Block doesn't number lines anymore.
The work around is to ask for Numbered list in addition to Code block.
Just applying Code Block:
required init?(coder aDecoder: NSCoder) {
super.init(coder: aDecoder)
commonInit()
}
Code Block AND Numbered List (or the other order)
1. required init?(coder aDecoder: NSCoder) {
2. super.init(coder: aDecoder)
3. commonInit()
4. }
Is there another way to get numbering directly ?
I try to exclude some activities from UIActivity.
It works as expected when exclusion is done directly with the activity, as with:
UIActivity.ActivityType.message,
UIActivity.ActivityType.airDrop
but not when activity is declared with an init as with:
UIActivity.ActivityType(rawValue: "net.whatsapp.WhatsApp.ShareExtension"),
UIActivity.ActivityType(rawValue: "com.ifttt.ifttt.share"),
So, with the following code:
let excludedActivityTypes = [
UIActivity.ActivityType.message,
UIActivity.ActivityType.airDrop,
UIActivity.ActivityType(rawValue: "net.whatsapp.WhatsApp.ShareExtension"),
UIActivity.ActivityType(rawValue: "com.ifttt.ifttt.share")
]
let activityVC = UIActivityViewController(activityItems: [modifiedPdfURL], applicationActivities: nil)
activityVC.excludedActivityTypes = excludedActivityTypes
message and airDrop do not show, but WhatsApp and IFTTT still show.
I have tested with
activityVC.completionWithItemsHandler = { (activity, success, modifiedItems, error) in
print("activity: \(activity), success: \(success), items: \(modifiedItems), error: \(error)")
}
that WhatsApp and IFTTT services are effectively the ones listed here.
When selecting WhatsApp, print above gives:
activity: Optional(__C.UIActivityType(_rawValue: net.whatsapp.WhatsApp.ShareExtension)), success: false, items: nil, error: nil
I get this error in Xcode 14 / iOS 16 on device that I had not with previous versions.
[general] *** -[NSKeyedUnarchiver validateAllowedClass:forKey:] allowed unarchiving safe plist type ''NSNumber' (0x205da88f8) [/System/Library/Frameworks/Foundation.framework]' for key 'NS.objects', even though it was not explicitly included in the client allowed classes set: '{(
"'NSDictionary' (0x205da1178) [/System/Library/Frameworks/CoreFoundation.framework]",
"'NSString' (0x205da8948) [/System/Library/Frameworks/Foundation.framework]"
)}'. This will be disallowed in the future.
The only places where I reference NSDictionary.self or NSString.self or NSNumber.self for allowed classes are:
@objc class MyClass : NSObject, NSSecureCoding {
required init(coder decoder: NSCoder) {
let myObject = decoder.decodeObject(of: [MyClass.self, NSNumber.self, NSArray.self, NSDictionary.self, NSString.self], forKey: myKey) as? [SomeClass] ?? []
}
and in another class
class Util {
// in a class func:
let data = try Data(contentsOf: fileURL)
guard let unarchived = try NSKeyedUnarchiver.unarchivedObject(ofClass: NSDictionary.self, from: data) else { return nil }
I do not understand what NS.objects refers to.
I tried to add NSObject.self in
let myObject = decoder.decodeObject(of: [MyClass.self, NSNumber.self, NSArray.self, NSDictionary.self, NSString.self, NSObject.self], forKey: myKey) as? [SomeClass] ?? []
but that gave even more warnings:
[Foundation] *** -[NSKeyedUnarchiver validateAllowedClass:forKey:]: NSSecureCoding allowed classes list contains [NSObject class], which bypasses security by allowing any Objective-C class to be implicitly decoded. Consider reducing the scope of allowed classes during decoding by listing only the classes you expect to decode, or a more specific base class than NSObject. This will become an error in the future. Allowed class list: {(
"'NSNumber' (0x205da88f8) [/System/Library/Frameworks/Foundation.framework]",
"'NSArray' (0x205da1240) [/System/Library/Frameworks/CoreFoundation.framework]",
"'NSObject' (0x205d8cb98) [/usr/lib]",
"'NSDictionary' (0x205da1178) [/System/Library/Frameworks/CoreFoundation.framework]",
"'MyClass.self' (0x1002f11a0) [/private/var/containers/Bundle/Application/5517240E-FB23-468D-80FA-B7E37D30936A/MyApp.app]",
"'NSString' (0x205da8948) [/System/Library/Frameworks/Foundation.framework]"
Another warning refers to NS.keys:
2022-09-16 16:19:10.911977+0200 MyApp[4439:1970094] [general] *** -[NSKeyedUnarchiver validateAllowedClass:forKey:] allowed unarchiving safe plist type ''NSString' (0x205da8948) [/System/Library/Frameworks/Foundation.framework]' for key 'NS.keys', even though it was not explicitly included in the client allowed classes set: '{(
"'NSDictionary' (0x205da1178) [/System/Library/Frameworks/CoreFoundation.framework]"
)}'. This will be disallowed in the future.
I do not understand what NS.keys refers to.
What additional class types should I add ?
I'm running a simulation (SwiftUI app), which has 100 steps.
I need each step to be executed in order.
A first try was to dispatch with delay to schedule each second:
for step in 0..<100 {
DispatchQueue.main.asyncAfter(deadline: .now() + Double(step) * 1.0) {
// simulation code
}
}
Very poor results as 100 running threads are too much load for the system.
So I split in 2 stages:
for bigStep in 0..<10 {
DispatchQueue.main.asyncAfter(deadline: .now() + Double(bigStep) * 10.0 ) {
for step in 0..<10 {
DispatchQueue.main.asyncAfter(deadline: .now() + Double(step) * 1.0) {
// simulation code
}
}
}
}
It works much better, as now there are a max of 20 threads active (in fact I create more levels to limit to a max of 8 concurrent threads).
It addition, it allows to interrupt the simulation before end.
My questions:
is it the appropriate pattern ?
Would a timer be better ?
Other options ?
I notice that several (most) tags are not accessible now (Nov 2, 19:30 GMT). For instance
https://developer.apple.com/forums/tags/ios
https://developer.apple.com/forums/tags/uikit
https://developer.apple.com/forums/tags/macos
https://developer.apple.com/forums/tags/appkit
https://developer.apple.com/forums/tags/xcode
https://developer.apple.com/forums/tags/interface-builder
all with the same error: The page you’re looking for can’t be found
But others are:
https://developer.apple.com/forums/
https://developer.apple.com/forums/tags/swift
Is it only me ?
Capability to read and write ofd HFS disks on Mac has been removed since a long time.
Capability to simply read was also removed since Catalina I think.
That is surprising and sometimes frustrating. I still use a 90's MacBook for a few tasks and need from time to time to transfer files to newer Mac or read some old files stored on 3.5" disks.
Solution I use is to read the disk on an old Mac with MacOS 10.6 (I'm lucky enough to have kept one) and transfer to USB stick or airdrop…
As there is no USB port on the Macbook of course (and I have no more a working 56k modem to transfer by mail), only option if not 3,5" disk is using PCMCIA port on the MacBook for writing to an SD Card to be read in Mac Sonoma. But reading directly 3.5" disk would be great.
Hence my questions for the forum:
how hard would it be to write such a driver for READING only HFS on Mac Sonoma?
There are some software like FuseHFS. Did anyone experience it ? Did anyone have a look at the source code (said to be open source).
does anyone know why Apple removed such capability (I thought it was a tiny piece of code compared to the GB of present MacOS)?
Thanks for any insights on the matter.
Not a question, but a surprise.
Did I miss something, but apparently there has been no new beta release (16.3) since Release of 16.2 on Dec 11. 2 months without betas is really unusual (in fact, it never happened and usually, next beta n+1 is even released before the final release of version n).
So does that mean 16.3 will be a major update ? Wait and see.
I get several warnings in log:
*** -[NSKeyedUnarchiver validateAllowedClass:forKey:] allowed unarchiving
safe plist type ''NSNumber' (0x204cdbeb8)
[/System/Library/Frameworks/Foundation.framework]' for key 'NS.objects',
even though it was not explicitly included in the client allowed classes set: '{(
"'NSArray' (0x204cd5598) [/System/Library/Frameworks/CoreFoundation.framework]"
)}'. This will be disallowed in the future.
I am not sure how to understand it:
I have removed every NSNumber.self in the allowed lists for decode. To no avail, still get the avalanche of warnings.
What is the key NS.objects about ?
What may allowed classes set: '{(
"'NSArray' be referring to ? An inclusion of NSArray.self in a list for decode ? The type of a property in a class ?
That's a question for Mac app (Cocoa).
I want to change the standard highlighting.
I thought to use tableView.selectionHighlightStyle.
But there are only 2 values: .none and .regular. Cannot find how to define a custom one.
So I tried a workaround:
set tableView.selectionHighlightStyle to .none
func tableView(_ tableView: NSTableView, viewFor tableColumn: NSTableColumn?, row: Int) -> NSView? {
tableView.selectionHighlightStyle = .none
keep track of previousSelection
Then, in tableViewSelectionDidChange
reset for previousSelection
func tableViewSelectionDidChange(_ notification: Notification) { }
if previousSelection >= 0 {
let cellView = theTableView.rowView(atRow: previousSelection, makeIfNecessary: false)
cellView?.layer?.backgroundColor = .clear
}
set for the selection to a custom color
let cellView = theTableView.rowView(atRow: row, makeIfNecessary: false)
cellView?.layer?.backgroundColor = CGColor(red: 0, green: 0, blue: 1, alpha: 0.4)
previousSelection = row
Result is disappointing :
Even though tableView.selectionHighlightStyle is set to .none, it does overlays the cellView?.layer
Is there a way to directly change the color for selection ?
Do you use third party framweworks, like Alamofire ?
One of my production Mac is still on Mojave. I want to update to Catalina and not Big Sur (waiting for dust to settle down).
Automatic update from system preferences only proposes Big Sur as well as some patch updates for MacOS 10.14.6
What is the safe and sure way to update from Mojave to the latest Catalina ?
I have created a new entry in the capability list (background modes)
None is selected, so that seems to have no effect.
However, I would like to remove this entry to clean the screen. But I cannot find any "-" button, nor contextual menu, nor editing in XML…
Is it simply possible to remove ?
When testing in simulator with Xcode 13, I noted a subtle difference in the display of WKInterfaceLabel between Watch series 6 and series 7.
WKInterfaceLabel is in a WKInterfaceGroup.
On Series 7: the text Fast Driving is clipped with round corner at top and bottom
On Series 6, round clipping is much less (noticeable on leading F and D)
I could not find what parameter has changed in IB nor how to change this round corner value. Nor why such a change ?