I have a func to flash a view a number of times and execute closure at the end.
The following code with UIView.setAnimationRepeatCount works OK: animation occurs the requested number of times and afterEnd closure executes at the end.
func flashIt(repeated: Int, cycleTime: Double, delayed: Double = 5.0, afterEnd: (() -> Void)? = nil) {
if repeated < 0 { return }
let initAlpha = self.alpha
UIView.animate(withDuration: cycleTime,
delay: delayed,
options:[.allowUserInteraction, .curveEaseInOut, .autoreverse, .repeat],
animations: {
UIView.setAnimationRepeatCount(Float(repeated))
self.alpha = 0.1 // Not 0.0, to allow user interaction
},
completion: { (done: Bool) in
self.alpha = initAlpha
afterEnd?()
} )
}
To address UIView.setAnimationRepeatCount deprecation, I now try this (inspired by https://stackoverflow.com/questions/47496584/uiviewpropertyanimator-reverse-animation)
func flashIt(repeated: Int, cycleTime: Double, delayed: Double = 5.0, afterEnd: (() -> Void)? = nil) {
if repeated < 0 { return }
let initAlpha = self.alpha
let animator = UIViewPropertyAnimator(duration: cycleTime, curve: .linear) {
self.alpha = 0.1
}
animator.addCompletion { _ in
let reverseAnimator = UIViewPropertyAnimator(duration: cycleTime, curve: .linear) {
self.alpha = initAlpha
}
reverseAnimator.addCompletion { [self] _ in
flashIt(repeated: repeated-1, cycleTime: cycleTime, delayed: 0) // without delay here
}
reverseAnimator.startAnimation()
if repeated <= 1 { // 1 and not 0, otherwise an extra loop…
afterEnd?() // Not called
return
}
}
animator.startAnimation(afterDelay: delayed)
}
The flash works, but afterEnd closure is never called.
I've tried to call
if repeated <= 1 { // 1 and not 0, otherwise an extra loop…
afterEnd?() // Not called
return
}
in other places, to no avail.
What do I miss ?
Is there a better and simpler way to have a RepeatCount with UIViewPropertyAnimator ?
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I get a surprising crash when adapting constraints.
Xcode 13.2.1
iOS 15.2 on simulator
Here is the set up in storyboard:
A subclass of UIView (PopoverView) to draw a popover like frame and label and button inside.
PopoverView can set its arrow position all around, for instance on the right, so that popover will appear on the left of an object,
or on the left, to appear on the right.
Button tap is used to loop through the arrow position.
I have defined constraints (and their IBOutlets), and notably for the centerX position of popoverView to centerX of the label.
I adapt the value of the constraints in viewWillLayoutSubviews to position the popover in correct position relative to the label.
Now the problem.
To position the popover on the right of the label, I set its centerX constraint as follows in viewWillLayoutSubviews():
popoverCenterToLabelCenterConstraint.constant = (popoverLabel.frame.width + label.frame.width) / 2
That works OK
To position the popover on the left of the label, I set its constraint as follows (offset the opposite value):
popoverCenterToLabelCenterConstraint.constant = -(popoverLabel.frame.width + label.frame.width) / 2
I also tried:
popoverCenterToLabelCenterConstraint.constant = -popoverLabel.frame.width / 2 - label.frame.width / 2
App does not crash but hangs, apparently in an infinite loop.
Idem if I divide by 2.0 instead of 2 : label.frame.width / 2.0
The width of the label, at this point of code, is 92.33333333333333
So I replaced by the value directly:
popoverCenterToLabelCenterConstraint.constant = -popoverLabel.frame.width / 2 - 46.2
and it works OK. Idem with 46.1
So the problem is not due to the position of popoverView
If I replace the div by 2 by 2.01 :
- popoverCenterToLabelCenterConstraint.constant = -popoverLabel.frame.width / 2 - label.frame.width / 2.01
it works !!!!!!
Conclusion: it is the exact div by 2 that causes the hanging.
What could be the reason for this strange behaviour ?
That's a follow up of a previous thread.
https://developer.apple.com/forums/thread/707130
I did some test on iOS 16 simulator with Xcode 14ß.
Recognition is very poor. And recognition rate of some single letters (an L or an I for instance) is zero (literally). Same code worked better (success 50% for same single letters) with iOS 15.2.
Did something change on iOS 16 ? I filed a bug report: Jun 7, 2022 at 3:28 PM – FB10066541
With this code, button is at the center of the view and message is logged only when tapping in the button
1. struct ContentView: View {
2. var body: some View {
3.
4. ZStack {
5. Button(action: {
6. print("Touched")
7. }) {
8. Image(systemName: "square.split.diagonal.2x2")
9. .font(.system(size: 24))
10. .foregroundColor(.black)
11. // .position(x: 12, y: 12)
12. }
13.
14. }
15. .frame(width: 300, height: 300)
16. .background(Color.yellow)
17. }
18. }
But if I uncomment line 11, the message is logged on tap anywhere in the view.
How is it position() changes the behaviour ?
I have encountered the following problem with a List.
The setup is as follows
@State private var allItems : [SomeItem]
@State private var selected : SomeItem?
// in the body
List(allItems, $selection) { theItem in …
}
where SomeItem is a struct.
When some properties of an item in allItems changes, the values that I read in theItem are not updated.
Just as if old content was cached.
I changed allItems to a computed var and everything works OK.
I read this SO thread https://stackoverflow.com/questions/74083515/swiftui-list-item-not-updated-if-model-is-wrapped-in-state
but that does not give a full explanation.
So my questions:
Is there effectively an issue here ?
when is it safe to use a State var as the Content of List ?
Is it OK (it seems) if the properties of items do not change ?
It seems also OK if the list of allItems is modified (appended, reduced) without problem changing the properties of its elements.
How to do if needed to change both allItems (append for instance) and change the properties of some items, as computed var cannot be modified.
Hope the question is clear.
I get this error in console in a SwiftUI MacApp on Xcode 15.0.1 and MacOS 14.1.2
FAULT: <NSRemoteView: 0x11f03bc00 com.apple.TextInputUI.xpc.CursorUIViewService TUICursorUIViewService> determined it was necessary to configure <TUINSWindow: 0x11f1b2390> to support remote view vibrancy
I found similar errors that seem to be harmless messages popping up: https://developer.apple.com/forums/thread/72133
No idea what can cause such message, even less how to correct it.
Here is the set up.
I have a UITableViewController (controller 1), to which I get from a controller (0)
As I need to add a searchView atop the tableView, I addSubview to the superview.
This is done in viewWillAppear
self.view.superview?.addSubview(searchView) // AboveTableView
searchView.isHidden = true
print("willAppear", self.view.superview, self.tableView.superview)
It works perfectly well, I get:
willAppear Optional(<UIViewControllerWrapperView: 0x103d2eef0; frame = (0 0; 393 852); autoresize = W+H; layer = <CALayer:
From this VC, I segue to another VC (2) with a show segue and comes back with an unwindSegue to return to (1).
Problem is that no, superviews are nil.
@IBAction func unwindToTableViewController(_ sender: UIStoryboardSegue) {
print("unwind", self.view.superview, self.tableView.superview)
}
gives
unwind nil nil
If I step back from this controller (1) to the controller (0) from which I segued and segue again to the TableViewController (1), everything works as expected.
In addition, if I go from 1 -> 2 by instantiating (2), everything works OK when I unwind to (1).
I know a solution is to refactor code replacing UITableViewController with a UIViewController and embed a TableView inside. But I would like to understand why my present design does not work.
To change the background of a TextField, I simply apply:
TextField("0.0", value: $p, format: .number)
.textFieldStyle(PlainTextFieldStyle())
.background(.red)
That does only work with plain style, but it works.
Trying something similar on a button changes the container view background.
The only solution I've found is to overlap with a Rectangle.
How is it ?
A SwiftUI bug ?
A current limit of Swift ?
A rationale for it ?
There a better solution I've not found ?
I feel a bit dumb now.
I once succeeded to change the language of an app on Watch simulator. So it is possible.
And I'm not able to repeat (fool of me I did not took note of how I did it).
I just remember it was simply through some language settings selection, may be rebooting the Mac, but not by changing anything in code.
Does someone know how to do ?
I have an @objC used for notification.
kTag is an Int constant, fieldBeingEdited is an Int variable.
The following code fails at compilation with error: Command CompileSwift failed with a nonzero exit code if I capture self (I edited code, to have minimal case)
@objc func keyboardDone(_ sender : UIButton) {
DispatchQueue.main.async { [self] () -> Void in
switch fieldBeingEdited {
case kTag : break
default : break
}
}
}
If I explicitly use self, it compiles, even with self captured:
@objc func keyboardDone(_ sender : UIButton) {
DispatchQueue.main.async { [self] () -> Void in
switch fieldBeingEdited { // <<-- no need for self here
case self.kTag : break // <<-- self here
default : break
}
}
}
This compiles as well:
@objc func keyboardDone(_ sender : UIButton) {
DispatchQueue.main.async { () -> Void in
switch self.fieldBeingEdited { // <<-- no need for self here
case self.kTag : break // <<-- self here
default : break
}
}
}
Is it a compiler bug or am I missing something ?
I submitted an iOS app with a watchOS companion app.App has been 'Metadata Rejected':Here is the full message:Guideline 2.1 - Information NeededWe have started the review of your app, but we are not able to continue because we need access to a video that demonstrates the current version of your app in use on a physical watchOS device.Please only include footage in your demo video of your app running on a physical watchOS device, and not on a simulator. It is acceptable to use a screen recorder to capture footage of your app in use.Next StepsTo help us proceed with the review of your app, please provide us with a link to a demo video in the App Review Information section of App Store Connect and reply to this message in Resolution Center.To provide a link to a demo video:- Log in to App Store Connect- Click on "My Apps"- Select your app- Click on the app version on the left side of the screen- Scroll down to "App Review Information"- Provide demo video access details in the "Notes" section- Once you've completed all changes, click the "Save" button at the top of the Version Information page.Please note that in the event that your app may only be reviewed by means of a demo video, you will be required to provide an updated demo video with every resubmission.Since your App Store Connect status is Metadata Rejected, we do NOT require a new binary. To revise the metadata, visit App Store Connect to select your app and revise the desired metadata values. Once you’ve completed all changes, reply to this message in Resolution Center and we will continue the review.I have 3 questions:- Is it a systematic requirement for Watch apps ? I did not see in guidelines ; or is it for some reason specific to my app or to the reviewer ?- How can I record video on the Apple Watch ? Should I film the watch while in operation and post this video ? Or is there a direct way to record the video from the watch to iPhone (using system tools, not third party).- I understand it is not video for publication on appstore, but video for tester. So should I include video in the screen captures section or put it on some web site and give a link to it to the tester ?
In this app I get access to QRCode.
Reading works perfectly from the camera.
Now I am struggling to get the image that was processed by the built in QRCode reader.
I have found many hints on SO, but cannot make it work.
Here is the code I have now.
It is a bit long, I have to slit in 2 parts
I looked at:
// https://stackoverflow.com/questions/56088575/how-to-get-image-of-qr-code-after-scanning-in-swift-ios
// https://stackoverflow.com/questions/37869963/how-to-use-avcapturephotooutput
import UIKit
import AVFoundation
class ScannerViewController: UIViewController, AVCaptureMetadataOutputObjectsDelegate {
		fileprivate var captureSession: AVCaptureSession! // use for QRCode reading
		fileprivate var previewLayer: AVCaptureVideoPreviewLayer!
		
		// To get the image of the QRCode
		private var photoOutputQR: AVCapturePhotoOutput!
		private var isCapturing = false
		
		override func viewDidLoad() {
				super.viewDidLoad()
				var accessGranted = false
			 //	switch AVCaptureDevice.authorizationStatus(for: .video) {
// HERE TEST FOR ACCESS RIGHT. WORKS OK ;
// But is .video enough ?
				}
				
				if !accessGranted {	return }
				captureSession = AVCaptureSession()
				
				photoOutputQR = AVCapturePhotoOutput() // IS IT THE RIGHT PLACE AND THE RIGHT THING TO DO ?
				captureSession.addOutput(photoOutputQR)	 // Goal is to capture an image of QRCode once acquisition is done
				guard let videoCaptureDevice = AVCaptureDevice.default(for: .video) else { return }
				let videoInput: AVCaptureDeviceInput
				
				do {
						videoInput = try AVCaptureDeviceInput(device: videoCaptureDevice)
				} catch {	return }
				
				if (captureSession.canAddInput(videoInput)) {
						captureSession.addInput(videoInput)
				} else {
						failed()
						return
				}
				
				let metadataOutput = AVCaptureMetadataOutput()
				
				if (captureSession.canAddOutput(metadataOutput)) {
						captureSession.addOutput(metadataOutput) // SO I have 2 output in captureSession. IS IT RIGHT ?
						
						metadataOutput.setMetadataObjectsDelegate(self, queue: DispatchQueue.main)
						metadataOutput.metadataObjectTypes = [.qr]	// For QRCode video acquisition
						
				} else {
						failed()
						return
				}
				
				previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
				previewLayer.frame = view.layer.bounds
				previewLayer.frame.origin.y += 40
				previewLayer.frame.size.height -= 40
				previewLayer.videoGravity = .resizeAspectFill
				view.layer.addSublayer(previewLayer)
				captureSession.startRunning()
		}
		
		override func viewWillAppear(_ animated: Bool) {
				
				super.viewWillAppear(animated)
				if (captureSession?.isRunning == false) {
						captureSession.startRunning()
				}
		}
		
		override func viewWillDisappear(_ animated: Bool) {
				
				super.viewWillDisappear(animated)
				if (captureSession?.isRunning == true) {
						captureSession.stopRunning()
				}
		}
		
		// MARK: - scan Results
		
		func metadataOutput(_ output: AVCaptureMetadataOutput, didOutput metadataObjects: [AVMetadataObject], from connection: AVCaptureConnection) {
				
				captureSession.stopRunning()
				
				if let metadataObject = metadataObjects.first {
						guard let readableObject = metadataObject as? AVMetadataMachineReadableCodeObject else { return }
						guard let stringValue = readableObject.stringValue else { return }
						AudioServicesPlaySystemSound(SystemSoundID(kSystemSoundID_Vibrate))
						found(code: stringValue)
				}
				// Get image - IS IT THE RIGHT PLACE TO DO IT ?
				// https://stackoverflow.com/questions/37869963/how-to-use-avcapturephotooutput
				print("Do I get here ?", isCapturing)
				let photoSettings = AVCapturePhotoSettings()
				let previewPixelType = photoSettings.availablePreviewPhotoPixelFormatTypes.first!
				print("previewPixelType", previewPixelType)
				let previewFormat = [kCVPixelBufferPixelFormatTypeKey as String: previewPixelType,
														 kCVPixelBufferWidthKey as String: 160,
														 kCVPixelBufferHeightKey as String: 160]
				photoSettings.previewPhotoFormat = previewFormat
				if !isCapturing {
						isCapturing = true
						photoOutputQR.capturePhoto(with: photoSettings, delegate: self)
				}
				dismiss(animated: true)
		}
		
}
extension ScannerViewController: AVCapturePhotoCaptureDelegate {
	
		func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) {
			
				isCapturing = false
				print("photo", photo, photo.fileDataRepresentation())
				guard let imageData = photo.fileDataRepresentation() else {
						print("Error while generating image from photo capture data.");
						return
				}
		 }
}
I get the following print on console
Clearly photo is not loaded properly
Do I get here ? false
previewPixelType 875704422
photo <AVCapturePhoto: 0x281973a20 pts:nan 1/1 settings:uid:3 photo:{0x0} time:nan-nan> nil
Error while generating image from photo capture data.
The Code Block doesn't number lines anymore.
The work around is to ask for Numbered list in addition to Code block.
Just applying Code Block:
required init?(coder aDecoder: NSCoder) {
super.init(coder: aDecoder)
commonInit()
}
Code Block AND Numbered List (or the other order)
1. required init?(coder aDecoder: NSCoder) {
2. super.init(coder: aDecoder)
3. commonInit()
4. }
Is there another way to get numbering directly ?
I try to exclude some activities from UIActivity.
It works as expected when exclusion is done directly with the activity, as with:
UIActivity.ActivityType.message,
UIActivity.ActivityType.airDrop
but not when activity is declared with an init as with:
UIActivity.ActivityType(rawValue: "net.whatsapp.WhatsApp.ShareExtension"),
UIActivity.ActivityType(rawValue: "com.ifttt.ifttt.share"),
So, with the following code:
let excludedActivityTypes = [
UIActivity.ActivityType.message,
UIActivity.ActivityType.airDrop,
UIActivity.ActivityType(rawValue: "net.whatsapp.WhatsApp.ShareExtension"),
UIActivity.ActivityType(rawValue: "com.ifttt.ifttt.share")
]
let activityVC = UIActivityViewController(activityItems: [modifiedPdfURL], applicationActivities: nil)
activityVC.excludedActivityTypes = excludedActivityTypes
message and airDrop do not show, but WhatsApp and IFTTT still show.
I have tested with
activityVC.completionWithItemsHandler = { (activity, success, modifiedItems, error) in
print("activity: \(activity), success: \(success), items: \(modifiedItems), error: \(error)")
}
that WhatsApp and IFTTT services are effectively the ones listed here.
When selecting WhatsApp, print above gives:
activity: Optional(__C.UIActivityType(_rawValue: net.whatsapp.WhatsApp.ShareExtension)), success: false, items: nil, error: nil
I get this error in Xcode 14 / iOS 16 on device that I had not with previous versions.
[general] *** -[NSKeyedUnarchiver validateAllowedClass:forKey:] allowed unarchiving safe plist type ''NSNumber' (0x205da88f8) [/System/Library/Frameworks/Foundation.framework]' for key 'NS.objects', even though it was not explicitly included in the client allowed classes set: '{(
"'NSDictionary' (0x205da1178) [/System/Library/Frameworks/CoreFoundation.framework]",
"'NSString' (0x205da8948) [/System/Library/Frameworks/Foundation.framework]"
)}'. This will be disallowed in the future.
The only places where I reference NSDictionary.self or NSString.self or NSNumber.self for allowed classes are:
@objc class MyClass : NSObject, NSSecureCoding {
required init(coder decoder: NSCoder) {
let myObject = decoder.decodeObject(of: [MyClass.self, NSNumber.self, NSArray.self, NSDictionary.self, NSString.self], forKey: myKey) as? [SomeClass] ?? []
}
and in another class
class Util {
// in a class func:
let data = try Data(contentsOf: fileURL)
guard let unarchived = try NSKeyedUnarchiver.unarchivedObject(ofClass: NSDictionary.self, from: data) else { return nil }
I do not understand what NS.objects refers to.
I tried to add NSObject.self in
let myObject = decoder.decodeObject(of: [MyClass.self, NSNumber.self, NSArray.self, NSDictionary.self, NSString.self, NSObject.self], forKey: myKey) as? [SomeClass] ?? []
but that gave even more warnings:
[Foundation] *** -[NSKeyedUnarchiver validateAllowedClass:forKey:]: NSSecureCoding allowed classes list contains [NSObject class], which bypasses security by allowing any Objective-C class to be implicitly decoded. Consider reducing the scope of allowed classes during decoding by listing only the classes you expect to decode, or a more specific base class than NSObject. This will become an error in the future. Allowed class list: {(
"'NSNumber' (0x205da88f8) [/System/Library/Frameworks/Foundation.framework]",
"'NSArray' (0x205da1240) [/System/Library/Frameworks/CoreFoundation.framework]",
"'NSObject' (0x205d8cb98) [/usr/lib]",
"'NSDictionary' (0x205da1178) [/System/Library/Frameworks/CoreFoundation.framework]",
"'MyClass.self' (0x1002f11a0) [/private/var/containers/Bundle/Application/5517240E-FB23-468D-80FA-B7E37D30936A/MyApp.app]",
"'NSString' (0x205da8948) [/System/Library/Frameworks/Foundation.framework]"
Another warning refers to NS.keys:
2022-09-16 16:19:10.911977+0200 MyApp[4439:1970094] [general] *** -[NSKeyedUnarchiver validateAllowedClass:forKey:] allowed unarchiving safe plist type ''NSString' (0x205da8948) [/System/Library/Frameworks/Foundation.framework]' for key 'NS.keys', even though it was not explicitly included in the client allowed classes set: '{(
"'NSDictionary' (0x205da1178) [/System/Library/Frameworks/CoreFoundation.framework]"
)}'. This will be disallowed in the future.
I do not understand what NS.keys refers to.
What additional class types should I add ?