Post

Replies

Boosts

Views

Created

Localization and tap gesture
In this app I have a view with 3 textFields or labels on which I have gesture recogniqzers.They work OK.I have localized the app for 2 more languages, and now, when I switxh language (inside the app), I get the following error:2018-02-16 00:19:34.951810+0100 Autonomie[1635:1007703] [Warning] WARNING: A Gesture recognizer (<UITapGestureRecognizer: 0x1757fd90; state = Possible; view = <UITextView 0x17b86600>; target= <(action=autonomieVersionViewTapped:, target=<Autonomie.StartViewController 0x1757f890>)>>) was setup in a storyboard/xib to be added to more than one view (-><UITextView: 0x17ba8000; frame = (60.5 50; 198 25); text = 'version 0.5 © AlphaNums 2...'; clipsToBounds = YES; hidden = YES; opaque = NO; autoresize = RM+BM; gestureRecognizers = <NSArray: 0x175e25d0>; layer = <CALayer: 0x175da320>; contentOffset: {0, 0}; contentSize: {198, 31}>) at a time, this was never allowed, and is now enforced. Beginning with iOS 9.0 it will be put in the first view it is loaded into.<UIButtonLabel: 0x176d3da0; frame = (160 18; 0 0); text = 'Calcular el rango …'; opaque = NO; userInteractionEnabled = NO; layer = <_UILabelLayer: 0x176d01c0>><UIButtonLabel: 0x1765cc40; frame = (160 18; 0 0); text = 'Calculer autonomie …'; opaque = NO; userInteractionEnabled = NO; layer = <_UILabelLayer: 0x1765cda0>>And gestures do not work.If I switch back to the original language, everything OK.
2
0
3.5k
Feb ’18
What is MGIsDeviceOneOfType
Just installed XCode 10.I receive the following warning:2018-06-05 13:25:47.686865+0200 simpleTest[14581:1428592] libMobileGestalt MobileGestalt.c:875: MGIsDeviceOneOfType is not supported on this platform.Cannot find what MGIsDeviceOneOfType is about.Edited : is it related to Mapping and Geographical Information System (“MGIS”) ?
18
0
46k
Jun ’18
NSLayoutConstraint should not be declared weak in Xcode 10ß2 - Swift 4.2
When compiling this OSX App in XCode 10 beta 2 (Swift 4.2), I get a warning that did not show in XCode 9.4 each time I create an NSConstraint programmaticallyInstance will be immediately deallocated because property 'myConstraint' is 'weak'fileprivate weak var myConstraint : NSLayoutConstraint! myConstraint = NSLayoutConstraint(item: aButton, attribute: .top, relatedBy: .equal, toItem: aView, attribute: .bottom, multiplier: 1.0, constant: 30)However, there is no such warning for IBOutlet@IBOutlet fileprivate weak var anotherConstraint : NSLayoutConstraint!Is it a just a new warning for an error that existed before ? Or did something change in NSConstraint ?Should I treat IBOutlet and programmatically created differently with respect to weak ?
2
0
5.4k
Jun ’18
Using and expressing "case let" statements
After seveal years with Swift, I still find it hard to use the if case let or while case let, even worse with opotional pattern if case let x?.So, I would like to find an expression to "speak" case let more naturally.Presently, to be sure of what I do, I have to menatlly replace the if case by the full switch statement, with a single case and default ; pretty tedious.I thought of canMatch or canMatchUnwrap … with:So, following would readif case let x = y { // if canMatch x with yif case let x? = someOptional { // if canMatchUnwrap x with someOptionalwhile case let next? = node.next { // while canMatchUnwrap next with node.nextAm I the only one with such problem ? Have you found a better way ?
5
0
7.7k
Nov ’18
Unable to install test app on Apple Watch
I repeat this older thread, as it does not show on the forum with my added post !https://forums.developer.apple.com/message/298938#298938I answered to KMT answer (I read the referenced thread)---------------Looks like that triggered the name change dialog.Can you just right click on the watch name without that happening? Try it on the phone, on the left so see what I'm looking for.Otherwise, this sounds like what others went thru (routine pairing v o o d o o ) in this SO thread - seen it ? [ filter does not like v o o …]https://stackoverflow.com/questions/30792520/in-xcode-i-see-no-paired-apple-watch-even-though-the-watch-is-paired-and-the-w-------------------------So, my question nowI run in the same problem.I just got the Apple Watch (4, OS 5.1.1) and use XCode 10.1. OSX 10.13.6I can create the app + WatchOS on simulator without problem. But not on device.The watch is paired with iPhone.I connect iPhone with USB=> get the message :The iPhone "iPhone XS" can not be used because it requires iTunes 12.9 or later. Do you want to download the latest version of iTunes now?I ignore the message (as I think iTunes 12.9 requires Mojave, isn't it ?). Could it be the problem ???In XCode, I select the scheme iPhone Xs of XXX + Apple Watch of XXXI run, it compiles OK, but at the end says:"Finished running myApp on Apple Watch of XXX" in the XCode barPlus alert:App installation failedThe host is not paired with the device.I have checked in XCode Devices:- on the left, I see iPhoneXS of XXX- but not the Watch- On the right of Devices window, I see the PAIRED WATCHES:Apple Watch of XXXI also checked that Apple Watch is listed with an UDID in the Devices listThe scheme was in Debug mode.So, I changed to release mode, and get the message in Scheme that paired devices not available for debug.So, Compile again, but then get message :Could not launch 'Autonomie' on iPhone XS XXXThere was an error preparing Apple Watch de XXX for development.Try reattaching the device to which Apple Watch de XXX is paired. The device rejected the pairing attempt.So I unpaired the device.ReattachedTruted the hostRun again, just to get again the messageThe host is not paired with the device.I do not see which and where I should pair.
11
0
17k
Dec ’18
Most wanted Xcode features…
I propose that we collectively build a list of most wanted XCode features. Those pain points that do make our life more difficult with XCode or less fun.The goal would not be to have a scientifically correct ranking of all those, but simply:- make visible many ideas that have probably been reported in improvement requests- expose in short why that would be a great evolution, what pain point it would solve (functional improvements, not bugs correction)- if possible, discuss the feasibility of each idea.I would agree to update this original post to include new inputs.To avoid a thread where new posts get smaller and smaller in width, would be great to post each new idea as an answer to this original post. Of course discuss an idea as a reply to this idea post itself.Rule of the game would be tolerance, not trying to argue indefinitely on one idea. And be concise in wanted feature descriptionThe ultimate goal would be to influence positively XCode development team to consider the most wanted proposals.At least, have them explain us that one wanted feature in on the way (let's dream), why such idea is not so great, or why it would be too complex to implement…_________________________________Here is a first wanted feature, inspired from a thread in IB forum: https://forums.developer.apple.com/thread/72495Wanted Feature : Storyboard zoom when editing macOS appPain point: makes navigating in MacOS storyboard extremely painful if more than a few windows. Does limit the use of storyboards for MacOS App. The IB post was seen more than 7500 times, so that seems a largely shared concern.Feasibility: done for iOS, so should be possible for MacOS._________________________________Wanted Feature : Automatic/1-button icon, logo, screenshot, launch image generation for all project related devices/screens as mandated by ASC. Current with each new release.Pain point: The number of individually required image-based assets involved for meta data submittals became a burden long ago and continues to require significant labor to suss and satisfy.Feasibility: Xcode already does some of this w/launchStoryboards for new Swift projects, it just needs to be ubiquitous for all similar assets. It's just scripting/image resizing, which should not be a problem on macOS. Yes, I know there are 3rd party tools that help, but they seem to not keep up w/new devices/screen sizes - if they can do it, so can Xcode._________________________________Wanted Feature : More explicit help message when an action cannot be completed in IBPain point: When XCode cannot complete an action (e.g., create an IBOutlet for an IB object), option-cleaning the build folder solves the problem. Numerous issues reported on the forum were solved with this simple action. However, messages from IDE never mention this solution.Feasibility: Just adding (when appropriate) a suggestion in the message._________________________________Wanted Feature : When build fails, propose automatic correction action whenever possiblePain point: As an example, if some asset contains Finder information, a build error occurs, with message "resource fork, Finder information, or similar detritus not allowed Command /usr/bin/codesign failed with exit code 1". One has to go to Terminal, use xattr command and clean laboriously the offending files.Feasibility: As XCode has detected the files, it could propose to do the cleaning itself…- if it can be done with terminal command, so can Xcode._________________________________Wanted Feature : When an API is deprecated, get information on how to replace.Pain point: Most often, when an API is deprecated and we get a warning from compiler, we get no hint on how to replace. Thus, requiring to post on forum (a lot of posts relate to this search for information), search on web to find some hints… At the end losing time and not being sure to make the best decision.Feasibility: Those who deprecated API for sure know how to handle this. Should be (relatively) easy to provide information in doc, and in the deprecation message._________________________________Wanted Feature : When a string is missing for an infoPlist key, give a warning.Pain point: If the required key is missing, app will crash (or be rejected at Appstore verification stage which is a minor harm). I had the case where I called imagePicker, but intended to use only photo, not video. But if user taps on video, that caused a crash.Feasibility: When UIImagePickerControllerDelegate is used, compiler could check that infoPlist is defined for the required keys.._________________________________All comments welcomed.
14
1
3.2k
Dec ’19
Video required for WatchOS app ?
I submitted an iOS app with a watchOS companion app.App has been 'Metadata Rejected':Here is the full message:Guideline 2.1 - Information NeededWe have started the review of your app, but we are not able to continue because we need access to a video that demonstrates the current version of your app in use on a physical watchOS device.Please only include footage in your demo video of your app running on a physical watchOS device, and not on a simulator. It is acceptable to use a screen recorder to capture footage of your app in use.Next StepsTo help us proceed with the review of your app, please provide us with a link to a demo video in the App Review Information section of App Store Connect and reply to this message in Resolution Center.To provide a link to a demo video:- Log in to App Store Connect- Click on "My Apps"- Select your app- Click on the app version on the left side of the screen- Scroll down to "App Review Information"- Provide demo video access details in the "Notes" section- Once you've completed all changes, click the "Save" button at the top of the Version Information page.Please note that in the event that your app may only be reviewed by means of a demo video, you will be required to provide an updated demo video with every resubmission.Since your App Store Connect status is Metadata Rejected, we do NOT require a new binary. To revise the metadata, visit App Store Connect to select your app and revise the desired metadata values. Once you’ve completed all changes, reply to this message in Resolution Center and we will continue the review.I have 3 questions:- Is it a systematic requirement for Watch apps ? I did not see in guidelines ; or is it for some reason specific to my app or to the reviewer ?- How can I record video on the Apple Watch ? Should I film the watch while in operation and post this video ? Or is there a direct way to record the video from the watch to iPhone (using system tools, not third party).- I understand it is not video for publication on appstore, but video for tester. So should I include video in the screen captures section or put it on some web site and give a link to it to the tester ?
7
0
3.7k
Mar ’20
Is the new forum better ?
The new forum has been there for a week now. I did wait to get accustomed to it so that my comments are not just due to the change factor. I think it is now possible to make some assessment and I would appreciate others’ opinion to see if it is just me… There are some good points: the editor is much less whimsical the possibility to tag topic is a good thing search has improved (a little). But I find really frustrating points: Editor is better and has a live Preview. What a strange concept in the world of the inventor of WYSIWYG ! Did we return to the 90’s editors with their control characters ? Tags are good, but IMHO it would be useful to have major and minor tags. That would help focus when we list threads by tag by making it possible to select only the major one All posts are now reviewed to avoid spam. Good enough. But why can it take hours (in a case, already 5 hours and still in review) for a very simple answer post (no URL, no bizarre word) to be reviewed. A bit frustrating. And that does not allow for any rapidly going discussion. Many features seem to be inspired from SO, but without some critical capabilities, as the inclusion of images. The concept of upvote or downvote has been introduced as well. Was it really needed ? Won’t it lead to some regrettable side effects that we see in SO where some do not dare propose an answer or even ask a question for fear of a negative vote ? That’s particularly the case for anyone not very fluent in english who may not express properly his/her opinion. I did find that previous developers forum was a more welcoming place than SO. Hope that will not change. There are also capabilities that are missing from previous forum (unless they are so well hidden I could not find them): because of increased spacing, there are now just 15 instead 25 posts per page. And you have to scroll a lot more to get at the end of the shorter list. That do slow down the screening. how can one see the threads he contributed to ? Seems we can only filter the one we authored. It seems now quasi impossible to edit a post, and of course delete it. So at the end, instead of the whaooh effect I was expecting I have just a mixed feeling.
16
0
3.0k
Jun ’20
Height of safe area margin in iPhone 12 Pro Max
I have an image to fill the entire screen, even behind the notch. So I set the Top margin to safe area to -44. That works OH on all iPhones… except on iPhone 12 ProMax simulator. Here I get a few pixels uncovered at top. I had to change value to -48 to get everything OK. So question is: has "notch area" which defines safe area increased 4 pixels on iPhone 12 Pro max ? If so, is it a hardware change ?
1
0
4k
Nov ’20
Update to Catalina, not to Big Sur
One of my production Mac is still on Mojave. I want to update to Catalina and not Big Sur (waiting for dust to settle down). Automatic update from system preferences only proposes Big Sur as well as some patch updates for MacOS 10.14.6 What is the safe and sure way to update from Mojave to the latest Catalina ?
3
0
6.1k
Nov ’20
Capturing the image read by QRCode
In this app I get access to QRCode. Reading works perfectly from the camera. Now I am struggling to get the image that was processed by the built in QRCode reader. I have found many hints on SO, but cannot make it work. Here is the code I have now. It is a bit long, I have to slit in 2 parts I looked at: // https://stackoverflow.com/questions/56088575/how-to-get-image-of-qr-code-after-scanning-in-swift-ios // https://stackoverflow.com/questions/37869963/how-to-use-avcapturephotooutput import UIKit import AVFoundation class ScannerViewController: UIViewController, AVCaptureMetadataOutputObjectsDelegate { &#9;&#9;fileprivate var captureSession: AVCaptureSession! // use for QRCode reading &#9;&#9;fileprivate var previewLayer: AVCaptureVideoPreviewLayer! &#9;&#9; &#9;&#9;// To get the image of the QRCode &#9;&#9;private var photoOutputQR: AVCapturePhotoOutput! &#9;&#9;private var isCapturing = false &#9;&#9; &#9;&#9;override func viewDidLoad() { &#9;&#9;&#9;&#9;super.viewDidLoad() &#9;&#9;&#9;&#9;var accessGranted = false &#9;&#9;&#9; //&#9;switch AVCaptureDevice.authorizationStatus(for: .video) { // HERE TEST FOR ACCESS RIGHT. WORKS OK ; // But is .video enough ? &#9;&#9;&#9;&#9;} &#9;&#9;&#9;&#9; &#9;&#9;&#9;&#9;if !accessGranted {&#9;return } &#9;&#9;&#9;&#9;captureSession = AVCaptureSession() &#9;&#9;&#9;&#9; &#9;&#9;&#9;&#9;photoOutputQR = AVCapturePhotoOutput() // IS IT THE RIGHT PLACE AND THE RIGHT THING TO DO ? &#9;&#9;&#9;&#9;captureSession.addOutput(photoOutputQR)&#9; // Goal is to capture an image of QRCode once acquisition is done &#9;&#9;&#9;&#9;guard let videoCaptureDevice = AVCaptureDevice.default(for: .video) else { return } &#9;&#9;&#9;&#9;let videoInput: AVCaptureDeviceInput &#9;&#9;&#9;&#9; &#9;&#9;&#9;&#9;do { &#9;&#9;&#9;&#9;&#9;&#9;videoInput = try AVCaptureDeviceInput(device: videoCaptureDevice) &#9;&#9;&#9;&#9;} catch {&#9;return } &#9;&#9;&#9;&#9; &#9;&#9;&#9;&#9;if (captureSession.canAddInput(videoInput)) { &#9;&#9;&#9;&#9;&#9;&#9;captureSession.addInput(videoInput) &#9;&#9;&#9;&#9;} else { &#9;&#9;&#9;&#9;&#9;&#9;failed() &#9;&#9;&#9;&#9;&#9;&#9;return &#9;&#9;&#9;&#9;} &#9;&#9;&#9;&#9; &#9;&#9;&#9;&#9;let metadataOutput = AVCaptureMetadataOutput() &#9;&#9;&#9;&#9; &#9;&#9;&#9;&#9;if (captureSession.canAddOutput(metadataOutput)) { &#9;&#9;&#9;&#9;&#9;&#9;captureSession.addOutput(metadataOutput) // SO I have 2 output in captureSession. IS IT RIGHT ? &#9;&#9;&#9;&#9;&#9;&#9; &#9;&#9;&#9;&#9;&#9;&#9;metadataOutput.setMetadataObjectsDelegate(self, queue: DispatchQueue.main) &#9;&#9;&#9;&#9;&#9;&#9;metadataOutput.metadataObjectTypes = [.qr]&#9;// For QRCode video acquisition &#9;&#9;&#9;&#9;&#9;&#9; &#9;&#9;&#9;&#9;} else { &#9;&#9;&#9;&#9;&#9;&#9;failed() &#9;&#9;&#9;&#9;&#9;&#9;return &#9;&#9;&#9;&#9;} &#9;&#9;&#9;&#9; &#9;&#9;&#9;&#9;previewLayer = AVCaptureVideoPreviewLayer(session: captureSession) &#9;&#9;&#9;&#9;previewLayer.frame = view.layer.bounds &#9;&#9;&#9;&#9;previewLayer.frame.origin.y += 40 &#9;&#9;&#9;&#9;previewLayer.frame.size.height -= 40 &#9;&#9;&#9;&#9;previewLayer.videoGravity = .resizeAspectFill &#9;&#9;&#9;&#9;view.layer.addSublayer(previewLayer) &#9;&#9;&#9;&#9;captureSession.startRunning() &#9;&#9;} &#9;&#9; &#9;&#9;override func viewWillAppear(_ animated: Bool) { &#9;&#9;&#9;&#9; &#9;&#9;&#9;&#9;super.viewWillAppear(animated) &#9;&#9;&#9;&#9;if (captureSession?.isRunning == false) { &#9;&#9;&#9;&#9;&#9;&#9;captureSession.startRunning() &#9;&#9;&#9;&#9;} &#9;&#9;} &#9;&#9; &#9;&#9;override func viewWillDisappear(_ animated: Bool) { &#9;&#9;&#9;&#9; &#9;&#9;&#9;&#9;super.viewWillDisappear(animated) &#9;&#9;&#9;&#9;if (captureSession?.isRunning == true) { &#9;&#9;&#9;&#9;&#9;&#9;captureSession.stopRunning() &#9;&#9;&#9;&#9;} &#9;&#9;} &#9;&#9; &#9;&#9;// MARK: - scan Results &#9;&#9; &#9;&#9;func metadataOutput(_ output: AVCaptureMetadataOutput, didOutput metadataObjects: [AVMetadataObject], from connection: AVCaptureConnection) { &#9;&#9;&#9;&#9; &#9;&#9;&#9;&#9;captureSession.stopRunning() &#9;&#9;&#9;&#9; &#9;&#9;&#9;&#9;if let metadataObject = metadataObjects.first { &#9;&#9;&#9;&#9;&#9;&#9;guard let readableObject = metadataObject as? AVMetadataMachineReadableCodeObject else { return } &#9;&#9;&#9;&#9;&#9;&#9;guard let stringValue = readableObject.stringValue else { return } &#9;&#9;&#9;&#9;&#9;&#9;AudioServicesPlaySystemSound(SystemSoundID(kSystemSoundID_Vibrate)) &#9;&#9;&#9;&#9;&#9;&#9;found(code: stringValue) &#9;&#9;&#9;&#9;} &#9;&#9;&#9;&#9;// Get image - IS IT THE RIGHT PLACE TO DO IT ? &#9;&#9;&#9;&#9;// https://stackoverflow.com/questions/37869963/how-to-use-avcapturephotooutput &#9;&#9;&#9;&#9;print("Do I get here ?", isCapturing) &#9;&#9;&#9;&#9;let photoSettings = AVCapturePhotoSettings() &#9;&#9;&#9;&#9;let previewPixelType = photoSettings.availablePreviewPhotoPixelFormatTypes.first! &#9;&#9;&#9;&#9;print("previewPixelType", previewPixelType) &#9;&#9;&#9;&#9;let previewFormat = [kCVPixelBufferPixelFormatTypeKey as String: previewPixelType, &#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9; kCVPixelBufferWidthKey as String: 160, &#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9; kCVPixelBufferHeightKey as String: 160] &#9;&#9;&#9;&#9;photoSettings.previewPhotoFormat = previewFormat &#9;&#9;&#9;&#9;if !isCapturing { &#9;&#9;&#9;&#9;&#9;&#9;isCapturing = true &#9;&#9;&#9;&#9;&#9;&#9;photoOutputQR.capturePhoto(with: photoSettings, delegate: self) &#9;&#9;&#9;&#9;} &#9;&#9;&#9;&#9;dismiss(animated: true) &#9;&#9;} &#9;&#9; } extension ScannerViewController: AVCapturePhotoCaptureDelegate { &#9; &#9;&#9;func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) { &#9;&#9;&#9; &#9;&#9;&#9;&#9;isCapturing = false &#9;&#9;&#9;&#9;print("photo", photo, photo.fileDataRepresentation()) &#9;&#9;&#9;&#9;guard let imageData = photo.fileDataRepresentation() else { &#9;&#9;&#9;&#9;&#9;&#9;print("Error while generating image from photo capture data."); &#9;&#9;&#9;&#9;&#9;&#9;return &#9;&#9;&#9;&#9;} &#9;&#9; } } I get the following print on console Clearly photo is not loaded properly Do I get here ? false previewPixelType 875704422 photo <AVCapturePhoto: 0x281973a20 pts:nan 1/1 settings:uid:3 photo:{0x0} time:nan-nan> nil Error while generating image from photo capture data.
4
0
6.1k
Dec ’20
Cannot activate Xcode Edit>Substitution menu
When editing a Swift file in Xcode 12.2, I wanted to change the quotes autocompletion. For this I looked at the Edit > Substitution menu and selected smartQuotes. From this point, the whole substitution menu is disabled, no way to enable back. Opened another file: same issue. Opened another older version of Xcode: the same. Opened Xcode 12.2 on another Mac: same thing How can I re-enable the menu ? Whilst typing this message, I tried again and the menu is once again reactivated ! I did nothing (at least intentionally). Was it some type of time out or time required to complete a task ? But did not return to normal on the other Mac. But now, 10 more minutes later, it is disabled once again… That's the type of surprise effect I do dislike in an app…
1
0
701
Dec ’20
Passing data from an App VC to its widget
Learning Widgets, I try a simple pattern: get a textField in the app; have the Widget updated when textField is changed. App is UIKit. Widget has No "included Configuration Intent". It compiles and works. Widget is updated every 5 minutes as asked line 23 (final snippet). But the message is never updated (line 48 of final snippet) with globalToPass (as expected from line 24): it always shows "Hello". What I tried: Create a singleton to hold the data to pass: class Util { &#9;&#9; &#9;&#9;class var shared : Util { &#9;&#9;&#9;&#9;struct Singleton { &#9;&#9;&#9;&#9;&#9;&#9;static let instance = Util() &#9;&#9;&#9;&#9;} &#9;&#9;&#9;&#9;return Singleton.instance; &#9;&#9;} &#9;&#9; &#9;&#9;var globalToPass = "Hello" } shared the file between the 2 targets App and WidgetExtension In VC, update the singleton when textField is changed and ask for widget to reload timeline &#9;&#9;@IBAction func updateMessage(_ sender: UITextField) { &#9;&#9;&#9;&#9; &#9;&#9;&#9;&#9;Util.shared.globalToPass = valueToPassLabel.text ?? "--" &#9;&#9;&#9;&#9;WidgetCenter.shared.reloadTimelines(ofKind: "WidgetForTest") &#9;&#9;&#9;&#9;WidgetCenter.shared.reloadAllTimelines() &#9;&#9;} Problem : Widget never updated its message field Probably I need to have the message in @State var, but I could not get it work. Here is the full widget code at this time: import WidgetKit import SwiftUI struct LoadStatusProvider: TimelineProvider { &#9;&#9; &#9;&#9;func placeholder(in context: Context) -> SimpleEntry { &#9;&#9;&#9;&#9; &#9;&#9;&#9;&#9;SimpleEntry(date: Date(), loadEntry: 0, message: Util.shared.globalToPass) &#9;&#9;} &#9;&#9;func getSnapshot(in context: Context, completion: @escaping (SimpleEntry) -> ()) { &#9;&#9;&#9;&#9; &#9;&#9;&#9;&#9;let entry = SimpleEntry(date: Date(), loadEntry: 0, message: Util.shared.globalToPass) &#9;&#9;&#9;&#9;completion(entry) &#9;&#9;} &#9;&#9;func getTimeline(in context: Context, completion: @escaping (Timeline<Entry>) -> ()) { &#9;&#9;&#9;&#9;var entries: [SimpleEntry] = [] &#9;&#9;&#9;&#9;// Generate a timeline consisting of five entries an hour apart, starting from the current date. &#9;&#9;&#9;&#9;let currentDate = Date() &#9;&#9;&#9;&#9;for minuteOffset in 0 ..< 2 { &#9;&#9;&#9;&#9;&#9;&#9;let entryDate = Calendar.current.date(byAdding: .minute, value: 5*minuteOffset, to: currentDate)! &#9;&#9;&#9;&#9;&#9;&#9;let entry = SimpleEntry(date: entryDate, loadEntry: minuteOffset, message: Util.shared.globalToPass) &#9;&#9;&#9;&#9;&#9;&#9;entries.append(entry) &#9;&#9;&#9;&#9;} &#9;&#9;&#9;&#9;let timeline = Timeline(entries: entries, policy: .atEnd) &#9;&#9;&#9;&#9;completion(timeline) &#9;&#9;} } struct SimpleEntry: TimelineEntry { &#9;&#9;let date: Date &#9;&#9;let loadEntry: Int &#9;&#9;let message: String } struct WidgetForTestNoIntentEntryView : View { &#9;&#9;var entry: LoadStatusProvider.Entry &#9;&#9;var body: some View { &#9;&#9;&#9;&#9;let formatter = DateFormatter() &#9;&#9;&#9;&#9;formatter.timeStyle = .medium &#9;&#9;&#9;&#9;let dateString = formatter.string(from: entry.date) &#9;&#9;&#9;&#9;return &#9;&#9;&#9;&#9;&#9;&#9;VStack { &#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;Text(String(entry.message)) &#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;HStack { &#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;Text("Started") &#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;Text(entry.date, style: .time) &#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;} &#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;HStack { &#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;Text("Now") &#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;Text(dateString) &#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;} &#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;HStack { &#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;Text("Loaded") &#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;Text(String(entry.loadEntry)) &#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;} &#9;&#9;&#9;&#9;&#9;&#9;} &#9;&#9;} } @main struct WidgetForTestNoIntent: Widget { &#9;&#9;let kind: String = "WidgetForTestNoIntent" &#9;&#9;var body: some WidgetConfiguration { &#9;&#9;&#9;&#9;StaticConfiguration(kind: kind, provider: LoadStatusProvider()) { entry in &#9;&#9;&#9;&#9;&#9;&#9;WidgetForTestNoIntentEntryView(entry: entry) &#9;&#9;&#9;&#9;} &#9;&#9;&#9;&#9;.configurationDisplayName("My Widget") &#9;&#9;&#9;&#9;.description("This is an example widget.") &#9;&#9;} } struct WidgetForTestNoIntent_Previews: PreviewProvider { &#9;&#9;static var previews: some View { &#9;&#9;&#9;&#9; &#9;&#9;&#9;&#9;WidgetForTestNoIntentEntryView(entry: SimpleEntry(date: Date(), loadEntry: 0, message: "-")) &#9;&#9;&#9;&#9;&#9;&#9;.previewContext(WidgetPreviewContext(family: .systemSmall)) &#9;&#9;} } I have not defined extension for IntentHandler
2
0
2.3k
Dec ’20
List with 2 cells side by side
From the Landmarks sample project, I tried a variation, to get two cells side by side in the List. This is the original with on cell per row as in the tutorial: struct LandmarkList: View { &#9;&#9; &#9;&#9;var body: some View { &#9;&#9;&#9;&#9;List(landmarks) { landmark in &#9;&#9;&#9;&#9;&#9;&#9;LandmarkRow(landmark: landmark) &#9;&#9;&#9;&#9;} &#9;&#9;} } Surprisingly, the following variation works well to display 2 cells side by side: struct LandmarkList: View { &#9;&#9; &#9;&#9;var body: some View { &#9;&#9;&#9;&#9;List (0 ..< (landmarks.count+1)/2) { item in &#9;&#9;&#9;&#9;&#9;&#9;LandmarkRow(landmark: landmarks[2*item]) &#9;&#9;&#9;&#9;&#9;&#9;if 2*item + 1 < landmarks.count { &#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;LandmarkRow(landmark: landmarks[(2*item)+1]) &#9;&#9;&#9;&#9;&#9;&#9;} &#9;&#9;&#9;&#9;} &#9;&#9;} } I thus tried to go on and add navigationLinks… It works when adding a navigation link to the first cell only struct LandmarkList: View { &#9;&#9; &#9;&#9;var body: some View { &#9;&#9;&#9;&#9;NavigationView { &#9;&#9;&#9;&#9;&#9;&#9;List (0 ..< (landmarks.count+1)/2) { item in &#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;NavigationLink(destination: LandmarkDetail()) { &#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;LandmarkRow(landmark: landmarks[2*item]) &#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;} &#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;if 2*item + 1 < landmarks.count { &#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;LandmarkRow(landmark: landmarks[(2*item)+1]) &#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;} &#9;&#9;&#9;&#9;&#9;&#9;} &#9;&#9;&#9;&#9;&#9;&#9;.navigationTitle("Landmarks")&#9; &#9;&#9;&#9;&#9;} &#9;&#9;} } But adding NavigationLink to the second makes it fail: struct LandmarkList: View { &#9;&#9; &#9;&#9;var body: some View { &#9;&#9;&#9;&#9;NavigationView { &#9;&#9;&#9;&#9;&#9;&#9;List (0 ..< (landmarks.count+1)/2) { item in &#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;NavigationLink(destination: LandmarkDetail()) { &#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;LandmarkRow(landmark: landmarks[2*item]) &#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;} &#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;if 2*item + 1 < landmarks.count { NavigationLink(destination: LandmarkDetail()) { &#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9; LandmarkRow(landmark: landmarks[(2*item)+1]) &#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9; } &#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;} &#9;&#9;&#9;&#9;&#9;&#9;} &#9;&#9;&#9;&#9;&#9;&#9;.navigationTitle("Landmarks")&#9; &#9;&#9;&#9;&#9;} &#9;&#9;} } The error is on line 3: The compiler is unable to type-check this expression in reasonable time; try breaking up the expression into distinct sub-expressions Is it a normal List behaviour ? Or did I miss something ? Which expression could I break ?
2
0
730
Dec ’20