Post

Replies

Boosts

Views

Created

When observing a notification that may be posted "on a thread other than the one used to registered the observer," how should I ensure thread-safe UI work?
I observe when an AVPlayer finishes play in order to present a UIAlert at the end time. NotificationCenter.default.addObserver( self, selector: #selector(presentAlert), name: .AVPlayerItemDidPlayToEndTime, object: nil ) I've had multiple user reports of the alert happening where they're not intended, such as the middle of the video after replaying, and on other views. I'm unable to reproduce this myself, but my guess is that it's a threading issue since AVPlayerItemDidPlayToEndTime says "the system may post this notification on a thread other than the one used to registered the observer." How then do I make sure the alert is present on the main thread? Should I dispatch to the main queue from within my presentAlert function, or add the above observer with addObserver(forName:object:queue:using:) instead, passing in the main operation queue?
1
0
1.3k
Oct ’21
How do you apply a diffable data source UI snapshot only after awaiting (with async/await) data fetched from the network?
I'm new to async/await, and am currently migrating my completion handler code to Swift 5.5's concurrency features. After generating an sync alternative in Xcode to my function func fetchMatchRecords(completion: @escaping ([Match]) -> Void), it becomes func fetchMatchRecords() async -> [Match]. I'm not sure how it would be used in the context of UIKit and diffable data sources. In a viewDidLoad, previously it would be MatchHistoryController.shared.fetchMatchRecords() { matches in DispatchQueue.main.async { self.dataSource.apply(self.initialSnapshot(), animatingDifferences: false) } } But I'm not sure how it would be used now Task { await MatchHistoryController.shared.fetchMatchRecords() } self.dataSource.apply(self.initialSnapshot(), animatingDifferences: false) How would I make sure that the snapshot is applied only after awaiting a successful fetch result? Here's the definition of initialSnapshot() that I used: func initialSnapshot() -> NSDiffableDataSourceSnapshot<Section, Match> { var snapshot = NSDiffableDataSourceSnapshot<Section, Match>() snapshot.appendSections([.main]) snapshot.appendItems(MatchHistoryController.shared.matches) return snapshot }
1
0
2.2k
Sep ’21
Migrating from value semantics model stored on iCloud to Core Data model
I have an app that currently depends on fetching the model through CloudKit, and is composed of value types. I'm considering adding Core Data support so that record modifications are robust regardless of network conditions. Core Data resources seem to always assume a model layer with reference semantics, so I'm not sure where to begin. Should I keep my top-level model type a struct? Can I? If I move my model to reference semantics, how might I bridge from past model instances that are fetched through CloudKit and then decoded? Thank you in advance.
0
0
569
Sep ’21
Constrain view's top anchor to just at the edge of sensor housing
What might be a good way to constrain a view's top anchor to be just at the edge of a device's Face ID sensor housing if it has one? This view is a product photo that would be clipped too much if it ignored the top safe area inset, but if it was positioned relative to the top safe area margin this wouldn't be ideal either because of the slight gap between the sensor housing and the view (the view is a photo of pants cropped at the waist). What might be a good approach here?
0
0
1.1k
Aug ’21
What is a robust approach to deleting a CKRecord associated with an IndexPath in a table view or collection view?
When synchronizing model objects, local CKRecords, and CKRecords in CloudKit during swipe-to-delete, how can I make this as robust as possible? Error handling omitted for the sake of the example. override func tableView(_ tableView: UITableView, commit editingStyle: UITableViewCell.EditingStyle, forRowAt indexPath: IndexPath) {         if editingStyle == .delete {             let record = self.records[indexPath.row]             privateDatabase.delete(withRecordID: record.recordID) { recordID, error in                 self.records.remove(at: indexPath.row)             }         }     } Since indexPath could change due to other changes in the table view / collection view during the time it takes to delete the record from CloudKit, how could this be improved upon?
1
0
656
Jul ’21
When making an AVFoundation video copy, how do you only add particular ranges of the original video for which there exist trajectories
I've been looking through Apple's sample code Building a Feature-Rich App for Sports Analysis - https://developer.apple.com/documentation/vision/building_a_feature-rich_app_for_sports_analysis and its associated WWDC video to learn to reason about AVFoundation and VNDetectTrajectoriesRequest - https://developer.apple.com/documentation/vision/vndetecttrajectoriesrequest. My goal is to allow the user to import videos (this part I have working, the user sees a UIDocumentBrowserViewController - https://developer.apple.com/documentation/uikit/uidocumentbrowserviewcontroller, picks a video file, and then a copy is made), but I only want segments of the original video copied where trajectories are detected from a ball moving. I've tried as best I can to grasp the two parts, at the very least finding where the video copy is made and where the trajectory request is made. The full video copy happens in CameraViewController.swift (I'm starting with just imported video for now and not reading live from the device's video camera), line 160:func startReadingAsset(_ asset: AVAsset) { videoRenderView = VideoRenderView(frame: view.bounds) setupVideoOutputView(videoRenderView) let displayLink = CADisplayLink(target: self, selector: #selector(handleDisplayLink(:))) displayLink.preferredFramesPerSecond = 0 displayLink.isPaused = true displayLink.add(to: RunLoop.current, forMode: .default) guard let track = asset.tracks(withMediaType: .video).first else { AppError.display(AppError.videoReadingError(reason: "No video tracks found in AVAsset."), inViewController: self) return } let playerItem = AVPlayerItem(asset: asset) let player = AVPlayer(playerItem: playerItem) let settings = [ String(kCVPixelBufferPixelFormatTypeKey): kCVPixelFormatType420YpCbCr8BiPlanarFullRange ] let output = AVPlayerItemVideoOutput(pixelBufferAttributes: settings) playerItem.add(output) player.actionAtItemEnd = .pause player.play() self.displayLink = displayLink self.playerItemOutput = output self.videoRenderView.player = player let affineTransform = track.preferredTransform.inverted() let angleInDegrees = atan2(affineTransform.b, affineTransform.a) * CGFloat(180) / CGFloat.pi var orientation: UInt32 = 1 switch angleInDegrees { case 0: orientation = 1 // Recording button is on the right case 180, -180: orientation = 3 // abs(180) degree rotation recording button is on the right case 90: orientation = 8 // 90 degree CW rotation recording button is on the top case -90: orientation = 6 // 90 degree CCW rotation recording button is on the bottom default: orientation = 1 } videoFileBufferOrientation = CGImagePropertyOrientation(rawValue: orientation)! videoFileFrameDuration = track.minFrameDuration displayLink.isPaused = false } @objc private func handleDisplayLink(_ displayLink: CADisplayLink) { guard let output = playerItemOutput else { return } videoFileReadingQueue.async { let nextTimeStamp = displayLink.timestamp + displayLink.duration let itemTime = output.itemTime(forHostTime: nextTimeStamp) guard output.hasNewPixelBuffer(forItemTime: itemTime) else { return } guard let pixelBuffer = output.copyPixelBuffer(forItemTime: itemTime, itemTimeForDisplay: nil) else { return } // Create sample buffer from pixel buffer var sampleBuffer: CMSampleBuffer? var formatDescription: CMVideoFormatDescription? CMVideoFormatDescriptionCreateForImageBuffer(allocator: nil, imageBuffer: pixelBuffer, formatDescriptionOut: &formatDescription) let duration = self.videoFileFrameDuration var timingInfo = CMSampleTimingInfo(duration: duration, presentationTimeStamp: itemTime, decodeTimeStamp: itemTime) CMSampleBufferCreateForImageBuffer(allocator: nil, imageBuffer: pixelBuffer, dataReady: true, makeDataReadyCallback: nil, refcon: nil, formatDescription: formatDescription!, sampleTiming: &timingInfo, sampleBufferOut: &sampleBuffer) if let sampleBuffer = sampleBuffer { self.outputDelegate?.cameraViewController(self, didReceiveBuffer: sampleBuffer, orientation: self.videoFileBufferOrientation) DispatchQueue.main.async { let stateMachine = self.gameManager.stateMachine if stateMachine.currentState is GameManager.SetupCameraState { // Once we received first buffer we are ready to proceed to the next state stateMachine.enter(GameManager.DetectingBoardState.self) } } } } } Line 139 self.outputDelegate?.cameraViewController(self, didReceiveBuffer: sampleBuffer, orientation: self.videoFileBufferOrientation) is where the video sample buffer is passed to the Vision framework subsystem for analyzing trajectories, the second part. This delegate callback is implemented in GameViewController.swift on line 335: // Perform the trajectory request in a separate dispatch queue. trajectoryQueue.async { do { try visionHandler.perform([self.detectTrajectoryRequest]) if let results = self.detectTrajectoryRequest.results { DispatchQueue.main.async { self.processTrajectoryObservations(controller, results) } } } catch { AppError.display(error, inViewController: self) } } Trajectories found are drawn over the video in self.processTrajectoryObservations(controller, results). Where I'm stuck now is modifying this so that instead of drawing the trajectories, the new video only copies parts of the original video to it where trajectories were detected in the frame.
0
0
1.1k
Jan ’21
When observing a notification that may be posted "on a thread other than the one used to registered the observer," how should I ensure thread-safe UI work?
I observe when an AVPlayer finishes play in order to present a UIAlert at the end time. NotificationCenter.default.addObserver( self, selector: #selector(presentAlert), name: .AVPlayerItemDidPlayToEndTime, object: nil ) I've had multiple user reports of the alert happening where they're not intended, such as the middle of the video after replaying, and on other views. I'm unable to reproduce this myself, but my guess is that it's a threading issue since AVPlayerItemDidPlayToEndTime says "the system may post this notification on a thread other than the one used to registered the observer." How then do I make sure the alert is present on the main thread? Should I dispatch to the main queue from within my presentAlert function, or add the above observer with addObserver(forName:object:queue:using:) instead, passing in the main operation queue?
Replies
1
Boosts
0
Views
1.3k
Activity
Oct ’21
How do you apply a diffable data source UI snapshot only after awaiting (with async/await) data fetched from the network?
I'm new to async/await, and am currently migrating my completion handler code to Swift 5.5's concurrency features. After generating an sync alternative in Xcode to my function func fetchMatchRecords(completion: @escaping ([Match]) -> Void), it becomes func fetchMatchRecords() async -> [Match]. I'm not sure how it would be used in the context of UIKit and diffable data sources. In a viewDidLoad, previously it would be MatchHistoryController.shared.fetchMatchRecords() { matches in DispatchQueue.main.async { self.dataSource.apply(self.initialSnapshot(), animatingDifferences: false) } } But I'm not sure how it would be used now Task { await MatchHistoryController.shared.fetchMatchRecords() } self.dataSource.apply(self.initialSnapshot(), animatingDifferences: false) How would I make sure that the snapshot is applied only after awaiting a successful fetch result? Here's the definition of initialSnapshot() that I used: func initialSnapshot() -> NSDiffableDataSourceSnapshot<Section, Match> { var snapshot = NSDiffableDataSourceSnapshot<Section, Match>() snapshot.appendSections([.main]) snapshot.appendItems(MatchHistoryController.shared.matches) return snapshot }
Replies
1
Boosts
0
Views
2.2k
Activity
Sep ’21
Migrating from value semantics model stored on iCloud to Core Data model
I have an app that currently depends on fetching the model through CloudKit, and is composed of value types. I'm considering adding Core Data support so that record modifications are robust regardless of network conditions. Core Data resources seem to always assume a model layer with reference semantics, so I'm not sure where to begin. Should I keep my top-level model type a struct? Can I? If I move my model to reference semantics, how might I bridge from past model instances that are fetched through CloudKit and then decoded? Thank you in advance.
Replies
0
Boosts
0
Views
569
Activity
Sep ’21
"The symbol could not be imported. The template version number must be present in the SVG file. Make sure that the version number text has not been converted to outlines."
How do I resolve this issue when trying to re-import a custom SF Symbol into Apple's SF Symbols app? Is there an exact export configuration I'm missing in Sketch or Figma?
Replies
3
Boosts
0
Views
4.9k
Activity
Sep ’21
SwiftUI scroll view page indicator color
In a SwiftUI scroll view with the page style, is it possible to change the page indicator color?
Replies
2
Boosts
0
Views
4.9k
Activity
Aug ’21
Constrain view's top anchor to just at the edge of sensor housing
What might be a good way to constrain a view's top anchor to be just at the edge of a device's Face ID sensor housing if it has one? This view is a product photo that would be clipped too much if it ignored the top safe area inset, but if it was positioned relative to the top safe area margin this wouldn't be ideal either because of the slight gap between the sensor housing and the view (the view is a photo of pants cropped at the waist). What might be a good approach here?
Replies
0
Boosts
0
Views
1.1k
Activity
Aug ’21
In UIKit, how do you present a confirmation dialog when the user is swiping to delete a table view cell?
Is there a UIKit equivalent to SwiftUI's confirmationDialog(_:isPresented:titleVisibility:actions:)?
Topic: UI Frameworks SubTopic: UIKit Tags:
Replies
3
Boosts
0
Views
1.1k
Activity
Jul ’21
What is a robust approach to deleting a CKRecord associated with an IndexPath in a table view or collection view?
When synchronizing model objects, local CKRecords, and CKRecords in CloudKit during swipe-to-delete, how can I make this as robust as possible? Error handling omitted for the sake of the example. override func tableView(_ tableView: UITableView, commit editingStyle: UITableViewCell.EditingStyle, forRowAt indexPath: IndexPath) {         if editingStyle == .delete {             let record = self.records[indexPath.row]             privateDatabase.delete(withRecordID: record.recordID) { recordID, error in                 self.records.remove(at: indexPath.row)             }         }     } Since indexPath could change due to other changes in the table view / collection view during the time it takes to delete the record from CloudKit, how could this be improved upon?
Replies
1
Boosts
0
Views
656
Activity
Jul ’21
What is the simplest approach to execute a Swift statement only after an asynchronous operation finishes?
For example, Operation A both fetches model data over the network and updates a UICollectionViewbacked by it. Operation B filters model data. What is a good approach to executing B only after A is finished?
Replies
2
Boosts
0
Views
611
Activity
Jul ’21
What are best practices for storing API keys / access tokens?
A quick web search shows that storing them in a plist is not recommended. What are the best practices here?
Replies
1
Boosts
1
Views
2.3k
Activity
Jul ’21
Code signing failed on latest non-beta Xcode in macOS Monterey beta 1
"Code signing 'WatchDeuce Extension.appex' failed." "View distribution logs for more information." Does anyone have any suggestions for a solution or workaround? I've filed this as FB9171462 with the logs attached.
Replies
10
Boosts
0
Views
4.0k
Activity
Jun ’21
When making an AVFoundation video copy, how do you only add particular ranges of the original video for which there exist trajectories
I've been looking through Apple's sample code Building a Feature-Rich App for Sports Analysis - https://developer.apple.com/documentation/vision/building_a_feature-rich_app_for_sports_analysis and its associated WWDC video to learn to reason about AVFoundation and VNDetectTrajectoriesRequest - https://developer.apple.com/documentation/vision/vndetecttrajectoriesrequest. My goal is to allow the user to import videos (this part I have working, the user sees a UIDocumentBrowserViewController - https://developer.apple.com/documentation/uikit/uidocumentbrowserviewcontroller, picks a video file, and then a copy is made), but I only want segments of the original video copied where trajectories are detected from a ball moving. I've tried as best I can to grasp the two parts, at the very least finding where the video copy is made and where the trajectory request is made. The full video copy happens in CameraViewController.swift (I'm starting with just imported video for now and not reading live from the device's video camera), line 160:func startReadingAsset(_ asset: AVAsset) { videoRenderView = VideoRenderView(frame: view.bounds) setupVideoOutputView(videoRenderView) let displayLink = CADisplayLink(target: self, selector: #selector(handleDisplayLink(:))) displayLink.preferredFramesPerSecond = 0 displayLink.isPaused = true displayLink.add(to: RunLoop.current, forMode: .default) guard let track = asset.tracks(withMediaType: .video).first else { AppError.display(AppError.videoReadingError(reason: "No video tracks found in AVAsset."), inViewController: self) return } let playerItem = AVPlayerItem(asset: asset) let player = AVPlayer(playerItem: playerItem) let settings = [ String(kCVPixelBufferPixelFormatTypeKey): kCVPixelFormatType420YpCbCr8BiPlanarFullRange ] let output = AVPlayerItemVideoOutput(pixelBufferAttributes: settings) playerItem.add(output) player.actionAtItemEnd = .pause player.play() self.displayLink = displayLink self.playerItemOutput = output self.videoRenderView.player = player let affineTransform = track.preferredTransform.inverted() let angleInDegrees = atan2(affineTransform.b, affineTransform.a) * CGFloat(180) / CGFloat.pi var orientation: UInt32 = 1 switch angleInDegrees { case 0: orientation = 1 // Recording button is on the right case 180, -180: orientation = 3 // abs(180) degree rotation recording button is on the right case 90: orientation = 8 // 90 degree CW rotation recording button is on the top case -90: orientation = 6 // 90 degree CCW rotation recording button is on the bottom default: orientation = 1 } videoFileBufferOrientation = CGImagePropertyOrientation(rawValue: orientation)! videoFileFrameDuration = track.minFrameDuration displayLink.isPaused = false } @objc private func handleDisplayLink(_ displayLink: CADisplayLink) { guard let output = playerItemOutput else { return } videoFileReadingQueue.async { let nextTimeStamp = displayLink.timestamp + displayLink.duration let itemTime = output.itemTime(forHostTime: nextTimeStamp) guard output.hasNewPixelBuffer(forItemTime: itemTime) else { return } guard let pixelBuffer = output.copyPixelBuffer(forItemTime: itemTime, itemTimeForDisplay: nil) else { return } // Create sample buffer from pixel buffer var sampleBuffer: CMSampleBuffer? var formatDescription: CMVideoFormatDescription? CMVideoFormatDescriptionCreateForImageBuffer(allocator: nil, imageBuffer: pixelBuffer, formatDescriptionOut: &formatDescription) let duration = self.videoFileFrameDuration var timingInfo = CMSampleTimingInfo(duration: duration, presentationTimeStamp: itemTime, decodeTimeStamp: itemTime) CMSampleBufferCreateForImageBuffer(allocator: nil, imageBuffer: pixelBuffer, dataReady: true, makeDataReadyCallback: nil, refcon: nil, formatDescription: formatDescription!, sampleTiming: &timingInfo, sampleBufferOut: &sampleBuffer) if let sampleBuffer = sampleBuffer { self.outputDelegate?.cameraViewController(self, didReceiveBuffer: sampleBuffer, orientation: self.videoFileBufferOrientation) DispatchQueue.main.async { let stateMachine = self.gameManager.stateMachine if stateMachine.currentState is GameManager.SetupCameraState { // Once we received first buffer we are ready to proceed to the next state stateMachine.enter(GameManager.DetectingBoardState.self) } } } } } Line 139 self.outputDelegate?.cameraViewController(self, didReceiveBuffer: sampleBuffer, orientation: self.videoFileBufferOrientation) is where the video sample buffer is passed to the Vision framework subsystem for analyzing trajectories, the second part. This delegate callback is implemented in GameViewController.swift on line 335: // Perform the trajectory request in a separate dispatch queue. trajectoryQueue.async { do { try visionHandler.perform([self.detectTrajectoryRequest]) if let results = self.detectTrajectoryRequest.results { DispatchQueue.main.async { self.processTrajectoryObservations(controller, results) } } } catch { AppError.display(error, inViewController: self) } } Trajectories found are drawn over the video in self.processTrajectoryObservations(controller, results). Where I'm stuck now is modifying this so that instead of drawing the trajectories, the new video only copies parts of the original video to it where trajectories were detected in the frame.
Replies
0
Boosts
0
Views
1.1k
Activity
Jan ’21
Picker driven by enum
How do you create a picker where the user's selection corresponds to different values of an enumerated type?
Replies
2
Boosts
1
Views
11k
Activity
Dec ’19