Post

Replies

Boosts

Views

Activity

What is a good way to reset a UIPanGestureRecognizer's view to its original frame once the pan gesture is done?
Say you have a pinch gesture recognizer and pan gesture recognizer on an image view: @IBAction func pinchPiece(_ pinchGestureRecognizer: UIPinchGestureRecognizer) { guard pinchGestureRecognizer.state == .began || pinchGestureRecognizer.state == .changed, let piece = pinchGestureRecognizer.view else { // After pinch releases, zoom back out. if pinchGestureRecognizer.state == .ended { UIView.animate(withDuration: 0.3, animations: { pinchGestureRecognizer.view?.transform = CGAffineTransform.identity }) } return } adjustAnchor(for: pinchGestureRecognizer) let scale = pinchGestureRecognizer.scale piece.transform = piece.transform.scaledBy(x: scale, y: scale) pinchGestureRecognizer.scale = 1 // Clear scale so that it is the right delta next time. } @IBAction func panPiece(_ panGestureRecognizer: UIPanGestureRecognizer) { guard panGestureRecognizer.state == .began || panGestureRecognizer.state == .changed, let piece = panGestureRecognizer.view else { return } let translation = panGestureRecognizer.translation(in: piece.superview) piece.center = CGPoint(x: piece.center.x + translation.x, y: piece.center.y + translation.y) panGestureRecognizer.setTranslation(.zero, in: piece.superview) } public func gestureRecognizer(_ gestureRecognizer: UIGestureRecognizer, shouldRecognizeSimultaneouslyWith otherGestureRecognizer: UIGestureRecognizer) -> Bool { true } The pinch gesture's view resets to its original state after the gesture is done, which occurs in its else clause. What would be a good way to do the same for the pan gesture recognizer? Ideally I'd like the gesture recognizers to be in an extension of UIImageView, which would also mean that I can't add a store property to the extension for tracking the initial state of the image view.
5
0
1.2k
Dec ’21
General guidelines for improving body pose action classifier performance
I just got an app feature working where the user imports a video file, each frame is fed to a custom action classifier, and then only frames with a certain action classified are exported. However, I'm finding that testing a one hour 4K video at 60 FPS is taking an unreasonably long time - it's been processing for 7 hours now on a MacBook Pro with M1 Max running the Mac Catalyst app. Are there any techniques or general guidance that would help with improving performance? As much as possible I'd like to preserve the input video quality, especially frame rate. One hour length for the video is expected, as it's of a tennis session (could be anywhere from 10 minutes to a couple hours). I made the body pose action classifier with Create ML.
2
0
1.3k
Jan ’22
Why does Accelerate appear so out of place in terms of naming style?
Reading a solution given in a book to adding the elements of an input array of doubles, an example is given with Accelerate as func challenge52c(numbers: [Double]) -> Double { var result: Double = 0.0 vDSP_sveD(numbers, 1, &result, vDSP_Length(numbers.count)) return result } I can understand why Accelerate API's don't adhere to Swift API design guidelines, why is it that they don't seem to use Cocoa guidelines either? Are there other conventions or precedents that I'm missing?
2
0
916
Apr ’22
When decoding a Codable struct from JSON, how do you initialize a property not present in the JSON?
Say that in this example here, this struct struct Reminder: Identifiable { var id: String = UUID().uuidString var title: String var dueDate: Date var notes: String? = nil var isComplete: Bool = false } is instead decoded from JSON array values (rather than constructed like in the linked example). If each JSON value were to be missing an "id", how would id then be initialized? When trying this myself I got an error keyNotFound(CodingKeys(stringValue: "id", intValue: nil), Swift.DecodingError.Context(codingPath: [_JSONKey(stringValue: "Index 0", intValue: 0)], debugDescription: "No value associated with key CodingKeys(stringValue: \"id\", intValue: nil) (\"id\").", underlyingError: nil)).
2
0
2.8k
May ’22
How do you configure collection view list cells to look inset with rounded corners?
In the Health app, it appears that cells and not sections are styled in this way: The closest I know of to getting to this appearance is setting the section to be inset grouped let listConfiguration = UICollectionLayoutListConfiguration(appearance: .insetGrouped) let listLayout = UICollectionViewCompositionalLayout.list(using: listConfiguration) collectionView.collectionViewLayout = listLayout but I'm not sure of a good approach to giving each cell this appearance like in the screenshot above. I'm assuming the list style collection view shown is two sections with three total cells, rather than three inset grouped sections.
Topic: UI Frameworks SubTopic: UIKit Tags:
2
0
2.8k
May ’22
Invalid `Podfile` file: cannot load such file -- cocoapods-catalyst-support
I encountered this error 2023-01-24T22:14:41.500565325Z Installing ri documentation for cocoapods-catalyst-support-0.2.1 2023-01-24T22:14:41.500827390Z Done installing documentation for colored2, concurrent-ruby, i18n, tzinfo, zeitwerk, activesupport, nap, fuzzy_match, httpclient, algoliasearch, ffi, ethon, typhoeus, netrc, public_suffix, addressable, cocoapods-core, claide, cocoapods-deintegrate, cocoapods-downloader, cocoapods-plugins, cocoapods-search, cocoapods-trunk, cocoapods-try, molinillo, atomos, nanaimo, rexml, xcodeproj, escape, fourflusher, gh_inspector, ruby-macho, cocoapods, cocoapods-catalyst-support after 50 seconds 2023-01-24T22:14:41.500997230Z 35 gems installed 2023-01-24T22:14:42.023353910Z [in /Volumes/workspace/repository] 2023-01-24T22:14:42.023798292Z 2023-01-24T22:14:42.024448317Z [!] Invalid `Podfile` file: cannot load such file -- cocoapods-catalyst-support. 2023-01-24T22:14:42.024714192Z 2023-01-24T22:14:42.024976712Z # from /Volumes/workspace/repository/Podfile:1 2023-01-24T22:14:42.025200239Z # ------------------------------------------- 2023-01-24T22:14:42.025463448Z > require 'cocoapods-catalyst-support' 2023-01-24T22:14:42.025663811Z # 2023-01-24T22:14:42.025900158Z # ------------------------------------------- from my post-clone script, which is #!/bin/sh # ci_post_clone.sh export GEM_HOME="$HOME/.gem" gem install bundler brew install cocoapods gem install cocoapods-catalyst-support # Install dependencies managed with CocoaPods. pod install
2
0
4.9k
Sep ’23
With the Vision framework, is it possible to get the time ranges or frames for which the video contains trajectories?
As far as I can tell - https://developer.apple.com/documentation/vision/identifying_trajectories_in_video trajectory detection lets you use characteristics of the trajectories detected for, say, drawing over the video as it plays. However, is it possible to mark which time ranges the video has detected trajectories, or perhaps access the frames for which there are trajectories?
1
0
758
Feb ’21