"Code signing 'WatchDeuce Extension.appex' failed."
"View distribution logs for more information."
Does anyone have any suggestions for a solution or workaround? I've filed this as FB9171462 with the logs attached.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Say you have a pinch gesture recognizer and pan gesture recognizer on an image view:
@IBAction func pinchPiece(_ pinchGestureRecognizer: UIPinchGestureRecognizer) {
guard pinchGestureRecognizer.state == .began || pinchGestureRecognizer.state == .changed,
let piece = pinchGestureRecognizer.view else {
// After pinch releases, zoom back out.
if pinchGestureRecognizer.state == .ended {
UIView.animate(withDuration: 0.3, animations: {
pinchGestureRecognizer.view?.transform = CGAffineTransform.identity
})
}
return
}
adjustAnchor(for: pinchGestureRecognizer)
let scale = pinchGestureRecognizer.scale
piece.transform = piece.transform.scaledBy(x: scale, y: scale)
pinchGestureRecognizer.scale = 1 // Clear scale so that it is the right delta next time.
}
@IBAction func panPiece(_ panGestureRecognizer: UIPanGestureRecognizer) {
guard panGestureRecognizer.state == .began || panGestureRecognizer.state == .changed,
let piece = panGestureRecognizer.view else {
return
}
let translation = panGestureRecognizer.translation(in: piece.superview)
piece.center = CGPoint(x: piece.center.x + translation.x, y: piece.center.y + translation.y)
panGestureRecognizer.setTranslation(.zero, in: piece.superview)
}
public func gestureRecognizer(_ gestureRecognizer: UIGestureRecognizer,
shouldRecognizeSimultaneouslyWith otherGestureRecognizer: UIGestureRecognizer) -> Bool {
true
}
The pinch gesture's view resets to its original state after the gesture is done, which occurs in its else clause. What would be a good way to do the same for the pan gesture recognizer? Ideally I'd like the gesture recognizers to be in an extension of UIImageView, which would also mean that I can't add a store property to the extension for tracking the initial state of the image view.
Is there a UIKit equivalent to SwiftUI's confirmationDialog(_:isPresented:titleVisibility:actions:)?
How do I resolve this issue when trying to re-import a custom SF Symbol into Apple's SF Symbols app? Is there an exact export configuration I'm missing in Sketch or Figma?
How do you create a picker where the user's selection corresponds to different values of an enumerated type?
For example,
Operation A both fetches model data over the network and updates a UICollectionViewbacked by it.
Operation B filters model data.
What is a good approach to executing B only after A is finished?
In a SwiftUI scroll view with the page style, is it possible to change the page indicator color?
I just got an app feature working where the user imports a video file, each frame is fed to a custom action classifier, and then only frames with a certain action classified are exported.
However, I'm finding that testing a one hour 4K video at 60 FPS is taking an unreasonably long time - it's been processing for 7 hours now on a MacBook Pro with M1 Max running the Mac Catalyst app. Are there any techniques or general guidance that would help with improving performance? As much as possible I'd like to preserve the input video quality, especially frame rate. One hour length for the video is expected, as it's of a tennis session (could be anywhere from 10 minutes to a couple hours). I made the body pose action classifier with Create ML.
Reading a solution given in a book to adding the elements of an input array of doubles, an example is given with Accelerate as
func challenge52c(numbers: [Double]) -> Double {
var result: Double = 0.0
vDSP_sveD(numbers, 1, &result, vDSP_Length(numbers.count))
return result
}
I can understand why Accelerate API's don't adhere to Swift API design guidelines, why is it that they don't seem to use Cocoa guidelines either? Are there other conventions or precedents that I'm missing?
#import <Cocoa/Cocoa.h>
int main(int argc, const char * argv[]) {
@autoreleasepool {
// Setup code that might create autoreleased objects goes here.
}
return NSApplicationMain(argc, argv);
}
Say that in this example here, this struct
struct Reminder: Identifiable {
var id: String = UUID().uuidString
var title: String
var dueDate: Date
var notes: String? = nil
var isComplete: Bool = false
}
is instead decoded from JSON array values (rather than constructed like in the linked example). If each JSON value were to be missing an "id", how would id then be initialized? When trying this myself I got an error keyNotFound(CodingKeys(stringValue: "id", intValue: nil), Swift.DecodingError.Context(codingPath: [_JSONKey(stringValue: "Index 0", intValue: 0)], debugDescription: "No value associated with key CodingKeys(stringValue: \"id\", intValue: nil) (\"id\").", underlyingError: nil)).
In the Health app, it appears that cells and not sections are styled in this way:
The closest I know of to getting to this appearance is setting the section to be inset grouped
let listConfiguration = UICollectionLayoutListConfiguration(appearance: .insetGrouped)
let listLayout = UICollectionViewCompositionalLayout.list(using: listConfiguration)
collectionView.collectionViewLayout = listLayout
but I'm not sure of a good approach to giving each cell this appearance like in the screenshot above. I'm assuming the list style collection view shown is two sections with three total cells, rather than three inset grouped sections.
I encountered this error
2023-01-24T22:14:41.500565325Z Installing ri documentation for cocoapods-catalyst-support-0.2.1
2023-01-24T22:14:41.500827390Z Done installing documentation for colored2, concurrent-ruby, i18n, tzinfo, zeitwerk, activesupport, nap, fuzzy_match, httpclient, algoliasearch, ffi, ethon, typhoeus, netrc, public_suffix, addressable, cocoapods-core, claide, cocoapods-deintegrate, cocoapods-downloader, cocoapods-plugins, cocoapods-search, cocoapods-trunk, cocoapods-try, molinillo, atomos, nanaimo, rexml, xcodeproj, escape, fourflusher, gh_inspector, ruby-macho, cocoapods, cocoapods-catalyst-support after 50 seconds
2023-01-24T22:14:41.500997230Z 35 gems installed
2023-01-24T22:14:42.023353910Z [in /Volumes/workspace/repository]
2023-01-24T22:14:42.023798292Z
2023-01-24T22:14:42.024448317Z [!] Invalid `Podfile` file: cannot load such file -- cocoapods-catalyst-support.
2023-01-24T22:14:42.024714192Z
2023-01-24T22:14:42.024976712Z # from /Volumes/workspace/repository/Podfile:1
2023-01-24T22:14:42.025200239Z # -------------------------------------------
2023-01-24T22:14:42.025463448Z > require 'cocoapods-catalyst-support'
2023-01-24T22:14:42.025663811Z #
2023-01-24T22:14:42.025900158Z # -------------------------------------------
from my post-clone script, which is
#!/bin/sh
# ci_post_clone.sh
export GEM_HOME="$HOME/.gem"
gem install bundler
brew install cocoapods
gem install cocoapods-catalyst-support
# Install dependencies managed with CocoaPods.
pod install
I created a distinct icon set for visionOS, and specified its name in build settings. This is a Designed for iPad app. When I run it in the simulator, only the existing app icon shows. Is this supported for existing iPad apps, or am I missing something? There are no warnings in the asset catalog for this.
As far as I can tell - https://developer.apple.com/documentation/vision/identifying_trajectories_in_video trajectory detection lets you use characteristics of the trajectories detected for, say, drawing over the video as it plays. However, is it possible to mark which time ranges the video has detected trajectories, or perhaps access the frames for which there are trajectories?