altool seemed to be working correctly, but then failed after upload finished:
$ xcrun altool --upload-package ../exports/OurApp.ipa --type ios --apple-id <apple-id> --bundle-id <bundle-id> --bundle-version 291 --bundle-short-version-string "3.1.5" --apiKey <api-key> --apiIssuer <issuer> --show-progress
Getting list of providers...
Beginning delivery...
Analyzing package…
Requesting app information…
Requesting asset description upload id…
Sending analysis to the App Store…
Waiting for response…
Requesting upload instructions from the App Store…
Preparing to upload package to the App Store…
Uploading package to the App Store…
**status code 401, auth issue.
*** Error: *** status code 401, auth issue.
*** Error: Error uploading '../exports/OurApp.ipa'.
*** Error: Unable to authenticate. (-19209)
Note that --list-providers seems to work just fine with the same credentials:
$ xcrun altool --list-providers --apiKey <key> --apiIssuer <issuer>
Getting list of providers...
ProviderName ProviderShortname PublicID WWDRTeamID
------------ ----------------- ------------------------------------ ----------
Whatnot Inc. <redacted> <redacted> <redacted>
@eskimo?
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Created
My app has a utility window that can live anywhere on screen, including over the menu bar, or in the top region of a screen with a notch when the main screen is not the built-in display.
If I drag the window into these areas, it sits there just fine. I have drag handlers in a subclass of NSWindow that call -setFrame: (yeah, this is Obj-C code).
If the screen gets reconfigured, my code tries to remember where the window was, and calls -setFrame: to put it back there. But in this case, macOS moves my window down out of the menu bar/notch area at the top of the screen.
Is there any way to prevent this behavior?
I just got an M1 MacBook Pro. I'm trying to build one of my macOS projects for Apple Silicon, but it only gives me "My Mac (Rosetta)" as a run destination (or "Any Mac (Intel)"), even though I've set the build archs to universal.
Googling is no help.
In this talk , at 12:00, the speaker refers to code associated with this talk. But I can't find it anywhere.
Is it available?
I'm unable to find any word on when Apple will be requiring new submissions be made with Xcode 13. We've already moved over to it, but still need to know.
Has anyone else run into this and found a workaround? Xcode 13b1, 3, and 4 are unable to prepare my device for development after I updated it today to 14.7.1. Xcode 12.5.1 is able to.
After churning for several minutes, it always ends up saying
Failed to prepare device for development.
I've rebooted the device and my Mac a few times each now.
There's at least one other report of this issue:
https://stackoverflow.com/questions/68549513/xcode-doesn-t-support-phone-s-ios-14-7-1
I recently created a new app record in AppStoreConnect. The app is still in development and testing, so we use this for TestFlight builds. The app has a complete app icon internally, and shows up in the Simulator and devices correctly.
But in AppStoreConnect, it shows the generic app icon placeholder.
I contacted Apple support about this, and they told me I needed to include the app icon in the app. I told them I already do, and showed them the screenshots. They basically said sorry, not my problem, try the dev forums.
So here I am.
I just downloaded and tried Xcode 13b2 on our iOS project. I've been using b1 with a surprising amount of success, switching to 12.5.1 for builds.
But now 13b2 seems to have an issue. In our AVCapture code, you get handed an AVCapturePhoto, and it has methods that return CF_RETURNS_NOT_RETAINED/Unmanaged. If you try to .takeUnretainedValue() on these, the compiler complains that CGImage has no such method. It appears to be ignoring the Obj-C directive.
I'm also unable to view the generated Swift file for that Obj-C header. I get "Couldn't generate Swift Representation" "Error (from SourceKit): Could not load the stdlib module".
Anyone else run into this? I filed FB9211460 about it.
I just created a new macOS app project in Xcode 13 beta. SwiftUI, unit tests, no Core Data.
I just noticed that it does not add the traditional "Products/MyApp.app" group and file, and I can't figure out how to add those via the UI.
At least one solution online edits the pbxproj file directly; I'd rather not do that.
Is there any sanctioned way to add this? Is not having it a bug?
As in the locked question here - https://developer.apple.com/forums/thread/674534, I'm constantly running into this error:
Compiling failed: 'main' attribute cannot be used in a module that contains top-level code
Once it starts, it's not clear how to stop it. No changes were made to my AppDelegate (It's a mostly-UIKit app that I'm adding some SwiftUI views to).
It compiles just fine with a regular build, but the SwiftUI preview can't build it. This is proving to be a real hindrance.
I can sometimes clear the condition by cleaning build and test results, and relaunching Xcode. But not always.
I filed FB9104575 and included the diagnostics.
I've got the following code that updates a @Published var messages: OrderedSetMessage property:
swift
public
func
add(messages inMsgs: [IncomingMessage])
{
for msg in inMsgs
{
let msg = Message(fromIncoming: msg, user: user)
self.messages.append(msg)
}
self.messages.sort()
}
In my SwiftUI view, however, .onChanged(self.stream.messages) gets called three times each time a single message is added.
I tried operating on a local copy of self.messages, and then just setting self.messages = local, but that didn't change anything.
Maybe the issue is on the SwftUI side?
In any case, how are published updates to a property coalesced?
As of 11.3, DocumentGroup defaults to showing the open panel (From the release notes: "DocumentGroup apps now show an Open panel on launch, even when iCloud isn’t in use. (66446310).") Seems like it was considered a bug before that it didn't.
Thing is, I don't like this behavior and don't want it, especially while I'm working on my app. I want to automatically create a new document. Is there any way to set that?
In my macOS SwiftUI app I have a list of "layers" on the left. Clicking on a layer focuses it on the right for acting upon. Each link has a little eye icon that's used to toggle visibility of that layer in the view to the right.
I'd like to be able to click on that eye button without selecting the layer or activating it. Is that possible?
I've got macOS SwiftUI app that displays an image, currently using Image. I need to display information about the pixel under the mouse pointer (its position, color, etc.) in some text fields at the bottom of the window.
I can't find an appropriate event handler to attach to Image. Traditionally I would have used mouseEntered, mouseExited, and mouseMoved. These are available for an NSHostingView, but that's for wrapping native views.
I found onHover(), which takes the place of mouseEntered and mouseExited, but it neither works (the perform method is never called for me), nor does it provide movement and position.
Is there documentation describing the semantics of a Metal CIKernel function?
I have image data where each pixel is a signed 16-bit integer. I need to convert that into any number of color values, starting with a simple shift from signed to unsigned (e.g. the data in one image ranges from about -8,000 to +20,000, and I want to simply add 8,000 to each pixel's value).
I've got a basic filter working, but it treats the pixel values as floating point, I think. I've tried using both sample_t and sample_h types in my kernel, and simple arithmetic:
extern "C"
coreimage::sample_h
heightShader(coreimage::sample_h inS, coreimage::destination inDest)
{
coreimage::sample_h r = inS + 0.1;
return r;
}
This has an effect, but I don't really know what's in inS. Is it a vector of four float16? What are the minimum and maximum values? They seem to be clamped to 1.0 (and perhaps -1.0). Well, I’ve told CI that my input image is CIFormat.L16, which is 16-bit luminance, so I imagine it's interpreting the bits as unsigned? Anyway, where is this documented, if anywhere (the correspondence between input image pixel format and the actual values that get passed to a filter kernel)?
Is there a type that lets me work on the integer values? This document - https://developer.apple.com/metal/MetalCIKLReference6.pdf implies that I can only work with floating-point values. But it doesn't tell me how they're mapped.
Any help would be appreciated. Thanks.