We're able to successfully perform test notifications using the app-store-server-library-python in SANDBOX environment, but the second we switch to PRODUCTION (for testing purposes), the call fails with 401 and it doesn't seem to reach our server at all. This suggests that something is wrong with singing production environment headers, however I've seen posts from others that suggest this is not specific to the App Store server library code.
It's very clear that the libraries are marked Beta – however, most replies to questions about the v2 API are replied to with suggestions to use the library. Just FWIW – there's some contradictory advice there.
Anyway, the main point is that we're currently blocked on making sure that our v2 API hooks are working properly, since we can't send production test notifications.
Any idea why the signed requests would work to send sandbox test notifications, but not production environment? We've triple checked the URLs, etc – as far as we know, the private key should be the same regardless of environment.
Thanks!
(P.S. If anyone has been able to send v2 test notifications with the PRODUCTION environment, please let us know!)
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hello,
Question re: iOS RealityView postProcess. I've got a working postProcess kernel and I'd like to add some depth-based effects to it. Theoretically I should be able to just do:
encoder.setTexture(context.sourceDepthTexture, index: 1)
and then in the kernel:
texture2d<float, access::read> depthIn [[texture(1)]]
...
outTexture.write(depthIn.read(gid), gid);
And I consistently see all black rendered to the view. The postProcess shader works, so that's not the issue. It just seems to not be receiving actual depth information.
(If I set a breakpoint at the encoder setTexture step, I can see preview the color texture of the scene, but the context's depthTexture looks like all NaN / blank.)
I've looked at all the WWDC samples, but they include ARView for all the depth sample code, which has a different set of configuration options than RealityView. So far I haven't seen anywhere to explicitly tell RealityView "include the depth information". So I'm not sure if I'm missing something there.
It appears that there is indeed a depth texture being passed, but it looks blank.
Is there a working example somewhere that we can reference?
I have a VisionOS app that uses a Game Center Leaderboard. On 'appear' of my content view, I check authentication, and if the user is auth'd, I display a 'leaderboard' button. This works on development / TestFlight – the little floating 'Game Center logged-in' pop up shows up when the app runs, and the button appears.
Tapping the button opens the leaderboard, and I can tap over to 'global' and see everyone's scores.
However, the app launched today on the App Store, and I'm having some issues.
The leaderboard button isn't showing up, meaning that the auth call is probably failing silently. Not sure why. If I log out of Game Center on the device, and re-open the app, I'm able to auth and see my leaderboard button.
Once I've done that, the leaderboard seems busted. On App Store Connect I can see the values that I expect (high scores from my beta test flight). But, on the production app, the leaderboard is just empty. And there's no 'global' tab to tap on.
Wondering if this is a propagation issue? Like, does it take time to roll out the Game Center entitlements when the app is live?
I'm going to distribute promo codes to try and get some more people with the App Store version testing for me to see what happens for them.
Any help appreciated!
Currently there's only an option to link your OpenAI account to use ChatGPT within Xcode on MacOS 26 Beta. Apparently with other services like Claude you can specify which model you'd like to use, but there's no such option for ChatGPT.
Is this coming, or is this working as designed?
Baffled by the new ExtractBits shader graph node only supporting String input. Is this a bug? Trying to extract an integer from a float value, but have no idea how to pass it into Extract Bits. Convert nodes don't support number to string.
Topic:
Spatial Computing
SubTopic:
Reality Composer Pro
I'm trying to create an offer code for one of my app's subscriptions, and the button just isn't there.
https://developer.apple.com/help/app-store-connect/manage-subscriptions/set-up-offer-codes
Wondering if this is because the app was just launched today, or if there's something I'm missing.