Hello,
On macOS 26 (Tahoe), when building a OSX app that includes GameKit code, calling GKLocalPlayer.local.authenticateHandler shows the "Sign In to Game Center" alert (e.g. didShowFullscreenSignIn) — even if the app does not have the Game Center capability enabled or any related entitlement (com.apple.developer.game-center).
This alert only appears when the user is not signed in to Game Center in system settings.
However, when testing the same code path on iOS app built with macOS 26 (Tahoe), the alert does not appear unless the proper capability and entitlement are included.
This behavior is different from macOS 15 (Sequoia) + Xcode 15.x. Prior to the update, Game Center features did not work at all even with the OSX app without Capability and Entitlements.
Steps to Reproduce
Create a new OSX app target (App Sandbox enabled, no Game Center capability).
Add minimal GameKit code:
GKLocalPlayer.local.authenticateHandler = { _, _, _ in }
Build OSX app and run on macOS 26 (Tahoe).
Ensure Game Center is signed out in System Settings.
Observe: “Sign In to Game Center” alert appears automatically.
Expected Behavior
When Game Center capability and entitlement are not present, authenticateHandler should fail silently, and no signIn alert should appear.
Actual Behavior
On OSX app, the Game Center signIn UI appears even without any Game Center capability or entitlement.
On iOS app, this alert does not appear.
*Build Configuration: built with the same condition. (macOS 26 + Xcode 26)
Question
Could you please confirm whether this behavior is an intentional change in macOS 26 or a bug only for OSX apps in the GameKit authentication flow?
Thank you.
Delve into the world of graphics and game development. Discuss creating stunning visuals, optimizing game mechanics, and share resources for game developers.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Created
我们想在游戏类 App 内接入 Game Center。用户可以在游戏内创建多个角色,若用户在游戏内创建了2个角色:角色1、角色2,请问:
当用户将角色1与 Game Center 绑定后,数据将上报至 Game Center。此时玩家想要将角色1与 Game Center 解除绑定,解绑后,再将角色2与 Game Center 绑定。那么这时角色1的数据是留存在 Game Center 中,还是将被移除?
subj
And how in this case are beautiful system dials made with smoke effects and other particles?
Topic:
Graphics & Games
SubTopic:
Metal
Hello,
I recently watched the WWDC2025 session titled “Combine Metal 4 machine learning and graphics” (https://developer.apple.com/videos/play/wwdc2025/262/ ), and I’m very excited about the new Metal 4 features that integrate machine learning with graphics—such as neural ambient occlusion, shader-based ML inference, and the use of MTLTensor and MTL4MachineLearningCommandEncoder.
While the session includes helpful code snippets and a compelling debug demo (e.g., the neural ambient occlusion example), the implementation details are not fully shown, and I haven’t been able to find a complete, runnable sample project that demonstrates end-to-end integration of ML and rendering in Metal 4.
Would Apple be able to provide a full, working example—such as an Xcode project—that shows how to:
Export a model to an .mlpackage,
Convert it to an .mtlpackage,
Use MTL4MachineLearningCommandEncoder alongside render passes,
Or embed small neural networks directly in shaders using Shader ML?
Having such a sample would greatly help developers like me adopt these powerful new capabilities correctly and efficiently.
Thank you very much for your time and support!
Best regards,
hello i am new to apple ecosystem and development i have some coding experience with c# now i like to develop my game for iphone 16 and up(due to ability to run ai models) but i am having hard time figuring out what to use there is a lot of resources for scene-kit but on its doc page it says its deprecated so i look at the reality-kit docs and tutorials and its strictly tells how to develop for visionos and i am really confused about this since there is no tutorials that shows how to develop a game for ios with reality-kit that does not focus visionos. i just want to develop for iphone 16 and up but i cant find resources focuses at that.
Topic:
Graphics & Games
SubTopic:
RealityKit
Hey there, I’m currently planning to use RealityKit in a new multiplatform app I’m building. Unfortunately, I noticed that WatchOS is not supported for RealityKit, while SceneKit is getting deprecated. However, I’d like to maintain the same codebase across platforms. What are my options?
Reproduce
Same SIM card with 4G, same testing location, connected to the same server, xcode debugging game applications, network/profile retrotransmitted, Avg round trip to view data
iPhone17, Turn off 4G and turn on WiFi. All the above indicators are acceptable
iPhone17, Turn on 4G, turn off WiFi, retry with retransmission and very high Avg round trip
iPhone14-16, Turn on 4G and turn off WiFi. All the above indicators are acceptable
App
Unity3d project
.netframe4.0
C# Socket
Other
Many developers in Chinese forums have provided feedback on this issue
Hi all,
I've developed some code that enables an arcball camera interaction with my scene. I've done this using components and systems. The implementation feels a bit messy as I've got gesture code on my realityView, and then a bunch of other code that uses those gesture inputs in my component and system.
Is there a demo app, or some example code that shows a nice way to encapsulate these things in to one item for custom cameras, something like Apple's .realityViewCameraControls(.orbit)
If not can anyone recommend an approach to take?
My IOS app generates pdf files.
Every time my users open the generated pdf files, the autofill popup jumps out, but my pdf file is NOT for interacting.
I'm here to ask if there's a way to mark my pdf files as "not a form", like in metadata or anywhere else?
Hello Apple team,
I'm working on an iOS AR app using SwiftUI and RealityKit,
and I was wondering if the Cinematic API can be used with a RealityKit scene. I’d like to achieve a shallow depth of field while keeping the 3D asset in focus, and vice versa.
Thanks!
What is the current [most recent] best practice to instancing Meshes in RealityKit?
I see both MeshInstanceComponent and MeshInstanceCollection.
My intent is to bind a transform to a Circle Agent (GameplayKit Agent), and feed that result to Instancing.
Hello, I am quite new to using the metal API and was wondering if it was common (or even possible) if you knew that, when a pipeline was created, you never needed to make another one with the same shaders again, if it is safe to release the library the was used to reference the shaders? Only asking because this is possible in other apis, but apple never mentions (as far as I have found) if this is safe or not safe to do.
Topic:
Graphics & Games
SubTopic:
Metal
Attempting to bring up the access point yields the following error log:
[GameCenterOverlayService] Failed to create GameOverlayUI Dashboard Remote Proxy
[GameCenterOverlayService] Could not create endpoint for service name: com.apple.GameOverlayUI.dashboard-service
[GameCenterOverlayService] Failed to create GameOverlayUI Dashboard Remote Proxy
[GameCenterOverlayService] Could not create endpoint for service name: com.apple.GameOverlayUI.dashboard-service
[GameCenterOverlayService] Failed to create GameOverlayUI Dashboard Remote Proxy
[GameCenterOverlayService] Failed to create GameOverlayUI Dashboard Remote Proxy
The same code (which is a single line setting 'active' to true) works on physical devices and on the simulator in iOS 18.6
I haven't been able to find any mention of this issue online. Any suggestions or help greatly appreciated.
i play Roblox and ever since I've got this update the quality graphics stability Internet ping and a lot of other stuff has drastically been worse
I have been tasked with creating content for the Apple Vision Pro. Just the 3D content and animation, not the programming end of things.
I can't seem to get any kind of mesh deformation animation to import into Reality Composer Pro. By that I mean bones/skin, or even point cache.
I work on PC, and my main software is 3DS Max, but I'm borrowing an iMac for this job, and was instructed to use RCP on it for testing before handing things off to the programmer. My files open and play fine in other USD programs, like Omniverse, or USD View, just not Reality Composer Pro.
I've seen the dinosaur demo in AVP, so I know mesh deformation is possible. If there are other essential tools that might make this possible, I have not been made aware of them.
I am experimenting with bouncing things off of Blender, in case that exports better, but not really having luck there either -though my results are different.
Thanks.
Topic:
Graphics & Games
SubTopic:
RealityKit
After updating to MacOS 26.1 I encountered an issue that Roblox tends to freeze quite often for 10 - 60 seconds at most, this is really annoying that it is doing this as i play the game a lot. My theory is that it is like a driver issue with metal or something, I have reinstalled MacOS, reinstalled the game and lowed the performance manually but nothing is working.
Wondering if you could help, when it will be fixed and if others are having the same issue.
Many thanks, William.
I have published a number of games that use SpriteKit for everything important. Since the release of macOS Tahoe, I've had a lot of end user reports saying that sound effects have stopped working in many (but not all) of my titles.
I'm not doing anything unusual here – typical code is:
sndGameOver = [SKAction playSoundFileNamed:@"Audio/GameOver.wav" waitForCompletion:YES];
Then at the appropriate time:
[self runAction:sndGameOver];
Has anyone else encountered this? The code still works fine on previous operating systems, and appears to be fine on iOS too. Has something changed in macOS Tahoe?
I'm at a bit of a loss. There's nothing obviously different between the titles that do work and the titles that don't.
Suggestions welcomed!
Thanks
I'm updating our app to support metal 4, but the metal 4 types don't seem to get recognized when targeting simulator. Is it known if metal 4 will be supported in the near future, or am I setting up the app wrong?
Hi,
I’m using the latest iPad Pro (13-inch) and I can see that Metal offers an rgb10a2unorm texture for rendering, but when I render a grey ramp and measure the actual luminance, I get a pattern that I would expect from an 8-bit texture (see below). Before I start ripping apart all my code, is there anything else I need to do to convince iOS to render my texture in 10-bit?
I already tried setting the PixelFormat in my CMetalLayer to rgb10a2unorm, but that didn’t change anything.
I rewrote my graphics pipeline to use Load/Store better for clearing and don't care cases. All my tests pass, and in the Metal debugger, all the draw calls succeed.
But when I present drawables (before [commandBuffer commit]) I only get a pink screen. I've tried everything I can think of: making sure the pixel formats are the same for the back buffer as my render targets, etc. But it's still pink.
Could you point me in the right direction so I can fix this, or help describe why it's pink. That would be really helpful.
Thank you,
Brian Hapgood
Topic:
Graphics & Games
SubTopic:
Metal