Hi all,
Wondering how I would go about creating a plugin/class to support a new (physical/hardware) device with the game controller framework?
Between GCVirtualController on iOS and the "KeyboardAndMouseSupport.bundle" I see inside GameController.framework on my Mac, it looks like the framework must be designed to support this but I can't find any documentation.
Thanks!
Delve into the world of graphics and game development. Discuss creating stunning visuals, optimizing game mechanics, and share resources for game developers.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Created
On MacBook Pro M3 14" I can profile the Metal App performance by running it, then clicking on the M icon and choosing profile after replay.
On Mac Studio M2 Ultra I cannot: the profiler starts and crashes. I have tried everything including reinstalling the OS, Xcode, the Metal SDK, you name it.
The app uses the Metal 4 API. The content of the replayer errorinfo report is shown at the end.
Any ideas what is going on here and/or what else I can do do root cause this and fix it?
FWIW, it was worse on 26.1 (Xcode just reported Metal 4 profiling not available). In 26.2 Xcode attempts to profile and invariably crashes.
=== Error summary: ===
1x DYErrorDomain (512) - guest app crashed (512)
1x com.apple.gputools.MTLReplayer (100) - Abort trap: 6
=== First Error ===
Domain: DYErrorDomain
Error code: 512
Description: guest app crashed (512)
GTErrorKeyPID: 26913
GTErrorKeyProcessName: GPUToolsReplayService
GTErrorKeyCrashDate: 2026-01-09 19:22:52 +0000
=== Underlying Error #1 ===
Domain: com.apple.gputools.MTLReplayer
Error code: 100
Description: Abort trap: 6
Call stack:
0 GPUToolsReplay 0x0000000249c25850 MakeNSError + 284
1 GPUToolsReplay 0x0000000249c26428 HandleCrashSignal + 252
2 libsystem_platform.dylib 0x00000001856c7744 _sigtramp + 56
3 libsystem_pthread.dylib 0x00000001856bd888 pthread_kill + 296
4 libsystem_c.dylib 0x00000001855c2850 abort + 124
5 libsystem_c.dylib 0x00000001855c1a84 err + 0
6 IOGPU 0x00000001a9ea60a8 -[IOGPUMetal4CommandQueue _commit:count:commitFeedback:].cold.1 + 0
7 IOGPU 0x00000001a9ea0df8 __77-[IOGPUMetal4CommandQueue commitFillArgs:count:args:argsSize:commitFeedback:]_block_invoke + 0
8 IOGPU 0x00000001a9ea1004 -[IOGPUMetal4CommandQueue _commit:count:commitFeedback:] + 148
9 AGXMetalG14X 0x00000001158a2c98 -[AGXG14XFamilyCommandQueue_mtlnext noMergeCommit:count:options:commitFeedback:error:] + 116
10 AGXMetalG14X 0x0000000115a45c14 +[AGXG14XFamilyRenderContext_mtlnext mergeRenderEncoders:count:options:commitFeedback:queue:error:] + 4740
11 AGXMetalG14X 0x00000001158a2b34 -[AGXG14XFamilyCommandQueue_mtlnext commit:count:options:] + 96
12 GPUToolsReplay 0x0000000249bf0644 GTMTLReplayController_defaultDispatchFunction_noPinning + 2744
13 GPUToolsReplay 0x0000000249befb10 GTMTLReplayController_defaultDispatchFunction + 1368
14 GPUToolsReplay 0x0000000249b7a61c _ZL16DispatchFunctionP21GTMTLReplayControllerPK11GTTraceFuncRb + 476
15 GPUToolsReplay 0x0000000249b8603c ___ZN35GTUSCSamplingStreamingManagerHelper19StreamFrameTimeDataEv_block_invoke + 456
16 Foundation 0x0000000186f6c878 __NSBLOCKOPERATION_IS_CALLING_OUT_TO_A_BLOCK__ + 24
17 Foundation 0x0000000186f6c740 -[NSBlockOperation main] + 96
18 Foundation 0x0000000186f6c6d8 __NSOPERATION_IS_INVOKING_MAIN__ + 16
19 Foundation 0x0000000186f6c308 -[NSOperation start] + 640
20 Foundation 0x0000000186f6c080 __NSOPERATIONQUEUE_IS_STARTING_AN_OPERATION__ + 16
21 Foundation 0x0000000186f6bf70 __NSOQSchedule_f + 164
22 libdispatch.dylib 0x00000001855104d0 _dispatch_block_async_invoke2 + 148
23 libdispatch.dylib 0x000000018551aad4 _dispatch_client_callout + 16
24 libdispatch.dylib 0x00000001855056e4 _dispatch_continuation_pop + 596
25 libdispatch.dylib 0x0000000185504d58 _dispatch_async_redirect_invoke + 580
26 libdispatch.dylib 0x0000000185512fc8 _dispatch_root_queue_drain + 364
27 libdispatch.dylib 0x0000000185513784 _dispatch_worker_thread2 + 180
28 libsystem_pthread.dylib 0x00000001856b9e10 _pthread_wqthread + 232
29 libsystem_pthread.dylib 0x00000001856b8b9c start_wqthread + 8
Replayer breadcrumbs:
[
]
GTErrorKeyProcessSignal: SIGABRT
=== Setup ===
Capture device: star.localdomain (Mac14,14) - macOS 26.2 (25C56) - 0BA10D1D-D340-5F2E-934B-536675AF9BA1
Metal version: 370.64.2
Supported graphics APIs:
Metal device: Apple M2 Ultra
Supported GPU families: Apple1 Apple2 Apple3 Apple4 Apple5 Apple6 Apple7 Apple8 Mac1 Mac2 Common1 Common2 Common3 Metal3 Metal4
Replay device: star (Mac14,14) - macOS 26.2 (25C56) - 0BA10D1D-D340-5F2E-934B-536675AF9BA1
Metal version: 370.64.2
Supported graphics APIs:
Metal device: Apple M2 Ultra
Supported GPU families: Apple1 Apple2 Apple3 Apple4 Apple5 Apple6 Apple7 Apple8 Mac1 Mac2 Common1 Common2 Common3 Metal3 Metal4
Host: Mac14,14 - macOS 26.2 (25C56)
Tool: Xcode (17C52)
Known SDKs:
Topic:
Graphics & Games
SubTopic:
Metal
My goal is to print a debug message from a shader. I follow the guide that orders to set -fmetal-enable-logging metal compiler flag and following environment variables:
MTL_LOG_LEVEL=MTLLogLevelDebug
MTL_LOG_BUFFER_SIZE=2048
MTL_LOG_TO_STDERR=1
However there's an issue with the guide, it's only covers Xcode project setup, however I'm working on a Swift Package. It has a Metal-only target that's included into main target like this:
targets: [
// A separate target for shaders.
.target(
name: "MetalShaders",
resources: [
.process("Metal")
],
plugins: [
// https://github.com/schwa/MetalCompilerPlugin
.plugin(name: "MetalCompilerPlugin", package: "MetalCompilerPlugin")
]
),
// Main target
.target(
name: "MegApp",
dependencies: ["MetalShaders"]
),
.testTarget(
name: "MegAppTests",
dependencies: [
"MegApp",
"MetalShaders",
]
]
So to apply compiler flag I use MetalCompilerPlugin which emits debug.metallib, it also allows to define DEBUG macro for shaders. This code compiles:
#ifdef DEBUG
logger.log_error("Hello There!");
os_log_default.log_debug("Hello thread: %d", gid);
// this proves that code exectutes
result.flag = true;
#endif
Environment is set via .xctestplan and valideted to work with ProcessInfo. However, nothing is printed to Xcode console nor to Console app.
In attempt to fix it I'm trying to setup a MTLLogState, however the makeLogState(descriptor:) fails with error:
if #available(iOS 18.0, *) {
let logDescriptor = MTLLogStateDescriptor()
logDescriptor.level = .debug
logDescriptor.bufferSize = 2048
// Error Domain=MTLLogStateErrorDomain Code=2 "Cannot create residency set for MTLLogState: (null)" UserInfo={NSLocalizedDescription=Cannot create residency set for MTLLogState: (null)}
let logState = try! device.makeLogState(descriptor: logDescriptor)
commandBufferDescriptor.logState = logState
}
Some LLMs suggested that this is connected with Simulator, and truly, I run the tests on simulator. However tests don't want to run on iPhone... I found solution running them on My Mac (Mac Catalyst). Surprisingly descriptor log works there, even without MTLLogState. But the Simulator behaviour seems like a bug...
Issue
When an Entity with a ViewAttachmentComponent is:
disabled using isEnabled = false
removed using removeFromParent()
and then enabled or added back again, the attached SwiftUI view is rendered correctly, but tap interactions stop working.
Specifically:
Button actions inside the attached view do not fire
TapGesture closures on child views do not respond
Expected Behavior
Tap interactions inside the attached view should continue to work after the Entity is re-enabled or re-added.
Actual Behavior
After being disabled or removed once, all tap interactions stop responding.
Comparison
When displaying the same SwiftUI view using RealityViewAttachments, this issue does not occur.
Removing and re-displaying the attachment still allows taps to work correctly.
Reproduction
Attached sample code reproduces the issue:
A RealityView with an Entity that has a ViewAttachmentComponent
The attached SwiftUI view contains a Toggle
The toggle updates isEnabled on the Entity
After toggling off and on, tap interactions stop responding
Environment
Xcode 26
visionOS 26
Question
Is this expected behavior of ViewAttachmentComponent, or a bug?
Is there a recommended way to temporarily hide or disable an Entity with ViewAttachmentComponent without breaking tap interactions?
import SwiftUI
import RealityKit
struct GestureTestView: View {
@State var sampleEnabled = true
@State var sampleEntity: Entity?
var body: some View {
RealityView { contents, attachments in
// After deleting and re-displaying it, taps no longer respond.
let sample = Entity(components: ViewAttachmentComponent(rootView: SampleView()))
// Executed successfully
//let sample = attachments.entity(for: "SampleView")!
contents.add(sample)
sample.position = [0, 1.2, -1]
sampleEntity = sample
let toggleButton = Entity(components: ViewAttachmentComponent(rootView: ToggleButtonView(isOn: $sampleEnabled)))
contents.add(toggleButton)
toggleButton.position = [0, 1, -1]
} update: { _, _ in
// run update closure
print(sampleEnabled)
// update sample entity enable
sampleEntity?.isEnabled = sampleEnabled
} attachments: {
Attachment(id: "SampleView") {
SampleView()
}
}
}
}
struct ToggleButtonView: View {
@Binding var isOn: Bool
var body: some View {
VStack {
Toggle(isOn: $isOn) {
Text("Toggle")
}
}
.padding()
.glassBackgroundEffect()
}
}
struct SampleView: View {
var body: some View {
VStack {
Button {
print("Hello, World!")
} label: {
Text("Hello, World!")
.padding()
}
}
.padding()
.glassBackgroundEffect()
}
}
#Preview(immersionStyle: .mixed) {
GestureTestView()
}
I'm developing a turn based game. When I present the GKTurnBasedMatchmakerViewController players can opt in for automatch instead of selecting a specific friend as opponent.
How exactly does the matching work if a player doesn't specify anything explicitly?
Does Game Center send push notifications in a round robin fashion to all friends and the first one to accept is then matched as opponent? Is this documented somewhere?
I noticed that MTLPixelFormat has this cases:
case r32Float = 55
case rg32Float = 105
case rgba32Float = 125
But no case rgb32Float. What's the reason for such a discrimination?
I'm developing a game that supports GameKit turn based matches. What I don't understand is this:
Is tapping on the Game Center notification push messages the only way for the GKTurnBasedEventListener to trigger? What if someone misses the push message (swiping it away by accident or something like that) but still wants to join? Is there some inbox somewhere where the pending messages can be seen or fetched?
Also it was mentioned in a very old WWDC video (from 2013, I think that's the latest with information about turn based matches) that the notification also includes a badge for the icon. However, I do not understand how to implement that. Is there any documentation for that?
Hi,
I would like clarification on whether the new hover effects feature introduced in vision os 26 supported pinch gestures through the psvr 2 controllers.
In your sample application, I was not able to confirm that this was working. Only pinch clicking with my hands worked. Pulling the trigger on the controller whilst looking at a 3d object did not activate the hover effect spatial event in the sample application. (The object is showing the highlight though)
This is inconsistent with hover effect behavior with psvr2 controllers on swift ui views, where the trigger press does count as a button click.
The sample I used was this one:
https://developer.apple.com/documentation/compositorservices/rendering_hover_effects_in_metal_immersive_apps
In my game 854159268 (com.1791entertainment.qugame), in my quMostRecent3 leaderboard, the top 2 entries have 'vanished'. They were there yesterday. I know these players have played today, as I see their scores on other leaderboards.
Any ideas how to get these back?
These 2 players (me and my tester) are both TestFlight ing - not sure if that changes things.
Topic:
Graphics & Games
SubTopic:
GameKit
I was wondering if there's a method on MacOS to have my application hide a hid device such as a game controller and instead have the receiving game/application see my app's virtual controller? Is this possible via DriverKit or some other form of kernel level coding?
On Windows we have a tool known as HidHide that hids a game controller from all other applications. Is it possible to implement such behavior into an app or is that system level?
Hey I'm using the CIDepthBlurEffect Core Image Filter in my app. It seems to work ok but I get these errors in the console when calling the class.
CoreImage Metal library does not contain function for name: sparserendering_xhlrb_scan
CoreImage Metal library does not contain function for name: sparserendering_xhlrb_diffuse
CoreImage Metal library does not contain function for name: sparserendering_xhlrb_copy_back
CoreImage Metal library does not contain function for name: plain_or_sRGB_copy
Am I missing some sort of import to gain these Metal functions? I am using my own custom shaders but I assume you'd be able to use them along side the built in ones.
I'm updating an existing distributed game to add turn-based matches. When the Matchmaker ViewController Info Button next to a game is pressed, the results vary:
iOS 15.x - Button under avatar says "Accept Invite" or "View Game" (depending on if invite has already been accepted)
iOS 18.x - Button always says "App Store" - I assume that means it would lead one to the App store to install the game.
Both devices (iPad 15.x and iPhone 18.x) have the same version of the game installed. The results are the same when running in the simulator.
When the game is released, I assume this button will work properly, no?
Topic:
Graphics & Games
SubTopic:
GameKit
Was wondering if anyone from Apple could provide some clarification, The gaming studio "Epic Games" Is wondering if they could distribute the award winning game "Fortnite" back on MacOS without any retaliations.
I know Fortnite being back on MacOS would benefit thousands of MacOS Devs. Hoping to get a clarification so Epic could start on bringing Fortnite back.
Turn-based games: 2 players
When an opponent declines a game in the Game Center MatchMaker VC, that player sees that they quit, but no message is sent to the listener about that fact. For the person who started the match, their MMVC shows it's their turn
again. Why doesn't Game Center end the match?
Topic:
Graphics & Games
SubTopic:
General
I've tried out a ParticleEmitter in Reality Composer Pro to produce a burst of particles that don't move (i.e. speed close to zero).
When viewing from different angles, it clearly looks like the particles are rendered exactly in the wrong order, that is, front first and back last. In other words, back particles obscure front particles.
I would prefer it the correct way around.
I've only tried this interactively in Reality Composer Pro, not programmatically, but I assume I would get the same result.
My Reality Composer Pro "File" (zipped):
https://gert-rieger-edv.de/Posts/Post-1/RealityParticles.zip
Screenshot:
Click on the ParticleEmitter object, then on its Play button, then select the Particles tab and click on "Burst" a few times to get a few random particles.
Mac Studio 2025
Apple M4 Max
macOS 15.7.2 (24G325)
Reality Composer Pro
Version 2.0 (494.60.2)
Hi, I'm trying to set the displayScale environment value for a RealityView, so it renders at 2x instead of 3x on the iPhone, but it seems to have no effect.
.environment(\.displayScale, 2.0)
Is this expected behavior, or a bug?
The reason I want it to render at 2x and not at the default 3x is for game optimization and performance.
I do not understand how offline leaderboard submission is supposed to work in Game Kit:
While the documentation briefly states that offline submission is supported, how is that even possible when you first have to fetch a leaderboard object in order to then call its submitScore function? How can I get the leaderboard object in the first place when offline?
Can anyone enlighten me how this works? Or maybe point me to some relevant documentation?
After authenticating the user I'm loading my Game Center leaderboards like this:
let leaderboards = try await GKLeaderboard.loadLeaderboards(IDs: [leaderboardID])
This is working fine, but there are times when this just returns an empty array. When I encounter this situation, the array remains empty for several hours when retrying, but then at some point it suddenly starts working again.
Is this a known issue? Or am I hitting some kind of quota maybe (as I do it quite often while developing my game)?.
Edit: My leaderboards are grouped in sets if that makes any difference here.
Problem Summary
After upgrading to iOS 26.1 and 26.2, I'm experiencing a particle positioning bug in RealityKit where ParticleEmitterComponent particles render at an incorrect offset relative to their parent entity. This behavior does not occur on iOS 18.6.2 or earlier versions, suggesting a regression introduced in the newer OS builds.
Environment Details
Operating System: iOS 26.1 & iOS 26.2
Framework: RealityKit
Xcode Version: 16.2 (16C5032a)
Expected vs. Actual Behavior
Expected: Particles should render at the position of the entity to which the ParticleEmitterComponent is attached, matching the behavior on iOS 18.6.2 and earlier.
Actual: Particles appear away from their parent entity, creating a visual misalignment that breaks the intended AR experience.
Steps to Reproduce
Create or open an AR application with RealityKit that uses particle components
Attach a ParticleEmitterComponent to an entity via a custom system
Run the application on iOS 26.1 or iOS 26.2
Observe that particles render at an offset position away from the entity
Minimal Code Example
Here's the setup from my test case:
Custom Component & System:
struct SparkleComponent4: Component {}
class SparkleSystem4: System {
static let query = EntityQuery(where: .has(SparkleComponent4.self))
required init(scene: Scene) {}
func update(context: SceneUpdateContext) {
for entity in context.scene.performQuery(Self.query) {
// Only add once
if entity.components.has(ParticleEmitterComponent.self) { continue }
var newEmitter = ParticleEmitterComponent()
newEmitter.mainEmitter.color = .constant(.single(.red))
entity.components.set(newEmitter)
}
}
}
AR Setup:
let material = SimpleMaterial(color: .gray, roughness: 0.15, isMetallic: true)
let model = Entity()
model.components.set(ModelComponent(mesh: boxMesh, materials: [material]))
model.components.set(SparkleComponent4())
model.position = [0, 0.05, 0]
model.name = "MyCube"
let anchor = AnchorEntity(.plane(.horizontal, classification: .any, minimumBounds: [0.2, 0.2]))
anchor.addChild(model)
arView.scene.addAnchor(anchor)
Questions for the Community
Has anyone else encountered this particle positioning issue after updating to iOS 26.1/26.2?
Are there known workarounds or configuration changes to ParticleEmitterComponent that restore correct positioning?
Is this a confirmed bug, or could there be a change in coordinate system handling or transform inheritance that I'm missing?
Additional Information
I've already submitted this issue via Feedback Assistant(FB21346746)
Problem Summary
After upgrading to iOS 26.1 and 26.2, I'm experiencing a particle positioning bug in RealityKit where ParticleEmitterComponent particles render at an incorrect offset relative to their parent entity. This behavior does not occur on iOS 18.6.2 or earlier versions, suggesting a regression introduced in the newer OS builds.
Environment Details
Operating System: iOS 26.1 & iOS 26.2
Framework: RealityKit
Xcode Version: 16.2 (16C5032a)
Expected vs. Actual Behavior
Expected: Particles should render at the position of the entity to which the ParticleEmitterComponent is attached, matching the behavior on iOS 18.6.2 and earlier.
Actual: Particles appear away from their parent entity, creating a visual misalignment that breaks the intended AR experience.
Steps to Reproduce
Create or open an AR application with RealityKit that uses particle components
Attach a ParticleEmitterComponent to an entity via a custom system
Run the application on iOS 26.1 or iOS 26.2
Observe that particles render at an offset position away from the entity
Minimal Code Example
Here's the setup from my test case:
Custom Component & System:
struct SparkleComponent4: Component {}
class SparkleSystem4: System {
static let query = EntityQuery(where: .has(SparkleComponent4.self))
required init(scene: Scene) {}
func update(context: SceneUpdateContext) {
for entity in context.scene.performQuery(Self.query) {
// Only add once
if entity.components.has(ParticleEmitterComponent.self) { continue }
var newEmitter = ParticleEmitterComponent()
newEmitter.mainEmitter.color = .constant(.single(.red))
entity.components.set(newEmitter)
}
}
}
AR Setup:
let material = SimpleMaterial(color: .gray, roughness: 0.15, isMetallic: true)
let model = Entity()
model.components.set(ModelComponent(mesh: boxMesh, materials: [material]))
model.components.set(SparkleComponent4())
model.position = [0, 0.05, 0]
model.name = "MyCube"
let anchor = AnchorEntity(.plane(.horizontal, classification: .any, minimumBounds: [0.2, 0.2]))
anchor.addChild(model)
arView.scene.addAnchor(anchor)
Questions for the Community
Has anyone else encountered this particle positioning issue after updating to iOS 26.1/26.2?
Are there known workarounds or configuration changes to ParticleEmitterComponent that restore correct positioning?
Is this a confirmed bug, or could there be a change in coordinate system handling or transform inheritance that I'm missing?
Additional Information
I've already submitted this issue via Feedback Assistant(FB21346746)