Hi all im having a variety of issues with gamekit matchmaking. On the simulator the matchmaking ui pops up and I can click Quick Match, then immediately "Failed to find Players" this is the same with a real Apple ID and a sandbox account.
If I use real devices the app at least discovers a match, but then the match none of the delegate methods for the match ever get called and the logs are filled with socket not connected and various errors.
My questions are:
Should match making via quick match work in the simulator, I have seen tutorial videos etc of this working, but I can't seem to get it to work.
How do people debug issues with GameCenter / Gamekit to find out why its not able to connect?
Many thanks in advance
Delve into the world of graphics and game development. Discuss creating stunning visuals, optimizing game mechanics, and share resources for game developers.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I am trying the simplest use of attachment in realityKit and get Contextual closure type @MainActor, @Sendable (inout RealityViewCameraContent) async -> void expects 1 argument, but 2 were used in closure body.
Also i get cannot find Attachment in scope
In my SceneKit game I'm able to connect two players with GKMatchmakerViewController. Now I want to support the scenario where one of them disconnects and wants to reconnect. I tried to do this with this code:
nonisolated public func match(_ match: GKMatch, player: GKPlayer, didChange state: GKPlayerConnectionState) {
Task { @MainActor in
switch state {
case .connected:
break
case .disconnected, .unknown:
let matchRequest = GKMatchRequest()
matchRequest.recipients = [player]
do {
try await GKMatchmaker.shared().addPlayers(to: match, matchRequest: matchRequest)
} catch {
}
@unknown default:
break
}
}
}
nonisolated public func player(_ player: GKPlayer, didAccept invite: GKInvite) {
guard let viewController = GKMatchmakerViewController(invite: invite) else {
return
}
viewController.matchmakerDelegate = self
present(viewController)
}
But after presenting the view controller with GKMatchmakerViewController(invite:), nothing else happens. I would expect matchmakerViewController(_:didFind:) to be called, or how would I get an instance of GKMatch?
Here is the code I use to reproduce the issue, and below the reproduction steps.
Code
Run the attached project on an iPad and a Mac simultaneously.
On both devices, tap the ship to connect to GameCenter.
Create an automatched match by tapping the rightmost icon on both devices.
When the two devices are matched, on iPad close the dialog and tap on the ship to disconnect from GameCenter.
Wait some time until the Mac detects the disconnect and automatically sends an invitation to join again.
When the notification arrives on the iPad, tap it, then tap the ship to connect to GameCenter again. The iPad receives the call player(_:didAccept:), but nothing else, so there’s no way to get a GKMatch instance again.
Does anyone know why the following call fails?
CGPDFOperatorTableSetCallback(operatorTable, "ID", &callback);
The PDF specification seems to indicate that ID is an operator?
BTW what is the proper topic/subtopic for questions about Quartz? Wasn't sure what topic on the new forums to post this under.
I recently needed to develop an application to obtain the window list, which requires Screen Recording permissions. Apple's official documentation mentions using the two functions CGPreflightScreenCaptureAccess and CGRequestScreenCaptureAccess to request permissions. These functions are stated to be available since version 10.15. However, when I used these two functions on a device running macOS 10.15.7, I encountered the errors shown in the attached screenshot. I used the nm tool to inspect the symbols in the CoreGraphics.framework and found that these two functions were not present. Could you help me understand why this is happening?
Topic:
Graphics & Games
SubTopic:
General
Hi,
I’m testing Unity’s Spaceship HDRP demo on iPhone 17 Pro Max and iPad Pro M4 (iOS 26.1).
Everything renders correctly, and my custom MetalFX Spatial plugin initializes successfully — it briefly reports active scaling (e.g. 1434×660 → 2868×1320 at 50% scaling), then reverts to native rendering a few frames later.
Setup:
Xcode 16.1 (targeting iOS 18)
Unity 2022.3.62f3 (HDRP)
Metal backend
Dynamic Resolution enabled in HDRP assets and cameras
Relevant Xcode console excerpt:
[MetalFXPlugin] MetalFX_Enable(True) called.
[SpaceshipOptions] MetalFX enabled with HDRP dynamic resolution integration.
[SpaceshipOptions] Disabled TAA for MetalFX Spatial.
[SpaceshipOptions] Created runtime RenderTexture: 1434x660
[MetalFX] Spatial scaler created (1434x660 → 2868x1320).
[MetalFX] Processed frame with scaler.
[MetalFXPlugin] Sent RenderTexture (1434x660) to MetalFX. Output target 2868x1320.
[SpaceshipOptions] MetalFX target set: 1434x660
[SpaceshipOptions] Camera targetTexture cleared after MetalFX handoff.
It looks like HDRP clears the camera’s target texture right after MetalFX submits the frame, which causes it to revert to native rendering.
Is there a recommended way to persist or rebind the MetalFX output texture when using HDRP on iOS?
Unity doesn’t appear to support MetalFX in the Editor either:
Thanks!
Hi Apple,
In VisionOS, for real-time streaming of large 3D scenes, I plan to create Metal buffers and textures in multiple threads and then use a compute shader on the main thread to copy the Metal resources into RealityKit, minimizing main thread usage. Given that most of RealityKit's default APIs require execution on the main actor (main thread), it is not ideal for streaming data. Is this approach the best way to handle streaming data and real-time rendering?
Thank you very much.
How can I paste a string to the findNavigator of a TextEditor ?
Topic:
Graphics & Games
SubTopic:
General
Hi,
How to enable multitouch on ARView?
Touch functions (touchesBegan, touchesMoved, ...) seem to only handle one touch at a time. In order to handle multiple touches at a time with ARView, I have to either:
Use SwiftUI .simultaneousGesture on top of an ARView representable
Position a UIView on top of ARView to capture touches and do hit testing by passing a reference to ARView
Expected behavior:
ARView should capture all touches via touchesBegan/Moved/Ended/Cancelled.
Here is what I tried, on iOS 26.1 and macOS 26.1:
ARView Multitouch
The setup below is a minimal ARView presented by SwiftUI, with touch events handled inside ARView. Multitouch doesn't work with this setup.
Note that multitouch wouldn't work either if the ARView is presented with a UIViewController instead of SwiftUI.
import RealityKit
import SwiftUI
struct ARViewMultiTouchView: View {
var body: some View {
ZStack {
ARViewMultiTouchRepresentable()
.ignoresSafeArea()
}
}
}
#Preview {
ARViewMultiTouchView()
}
// MARK: Representable ARView
struct ARViewMultiTouchRepresentable: UIViewRepresentable {
func makeUIView(context: Context) -> ARView {
let arView = ARViewMultiTouch(frame: .zero)
let anchor = AnchorEntity()
arView.scene.addAnchor(anchor)
let boxWidth: Float = 0.4
let boxMaterial = SimpleMaterial(color: .red, isMetallic: false)
let box = ModelEntity(mesh: .generateBox(size: boxWidth), materials: [boxMaterial])
box.name = "Box"
box.components.set(CollisionComponent(shapes: [.generateBox(width: boxWidth, height: boxWidth, depth: boxWidth)]))
anchor.addChild(box)
return arView
}
func updateUIView(_ uiView: ARView, context: Context) { }
}
// MARK: ARView
class ARViewMultiTouch: ARView {
required init(frame: CGRect) {
super.init(frame: frame)
/// Enable multi-touch
isMultipleTouchEnabled = true
cameraMode = .nonAR
automaticallyConfigureSession = false
environment.background = .color(.gray)
/// Disable gesture recognizers to not conflict with touch events
/// But it doesn't fix the issue
gestureRecognizers?.forEach { $0.isEnabled = false }
}
required dynamic init?(coder decoder: NSCoder) {
fatalError("init(coder:) has not been implemented")
}
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
for touch in touches {
/// # Problem
/// This should print for every new touch, up to 5 simultaneously on an iPhone (multi-touch)
/// But it only fires for one touch at a time (single-touch)
print("Touch began at: \(touch.location(in: self))")
}
}
}
Multitouch with an Overlay
This setup works, but it doesn't seem right. There must be a solution to make ARView handle multi touch directly, right?
import SwiftUI
import RealityKit
struct MultiTouchOverlayView: View {
var body: some View {
ZStack {
MultiTouchOverlayRepresentable()
.ignoresSafeArea()
Text("Multi touch with overlay view")
.font(.system(size: 24, weight: .medium))
.foregroundStyle(.white)
.offset(CGSize(width: 0, height: -150))
}
}
}
#Preview {
MultiTouchOverlayView()
}
// MARK: Representable Container
struct MultiTouchOverlayRepresentable: UIViewRepresentable {
func makeUIView(context: Context) -> UIView {
/// The view that SwiftUI will present
let container = UIView()
/// ARView
let arView = ARView(frame: container.bounds)
arView.autoresizingMask = [.flexibleWidth, .flexibleHeight]
arView.cameraMode = .nonAR
arView.automaticallyConfigureSession = false
arView.environment.background = .color(.gray)
let anchor = AnchorEntity()
arView.scene.addAnchor(anchor)
let boxWidth: Float = 0.4
let boxMaterial = SimpleMaterial(color: .red, isMetallic: false)
let box = ModelEntity(mesh: .generateBox(size: boxWidth), materials: [boxMaterial])
box.name = "Box"
box.components.set(CollisionComponent(shapes: [.generateBox(width: boxWidth, height: boxWidth, depth: boxWidth)]))
anchor.addChild(box)
/// The view that will capture touches
let touchOverlay = TouchOverlayView(frame: container.bounds)
touchOverlay.autoresizingMask = [.flexibleWidth, .flexibleHeight]
touchOverlay.backgroundColor = .clear
/// Pass an arView reference to the overlay for hit testing
touchOverlay.arView = arView
/// Add views to the container.
/// ARView goes in first, at the bottom.
container.addSubview(arView)
/// TouchOverlay goes in last, on top.
container.addSubview(touchOverlay)
return container
}
func updateUIView(_ uiView: UIView, context: Context) {
}
}
// MARK: Touch Overlay View
/// A UIView to handle multi-touch on top of ARView
class TouchOverlayView: UIView {
weak var arView: ARView?
override init(frame: CGRect) {
super.init(frame: frame)
isMultipleTouchEnabled = true
isUserInteractionEnabled = true
}
required init?(coder: NSCoder) {
fatalError("init(coder:) has not been implemented")
}
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
let totalTouches = event?.allTouches?.count ?? touches.count
print("--- Touches Began --- (New: \(touches.count), Total: \(totalTouches))")
for touch in touches {
let location = touch.location(in: self)
/// Hit testing.
/// ARView and Touch View must be of the same size
if let arView = arView {
let entity = arView.entity(at: location)
if let entity = entity {
print("Touched entity: \(entity.name)")
} else {
print("Touched: none")
}
}
}
}
override func touchesCancelled(_ touches: Set<UITouch>, with event: UIEvent?) {
let totalTouches = event?.allTouches?.count ?? touches.count
print("--- Touches Cancelled --- (Cancelled: \(touches.count), Total: \(totalTouches))")
}
}
I am trying to implement a ChacterControllerComponent using the following URL.
https://developer.apple.com/documentation/realitykit/charactercontrollercomponent
I have written sample code, but PhysicsSimulationEvents.WillSimulate is not executed and nothing happens.
import SwiftUI
import RealityKit
import RealityKitContent
struct ImmersiveView: View {
let gravity: SIMD3<Float> = [0, -50, 0]
let jumpSpeed: Float = 10
enum PlayerInput {
case none, jump
}
@State private var testCharacter: Entity = Entity()
@State private var myPlayerInput = PlayerInput.none
var body: some View {
RealityView { content in
// Add the initial RealityKit content
if let immersiveContentEntity = try? await Entity(named: "Immersive", in: realityKitContentBundle) {
content.add(immersiveContentEntity)
testCharacter = immersiveContentEntity.findEntity(named: "Capsule")!
testCharacter.components.set(CharacterControllerComponent())
let _ = content.subscribe(to: PhysicsSimulationEvents.WillSimulate.self, on: testCharacter) {
event in
print("subscribe run")
let deltaTime: Float = Float(event.deltaTime)
var velocity: SIMD3<Float> = .zero
var isOnGround: Bool = false
// RealityKit automatically adds `CharacterControllerStateComponent` after moving the character for the first time.
if let ccState = testCharacter.components[CharacterControllerStateComponent.self] {
velocity = ccState.velocity
isOnGround = ccState.isOnGround
}
if !isOnGround {
// Gravity is a force, so you need to accumulate it for each frame.
velocity += gravity * deltaTime
} else if myPlayerInput == .jump {
// Set the character's velocity directly to launch it in the air when the player jumps.
velocity.y = jumpSpeed
}
testCharacter.moveCharacter(by: velocity * deltaTime, deltaTime: deltaTime, relativeTo: nil) {
event in
print("playerEntity collided with \(event.hitEntity.name)")
}
}
}
}
}
}
The scene is loaded from RCP. It is simple, just a capsule on a pedestal.
Do I need a separate code to run testCharacter from this state?
Hello, I’m the developer of the “ StepSquad” app. Our app uses the Game Center achievement feature, but we’ve been encountering a problem: the “Global Players” metric always shows 0%, even though there are friends who have already achieved these achievements. Initially, I thought it might be because the app was newly launched. However, it’s now been over two months since release, and it’s still showing 0%. If anyone has any insight into this issue, please leave a comment.
Hello,
I'm working on a game that features online multiplayer. The game is developed using Unity and Apple Unity plugins.
The "isUnderAge" property restricts the online multiplayer feature. Everything works as expected on all platforms (Mac, iPhone, iPad, AppleTV, and visionPro) except on Macs equipped with an Intel chip.
Using the same iCloud and GameCenter, with no restrictions enabled, "isUnderAge" returns false, as expected, but on Mac equipped with an Intel chip, it returns true.
Is there any restriction or compatibility issue with those chips? Is there a workaround?
Thanks
Topic:
Graphics & Games
SubTopic:
GameKit
The code is pretty simple
kernel void naive(
constant RunParams *param [[ buffer(0) ]],
const device float *A [[ buffer(1) ]], // [N, K]
device float *output [[ buffer(2) ]],
uint2 gid [[ thread_position_in_grid ]]) {
uint a_ptr = gid.x * param->K;
for (uint i = 0; i < param->K; i++, a_ptr++) {
val += A[b_ptr];
}
output[ptr] = val;
}
when uint a_ptr = gid.x * param->K, the code got 150 GFLops
when uint a_ptr = gid.y * param->K, the code got 860 GFLops
param->K = 256;
thread per group: [16, 16]
I'd like to understand why the performance is so different, and how can I profile/diagnose this to help with further optimization.
Topic:
Graphics & Games
SubTopic:
Metal
Hi Apple & devs,
I'm trying to test various Windows .exe files using the Game Porting Toolkit (GPTK), but I’m hitting a wall: no matter what .exe I try, the command returns instantly with no output — no error, no logs, nothing.
Here's what I'm doing:
I'm using macOS Sequioa 15.5 on M1 macbook pro.
I installed gameportingtoolkt GPTK 2.1 through brew from gcenx:
brew install gcenx/wine/game-porting-toolkit
When I run any .exe using GPTK's wine64, like this, e.g. with steam
user@JMacBook-Pro / % WINEPREFIX=~/wine_prefix /usr/local/bin/gameportingtoolkit 'C:\SteamSetup.exe' --verbose
user@JMacBook-Pro / %
Immediate exit without any return code, output, nor errors.
No output, no crash, no logs. Same result with simple test apps
Running with WINEDEBUG=+all (still no output)
Even running wine64 does the same thing.
I’ve tried:
Removing and reinstalling GPTK
Creating a fresh WINEPREFIX
Checking /tmp and ~/Library/Logs for logs — nothing
Has anyone else experienced this or have any idea how to debug it?
Is there ANY Apple support for this??
Thanks in advance.
What evidence exists that it's safe to call nextDrawable() on CAMetalLayer off the main thread? I have seen developers claiming that it's OK, but the official docs are silent on the topic. Attempting to do so with Strict Concurrency Checking set to Complete complains that CAMetalLayer is not @Sendable.
I want to call it off the main thread since there doesn't seem to be any way to prevent it from blocking the UI for up to a second. I have read hints and allegations that this won't happen if you avoid asking for too many drawables, but that doesn't seem to be true 100% of the time in my experience.
Supposing it is allowed, I wonder how races are handled such as when the layer's size is changed on the main thread, or if the layer is removed from the layer hierarchy.
Topic:
Graphics & Games
SubTopic:
Metal
hi
When analyzing our game using Instruments, I've always been confused about the two items "Drawable Present" and "Drawable Presented" in the GPU column. The timing of Drawable Present seems to be when the CPU layer calls commandbuffer:present, rather than when the actual encoding is completed on the GPU. Also, what does drawable presented specifically mean? In our case, when a CPU stall occurs, it appears that the vsync interval changes in the next frame, and a surface that has already been calculated is not displayed. Why is this happening?
Hi everyone,
I'm using the Vision framework’s ImageAestheticsScoresObservation class (https://developer.apple.com/documentation/vision/imageaestheticsscoresobservation).
I noticed that the overallScore returned sometimes gives negative values. Could someone confirm whether the expected range of the score is from -1.0 to 1.0?
The documentation doesn’t explicitly state the possible score range, so I’d appreciate any clarification or insights.
Thanks in advance!
How can one match the walls and floor of a given CapturedRoom ?
The transform.eulerAngles of a floor z & y are always 0 !
And the polygons seems to have a different orientation than the walls.
So how to figure out the rotation and match the one from the walls ?
Hi!
Using ARView in UIKit or through a UIViewRepresentable in SwiftUI, we can do:
arView.debugOptions = [.showPhysics, .showStatistics]
What is the equivalent in RealityView?
Topic:
Graphics & Games
SubTopic:
RealityKit
Hi,
I can't see RealityKit statistics on Xcode Canvas using:
arView.debugOptions = [.showStatistics]
The statistics only show on a physical device, not Xcode live canvas with #Preview. Testing in Xcode 26.0.1 (17A400) on Tahoe 26.0.1 (25A362).
Use case: I'm using RealityKit as a non-AR 3D engine. Xcode Canvas is useful for live iterations.
Is this expected behavior? How can I see FPS on Xcode canvas? SKView for example shows all debug options on both Xcode Canvas and physical devices.
Topic:
Graphics & Games
SubTopic:
RealityKit