Delve into the world of graphics and game development. Discuss creating stunning visuals, optimizing game mechanics, and share resources for game developers.

All subtopics
Posts under Graphics & Games topic

Post

Replies

Boosts

Views

Created

ARView ignores multi-touch events
Hi, How to enable multitouch on ARView? Touch functions (touchesBegan, touchesMoved, ...) seem to only handle one touch at a time. In order to handle multiple touches at a time with ARView, I have to either: Use SwiftUI .simultaneousGesture on top of an ARView representable Position a UIView on top of ARView to capture touches and do hit testing by passing a reference to ARView Expected behavior: ARView should capture all touches via touchesBegan/Moved/Ended/Cancelled. Here is what I tried, on iOS 26.1 and macOS 26.1: ARView Multitouch The setup below is a minimal ARView presented by SwiftUI, with touch events handled inside ARView. Multitouch doesn't work with this setup. Note that multitouch wouldn't work either if the ARView is presented with a UIViewController instead of SwiftUI. import RealityKit import SwiftUI struct ARViewMultiTouchView: View { var body: some View { ZStack { ARViewMultiTouchRepresentable() .ignoresSafeArea() } } } #Preview { ARViewMultiTouchView() } // MARK: Representable ARView struct ARViewMultiTouchRepresentable: UIViewRepresentable { func makeUIView(context: Context) -> ARView { let arView = ARViewMultiTouch(frame: .zero) let anchor = AnchorEntity() arView.scene.addAnchor(anchor) let boxWidth: Float = 0.4 let boxMaterial = SimpleMaterial(color: .red, isMetallic: false) let box = ModelEntity(mesh: .generateBox(size: boxWidth), materials: [boxMaterial]) box.name = "Box" box.components.set(CollisionComponent(shapes: [.generateBox(width: boxWidth, height: boxWidth, depth: boxWidth)])) anchor.addChild(box) return arView } func updateUIView(_ uiView: ARView, context: Context) { } } // MARK: ARView class ARViewMultiTouch: ARView { required init(frame: CGRect) { super.init(frame: frame) /// Enable multi-touch isMultipleTouchEnabled = true cameraMode = .nonAR automaticallyConfigureSession = false environment.background = .color(.gray) /// Disable gesture recognizers to not conflict with touch events /// But it doesn't fix the issue gestureRecognizers?.forEach { $0.isEnabled = false } } required dynamic init?(coder decoder: NSCoder) { fatalError("init(coder:) has not been implemented") } override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) { for touch in touches { /// # Problem /// This should print for every new touch, up to 5 simultaneously on an iPhone (multi-touch) /// But it only fires for one touch at a time (single-touch) print("Touch began at: \(touch.location(in: self))") } } } Multitouch with an Overlay This setup works, but it doesn't seem right. There must be a solution to make ARView handle multi touch directly, right? import SwiftUI import RealityKit struct MultiTouchOverlayView: View { var body: some View { ZStack { MultiTouchOverlayRepresentable() .ignoresSafeArea() Text("Multi touch with overlay view") .font(.system(size: 24, weight: .medium)) .foregroundStyle(.white) .offset(CGSize(width: 0, height: -150)) } } } #Preview { MultiTouchOverlayView() } // MARK: Representable Container struct MultiTouchOverlayRepresentable: UIViewRepresentable { func makeUIView(context: Context) -> UIView { /// The view that SwiftUI will present let container = UIView() /// ARView let arView = ARView(frame: container.bounds) arView.autoresizingMask = [.flexibleWidth, .flexibleHeight] arView.cameraMode = .nonAR arView.automaticallyConfigureSession = false arView.environment.background = .color(.gray) let anchor = AnchorEntity() arView.scene.addAnchor(anchor) let boxWidth: Float = 0.4 let boxMaterial = SimpleMaterial(color: .red, isMetallic: false) let box = ModelEntity(mesh: .generateBox(size: boxWidth), materials: [boxMaterial]) box.name = "Box" box.components.set(CollisionComponent(shapes: [.generateBox(width: boxWidth, height: boxWidth, depth: boxWidth)])) anchor.addChild(box) /// The view that will capture touches let touchOverlay = TouchOverlayView(frame: container.bounds) touchOverlay.autoresizingMask = [.flexibleWidth, .flexibleHeight] touchOverlay.backgroundColor = .clear /// Pass an arView reference to the overlay for hit testing touchOverlay.arView = arView /// Add views to the container. /// ARView goes in first, at the bottom. container.addSubview(arView) /// TouchOverlay goes in last, on top. container.addSubview(touchOverlay) return container } func updateUIView(_ uiView: UIView, context: Context) { } } // MARK: Touch Overlay View /// A UIView to handle multi-touch on top of ARView class TouchOverlayView: UIView { weak var arView: ARView? override init(frame: CGRect) { super.init(frame: frame) isMultipleTouchEnabled = true isUserInteractionEnabled = true } required init?(coder: NSCoder) { fatalError("init(coder:) has not been implemented") } override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) { let totalTouches = event?.allTouches?.count ?? touches.count print("--- Touches Began --- (New: \(touches.count), Total: \(totalTouches))") for touch in touches { let location = touch.location(in: self) /// Hit testing. /// ARView and Touch View must be of the same size if let arView = arView { let entity = arView.entity(at: location) if let entity = entity { print("Touched entity: \(entity.name)") } else { print("Touched: none") } } } } override func touchesCancelled(_ touches: Set<UITouch>, with event: UIEvent?) { let totalTouches = event?.allTouches?.count ?? touches.count print("--- Touches Cancelled --- (Cancelled: \(touches.count), Total: \(totalTouches))") } }
1
0
262
Nov ’25
MPSMatrixRandom SEGFAULTs when ran in an async context
The following minimal snippet SEGFAULTS with SDK 26.0 and 26.1. Won't crash if I remove async from the enclosing function signature - but it's impractical in a real project. import Metal import MetalPerformanceShaders let SEED = UInt64(0x0) typealias T = Float16 /* Why ran in async context? Because global GPU object, and async makeMTLFunction, and async makeMTLComputePipelineState. Nevertheless, can trigger the bug without using global @MainActor let myGPU = MyGPU() */ @main struct CMDLine { static func main() async { let ptr = UnsafeMutablePointer<T>.allocate(capacity: 0) async let future: Void = randomFillOnGPU(ptr, count: 0) print("Main thread is playing around") await future print("Successfully reached the end.") } static func randomFillOnGPU(_ buf: UnsafeMutablePointer<T>, count destbufcount: Int) async { // let (device, queue) = await (myGPU.device, myGPU.commandqueue) let myGPU = MyGPU() let (device, queue) = (myGPU.device, myGPU.commandqueue) // Init MTLBuffer, async let makeFunction, makeComputePipelineState, etc. let tempDataType = MPSDataType.uInt32 let randfiller = MPSMatrixRandomMTGP32(device: device, destinationDataType: tempDataType, seed: Int(bitPattern:UInt(SEED))) print("randomFillOnGPU: successfully created MPSMatrixRandom.") // try await computePipelineState // ^ Crashes before this could return // Or in this minimal case, after randomFillOnGPU() returns // make encoder, set pso, dispatch, commit... } } actor MyGPU { let device : MTLDevice let commandqueue : MTLCommandQueue init() { guard let dev: MTLDevice = MPSGetPreferredDevice(.skipRemovable), let cq = dev.makeCommandQueue(), dev.supportsFamily(.apple6) || dev.supportsFamily(.mac2) else { print("Unable to get Metal Device! Exiting"); exit(EX_UNAVAILABLE) } print("Selected device: \(String(format: "%llX", dev.registryID))") self.device = dev self.commandqueue = cq print("myGPU: initialization complete.") } } See FB20916929. Apparently objc autorelease pool is releasing the wrong address during context switch (across suspension points). I wonder why such obvious case has not been caught before.
0
0
73
Nov ’25
ScreenCaptureKit recording output is corrupted when captureMicrophone is true
Hello everyone, I'm working on a screen recording app using ScreenCaptureKit and I've hit a strange issue. My app records the screen to an .mp4 file, and everything works perfectly until the .captureMicrophone is false In this case, I get a valid, playable .mp4 file. However, as soon as I try to enable the microphone by setting streamConfig.captureMicrophone = true, the recording seems to work, but the final .mp4 file is corrupted and cannot be played by QuickTime or any other player. This happens whether capturesAudio (app audio) is on or off. I've already added the "Privacy - Microphone Usage Description" (NSMicrophoneUsageDescription) to my Info.plist, so I don't think it's a permissions problem. I have my logic split into a ScreenRecorder class that manages state and a CaptureEngine that handles the SCStream. Here is how I'm configuring my SCStream: ScreenRecorder.swift // This is my main SCStreamConfiguration private var streamConfiguration: SCStreamConfiguration { var streamConfig = SCStreamConfiguration() // ... other HDR/preset config ... // These are the problem properties streamConfig.capturesAudio = isAudioCaptureEnabled streamConfig.captureMicrophone = isMicCaptureEnabled // breaks it if true streamConfig.excludesCurrentProcessAudio = false streamConfig.showsCursor = false if let region = selectedRegion, let display = currentDisplay { // My region/frame logic (works fine) let regionWidth = Int(region.frame.width) let regionHeight = Int(region.frame.height) streamConfig.width = regionWidth * scaleFactor streamConfig.height = regionHeight * scaleFactor // ... (sourceRect logic) ... } streamConfig.pixelFormat = kCVPixelFormatType_32BGRA streamConfig.colorSpaceName = CGColorSpace.sRGB streamConfig.minimumFrameInterval = CMTime(value: 1, timescale: 60) return streamConfig } And here is how I'm setting up the SCRecordingOutput that writes the file: ScreenRecorder.swift private func initRecordingOutput(for region: ScreenPickerManager.SelectedRegion) throws { let screeRecordingOutputURL = try RecordingWorkspace.createScreenRecordingVideoFile( in: workspaceURL, sessionIndex: sessionIndex ) let recordingConfiguration = SCRecordingOutputConfiguration() recordingConfiguration.outputURL = screeRecordingOutputURL recordingConfiguration.outputFileType = .mp4 recordingConfiguration.videoCodecType = .hevc let recordingOutput = SCRecordingOutput(configuration: recordingConfiguration, delegate: self) self.recordingOutput = recordingOutput } Finally, my CaptureEngine adds these to the SCStream: CaptureEngine.swift class CaptureEngine: NSObject, @unchecked Sendable { private(set) var stream: SCStream? private var streamOutput: CaptureEngineStreamOutput? // ... (dispatch queues) ... func startCapture(configuration: SCStreamConfiguration, filter: SCContentFilter, recordingOutput: SCRecordingOutput) async throws { let streamOutput = CaptureEngineStreamOutput() self.streamOutput = streamOutput do { stream = SCStream(filter: filter, configuration: configuration, delegate: streamOutput) // Add outputs for raw buffers (not used for file recording) try stream?.addStreamOutput(streamOutput, type: .screen, sampleHandlerQueue: videoSampleBufferQueue) try stream?.addStreamOutput(streamOutput, type: .audio, sampleHandlerQueue: audioSampleBufferQueue) try stream?.addStreamOutput(streamOutput, type: .microphone, sampleHandlerQueue: micSampleBufferQueue) // Add the file recording output try stream?.addRecordingOutput(recordingOutput) try await stream?.startCapture() } catch { logger.error("Failed to start capture: \(error.localizedDescription)") throw error } } // ... (stopCapture, etc.) ... } When I had the .captureMicrophone value to be false, I get a perfect .mp4 video playable everywhere, however, when its true, I am getting corrupted video which doesn't play at all :-
0
0
265
Nov ’25
GKMatch rule-based matching. Can't match more than 3 people.
Matchmaking rules https://developer.apple.com/documentation/gamekit/matchmaking-rules?language=objc AppStoreConnectApi rules https://developer.apple.com/documentation/appstoreconnectapi/rules?language=objc ・Environment Unity 6000.2.2f1 XCode 16.1 iOS 26 3 iPhones ・AppStoreConnectApi rules "type": "gameCenterMatchmakingRuleSets", "id": "f6a88caf-85db-42bf-xxxxxxxxxxxxxxxxxxxx", "attributes": { "referenceName": "co.mygame.RuleSets.GvERandom34", "ruleLanguageVersion": 1, "minPlayers": 3, "maxPlayers": 4 }, "type": "gameCenterMatchmakingRules", "id": "6afa68ce-4d2c-496f-xxxxxxxxxxxxxxxxxxxx", "attributes": { "referenceName": "GameVersion", "description": "Check Game Version. GvERandom34", "type": "COMPATIBLE", "expression": "requests[0].properties.gameVersion == requests[1].properties.gameVersion", "weight": null }, "type": "gameCenterMatchmakingQueues", "id": "7fb645ef-4eca-4510-xxxxxxxxxxxxxxxxxxxx", "attributes": { "referenceName": "co.mygame.que.GvERandom34", "classicMatchmakingBundleIds": [] }, ・Objective-C Execution code queueName = "co.mygame.que.GvERandom34" keyStr = "gameVersion " valueStr = "1.0" - (void)MatchQueueParamStr1Start:(NSString*)queueName keyStr:(NSString*)keyStr valueStr:(NSString*)valueStr { if (@available(iOS 17.2, tvOS 17.2, macOS 14.2, visionOS 1.1, *) == NO) { DBGLOG(@"MatchQueueParamStr1Start Not support."); return; } self->_matchMakingFlag = YES; self->_matchFinishFlag = NO; self->_myMatch = nil; GKMatchRequest *req = [[GKMatchRequest alloc] init]; if (@available(iOS 17.2, tvOS 17.2, macOS 14.2, visionOS 1.1, *)) { req.queueName = queueName; req.properties = @{keyStr: valueStr}; } [[GKMatchmaker sharedMatchmaker] findMatchForRequest:req withCompletionHandler: ^(GKMatch *match, NSError *error) { if (error) { [self SetupErrorInfo:error descriptionText:@"findMatchForRequest"]; } else if(match) { self->_myMatch = match; self->_myMatch.delegate = self; } self->_matchMakingFlag = NO; self->_matchFinishFlag = YES; }]; } ・ I'm trying to match with three devices. Matching doesn't work. 5 minutes later times out. What's the problem?
1
0
201
Nov ’25
Request: Allow Game Mode to be enabled locally for non-game App Store categories
Hi Apple team, Game Mode was introduced in iOS 18. To activate Game Mode, an app must include specific key-value pairs in its *.plist and be categorized as a "Game" on the App Store. My app (https://apps.apple.com/us/app/voidlink/id6747717070) works primarily as a self-hosted game streaming (PC->iPhone/iPad) client. Game Mode provides clear benefits in terms of latency and frame rate stability, but it can currently only be activated when running via Xcode or TestFlight. I am an individual iOS developer based in China, where an additional government license is required for apps to be listed under the "Game" category on the App Store. Obtaining such a license is very difficult for independent developers, so my app has been categorized under "Utilities" instead.(If move the app to game category, it will disappear from Chinese App Store immediately) Expectation / Suggestion: Please consider making Game Mode available as a local, user-controllable option on iOS18/26+, such as through a system “App Pool” where users can choose which apps to enable Game Mode for, regardless of App Store category. This would greatly benefit use cases like streaming clients, benchmarking tools, and remote play utilities, without requiring developers to reclassify their apps as “Games” on App Store.
3
1
618
Oct ’25
GKLocalPlayer.isUnderAge always returns true on mac with an intel chips
Hello, I'm working on a game that features online multiplayer. The game is developed using Unity and Apple Unity plugins. The "isUnderAge" property restricts the online multiplayer feature. Everything works as expected on all platforms (Mac, iPhone, iPad, AppleTV, and visionPro) except on Macs equipped with an Intel chip. Using the same iCloud and GameCenter, with no restrictions enabled, "isUnderAge" returns false, as expected, but on Mac equipped with an Intel chip, it returns true. Is there any restriction or compatibility issue with those chips? Is there a workaround? Thanks
9
0
881
Oct ’25
X11 applications run with XQuartz not working properly in macOS Tahoe
Hello XQuartz is an open-source effort to develop a version of the X.Org X Window System (https://www.xquartz.org/), widely used to bring graphical support to applications running in remote servers (usually via SSH). Since macOS Tahoe, XQuartz fails to refresh properly on window resize (more info here https://github.com/XQuartz/XQuartz/issues/438#issuecomment-3371409500), leading to severe usability issues. The XQuartz developers are already aware of the issue, but I’m wondering if there’s anything we can do at the OS level to resolve it and restore the usual behavior from before macOS Tahoe. Thanks, KiM
4
3
828
Oct ’25
Multiple App Icons
Hi, I have an Unity game. I need to have multiple App Icons for my game for it to be able to be recognized in different countries. In other words, is it possible to have an iOS app in which the App Icon changes based on device locale/language? On Android this is possible using Unity Localization package "com.unity.localization"
0
0
219
Oct ’25
Metal recommendedMaxWorkingSetSize vs actual RAM on iPhone (LLM load fails)
Context I’m deploying large language models on iPhone using llama.cpp. A new iPhone Air (12 GB RAM) reports a Metal MTLDevice.recommendedMaxWorkingSetSize of 8,192 MB, and my attempt to load Llama-2-13B Q4_K (~7.32 GB weights) fails during model initialization. Environment Device: iPhone Air (12 GB RAM) iOS: 26 Xcode: 26.0.1 Build: Metal backend enabled llama.cpp App runs on device (not Simulator) What I’m seeing MTLCreateSystemDefaultDevice().recommendedMaxWorkingSetSize == 8192 MiB Loading Llama-2-13B Q4_K (7.32 GB) fails to complete. Logs indicate memory pressure / allocation issues consistent with the 8 GB working-set guidance. Smaller models (e.g., 7B/8B with similar quantization) load and run (8B Q4_K provide around 9 tokens/second decoding speed). Questions Is 8,192 MB an expected recommendedMaxWorkingSetSize on a 12 GB iPhone? What values should I expect on other 2025 devices including iPhone 17 (8 GB RAM) and iPhone 17 Pro (12 GB RAM) Is it strictly enforced by Metal allocations (heaps/buffers), or advisory for best performance/eviction behavior? Can a process practically exceed this for long-lived buffers without immediate Jetsam risk? Any guidance for LLM scenarios near the limit?
0
0
405
Oct ’25
ARView [.showStatistics] doesn't work on Xcode Canvas
Hi, I can't see RealityKit statistics on Xcode Canvas using: arView.debugOptions = [.showStatistics] The statistics only show on a physical device, not Xcode live canvas with #Preview. Testing in Xcode 26.0.1 (17A400) on Tahoe 26.0.1 (25A362). Use case: I'm using RealityKit as a non-AR 3D engine. Xcode Canvas is useful for live iterations. Is this expected behavior? How can I see FPS on Xcode canvas? SKView for example shows all debug options on both Xcode Canvas and physical devices.
0
0
372
Oct ’25
Value of type 'SCRecordingOutput' has no member 'delegate'
Hello, I am trying to capture screen recording ( output.mp4 ) using ScreenCaptureKit and also the mouse positions during the recording ( mouse.json ). The recording and the mouse positions ( tracked based on mouse movements events only ) needs to be perfectly synced in order to add effects in post editing. I started off by using the await stream?.startCapture() and after that starting my mouse tracking function :- try await captureEngine.startCapture(configuration: config, filter: filter, recordingOutput: recordingOutput) let captureStartTime = Date() mouseTracker?.startTracking(with: captureStartTime) But every time I tested, there is a clear inconsistency in sync between the recorded video and the recorded mouse positions. The only thing I want is to know when exactly does the recording "actually" started so that I can start the mouse capture at that same time, and thus I tried using the Delegates, but being able to set them up perfectly. import Foundation import AVFAudio import ScreenCaptureKit import OSLog import Combine class CaptureEngine: NSObject, @unchecked Sendable { private let logger = Logger() private(set) var stream: SCStream? private var streamOutput: CaptureEngineStreamOutput? private var recordingOutput: SCRecordingOutput? private let videoSampleBufferQueue = DispatchQueue(label: "com.francestudio.phia.VideoSampleBufferQueue") private let audioSampleBufferQueue = DispatchQueue(label: "com.francestudio.phia.AudioSampleBufferQueue") private let micSampleBufferQueue = DispatchQueue(label: "com.francestudio.phia.MicSampleBufferQueue") func startCapture(configuration: SCStreamConfiguration, filter: SCContentFilter, recordingOutput: SCRecordingOutput) async throws { // Create the stream output delegate. let streamOutput = CaptureEngineStreamOutput() self.streamOutput = streamOutput do { stream = SCStream(filter: filter, configuration: configuration, delegate: streamOutput) try stream?.addStreamOutput(streamOutput, type: .screen, sampleHandlerQueue: videoSampleBufferQueue) try stream?.addStreamOutput(streamOutput, type: .audio, sampleHandlerQueue: audioSampleBufferQueue) try stream?.addStreamOutput(streamOutput, type: .microphone, sampleHandlerQueue: micSampleBufferQueue) self.recordingOutput = recordingOutput recordingOutput.delegate = self try stream?.addRecordingOutput(recordingOutput) try await stream?.startCapture() } catch { logger.error("Failed to start capture: \(error.localizedDescription)") throw error } } func stopCapture() async throws { do { try await stream?.stopCapture() } catch { logger.error("Failed to stop capture: \(error.localizedDescription)") throw error } } func update(configuration: SCStreamConfiguration, filter: SCContentFilter) async { do { try await stream?.updateConfiguration(configuration) try await stream?.updateContentFilter(filter) } catch { logger.error("Failed to update the stream session: \(String(describing: error))") } } func stopRecordingOutputForStream(_ recordingOutput: SCRecordingOutput) throws { try self.stream?.removeRecordingOutput(recordingOutput) } } // MARK: - SCRecordingOutputDelegate extension CaptureEngine: SCRecordingOutputDelegate { func recordingOutputDidStartRecording(_ recordingOutput: SCRecordingOutput) { let startTime = Date() logger.info("Recording output did start recording \(startTime)") } func recordingOutputDidFinishRecording(_ recordingOutput: SCRecordingOutput) { logger.info("Recording output did finish recording") } func recordingOutput(_ recordingOutput: SCRecordingOutput, didFailWithError error: any Error) { logger.error("Recording output failed with error: \(error.localizedDescription)") } } private class CaptureEngineStreamOutput: NSObject, SCStreamOutput, SCStreamDelegate { private let logger = Logger() override init() { super.init() } func stream(_ stream: SCStream, didOutputSampleBuffer sampleBuffer: CMSampleBuffer, of outputType: SCStreamOutputType) { guard sampleBuffer.isValid else { return } switch outputType { case .screen: break case .audio: break case .microphone: break @unknown default: logger.error("Encountered unknown stream output type:") } } func stream(_ stream: SCStream, didStopWithError error: Error) { logger.error("Stream stopped with error: \(error.localizedDescription)") } } I am getting error Value of type 'SCRecordingOutput' has no member 'delegate' Even though I am targeting macOs 15+ ( macOs 26 actually ) and macOs only. What is the best way to achieving the desired result? Is there any other / better way to do it?
1
0
201
Oct ’25
PDFKit Page Rerender
I'm experiencing an issue with PDFKit where page.removeAnnotation(annotation) successfully removes the annotation from the page's data structure, but the PDFView no longer updates automatically to reflect the change visually. Issue Details: The annotation is removed (verified by checking page.annotations.count) The PDFView display doesn't refresh to show the removal This code was working correctly before and suddenly stopped working No code changes were made on my end
3
0
715
Oct ’25
Can't remove annotations from PdfView
Hi everyone, I faced an issue that on IOS 26 removeAnnotation method doesn't remove annotation. This code worked on previous versions (IOS 18, 17) but suddenly stopped working on IOS 26. Has anyone faced this issue? guard let document = await pdfView.document else { return } for pageIndex in 0..<document.pageCount { guard let page = document.page(at: pageIndex) else { continue } let annotations = page.annotations for annotation in annotations { page.removeAnnotation(annotation) } }
1
0
200
Oct ’25
iOS 26.1 beta: third-party Broadcast Extension sessions doesnt work after 3 s, then picker frozen
It's a Broadcast Extension issue: on iOS 26.1 beta the extension never launches—after you tap “Start Broadcast” in the system picker the countdown disappears after 3 s and no broadcast starts, so every live-streaming app(and all other non-system apps that use Broadcast Extension) fails to go live (only the native Photos screen recording still works). Is this a known regression or is a new entitlement required?
0
4
168
Oct ’25
RealityKit - Full 3D experience
I have a question I guess more for the Apple team. But why are there no totally 3D experiences for the Vision Pro lineup? I know they have given us tools to implement unity 3D games into iPhone and I guess you can also build it in RealityKit. But why at this moment are 3D games limited to just iPad and iPhone and can't you bring that into Vision Pro? Just to explain. When I say a totally 3D game, I mean games like Gorn. I mean the Vision Pro is definitely powerful enough, but it just feels limited to tabletop games and AR games. Is this something Apple is thinking about implementing?
0
0
519
Oct ’25
[Bug] iPadOS 26: 4-Finger Fast Tap/Swipe Gesture Not Detected (Multitouch Issue)
Hi everyone I'm experiencing an issue with iPadOS 26 regarding multi-touch gesture detection. When performing a quick four-finger gesture (tap and swipe), the system often fails to recognize the input. This especially affects multi-touch gestures, such as rhythm games with difficult levels. Steps to Reproduce: Place four fingers on the screen. Perform a quick tap or a quick horizontal swipe (like the one used to switch apps). Observe whether the gesture is ignored or detected inconsistently. Expected Behavior: 4-finger multitouch gestures should be recognized regardless of gesture speed, just like previous iPadOS versions. Actual Behavior: Gestures fail to be detected when executed quickly—same gestures still work, and miss notes in rhythm games. You can check out my posts on Twitter/x and Facebook: [https://x.com/kokona_fwa/status/1978131164104728949?s=61] Facebook: [https://m.facebook.com/groups/idipad/permalink/24438964899058806/?]
1
1
820
Oct ’25
Game Center SignIn alert appears on macOS 26 (Tahoe) without capability or entitlement
Hello, On macOS 26 (Tahoe), when building a OSX app that includes GameKit code, calling GKLocalPlayer.local.authenticateHandler shows the "Sign In to Game Center" alert (e.g. didShowFullscreenSignIn) — even if the app does not have the Game Center capability enabled or any related entitlement (com.apple.developer.game-center). This alert only appears when the user is not signed in to Game Center in system settings. However, when testing the same code path on iOS app built with macOS 26 (Tahoe), the alert does not appear unless the proper capability and entitlement are included. This behavior is different from macOS 15 (Sequoia) + Xcode 15.x. Prior to the update, Game Center features did not work at all even with the OSX app without Capability and Entitlements. Steps to Reproduce Create a new OSX app target (App Sandbox enabled, no Game Center capability). Add minimal GameKit code: GKLocalPlayer.local.authenticateHandler = { _, _, _ in } Build OSX app and run on macOS 26 (Tahoe). Ensure Game Center is signed out in System Settings. Observe: “Sign In to Game Center” alert appears automatically. Expected Behavior When Game Center capability and entitlement are not present, authenticateHandler should fail silently, and no signIn alert should appear. Actual Behavior On OSX app, the Game Center signIn UI appears even without any Game Center capability or entitlement. On iOS app, this alert does not appear. *Build Configuration: built with the same condition. (macOS 26 + Xcode 26) Question Could you please confirm whether this behavior is an intentional change in macOS 26 or a bug only for OSX apps in the GameKit authentication flow? Thank you.
2
0
368
Oct ’25
请问Game Center的数据保存逻辑
我们想在游戏类 App 内接入 Game Center。用户可以在游戏内创建多个角色,若用户在游戏内创建了2个角色:角色1、角色2,请问: 当用户将角色1与 Game Center 绑定后,数据将上报至 Game Center。此时玩家想要将角色1与 Game Center 解除绑定,解绑后,再将角色2与 Game Center 绑定。那么这时角色1的数据是留存在 Game Center 中,还是将被移除?
0
0
257
Oct ’25