Delve into the world of graphics and game development. Discuss creating stunning visuals, optimizing game mechanics, and share resources for game developers.

All subtopics
Posts under Graphics & Games topic

Post

Replies

Boosts

Views

Activity

Metal and Swift Concurrency
Hi, Introducing Swift Concurrency to my Metal app has been a bit challenging as Swift Concurrency is limited by the cooperative thread pool. GPU work is obviously not CPU bound and can block forward moving progress, especially when using waitUntilCompleted on the command buffer. For concurrent render work this has the potential of under utilizing the CPU and even creating dead locks. My question is, what is the Metal's teams general recommendation when it comes to concurrency? It seems to me that Dispatch or OperationQueues are still the preferred way for Metal bound tasks in order to gain maximum performance? To integrate with Swift Concurrency my idea is to use continuations that kick off render jobs via Dispatch or Queues? Would this be the best solution to bridge async tasks with Metal work? Thanks!
5
0
1.1k
Apr ’25
PSVR2 controllers don't report anything in snapshot
I typically read an extended gamepad capture() and get all state. But PSVR2 controllers seem to report nothing. So the stick and other buttons don't do anything in a built app. They register as left/right controllers. This on vOS 26, Xcode 26, etc. They work correctly in the main icon view, although they don't honor inverted vertical and horiztonal scrolling. Both of the default scrolls just feel wrong. When I move left I'm want to scroll level not right. Same for up/down.
5
1
820
Sep ’25
Xcode Vulkan is opening two windows instead of one.
I'm a newbee at Vulkan and Xcode. I have my project on github https://github.com/flocela/OrangeSpider/ Whenever I run, two windows open instead of only one. I added testing, which means I have an OrangeSpider.xctestplan in the OrangeSpider/TestsOrangeSpider/ folder. This is my first time adding testing to an XCode project, so I think this may be where the problem is. I also get this error message: ViewBridge to RemoteViewService Terminated: Error Domain=com.apple.ViewBridge Code=18 "(null)" UserInfo={com.apple.ViewBridge.error.hint=this process disconnected remote view controller -- benign unless unexpected, com.apple.ViewBridge.error.description=NSViewBridgeErrorCanceled}
4
0
177
Jul ’25
Float8 and Float16 "Reserved_Name__Do_not_use"
I am developing a macOS terminal app, running on an M4 Pro, and using Metal. I am not able use float8 or float16, both reporting Variable has incomplete type 'float16' (aka '__Reserved_Name__Do_not_use_float16'). Based on the system I should be able to use these. Either it is because it is also compiling to Intel, which they are not allowed, or something else. Either way I have not been able to figure out how to get past this. IIs there a compiler setting I need to set to make this work? if so which one and what setting do I need? I only want to run this on M processes, on the latest version of OS so not interested in Intel version or backward compatibility.
Topic: Graphics & Games SubTopic: Metal Tags:
4
0
220
Aug ’25
GKMatch.chooseBestHostingPlayer(_:) always returns nil player
I'm building a game with a client-server architecture. Using GKMatch.chooseBestHostingPlayer(_:) rarely works. When I started testing it today, it worked once at the very beginning, and since then it always succeeds on one client and returns nil on the other client. I'm testing with a Mac and an iPhone. Sometimes it fails on the Mac, sometimes on the iPhone. On the device that it succeeds on, the provided host can be the device itself or the other one. I created FB9583628 in August 2021, but after the Feedback Assistant team replied that they are not able to reproduce it, the feedback never went forward. import SceneKit import GameKit #if os(macOS) typealias ViewController = NSViewController #else typealias ViewController = UIViewController #endif class GameViewController: ViewController, GKMatchmakerViewControllerDelegate, GKMatchDelegate { var match: GKMatch? var matchStarted = false override func viewDidLoad() { super.viewDidLoad() GKLocalPlayer.local.authenticateHandler = authenticate } private func authenticate(_ viewController: ViewController?, _ error: Error?) { #if os(macOS) if let viewController = viewController { presentAsSheet(viewController) } else if let error = error { print(error) } else { print("authenticated as \(GKLocalPlayer.local.gamePlayerID)") let viewController = GKMatchmakerViewController(matchRequest: defaultMatchRequest())! viewController.matchmakerDelegate = self GKDialogController.shared().present(viewController) } #else if let viewController = viewController { present(viewController, animated: true) } else if let error = error { print(error) } else { print("authenticated as \(GKLocalPlayer.local.gamePlayerID)") let viewController = GKMatchmakerViewController(matchRequest: defaultMatchRequest())! viewController.matchmakerDelegate = self present(viewController, animated: true) } #endif } private func defaultMatchRequest() -> GKMatchRequest { let request = GKMatchRequest() request.minPlayers = 2 request.maxPlayers = 2 request.defaultNumberOfPlayers = 2 request.inviteMessage = "Ciao!" return request } func matchmakerViewControllerWasCancelled(_ viewController: GKMatchmakerViewController) { print("cancelled") } func matchmakerViewController(_ viewController: GKMatchmakerViewController, didFailWithError error: Error) { print(error) } func matchmakerViewController(_ viewController: GKMatchmakerViewController, didFind match: GKMatch) { self.match = match match.delegate = self startMatch() } func match(_ match: GKMatch, player: GKPlayer, didChange state: GKPlayerConnectionState) { print("\(player.gamePlayerID) changed state to \(String(describing: state))") startMatch() } func startMatch() { let match = match! if matchStarted || match.expectedPlayerCount > 0 { return } print("starting match with local player \(GKLocalPlayer.local.gamePlayerID) and remote players \(match.players.map({ $0.gamePlayerID }))") match.chooseBestHostingPlayer { host in print("host is \(String(describing: host?.gamePlayerID))") } } }
4
0
451
Apr ’25
Xcode compile stucks(stops) when add new source files or add functions to previous files
Hello, we are working on a iOS game project, as progress, the project grows larger and larger. Because we are using other game dependencies and libraries, here larger and larger refers to the whole project, and our source files integrated and compiled by Xcode are not many. Now, it seems we hit a bottleneck, when I add new files or functions to the previous files to implement a new feature, Xcode compile stucks(stops), it's Indexing | Initializing datastore forever, cannot produce a final build. macOS 15.1, Xcode 16.2 Can you provide any solutions to solve this problem? Also submitted Feedback ID #FB18432749
4
0
710
Aug ’25
Background GPU Access availability
I would love to use Background GPU Access to do some video processing in the background. However the documentation of BGContinuedProcessingTaskRequest.Resources.gpu clearly states: Not all devices support background GPU use. For more information, see Performing long-running tasks on iOS and iPadOS. Is there a list available of currently released devices that do (or don't) support GPU background usage? That would help to understand what part of our user base can use this feature. (And what hardware we need to test this on as developers.) For example it seems that it isn't supported on an iPad Pro M1 with the current iOS 26 beta. The simulators also seem to not support the background GPU resource. So would be great to understand what hardware is capable of using this feature!
4
0
1k
Jul ’25
RealityKit animation with bindTarget: .opacity doesn't work
I want to fade objects in and out, and while setting an entity's OpacityComponent works, animating it doesn't seem to do anything. In the following code the second sphere should fade out, but it keeps its initial opacity. On the other hand, the animation that changes its transform works. What am I doing wrong? class ViewController: NSViewController { override func loadView() { let arView = ARView(frame: NSScreen.main!.frame) let anchor = AnchorEntity(.world(transform: matrix_identity_float4x4)) arView.scene.addAnchor(anchor) let sphere = ModelEntity(mesh: .generateSphere(radius: 0.5)) anchor.addChild(sphere) sphere.components.set(OpacityComponent(opacity: 0.1)) let sphere2 = ModelEntity(mesh: .generateSphere(radius: 0.5)) sphere2.position = .init(x: 0.2, y: 0, z: 0) anchor.addChild(sphere2) sphere2.components.set(OpacityComponent(opacity: 0.1)) sphere.playAnimation(try! AnimationResource.makeActionAnimation(for: FromToByAction(to: 0, timing: .linear), duration: 1, bindTarget: .opacity)) sphere.playAnimation(try! AnimationResource.makeActionAnimation(for: FromToByAction(to: Transform(translation: SIMD3(x: 0.1, y: 0, z: 0)), timing: .linear), duration: 1, bindTarget: .transform)) view = arView } }
4
0
385
1w
RealityKit Instanced Rendering on visionOS
Hello, I've been trying to leverage instanced rendering in RealityKit on visionOS but have not had success. RealityKit states this is supported: https://developer.apple.com/documentation/realitykit/validating-usd-files https://developer.apple.com/videos/play/wwdc2021/10075/?time=1373 https://developer.apple.com/videos/play/wwdc2023/10099/?time=772 RealityKit Trace metrics Validating instancing is working: To test I made a base visionOS app with immersive space and the entity replaced with my test usdz file. I've been using the RealityKit Trace profiling template in xcode instruments in the immersive space and volume closed. This gets consistent draw call results. If I have a single sphere mesh with one material I get one draw call, but the number of draw calls grows linearly with mesh count no matter how my entity is configured. What I've tried Create a test scene in blender, export with instancing enabled Create a test scene in Reality Composer Pro using references Author usda files by hand based on the OpenUSD spec Programatically create a MeshResource with Contents at runtime References https://openusd.org/release/api/_usd__page__scenegraph_instancing.html https://developer.apple.com/documentation/realitykit/meshresource https://developer.apple.com/documentation/realitykit/meshresource/instance Thank you
4
4
1.5k
Oct ’25
macOS Tahoe Beta 4 disabled __asm keyword for Metal
Hi, developers, I maintain a shipped app that uses string concatenation to construct Metal shader and compile on-device. Beta 4 seems disabled __asm keyword, resulting the compilation failure. The error is: v2/GEMMKernel.cpp:229: error: program_source:23:9: error: illegal string literal in 'asm' __asm("air.simdgroup_async_copy_1d.p3i8.p1i8"); The relevant code is available at https://github.com/liuliu/ccv/blob/unstable/lib/nnc/mfa/v2/GEMMHeaders.cpp#L30 although any __asm will trip this. Please give us guidance on whether this is a regression or this will be something enforced in 26 release. Personally, I would consider this as a bug given it won't impact anything "compiled" shaders. Thanks for your patience reading this!
Topic: Graphics & Games SubTopic: Metal Tags:
4
6
940
Jul ’25
MetalFX FrameInterpolator assertion: `Color texture width mismatch from descriptor` even when all texture sizes match
I am integrating MetalFX FrameInterpolator into a custom Unity RenderGraph–based render pipeline (C++ native plugin + C# render passes), and I am hitting the following assertion at runtime: /MetalFXDebugError.h:29: failed assertion `Color texture width mismatch from descriptor' What makes this confusing is that all input/output textures have the correct width and height, and they exactly match the values specified in the MTLFXFrameInterpolatorDescriptor. Setup Input resolution: 1024 x 512 Output resolution: 2048 x 1024 MTLFXTemporalScaler is created first and then passed into MTLFXFrameInterpolator The TemporalScaler and FrameInterpolator descriptors use the same input/output sizes and formats All Metal textures: Have no parentTexture Are 2D textures Match the descriptor sizes exactly (verified via logging) Texture bindings at encode time frameInterpolator.colorTexture = mtlTexColor; // 1024 x 512 frameInterpolator.prevColorTexture = mtlTexPrevColor; // 1024 x 512 frameInterpolator.motionTexture = mtlTexMotion; // 1024 x 512 frameInterpolator.depthTexture = mtlTexDepth; // 1024 x 512 frameInterpolator.uiTexture = mtlTexUI; // 2048 x 1024 frameInterpolator.outputTexture = mtlTexOutput; // 2048 x 1024 All widths/heights are logged and match: Color : 1024 x 512 (input) PrevColor : 1024 x 512 (input) Motion : 1024 x 512 (input) Depth : 1024 x 512 (input) UI : 2048 x 1024 (output) Output : 2048 x 1024 (output) The TemporalScaler works correctly on its own. The assertion only occurs when using FrameInterpolator. Important detail about colorTexture Originally, colorTexture was copied from BuiltinRenderTextureType.CurrentActive. After reading that this might violate MetalFX semantics, I changed the pipeline so that: colorTexture now comes from a dedicated private RenderGraph texture It is not the backbuffer It is not a drawable It is not used as a final output It is created before UI rendering Despite this, the assertion still occurs. Question Can uiTexture for MTLFXFrameInterpolator legally come from a texture copied from BuiltinRenderTextureType.CurrentActive? More generally: Are there additional hidden constraints on colorTexture / prevColorTexture (such as Metal usage, storageMode, aliasing, or hazard tracking) that could cause this assertion, even when sizes match? Does FrameInterpolator require colorTexture and prevColorTexture to be created in a very specific way (e.g. non-aliased, ShaderRead usage, identical Metal resource properties)? Any clarification on the exact semantic requirements for colorTexture, prevColorTexture, or uiTexture in MetalFX FrameInterpolator would be greatly appreciated.
4
0
630
Jan ’26
Metal 4 & Acceleration Structures
I have really enjoyed looking through the code and videos related to Metal 4. Currently, my interest is to update a ReSTIR Project and take advantage of more robust ways to refit acceleration Structures and more powerful ways to access resources. I am working in Swift and have encountered a couple of puzzles: What is the 'accepted' way to create a MTL4BufferRange to store indices and vertices? How do I properly rewrite Swift code to build and compact an Acceleration Structure? I do realize that this is all in Beta and will happily look through Code Samples this Fall. If other guidance is available earlier, that would be fabulous! Thank you
4
0
670
Sep ’25
visionOS + Unity PolySpatial: Is 15,970 MeshFilters the True Upper Limit for Industrial Scenes?
Breaking Through PolySpatial's ~8k Object Limit – Seeking Alternative Approaches for Large-Scale Digital Twins Confirmed: PolySpatial make Doubles MeshFilter Count – Hard Limit at ~8k Active Objects (15.9k Total) Project Context & Research Goals I’m developing an industrial digital twin application for Apple Vision Pro using Unity’s PolySpatial framework (RealityKit rendering in Unbounded_Volume mode). The scene contains complex factory environments with: Production line equipment Many fragmented grid objects need to be merged.) Dynamic product racks (state-switchable assets) Animated worker avatars To optimize performance, I’m systematically testing visionOS’s rendering capacity limits. Through controlled stress tests, I’ve identified a critical threshold: Key Finding When the total MeshFilter count reaches 15,970 (system baseline + 7,985 user-created objects × 2 due to PolySpatial cloning), the application crashes consistently. This suggests: PolySpatial’s mirroring mechanism effectively doubles GameObject overhead An apparent hard limit exists around ~8k active mesh objects in practice Objectives for This Discussion Verify if others have encountered similar limits with PolySpatial/RealityKit Understand whether this is a: Memory constraint (per-app allocation) Render pipeline limit (Metal draw calls) Unity-specific PolySpatial behavior Explore optimization strategies beyond brute-force object reduction Why This Matters Industrial metaverse applications require rendering thousands of interactive objects . Confirming these limits will help our team: Design safer content guidelines Prioritize GPU instancing/LOD investments Potentially contribute back to PolySpatial’s optimization I’d appreciate insights from engineers who’ve: Pushed similar large-scale scenes in visionOS Worked around PolySpatial’s cloning overhead Discovered alternative capacity limits (vertices/draw calls)
4
0
798
Oct ’25
Feature Request: Support .reality File Export in Reality Composer Pro for Mac
I am an AR developer working on Apple Silicon Macs. Currently, Reality Composer Pro does not allow exporting .reality files, and Reality Composer (classic) is not available for Apple Silicon. This creates a gap in the workflow for ARKit/RealityKit developers who need interactive .reality files for use in Xcode projects. Having the ability to export .reality files directly from Reality Composer Pro on Mac would greatly streamline development and enable a fully native workflow on modern Macs. Alternatively, bringing Reality Composer (classic) to Apple Silicon would also resolve this issue. I have submitted this as a feature request via Feedback Assistant (FB17900386). I encourage others with similar needs to reply or submit feedback as well. Thank you!
4
1
281
Jul ’25
SpriteKit scene used as SCNView.overlaySKScene crashes due to SKShapeNode
I recently published my first game on the App Store. It uses SceneKit with a SpriteKit overlay. All crashes Xcode downloaded for it so far are related to some SpriteKit/SceneKit internals. The most common crash is caused by SKCShapeNode::_NEW_copyRenderPathData. What could cause such a crash? crash.crash While developing this game (and the BoardGameKit framework that appears in the crash log) over the years I experienced many crashes presumably caused by the SpriteKit overlay (I opened a post SceneKit app randomly crashes with EXC_BAD_ACCESS in jet_context::set_fragment_texture about such a crash in September 2024), and other people on the internet also mention that they experience crashes when using SpriteKit as a SceneKit overlay. Should I use a separate SKView and lay it on top of SCNView rather than setting SCNView.overlaySKScene? That seemed to solve the crashes for a guy on stackoverflow, but is it also encouraged by Apple? I know SceneKit is deprecated, but according to Apple critical bugs would still be fixed. Could this be considered a critical bug?
4
0
752
Jan ’26
Are there complete code examples available for “Combine Metal 4 machine learning and graphics”?
Hello, I recently watched the WWDC2025 session titled “Combine Metal 4 machine learning and graphics” (https://developer.apple.com/videos/play/wwdc2025/262/ ), and I’m very excited about the new Metal 4 features that integrate machine learning with graphics—such as neural ambient occlusion, shader-based ML inference, and the use of MTLTensor and MTL4MachineLearningCommandEncoder. While the session includes helpful code snippets and a compelling debug demo (e.g., the neural ambient occlusion example), the implementation details are not fully shown, and I haven’t been able to find a complete, runnable sample project that demonstrates end-to-end integration of ML and rendering in Metal 4. Would Apple be able to provide a full, working example—such as an Xcode project—that shows how to: Export a model to an .mlpackage, Convert it to an .mtlpackage, Use MTL4MachineLearningCommandEncoder alongside render passes, Or embed small neural networks directly in shaders using Shader ML? Having such a sample would greatly help developers like me adopt these powerful new capabilities correctly and efficiently. Thank you very much for your time and support! Best regards,
4
2
973
Nov ’25
SpriteKit framerate drop on iOS 26.3
Hello, I have noticed a performance drop on SpriteKit-based projects running on iOS 26.3. The issue seems very similar to the issue reported on iOS 26.0, and later solved from iOS 26.2 beta 3: https://developer.apple.com/forums/thread/800952?answerId=870617022#870617022 With 26.3, it seems a regression occured. Below is the same SpriteKit scene used to test framerate on different devices: import SpriteKit import SwiftUI class BareboneScene: SKScene { override func didMove(to view: SKView) { size = view.bounds.size anchorPoint = CGPoint(x: 0.5, y: 0.5) backgroundColor = .darkGray let roundedSquare = SKShapeNode(rectOf: CGSize(width: 150, height: 75), cornerRadius: 12) roundedSquare.fillColor = .systemRed roundedSquare.strokeColor = .black roundedSquare.lineWidth = 3 addChild(roundedSquare) let action = SKAction.rotate(byAngle: .pi, duration: 1) roundedSquare.run(.repeatForever(action)) } } struct BareboneSceneView: View { var body: some View { SpriteView( scene: BareboneScene(), debugOptions: [.showsFPS] ) .ignoresSafeArea() } } #Preview { BareboneSceneView() } Running this very basic spritekit scene on my device and I can not reach an FPS of 60. Instead, it stalls around 40. I see the same in my AppStore published SpriteKit games. As you can see in the discussion of the linked forum topic, the regression has been noticed by others as well. Can you please look into this, and let us know if there is an outstanding action to resolve it already? It's very problematic for published games suddenly dropping to 40FPS on even the most modern iOS devices.
4
3
1.5k
6d