Delve into the world of graphics and game development. Discuss creating stunning visuals, optimizing game mechanics, and share resources for game developers.

All subtopics
Posts under Graphics & Games topic

Post

Replies

Boosts

Views

Activity

Optimizing HZB Mip-Chain Generation and Bindless Argument Tables in a Custom Metal Engine
Hi everyone, I’ve been developing a custom, end-to-end 3D rendering engine called Crescent from scratch using C++20 and Metal-cpp (targeting macOS and visionOS). My primary goal is to build a zero-bottleneck, GPU-driven pipeline that maximizes the potential of Apple Silicon’s Unified Memory and TBDR architecture. While the fundamental systems are stable, I am looking for architectural feedback from Metal framework engineers regarding specific synchronization and latency challenges. Current Core Implementations: GPU-Driven Instance Culling: High-performance occlusion culling using a Hierarchical Z-Buffer (HZB) approach via Compute Shaders. Clustered Forward Shading: Support for high-count dynamic lights through view-space clustering. Temporal Stability: Custom TAA with history rejection and Motion Blur resolve. Asset Infrastructure: Robust GUID-based scene serialization and a JSON-driven ECS hierarchy. The Architectural Challenge: I am currently seeing slight synchronization overhead when generating the HZB mip-chain. On Apple Silicon, I am evaluating the cost of encoder transitions versus cache-friendly barriers. && m_hzbInitPipeline && m_hzbDownsamplePipeline && !m_hzbMipViews.empty(); if (canBuildHzb) { MTL::ComputeCommandEncoder* hzbInit = commandBuffer->computeCommandEncoder(); hzbInit->setComputePipelineState(m_hzbInitPipeline); hzbInit->setTexture(m_depthTexture, 0); hzbInit->setTexture(m_hzbMipViews[0], 1); if (m_pointClampSampler) { hzbInit->setSamplerState(m_pointClampSampler, 0); } else if (m_linearClampSampler) { hzbInit->setSamplerState(m_linearClampSampler, 0); } const uint32_t hzbWidth = m_hzbMipViews[0]->width(); const uint32_t hzbHeight = m_hzbMipViews[0]->height(); const uint32_t threads = 8; MTL::Size tgSize = MTL::Size(threads, threads, 1); MTL::Size gridSize = MTL::Size((hzbWidth + threads - 1) / threads * threads, (hzbHeight + threads - 1) / threads * threads, 1); hzbInit->dispatchThreads(gridSize, tgSize); hzbInit->endEncoding(); for (size_t mip = 1; mip < m_hzbMipViews.size(); ++mip) { MTL::Texture* src = m_hzbMipViews[mip - 1]; MTL::Texture* dst = m_hzbMipViews[mip]; if (!src || !dst) { continue; } MTL::ComputeCommandEncoder* downEncoder = commandBuffer->computeCommandEncoder(); downEncoder->setComputePipelineState(m_hzbDownsamplePipeline); downEncoder->setTexture(src, 0); downEncoder->setTexture(dst, 1); const uint32_t mipWidth = dst->width(); const uint32_t mipHeight = dst->height(); MTL::Size downGrid = MTL::Size((mipWidth + threads - 1) / threads * threads, (mipHeight + threads - 1) / threads * threads, 1); downEncoder->dispatchThreads(downGrid, tgSize); downEncoder->endEncoding(); } if (m_instanceCullHzbPipeline) { dispatchInstanceCulling(m_instanceCullHzbPipeline, true); } } My Questions: Encoder Synchronization: Would you recommend moving this loop into a single ComputeCommandEncoder using MTLBarrier between dispatches to maintain L2 cache residency, or is the overhead of separate encoders negligible for depth-downsampling on TBDR? visionOS Bindless Latency: For stereo rendering on visionOS, what are the best practices for managing MTL4ArgumentTable updates at 90Hz+? I want to ensure that updating bindless resources for each eye doesn't introduce unnecessary CPU-to-GPU latency. Memory Management: Are there specific hints for Memoryless textures that could be applied to intermediate HZB levels to save bandwidth during this process? I’ve attached a screenshot of a scene rendered with the engine (PBR, SSR, and IBL).
0
0
43
7h
receivedTurnEventForMatch giving stale data
In my turn-based game, I receive GKListener event receivedTurnEventForMatch and decode the match.matchData. On occasion, the matchData is clearly stale and is from the previous turn. If I call the MatchMaker ViewController up and select that same match, the data is not stale, so it's not a matter of not calling endTurn. I have tried both loadMatchWithID and loadMatchesWithCompletionHandler after receiving the receivedTurnEventForMatch, but the data is still stale. Advice?
2
0
414
1d
SpriteKit scene used as SCNView.overlaySKScene crashes due to SKShapeNode
I recently published my first game on the App Store. It uses SceneKit with a SpriteKit overlay. All crashes Xcode downloaded for it so far are related to some SpriteKit/SceneKit internals. The most common crash is caused by SKCShapeNode::_NEW_copyRenderPathData. What could cause such a crash? crash.crash While developing this game (and the BoardGameKit framework that appears in the crash log) over the years I experienced many crashes presumably caused by the SpriteKit overlay (I opened a post SceneKit app randomly crashes with EXC_BAD_ACCESS in jet_context::set_fragment_texture about such a crash in September 2024), and other people on the internet also mention that they experience crashes when using SpriteKit as a SceneKit overlay. Should I use a separate SKView and lay it on top of SCNView rather than setting SCNView.overlaySKScene? That seemed to solve the crashes for a guy on stackoverflow, but is it also encouraged by Apple? I know SceneKit is deprecated, but according to Apple critical bugs would still be fixed. Could this be considered a critical bug?
4
0
470
2d
Many CGECreateKeyboardEvent in quick succession causing function and dictation buttons to be pressed?
Hi! I hope everyone reading is doing well. I am working on developing a reinforcement learning agent that involves sending scan codes to a window, which I've been doing by sending virtual scan codes with CGEventCreateKeyboardEvent per the docs. There is no event source when I send the keyboard events. However, when many keyboard events are happening (with the keys 'q', 'w', 'e', 'r', 'f', 'd', 's', space, arrow keys) in quick succession (<250ms), the enable dictation popup or the function button emojis popup appear for seemingly no reason. I have verified that I am using the correct scan codes for these keypresses, so I am wondering what else could cause this to happen. It is as if I am choosing to press f5 or fn. It does not happen when 'a' is the only button being pressed in quick succession. One thing that I have not been able to easily find is the scan code inputs for dictation nor the function button. do these scan codes overlap somehow? Thank you all for the help! Hunter
2
0
369
3d
Xcode Metal Trace
Code is download from apple official metal4 sample [https://developer.apple.com/documentation/metal/drawing-a-triangle-with-metal-4?language=objc] enable metal gpu trace in macOS schema and trace a frame in Xcode. Xcode may show segment fault on App from some 'GTTrace' function when click trace button. When replay a .gputrace file, Xcode may crash , throw an internal error or a XPC error. The example code using old metal-renderer can trace without any problem and everything works fine. Test Environment: Xcode Version 26.2 (17C52) macOS 26.2 (25C56) M1 Pro 16GB A2442
2
0
402
4d
Krazy Krownz multiplayer option not working for Game Center!
When trying to play with friends Krazy Krownz doesn’t allow me to click multiplayer even though my Apple Game Center connected and my friends Apple game center connected as well. I even tried sending an invite from Apple Game Center to friends and Krazy Krownz doesn’t even show up on the list of available multiplayer games. I’ve signed out and back in the same issue remain. I’ve try to contact the game developer, but the website doesn’t work.
1
0
215
4d
MatchMaker VC not showing existing matches after upgrade.
Updated my app to include turn-based matches. Beta testing through FlightTest and all was well between iOS 18.x and 26.2 devices. One beta tester upgraded to 26.2 during beta testing and now when the MatchMaker VC is opened, it does not show existing matches. Worse, he can create new matches and play his turn, but the new match won't even show up in MMVC, even after opponent takes turn. My app has been reviewed and is ready for release, but I'd like to know how to solve this before I release. He has tried re-installing the app, including an updated FlightTest version that is the same as the about-to-be-released reviewed version.
4
0
465
6d
- (BOOL) contentsAreFlipped needs to be true for .nib layouts
I have an odd bug, if I use initWithFrame as the init routine for NSView subclass that uses layers I don't see this bug. But if I embedded this view into a storyboard with a .nib file and use initWithCoder, I need to return true on (BOOL) contentsAreFlipped From the NSView subclass If I don't the CALayer actually renders from 0,0 from the view upwards and off the window. The frame sizes for the NSView and the CALayer are good.. when I see them in updateLayer. Obviously I have a fix.. but I would like to understand why.
0
0
203
6d
GameKit Leaderboards not working in iOS 26.2
Leaderboards working fine in iOS 26.1 but seem to be broken in 26.2 and also in the 26.3 developer beta. Players cannot submit scores and neither can they view scores on Apple's default leaderboards. Custom leaderboards that rely on pulling information using GameKit APIs also fail. Is there a workaround or patch for this?
2
0
271
1w
Metal 4: Proper usage of requestResidency() with unique per-frame textures at 120fps
Hello, I have some confusion regarding ResidencySet. Specifically, about the requestResidency() function: how often should we call it? I have a captureOutput(_:didOutput:from:) method that is triggered at 60 or 120 fps. Inside this method, I am calling the following code every frame: computeResidencySet.removeAllAllocations() сomputeResidencySet.addAllocation(TextureA) computeResidencySet.addAllocation(TextureB) computeResidencySet.addAllocation(TextureC) computeResidencySet.commit() computeResidencySet.requestResidency() // Should we call it every frame? Please keep in mind that TextureA, TextureB, and TextureC are unique for each call (new instances are provided on every frame)."
1
0
464
1w
ReplayKit Issue on iOS 26
When previewing the recording of gameplay the buttons to exit or save are unclickable behind the top bar clock and Wi-Fi/5G status bar. Which means that you have to quit the game in order to continue. Tested on multiple devices. Does anyone have a solution to this? At the moment we have disabled it altogether for iOS 26 users.
2
3
822
1w
MetalFX FrameInterpolator assertion: `Color texture width mismatch from descriptor` even when all texture sizes match
I am integrating MetalFX FrameInterpolator into a custom Unity RenderGraph–based render pipeline (C++ native plugin + C# render passes), and I am hitting the following assertion at runtime: /MetalFXDebugError.h:29: failed assertion `Color texture width mismatch from descriptor' What makes this confusing is that all input/output textures have the correct width and height, and they exactly match the values specified in the MTLFXFrameInterpolatorDescriptor. Setup Input resolution: 1024 x 512 Output resolution: 2048 x 1024 MTLFXTemporalScaler is created first and then passed into MTLFXFrameInterpolator The TemporalScaler and FrameInterpolator descriptors use the same input/output sizes and formats All Metal textures: Have no parentTexture Are 2D textures Match the descriptor sizes exactly (verified via logging) Texture bindings at encode time frameInterpolator.colorTexture = mtlTexColor; // 1024 x 512 frameInterpolator.prevColorTexture = mtlTexPrevColor; // 1024 x 512 frameInterpolator.motionTexture = mtlTexMotion; // 1024 x 512 frameInterpolator.depthTexture = mtlTexDepth; // 1024 x 512 frameInterpolator.uiTexture = mtlTexUI; // 2048 x 1024 frameInterpolator.outputTexture = mtlTexOutput; // 2048 x 1024 All widths/heights are logged and match: Color : 1024 x 512 (input) PrevColor : 1024 x 512 (input) Motion : 1024 x 512 (input) Depth : 1024 x 512 (input) UI : 2048 x 1024 (output) Output : 2048 x 1024 (output) The TemporalScaler works correctly on its own. The assertion only occurs when using FrameInterpolator. Important detail about colorTexture Originally, colorTexture was copied from BuiltinRenderTextureType.CurrentActive. After reading that this might violate MetalFX semantics, I changed the pipeline so that: colorTexture now comes from a dedicated private RenderGraph texture It is not the backbuffer It is not a drawable It is not used as a final output It is created before UI rendering Despite this, the assertion still occurs. Question Can uiTexture for MTLFXFrameInterpolator legally come from a texture copied from BuiltinRenderTextureType.CurrentActive? More generally: Are there additional hidden constraints on colorTexture / prevColorTexture (such as Metal usage, storageMode, aliasing, or hazard tracking) that could cause this assertion, even when sizes match? Does FrameInterpolator require colorTexture and prevColorTexture to be created in a very specific way (e.g. non-aliased, ShaderRead usage, identical Metal resource properties)? Any clarification on the exact semantic requirements for colorTexture, prevColorTexture, or uiTexture in MetalFX FrameInterpolator would be greatly appreciated.
4
0
523
1w
macOS SwiftUI app with external 4K camera & sensors for Hospital Avatar: ARKit, MLX, and Thermal feasibility?
We are developing a standalone AI avatar application for hospital reception kiosks using Mac mini (M2/M4). The app runs on SwiftUI + RealityKit, displays on a 75-inch monitor, and utilizes a USB-connected 4K camera and external sensors (LiDAR/mmWave). We have several technical concerns regarding the transition from iPadOS to macOS. Could you please provide insights on the following? ARKit/Vision Framework on macOS with External Camera On iPadOS, ARKit provides robust Face Tracking. On macOS with an external USB 4K camera: Can we achieve real-time face tracking (expression/gaze/depth) with Vision framework or ARKit comparable to iPadOS performance? Are there any specific limitations for accessing the Neural Engine via Vision framework for real-time 4K video analysis on macOS? Accessing External Hardware (LiDAR/Sensors) in Sandbox We plan to connect external LiDAR and mmWave sensors (e.g., Akara) via USB/Bluetooth. Is it feasible to communicate with these custom drivers/devices within the App Sandbox environment? Would DriverKit be required, or can we use standard serial communication APIs? On-Device LLM (MLX) & Thermals We intend to run a local LLM (e.g., Llama 3 using MLX framework) for offline conversation, alongside 3D rendering. With the M2/M4 Mac mini fan design, is there a risk of thermal throttling during 10+ hours of continuous operation (simultaneous CoreML + 3D rendering)? Is the Mac Studio recommended over the Mac mini for this thermal profile? Long-running Speech API Are there any known issues (memory leaks, API limits) when using Spherch framework and AVSpeechSynthesizer continuously for over 10 hours daily? 3D Display Output Are there any macOS constraints for rendering a SwiftUI window in a specific 3D format (e.g., Side-by-Side) and outputting it via HDMI to a 3D digital signage display (fixed refresh rate/resolution)? Thank you for your assistance.
0
0
215
1w
Clarifying when Game Center activity events fire relative to authentication
Hello, In our game we enforce an age gate before showing Game Center sign‑in. Only after the user passes the age gate do we call GKLocalPlayer.localPlayer.authenticateHandler. The reason I’m asking is that we want to reliably detect if the game was launched from a Game Center activity in the Games app (iOS 26+). If the user prefers to enter via activities, we don’t want to miss that event during cold start. Our current proposal is: Register a GKLocalPlayerListener early in didFinishLaunchingWithOptions: so the app is ready to catch events. Queue any incoming events in our dispatcher. Only process those events after the user passes the age gate and authentication succeeds. My questions are: Does player:wantsToPlayGameActivity:completionHandler: ever fire before authentication, or only after the local player is authenticated? If it only fires after authentication, is our “register early but gate processing” approach the correct way to ensure we don’t miss activity launches? Is there any recommended pattern to distinguish “activity launch” vs. “normal launch” in this age‑gate scenario? We want to respect Apple’s age gate requirements, but also ensure activity launches are not lost if the user prefers that entry point. Sorry if this is a stupid question — I just want to be sure we’re following the right pattern. Thanks for any clarification or best‑practice guidance!
1
0
207
1w
GameKit Turn Based Matches Push Notifications
I'm developing a game that supports GameKit turn based matches. What I don't understand is this: Is tapping on the Game Center notification push messages the only way for the GKTurnBasedEventListener to trigger? What if someone misses the push message (swiping it away by accident or something like that) but still wants to join? Is there some inbox somewhere where the pending messages can be seen or fetched? Also it was mentioned in a very old WWDC video (from 2013, I think that's the latest with information about turn based matches) that the notification also includes a badge for the icon. However, I do not understand how to implement that. Is there any documentation for that?
3
0
559
1w
How does the automatch feature work in Game Kit?
I'm developing a turn based game. When I present the GKTurnBasedMatchmakerViewController players can opt in for automatch instead of selecting a specific friend as opponent. How exactly does the matching work if a player doesn't specify anything explicitly? Does Game Center send push notifications in a round robin fashion to all friends and the first one to accept is then matched as opponent? Is this documented somewhere?
3
0
346
1w
ARView ignores multi-touch events
Hi, How to enable multitouch on ARView? Touch functions (touchesBegan, touchesMoved, ...) seem to only handle one touch at a time. In order to handle multiple touches at a time with ARView, I have to either: Use SwiftUI .simultaneousGesture on top of an ARView representable Position a UIView on top of ARView to capture touches and do hit testing by passing a reference to ARView Expected behavior: ARView should capture all touches via touchesBegan/Moved/Ended/Cancelled. Here is what I tried, on iOS 26.1 and macOS 26.1: ARView Multitouch The setup below is a minimal ARView presented by SwiftUI, with touch events handled inside ARView. Multitouch doesn't work with this setup. Note that multitouch wouldn't work either if the ARView is presented with a UIViewController instead of SwiftUI. import RealityKit import SwiftUI struct ARViewMultiTouchView: View { var body: some View { ZStack { ARViewMultiTouchRepresentable() .ignoresSafeArea() } } } #Preview { ARViewMultiTouchView() } // MARK: Representable ARView struct ARViewMultiTouchRepresentable: UIViewRepresentable { func makeUIView(context: Context) -> ARView { let arView = ARViewMultiTouch(frame: .zero) let anchor = AnchorEntity() arView.scene.addAnchor(anchor) let boxWidth: Float = 0.4 let boxMaterial = SimpleMaterial(color: .red, isMetallic: false) let box = ModelEntity(mesh: .generateBox(size: boxWidth), materials: [boxMaterial]) box.name = "Box" box.components.set(CollisionComponent(shapes: [.generateBox(width: boxWidth, height: boxWidth, depth: boxWidth)])) anchor.addChild(box) return arView } func updateUIView(_ uiView: ARView, context: Context) { } } // MARK: ARView class ARViewMultiTouch: ARView { required init(frame: CGRect) { super.init(frame: frame) /// Enable multi-touch isMultipleTouchEnabled = true cameraMode = .nonAR automaticallyConfigureSession = false environment.background = .color(.gray) /// Disable gesture recognizers to not conflict with touch events /// But it doesn't fix the issue gestureRecognizers?.forEach { $0.isEnabled = false } } required dynamic init?(coder decoder: NSCoder) { fatalError("init(coder:) has not been implemented") } override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) { for touch in touches { /// # Problem /// This should print for every new touch, up to 5 simultaneously on an iPhone (multi-touch) /// But it only fires for one touch at a time (single-touch) print("Touch began at: \(touch.location(in: self))") } } } Multitouch with an Overlay This setup works, but it doesn't seem right. There must be a solution to make ARView handle multi touch directly, right? import SwiftUI import RealityKit struct MultiTouchOverlayView: View { var body: some View { ZStack { MultiTouchOverlayRepresentable() .ignoresSafeArea() Text("Multi touch with overlay view") .font(.system(size: 24, weight: .medium)) .foregroundStyle(.white) .offset(CGSize(width: 0, height: -150)) } } } #Preview { MultiTouchOverlayView() } // MARK: Representable Container struct MultiTouchOverlayRepresentable: UIViewRepresentable { func makeUIView(context: Context) -> UIView { /// The view that SwiftUI will present let container = UIView() /// ARView let arView = ARView(frame: container.bounds) arView.autoresizingMask = [.flexibleWidth, .flexibleHeight] arView.cameraMode = .nonAR arView.automaticallyConfigureSession = false arView.environment.background = .color(.gray) let anchor = AnchorEntity() arView.scene.addAnchor(anchor) let boxWidth: Float = 0.4 let boxMaterial = SimpleMaterial(color: .red, isMetallic: false) let box = ModelEntity(mesh: .generateBox(size: boxWidth), materials: [boxMaterial]) box.name = "Box" box.components.set(CollisionComponent(shapes: [.generateBox(width: boxWidth, height: boxWidth, depth: boxWidth)])) anchor.addChild(box) /// The view that will capture touches let touchOverlay = TouchOverlayView(frame: container.bounds) touchOverlay.autoresizingMask = [.flexibleWidth, .flexibleHeight] touchOverlay.backgroundColor = .clear /// Pass an arView reference to the overlay for hit testing touchOverlay.arView = arView /// Add views to the container. /// ARView goes in first, at the bottom. container.addSubview(arView) /// TouchOverlay goes in last, on top. container.addSubview(touchOverlay) return container } func updateUIView(_ uiView: UIView, context: Context) { } } // MARK: Touch Overlay View /// A UIView to handle multi-touch on top of ARView class TouchOverlayView: UIView { weak var arView: ARView? override init(frame: CGRect) { super.init(frame: frame) isMultipleTouchEnabled = true isUserInteractionEnabled = true } required init?(coder: NSCoder) { fatalError("init(coder:) has not been implemented") } override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) { let totalTouches = event?.allTouches?.count ?? touches.count print("--- Touches Began --- (New: \(touches.count), Total: \(totalTouches))") for touch in touches { let location = touch.location(in: self) /// Hit testing. /// ARView and Touch View must be of the same size if let arView = arView { let entity = arView.entity(at: location) if let entity = entity { print("Touched entity: \(entity.name)") } else { print("Touched: none") } } } } override func touchesCancelled(_ touches: Set<UITouch>, with event: UIEvent?) { let totalTouches = event?.allTouches?.count ?? touches.count print("--- Touches Cancelled --- (Cancelled: \(touches.count), Total: \(totalTouches))") } }
2
0
422
1w
Multiply exr lightmap in Reality Composer Pro Shader Graph
I’m trying to use EXR lightmaps to overlay baked lighting on top of a base texture in the RCP Shader Graph. When I multiply an EXR image set to Image(float) with an 8-bit base texture, the output becomes Image(float). I can’t connect that to the BaseColor input on the UnlitSurface node, since it only accepts Color3f. I expected to be able to use a Convert node between the Multiply node and the BaseColor input, but when I do that, the result becomes black and white instead of the expected outcome: the EXR multiplied with the base texture using a baseline value of 1, where values below 1 in the EXR would darken the base texture and values above 1 would brighten it. Is there any documentation on how to properly overlay a 32-bit EXR lightmap in the RCP Shader Graph, or is the black-and-white output from the Convert node a bug?
7
0
1k
2w