Delve into the world of graphics and game development. Discuss creating stunning visuals, optimizing game mechanics, and share resources for game developers.

All subtopics
Posts under Graphics & Games topic

Post

Replies

Boosts

Views

Activity

Metal and Swift Concurrency
Hi, Introducing Swift Concurrency to my Metal app has been a bit challenging as Swift Concurrency is limited by the cooperative thread pool. GPU work is obviously not CPU bound and can block forward moving progress, especially when using waitUntilCompleted on the command buffer. For concurrent render work this has the potential of under utilizing the CPU and even creating dead locks. My question is, what is the Metal's teams general recommendation when it comes to concurrency? It seems to me that Dispatch or OperationQueues are still the preferred way for Metal bound tasks in order to gain maximum performance? To integrate with Swift Concurrency my idea is to use continuations that kick off render jobs via Dispatch or Queues? Would this be the best solution to bridge async tasks with Metal work? Thanks!
5
0
1.1k
Apr ’25
How to use MTKTextureLoader to load png data
I am trying to load some PNG data with MTKTextureLoader newTextureWithData,but the result shows wrong at the alpha area. Here is the code. I have an image URL, after it downloads successfully, I try to use the data or UIImagePNGRepresentation (image), they all show wrong. UIImage *tempImg = [UIImage imageWithData:data]; CGImageRef cgRef = tempImg.CGImage; MTKTextureLoader *loader = [[MTKTextureLoader alloc] initWithDevice:device]; id<MTLTexture> temp1 = [loader newTextureWithData:data options:@{MTKTextureLoaderOptionSRGB: @(NO), MTKTextureLoaderOptionTextureUsage: @(MTLTextureUsageShaderRead), MTKTextureLoaderOptionTextureCPUCacheMode: @(MTLCPUCacheModeWriteCombined)} error:nil]; NSData *tempData = UIImagePNGRepresentation(tempImg); id<MTLTexture> temp2 = [loader newTextureWithData:tempData options:@{MTKTextureLoaderOptionSRGB: @(NO), MTKTextureLoaderOptionTextureUsage: @(MTLTextureUsageShaderRead), MTKTextureLoaderOptionTextureCPUCacheMode: @(MTLCPUCacheModeWriteCombined)} error:nil]; id<MTLTexture> temp3 = [loader newTextureWithCGImage:cgRef options:@{MTKTextureLoaderOptionSRGB: @(NO), MTKTextureLoaderOptionTextureUsage: @(MTLTextureUsageShaderRead), MTKTextureLoaderOptionTextureCPUCacheMode: @(MTLCPUCacheModeWriteCombined)} error:nil]; }] resume];
5
0
624
May ’25
App keeps crashing due to Game Center account??
I play this game called Sonic Forces: Speed Battle that's available in the app store and I completed a quest outside of the app on this site called TapResearch for some rewards as I've done before and has worked, but after this one time I can no longer enter back into the game without crashing immediately. I tried deleting and reinstalling but nothing. I even tried signing into a different account but that didn't work either. So then I tried to make a new game center account to try and see if it works, and it did, though all my progress has been restarted. Does anyone know how to fix this?
1
0
531
Jan ’25
Swift game sometimes runs on efficiency cores then snaps back to performance cores
I've been working on Swift game which is not yet launched or available for preview. The game works in such a way that it has idle CPU while the user is thinking and sustained max CPU and GPU on as many cores as possible when he makes a move. Rarely, due to OS activity or something else outside of my control (for example when dropping the OS curtain even if for just a bit then remove it), the game or some of its threads are moved to efficiency cores which results in major stuttering which persists precisely until the game is idle again at which point the game is moved back on performance cores - but if the player keeps making moves the stuttering simply won't go away and so I guess compuptation is locked onto efficiency cores. The issue does not reproduce on MacCatalyst on Intel. How do I tell Swift to avoid efficiency cores? BTW Swift and SceneKIT have AMAZING performance especially when compared to others.
0
0
110
Mar ’25
How to properly pass a Metal layer from SwiftUI MTKView to C++ for use with metal-cpp?
Hello! I'm currently porting a videogame console emulator to iOS and I'm trying to make the renderer (tested on MacOS) work on iOS as well. The emulator core is written in C++ and uses metal-cpp for rendering, whereas the iOS frontend is written in Swift with SwiftUI. I have an Objective-C++ bridging header for bridging the Swift and C++ sides. On the Swift side, I create an MTKView. Inside the MTKView delegate, I run the emulator for 1 video frame and pass it the view's backing layer for it to render the final output image with. The emulator runs and returns, but when it returns I get a crash in Swift land (callstack attached below), inside objc_release, which indicates I'm doing something wrong with memory management. My bridging interface (ios_driver.h): #pragma once #include <Foundation/Foundation.h> #include <QuartzCore/QuartzCore.h> void iosCreateEmulator(); void iosRunFrame(CAMetalLayer* layer); Bridge implementation (ios_driver.mm): #import <Foundation/Foundation.h> extern "C" { #include "ios_driver.h" } <...> #define IOS_EXPORT extern "C" __attribute__((visibility("default"))) std::unique_ptr<Emulator> emulator = nullptr; IOS_EXPORT void iosCreateEmulator() { ... } // Runs 1 video frame of the emulator and IOS_EXPORT void iosRunFrame(CAMetalLayer* layer) { void* layerBridged = (__bridge void*)layer; // Pass the CAMetalLayer to the emulator emulator->getRenderer()->setMTKLayer(layerBridged); // Runs the emulator for 1 frame and renders the output image using our layer emulator->runFrame(); } My MTKView delegate: class Renderer: NSObject, MTKViewDelegate { var parent: ContentView var device: MTLDevice! init(_ parent: ContentView) { self.parent = parent if let device = MTLCreateSystemDefaultDevice() { self.device = device } super.init() } func mtkView(_ view: MTKView, drawableSizeWillChange size: CGSize) {} func draw(in view: MTKView) { var metalLayer = view.layer as! CAMetalLayer // Run the emulator for 1 frame & display the output image iosRunFrame(metalLayer) } } Finally, the emulator's render function that interacts with the layer: void RendererMTL::setMTKLayer(void* layer) { metalLayer = (CA::MetalLayer*)layer; } void RendererMTL::display() { CA::MetalDrawable* drawable = metalLayer->nextDrawable(); if (!drawable) { return; } MTL::Texture* texture = drawable->texture(); <rest of rendering follows here using the drawable & its texture> } This is the Swift callstack at the time of the crash: To my understanding, I shouldn't be violating ARC rules as my bridging header uses CAMetalLayer* instead of void* and Swift will automatically account for ARC when passing CoreFoundation objects to Objective-C. However I don't have any other idea as to what might be causing this. I've been trying to debug this code for a couple of days without much success. If you need more info, the emulator code is also on Github Metal renderer: https://github.com/wheremyfoodat/Panda3DS/blob/ios/src/core/renderer_mtl/renderer_mtl.cpp#L58-L68 Bridge implementation: https://github.com/wheremyfoodat/Panda3DS/blob/ios/src/ios_driver.mm Bridging header: https://github.com/wheremyfoodat/Panda3DS/blob/ios/include/ios_driver.h Any help is more than appreciated. Thank you for your time in advance.
0
0
487
Mar ’25
MacOS Catalina 10.15.7 CoreGraphic.framework not find symbol
I recently needed to develop an application to obtain the window list, which requires Screen Recording permissions. Apple's official documentation mentions using the two functions CGPreflightScreenCaptureAccess and CGRequestScreenCaptureAccess to request permissions. These functions are stated to be available since version 10.15. However, when I used these two functions on a device running macOS 10.15.7, I encountered the errors shown in the attached screenshot. I used the nm tool to inspect the symbols in the CoreGraphics.framework and found that these two functions were not present. Could you help me understand why this is happening?
0
0
87
May ’25
CMake unable to generate the Xcode file described in this tutorial
In the Creating A 3D Application With Hydra Rendering tutorial on the Apple Developer website, on the last step where I execute this command: cmake -S ~/Users/macuser/CreatingA3DApplicationWithHydraRendering/ -B ~/Users/macuser/CreatingA3DApplicationWithHydraRendering/ I keep getting an error: CMake Error at CMakeLists.txt:5 (include): include could not find requested file: /Users/macuser/USDInstall/bin/pxrConfig.cmake I've tried to follow the instructions as mentioned in the README.md file included in the project files at least 5 times as well as moving the pxrConfig.cmake file around and copying it in different folders, then executed the command and was still unsuccessful into generating the proper file expected to compile and render the HydraPlayer renderer. How do I get cmake to generate the Xcode file to create the HydraPlayer renderer?
1
0
148
May ’25
The App Store purchase button disappears when another window approaches
Since macOS 15.3.2, we have observed that when another window is moved near the App Store's install button, the button disappears. We have attached a related video in the Feedback submission here https://feedbackassistant.apple.com/feedback/20444423 Our application overlays a transparent, watermark-window on top of the system window, which causes the install button in the App Store to be hidden when a user attempts to install an application.Could you advise on how to avoid this issue?
0
0
148
Nov ’25
Can Game Mode be activated when a child (Java) process's window is fullscreened?
Imagine a native macOS app that acts as a "launcher" for a Java game.** For example, the "launcher" app might use the Swift Process API or a similar method to run the java command line tool (lets assume the user has installed Java themselves) to run the game. I have seen How to Enable Game Mode. If the native launcher app's Info.plist has the following keys set: LSApplicationCategoryType set to public.app-category.games LSSupportsGameMode set to true (for macOS 26+) GCSupportsGameMode set to true The launcher itself can cause Game Mode to activate if the launcher is fullscreened. However, if the launcher opens a Java process that opens a window, then the Java window is fullscreened, Game Mode doesn't seem to activate. In this case activating Game Mode for the launcher itself is unnecessary, but you'd expect Game Mode to activate when the actual game in the Java window is fullscreened. Is there a way to get Game Mode to activate in the latter case? ** The concrete case I'm thinking of is a third-party Minecraft Java Edition launcher, but the issue can also be demonstrated in a sample project (FB13786152). It seems like the official Minecraft launcher is able to do this, though it's not clear how. (Is its bundle identifier hardcoded in the OS to allow for this? Changing a sample app's bundle identifier to be the same as the official Minecraft launcher gets the behavior I want, but obviously this is not a practical solution.)
2
0
184
Jun ’25
Metal triangle strips uniform opacity.
I have this drawing app that I have been working on for the past few years when I have free time. I recently rebuilt the app in Metal to build out other brushes and improve performance, need to render 10000s of lines in realtime. I’m running into this issue trying to create a uniform opacity per path. I have a solution but do not love it - as this is a realtime app and the solution could have some bottlenecks. If I just generate a triangle strip from touch points and do my best to smooth, resample, and handle miters I will always get some overlaps. See: To create a uniform opacity I render to an offscreen texture with blending disabled. I then pre-multiply the color and draw that texture to a composite texture with blending on (I do this per path). This works but gets tricky when you introduce a textured brush, the edges of the texture in the frag shader cut out the line. Pasted Graphic 1.png Solution: I discard below a threshold fragment float4 fragment_line(VertexOut in [[stage_in]], texture2d<float> texture [[ texture(0) ]]) { constexpr sampler s(coord::normalized, address::mirrored_repeat, filter::linear); float2 texCoord = in.texCoord; float4 texColor = texture.sample(s, texCoord); if (texColor.a < 0.01) discard_fragment(); // may be slow (from what I read) return in.color * texColor; } Better but still not perfect. Question: I'm looking for better ways to create a uniform opacity per path. I tried .max blending but that will cause no blending of other paths. Any tips, ideas, much appreciated. If this is too detailed of a question just achieve.
1
0
94
Mar ’25
How to customize shader code for visionos ?
Hello experts, I'm trying to implement a material with custom shader code, but I saw that visionOS doesn't allow you to inject custom Metal functions or use CustomMaterial like iOS/macOS, nor can you directly write Metal Shading Language (.metal) and use it through ShaderGraphMaterial. So my question is, if i want to implement your own shader code, how should i do it?
1
0
461
Jul ’25
CGEvent Not Working
I am trying to simulate a paste command and it seems to not want to paste. It worked at one point with the same code and now is causing issues. My code looks like this: ` func simulatePaste() { guard let source = CGEventSource(stateID: .hidSystemState) else { print("Failed to create event source") return } let keyDown = CGEvent(keyboardEventSource: source, virtualKey: CGKeyCode(9), keyDown: true) let keyUp = CGEvent(keyboardEventSource: source, virtualKey: CGKeyCode(9), keyDown: false) keyDown?.flags = .maskCommand keyUp?.flags = .maskCommand keyDown?.post(tap: .cgAnnotatedSessionEventTap) keyUp?.post(tap: .cgAnnotatedSessionEventTap) print("Simulated Cmd + V") } I know that there is some issues around permissions and so in my Info.plist I have this: &lt;string&gt;NSApplication&lt;/string&gt; &lt;key&gt;NSAppleEventsUsageDescription&lt;/key&gt; &lt;string&gt;This app requires permission to send keyboard input for pasting from the clipboard.&lt;/string&gt; I have also disabled sandbox. It does ask me if I want to give the app permissions but after approving it, it still doesn't paste.
1
0
434
Feb ’25
authenticateHandler events not being received on iOS 18
I work on a team that provides an SDK for another game to handle various tasks like authentication. They are experiencing a case where devices using iOS 17 are failing to authenticate with GameCenter, receiving the message "The requested operation could not be completed because local player has not been authenticated." We imagine this is because they still have some setup to finish regarding GameCenter itself, and we're working with them to take care of that. However, on iOS 18, their app ends up waiting indefinitely for GameCenter authentication messages that it never receives. That's where we're puzzled. We expect them to have the same outcome regardless of OS version. We initiate GameCenter authentication by setting an authenticateHandler after some initial application setup. The handler has code to account for UI, errors, and successful authentication. On iOS 17, it's clear that it's getting called as expected because they receive an indication that the player isn't authenticated. But on iOS 18, it looks like the same handler code on iOS 18 isn't being called at all. Are there differences in how iOS 18 interacts with the authenticationHandler that we somehow aren't accounting for? Or is there potentially something else that we're doing incorrectly that is manifesting only on iOS 18? Here's a simplified version of our login function code (in Obj-C++). There is no OS-specific code, and the job that owns this function does stay in scope until after authentication is complete. void beginLogin() { // Snip: Check if the user is already logged in. // Snip: Prevent multiple concurrent calls to this function. auto authenticateHandler = ^(UIViewController* gcViewController, NSError* error) { if (gcViewController != nil) { // Snip: Display the UI } else if (error != nil) { // Snip: Handle the error. } else { if ([[GKLocalPlayer localPlayer] isAuthenticated]) { // Snip: Handle successful authentication. } else { // Snip: Handle other case. } } }; [[GKLocalPlayer localPlayer] setAuthenticateHandler: authenticateHandler]; }
2
0
152
May ’25
Sparse Texture Writes
Hey, I've been struggling with this for some days now. I am trying to write to a sparse texture in a compute shader. I'm performing the following steps: Set up a sparse heap and create a texture from it Map the whole area of the sparse texture using updateTextureMapping(..) Overwrite every value with the value "4" in a compute shader Blit the texture to a shared buffer Assert that the values in the buffer are "4". I have a minimal example (which is still pretty long unfortunately). It works perfectly when removing the line heapDesc.type = .sparse. What am I missing? I could not find any information that writes to sparse textures are unsupported. Any help would be greatly appreciated. import Metal func sparseTexture64x64Demo() throws { // ── Metal objects guard let device = MTLCreateSystemDefaultDevice() else { throw NSError(domain: "SparseNotSupported", code: -1) } let queue = device.makeCommandQueue()! let lib = device.makeDefaultLibrary()! let pipeline = try device.makeComputePipelineState(function: lib.makeFunction(name: "addOne")!) // ── Texture descriptor let width = 64, height = 64 let format: MTLPixelFormat = .r32Uint // 4 B per texel let desc = MTLTextureDescriptor() desc.textureType = .type2D desc.pixelFormat = format desc.width = width desc.height = height desc.storageMode = .private desc.usage = [.shaderWrite, .shaderRead] // ── Sparse heap let bytesPerTile = device.sparseTileSizeInBytes let meta = device.heapTextureSizeAndAlign(descriptor: desc) let heapBytes = ((bytesPerTile + meta.size + bytesPerTile - 1) / bytesPerTile) * bytesPerTile let heapDesc = MTLHeapDescriptor() heapDesc.type = .sparse heapDesc.storageMode = .private heapDesc.size = heapBytes let heap = device.makeHeap(descriptor: heapDesc)! let tex = heap.makeTexture(descriptor: desc)! // ── CPU buffers let bytesPerPixel = MemoryLayout<UInt32>.stride let rowStride = width * bytesPerPixel let totalBytes = rowStride * height let dstBuf = device.makeBuffer(length: totalBytes, options: .storageModeShared)! let cb = queue.makeCommandBuffer()! let fence = device.makeFence()! // 2. Map the sparse tile, then signal the fence let rse = cb.makeResourceStateCommandEncoder()! rse.updateTextureMapping( tex, mode: .map, region: MTLRegionMake2D(0, 0, width, height), mipLevel: 0, slice: 0) rse.update(fence) // ← capture all work so far rse.endEncoding() let ce = cb.makeComputeCommandEncoder()! ce.waitForFence(fence) ce.setComputePipelineState(pipeline) ce.setTexture(tex, index: 0) let threadsPerTG = MTLSize(width: 8, height: 8, depth: 1) let tgCount = MTLSize(width: (width + 7) / 8, height: (height + 7) / 8, depth: 1) ce.dispatchThreadgroups(tgCount, threadsPerThreadgroup: threadsPerTG) ce.updateFence(fence) ce.endEncoding() // Blit texture into shared buffer let blit = cb.makeBlitCommandEncoder()! blit.waitForFence(fence) blit.copy( from: tex, sourceSlice: 0, sourceLevel: 0, sourceOrigin: MTLOrigin(x: 0, y: 0, z: 0), sourceSize: MTLSize(width: width, height: height, depth: 1), to: dstBuf, destinationOffset: 0, destinationBytesPerRow: rowStride, destinationBytesPerImage: totalBytes) blit.endEncoding() cb.commit() cb.waitUntilCompleted() assert(cb.error == nil, "GPU error: \(String(describing: cb.error))") // ── Verify a few texels let out = dstBuf.contents().bindMemory(to: UInt32.self, capacity: width * height) print("first three texels:", out[0], out[1], out[width]) // 0 1 64 assert(out[0] == 4 && out[1] == 4 && out[width] == 4) } Metal shader: #include <metal_stdlib> using namespace metal; kernel void addOne(texture2d<uint, access::write> tex [[texture(0)]], uint2 gid [[thread_position_in_grid]]) { tex.write(4, gid); }
1
0
120
May ’25
unable to find boardcast extension
Issue Summary: In our Flutter application, we utilize Tencent's TRTC API for voice and video communication. While the broadcast functionality operates correctly on Android, it fails to respond on iOS devices. Attempting to initiate a broadcast results in no action, and long-pressing the broadcast button does not reveal the broadcast extension. Steps to Reproduce: Add Broadcast Upload Extension: In Xcode, navigate to File > New > Target. Select Broadcast Upload Extension and add it to the project. 2. Build the Project: Attempt to build the project. Encounter the error: "Cycle inside Runner; building could produce unreliable results." 3. Resolve Build Cycle Error: Go to the project’s Build Phases. Locate the Embed App Extensions phase. Move Embed App Extensions just below Copy Bundle Resources. Ensure the Copy only when installing option is selected. Rebuild the project; the cycle error is resolved. 4.Test Broadcast Functionality: Install the app on an iOS device. Tap the broadcast button; observe no response. Long-press the broadcast button in the top right hand scroll down menu; the broadcast extension is not listed. 5. Isolate the Issue: Create a new Flutter project. Repeat the above steps to add the broadcast upload extension. The issue persists: broadcast functionality remains unresponsive on iOS.
1
0
526
Feb ’25
Game Center leaderboards not posting scores
My app is live but the leaderboards still aren’t updating. App was built with unreal engine 5 with blueprints. I have the leaderboard stat info entered into the node for write integer to leaderboard and a node for show platform specific leaderboard. The leaderboards are shown as live on app connect. When I run the app, the Game Center login functions and the leaderboard interface launches as expected but it just lists a group of friends to invite. There are no scores listed and it says number of players 0 even though I have scored on two different devices and accounts. I have the Game Center entitlement added in Xcode. Not sure where else to look.
0
0
599
3w
Improving person segmentation and occlusion quality in RealityKit
I’m building an app that uses RealityKit and specifically ARConfiguration.FrameSemantics.personSegmentationWithDepth. The goal is to insert an AR object into the scene behind a person, and an additional AR object in front of the person, while being as photo realistic as possible. Through testing, I’ve noticed that many times, the edges of the person segmentation mask are not well matched to the actual person, and parts of the person are transparent, with the AR object bleeding through. It’s sort of like a “bad green screen” effect, which I’d expect to see a little bit, but not to this extent. I’ve been testing on iPhone 16, iPhone 14 Pro, iPad Pro 12.9 inch 6th Generation, and iPhone 12 Pro, with similar results across all devices. I’m wondering what else I can do to improve this… either code changes, platform (like different iPhone models), or environment (like lighting, distance, etc). Attaching some example screen grabs and a minimum reproducible code sample. Appreciate any insights! import ARKit import SwiftUI import RealityKit struct RealityViewContainer: UIViewRepresentable { func makeUIView(context: Context) -> ARView { let arView = ARView(frame: .zero) arView.environment.sceneUnderstanding.options.insert(.occlusion) arView.renderOptions.insert(.disableMotionBlur) arView.renderOptions.insert(.disableDepthOfField) let configuration = ARWorldTrackingConfiguration() configuration.planeDetection = [.horizontal] if ARWorldTrackingConfiguration.supportsFrameSemantics(.personSegmentationWithDepth) { configuration.frameSemantics.insert(.personSegmentationWithDepth) } arView.session.run(configuration) arView.session.delegate = context.coordinator context.coordinator.arView = arView } func makeCoordinator() -> Coordinator { Coordinator(self) } class Coordinator: NSObject, ARSessionDelegate { var parent: RealityViewContainer var floorAnchor: ARPlaneAnchor? init(_ parent: RealityViewContainer) { self.parent = parent } func session(_ session: ARSession, didAdd anchors: [ARAnchor]) { if let arView,floorAnchor == nil { for anchor in anchors { if let horizontalPlaneAnchor = anchor as? ARPlaneAnchor, horizontalPlaneAnchor.alignment == .horizontal, horizontalPlaneAnchor.transform.columns.3.y < arView.cameraTransform.translation.y { // filter out ceiling floorAnchor = horizontalPlaneAnchor let backgroundEntity = BackgroundEntity() let anchorEntity = AnchorEntity(anchor: horizontalPlaneAnchor) anchorEntity.addChild(background) let foregroundEntity = ForegroundEntity() backgroundEntity.addChild(foregroundEntity) arView.scene.addAnchor(anchorEntity) arView.installGestures([.rotation, .translation], for: backgroundEntity) break // Stop after adding the first horizontal plane (floor) } } } } } }
1
0
105
May ’25