Delve into the world of graphics and game development. Discuss creating stunning visuals, optimizing game mechanics, and share resources for game developers.

All subtopics
Posts under Graphics & Games topic

Post

Replies

Boosts

Views

Activity

iOS 26 Games app: Wired Switch Pro controllers (and GameSir X5 Lite) not working correctly
Hi, Since iOS 26 introduced the new Games app, I’ve noticed a problem when using a Nintendo Switch Pro Controller in wired USB-C mode, and also with third-party controllers that emulate it (like the GameSir X5 Lite). In the Games app interface, only the L/R buttons respond, but the D-Pad and analog sticks don’t work at all. Once inside actual games, the controller works fine — the issue only affects the Games app UI. What I’ve tested so far: Xbox / PlayStation controllers → work fine in both wired and Bluetooth, including inside the Games app. Switch Pro Controller (Bluetooth) → works fine, including in the Games app. Switch Pro Controller (wired) → same issue as the X5 Lite, D-Pad and sticks don’t work in the Games app. This makes it hard to use the new Games app launcher with these controllers, even though they work perfectly once a game is launched. My question: is this an iOS bug (Apple needs to add proper support for wired Switch Pro controllers in the Games app), or something that Nintendo / GameSir would need to address? Thanks in advance to anyone who can confirm this or provide more info.
2
0
467
Sep ’25
Float64 (Double Precision) Support on MPS with PyTorch on Apple Silicon?
Hi everyone, This project uses PyTorch on an Apple Silicon Mac (M1/M2/etc.), and the goal is to use the MPS backend for GPU acceleration, notes Apple Developer. However, the workflow depends on Float64 (double-precision) floating-point numbers for certain computations, notes PyTorch Forums. The error "Cannot convert a MPS Tensor to float64 dtype as the MPS framework doesn't support float64. Please use float32 instead" has been encountered, notes GitHub. It seems that the MPS backend doesn't currently support Float64 for direct GPU computation. Questions for the community: Are there any known workarounds or best practices for handling Float64-dependent operations when using the MPS backend with PyTorch? For those working with high-precision tasks on Apple Silicon, what strategies are being used to balance performance with the need for Float64? Offloading to the CPU is an option, and it's of interest to know if there are any specific techniques or libraries within the Apple ecosystem that could streamline this process while aiming for optimal performance. Any insights, tips, or experiences would be appreciated. Thanks in advance, Jonaid MacBook Pro M3 Max
2
1
446
Oct ’25
How to Apple Unity Plugins
When running my game in the Unity Editor on Windows platform I get an error: DllNotFoundException: GameKitWrapper assembly:<unknown assembly> type:<unknown type> member:(null) Apple.GameKit.DefaultNSErrorHandler.Init () (at ./Library/PackageCache/com.apple.unityplugin.gamekit@0abcad546f73/Source/DefaultHandlers.cs:35) This is because GameKitWrapper dynamically linked library is not available under Windows platform. Besides, "Apple Build Settings" are declared under UNITY_EDITOR_OSX and also not available under Windows platform. Does anyone managed to solve this?
1
1
495
Jul ’25
Slow compilation
Hi, I am working with a large project. We are compiling each material to its own .metallib. They all include many common files full of inline functions. Finally we link it all together at the end with a single big pathtrace kernel. Everything works as expected, however the compile times have gotten completely out of hand and it takes multiple minutes to compile at runtime (to native code). I have gathered that I can do this offline by using metal-tt however if I am wondering if there is a way to reduce the compile times in such a scenario, and how to investigate what the root cause of the problem is. I suspect it could have to do with the fact that every materials metallib contains duplications of all the inline functions. Any ideas on how to profile and debug this? Thanks, Rasmus
0
1
120
Mar ’25
Portals do not occlude CollisionComponent and InputTargetComponent
Hello If you add a ModelEntity to a world inside a portal, the drawing of the model will be occluded properly to the portal bounds. However the invisible shape of the InputTargetComponent and CollisionComponent are not occluded. They are able to cross the portal, and if you have gestures on your ModelEntity you can trigger them in areas outside the portal bounds. This happens even if the ModelEntity has no PortalCrossingComponent.
0
1
79
Mar ’25
Feature Request: Support .reality File Export in Reality Composer Pro for Mac
I am an AR developer working on Apple Silicon Macs. Currently, Reality Composer Pro does not allow exporting .reality files, and Reality Composer (classic) is not available for Apple Silicon. This creates a gap in the workflow for ARKit/RealityKit developers who need interactive .reality files for use in Xcode projects. Having the ability to export .reality files directly from Reality Composer Pro on Mac would greatly streamline development and enable a fully native workflow on modern Macs. Alternatively, bringing Reality Composer (classic) to Apple Silicon would also resolve this issue. I have submitted this as a feature request via Feedback Assistant (FB17900386). I encourage others with similar needs to reply or submit feedback as well. Thank you!
4
1
235
Jul ’25
RealityView content scale factor
Hi, following the recent deprecation of SceneKit, I'm trying to move a couple of my SceneKit projects to RealityKit. One thing I can't seem to find is how to change the content scale factor when using a RealityView in SwiftUI. It was really easy to do in SceneKit with just a SCNView property, and it seems that it's also possible when using ARView, but I can't find a way to do it with a RealityView. Maybe it's a SwiftUI limitation?
3
1
173
Jul ’25
New GameSave API fails, "Couldn’t communicate with a helper application."
I've been playing with the new GameSave API and cannot get it to work. I followed the 3-step instructions from the Developer video. Step 2, "Next, login to your Apple developer account and include this entitlement in the provisioning profile for your game." seems to be unnecessary, as Xcode set this for you when you do step 1 "First add the iCloud entitlement to your game." Running the app on my device and tapping "Load" starts the sync, then fails with the error "Couldn’t communicate with a helper application." I have no idea how to troubleshoot this. Every other time I've used CloudKit it has Just Worked™. Halp‽ Here is my example app: import Foundation import SwiftUI import GameSave @main struct GameSaveTestApp: App { var body: some Scene { WindowGroup { GameView() } } } struct GameView: View { @State private var loader = GameLoader() var body: some View { List { Button("Load") { loader.load() } Button("Finish sync") { Task { try? await loader.finish() } } } } } @Observable class GameLoader { var directory: GameSaveSyncedDirectory? func stateChanged() { let newState = withObservationTracking { directory?.state } onChange: { Task { @MainActor [weak self] in self?.stateChanged() } } print("State changed to \(newState?.description ?? "nil")") switch newState { case .error(let error): print("ERROR: \(error.localizedDescription)") default: _ = 0 // NOOP } } func load() { print("Opening gamesave directory") directory = GameSaveSyncedDirectory.openDirectory() stateChanged() } func finish() async throws { print("finishing syncing") await directory?.finishSyncing() } }
7
1
458
Sep ’25
Subdivision shows in RealityComposerPro but not when loaded in Simulator
Hello, I am trying to use the subdivision mesh rendering option. I can see it working in RealityComposerPro: But not when loading asset and displaying in Simulator: Using this code: import SwiftUI import RealityKit import RealityKitContent struct AirspaceView: View { // MARK: - VIEW BODY var body: some View { RealityView { content in if let a = try? await Entity(named: "Models/Test/Test.usdc", in: realityKitContentBundle) { content.add(a) } } } } Any ideas why?
2
1
589
Feb ’25
Moving from SceneKit - fog missing
I am rewriting an unfinished SceneKit project as RealityKit (NonAR). As far as I can see, RealityKit is missing basic fog functionality? Fog was simple & easy to implement in SCeneKit (fogStartDistance / fogEndDistance / fogDensityExponent / fogColor). Are there any plans to implement something like this in RealityKit? Are there any simple workarounds?
3
1
593
Aug ’25
Subject: Handling Z-Up Blender USDZ Models in RealityKit (visionOS) for Transform Updates
Hello everyone, I'm working on a visionOS application using RealityKit and am encountering a common coordinate system challenge when integrating 3D models created in Blender. My goal is to display and dynamically update the Transform (position, rotation, scale) of models created in Blender within RealityKit. The issue arises because Blender's default coordinate system is Z-up, and while exporting to USD/USDZ, I don't have a reliable "Y-up" export option that correctly reorients the model and its transform data for RealityKit's Y-up convention. This means I'm essentially exporting models with their "up" direction along the Z-axis. When I load these Z-up exported models into RealityKit, they are often oriented incorrectly. To then programmatically update their Transform (e.g., move them, rotate them based on game logic, or apply physics), I need to ensure that the Transform values I set align with RealityKit's Y-up system, even though the original model data was authored in a Z-up context. My questions are: What is the recommended transformation process (e.g., using simd_quatf or simd_float4x4) to convert a Transform that was conceptually defined in a Z-up coordinate system to RealityKit's Y-up coordinate system? Specifically, when I have a Transform (or its translation, rotation, scale components) from a Z-up context, how should I apply this to a RealityKit Entity so it appears and behaves correctly in a Y-up world? Are there any existing convenience APIs or helper functions within RealityKit, simd, or other Apple frameworks that simplify this Z-up to Y-up Transform conversion process? Or is a manual application of a transformation quaternion (e.g., simd_quatf(angle: -.pi / 2, axis: [1, 0, 0])) the standard approach? Any guidance, code examples, or best practices from those who have faced similar challenges would be greatly appreciated! Thank you.
1
1
415
Jul ’25
Multiply exr lightmap in Reality Composer Pro Shader Graph
I’m trying to use EXR lightmaps to overlay baked lighting on top of a base texture in the RCP Shader Graph. When I multiply an EXR image set to Image(float) with an 8-bit base texture, the output becomes Image(float). I can’t connect that to the BaseColor input on the UnlitSurface node, since it only accepts Color3f. I expected to be able to use a Convert node between the Multiply node and the BaseColor input, but when I do that, the result becomes black and white instead of the expected outcome: the EXR multiplied with the base texture using a baseline value of 1, where values below 1 in the EXR would darken the base texture and values above 1 would brighten it. Is there any documentation on how to properly overlay a 32-bit EXR lightmap in the RCP Shader Graph, or is the black-and-white output from the Convert node a bug?
7
0
1k
2w
virtual game controller + SwiftUI warning
Hi, I've just moved my SpriteKit-based game from UIView to SwiftUI + SpriteView and I'm getting this mesage Adding 'GCControllerView' as a subview of UIHostingController.view is not supported and may result in a broken view hierarchy. Add your view above UIHostingController.view in a common superview or insert it into your SwiftUI content in a UIViewRepresentable instead. Here's how I'm doing this struct ContentView: View { @State var alreadyStarted = false let initialScene = GKScene(fileNamed: "StartScene")!.rootNode as! SKScene var body: some View { ZStack { SpriteView(scene: initialScene, transition: .crossFade(withDuration: 1), isPaused: false , preferredFramesPerSecond: 60) .edgesIgnoringSafeArea(.all) .onAppear { if !self.alreadyStarted { self.alreadyStarted.toggle() initialScene.scaleMode = .aspectFit } } VirtualControllerView() .onAppear { let virtualController = BTTSUtilities.shared.makeVirtualController() BTTSSharedData.shared.virtualGameController = virtualController BTTSSharedData.shared.virtualGameController?.connect() } .onDisappear { BTTSSharedData.shared.virtualGameController?.disconnect() } } } } struct VirtualControllerView: UIViewRepresentable { func makeUIView(context: Context) -> UIView { let result = PassthroughView() return result } func updateUIView(_ uiView: UIView, context: Context) { } } class PassthroughView: UIView { override func hitTest(_ point: CGPoint, with event: UIEvent?) -> UIView? { for subview in subviews.reversed() { let convertedPoint = convert(point, to: subview) if let hitView = subview.hitTest(convertedPoint, with: event) { return hitView } } return nil } }
1
0
388
Sep ’25
Metal 4: When is it ok to dealloc a MTLBuffer's memory
I have something like this drawing in an MTKView (see at bottom). I am finding it difficult to figure out when can the Swift-land resources used in making the MTLBuffer(s) be released? Below, for example, is it ok if args goes out of scope (or is otherwise deallocated) at point 1, 2, or 3? Or perhaps even earlier, as soon as argsBuffer has been created? I have been reading through various articles such as Setting resource storage modes Choosing a resource storage mode for Apple GPUs Copying data to a private resource but it's a lot to absorb and I haven't been really able to find an authoritative description of the required lifetime of the resources in CPU land. I should mention that this is Metal 4 code. In previous versions of Metal, the MTLCommandBuffer had the ability to add a completion handler to be called by the GPU after it has finished running the commands in the buffer but in Metal 4 there is no such thing (it it were even needed for the purpose I am interested in). Any advice and/or pointers to the definitive literature will be appreciated. guard let argsBuffer = device.makeBuffer(bytes: &args,... argumentTable.setAddress(argsBuffer.gpuAddress, ... encoder.setArgumentTable(argumentTable, stages: .vertex) // encode drawing renderEncoder.draw... ... encoder.endEncoding() // 1 commandBuffer.endCommandBuffer() // 2 commandQueue.waitForDrawable(drawable) commandQueue.commit([commandBuffer]) // 3 commandQueue.signalDrawable(drawable) drawable.present()
2
0
179
3w
Looking for some clarification
Was wondering if anyone from Apple could provide some clarification, The gaming studio "Epic Games" Is wondering if they could distribute the award winning game "Fortnite" back on MacOS without any retaliations. I know Fortnite being back on MacOS would benefit thousands of MacOS Devs. Hoping to get a clarification so Epic could start on bringing Fortnite back.
0
1
605
Dec ’25
Race conditions when changing CAMetalLayer.drawableSize?
Is the pseudocode below thread-safe? Imagine that the Main thread sets the CAMetalLayer's drawableSize to a new size meanwhile the rendering thread is in the middle of rendering into an existing MTLDrawable which does still have the old size. Is the change of metalLayer.drawableSize thread-safe in the sense that I can present an old MTLDrawable which has a different resolution than the current value of metalLayer.drawableSize? I assume that setting the drawableSize property informs Metal that the next MTLDrawable offered by the CAMetalLayer should have the new size, right? Is it valid to assume that "metalLayer.drawableSize = newSize" and "metalLayer.nextDrawable()" are internally synchronized, so it cannot happen that metalLayer.nextDrawable() would produce e.g. a MTLDrawable with the old width but with the new height (or a completely invalid resolution due to potential race conditions)? func onWindowResized(newSize: CGSize) { // Called on the Main thread metalLayer.drawableSize = newSize } func onVsync(drawable: MTLDrawable) { // Called on a background rendering thread renderer.renderInto(drawable: drawable) }
1
1
468
Dec ’25