Delve into the world of graphics and game development. Discuss creating stunning visuals, optimizing game mechanics, and share resources for game developers.

All subtopics
Posts under Graphics & Games topic

Post

Replies

Boosts

Views

Created

iPad - Can I prevent Multitasking on my app?
I have a game built in Unreal Engine 5.6 which uses tilt motion controls to rotate an object. I've restricted the app to only run in portrait for iPhone, and everything works fine, however for iPad I've had a few issues relating to multitasking and I can't seem to solve it. Forcing the app to portrait only still allows the app to run in landscape mode, but shows black bars either side of the game, and the axes for the motion controls are incorrect. X becomes Y and Y becomes X, and there's no way for my app to know which orientation it is because the container is still technically portrait. Allowing my game to run in all orientations makes the whole app more presentable, it doesn't add black bars and the game is still functional and I'm able to map the controls correctly because the game knows it's landscape rather than portrait. The problem with allowing my app to run in landscape mode is if multitasking is enabled on the ipad, you can resize the app to be portrait, and then I run into the same problem again where the game thinks it's portrait mode and all of the axes are wrong again. I tried getting the true orientation of the device rather than the scene, but the game is intended to be played flat so instead of returning the orientation of the OS the orientation is FaceUp, which doesn't help. I need to either disable multitasking or find a way of getting the orientation of the OS (not the scene or the device). I haven't found how to get the OS orientation so I've been trying to disable multitasking. I've got Requires Fullscreen true and UIApplicationSupportsMultipleScreens false in my info.plist but my iPad still seems to allow the window to be resized in landscape view. Opening the IOS workspace of my project Requires Fullscreen is ticked but under that it says "Supports Multiple Windows" and the arrow button next to it takes my to my info.plist values, but no indication of how I can change it. I'm using Unreal Engine 5.6 and Xcode 16.0. Xcode is old I know, but this version of unreal engine doesn't seem to support any newer.
0
0
218
3w
Game Center leaderboards not posting scores
My app is live but the leaderboards still aren’t updating. App was built with unreal engine 5 with blueprints. I have the leaderboard stat info entered into the node for write integer to leaderboard and a node for show platform specific leaderboard. The leaderboards are shown as live on app connect. When I run the app, the Game Center login functions and the leaderboard interface launches as expected but it just lists a group of friends to invite. There are no scores listed and it says number of players 0 even though I have scored on two different devices and accounts. I have the Game Center entitlement added in Xcode. Not sure where else to look.
0
0
599
3w
Metal: Intersection results unstable when reusing Instance Acceleration Structures
Hi all, I'm encountering an issue with Metal raytracing on my M5 MacBook Pro regarding Instance Acceleration Structure (IAS). Intersection tests suddenly stop working after a certain point in the sampling loop. Situation I implemented an offline GPU path tracer that runs the same kernel multiple times per pixel (sampleCount) using metal::raytracing. Intersection tests are performed using an IAS. Since this is an offline path tracer, geometries inside the IAS never changes across samples (no transforms or updates). As sampleCount increases, there comes a point where the number of intersections drops to zero, and remains zero for all subsequent samples. Here's a code sketch: let sampleCount: UInt16 = 1024 for sampleIndex: UInt16 in 0..<sampleCount { // ... do { let commandBuffer = commandQueue.makeCommandBuffer() // Dispatch the intersection kernel. await commandBuffer.completed() } do { let commandBuffer = commandQueue.makeCommandBuffer() // Use the intersection test results from the previous command buffer. await commandBuffer.completed() } // ... } kernel void intersectAlongRay( const metal::uint32_t threadIndex [[thread_position_in_grid]], // ... const metal::raytracing::instance_acceleration_structure accelerationStructure [[buffer(2)]], // ... ) { // ... const auto result = intersector.intersect(ray, accelerationStructure); switch (result.type) { case metal::raytracing::intersection_type::triangle: { // Write intersection result to device buffers. break; } default: break; } Observations Encoding both the intersection kernel and the subsequent result usage in the same command buffer does not resolve the problem. Switching from IAS to Primitive Acceleration Structure (PAS) fixes the problem. Rebuilding the IAS for each sample also resolves the issue. Intersections produce inconsistent results even though the IAS and rays are identical — Image 1 shows a hit, while Image 2 shows a miss. Questions Am I misusing IAS in some way ? Could this be a Metal bug ? Any guidance or confirmation would be greatly appreciated.
1
0
266
4w
Game Porting Toolkit installation fails: CMake compatibility error in game-porting-toolkit-compiler
I'm encountering a build failure when trying to install the Game Porting Toolkit via Homebrew. The installation fails during the game-porting-toolkit-compiler dependency build phase with a CMake compatibility error. Error Message: CMake Error at CMakeLists.txt:3 (cmake_minimum_required): Compatibility with CMake < 3.5 has been removed from CMake. Update the VERSION argument <min> value. Or, use the <min>...<max> syntax to tell CMake that the project requires at least <min> but has been updated to work with policies introduced by <max> or earlier. Or, add -DCMAKE_POLICY_VERSION_MINIMUM=3.5 to try configuring anyway. -- Configuring incomplete, errors occurred! Environment: macOS: 15.6.1 (Sequoia) Homebrew: 5.0.1 CMake: 3.20.2 Architecture: x86_64 (via Rosetta) Formula: apple/apple/game-porting-toolkit-compiler v0.1 Source: crossover-sources-22.1.1.tar.gz Steps to Reproduce: Install x86_64 Homebrew for Rosetta compatibility Run: arch -x86_64 /usr/local/bin/brew install apple/apple/game-porting-toolkit Build fails during dependency installation Root Cause: The LLVM/Clang sources included in crossover-sources-22.1.1.tar.gz contain a CMakeLists.txt file that specifies a minimum CMake version lower than 3.5. Modern CMake versions (3.5+) have removed backward compatibility with these older version requirements. Potential Solutions: Update the Homebrew formula to patch the CMakeLists.txt with cmake_minimum_required(VERSION 3.5) or higher Update to newer CrossOver sources with updated CMake requirements Add the -DCMAKE_POLICY_VERSION_MINIMUM=3.5 flag to the CMake build command in the formula Is this a known issue? Are there plans to update the formula or the source package to resolve this compatibility problem? Any guidance on a workaround would be appreciated. Full log available at: /Users/kentarovadney/Library/Logs/Homebrew/game-porting-toolkit-compiler/02.cmake.log Thanks for any assistance!
1
0
736
Nov ’25
Request low-latency streaming for iOS/iPadOS
Just found out this key available for visionOS https://developer.apple.com/documentation/bundleresources/entitlements/com.apple.developer.low-latency-streaming It seems to keep video streaming from being interrupted by AWDL, our community needs it badly for self-hosted game streaming (PC to iPhone / iPad). Related apps: Moonlight / VoidLink / SteamLink. Can we expect this on iOS/iPadOS 26, or even iOS/iPadOS 18 ?
1
3
272
Nov ’25
Request to restore full ICC profile support (LUT-based display profiles) in macOS ColorSync
Dear Apple Color Management Team, I’m a professional visual creator working on color-critical photo and graphic projects using macOS (currently 26.1 Tahoe). In recent macOS releases, LUT-based ICC display profiles (such as XYZ LUT + Matrix types generated by DisplayCAL or professional spectrophotometers) can no longer be installed or activated via ColorSync. This limitation significantly affects professional workflows in photography, graphic design, prepress, and video color grading — fields that rely on precise display profiling. The current workaround (converting LUT profiles to simple shaper/matrix ICC v2) results in less accurate tone response and color reproduction, particularly in the dark range and wide-gamut displays. I kindly request Apple to restore or re-enable the ability to install and use ICC v2/v4 LUT-based display profiles under ColorSync, as was possible on macOS Monterey and Ventura. This would allow professionals to continue using trusted calibration tools such as DisplayCAL, X-Rite i1Profiler, and Calibrite Profiler to achieve accurate color management. macOS is widely used in professional creative industries, and restoring this feature would be a huge help for countless photographers, designers, and colorists. Thank you for your attention and commitment to professional users. Best regards, Richárd Deutsch Professional Photographer https://riccio.hu/ MacBook Pro (M4 Pro, macOS 26.1)
6
0
1.2k
Nov ’25
MetalFX for Unity 2022.3.62f3?
Hi, I’m testing Unity’s Spaceship HDRP demo on iPhone 17 Pro Max and iPad Pro M4 (iOS 26.1). Everything renders correctly, and my custom MetalFX Spatial plugin initializes successfully — it briefly reports active scaling (e.g. 1434×660 → 2868×1320 at 50% scaling), then reverts to native rendering a few frames later. Setup: Xcode 16.1 (targeting iOS 18) Unity 2022.3.62f3 (HDRP) Metal backend Dynamic Resolution enabled in HDRP assets and cameras Relevant Xcode console excerpt: [MetalFXPlugin] MetalFX_Enable(True) called. [SpaceshipOptions] MetalFX enabled with HDRP dynamic resolution integration. [SpaceshipOptions] Disabled TAA for MetalFX Spatial. [SpaceshipOptions] Created runtime RenderTexture: 1434x660 [MetalFX] Spatial scaler created (1434x660 → 2868x1320). [MetalFX] Processed frame with scaler. [MetalFXPlugin] Sent RenderTexture (1434x660) to MetalFX. Output target 2868x1320. [SpaceshipOptions] MetalFX target set: 1434x660 [SpaceshipOptions] Camera targetTexture cleared after MetalFX handoff. It looks like HDRP clears the camera’s target texture right after MetalFX submits the frame, which causes it to revert to native rendering. Is there a recommended way to persist or rebind the MetalFX output texture when using HDRP on iOS? Unity doesn’t appear to support MetalFX in the Editor either: Thanks!
0
0
124
Nov ’25
The App Store purchase button disappears when another window approaches
Since macOS 15.3.2, we have observed that when another window is moved near the App Store's install button, the button disappears. We have attached a related video in the Feedback submission here https://feedbackassistant.apple.com/feedback/20444423 Our application overlays a transparent, watermark-window on top of the system window, which causes the install button in the App Store to be hidden when a user attempts to install an application.Could you advise on how to avoid this issue?
0
0
149
Nov ’25
Multiply exr lightmap in Reality Composer Pro Shader Graph
I’m trying to use EXR lightmaps to overlay baked lighting on top of a base texture in the RCP Shader Graph. When I multiply an EXR image set to Image(float) with an 8-bit base texture, the output becomes Image(float). I can’t connect that to the BaseColor input on the UnlitSurface node, since it only accepts Color3f. I expected to be able to use a Convert node between the Multiply node and the BaseColor input, but when I do that, the result becomes black and white instead of the expected outcome: the EXR multiplied with the base texture using a baseline value of 1, where values below 1 in the EXR would darken the base texture and values above 1 would brighten it. Is there any documentation on how to properly overlay a 32-bit EXR lightmap in the RCP Shader Graph, or is the black-and-white output from the Convert node a bug?
6
0
599
Nov ’25
RealityView postProcess effect depth texture
Hello, Question re: iOS RealityView postProcess. I've got a working postProcess kernel and I'd like to add some depth-based effects to it. Theoretically I should be able to just do: encoder.setTexture(context.sourceDepthTexture, index: 1) and then in the kernel: texture2d<float, access::read> depthIn [[texture(1)]] ... outTexture.write(depthIn.read(gid), gid); And I consistently see all black rendered to the view. The postProcess shader works, so that's not the issue. It just seems to not be receiving actual depth information. (If I set a breakpoint at the encoder setTexture step, I can see preview the color texture of the scene, but the context's depthTexture looks like all NaN / blank.) I've looked at all the WWDC samples, but they include ARView for all the depth sample code, which has a different set of configuration options than RealityView. So far I haven't seen anywhere to explicitly tell RealityView "include the depth information". So I'm not sure if I'm missing something there. It appears that there is indeed a depth texture being passed, but it looks blank. Is there a working example somewhere that we can reference?
2
0
516
Nov ’25
Help Request! How to Render Models with SubMeshes Using Metal 4?
Hi, I'm Beginner with Metal 4 and Model I/O 🥺. I can render simple models with just one mesh, but when I try to render models with SubMeshes, nothing shows up on screen. Can anyone help me figure out how to properly render models with multiple submeshes? I think I'm not iterating through them correctly or maybe missing some buffers setup. Here's what I have so far: https://www.icloud.com.cn/iclouddrive/0a6x_NLwlWy-herPocExZ8g3Q#LoadModel
1
0
199
Nov ’25
Crash occurring when authenticating user for Game Center
I am using the latest version of the Game Center plugin for Unity and have noticed that my game will crash on launch when trying to authenticate. I've tried this in an empty project with just the plugin and it still crashes with this exception. GfxDevice: creating device client; threaded=1; jobified=0 Initializing Metal device caps: Apple A14 GPU Initialize engine version: 2022.3.62f2 (7670c08855a9) GameKitException: Code=-7 Domain=GKErrorDomain Description=The operation couldn’t be completed. (GKErrorDomain error -7.) (UnsupportedOperationForOSVersion) at Apple.GameKit.DefaultNSErrorHandler.ThrowNSError (System.IntPtr nsErrorPtr) [0x00000] in <00000000000000000000000000000000>:0 Rethrow as TypeInitializationException: The type initializer for 'Apple.GameKit.GKGameActivity' threw an exception. And the area in the native code that is triggering the crash is this inside the GKLocalPlayer_SetAuthenticateHandler function `_onAuthenticate!(tid, _mostRecentAuthenticatePlayer!.passRetainedUnsafeMutablePointer()); I am using Unity 2022.3.62f2 and MacOS 15.6 with iOS 18.6.2 which based on the min specs for the plugin we should be within spec. I have also included this message because I thought it might help too `terminating due to uncaught exception of type Il2CppExceptionWrapper Could not import Swift modules for translation unit: failed to get module "GameKitWrapper" from AST context: error: 'GKErrorCodeExtension.h' file not found in file included from :1: error: could not build Objective-C module 'GameKitWrapper' warning: Ignoring missing VFS file: /Users/james/Library/Developer/Xcode/DerivedData/GameKitWrapper-dzawbtxqdxdviiakfxmfunexppqv/Build/Intermediates.noindex/GameKitWrapper.build/Release-iphoneos/GameKitWrapper-bc72bd3638f4d2956cac9b00e84c1a7d-VFS-iphoneos/all-product-headers.yaml This is the likely root cause for any subsequent compiler errors.warning: Ignoring missing VFS file: /Users/bill/Library/Developer/Xcode/DerivedData/GameKitWrapper-dzawbtxqdxdviiakfxmfunexppqv/Build/Intermediates.noindex/GameKitWrapper.build/Release-iphoneos/GameKitWrapper iOS.build/unextended-module-overlay.yaml This is the likely root cause for any subsequent compiler errors.warning: TypeSystemSwiftTypeRef::GetNumChildren: had to engage SwiftASTContext fallback for type $syyXCD I've also attached the script that I am using for authentication, this script runs on the first scene. GameCenterManager.cs
1
0
227
Nov ’25
Cannot load .mtlpackage to MTLLibrary
After watching WWDC 2025 session "Combine Metal 4 machine learning and graphics", I have decided to give it a shot to integrate the latest MTL4MachineLearningCommandEncoder to my existing render pipeline. After a lot of trial and errors, I managed to set up the pipeline and have the app compiled. However, I am now stuck on creating a MTLLibrary with .mtlpackage. Here is the code I have to create a MTLLibrary according the WWDC session https://developer.apple.com/videos/play/wwdc2025/262/?time=550: let coreMLFilePath = bundle.path(forResource: "my_model", ofType: "mtlpackage")! let coreMLURL = URL(string: coreMLFilePath)! do { metalDevice.makeLibrary(URL: coreMLURL) } catch { print("error: \(error)") } With the above code, I am getting error: Error Domain=MTLLibraryErrorDomain Code=1 "Invalid metal package" UserInfo={NSLocalizedDescription=Invalid metal package} What is the correct way to create a MTLLibrary with .mtlpackage? Do I see this error because the .mtlpackage I am using is incorrect? How should I go with debugging this? I'd really appreciate if I could get some help on this as I have been stuck with it for some time now. Thanks in advance!
0
0
168
Nov ’25
ARView ignores multi-touch events
Hi, How to enable multitouch on ARView? Touch functions (touchesBegan, touchesMoved, ...) seem to only handle one touch at a time. In order to handle multiple touches at a time with ARView, I have to either: Use SwiftUI .simultaneousGesture on top of an ARView representable Position a UIView on top of ARView to capture touches and do hit testing by passing a reference to ARView Expected behavior: ARView should capture all touches via touchesBegan/Moved/Ended/Cancelled. Here is what I tried, on iOS 26.1 and macOS 26.1: ARView Multitouch The setup below is a minimal ARView presented by SwiftUI, with touch events handled inside ARView. Multitouch doesn't work with this setup. Note that multitouch wouldn't work either if the ARView is presented with a UIViewController instead of SwiftUI. import RealityKit import SwiftUI struct ARViewMultiTouchView: View { var body: some View { ZStack { ARViewMultiTouchRepresentable() .ignoresSafeArea() } } } #Preview { ARViewMultiTouchView() } // MARK: Representable ARView struct ARViewMultiTouchRepresentable: UIViewRepresentable { func makeUIView(context: Context) -> ARView { let arView = ARViewMultiTouch(frame: .zero) let anchor = AnchorEntity() arView.scene.addAnchor(anchor) let boxWidth: Float = 0.4 let boxMaterial = SimpleMaterial(color: .red, isMetallic: false) let box = ModelEntity(mesh: .generateBox(size: boxWidth), materials: [boxMaterial]) box.name = "Box" box.components.set(CollisionComponent(shapes: [.generateBox(width: boxWidth, height: boxWidth, depth: boxWidth)])) anchor.addChild(box) return arView } func updateUIView(_ uiView: ARView, context: Context) { } } // MARK: ARView class ARViewMultiTouch: ARView { required init(frame: CGRect) { super.init(frame: frame) /// Enable multi-touch isMultipleTouchEnabled = true cameraMode = .nonAR automaticallyConfigureSession = false environment.background = .color(.gray) /// Disable gesture recognizers to not conflict with touch events /// But it doesn't fix the issue gestureRecognizers?.forEach { $0.isEnabled = false } } required dynamic init?(coder decoder: NSCoder) { fatalError("init(coder:) has not been implemented") } override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) { for touch in touches { /// # Problem /// This should print for every new touch, up to 5 simultaneously on an iPhone (multi-touch) /// But it only fires for one touch at a time (single-touch) print("Touch began at: \(touch.location(in: self))") } } } Multitouch with an Overlay This setup works, but it doesn't seem right. There must be a solution to make ARView handle multi touch directly, right? import SwiftUI import RealityKit struct MultiTouchOverlayView: View { var body: some View { ZStack { MultiTouchOverlayRepresentable() .ignoresSafeArea() Text("Multi touch with overlay view") .font(.system(size: 24, weight: .medium)) .foregroundStyle(.white) .offset(CGSize(width: 0, height: -150)) } } } #Preview { MultiTouchOverlayView() } // MARK: Representable Container struct MultiTouchOverlayRepresentable: UIViewRepresentable { func makeUIView(context: Context) -> UIView { /// The view that SwiftUI will present let container = UIView() /// ARView let arView = ARView(frame: container.bounds) arView.autoresizingMask = [.flexibleWidth, .flexibleHeight] arView.cameraMode = .nonAR arView.automaticallyConfigureSession = false arView.environment.background = .color(.gray) let anchor = AnchorEntity() arView.scene.addAnchor(anchor) let boxWidth: Float = 0.4 let boxMaterial = SimpleMaterial(color: .red, isMetallic: false) let box = ModelEntity(mesh: .generateBox(size: boxWidth), materials: [boxMaterial]) box.name = "Box" box.components.set(CollisionComponent(shapes: [.generateBox(width: boxWidth, height: boxWidth, depth: boxWidth)])) anchor.addChild(box) /// The view that will capture touches let touchOverlay = TouchOverlayView(frame: container.bounds) touchOverlay.autoresizingMask = [.flexibleWidth, .flexibleHeight] touchOverlay.backgroundColor = .clear /// Pass an arView reference to the overlay for hit testing touchOverlay.arView = arView /// Add views to the container. /// ARView goes in first, at the bottom. container.addSubview(arView) /// TouchOverlay goes in last, on top. container.addSubview(touchOverlay) return container } func updateUIView(_ uiView: UIView, context: Context) { } } // MARK: Touch Overlay View /// A UIView to handle multi-touch on top of ARView class TouchOverlayView: UIView { weak var arView: ARView? override init(frame: CGRect) { super.init(frame: frame) isMultipleTouchEnabled = true isUserInteractionEnabled = true } required init?(coder: NSCoder) { fatalError("init(coder:) has not been implemented") } override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) { let totalTouches = event?.allTouches?.count ?? touches.count print("--- Touches Began --- (New: \(touches.count), Total: \(totalTouches))") for touch in touches { let location = touch.location(in: self) /// Hit testing. /// ARView and Touch View must be of the same size if let arView = arView { let entity = arView.entity(at: location) if let entity = entity { print("Touched entity: \(entity.name)") } else { print("Touched: none") } } } } override func touchesCancelled(_ touches: Set<UITouch>, with event: UIEvent?) { let totalTouches = event?.allTouches?.count ?? touches.count print("--- Touches Cancelled --- (Cancelled: \(touches.count), Total: \(totalTouches))") } }
1
0
260
Nov ’25
MPSMatrixRandom SEGFAULTs when ran in an async context
The following minimal snippet SEGFAULTS with SDK 26.0 and 26.1. Won't crash if I remove async from the enclosing function signature - but it's impractical in a real project. import Metal import MetalPerformanceShaders let SEED = UInt64(0x0) typealias T = Float16 /* Why ran in async context? Because global GPU object, and async makeMTLFunction, and async makeMTLComputePipelineState. Nevertheless, can trigger the bug without using global @MainActor let myGPU = MyGPU() */ @main struct CMDLine { static func main() async { let ptr = UnsafeMutablePointer<T>.allocate(capacity: 0) async let future: Void = randomFillOnGPU(ptr, count: 0) print("Main thread is playing around") await future print("Successfully reached the end.") } static func randomFillOnGPU(_ buf: UnsafeMutablePointer<T>, count destbufcount: Int) async { // let (device, queue) = await (myGPU.device, myGPU.commandqueue) let myGPU = MyGPU() let (device, queue) = (myGPU.device, myGPU.commandqueue) // Init MTLBuffer, async let makeFunction, makeComputePipelineState, etc. let tempDataType = MPSDataType.uInt32 let randfiller = MPSMatrixRandomMTGP32(device: device, destinationDataType: tempDataType, seed: Int(bitPattern:UInt(SEED))) print("randomFillOnGPU: successfully created MPSMatrixRandom.") // try await computePipelineState // ^ Crashes before this could return // Or in this minimal case, after randomFillOnGPU() returns // make encoder, set pso, dispatch, commit... } } actor MyGPU { let device : MTLDevice let commandqueue : MTLCommandQueue init() { guard let dev: MTLDevice = MPSGetPreferredDevice(.skipRemovable), let cq = dev.makeCommandQueue(), dev.supportsFamily(.apple6) || dev.supportsFamily(.mac2) else { print("Unable to get Metal Device! Exiting"); exit(EX_UNAVAILABLE) } print("Selected device: \(String(format: "%llX", dev.registryID))") self.device = dev self.commandqueue = cq print("myGPU: initialization complete.") } } See FB20916929. Apparently objc autorelease pool is releasing the wrong address during context switch (across suspension points). I wonder why such obvious case has not been caught before.
0
0
72
Nov ’25