Delve into the world of graphics and game development. Discuss creating stunning visuals, optimizing game mechanics, and share resources for game developers.

All subtopics
Posts under Graphics & Games topic

Post

Replies

Boosts

Views

Activity

Memory leak when no draw calls issued to encoder
I noticed that when the render command encoder adds no draw calls an apps memory usage seems to grow unboundedly. Using a super simple MTKView-based drawing with the following delegate (code at end). If I add the simplest of draw calls, e.g., a single vertex, the app's memory usage is normal, around 100-ish MBs. I am attaching a couple screenshot, one from Xcode and one from Instruments. What's going on here? Is this an illegal program? If yes, why does it not crash, such as if the encode or command buffer weren't ended. Or is there some race condition at play here due to the lack of draws? class Renderer: NSObject, MTKViewDelegate { var device: MTLDevice var commandQueue: MTL4CommandQueue var commandBuffer: MTL4CommandBuffer var allocator: MTL4CommandAllocator override init() { guard let d = MTLCreateSystemDefaultDevice(), let queue = d.makeMTL4CommandQueue(), let cmdBuffer = d.makeCommandBuffer(), let alloc = d.makeCommandAllocator() else { fatalError("unable to create metal 4 objects") } self.device = d self.commandQueue = queue self.commandBuffer = cmdBuffer self.allocator = alloc super.init() } func mtkView(_ view: MTKView, drawableSizeWillChange size: CGSize) {} func draw(in view: MTKView) { guard let drawable = view.currentDrawable else { return } commandBuffer.beginCommandBuffer(allocator: allocator) guard let descriptor = view.currentMTL4RenderPassDescriptor, let encoder = commandBuffer.makeRenderCommandEncoder( descriptor: descriptor ) else { fatalError("unable to create encoder") } encoder.endEncoding() commandBuffer.endCommandBuffer() commandQueue.waitForDrawable(drawable) commandQueue.commit([commandBuffer]) commandQueue.signalDrawable(drawable) drawable.present() } }
3
0
374
2w
SpriteKit scene used as SCNView.overlaySKScene crashes due to SKShapeNode
I recently published my first game on the App Store. It uses SceneKit with a SpriteKit overlay. All crashes Xcode downloaded for it so far are related to some SpriteKit/SceneKit internals. The most common crash is caused by SKCShapeNode::_NEW_copyRenderPathData. What could cause such a crash? crash.crash While developing this game (and the BoardGameKit framework that appears in the crash log) over the years I experienced many crashes presumably caused by the SpriteKit overlay (I opened a post SceneKit app randomly crashes with EXC_BAD_ACCESS in jet_context::set_fragment_texture about such a crash in September 2024), and other people on the internet also mention that they experience crashes when using SpriteKit as a SceneKit overlay. Should I use a separate SKView and lay it on top of SCNView rather than setting SCNView.overlaySKScene? That seemed to solve the crashes for a guy on stackoverflow, but is it also encouraged by Apple? I know SceneKit is deprecated, but according to Apple critical bugs would still be fixed. Could this be considered a critical bug?
4
0
540
3d
SceneKit Transparent Material Self-Overlapping Issue (Front Face Overlapping)
Description: I'm developing an AR effect using SceneKit and applying a transparent material to a face mesh. However, I'm facing an issue where the front faces of the mesh overlap each other, causing incorrect rendering. Problem: The front faces of the mesh overlap with each other when transparency is applied. This causes areas like the cheeks to be visible through the nose, even though they should be occluded. Expected Behavior: The material should behave as if it were opaque to itself—that is, overlapping front faces should be occluded properly, while still allowing transparency for background elements. Actual Behavior: The mesh renders its own front faces incorrectly, making parts of the face visible through others when they should be blocked. What I Have Tried: testMaterial.writesToDepthBuffer = true testMaterial.readsFromDepthBuffer = true Question: 👉 How can I prevent SceneKit's transparent material from rendering overlapping front faces? 👉 Is there a way to force SceneKit to treat its own mesh as opaque for itself while still being transparent to the background? 👉 Does SceneKit support a proper depth pre-pass or an equivalent to Unity’s ZWrite shaders to solve this issue? Attached screenshots demonstrate the problem visually. Any help would be greatly appreciated! 🚀
0
2
526
Feb ’25
Issues building Unity plug-in project: Cannot locate native library Apple.Core/Apple.GameKit for iOS
I'm having issues getting a well built package from the Apple Unity Plug-in project. When building the my game project in Unity the following error is printed to the console: Apple.Core.AppleNativeLibraryUtility] Cannot locate a Debug or Release Apple.Core native library for iOS. Please ensure that the build invocation (build.py, xcodebuild, or Xcode) compiled cleanly and that the build was configured to support Debug on iOS. As far as I can tell the build did compile cleanly, but I might be missing something. If anyone can see what I'm doing wrong or has any insight it would be greatly appreciated. Setup is the following: macOS Tahoe 26 Beta Xcode-beta Version 26.0 beta 3 (17A5276g) Unity Plug-in branch: 2025-beta1 Unity game project version: 2022.3.60f M1 Macbook Pro The built packages have been imported into the game project through the Unity Package Manager using the tarball option pointing to the built packages from the Unity Plug-in project. The Unity Plug-in project has been built using the build.py file with the following: python3 build.py -m iOS iPhoneSimulator -p Core GameKit CoreHaptics GameController -k all The output is available in the attached file. build-output.txt Here's an image of the NativeLibraries~ folder inside the built Apple.Core package.
6
1
1.2k
Oct ’25
Can't remove annotations from PdfView
Hi everyone, I faced an issue that on IOS 26 removeAnnotation method doesn't remove annotation. This code worked on previous versions (IOS 18, 17) but suddenly stopped working on IOS 26. Has anyone faced this issue? guard let document = await pdfView.document else { return } for pageIndex in 0..<document.pageCount { guard let page = document.page(at: pageIndex) else { continue } let annotations = page.annotations for annotation in annotations { page.removeAnnotation(annotation) } }
1
1
259
Oct ’25
visionOS + Unity PolySpatial: Is 15,970 MeshFilters the True Upper Limit for Industrial Scenes?
Breaking Through PolySpatial's ~8k Object Limit – Seeking Alternative Approaches for Large-Scale Digital Twins Confirmed: PolySpatial make Doubles MeshFilter Count – Hard Limit at ~8k Active Objects (15.9k Total) Project Context & Research Goals I’m developing an industrial digital twin application for Apple Vision Pro using Unity’s PolySpatial framework (RealityKit rendering in Unbounded_Volume mode). The scene contains complex factory environments with: Production line equipment Many fragmented grid objects need to be merged.) Dynamic product racks (state-switchable assets) Animated worker avatars To optimize performance, I’m systematically testing visionOS’s rendering capacity limits. Through controlled stress tests, I’ve identified a critical threshold: Key Finding When the total MeshFilter count reaches 15,970 (system baseline + 7,985 user-created objects × 2 due to PolySpatial cloning), the application crashes consistently. This suggests: PolySpatial’s mirroring mechanism effectively doubles GameObject overhead An apparent hard limit exists around ~8k active mesh objects in practice Objectives for This Discussion Verify if others have encountered similar limits with PolySpatial/RealityKit Understand whether this is a: Memory constraint (per-app allocation) Render pipeline limit (Metal draw calls) Unity-specific PolySpatial behavior Explore optimization strategies beyond brute-force object reduction Why This Matters Industrial metaverse applications require rendering thousands of interactive objects . Confirming these limits will help our team: Design safer content guidelines Prioritize GPU instancing/LOD investments Potentially contribute back to PolySpatial’s optimization I’d appreciate insights from engineers who’ve: Pushed similar large-scale scenes in visionOS Worked around PolySpatial’s cloning overhead Discovered alternative capacity limits (vertices/draw calls)
4
0
727
Oct ’25
How to use MetalPeformancePrimitives
I am trying to learn the new Metal Peformance Primitives APIs. I have added the MetalPeformancePrimitives framework and included the header in my shader code as per documentation #include <MetalPeformancePrimitives/MetalPeformancePrimitives.h> Unfortunately, Xcode complains that the header cannot be found. How do I include it properly? I am using Xcode 26 on Tahoe. The MetalPeformancePrimitives framework is present on my machine and I can inspect the headers in the filesystem.
3
1
741
Oct ’25
Game Center Dashboard frequently not updating when new achievement unlocked
I am currently working on a game that involves earning achievements, which I am using the Apple Unity Plug-Ins to display. I have found that occasionally opening the Game Center Dashboard the last achievement earned will not be displayed until the game is closed and reopened. I am using GKAccessPoint.Shared.Trigger to display the Achievements screen, which occasionally seems to open a cached version of the dashboard. I've found that it seems to consistently happen when earning multiple achievements within one minute, but this is not always the case. Does anybody have any experience with something like this in the past?
1
1
1.3k
Feb ’25
How to use Unity Apple GameKit Plugin For Rule-based matchmaking?
Hello, **I'm Using ** Unity 6 LTS Unity Apple GameKit + Core plugins Turn-based matchmaking interface w/ 2 players max App Store Connect API for rule-based matchmaking I have already enabled game center in app store connect (I think) authenticated players and matched via friend request I am stuck Using queues to match players automatically I'm working on a rule-based matchmaking system which aims to place two players against each other into a GKTurnBasedMatch. I have a simple Unity Project that correctly authenticates a user and proceeds to send a matchmaking request. The matchmaking script utilizes the Unity plugins' GKTurnBasedMatchmakerViewController.Request(...) request function with a GKMatchRequest.Init() request configured with a QueueName equal to the App Store Connect API Queue I created. The queue I created is also linked to a ruleset with a very basic rule that checks if the properties contains a key called 'preference' that contains a string value for what side the player wants to play for this match. If during the matchmaking, the preferences between players are different, then the match is made and both players should join the match; each player gets to play the side they have chosen. I have my rule expression designed to just check if the preferences are not equal: requests[0].properties.faction_preference != requests[1].properties.faction_preference When I launch the game with two physical iPads and begin the matchmaking request, each player is immediately presented with two options: Invite a friend, or Start game The Problem: Inviting a friend works to get two players into a game, but queue seems to not matter, and clicking start game will just put the current player into its own match (no one joins). The Question: How do I get queue based matchmaking to work in Unity for a Turn-based match with only two players who are able to select the enemy side they want to play dictated by a rule that compares enemy play-side preferences? Resources I've used: Apple Unity GameKit Plugin: https://github.com/apple/unityplugins Matchmaking: https://developer.apple.com/documentation/gamekit/matchmaking-rules Multiplayer rulesets: https://developer.apple.com/documentation/gamekit/finding-players-using-matchmaking-rules
1
0
1.1k
Sep ’25
Game Center Access Point does not appear on iOS 26 (Simulator)
Attempting to bring up the access point yields the following error log: [GameCenterOverlayService] Failed to create GameOverlayUI Dashboard Remote Proxy [GameCenterOverlayService] Could not create endpoint for service name: com.apple.GameOverlayUI.dashboard-service [GameCenterOverlayService] Failed to create GameOverlayUI Dashboard Remote Proxy [GameCenterOverlayService] Could not create endpoint for service name: com.apple.GameOverlayUI.dashboard-service [GameCenterOverlayService] Failed to create GameOverlayUI Dashboard Remote Proxy [GameCenterOverlayService] Failed to create GameOverlayUI Dashboard Remote Proxy The same code (which is a single line setting 'active' to true) works on physical devices and on the simulator in iOS 18.6 I haven't been able to find any mention of this issue online. Any suggestions or help greatly appreciated.
1
0
484
Oct ’25
ARMeshAnchor Data with RealityView
I want to use SwiftUI and RealityView to get AR scene understanding data (ARMeshAnchor) on iOS devices with LiDAR. The only way we can do that is by using ARSession (unless there is another way). However in previous iOS 18 builds there was this function: https://developer.apple.com/documentation/realitykit/spatialtrackingsession/run(_:session:arconfiguration:) , which worked with SpatialTrackingSession and a custom ARSession together. This function in the the latest iOS and Xcode has since been removed in the RealityKit framework but still there on documentation. I also wanted to get ARFaceAnchor data which I still cannot get without ARSession, the closest I can get is by using: let target = AnchoringComponent.Target.face let anchoringComponent = AnchoringComponent(target, trackingMode: .predicted) entity = Entity() entity!.components.set(anchoringComponent) But I still can't find a way to get the current frame (ARFrame) or the anchors ([ARAnchor]) in the view. Alternatively if I use if I use this function: https://developer.apple.com/documentation/realitykit/spatialtrackingsession/run(_:) and start the ARSession separately. The session (didUpdate and didAdd) only runs for a few frames before getting interrupted. And if I completely remove SpatialTrackingConfiguration and just run the ARSession. There still is a valid tracked entity for the AnchoringComponent.Target.face component. IF in the configuration for the ARSession I use the ARWorldTrackingConfiguration with face tracking. And I still get updated facial data each frame. But the ARSession didUpdate or didAdd functions don't get called passed the first few frames. Interestingly if I switch the RealityViewCameraContent.RealityViewCamera to .virtual. I get ARMeshAnchor and ARFaceAnchor data, but no camera feed (as expected). This with or without SpatialTrackingConfiguration. My overarching question is what is the proper way to access ARMeshAnchors and other ARAnchors created by the system and track them live while also using SwiftUI. GitHub Repo with sample project can be found here: https://github.com/bpate75/RealityViewTesting
1
1
472
Feb ’25
MTLBinaryArchive Size
I'm trying to use MTLBinaryArchive. I collected a BinaryArchive from one device and used metal-tt to translate it for all supported iPhone devices, ranging from iPhone 7 Plus to iPhone 16. However, this BinaryArchive is quite large, around 1.5GB uncompressed, and about 500MB compressed in the IPA. I'm wondering how to address the size issue. I watched the WWDC 2022 video, which mentioned that the operating system or app installation process would handle compatibility. Does this compatibility support different GPU chips? I tried installing an IPA with a BinaryArchive collected only from an iPhone 12 on an iPhone 13, but the BinaryArchive didn't take effect. I also saw that Apple supports App Thinning. However, it seems that resources in the Asset Catalog cannot be accessed via URL, and creating an MTLBinaryArchive requires a URL. Is it possible for MTLBinaryArchive to be distributed through App Thinning? The WWDC 2022 video also mentioned using the -Os optimization flag to reduce size. Can this give an estimate of how much compression it would achieve? Are there any methods to solve the BinaryArchive size issue without impacting performance?
0
1
105
Mar ’25
PSVR2 controllers don't report anything in snapshot
I typically read an extended gamepad capture() and get all state. But PSVR2 controllers seem to report nothing. So the stick and other buttons don't do anything in a built app. They register as left/right controllers. This on vOS 26, Xcode 26, etc. They work correctly in the main icon view, although they don't honor inverted vertical and horiztonal scrolling. Both of the default scrolls just feel wrong. When I move left I'm want to scroll level not right. Same for up/down.
5
1
707
Sep ’25
iPhone limited to 60hz frame rate
Just wondering if anyone knows what it will take to hit greater than 60hz when targeting iPhone. If I set the preferredFramesPerSecond of an MTKView to 120, it works on the iPad, but on iPhone it never goes over 60hz, even with a simple hello triangle sample app... is this a limitation of targeting iPhone?
2
0
198
Sep ’25
Game Center challenges and activities issues for released game
We have a released game in the App Store with Game Center Challenges. The challenges were working well in testing of the final version before the Game Center challenges went live. After release the activity that should take the player to the challenge in game results in this printout and we do not get informed by GKGameActivityListener of the activity happening in our code as we did before the Game Center challenges went live. “Invalid game activity definition. Failed to kick activity notification to GameKit. Error: Error Domain=GKErrorDomain Code=17 "(null)"” Also, Game Center reports that the are no challenges in GKChallengeDefinition.all even though there are ongoing challenges. The leaderboard results do get reported to Game Center though and show up as challenge entries in the Games app. At the moment we are trying to figure out if this is a problem on our end or a Game Center server issue.
4
1
644
Sep ’25
[Bug] iPadOS 26: 4-Finger Fast Tap/Swipe Gesture Not Detected (Multitouch Issue)
Hi everyone I'm experiencing an issue with iPadOS 26 regarding multi-touch gesture detection. When performing a quick four-finger gesture (tap and swipe), the system often fails to recognize the input. This especially affects multi-touch gestures, such as rhythm games with difficult levels. Steps to Reproduce: Place four fingers on the screen. Perform a quick tap or a quick horizontal swipe (like the one used to switch apps). Observe whether the gesture is ignored or detected inconsistently. Expected Behavior: 4-finger multitouch gestures should be recognized regardless of gesture speed, just like previous iPadOS versions. Actual Behavior: Gestures fail to be detected when executed quickly—same gestures still work, and miss notes in rhythm games. You can check out my posts on Twitter/x and Facebook: [https://x.com/kokona_fwa/status/1978131164104728949?s=61] Facebook: [https://m.facebook.com/groups/idipad/permalink/24438964899058806/?]
1
1
1k
Oct ’25