Delve into the world of graphics and game development. Discuss creating stunning visuals, optimizing game mechanics, and share resources for game developers.

All subtopics
Posts under Graphics & Games topic

Post

Replies

Boosts

Views

Activity

ScreenCaptureKit recording output is corrupted when captureMicrophone is true
Hello everyone, I'm working on a screen recording app using ScreenCaptureKit and I've hit a strange issue. My app records the screen to an .mp4 file, and everything works perfectly until the .captureMicrophone is false In this case, I get a valid, playable .mp4 file. However, as soon as I try to enable the microphone by setting streamConfig.captureMicrophone = true, the recording seems to work, but the final .mp4 file is corrupted and cannot be played by QuickTime or any other player. This happens whether capturesAudio (app audio) is on or off. I've already added the "Privacy - Microphone Usage Description" (NSMicrophoneUsageDescription) to my Info.plist, so I don't think it's a permissions problem. I have my logic split into a ScreenRecorder class that manages state and a CaptureEngine that handles the SCStream. Here is how I'm configuring my SCStream: ScreenRecorder.swift // This is my main SCStreamConfiguration private var streamConfiguration: SCStreamConfiguration { var streamConfig = SCStreamConfiguration() // ... other HDR/preset config ... // These are the problem properties streamConfig.capturesAudio = isAudioCaptureEnabled streamConfig.captureMicrophone = isMicCaptureEnabled // breaks it if true streamConfig.excludesCurrentProcessAudio = false streamConfig.showsCursor = false if let region = selectedRegion, let display = currentDisplay { // My region/frame logic (works fine) let regionWidth = Int(region.frame.width) let regionHeight = Int(region.frame.height) streamConfig.width = regionWidth * scaleFactor streamConfig.height = regionHeight * scaleFactor // ... (sourceRect logic) ... } streamConfig.pixelFormat = kCVPixelFormatType_32BGRA streamConfig.colorSpaceName = CGColorSpace.sRGB streamConfig.minimumFrameInterval = CMTime(value: 1, timescale: 60) return streamConfig } And here is how I'm setting up the SCRecordingOutput that writes the file: ScreenRecorder.swift private func initRecordingOutput(for region: ScreenPickerManager.SelectedRegion) throws { let screeRecordingOutputURL = try RecordingWorkspace.createScreenRecordingVideoFile( in: workspaceURL, sessionIndex: sessionIndex ) let recordingConfiguration = SCRecordingOutputConfiguration() recordingConfiguration.outputURL = screeRecordingOutputURL recordingConfiguration.outputFileType = .mp4 recordingConfiguration.videoCodecType = .hevc let recordingOutput = SCRecordingOutput(configuration: recordingConfiguration, delegate: self) self.recordingOutput = recordingOutput } Finally, my CaptureEngine adds these to the SCStream: CaptureEngine.swift class CaptureEngine: NSObject, @unchecked Sendable { private(set) var stream: SCStream? private var streamOutput: CaptureEngineStreamOutput? // ... (dispatch queues) ... func startCapture(configuration: SCStreamConfiguration, filter: SCContentFilter, recordingOutput: SCRecordingOutput) async throws { let streamOutput = CaptureEngineStreamOutput() self.streamOutput = streamOutput do { stream = SCStream(filter: filter, configuration: configuration, delegate: streamOutput) // Add outputs for raw buffers (not used for file recording) try stream?.addStreamOutput(streamOutput, type: .screen, sampleHandlerQueue: videoSampleBufferQueue) try stream?.addStreamOutput(streamOutput, type: .audio, sampleHandlerQueue: audioSampleBufferQueue) try stream?.addStreamOutput(streamOutput, type: .microphone, sampleHandlerQueue: micSampleBufferQueue) // Add the file recording output try stream?.addRecordingOutput(recordingOutput) try await stream?.startCapture() } catch { logger.error("Failed to start capture: \(error.localizedDescription)") throw error } } // ... (stopCapture, etc.) ... } When I had the .captureMicrophone value to be false, I get a perfect .mp4 video playable everywhere, however, when its true, I am getting corrupted video which doesn't play at all :-
0
0
312
Nov ’25
请问Game Center的数据保存逻辑
我们想在游戏类 App 内接入 Game Center。用户可以在游戏内创建多个角色,若用户在游戏内创建了2个角色:角色1、角色2,请问: 当用户将角色1与 Game Center 绑定后,数据将上报至 Game Center。此时玩家想要将角色1与 Game Center 解除绑定,解绑后,再将角色2与 Game Center 绑定。那么这时角色1的数据是留存在 Game Center 中,还是将被移除?
0
0
301
Oct ’25
CMake unable to generate the Xcode file described in this tutorial
In the Creating A 3D Application With Hydra Rendering tutorial on the Apple Developer website, on the last step where I execute this command: cmake -S ~/Users/macuser/CreatingA3DApplicationWithHydraRendering/ -B ~/Users/macuser/CreatingA3DApplicationWithHydraRendering/ I keep getting an error: CMake Error at CMakeLists.txt:5 (include): include could not find requested file: /Users/macuser/USDInstall/bin/pxrConfig.cmake I've tried to follow the instructions as mentioned in the README.md file included in the project files at least 5 times as well as moving the pxrConfig.cmake file around and copying it in different folders, then executed the command and was still unsuccessful into generating the proper file expected to compile and render the HydraPlayer renderer. How do I get cmake to generate the Xcode file to create the HydraPlayer renderer?
1
0
163
May ’25
How to use MTKTextureLoader to load png data
I am trying to load some PNG data with MTKTextureLoader newTextureWithData,but the result shows wrong at the alpha area. Here is the code. I have an image URL, after it downloads successfully, I try to use the data or UIImagePNGRepresentation (image), they all show wrong. UIImage *tempImg = [UIImage imageWithData:data]; CGImageRef cgRef = tempImg.CGImage; MTKTextureLoader *loader = [[MTKTextureLoader alloc] initWithDevice:device]; id<MTLTexture> temp1 = [loader newTextureWithData:data options:@{MTKTextureLoaderOptionSRGB: @(NO), MTKTextureLoaderOptionTextureUsage: @(MTLTextureUsageShaderRead), MTKTextureLoaderOptionTextureCPUCacheMode: @(MTLCPUCacheModeWriteCombined)} error:nil]; NSData *tempData = UIImagePNGRepresentation(tempImg); id<MTLTexture> temp2 = [loader newTextureWithData:tempData options:@{MTKTextureLoaderOptionSRGB: @(NO), MTKTextureLoaderOptionTextureUsage: @(MTLTextureUsageShaderRead), MTKTextureLoaderOptionTextureCPUCacheMode: @(MTLCPUCacheModeWriteCombined)} error:nil]; id<MTLTexture> temp3 = [loader newTextureWithCGImage:cgRef options:@{MTKTextureLoaderOptionSRGB: @(NO), MTKTextureLoaderOptionTextureUsage: @(MTLTextureUsageShaderRead), MTKTextureLoaderOptionTextureCPUCacheMode: @(MTLCPUCacheModeWriteCombined)} error:nil]; }] resume];
5
0
663
May ’25
Building Game Porting Toolkit on Sequoia
Like many folks here, I've recently attempted to build Apple's Game Porting Toolkit on my machine and ran into compiler errors, but instead of going the usual route of downloading the prebuilt package (kindly provided by GCenX), I decided to see if I could force it to build (since it was obviously buildable at some point). Down below is the list of things I had to do to make it work. Disclaimer: There are several dirty hacks I had to attempt to force the system to do what I needed. Use at your own risk. Don't forget to run all brew commands from a Rosetta prompt: arch -x86_64 zsh Install openssl This one is easy. Just run brew tap rbenv/tap brew install rbenv/tap/openssl@1.1 Install Command Line Tools 15.1 This specific version is required since newer versions come with the linker that is not compatible with the custom compiler (game-porting-toolkit-compiler) that GPTK is using. However, by default 15.1 tools won't install on Sequoia since the installer complains that macOS version is too new. Obviously, Apple has their reasons to not allow this, but all we need is a compiler which should be mostly indifferent to the OS version. To trick the installer, we need to change OS requirement of the installer package. You can do it in four easy steps: Copy Command Line Tools.pkg from the mounted Command_Line_Tools_for_Xcode_15.1.dmg to some other directory. Expand the installer package: pkgutil --expand "Command Line Tools.pkg" CLT You might be prompted to install Command Line Tools when you call pkgutil, just install any version. Go to the newly created CLT folder and edit the Distribution file (it may appear as executable but it's just an xml). You would want to change allowed-os-versions to something greater than 15. Removing this section altogether might also work. When done, re-wrap the package: pkgutil --flatten CLT "Command Line Tools 2.pkg" Congratulations, now you should be able to install 15.1 tools on your OS! If you had to install newer Command Line Tools for pkgutil, delete them before installing 15.1: sudo rm -rf /Library/Developer/CommandLineTools Next step is to make Homebrew accept the outdated 15.1 tools, as by default it'll complain that they're outdated or corrupted. To shut it up, open /usr/local/Homebrew/Library/Homebrew/extend/os/Mac/diagnostic.rb and remove references to check_if_supported_sdk_available from a couple of fatal build check collections. Note - by default, Homebrew will auto-update on any invocation, which will overwrite any changes you've made to its internals. To disable this behavior, before running any brew commands in the terminal, run export HOMEBREW_NO_AUTO_UPDATE=1 After these manipulations, Homebrew might still complain about outdated Command Line Tools, but it won't be a fatal error anymore. Finally, we need to downgrade MinGW to 11.0.1, since the latest version spits out compiler errors when compiling Wine. Unfortunately, Homebrew does a bad job tracking versions of MinGW, so there is no automatic way to do it. Instead, you have to manually download and install old MinGW 11.0.1 formula from the Homebrew git repository. I used the commit from Sep 16, 2023: https://github.com/Homebrew/homebrew-core/blob/b95f4f9491394af667943bd92b081046ba3406f2/Formula/m/mingw-w64.rb Download the file above, save it in your current working directory, and then run brew install ./mingw-w64.rb If you had a newer version of MinGW already installed from the previous build attempts, you can unlink it before installing the one above: brew unlink mingw-w64 With Command Line Tools 15.1 and MinGW 11.0.1 you should now be able to build GPTK without errors: brew -v install apple/apple/game-porting-toolkit In the end, steps above worked for me, although more things could break in the future. I'm leaving the instructions here just to show that it's still possible to build GPTK manually instead of relying on third parties, but with all the hoops I had to jump through I can't really recommend it.
1
0
736
Mar ’25
Hover effects w/ Compositor Services w/ PSVR2 controllers
Hi, I would like clarification on whether the new hover effects feature introduced in vision os 26 supported pinch gestures through the psvr 2 controllers. In your sample application, I was not able to confirm that this was working. Only pinch clicking with my hands worked. Pulling the trigger on the controller whilst looking at a 3d object did not activate the hover effect spatial event in the sample application. (The object is showing the highlight though) This is inconsistent with hover effect behavior with psvr2 controllers on swift ui views, where the trigger press does count as a button click. The sample I used was this one: https://developer.apple.com/documentation/compositorservices/rendering_hover_effects_in_metal_immersive_apps
0
0
410
4w
Improving person segmentation and occlusion quality in RealityKit
I’m building an app that uses RealityKit and specifically ARConfiguration.FrameSemantics.personSegmentationWithDepth. The goal is to insert an AR object into the scene behind a person, and an additional AR object in front of the person, while being as photo realistic as possible. Through testing, I’ve noticed that many times, the edges of the person segmentation mask are not well matched to the actual person, and parts of the person are transparent, with the AR object bleeding through. It’s sort of like a “bad green screen” effect, which I’d expect to see a little bit, but not to this extent. I’ve been testing on iPhone 16, iPhone 14 Pro, iPad Pro 12.9 inch 6th Generation, and iPhone 12 Pro, with similar results across all devices. I’m wondering what else I can do to improve this… either code changes, platform (like different iPhone models), or environment (like lighting, distance, etc). Attaching some example screen grabs and a minimum reproducible code sample. Appreciate any insights! import ARKit import SwiftUI import RealityKit struct RealityViewContainer: UIViewRepresentable { func makeUIView(context: Context) -> ARView { let arView = ARView(frame: .zero) arView.environment.sceneUnderstanding.options.insert(.occlusion) arView.renderOptions.insert(.disableMotionBlur) arView.renderOptions.insert(.disableDepthOfField) let configuration = ARWorldTrackingConfiguration() configuration.planeDetection = [.horizontal] if ARWorldTrackingConfiguration.supportsFrameSemantics(.personSegmentationWithDepth) { configuration.frameSemantics.insert(.personSegmentationWithDepth) } arView.session.run(configuration) arView.session.delegate = context.coordinator context.coordinator.arView = arView } func makeCoordinator() -> Coordinator { Coordinator(self) } class Coordinator: NSObject, ARSessionDelegate { var parent: RealityViewContainer var floorAnchor: ARPlaneAnchor? init(_ parent: RealityViewContainer) { self.parent = parent } func session(_ session: ARSession, didAdd anchors: [ARAnchor]) { if let arView,floorAnchor == nil { for anchor in anchors { if let horizontalPlaneAnchor = anchor as? ARPlaneAnchor, horizontalPlaneAnchor.alignment == .horizontal, horizontalPlaneAnchor.transform.columns.3.y < arView.cameraTransform.translation.y { // filter out ceiling floorAnchor = horizontalPlaneAnchor let backgroundEntity = BackgroundEntity() let anchorEntity = AnchorEntity(anchor: horizontalPlaneAnchor) anchorEntity.addChild(background) let foregroundEntity = ForegroundEntity() backgroundEntity.addChild(foregroundEntity) arView.scene.addAnchor(anchorEntity) arView.installGestures([.rotation, .translation], for: backgroundEntity) break // Stop after adding the first horizontal plane (floor) } } } } } }
1
0
112
May ’25
We are getting a blank image after capturing and compressing the picture.
We used below method to resize image while compress the image, Below method is correct or need to do the correction in method or "CGBitmapContextCreate" -(UIImage *)resizeImage:(UIImage *)anImage width:(int)width height:(int)height { CGImageRef imageRef = [anImage CGImage]; CGImageAlphaInfo alphaInfo = CGImageGetAlphaInfo(imageRef); if (alphaInfo == kCGImageAlphaNone) alphaInfo = kCGImageAlphaNoneSkipLast; CGContextRef bitmap = CGBitmapContextCreate(NULL, width, height, CGImageGetBitsPerComponent(imageRef), 4 * width, CGImageGetColorSpace(imageRef), alphaInfo); CGContextDrawImage(bitmap, CGRectMake(0, 0, width, height), imageRef); CGImageRef ref = CGBitmapContextCreateImage(bitmap); UIImage *result = [UIImage imageWithCGImage:ref]; CGContextRelease(bitmap); CGImageRelease(ref); return result; }
0
0
354
Dec ’25
screencapturekit
I have code that captures a window and displays a cropped image. The problem is 2 fold. Kit doesn't seem to allow to modify stop and recapture image in window mode to capture a portion of the screen. So this makes me having to crop and display the cropped image via a published variable. This all works find. But seems to stop after some time. Using an M1 16gig ram. program is taking less than 100meg of mem with 40-70%cpu as the crow flies. printing captured success in debug mode and sometimes frame isn't valid so guarding against it. any ideas on how to improve my strategy?
0
0
135
Jun ’25
Xcode Metal Trace
Code is download from apple official metal4 sample [https://developer.apple.com/documentation/metal/drawing-a-triangle-with-metal-4?language=objc] enable metal gpu trace in macOS schema and trace a frame in Xcode. Xcode may show segment fault on App from some 'GTTrace' function when click trace button. When replay a .gputrace file, Xcode may crash , throw an internal error or a XPC error. The example code using old metal-renderer can trace without any problem and everything works fine. Test Environment: Xcode Version 26.2 (17C52) macOS 26.2 (25C56) M1 Pro 16GB A2442
2
0
403
4d
Metal and Swift Concurrency
Hi, Introducing Swift Concurrency to my Metal app has been a bit challenging as Swift Concurrency is limited by the cooperative thread pool. GPU work is obviously not CPU bound and can block forward moving progress, especially when using waitUntilCompleted on the command buffer. For concurrent render work this has the potential of under utilizing the CPU and even creating dead locks. My question is, what is the Metal's teams general recommendation when it comes to concurrency? It seems to me that Dispatch or OperationQueues are still the preferred way for Metal bound tasks in order to gain maximum performance? To integrate with Swift Concurrency my idea is to use continuations that kick off render jobs via Dispatch or Queues? Would this be the best solution to bridge async tasks with Metal work? Thanks!
5
0
1.1k
Apr ’25
How does the automatch feature work in Game Kit?
I'm developing a turn based game. When I present the GKTurnBasedMatchmakerViewController players can opt in for automatch instead of selecting a specific friend as opponent. How exactly does the matching work if a player doesn't specify anything explicitly? Does Game Center send push notifications in a round robin fashion to all friends and the first one to accept is then matched as opponent? Is this documented somewhere?
3
0
347
1w
Reality Composer Pro Transparent Textures
Hey everyone, I am currently developing an app in visionOS and using RealityComposerPro create scenes in put in my app. I have a humanoid model with hair strands, and each strand of hair has an opacity map. However, some reflections are still visible even though the opacity is zero. There are also some weird culling among hair strands (in the left circle) and weird reflections in hair cards (in the right circle). Here's my settings for the materials. Since all the hair strands are interconnected with each other, it is hard to decide the drawing order in Xcode, so I am wondering if there's an easier way to handle transparency objects. Please let me know if you know anything helpful, much appreciated!
0
0
147
Apr ’25
'__abort_with_payload' from CompositorNonUI on visionOS 26.2 (device + simulator, Omniverse streaming)
I am developing a custom app for Apple Vision Pro using Compositor Services to stream content from NVIDIA Omniverse. The app is based on: https://github.com/NVIDIA-Omniverse/apple-configurator-sample Environment: Device: Apple Vision Pro OS Version: visionOS 26.2 Xcode Version: 26.2 The Issue: The application crashes hard (__abort_with_payload) in "libsystem_kernel.dylib" on Task 6 immediately after initialization. This appears to be a deliberate abort triggered by the compositor, not a typical crash. The issue occurs on both physical device and simulator. Important detail: The console output shows a specific CLIENT BUG assertion. By checking the metadata of the warning, I found that it is related to "Library: CompositorNonUI". Relevant console output before abort: Missed 'FrameLimiter' target of 90.0 Hz running compositor services to get IPD, FOV, etc fence tx observer 14f27 timed out after 0.600000 fence tx observer bc1b timed out after 0.600000 BUG IN CLIENT: For mixed reality experiences please use cp_drawable_compute_projection API
0
0
78
2w
App Crash in GameController when accessing GCKeyboard.coalesced on iPad
We are developing a hybrid iOS app where Angular content is rendered inside a WKWebView, hosted by a native Swift application. We use the GameController framework to detect whether an external Bluetooth keyboard is connected to an iPad. The following code is executed when the app enters the foreground and also when requested by the web layer: func keyboardStatusHandler(){ let isKeyboardConnected = GCKeyboard.coalesced != nil if(!isKeyboardConnected){ //sent status to Angular } else { //sent status to Angular } } Crash details We are seeing intermittent crashes on iPad with the following stack trace: Crashed: GCDeviceSession.HID 0 libobjc.A.dylib 0x7db8 objc_retain_x8 + 16 1 libsystem_blocks.dylib 0xfb8 void HelperBase<ExtendedInline>::copyCapture<(HelperBase<ExtendedInline>::BlockCaptureKind)3>(unsigned int) + 48 2 libsystem_blocks.dylib 0xbc4 HelperBase<GenericInline>::copyBlock(Block_layout*, Block_layout*) + 108 3 libsystem_blocks.dylib 0xc94 _call_copy_helpers_excp + 60 4 libsystem_blocks.dylib 0xef8 _Block_copy + 412 5 libdispatch.dylib 0x1a70 _dispatch_Block_copy + 32 6 libdispatch.dylib 0x792c dispatch_async + 56 7 libdispatch.dylib 0x792c dispatch_channel_async + 56 8 GameController 0xea6dc -[GCKeyboardInput _handleKeyboardEvent:] + 324 9 GameController 0x22508 __53-[_GCKeyboardEventHIDAdapter initWithSource:service:]_block_invoke + 376 10 GameController 0x11d30 -[_GCHIDEventSubject publishHIDEvent:] + 268 11 GameController 0xb79cc __40-[_GCHIDEventUIKitClient initWithQueue:]_block_invoke_3 + 44 12 libdispatch.dylib 0x1b584 _dispatch_client_callout + 16 13 libdispatch.dylib 0x12088 _dispatch_async_and_wait_invoke_and_complete_recurse + 272 14 libdispatch.dylib 0x8448 _dispatch_async_and_wait_f + 108 15 GameController 0xb7984 __40-[_GCHIDEventUIKitClient initWithQueue:]_block_invoke_2 + 132 16 GameController 0xb746c __48-[__GCHIDEventUIKitClient _initWithApplication:]_block_invoke + 256 17 UIKitCore 0x11fd394 __61-[UIEventFetcher _setHIDGameControllerEventObserver:onQueue:]_block_invoke_3 + 40 18 libdispatch.dylib 0x1aac _dispatch_call_block_and_release + 32 19 libdispatch.dylib 0x1b584 _dispatch_client_callout + 16 20 libdispatch.dylib 0xa2d0 _dispatch_lane_serial_drain + 740 21 libdispatch.dylib 0xadac _dispatch_lane_invoke + 388 22 libdispatch.dylib 0x151dc _dispatch_root_queue_drain_deferred_wlh + 292 23 libdispatch.dylib 0x14a60 _dispatch_workloop_worker_thread + 540 24 libsystem_pthread.dylib 0xa0c _pthread_wqthread + 292 25 libsystem_pthread.dylib 0xaac start_wqthread + 8 Observed scenarios Crash occurs when the app transitions from background to foreground Crash also occurs when the Angular layer requests keyboard status, triggering the same code path Questions Has anyone encountered crashes related to GCKeyboard.coalesced or GCKeyboardInput like this? Are there known issues with the GameController framework when querying keyboard state during app lifecycle transitions? Is there a recommended or safer way to detect external keyboard connection status on iPad (especially when using WKWebView)? Any insights, known platform issues, or suggested workarounds would be greatly appreciated. Thanks!
7
0
411
Dec ’25