Post

Replies

Boosts

Views

Activity

IAP in App Extension
How should an App Extension (in this case an Audio Unit Extension) determine if an IAP has been purchased in the containing app? (and related: can an IAP be purchased from within the extension?) On macOS, I suppose I could share the receipt file with the extension? and on iOS, suppose I could write some data to shared UserDefaults in an app group. Is there any official guidance on this? thanks!
1
1
1.4k
Aug ’22
testing multichannel AudioUnit output with AVAudioEngine
I'm extending an AudioUnit to generate multi-channel output, and trying to write a unit test using AVAudioEngine. My test installs a tap on the AVAudioNode's output bus and ensures the output is not silence. This works for stereo. I've currently got: auto avEngine = [[AVAudioEngine alloc] init]; [avEngine attachNode:avAudioUnit]; auto format = [[AVAudioFormat alloc] initStandardFormatWithSampleRate:44100. channels:channelCount]; [avEngine connect:avAudioUnit to:avEngine.mainMixerNode format:format]; where avAudioUnit is my AU. So it seems I need to do more than simply setting the channel count for the format when connecting, because after this code, [avAudioUnit outputFormatForBus:0].channelCount is still 2. Printing the graph yields: AVAudioEngineGraph 0x600001e0a200: initialized = 1, running = 1, number of nodes = 3 ******** output chain ******** node 0x600000c09a80 {'auou' 'ahal' 'appl'}, 'I' inputs = 1 (bus0, en1) <- (bus0) 0x600000c09e00, {'aumx' 'mcmx' 'appl'}, [ 2 ch, 44100 Hz, 'lpcm' (0x00000029) 32-bit little-endian float, deinterleaved] node 0x600000c09e00 {'aumx' 'mcmx' 'appl'}, 'I' inputs = 1 (bus0, en1) <- (bus0) 0x600000c14300, {'augn' 'brnz' 'brnz'}, [ 2 ch, 44100 Hz, 'lpcm' (0x00000029) 32-bit little-endian float, deinterleaved] outputs = 1 (bus0, en1) -> (bus0) 0x600000c09a80, {'auou' 'ahal' 'appl'}, [ 2 ch, 44100 Hz, 'lpcm' (0x00000029) 32-bit little-endian float, deinterleaved] node 0x600000c14300 {'augn' 'brnz' 'brnz'}, 'I' outputs = 1 (bus0, en1) -> (bus0) 0x600000c09e00, {'aumx' 'mcmx' 'appl'}, [ 2 ch, 44100 Hz, 'lpcm' (0x00000029) 32-bit little-endian float, deinterleaved] So AVAudioEngine just silently ignores whatever channel counts I pass to it. If I do: auto numHardwareOutputChannels = [avEngine.outputNode outputFormatForBus:0].channelCount; NSLog(@"hardware output channels %d\n", numHardwareOutputChannels); I get 30, because I have an audio interface connected. So I would think AVAudioEngine would support this. I've also tried setting the format explicitly on the connection between the mainMixerNode and the outputNode to no avail.
0
2
1.6k
Jun ’22
SwiftUI full screen animation uses less energy than Metal Game template
I've got a full-screen animation of a bunch of circles filled with gradients, with plenty of (careless) overdraw, plus real-time audio processing driving the animation, plus the overhead of SwiftUI's dependency analysis, and that app uses less energy (on iPhone 13) than the Xcode "Metal Game" template which is a rotating textured cube (a trivial GPU workload). Why is that? How can I investigate further? Does CoreAnimation have access to a compositor fast-path that a Metal app cannot access? Maybe another data point: when I do the same circles animation using SwiftUI's Canvas, the energy use is "Very High" and GPU utilization is also quite high. Eventually the phone's thermal state goes "Serious" and I get a message on the device that "Charging will resume when iPhone returns to normal temperature".
0
5
1.1k
May ’24
black screen after switching to UIKit app lifecycle
I had to switch from the SwiftUI app lifecycle to the UIKit lifecycle due to this issue: https://developer.apple.com/forums/thread/742580 When I switch to UIKit I get a black screen on startup. It's the inverse of this issue: https://openradar.appspot.com/FB9692750 For development, I can work around this by deleting and reinstalling the app, but I can't ship an app that results in a black screen for users when they update. Anyone know of a work-around? I've filed FB13462315
7
2
1.4k
Dec ’23
rendering thousands of small meshes
I have on the order of 50k small meshes (~64 vertices), all different connectivity, some subset of which change each frame (generated by a compute kernel). Can I render those in a performant way with Metal? I'm assuming 50k separate draw calls would be too slow. I have a few ideas: encode those draw calls on the GPU or lay out the meshes linearly in blocks, with some maximum size, and use a single draw call, but wasting vertex shader threads on the blocks that aren't full or use another kernel to combine the little meshes into a big mesh thanks!
2
1
1.8k
Sep ’21
what does kIOAccelCommandBufferCallbackErrorInvalidResource mean?
I'm getting the following error on Intel Iris integrated graphics. Code works well on newer Mac GPUs as well as Apple GPUs. Execution of the command buffer was aborted due to an error during execution. Invalid Resource (00000009:kIOAccelCommandBufferCallbackErrorInvalidResource) The error is for a compute command, not a draw command. The constant isn't in the documentation. All buffers and textures seem to be created successfully. I've also checked that the GPU supports the required threadgroup size for the compute pipeline. thanks!
3
0
1.6k
Apr ’22
is the MPSDynamicScene example correctly computing the motion vector texture?
I'm trying to implement de-noising of AO in my app, using the MPSDynamicScene example as a guide: https://developer.apple.com/documentation/metalperformanceshaders/animating_and_denoising_a_raytraced_scene In that example, it computes motion vectors in UV coordinates, resulting in very small values: // Compute motion vectors if (uniforms.frameIndex > 0) { // Map current pixel location to 0..1 float2 uv = in.position.xy / float2(uniforms.width, uniforms.height); // Unproject the position from the previous frame then transform it from // NDC space to 0..1 float2 prevUV = in.prevPosition.xy / in.prevPosition.w * float2(0.5f, -0.5f) + 0.5f; // Next, remove the jittering which was applied for antialiasing from both // sets of coordinates uv -= uniforms.jitter; prevUV -= prevUniforms.jitter; // Then the motion vector is simply the difference between the two motionVector = uv - prevUV; } Yet the documentation for MPSSVGF seems to indicate the offsets should be expressed in texels: The motion vector texture must be at least a two channel texture representing how many texels * each texel in the source image(s) have moved since the previous frame. The remaining channels * will be ignored if present. This texture may be nil, in which case the motion vector is assumed * to be zero, which is suitable for static images. Is this a mistake in the example code? Asking because doing something similarly in my own app leaves AO trails, which would indicate the motion vector texture values are too small in magnitude. I don't really see trails in the example, even when I speed up the animation, but that could be due to the example being monochrome. Update: If I multiply the uv offsets by the size of the texture, I get a bad result. Which seems to indicate the header is misleading and they are in fact in uv coordinates. So perhaps the trails I'm seeing in my app are for some other reason. I also wonder who is actually using this API other than me? I would think most game engines are doing their own thing. Perhaps some of apple's own code uses it.
0
1
780
Aug ’23
Can an SDF be rendered using RealityKit?
I'm trying to ray-march an SDF inside a RealityKit surface shader. For the SDF primitive to correctly render with other primitives, the depth of the fragment needs to be set according to the ray-surface intersection point. Is there a way to do that within a RealityKit surface shader? It seems the only values I can set are within surface::surface_properties. If not, can an SDF still be rendered in RealityKit using ray-marching?
1
1
760
Sep ’24
document-based sample code doesn't work... work around?
I just tried the "Building a document-based app with SwiftUI" sample code for iOS 18. https://developer.apple.com/documentation/swiftui/building-a-document-based-app-with-swiftui I can create a document and then close it. But once I open it back up, I can't navigate back to the documents browser. It also struggles to open documents (I would tap multiple times and nothing happens). This happens on both simulator and device. Will file a bug, but anyone know of a work-around? I can't use a document browser that is this broken.
1
1
430
Nov ’24
Xcode UIKit Document App template crashes under Swift 6
I'm trying to switch to UIKit's document lifecycle due to serious bugs with SwiftUI's version. However I'm noticing the template project from Xcode isn't compatible with Swift 6 (I already migrated my app to Swift 6.). To reproduce: File -> New -> Project Select "Document App" under iOS Set "Interface: UIKit" In Build Settings, change Swift Language Version to Swift 6 Run app Tap "Create Document" Observe: crash in _dispatch_assert_queue_fail Does anyone know of a work around other than downgrading to Swift 5?
0
1
82
Apr ’25
all data in managed buffer copied
I'm modifying <1mb of a 256mb managed buffer (calling didModifyRange), but according to Metal System Trace, the GPU copies the whole buffer (SDMA0 channel, "Page On 268435456 bytes"), taking 13ms. I'm making lots of small modifications (~4k) per frame. I also tried coalescing into a single call to didModifyRange (~66mb) and still the entire buffer is copied. I also tried calling didModifyRange for the first byte, and then the copied data is small. So I'm wondering why didModifyRange doesn't seem to be efficient for many small updates to a big buffer?
1
0
865
Aug ’21
acceleration structure doesn't render in gpu trace
I've got a scene which renders as I expect: but in the acceleration structure inspector, the kraken primitive doesn't render: In the list on the left, the structure is there. As expected, there is just one bounding-box primitive as a lot happens in the intersection function (doing it this way since I've already built my own octree and it takes too long to rebuild BVHs for dynamic geometry) This is just based on the SimplePathTracer example. The signatures of the sphereIntersectionFunction and octreeIntersectionFunction aren't that different: [[intersection(bounding_box, triangle_data, instancing)]] BoundingBoxIntersection sphereIntersectionFunction(// Ray parameters passed to the ray intersector below float3 origin [[origin]], float3 direction [[direction]], float minDistance [[min_distance]], float maxDistance [[max_distance]], // Information about the primitive. unsigned int primitiveIndex [[primitive_id]], unsigned int geometryIndex [[geometry_intersection_function_table_offset]], // Custom resources bound to the intersection function table. device void *resources [[buffer(0), function_constant(useResourcesBuffer)]] #if SUPPORTS_METAL_3 ,const device Sphere* perPrimitiveData [[primitive_data]] #endif ,ray_data IntersectionPayload& payload [[payload]]) { vs. [[intersection(bounding_box, triangle_data, instancing)]] BoundingBoxIntersection octreeIntersectionFunction(// Ray parameters passed to the ray intersector below float3 origin [[origin]], float3 direction [[direction]], float minDistance [[min_distance]], float maxDistance [[max_distance]], // Information about the primitive. unsigned int primitiveIndex [[primitive_id]], unsigned int geometryIndex [[geometry_intersection_function_table_offset]], // Custom resources bound to the intersection function table. device void *resources [[buffer(0)]], const device BlockInfo* perPrimitiveData [[primitive_data]], ray_data IntersectionPayload& payload [[payload]]) Note: running 15.0 beta 5 (15A5209g) since even the unmodified SimplePathTracer example project will hang the acceleration structure viewer on Xcode 14. Update: Replacing the octreeIntersectionFunction's code with just a hard-coded sphere does render. Perhaps the viewer imposes a time (or instruction count) limit on intersection functions so as to not hang the GPU?
6
0
858
Aug ’23
macOS app rejected for Thumbnail extension continuing to run
I received a rejection for "Your app spawns processes that continue running after the user has quit the app." The process in question is the app's Thumbnail extension. When I remove all of my own code from the thumbnail extension, it still continues to run after I exit my app. This is the entirety of the extension's code, which now renders blank thumbnails: import QuickLookThumbnailing class ThumbnailProvider: QLThumbnailProvider { override init() { } override func provideThumbnail(for request: QLFileThumbnailRequest, _ handler: @escaping (QLThumbnailReply?, Error?) -> Void) { let reply = QLThumbnailReply(contextSize: request.maximumSize) { (context: CGContext) -> Bool in return true } handler(reply, nil) } } Presumably Thumbnail extensions continue to run so that Finder (among others) can generate thumbnails as necessary. AFAIK, I have no direct control over the extension's lifecycle. Is this just App Review's mistake? The "Next Steps" are clueless: "You can resolve this by leaving this option unchecked by default, providing the user the option to turn it on." The app uses its own thumbnail extension to render thumbnails for document templates, which may be an uncommon thing.
1
0
1.2k
Aug ’23
SwiftUI inspector full height
Adding an inspector and toolbar to Xcode's app template, I have: struct ContentView: View { var body: some View { VStack { Image(systemName: "globe") .imageScale(.large) .foregroundStyle(.tint) Text("Hello, world!") } .padding() .toolbar { Text("test") } .inspector(isPresented: .constant(true)) { Text("this is a test") } } } In the preview canvas, this renders as I would expect: However when running the app: Am I missing something? (Relevant wwdc video is wwdc2023-10161. I couldn't add that as a tag)
2
0
1.5k
Oct ’23