I'm modifying <1mb of a 256mb managed buffer (calling didModifyRange), but according to Metal System Trace, the GPU copies the whole buffer (SDMA0 channel, "Page On 268435456 bytes"), taking 13ms.
I'm making lots of small modifications (~4k) per frame. I also tried coalescing into a single call to didModifyRange (~66mb) and still the entire buffer is copied. I also tried calling didModifyRange for the first byte, and then the copied data is small.
So I'm wondering why didModifyRange doesn't seem to be efficient for many small updates to a big buffer?
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
How should an App Extension (in this case an Audio Unit Extension) determine if an IAP has been purchased in the containing app? (and related: can an IAP be purchased from within the extension?)
On macOS, I suppose I could share the receipt file with the extension? and on iOS, suppose I could write some data to shared UserDefaults in an app group.
Is there any official guidance on this?
thanks!
I have an app (currently for sale on the Mac App Store) which is a programming environment for audio processing (DSP node graph). I would like it to be able to export apps that are ready to be uploaded to the App Store or Mac App Store (including Audio Unit extensions).
Can I code sign from within my Mac App Store app? (Seems I can use Process to invoke codesign. Otherwise perhaps I could add the source for codesign to my app.. it seems to be open source)
Is this whole process too hard for a solo developer to take on?
What resources should I look at?
thanks!
I'm getting this segfault stack trace from TestFlight. Any idea about how I should approach it?
Thread 0 Crashed:
0 SwiftUI 0x000000018dd18e14 specialized static Array<A>.== infix(_:_:) + 0 (<compiler-generated>:0)
1 SwiftUI 0x000000018e2ab404 static StrokeStyle.== infix(_:_:) + 100 (<compiler-generated>:0)
2 SwiftUI 0x000000018e2ab468 protocol witness for static Equatable.== infix(_:_:) in conformance StrokeStyle + 60 (<compiler-generated>:0)
3 AttributeGraph 0x00000001b0e4feec AGDispatchEquatable + 24 (Misc.swift:160)
4 AttributeGraph 0x00000001b0e4fd68 AG::LayoutDescriptor::Compare::operator()(unsigned char const*, unsigned char const*, unsigned char const*, unsigned long, unsigned int) + 1632 (ag-value.cc:579)
5 AttributeGraph 0x00000001b0e4f674 AG::LayoutDescriptor::compare(unsigned char const*, unsigned char const*, unsigned char const*, unsigned long, unsigned int) + 96 (ag-value.cc:723)
6 AttributeGraph 0x00000001b0e4efb0 AGGraphSetOutputValue + 268 (AGGraph.mm:784)
7 SwiftUI 0x000000018e8c1eb4 closure #1 in StatefulRule.value.setter + 72 (<compiler-generated>:0)
8 SwiftUI 0x000000018defc57c partial apply for closure #1 in StatefulRule.value.setter + 20 (<compiler-generated>:0)
9 libswiftCore.dylib 0x0000000182a779e4 withUnsafePointer<A, B>(to:_:) + 28 (LifetimeManager.swift:128)
10 SwiftUI 0x000000018def9e84 closure #1 in closure #1 in UnwrapConditional.updateValue() + 360 (ConditionalMetadata.swift:286)
11 SwiftUI 0x000000018defc55c partial apply for closure #1 in closure #1 in UnwrapConditional.updateValue() + 36 (<compiler-generated>:0)
12 SwiftUI 0x000000018def8374 ConditionalTypeDescriptor.project(at:baseIndex:_:) + 192 (ConditionalMetadata.swift:203)
13 SwiftUI 0x000000018def8458 ConditionalTypeDescriptor.project(at:baseIndex:_:) + 420 (ConditionalMetadata.swift:212)
14 SwiftUI 0x000000018def9cf0 closure #1 in UnwrapConditional.updateValue() + 136 (ConditionalMetadata.swift:283)
15 SwiftUI 0x000000018defc530 partial apply for closure #1 in UnwrapConditional.updateValue() + 28 (<compiler-generated>:0)
16 libswiftCore.dylib 0x0000000182a779e4 withUnsafePointer<A, B>(to:_:) + 28 (LifetimeManager.swift:128)
17 SwiftUI 0x000000018def9c1c UnwrapConditional.updateValue() + 260 (ConditionalMetadata.swift:282)
18 SwiftUI 0x000000018e3b3de8 partial apply for implicit closure #1 in closure #1 in closure #1 in Attribute.init<A>(_:) + 32 (<compiler-generated>:0)
19 AttributeGraph 0x00000001b0e52854 AG::Graph::UpdateStack::update() + 512 (ag-graph-update.cc:578)
20 AttributeGraph 0x00000001b0e49504 AG::Graph::update_attribute(AG::data::ptr<AG::Node>, unsigned int) + 424 (ag-graph-update.cc:719)
(UIApplication.m:3679)
....
137 UIKitCore 0x000000018b841cf0 UIApplicationMain + 340 (UIApplication.m:5266)
138 SwiftUI 0x000000018e1f2ff8 closure #1 in KitRendererCommon(_:) + 176 (UIKitApp.swift:37)
139 SwiftUI 0x000000018e1f2e3c runApp<A>(_:) + 152 (UIKitApp.swift:14)
140 SwiftUI 0x000000018de6fda0 static App.main() + 128 (App.swift:114)
141 Sculptura 0x0000000100b61f50 static SculpturaApp.$main() + 24 (SculpturaApp.swift:17)
142 Sculptura 0x0000000100b61f50 main + 36 (SculpturaApp.swift:0)
143 dyld 0x00000001abaafd44 start + 2104 (dyldMain.cpp:1269)
High up in the trace there's some ZStack layout stuff. Maybe just trying to simplify my view hierarchy?
(I wonder if this would be more stable if AttributeGraph wasn't written in C++)
I received a rejection for "Your app spawns processes that continue running after the user has quit the app."
The process in question is the app's Thumbnail extension.
When I remove all of my own code from the thumbnail extension, it still continues to run after I exit my app. This is the entirety of the extension's code, which now renders blank thumbnails:
import QuickLookThumbnailing
class ThumbnailProvider: QLThumbnailProvider {
override init() { }
override func provideThumbnail(for request: QLFileThumbnailRequest,
_ handler: @escaping (QLThumbnailReply?, Error?) -> Void) {
let reply = QLThumbnailReply(contextSize: request.maximumSize) { (context: CGContext) -> Bool in
return true
}
handler(reply, nil)
}
}
Presumably Thumbnail extensions continue to run so that Finder (among others) can generate thumbnails as necessary. AFAIK, I have no direct control over the extension's lifecycle.
Is this just App Review's mistake? The "Next Steps" are clueless:
"You can resolve this by leaving this option unchecked by default, providing the user the option to turn it on."
The app uses its own thumbnail extension to render thumbnails for document templates, which may be an uncommon thing.
Is this an uncaught C++ exception that could have originated from my code? or something else? (this report is from a tester)
(also, why can't crash reporter tell you info about what exception wasn't caught?)
(Per instructions here, to view the crash report, you'll need to rename the attached .txt to .ips to view the crash report)
thanks!
AudulusAU-2024-02-14-020421.txt
I'm recreating the sleep timer from the Podcasts app. How can I display an icon for the picker instead of the current selection?
This doesn't work:
Picker("Sleep Timer", systemImage: "moon.zzz.fill", selection: $sleepTimerDuration) {
Text("Off").tag(0)
Text("5 Minutes").tag(5)
Text("10 Minutes").tag(10)
Text("15 Minutes").tag(15)
Text("30 Minutes").tag(30)
Text("45 Minutes").tag(45)
Text("1 Hour").tag(60)
}
Do I need to drop down to UIKit for this?
Topic:
UI Frameworks
SubTopic:
SwiftUI
In my Metal-based app, I ray-march a 3D texture. I'd like to use RealityKit instead of my own code. I see there is a LowLevelTexture (beta) where I could specify a 3D texture. However on the Metal side, there doesn't seem to be any way to access a 3D texture (realitykit::texture::textures::custom returns a texture2d).
Any work-arounds? Could I even do something icky like cast the texture2d to a texture3d in MSL? (is that even possible?) Could I encode the 3d texture into an argument buffer and get that in somehow?
I'm trying to ray-march an SDF inside a RealityKit surface shader. For the SDF primitive to correctly render with other primitives, the depth of the fragment needs to be set according to the ray-surface intersection point. Is there a way to do that within a RealityKit surface shader? It seems the only values I can set are within surface::surface_properties.
If not, can an SDF still be rendered in RealityKit using ray-marching?
"Specifically, your App Description and screenshot references paid features but does not inform users that a purchase is required to access this content."
My App Description (Pro 3D art app) doesn't mention that the entire app is a subscription. I didn't think I needed to because Final Cut Pro and Logic Pro don't do that either. Anyone had experience with this? Is there a double-standard or did App Review just make a mistake?
Suppose I can add some language at the end of the App Description like "All Features unlocked with subscription"
I'm wondering if it's possible to do Parallax Occlusion Mapping in RealityKit? Does RK's metal shader API provide enough?
I think it would need to be able to discard fragments and thus can't be run as a deferred pass. Not sure though!
I just tried the "Building a document-based app with SwiftUI" sample code for iOS 18.
https://developer.apple.com/documentation/swiftui/building-a-document-based-app-with-swiftui
I can create a document and then close it. But once I open it back up, I can't navigate back to the documents browser. It also struggles to open documents (I would tap multiple times and nothing happens). This happens on both simulator and device.
Will file a bug, but anyone know of a work-around? I can't use a document browser that is this broken.
I want to turn off my ray-tracing conditionally. There's is_null_acceleration_structure but when I don't bind an acceleration structure (or pass nil to setFragmentAccelerationStructure), I get the following API validation error:
-[MTLDebugRenderCommandEncoder validateCommonDrawErrors:]:5782: failed assertion `Draw Errors Validation
Fragment Function(vol_deferred_lighting): missing instanceAccelerationStructure binding at index 6 for accelerationStructure[0].
I can turn off API validation and it works, but it seems like I should be able to use nil for the acceleration structure w/o triggering a validation error. Seems like a bug, right?
I suppose I can work around this by creating a separate pipeline with the ray-tracing disabled via a function constant instead of using is_null_acceleration_structure.
(Can we get a ray-tracing tag for questions?)
I'm trying to improve my build time on macOS by not building for x86_64. I've got the following settings:
This gets Xcode not to build x86_64 for my app, but not all the package dependencies.
I've updated most of the packages to swift-tools-version: 6.0 but FlatBuffers is still on 5.8 and .macOS(.v10_14). GPT claims:
If your deployment target is set to macOS 10.15 or earlier, Xcode may force x86_64 support for compatibility reasons.
But Xcode is building x86_64 for ALL my packages, even the ones that don't depend on FlatBuffers.
When I open a package in Xcode that depends on FlatBuffers, then it builds arm only, so that may be a red herring.
Not sure what else to try.
I have on the order of 50k small meshes (~64 vertices), all different connectivity, some subset of which change each frame (generated by a compute kernel). Can I render those in a performant way with Metal?
I'm assuming 50k separate draw calls would be too slow. I have a few ideas:
encode those draw calls on the GPU
or lay out the meshes linearly in blocks, with some maximum size, and use a single draw call, but wasting vertex shader threads on the blocks that aren't full
or use another kernel to combine the little meshes into a big mesh
thanks!