I just tried the "Building a document-based app with SwiftUI" sample code for iOS 18.
https://developer.apple.com/documentation/swiftui/building-a-document-based-app-with-swiftui
I can create a document and then close it. But once I open it back up, I can't navigate back to the documents browser. It also struggles to open documents (I would tap multiple times and nothing happens). This happens on both simulator and device.
Will file a bug, but anyone know of a work-around? I can't use a document browser that is this broken.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I want to turn off my ray-tracing conditionally. There's is_null_acceleration_structure but when I don't bind an acceleration structure (or pass nil to setFragmentAccelerationStructure), I get the following API validation error:
-[MTLDebugRenderCommandEncoder validateCommonDrawErrors:]:5782: failed assertion `Draw Errors Validation
Fragment Function(vol_deferred_lighting): missing instanceAccelerationStructure binding at index 6 for accelerationStructure[0].
I can turn off API validation and it works, but it seems like I should be able to use nil for the acceleration structure w/o triggering a validation error. Seems like a bug, right?
I suppose I can work around this by creating a separate pipeline with the ray-tracing disabled via a function constant instead of using is_null_acceleration_structure.
(Can we get a ray-tracing tag for questions?)
In my Swift app, Xcode has decided to put various important settings like, SWIFT_VERSION under "User Defined."
Does anyone know why?
I'm trying to improve my build time on macOS by not building for x86_64. I've got the following settings:
This gets Xcode not to build x86_64 for my app, but not all the package dependencies.
I've updated most of the packages to swift-tools-version: 6.0 but FlatBuffers is still on 5.8 and .macOS(.v10_14). GPT claims:
If your deployment target is set to macOS 10.15 or earlier, Xcode may force x86_64 support for compatibility reasons.
But Xcode is building x86_64 for ALL my packages, even the ones that don't depend on FlatBuffers.
When I open a package in Xcode that depends on FlatBuffers, then it builds arm only, so that may be a red herring.
Not sure what else to try.
I'm trying to switch to UIKit's document lifecycle due to serious bugs with SwiftUI's version.
However I'm noticing the template project from Xcode isn't compatible with Swift 6 (I already migrated my app to Swift 6.). To reproduce:
File -> New -> Project
Select "Document App" under iOS
Set "Interface: UIKit"
In Build Settings, change Swift Language Version to Swift 6
Run app
Tap "Create Document"
Observe: crash in _dispatch_assert_queue_fail
Does anyone know of a work around other than downgrading to Swift 5?
I'm streaming mp3 audio data using URLSession/AudioFileStream/AVAudioConverter and getting occasional silent buffers and glitches (little bleeps and whoops as opposed to clicks). The issues are present in an offline test, so this isn't an issue of underruns.
Doing some buffering on the input coming from the URLSession (URLSessionDataTask) reduces the glitches/silent buffers to rather infrequent, but they do still happen occasionally.
var bufferedData = Data()
func parseBytes(data: Data) {
bufferedData.append(data)
// XXX: this buffering reduces glitching
// to rather infrequent. But why?
if bufferedData.count > 32768 {
bufferedData.withUnsafeBytes { (bytes: UnsafeRawBufferPointer) in
guard let baseAddress = bytes.baseAddress else { return }
let result = AudioFileStreamParseBytes(audioStream!,
UInt32(bufferedData.count),
baseAddress,
[])
if result != noErr {
print("❌ error parsing stream: \(result)")
}
}
bufferedData = Data()
}
}
No errors are returned by AudioFileStream or AVAudioConverter.
func handlePackets(data: Data,
packetDescriptions: [AudioStreamPacketDescription]) {
guard let audioConverter else {
return
}
var maxPacketSize: UInt32 = 0
for packetDescription in packetDescriptions {
maxPacketSize = max(maxPacketSize, packetDescription.mDataByteSize)
if packetDescription.mDataByteSize == 0 {
print("EMPTY PACKET")
}
if Int(packetDescription.mStartOffset) + Int(packetDescription.mDataByteSize) > data.count {
print("❌ Invalid packet: offset \(packetDescription.mStartOffset) + size \(packetDescription.mDataByteSize) > data.count \(data.count)")
}
}
let bufferIn = AVAudioCompressedBuffer(format: inFormat!, packetCapacity: AVAudioPacketCount(packetDescriptions.count), maximumPacketSize: Int(maxPacketSize))
bufferIn.byteLength = UInt32(data.count)
for i in 0 ..< Int(packetDescriptions.count) {
bufferIn.packetDescriptions![i] = packetDescriptions[i]
}
bufferIn.packetCount = AVAudioPacketCount(packetDescriptions.count)
_ = data.withUnsafeBytes { ptr in
memcpy(bufferIn.data, ptr.baseAddress, data.count)
}
if verbose {
print("handlePackets: \(data.count) bytes")
}
// Setup input provider closure
var inputProvided = false
let inputBlock: AVAudioConverterInputBlock = { packetCount, statusPtr in
if !inputProvided {
inputProvided = true
statusPtr.pointee = .haveData
return bufferIn
} else {
statusPtr.pointee = .noDataNow
return nil
}
}
// Loop until converter runs dry or is done
while true {
let bufferOut = AVAudioPCMBuffer(pcmFormat: outFormat, frameCapacity: 4096)!
bufferOut.frameLength = 0
var error: NSError?
let status = audioConverter.convert(to: bufferOut, error: &error, withInputFrom: inputBlock)
switch status {
case .haveData:
if verbose {
print("✅ convert returned haveData: \(bufferOut.frameLength) frames")
}
if bufferOut.frameLength > 0 {
if bufferOut.isSilent {
print("(haveData) SILENT BUFFER at frame \(totalFrames), pending: \(pendingFrames), inputPackets=\(bufferIn.packetCount), outputFrames=\(bufferOut.frameLength)")
}
outBuffers.append(bufferOut)
totalFrames += Int(bufferOut.frameLength)
}
case .inputRanDry:
if verbose {
print("🔁 convert returned inputRanDry: \(bufferOut.frameLength) frames")
}
if bufferOut.frameLength > 0 {
if bufferOut.isSilent {
print("(inputRanDry) SILENT BUFFER at frame \(totalFrames), pending: \(pendingFrames), inputPackets=\(bufferIn.packetCount), outputFrames=\(bufferOut.frameLength)")
}
outBuffers.append(bufferOut)
totalFrames += Int(bufferOut.frameLength)
}
return // wait for next handlePackets
case .endOfStream:
if verbose {
print("✅ convert returned endOfStream")
}
return
case .error:
if verbose {
print("❌ convert returned error")
}
if let error = error {
print("error converting: \(error.localizedDescription)")
}
return
@unknown default:
fatalError()
}
}
}
The document-based SwiftUI example app (https://developer.apple.com/documentation/swiftui/building-a-document-based-app-with-swiftui) doesn't specify a launch image.
It would seem per the HIG that the "pinkJungle" background in the app would be a decent candidate for a launch image, since it will be in the background when the document browser comes up.
However when specifying it as the UIImageName, it is not aligned the same as the background image. I'm having trouble figuring out how it should be aligned to match the image. The launch image seems to be scaled up a bit over scaledToFill.
I suppose a launch storyboard might make this more explicit, but I still should be able to do it without one.
This is the image when displayed as the launch image:
and this is how it's rendered in the background right before the document browser comes up: