I didn't check this out specifically on earlier betas, but AI is allowed to create new .swift files, yes? I'm only seeing proposals to create files (as opposed to code which it can change automatically), and the CREATE FILE button to the right of any proposed file creation does nothing.
Next I'll mention running local models, but file creation does not seem to happen for neither ChatGPT nor any local model.
Also I'm experimenting with LM Studio and it is reporting my client is timing out. So I guess Xcode is not waiting long enough for a response? The local models are slow, yes. But is there a setting for the AI timeout value?
I told the LLM "Every 1 minute send me an update so I know you are still working so my client does not time out" which seems to have no visible effect, it hasn't timed out yet, but I don't visibly see that message.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I didn't run benchmarks before update, but it seems at least 5x slower. Of course all the LLM work is on remote servers, so is non-intuitive to me this should be happening.
Had updated MacOS and Xcode to 26.1RC at the same time, so can't even say I think it is MacOS or I think it is Xcode.
Before the update the progress indicator for each piece of code might seem to get stuck at the very end (and toggling between Navigators and Coding Assistant) in Xcode UI seemed to refresh the UI and confirm coding complete... but now it seems progress races to 50%, then often is stuck at 75%... well earlier than used to get stuck. And it like something is legitimately processing not just a UI glitch.
I'm wondering if this is somehow tied to visual rendering of the code in the little white window? CMD-TAB into Xcode seems laggy. Xcode is pinning a CPU. Why, this is all remote LLM work?
MacBook Pro 2021 M1 64GB RAM. Went from 26.01 to 26.1RC. Didn't touch any of the betas until RC1.
Using an iPhone Pro 12 running iOS 26.0.1, with AirPods Pro 3. Camera app does capture video with what seems to be "Studio Quality Recording".
Am trying to replicate that SQR with my own Camera like app, and while I can pull audio in from the APP3 mic, and my video capture app is recording a 48,000Hz high-bitrate video, the audio still sounds non-SQR.
I'm seeing bluetoothA2DP , bluetoothLE , bluetoothHFP as portType, and not sure if SQR depends on one of those?
Is there sample code demonstrating a SQR capture? Nevermind video and camera, just audio even?
Also, I don't understand what SQR is doing between the APP3 and the iPhone. What codec is that? What bitrate is that? If I capture video using Capture and inspect the audio stream I see mono 74.14 kbit/s MPEG-4 AAC, 48000 Hz. But I assume that's been recompressed and not really giving me any insight into the APP3 H2 transmission?