I have the following two views in SwiftUI. The first view GestureTestView has a drag gesture defined on its overlay view (call it indicator view) and has the subview called ContentTestView that has tap gesture attached to it. The problem is tap gesture in ContentTestView is blocking Drag Gesture on indicator view. I have tried everything including simultaneous gestures but it doesn't seem to work as gestures are on different views. It's easy to test by simply copying and pasting the code and running the code in XCode preview.
import SwiftUI
struct GestureTestView: View {
@State var indicatorOffset:CGFloat = 10.0
var body: some View {
ContentTestView()
.overlay(alignment: .leading, content: {
Capsule()
.fill(Color.mint.gradient)
.frame(width: 8, height: 60)
.offset(x: indicatorOffset )
.gesture(
DragGesture(minimumDistance: 0)
.onChanged({ value in
indicatorOffset = min(max(0, 10 + value.translation.width), 340)
})
.onEnded { value in
}
)
})
}
}
#Preview {
GestureTestView()
}
struct ContentTestView: View {
@State var isSelected = false
var body: some View {
HStack(spacing:0) {
ForEach(0..<8) { index in
Rectangle()
.fill(index % 2 == 0 ? Color.blue : Color.red)
.frame(width:40, height:40)
}
.overlay {
if isSelected {
RoundedRectangle(cornerRadius: 5)
.stroke(.yellow, lineWidth: 3.0)
}
}
}
.onTapGesture {
isSelected.toggle()
}
}
}
#Preview {
ContentTestView()
}
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I have the following code in my ObservableObject class and recently XCode started giving purple coloured runtime issues with it (probably in iOS 18):
Issue 1: Performing I/O on the main thread can cause slow launches.
Issue 2: Interprocess communication on the main thread can cause non-deterministic delays.
Issue 3: Interprocess communication on the main thread can cause non-deterministic delays.
Here is the code:
@Published var cameraAuthorization:AVAuthorizationStatus
@Published var micAuthorization:AVAuthorizationStatus
@Published var photoLibAuthorization:PHAuthorizationStatus
@Published var locationAuthorization:CLAuthorizationStatus
var locationManager:CLLocationManager
override init() {
// Issue 1 (Performing I/O on the main thread can cause slow launches.)
cameraAuthorization = AVCaptureDevice.authorizationStatus(for: AVMediaType.video)
micAuthorization = AVCaptureDevice.authorizationStatus(for: AVMediaType.audio)
photoLibAuthorization = PHPhotoLibrary.authorizationStatus(for: .addOnly)
//Issue 1: Performing I/O on the main thread can cause slow launches.
locationManager = CLLocationManager()
locationAuthorization = locationManager.authorizationStatus
super.init()
//Issue 2: Interprocess communication on the main thread can cause non-deterministic delays.
locationManager.delegate = self
}
And also in route Change notification handler of AVAudioSession.routeChangeNotification,
//Issue 3: Hangs - Interprocess communication on the main thread can cause non-deterministic delays.
let categoryPlayback = (AVAudioSession.sharedInstance().category == .playback)
I wonder how checking authorisation status can give these issues? What is the fix here?
Topic:
Media Technologies
SubTopic:
Photos & Camera
Tags:
Core Location
AVFoundation
Concurrency
Xcode Sanitizers and Runtime Issues
I want to understand the utility of using AsyncStream when iOS 17 introduced @Observable macro where we can directly observe changes in the value of any variable in the model(& observation tracking can happen even outside SwiftUI view). So if I am observing a continuous stream of values, such as download progress of a file using AsyncStream in a SwiftUI view, the same can be observed in the same SwiftUI view using onChange(of:initial) of download progress (stored as a property in model object). I am looking for benefits, drawbacks, & limitations of both approaches.
Specifically, my question is with regards to AVCam sample code by Apple where they observe few states as follows. This is done in CameraModel class which is attached to SwiftUI view.
// MARK: - Internal state observations
// Set up camera's state observations.
private func observeState() {
Task {
// Await new thumbnails that the media library generates when saving a file.
for await thumbnail in mediaLibrary.thumbnails.compactMap({ $0 }) {
self.thumbnail = thumbnail
}
}
Task {
// Await new capture activity values from the capture service.
for await activity in await captureService.$captureActivity.values {
if activity.willCapture {
// Flash the screen to indicate capture is starting.
flashScreen()
} else {
// Forward the activity to the UI.
captureActivity = activity
}
}
}
Task {
// Await updates to the capabilities that the capture service advertises.
for await capabilities in await captureService.$captureCapabilities.values {
isHDRVideoSupported = capabilities.isHDRSupported
cameraState.isVideoHDRSupported = capabilities.isHDRSupported
}
}
Task {
// Await updates to a person's interaction with the Camera Control HUD.
for await isShowingFullscreenControls in await captureService.$isShowingFullscreenControls.values {
withAnimation {
// Prefer showing a minimized UI when capture controls enter a fullscreen appearance.
prefersMinimizedUI = isShowingFullscreenControls
}
}
}
}
If we see the structure CaptureCapabilities, it is a small structure with two Bool members. These changes could have been directly observed by a SwiftUI view. I wonder if there is a specific advantage or reason to use AsyncStream here & continuously iterate over changes in a for loop.
/// A structure that represents the capture capabilities of `CaptureService` in
/// its current configuration.
struct CaptureCapabilities {
let isLivePhotoCaptureSupported: Bool
let isHDRSupported: Bool
init(isLivePhotoCaptureSupported: Bool = false,
isHDRSupported: Bool = false) {
self.isLivePhotoCaptureSupported = isLivePhotoCaptureSupported
self.isHDRSupported = isHDRSupported
}
static let unknown = CaptureCapabilities()
}
I see in most of the old sample codes from Apple that when using AVAssetWriter to append audio, video, and metadata samples in a real time camera recording setup, calls to .append(sampleBuffer) are either synchronised using an NSLock or all the samples are sent to the asset writer on the same dispatch queue thereby preventing concurrent writes. However I can't find any documentation that calls to assetWriterInput.append(sampleBuffer) for different media samples such as Audio and Video should not be done concurrently. Is it not valid for these methods to be executed in parallel for instance?
`videoSamplesAssetWriterInput.append(videoSampleBuffer)` from DispatchQueue 1
`audioSamplesAssetWriterInput.append(audioSampleBuffer)` from DispatchQueue 2
The AVCam sample code by Apple fails to build in Swift 6 language settings due to failed concurrency checks ((the only modification to make in that code is to append @preconcurrency to import AVFoundation).
Here is a minimally reproducible sample code for one of the errors:
import Foundation
final class Recorder {
var writer = Writer()
var isRecording = false
func startRecording() {
Task { [writer] in
await writer.startRecording()
print("started recording")
}
}
func stopRecording() {
Task { [writer] in
await writer.stopRecording()
print("stopped recording")
}
}
func observeValues() {
Task {
for await value in await writer.$isRecording.values {
isRecording = value
}
}
}
}
actor Writer {
@Published private(set) public var isRecording = false
func startRecording() {
isRecording = true
}
func stopRecording() {
isRecording = false
}
}
The function observeValues gives an error:
Non-sendable type 'Published<Bool>.Publisher' in implicitly asynchronous access to actor-isolated property '$isRecording' cannot cross actor boundary
I tried everything to fix it but all in vain. Can someone please point out if the architecture of AVCam sample code is flawed or there is an easy fix?
I imported few files in my Xcode project from other projects using drag and drop. Even though the files are copied in the new project and there are no softlinks pointing to the location of other project, the issue is whenever I do a git commit and push, Xcode keeps showing all the projects to commit to rather than just the current project. There seem to be no setting to permanently remove git dependency of other projects. Is there anyway to remove references to other projects?
The documentation for AVCaptureDeviceRotationCoordinator says it is designed to work with any CALayer but it seems like it is designed to work only with AVCaptureVideoPreviewLayer. Can someone confirm it is possible to make it work with other layers such as CAMetalLayer or AVSampleBufferDisplayLayer?
I am looking to move from paid app to fremium without upsetting my existing users. I see WWDC2022 session where new fields introduced in iOS 16 are used to extract original application version user used to purchase the app. While my app supports iOS 14 and above, I am willing to sacrifice iOS 14 and go iOS 15 and above as StoreKit2 requires iOS 15 at the minimum. The code below is however only valid for iOS 16. I need to know what is the best way out for iOS 15 devices if I am using StoreKit2? If it is not possible in StoreKit2, then how much is the work involved in original StoreKit API(because in that case I can handle for iOS 14 as well)?
I am trying to use AVCaptureDevice.rotationCoordinator API to observe angles for preview and capture and it seems there is an issue with the API when used with arbitrary CALayer (which is not a AVCaptureVideoPreviewLayer) and switching cameras.
Here is my setup. The below function is defined in an actor class called CameraManager that performs setup of rotationCoordinator.
func updateRotationCoordinator(_ callback:@escaping @MainActor (CGFloat) -> Void) {
guard let device = sessionConfiguration.activeVideoInput?.device, let displayLayer = displayLayer else { return }
cancellables.removeAll()
rotationCoordinator = AVCaptureDevice.RotationCoordinator(device: device, previewLayer: displayLayer)
guard let coordinator = rotationCoordinator else { return }
coordinator.publisher(for: \.videoRotationAngleForHorizonLevelPreview)
.receive(on: DispatchQueue.main)
.sink { degrees in
let radians = degrees * .pi / 180
MainActor.assumeIsolated {
callback(radians)
}
}
.store(in: &cancellables)
}
This works the very first time but when I switch cameras and call this function again, it throws a runtime error that view's layer is modified from a non-main thread. This happens at the very line where rotation coordinator is been recreated. It's not clear why initialising rotation coordinator should modify CALayer properties right in it's init method.
Modifying properties of a view's layer off the main thread is not allowed: view <MyApp.DisplayLayerView: 0x102ffaf40> with nearest ancestor view controller <_TtGC7SwiftUI19UIHostingControllerGVS_15ModifiedContentVS_7AnyViewVS_12RootModifier__: 0x101f7fb80>; backtrace:
(
0 UIKitCore 0x0000000194a977b4 575E5140-FA6A-37C2-B00B-A4EACEDFDA53 + 22509492
1 UIKitCore 0x000000019358594c 575E5140-FA6A-37C2-B00B-A4EACEDFDA53 + 416076
2 QuartzCore 0x00000001927f5bd8 D8E8E86D-85AC-3C90-B2E1-940235ECAA18 + 43992
3 QuartzCore 0x00000001927f5a4c D8E8E86D-85AC-3C90-B2E1-940235ECAA18 + 43596
4 QuartzCore 0x000000019283a41c D8E8E86D-85AC-3C90-B2E1-940235ECAA18 + 324636
5 QuartzCore 0x000000019283a0a8 D8E8E86D-85AC-3C90-B2E1-940235ECAA18 + 323752
6 AVFCapture 0x00000001af072a18 09192166-E0B6-346C-B1C2-7C95C3EFF7F7 + 420376
7 MyApp.debug.dylib 0x0000000105fa3914 $s10MyApp15CapturePipelineC25updateRotationCoordinatoryyy12CoreGraphics7CGFloatVScMYccF + 972
8 MyApp.debug.dylib 0x00000001063ade40 $s10MyApp11CameraModelC18switchVideoDevicesyyYaFTY3_ + 72
9 MyApp.debug.dylib 0x0000000105fe3cbd $s10MyApp11ContentViewV4bodyQrvg7SwiftUI6VStackVyAE05TupleE0VyAE6HStackVyAIyAE6SpacerV_AE6ButtonVyAE0E0PAEE5frame5width6height9alignmentQr12CoreGraphics7CGFloatVSg_AyE9AlignmentVtFQOyAqEE11scaledToFitQryFQOyAqEE10imageScaleyQrAE5ImageV0Z0OFQOyA3__Qo__Qo__Qo_GtGG_AmKyAIyAKyAIyAqEE7paddingyQrAE4EdgeO3SetV_AYtFQOyAA07CaptureM0V_Qo__AOyAE4TextVGAmKyAIyA9__AqEEArstUQrAY_AYA_tFQOyAM_Qo_A9_tGGtGG_AmqEE10background_AUQrqd___A_tAePRd__lFQOyAqEEArstUQrAY_AYA_tFQOyA21__Qo__AqEEArstUQrAY_AYA_tFQOyAE06_ShapeE0VyAE9RectangleVAE5ColorVG_Qo_Qo_SgtGGtGGyXEfU0_A42_yXEfU_A10_yXEfU_yyScMYccfU_yyYacfU_TQ1_ + 1
10 MyApp.debug.dylib 0x0000000105ff06d9 $s10MyApp11ContentViewV4bodyQrvg7SwiftUI6VStackVyAE05TupleE0VyAE6HStackVyAIyAE6SpacerV_AE6ButtonVyAE0E0PAEE5frame5width6height9alignmentQr12CoreGraphics7CGFloatVSg_AyE9AlignmentVtFQOyAqEE11scaledToFitQryFQOyAqEE10imageScaleyQrAE5ImageV0Z0OFQOyA3__Qo__Qo__Qo_GtGG_AmKyAIyAKyAIyAqEE7paddingyQrAE4EdgeO3SetV_AYtFQOyAA07CaptureM0V_Qo__AOyAE4TextVGAmKyAIyA9__AqEEArstUQrAY_AYA_tFQOyAM_Qo_A9_tGGtGG_AmqEE10background_AUQrqd___A_tAePRd__lFQOyAqEEArstUQrAY_AYA_tFQOyA21__Qo__AqEEArstUQrAY_AYA_tFQOyAE06_ShapeE0VyAE9RectangleVAE5ColorVG_Qo_Qo_SgtGGtGGyXEfU0_A42_yXEfU_A10_yXEfU_yyScMYccfU_yyYacfU_TATQ0_ + 1
11 MyApp.debug.dylib 0x0000000105f9c595 $sxIeAgHr_xs5Error_pIegHrzo_s8SendableRzs5NeverORs_r0_lTRTQ0_ + 1
12 MyApp.debug.dylib 0x0000000105f9fb3d $sxIeAgHr_xs5Error_pIegHrzo_s8SendableRzs5NeverORs_r0_lTRTATQ0_ + 1
13 libswift_Concurrency.dylib 0x000000019c49fe39 E15CC6EE-9354-3CE5-AF91-F641CA8283E0 + 433721
)
I see SwiftUI body being repeatedly called in an infinite loop in the presence of Environment variables like horizontalSizeClass or verticalSizeClass. This happens after device is rotated from portrait to landscape and then back to portrait mode. The deinit method of TestPlayerVM is repeatedly called. Minimally reproducible sample code is pasted below.
The infinite loop is not seen if I remove size class environment references, OR, if I skip addPlayerObservers call in the TestPlayerVM initialiser.
import AVKit
import Combine
struct InfiniteLoopView: View {
@Environment(\.verticalSizeClass) var verticalSizeClass
@Environment(\.horizontalSizeClass) var horizontalSizeClass
@State private var openPlayer = false
@State var playerURL: URL = URL(fileURLWithPath: Bundle.main.path(forResource: "Test_Video", ofType: ".mov")!)
var body: some View {
PlayerView(playerURL: playerURL)
.ignoresSafeArea()
}
}
struct PlayerView: View {
@Environment(\.dismiss) var dismiss
var playerURL:URL
@State var playerVM = TestPlayerVM()
var body: some View {
VideoPlayer(player: playerVM.player)
.ignoresSafeArea()
.background {
Color.black
}
.task {
let playerItem = AVPlayerItem(url: playerURL)
playerVM.playerItem = playerItem
}
}
}
@Observable
class TestPlayerVM {
private(set) public var player: AVPlayer = AVPlayer()
var playerItem:AVPlayerItem? {
didSet {
player.replaceCurrentItem(with: playerItem)
}
}
private var cancellable = Set<AnyCancellable>()
init() {
addPlayerObservers()
}
deinit {
print("Deinit Video player manager")
removeAllObservers()
}
private func removeAllObservers() {
cancellable.removeAll()
}
private func addPlayerObservers() {
player.publisher(for: \.timeControlStatus, options: [.initial, .new])
.receive(on: DispatchQueue.main)
.sink { timeControlStatus in
print("Player time control status \(timeControlStatus)")
}
.store(in: &cancellable)
}
}
Just downloaded XCode 26 and I see build fails despite Metal toolchain 26.0 downloaded. What am I missing?
cannot execute tool 'metal' due to missing Metal Toolchain; use: xcodebuild -downloadComponent MetalToolchain
I am wondering if new AVCam sample code was tested before release. It hangs on startup on iPhone 14 pro running iOS 26 beta with the following logs on console:
<<<< FigAudioSession(AV) >>>> audioSessionAVAudioSession_CopyMXSessionProperty signalled err=-19224 (kFigAudioSessionError_UnsupportedOperation) (getMXSessionProperty unsupported) at FigAudioSession_AVAudioSession.m:606
<<<< FigAudioSession(AV) >>>> audioSessionAVAudioSession_CopyMXSessionProperty signalled err=-19224 (kFigAudioSessionError_UnsupportedOperation) (getMXSessionProperty unsupported) at FigAudioSession_AVAudioSession.m:606
App is being debugged, do not track this hang
Hang detected: 8.04s (debugger attached, not reporting)```
I have this build error with Xcode 26 beta 2:
var asset:AVURLAsset?
func loadAsset() {
let assetURL = URL.documentsDirectory
.appendingPathComponent("sample.mov")
asset = AVURLAsset(url: assetURL, options: [AVURLAssetPreferPreciseDurationAndTimingKey: true])
/*Error: Type of expression is ambiguous without a type annotation */
if let result = try? await asset?.load(.tracks, .isPlayable, .isComposable) {
}
}
Is there an issue with try? in the new Swift compiler?
Error: Type of expression is ambiguous without a type annotation
XCode 26 beta 2 is taking more than 20 seconds to start AVCaptureSession when AVCaptureVideoDataOutput and AVCaptureAudioDataOutput are added. The problem occurs only during debugging and is clearly seen with Cinematic Capture sample code by Apple. I am using iPhone 14 Pro running iOS 26 beta 2 for reference.
I have a discrete scrubber implementation (range 0-100) using ScrollView in SwiftUI that fails on the end points. For instance, scrolling it all the way to bottom shows a value of 87 instead of 100. Or if scrolling down by tapping + button incrementally till it reaches the end, it will show the correct value of 100 when it reaches the end. But now, tapping minus button doesn't scrolls the scrubber back till minus button is clicked thrice.
I understand this has only to do with scroll target behaviour of .viewAligned but don't understand what exactly is the issue, or if its a bug in SwiftUI.
import SwiftUI
struct VerticalScrubber: View {
var config: ScrubberConfig
@Binding var value: CGFloat
@State private var scrollPosition: Int?
var body: some View {
GeometryReader { geometry in
let verticalPadding = geometry.size.height / 2 - 8
ZStack(alignment: .trailing) {
ScrollView(.vertical, showsIndicators: false) {
VStack(spacing: config.spacing) {
ForEach(0...(config.steps * config.count), id: \.self) { index in
horizontalTickMark(for: index)
.id(index)
}
}
.frame(width: 80)
.scrollTargetLayout()
.safeAreaPadding(.vertical, verticalPadding)
}
.scrollTargetBehavior(.viewAligned)
.scrollPosition(id: $scrollPosition, anchor: .top)
Capsule()
.frame(width: 32, height: 3)
.foregroundColor(.accentColor)
.shadow(color: .accentColor.opacity(0.3), radius: 3, x: 0, y: 1)
}
.frame(width: 100)
.onAppear {
DispatchQueue.main.async {
scrollPosition = Int(value * CGFloat(config.steps))
}
}
.onChange(of: value, { oldValue, newValue in
let newIndex = Int(newValue * CGFloat(config.steps))
print("New index \(newIndex)")
if scrollPosition != newIndex {
withAnimation {
scrollPosition = newIndex
print("\(scrollPosition)")
}
}
})
.onChange(of: scrollPosition, { oldIndex, newIndex in
guard let pos = newIndex else { return }
let newValue = CGFloat(pos) / CGFloat(config.steps)
if abs(value - newValue) > 0.001 {
value = newValue
}
})
}
}
private func horizontalTickMark(for index: Int) -> some View {
let isMajorTick = index % config.steps == 0
let tickValue = index / config.steps
return HStack(spacing: 8) {
Rectangle()
.fill(isMajorTick ? Color.accentColor : Color.gray.opacity(0.5))
.frame(width: isMajorTick ? 24 : 12, height: isMajorTick ? 2 : 1)
if isMajorTick {
Text("\(tickValue * 5)")
.font(.system(size: 12, weight: .medium))
.foregroundColor(.primary)
.fixedSize()
}
}
.frame(maxWidth: .infinity, alignment: .trailing)
.padding(.trailing, 8)
}
}
#Preview("Vertical Scrubber") {
struct VerticalScrubberPreview: View {
@State private var value: CGFloat = 0
private let config = ScrubberConfig(count: 20, steps: 5, spacing: 8)
var body: some View {
VStack {
Text("Vertical Scrubber (0–100 in steps of 5)")
.font(.title2)
.padding()
HStack(spacing: 30) {
VerticalScrubber(config: config, value: $value)
.frame(width: 120, height: 300)
.background(Color(.systemBackground))
.border(Color.gray.opacity(0.3))
VStack {
Text("Current Value:")
.font(.headline)
Text("\(value * 5, specifier: "%.0f")")
.font(.system(size: 36, weight: .bold))
.padding()
HStack {
Button("−5") {
let newValue = max(0, value - 1)
if value != newValue {
value = newValue
UISelectionFeedbackGenerator().selectionChanged()
}
print("Value \(newValue), \(value)")
}
.disabled(value <= 0)
Button("+5") {
let newValue = min(CGFloat(config.count), value + 1)
if value != newValue {
value = newValue
UISelectionFeedbackGenerator().selectionChanged()
}
print("Value \(newValue), \(value)")
}
.disabled(value >= CGFloat(config.count))
}
.buttonStyle(.bordered)
}
}
Spacer()
}
.padding()
}
}
return VerticalScrubberPreview()
}