Post

Replies

Boosts

Views

Activity

Performance Concerns and Dynamic Control of Parallel Image Uploads Using Swift TaskGroup
I'm currently developing an iOS app with image upload functionality. To enhance upload speed, I'm considering implementing parallel uploads using Swift’s TaskGroup. However, I have concerns that in environments with limited bandwidth, parallelization might introduce overhead and contention, ultimately slowing down uploads instead of improving them. Specifically, I'm curious about: Is this concern valid? Does parallelizing uploads become counterproductive in low-bandwidth conditions due to overhead and network contention? If so, I'm considering dynamically adjusting the concurrency level based on network conditions. Does anyone have experience or best practices regarding such an approach? Any insights or advice would be greatly appreciated. Thank you!
1
0
168
Jun ’25
existential any error in MLModel class
Problem I have set SWIFT_UPCOMING_FEATURE_EXISTENTIAL_ANY at Build Settings > Swift Compiler - Upcoming Features to true to support this existential any proposal. Then following errors appears in the MLModel class, but this is an auto-generated file, so I don't know how to deal with it. Use of protocol 'MLFeatureProvider' as a type must be written 'any MLFeatureProvider' Use of protocol 'Error' as a type must be written 'any Error' environment Xcode 16.0 Xcode 16.1 Beta 2 What I tried Delete cache of DerivedData and regenerate MLModel class files I also tried using DepthAnythingV2SmallF16P6.mlpackage to verify if there is a problem with my mlmodel I tried the above after setting up Swift6 in Xcode I also used coremlc to generate MLModel class files with Swift6 specified by command.
2
2
703
Dec ’24
AVFoundation's ultra-wide-angle camera behaves differently than the default apple camera
Thanks for reading. I am creating a custom camera using AVFoundation. When I maximize the wide angle with the ultra wide angle camera (when videoZoomFactor is minimized), the wide angle field of view is narrower compared to the apple default camera. Looking at the metadata from the album, the focal length is 13mm for the apple default camera, while it is 16mm for the one I created. Below is an excerpt of the code. Camera Settings if let captureDevice = AVCaptureDevice.default(       .builtInTripleCamera,       for: .video,       position: .back     ) {       self.captureDevice = captureDevice     } else if let captureDevice = AVCaptureDevice.default(       .builtInDualWideCamera,       for: .video,       position: .back     ) {       self.captureDevice = captureDevice     } else if let captureDevice = AVCaptureDevice.default(       .builtInWideAngleCamera,       for: .video,       position: .back     ) {       self.captureDevice = captureDevice } do {       let input = try AVCaptureDeviceInput(device: captureDevice)       let videoDataOutput = AVCaptureVideoDataOutput()       // Omitted       photoOutput = AVCapturePhotoOutput()       guard let photoOutput = photoOutput else { return }       photoOutput.isHighResolutionCaptureEnabled = true       session.sessionPreset = .photo      // Omitted     } catch {        } for connection in session.connections {       connection.preferredVideoStabilizationMode = .cinematicExtended } zoom function    func zoom(zoomFactor: CGFloat, ramping: Bool = false) {     do {       try captureDevice?.lockForConfiguration()       self.zoomFactor = zoomFactor       if ramping {         captureDevice?.ramp(toVideoZoomFactor: zoomFactor, withRate: 10.0)       } else {         captureDevice?.videoZoomFactor = zoomFactor       }       captureDevice?.unlockForConfiguration()     } catch {       errorReportingService.reportError(error: error)     }   } Test devices: iPhone 11, 12mini Thanks for reading this far. I want to make it as wide angle as the apple default camera! This app allows for a wider angle than the one I created. So I believe there is a way.
3
0
1.8k
Feb ’23
CompositeImage is always 2 for images acquired using AVCapturePhotoOutput
I use AVCapturePhotoOutput to take still pictures. I print metadata with the following code. func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) {     guard let data = photo.fileDataRepresentation() else { return }     let ciImage = CIImage(data: data)     print(ciImage?.properties) The following CompositeImage can be obtained CompositeImage = 2 However, CompositeImage will always be 2 even if HDR is turned off as shown below. device.automaticallyAdjustsVideoHDREnabled = false device.isVideoHDREnabled = false The expected value is 1. Could you find the cause of this?
0
0
1.8k
Jun ’22
test of asynchronous by XCTestExpectation
When using XCTestExpectation, is there a difference between writing XCTAssert in sink or after wait? My example: let exp = expectation(description: "test") repository.delete(id: "2") .sink { [unowned self] in print("in sink closure") XCTAssertEqual(repository.object.count, 2) XCTAssertFalse(repository.object.contains { $0.id == "2" }) exp.fulfill() } .store(in: &cancellables) wait(for: [exp], timeout: 1.0) print("after wait") XCTAssertEqual(repository.object.count, 2) XCTAssertFalse(repository.object.contains { $0.id == "2" }) XCTAssert in sink succeeds, but XCTAssert after wait fails. So I thought the code after wait would be executed without waiting for wait. However, the console outputs the following in the following order in sink closure after wait Why does XCTAssert fail after wait?
1
0
654
Apr ’22
message isn's displayed when crash by assertNoFailure
When crash by assertNoFailure, I cannot see the message. smaple code View import SwiftUI struct ContentView: View {   @ObservedObject private var viewModel = ContentViewModel()   var body: some View {       Button(action: { viewModel.crash() }, label: { Text("button") })   } } ViewModel import Foundation import Combine class ContentViewModel: ObservableObject {   var cancellables = Set<AnyCancellable>()   private enum AddOneError: Swift.Error {     case error   }   private func addOnePublisher(_ n: Int) -> AnyPublisher<Int, AddOneError> {     Deferred {       Future { (promise) in         guard n < 10 else {           promise(.failure(AddOneError.error))           return         }         promise(.success(n + 1))       }     }     .eraseToAnyPublisher()   }   func crash() {     addOnePublisher(10)       .assertNoFailure("I want to disply this message.")       .sink { (completion) in         switch completion {         case .finished:           break         case .failure(let error):           print(error)         }       } receiveValue: { (n) in         print(n)       }       .store(in: &cancellables)   } } when crash if I use .catch { error -> AnyPublisher<Int, Never> in fatalError("I can see this message") } instead of assertNoFailure I can see log Fatal error: I can see this message How can I see the prefix of assertNoFailure? And if not, when is it used?
1
0
850
Dec ’21
Bug in VStack alignment in button in iOS15.
When using VStack in a button in iOS15, alignment is buggy. struct ContentView: View {   var body: some View {     Button(action: {}, label: {       VStack(alignment: .leading, spacing: 10) {         Text("こんにちはこんにちはこんにちは")         Text("これはテストです。これはテストです。これはテストです。これはテストです。これはテストです。これはテストです。これはテストです。これはテストです。これはテストです。")       }     })   } } iOS14 iOS15
1
0
1.9k
Nov ’21
Performance Concerns and Dynamic Control of Parallel Image Uploads Using Swift TaskGroup
I'm currently developing an iOS app with image upload functionality. To enhance upload speed, I'm considering implementing parallel uploads using Swift’s TaskGroup. However, I have concerns that in environments with limited bandwidth, parallelization might introduce overhead and contention, ultimately slowing down uploads instead of improving them. Specifically, I'm curious about: Is this concern valid? Does parallelizing uploads become counterproductive in low-bandwidth conditions due to overhead and network contention? If so, I'm considering dynamically adjusting the concurrency level based on network conditions. Does anyone have experience or best practices regarding such an approach? Any insights or advice would be greatly appreciated. Thank you!
Replies
1
Boosts
0
Views
168
Activity
Jun ’25
existential any error in MLModel class
Problem I have set SWIFT_UPCOMING_FEATURE_EXISTENTIAL_ANY at Build Settings > Swift Compiler - Upcoming Features to true to support this existential any proposal. Then following errors appears in the MLModel class, but this is an auto-generated file, so I don't know how to deal with it. Use of protocol 'MLFeatureProvider' as a type must be written 'any MLFeatureProvider' Use of protocol 'Error' as a type must be written 'any Error' environment Xcode 16.0 Xcode 16.1 Beta 2 What I tried Delete cache of DerivedData and regenerate MLModel class files I also tried using DepthAnythingV2SmallF16P6.mlpackage to verify if there is a problem with my mlmodel I tried the above after setting up Swift6 in Xcode I also used coremlc to generate MLModel class files with Swift6 specified by command.
Replies
2
Boosts
2
Views
703
Activity
Dec ’24
AVFoundation's ultra-wide-angle camera behaves differently than the default apple camera
Thanks for reading. I am creating a custom camera using AVFoundation. When I maximize the wide angle with the ultra wide angle camera (when videoZoomFactor is minimized), the wide angle field of view is narrower compared to the apple default camera. Looking at the metadata from the album, the focal length is 13mm for the apple default camera, while it is 16mm for the one I created. Below is an excerpt of the code. Camera Settings if let captureDevice = AVCaptureDevice.default(       .builtInTripleCamera,       for: .video,       position: .back     ) {       self.captureDevice = captureDevice     } else if let captureDevice = AVCaptureDevice.default(       .builtInDualWideCamera,       for: .video,       position: .back     ) {       self.captureDevice = captureDevice     } else if let captureDevice = AVCaptureDevice.default(       .builtInWideAngleCamera,       for: .video,       position: .back     ) {       self.captureDevice = captureDevice } do {       let input = try AVCaptureDeviceInput(device: captureDevice)       let videoDataOutput = AVCaptureVideoDataOutput()       // Omitted       photoOutput = AVCapturePhotoOutput()       guard let photoOutput = photoOutput else { return }       photoOutput.isHighResolutionCaptureEnabled = true       session.sessionPreset = .photo      // Omitted     } catch {        } for connection in session.connections {       connection.preferredVideoStabilizationMode = .cinematicExtended } zoom function    func zoom(zoomFactor: CGFloat, ramping: Bool = false) {     do {       try captureDevice?.lockForConfiguration()       self.zoomFactor = zoomFactor       if ramping {         captureDevice?.ramp(toVideoZoomFactor: zoomFactor, withRate: 10.0)       } else {         captureDevice?.videoZoomFactor = zoomFactor       }       captureDevice?.unlockForConfiguration()     } catch {       errorReportingService.reportError(error: error)     }   } Test devices: iPhone 11, 12mini Thanks for reading this far. I want to make it as wide angle as the apple default camera! This app allows for a wider angle than the one I created. So I believe there is a way.
Replies
3
Boosts
0
Views
1.8k
Activity
Feb ’23
kCGImagePropertyExifSourceImageNumberOfCompositeImage
I want to store the number of images being composited with HDR in kCGImagePropertyExifSourceImageNumberOfCompositeImage. But I don't know how to get the composite number. Could you please tell me how to do that?
Replies
0
Boosts
0
Views
640
Activity
Jun ’22
CompositeImage is always 2 for images acquired using AVCapturePhotoOutput
I use AVCapturePhotoOutput to take still pictures. I print metadata with the following code. func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) {     guard let data = photo.fileDataRepresentation() else { return }     let ciImage = CIImage(data: data)     print(ciImage?.properties) The following CompositeImage can be obtained CompositeImage = 2 However, CompositeImage will always be 2 even if HDR is turned off as shown below. device.automaticallyAdjustsVideoHDREnabled = false device.isVideoHDREnabled = false The expected value is 1. Could you find the cause of this?
Replies
0
Boosts
0
Views
1.8k
Activity
Jun ’22
test of asynchronous by XCTestExpectation
When using XCTestExpectation, is there a difference between writing XCTAssert in sink or after wait? My example: let exp = expectation(description: "test") repository.delete(id: "2") .sink { [unowned self] in print("in sink closure") XCTAssertEqual(repository.object.count, 2) XCTAssertFalse(repository.object.contains { $0.id == "2" }) exp.fulfill() } .store(in: &cancellables) wait(for: [exp], timeout: 1.0) print("after wait") XCTAssertEqual(repository.object.count, 2) XCTAssertFalse(repository.object.contains { $0.id == "2" }) XCTAssert in sink succeeds, but XCTAssert after wait fails. So I thought the code after wait would be executed without waiting for wait. However, the console outputs the following in the following order in sink closure after wait Why does XCTAssert fail after wait?
Replies
1
Boosts
0
Views
654
Activity
Apr ’22
message isn's displayed when crash by assertNoFailure
When crash by assertNoFailure, I cannot see the message. smaple code View import SwiftUI struct ContentView: View {   @ObservedObject private var viewModel = ContentViewModel()   var body: some View {       Button(action: { viewModel.crash() }, label: { Text("button") })   } } ViewModel import Foundation import Combine class ContentViewModel: ObservableObject {   var cancellables = Set<AnyCancellable>()   private enum AddOneError: Swift.Error {     case error   }   private func addOnePublisher(_ n: Int) -> AnyPublisher<Int, AddOneError> {     Deferred {       Future { (promise) in         guard n < 10 else {           promise(.failure(AddOneError.error))           return         }         promise(.success(n + 1))       }     }     .eraseToAnyPublisher()   }   func crash() {     addOnePublisher(10)       .assertNoFailure("I want to disply this message.")       .sink { (completion) in         switch completion {         case .finished:           break         case .failure(let error):           print(error)         }       } receiveValue: { (n) in         print(n)       }       .store(in: &cancellables)   } } when crash if I use .catch { error -> AnyPublisher<Int, Never> in fatalError("I can see this message") } instead of assertNoFailure I can see log Fatal error: I can see this message How can I see the prefix of assertNoFailure? And if not, when is it used?
Replies
1
Boosts
0
Views
850
Activity
Dec ’21
Add a new Package Product to already added Package
How can I add a new Package Product to a Package that has already been added? For example: I have been using Firebase itself, but I want to add FirebaseCrashlytics to it. Do I have to delete the Package and then add it again?
Replies
0
Boosts
0
Views
518
Activity
Dec ’21
want to make external display disable
Xcode13.1 simulator iPhone12(iOS 15.0) When run, simulator is external display and black screen. I want to undo it, but I don't know how. I/O -> External Displays -> Disabled isSelected
Replies
4
Boosts
0
Views
2.7k
Activity
Nov ’21
delete green (+) icon when onInsert
I use .onInsert now. I want to delete green (+) icon that displayed by default. Is it possible?
Replies
4
Boosts
0
Views
1.8k
Activity
Nov ’21
Bug in VStack alignment in button in iOS15.
When using VStack in a button in iOS15, alignment is buggy. struct ContentView: View {   var body: some View {     Button(action: {}, label: {       VStack(alignment: .leading, spacing: 10) {         Text("こんにちはこんにちはこんにちは")         Text("これはテストです。これはテストです。これはテストです。これはテストです。これはテストです。これはテストです。これはテストです。これはテストです。これはテストです。")       }     })   } } iOS14 iOS15
Replies
1
Boosts
0
Views
1.9k
Activity
Nov ’21