Dive into the world of programming languages used for app development.

All subtopics
Posts under Programming Languages topic

Post

Replies

Boosts

Views

Activity

SwifUI. Short way for using modifier methods
These helper methods allow to use modifier methods in standard for SwiftUI short way. extension View { @inline(__always) func modify(_ block: (_ view: Self) -> some View) -> some View { block(self) } @inline(__always) func modify<V : View, T>(_ block: (_ view: Self, _ data: T) -> V, with data: T) -> V { block(self, data) } } _ DISCUSSION Suppose you have modifier methods: func addBorder(view: some View) -> some View { view.padding().border(Color.red, width: borderWidth) } func highlight(view: some View, color: Color) -> some View { view.border(Color.red, width: borderWidth).overlay { color.opacity(0.3) } } _ Ordinar Decision Your code may be like this: var body: some View { let image = Image(systemName: "globe") let borderedImage = addBorder(view: image) let highlightedImage = highlight(view: borderedImage, color: .red) let text = Text("Some Text") let borderedText = addBorder(view: text) let highlightedText = highlight(view: borderedText, color: .yellow) VStack { highlightedImage highlightedText } } This code doesn't look like standard SwiftUI code. _ Better Decision Described above helper methods modify(:) and modify(:,with:) allow to write code in typical for SwiftUI short way: var body: some View { VStack { Image(systemName: "globe") .modify(addBorder) .modify(highlight, with: .red) Text("Some Text") .modify(addBorder) .modify(highlight, with: .yellow) } }
0
0
402
Dec ’24
Animatable AnyInsettableShape
System provides AnyShape type erasure that animates correctly. But system doesn't provide AnyInsettableShape. Here is my implementation of AnyInsettableShape (and AnyAnimatableData that is needed to support animation). Let me know if there is simpler solution. struct AnyInsettableShape: InsettableShape { private let _path: (CGRect) -> Path private let _inset: (CGFloat) -> AnyInsettableShape private let _getAnimatableData: () -> AnyAnimatableData private let _setAnimatableData: (_ data: AnyAnimatableData) -> AnyInsettableShape init<S>(_ shape: S) where S : InsettableShape { _path = { shape.path(in: $0) } _inset = { AnyInsettableShape(shape.inset(by: $0)) } _getAnimatableData = { AnyAnimatableData(shape.animatableData) } _setAnimatableData = { data in guard let otherData = data.rawValue as? S.AnimatableData else { assertionFailure(); return AnyInsettableShape(shape) } var shape = shape shape.animatableData = otherData return AnyInsettableShape(shape) } } var animatableData: AnyAnimatableData { get { _getAnimatableData() } set { self = _setAnimatableData(newValue) } } func path(in rect: CGRect) -> Path { _path(rect) } func inset(by amount: CGFloat) -> some InsettableShape { _inset(amount) } } struct AnyAnimatableData : VectorArithmetic { init<T : VectorArithmetic>(_ value: T) { self.init(optional: value) } private init<T : VectorArithmetic>(optional value: T?) { rawValue = value _scaleBy = { factor in (value != nil) ? AnyAnimatableData(value!.scaled(by: factor)) : .zero } _add = { other in AnyAnimatableData(value! + (other.rawValue as! T)) } _subtract = { other in AnyAnimatableData(value! - (other.rawValue as! T)) } _equal = { other in value! == (other.rawValue as! T) } _magnitudeSquared = { (value != nil) ? value!.magnitudeSquared : .zero } _zero = { AnyAnimatableData(T.zero) } } fileprivate let rawValue: (any VectorArithmetic)? private let _scaleBy: (_: Double) -> AnyAnimatableData private let _add: (_ other: AnyAnimatableData) -> AnyAnimatableData private let _subtract: (_ other: AnyAnimatableData) -> AnyAnimatableData private let _equal: (_ other: AnyAnimatableData) -> Bool private let _magnitudeSquared: () -> Double private let _zero: () -> AnyAnimatableData mutating func scale(by rhs: Double) { self = _scaleBy(rhs) } var magnitudeSquared: Double { _magnitudeSquared() } static let zero = AnyAnimatableData(optional: nil as Double?) @inline(__always) private var isZero: Bool { rawValue == nil } static func + (lhs: AnyAnimatableData, rhs: AnyAnimatableData) -> AnyAnimatableData { guard let (lhs, rhs) = fillZeroTypes(lhs, rhs) else { return .zero } return lhs._add(rhs) } static func - (lhs: AnyAnimatableData, rhs: AnyAnimatableData) -> AnyAnimatableData { guard let (lhs, rhs) = fillZeroTypes(lhs, rhs) else { return .zero } return lhs._subtract(rhs) } static func == (lhs: AnyAnimatableData, rhs: AnyAnimatableData) -> Bool { guard let (lhs, rhs) = fillZeroTypes(lhs, rhs) else { return true } return lhs._equal(rhs) } @inline(__always) private static func fillZeroTypes(_ lhs: AnyAnimatableData, _ rhs: AnyAnimatableData) -> (AnyAnimatableData, AnyAnimatableData)? { switch (!lhs.isZero, !rhs.isZero) { case (true, true): (lhs, rhs) case (true, false): (lhs, lhs._zero()) case (false, true): (rhs._zero(), rhs) case (false, false): nil } } }
0
0
421
Dec ’24
Playing Timed Sound Effects in Background
Hi, I'm relatively new to iOS development and kindly ask for some feedback on a strategy to achieve this desired behavior in my app. My Question: What would be the best strategy for sound effect playback when an app is in the background with precise timing? Is this even possible? Context: I created a basic countdown timer app (targeting iOS 17 with Swift/SwiftUI.). Countdown sessions can last up to 30-60 mins. When the timer is started it progresses through a series of sub-intervals and plays a short sound for each one. I used AVAudioPlayer and everything works fine when the app is in the foreground. I'm considering switching to AVAudioEngine b/c precise timing is very important and the AIs tell me this would have better precision. I'm already setting "App plays audio or streams audio/video using AirPlay" in my Plist, and have configured: AVAudioSession.sharedInstance().setCategory(.playback, mode: .default, options: .mixWithOthers) Curiously, when testing on my iPhone 13 mini, sounds sometimes still play when the app is in the background, but not always. What I've considered: Background Tasks: Would they make any sense for this use-case? Seems like not if the allowed time is short &amp; limited by the system. Pre-scheduling all Sounds: Not sure this would even work and seems like a lot of memory would be needed (could be hundreds of intervals). ActivityKit Alerts: works but with a ~50ms delay which is too long for my purposes. Pre-Render all SFX to 1 large audio file: Seems like a lot of work and processing time and probably not worth it. I hope there's a better solution. I'd really appreciate any feedback.
1
0
1.1k
Dec ’24
AVExportSession in Xcode16 - "exportAsynchronouslyWithCompletionHandler: more than once."
I’m experiencing a crash at runtime when trying to extract audio from a video. This issue occurs on both iOS 18 and earlier versions. The crash is caused by the following error: *** Terminating app due to uncaught exception 'NSInternalInconsistencyException', reason: '*** -[AVAssetExportSession exportAsynchronouslyWithCompletionHandler:] Cannot call exportAsynchronouslyWithCompletionHandler: more than once. (0x1851435ec 0x1826dd244 0x1970c09c0 0x214d8f358 0x214d95899 0x190a1c8b9 0x214d8efd9 0x30204cef5 0x302053ab9 0x190a5ae39) libc++abi: terminating due to uncaught exception of type NSException My previous code worked fine, but it's crashing with Swift 6. Does anyone know a solution for this? Previous code: func extractAudioFromVideo(from videoURL: URL, exportHandler: ((AVAssetExportSession, CurrentValueSubject<Float, Never>?) -> Void)? = nil, completion: @escaping (Swift.Result<URL, Error>) -> Void) { let asset = AVAsset(url: videoURL) // Create an AVAssetExportSession to export the audio track guard let exportSession = AVAssetExportSession(asset: asset, presetName: AVAssetExportPresetAppleM4A) else { completion(.failure(NSError(domain: "com.example.app", code: -1, userInfo: [NSLocalizedDescriptionKey: "Failed to create AVAssetExportSession"]))) return } // Set the output file type and path guard let filename = videoURL.lastPathComponent.components(separatedBy: ["."]).first else { return } let outputURL = VideoUtils.getTempAudioExportUrl(filename) VideoUtils.deleteFileIfExists(outputURL.path) exportSession.outputFileType = .m4a exportSession.outputURL = outputURL let audioExportProgressPublisher = CurrentValueSubject<Float, Never>(0.0) if let exportHandler = exportHandler { exportHandler(exportSession, audioExportProgressPublisher) } // Periodically check the progress of the export session let timer = Timer.scheduledTimer(withTimeInterval: 0.1, repeats: true) { _ in audioExportProgressPublisher.send(exportSession.progress) } // Export the audio track asynchronously exportSession.exportAsynchronously { switch exportSession.status { case .completed: completion(.success(outputURL)) case .failed: completion(.failure(exportSession.error ?? NSError(domain: "com.example.app", code: -1, userInfo: [NSLocalizedDescriptionKey: "Unknown error occurred while exporting audio"]))) case .cancelled: completion(.failure(NSError(domain: "com.example.app", code: -1, userInfo: [NSLocalizedDescriptionKey: "Export session was cancelled"]))) default: completion(.failure(NSError(domain: "com.example.app", code: -1, userInfo: [NSLocalizedDescriptionKey: "Unknown export session status"]))) } // Invalidate the timer when the export session completes or is cancelled timer.invalidate() } } ## New Code: func extractAudioFromVideo(from videoURL: URL, exportHandler: ((AVAssetExportSession, CurrentValueSubject<Float, Never>?) -> Void)? = nil, completion: @escaping (Swift.Result<URL, Error>) -> Void) { let asset = AVAsset(url: videoURL) // Create an AVAssetExportSession to export the audio track guard let exportSession = AVAssetExportSession(asset: asset, presetName: AVAssetExportPresetAppleM4A) else { completion(.failure(NSError(domain: "com.example.app", code: -1, userInfo: [NSLocalizedDescriptionKey: "Failed to create AVAssetExportSession"]))) return } // Set the output file type and path guard let filename = videoURL.lastPathComponent.components(separatedBy: ["."]).first else { return } let outputURL = VideoUtils.getTempAudioExportUrl(filename) VideoUtils.deleteFileIfExists(outputURL.path) exportSession.outputFileType = .m4a exportSession.outputURL = outputURL let audioExportProgressPublisher = CurrentValueSubject<Float, Never>(0.0) if let exportHandler { exportHandler(exportSession, audioExportProgressPublisher) } let task = Task { if #available(iOS 18.0, *) { // Handle export for iOS 18 and later let states = exportSession.states(updateInterval: 0.1) for await state in states { switch state { case .pending, .waiting: break case .exporting(progress: let progress): print("Exporting: \(progress.fractionCompleted)") if progress.isFinished { completion(.success(outputURL)) } else if progress.isCancelled { completion(.failure(NSError(domain: "com.example.app", code: -1, userInfo: [NSLocalizedDescriptionKey: "Export session was cancelled"]))) } else { audioExportProgressPublisher.send(Float(progress.fractionCompleted)) } } } try await exportSession.export(to: outputURL, as: .m4a) // Only call export once } else { // Handle export for iOS versions below 18 let publishTimer = Timer.publish(every: 0.1, on: .main, in: .common) .autoconnect() .sink { [weak exportSession] _ in guard let exportSession = exportSession else { return } audioExportProgressPublisher.send(exportSession.progress) } // Only call export once await exportSession.export() // Handle the export session's status switch exportSession.status { case .completed: completion(.success(outputURL)) case .failed: completion(.failure(exportSession.error ?? NSError(domain: "com.example.app", code: -1, userInfo: [NSLocalizedDescriptionKey: "Unknown error occurred while exporting audio"]))) case .cancelled: completion(.failure(NSError(domain: "com.example.app", code: -1, userInfo: [NSLocalizedDescriptionKey: "Export session was cancelled"]))) default: completion(.failure(NSError(domain: "com.example.app", code: -1, userInfo: [NSLocalizedDescriptionKey: "Unknown export session status"]))) } // Invalidate the timer when the export session completes or is cancelled publishTimer.cancel() } } task.cancel() }
0
0
425
Dec ’24
App Settings Not Appearing with Xcode 16.2
I recently encountered an issue with Xcode 16.2 while attempting to integrate Settings.bundle into a new app. I added Settings.bundle as a new file (using the provided template), but when I ran the app (the standard simple "Hello World" project), the expected three default controls (Name, Enabled, Slider) did not appear in the app's settings. To troubleshoot, I downgraded my system to macOS Sonoma 14.7.2 and Xcode 15.4 (on a 2023 Mac Mini, M2). After this downgrade, everything worked as expected. With a new project, adding Settings.bundle, and running the app, the settings entry for the app appeared, including the three default fields. This behavior suggests a potential issue or incompatibility with Xcode 16.2.
0
1
433
Dec ’24
Is it possible to Add Reply of Push Notification from Airpod using Voice ?
We want to do below addition to iOS Mobile App. Airpod announces Push notification = which is workking now we want to use voice command that "Reply to this" and sending Reply to that notification but it is saying it is not supported in your app. So basically we need to use feature - Listen and respond to messages with AirPods Do we need to add any integration inside app for this or it will directly worked with Siri settings ? Is it possible to do in non messaging App? Is it possible to do without syncing contacts ?
0
0
479
Dec ’24
Model instance invalidated, backing data could no longer be found in the store
I have been recently getting the following error seemingly randomly, when an event handler of a SwiftUI view accesses a relationship of a SwiftData model the view holds a reference to. I haven't yet found a reliable way of reproducing it: SwiftData/BackingData.swift:866: Fatal error: This model instance was invalidated because its backing data could no longer be found the store. PersistentIdentifier(id: SwiftData.PersistentIdentifier.ID(url: COREDATA_ID_URL), implementation: SwiftData.PersistentIdentifierImplementation) What could cause this error? Could you suggest me a workaround?
2
0
906
Dec ’24
SwiftData Query filter().first crashed in iOS18.2
Hi There, I have a iOS App which has been published and purely managing data by SwiftData. I use following simple codes everywhere in Views: ... @Query var items: [Item] .... if let firstItem = items.first( where: {...}) { ... Then I encountered crash at Query that _items.wrapperdValue has some errors. Then I tried to split first(where...) into ordinary way: let filteredItems = items.filter(...) if let firstItem = filteredItems.first { ... It runs OK. Is it a bug in SwiftData in 18.2 or I missed some steps to facilitate SwiftData macros?
1
0
376
Dec ’24
Xcode: Duplicate Symbols Error
Hello, I have a problem with Xcode, in C++ language. When I create a new project, I put my program in a file and click Build. It works correctly and without any problems, but when I enter a second file in the same project and click build, it says build failed. In the log it says, duplicate symbols appear.
1
0
620
Dec ’24
Open Share Extension
Hello, everyone! Help me please to find answer. I have two applications: App-1 with share extension and App-2 without it. From the second app I can open share extension via UIActivityViewController. But I need this extension in the second application to open immediately by pressing a button, and not through UIActivityViewController. Can I do this?
3
0
1.6k
Dec ’24
Strange crash when using .values from @Published publisher
Given the below code with Swift 6 language mode, Xcode 16.2 If running with iOS 18+: the app crashes due to _dispatch_assert_queue_fail If running with iOS 17 and below: there is a warning: warning: data race detected: @MainActor function at Swift6Playground/PublishedValuesView.swift:12 was not called on the main thread Could anyone please help explain what's wrong here? import SwiftUI import Combine @MainActor class PublishedValuesViewModel: ObservableObject { @Published var count = 0 @Published var content: String = "NA" private var cancellables: Set<AnyCancellable> = [] func start() async { let publisher = $count .map { String(describing: $0) } .removeDuplicates() for await value in publisher.values { content = value } } } struct PublishedValuesView: View { @ObservedObject var viewModel: PublishedValuesViewModel var body: some View { Text("Published Values: \(viewModel.content)") .task { await viewModel.start() } } }
2
0
553
Dec ’24
In app purchase goes to Entered Billing Retry after free trail
I have a VPN application published in the app store. Used Ikev2 for this personal VPN. There are two in-app purchases. One is 'Monthly' and another is 'Yearly' with 3 days free trial. We have seen something strange for the yearly subscriptions which has free trail, the cancellation reason through the billing issue is too high like 70-80% due to billing retry state. Some other apps which have billing issues under 10% always. We have done some research and found that if the user doesn't cancel and Apple is unable to charge then it goes to a billing retry state. If users don't like the app, they could cancel their subscription/free trail easily but they are not doing this and why Apple unable to charge the bill after the trial ends. Am i missing something in the developer end?
3
0
1.1k
Dec ’24
Inquiry about WebRTC camera access when using Chrome browser and WKWebView API on iPadOS
We are Java application developers and we have a question regarding camera access via WebRTC on iPadOS. Specifically, on iPadOS 17.1, we are encountering an issue when trying to access the camera via the WKWebView API in the Chrome browser, where an error occurs and the camera capture fails. Our investigation suggests that device access through the navigator.mediaDevices property via the WKWebView API may not work in Chrome. However, it works as expected in the Safari browser, leading us to wonder if this is a Chrome-specific limitation, or if it's due to an iPadOS setting or specification. At this point, we are unsure if this issue is related to the WKWebView and WebRTC specifications on iPadOS 17.1, or if there are specific limitations in Chrome. We would appreciate any insights or solutions regarding camera access in iPadOS 17.1 with WKWebView and WebRTC, especially in relation to Chrome.
1
0
527
Dec ’24
Problem with deleting SDP service (macOS, IOBluetooth)
Hello everyone. macOS, IOBluetooth framework. My goal is to create a temporary SDP service. According to the documentation, by default a temporary service is created (aka Persistent = NO), which is deleted after the application is closed. The documentation also mentions the IOBluetoothRemoveServiceWithRecordHandle function for forced removal of the service. This function is deprecated and is currently unavailable. I guesse that it has been replaced by the IOBluetoothSDPServiceRecord.removeServiceRecord method. The essence of the problem is that the server is not deleted either using removeServiceRecord or even after app closing. That is, if you create several services and try to delete them, they will remain alive. Only turning Bluetooth off and on in the OS helps. I tested all versions of macOS starting with Monteray. The same behavior. for (int i = 0; i < 10; i++) { service = [IOBluetoothSDPServiceRecord publishedServiceRecordWithDictionary:dictionary]; if (!service) { NSLog(@"Failed to create service"); } else { [service getRFCOMMChannelID:&channelID]; [service getServiceRecordHandle:&serverHandle]; NSLog(@"A new service has been created handle=%u, channelID=%hhu", serverHandle, channelID); if (service.removeServiceRecord != kIOReturnSuccess) { NSLog(@"Failed to delete service"); } //service.release; service = nil; } } Can someone confirm this behavior? And is there a solution? A minimal test example is available at the link
1
1
662
Dec ’24
SwiftUI Entry Point - Proper Location In Project
Hi In C#, one can define associated functions by the following. Notice that "Declarations DE" is a reference to a function in another C# project file. This lets the compiler know that there are other references in the project. Likewise, "Form_Load" is the entry point of the code, similar to "main" in C. Any calls to related functions can be made in this section, to the functions that have been previously defined above. So I set out trying to find similar information about SwiftUI, and found several, but only offer partial answers to my questions. The YouTube video... Extracting functions and subviews in SwiftUI | Bootcamp #20 - YouTube ... goes into some of the details, but still leaves me hanging. Likewise... SOLVED: Swift Functions In Swift UI – SwiftUI – Hacking with Swift forums ... has further information, but nothing concrete that I am looking for. Now in the SwiftUI project, I tried this... The most confusing thing for me, is where is "main"? I found several examples that call functions from the structure shown above, BUT I have no reason as to why. So one web example on StackOverFlow called the function from position 1. That did not work. Position 2 worked to call the function at position 3, but really, why? All this activity brings up a lot of questions for me, such as: Does SwiftUI need function callouts similar to C#, and they are called out even before running "main". I seem to recall Borland Delphi being this way as well. How does SwiftUI make references to other classes (places where other functions are stored in separate files)? Does SwiftUI actually make use of "main" in the normal sense, i.e. similar to C, C#, Rust and so on? I did notice that once a SwiftUI function is called, it makes reference to data being passed very similar to other languages, at least for the examples I found. Note that I looked at official SwiftUI documentation, but did not come across information that answers the above.
6
0
521
Dec ’24
Instance methods don't work on iOS18.2.
I am using CHCSVParser in objective-c to use the CSVString extension of the Array instance method. When I installed a new app on iOS18.1.1, the correct value (@"Application,"abc,def"") was returned for both the first and second turns, but when I installed a new app on iOS18.2, the correct value was returned for the first turn, but @"" was returned for the second turn. Even when I debug with step into, I can enter the CSVString for the first turn, but I cannot enter it for the second turn and after. It's as if the instance method is not being generated. There are in-app purchases between the first and second turns, but the view controllers that are called are also the same. Is there any change between iOS18.1.1 and iOS18.2? [Code] NSArray *application = [NSArray arrayWithObjects:KEY_APPLICATION, @"abc,def", nil]; NSString *applistring = [application CSVString]; NSString *appliStr = [application CSVString]; [Debug window 18.1.1 First] application __NSArrayI * @"2 elements" 0x0000000302118c00 [0] __NSCFConstantString * @"Application" 0x00000001005deab8 [1] __NSCFConstantString * @"abc,def" 0x00000001005deb78 applistring __NSCFString * @"Application,"abc,def"" 0x0000000302f7d050 appliStr __NSCFString * @"Application,"abc,def"" 0x0000000302f706f0 [18.1.1 Second] application __NSArrayI * @"2 elements" 0x00000003021b5200 [0] __NSCFConstantString * @"Application" 0x00000001005deab8 [1] __NSCFConstantString * @"abc,def" 0x00000001005deb78 applistring __NSCFString * @"Application,"abc,def"" 0x0000000302ff6dc0 appliStr __NSCFString * @"Application,"abc,def"" 0x0000000302fa4d20 [18.2 First] sapplication __NSArrayI * @"2 elements" 0x00000003019d7e80 [0] __NSCFConstantString * @"Application" 0x00000001041c6ab8 [1] __NSCFConstantString * @"abc,def" 0x00000001041c6b78 applistring __NSCFString * @"Application,"abc,def"" 0x000000030179e430 appliStr __NSCFString * @"Application,"abc,def"" 0x000000030179e5e0 [18.2 Second] application __NSArrayI * @"2 elements" 0x00000003019679a0 [0] __NSCFConstantString * @"Application" 0x00000001041c6ab8 [1] __NSCFConstantString * @"abc,def" 0x00000001041c6b78 applistring __NSCFConstantString * @"" 0x00000001efa04768 appliStr __NSCFConstantString * @"" 0x00000001efa04768
0
0
478
Dec ’24
traitCollectionDidChange iOS18
iOS18.2 / iPhone 16pro / Xcode 16.2 'traitCollectionDidChange' This function has been deprecated since ios17. However, in ios18, when I changed the app to the background state or changed it to the foreground state again, it was confirmed that the function worked. It hasn't been confirmed in ios17, but why is it only confirmed in ios18?
0
0
602
Dec ’24
Can't use Link in .systemLarge widget
I just added a .systemLarge widget to my app, but I can't get Links to work. I want the user to be able to tap one of the four rows in my widget - like the EmojiRangers example - but I can't get it to work. I watched a Developer video from WWDC20: https://developer.apple.com/videos/play/wwdc2020/10036?time=223 The guy, Izzy, 'simply' embeds an HStack in a Link, and hey presto! It all works. But that doesn't happen for me. There's clearly some code in the background that runs. I already have .widgetURL working for .systemSmall and .systemMedium widgets, and I don't need to use Links on those two types. Those work by sending a URL to .onOpenURL { incomingURL in ... All good there, no issues. I've wrapped each row in the large widget in a Link with the URL of something like myappurlscheme://widgetTapped/widgetId (it's the same url as that used in the small and medium widgets). I build & run. I tap a row. It doesn't act as though a row is tappable (it doesn't go slightly transparent), and just opens the app without hitting .onOpenURL or anything else. Nothing in my scene delegate is triggered. Is there a specific delegate method that gets called? Do I need to set up some awful intents? I'm not using any sort of NavigationStack here; that model doesn't fit my app. Any ideas? Thanks.
1
0
513
Dec ’24