Post

Replies

Boosts

Views

Activity

Need clarifications regarding some properties of NEPacketTunnelNetworkSettings
Hello, I am working on a NEPacketTunnelProvider and I am not exactly sure if I correctly understand the NEPacketTunnelNetworkSettings. For example the ipv4Settings according to docs: This property contains the IPv4 routes specifying what IPv4 traffic to route to the tunnel, as well as the IPv4 address and netmask to assign to the TUN interface. So this seems like unless I set all the possible routes here, the tunnel should not work for all the traffic? Currently I have this: swift let ipv4Settings: NEIPv4Settings = NEIPv4Settings( addresses: ["192.169.89.1"], subnetMasks: ["255.255.255.255"] ) Which seems to work pretty well, both for WiFi and cellular. In the past I tried various other addresses, even manually including all the IPV4 routes but I never noticed any effect regarding the tunnel. Then there is the includedRoutes property. The routes that specify what IPv4 network traffic will be routed to the TUN interface. So this is basically another way to set the address like in the constructor for NEIPv4Settings? This seems to work best when I don't set anything. I tried setting all the routes but that did not change things a bit. The only difference is when I set includedRoutes to NEIPv4Route.default(). Then some apps stop working when the tunnel is active. This is strange, because even setting all the available routes + default one doesn't fix this "issue". What is the relation between these properties? It is best to not set includedRoutes if the tunnel works fine? And lastly. What about dnsSettings? This looks like another optional property. Does it make sense to manually specify DNS to point maybe to 1.1.1.1?
3
0
643
Apr ’21
Trying to implement Siri Shortcuts with Intents.. Need some clarification
Hello, I am working on a small app to easily create time relative reminders. Meaning I can quickly create reminder that will remind me about something after 45 minutes from now.. I want to add configurable shortcuts, so users can use this app via Siri and the Shortcuts app. I have created the Intents.intentdefinition file and it's (so far one) shortcut correctly displays in the Shortcuts app with the parametrs. But I am unsure how should I now handle it? Some tutorials and docs mention the delegate methods in SceneDelegate while others point to Intents Extension which is supposed to handle the shortcut? swift override func restoreUserActivityState(_ activity: NSUserActivity) { } This shortcut what I want does not need more user interaction than just providing the two input parameters. So the simplest way to handle that would be nice. Ideally if that work could happen in background without opening my app? Because after the reminder is added, there is nothing to do in my app. Is that possible?
0
0
621
Apr ’21
How to re-enable disabled VPN configuration?
Hello, I noticed an issue with my VPN configuration created with NETunnelProviderManager. When the user has multiple apps that use VPN configuration and are active, I cannot activate my network extension. I am getting this error: NEVPNError.Code.configurationDisabled For ObjC it's NEVPNErrorConfigurationDisabled An error code indicating the VPN configuration associated with the VPN manager isn’t enabled. So if this happens, I need to open Settings - VPN and select my app's profile. How to do this programatically? Other apps are able to re-enable their VPN profile, if another one was selected.
7
0
3.1k
Aug ’22
Issues with saving `AVAudioFile` - playable, no sound
Hello, I am trying to use AVAudioFile to save audio buffer to .wav file. The buffer is of type [Float]. Currently I am able to successfully create the .wav files and even play them, but they are blank - I cannot hear any sound. private func saveAudioFile(using buffer: [Float]) { let fileUrl = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask).first!.appendingPathComponent("\(UUID().uuidString).wav") let fileSettings = [ AVFormatIDKey: Int(kAudioFormatLinearPCM), AVSampleRateKey: 15600, AVNumberOfChannelsKey: 1, AVEncoderAudioQualityKey: AVAudioQuality.high.rawValue ] guard let file = try? AVAudioFile(forWriting: fileUrl, settings: fileSettings, commonFormat: .pcmFormatInt16, interleaved: true) else { print("Cannot create AudioFile") return } guard let bufferFormat = AVAudioFormat(settings: settings) else { print("Cannot create buffer format") return } guard let outputBuffer = AVAudioPCMBuffer(pcmFormat: bufferFormat, frameCapacity: AVAudioFrameCount(buffer.count)) else { print("Cannot create output buffer") return } for i in 0..<buffer.count { outputBuffer.int16ChannelData!.pointee[i] = Int16(buffer[i]) } outputBuffer.frameLength = AVAudioFrameCount(buffer.count) do { try file.write(from: outputBuffer) } catch { print(error.localizedDescription) print("Write to file failed") } } Where should I be looking first for the problem? Is it format issue? I am getting the data from the microphone with the AVAudioEngine. Its format is created like this: let outputFormat = AVAudioFormat(commonFormat: .pcmFormatInt16, sampleRate: Double(15600), channels: 1, interleaved: true)! And here is the installTap implementation with the buffer callback: input.installTap(onBus: 0, bufferSize: AVAudioFrameCount(sampleRate*2), format: inputFormat) { (incomingBuffer, time) in DispatchQueue.global(qos: .background).async { let pcmBuffer = AVAudioPCMBuffer(pcmFormat: outputFormat, frameCapacity: AVAudioFrameCount(outputFormat.sampleRate * 2.0)) var error: NSError? = nil let inputBlock: AVAudioConverterInputBlock = { inNumPackets, outStatus in outStatus.pointee = AVAudioConverterInputStatus.haveData return incomingBuffer } formatConverter.convert(to: pcmBuffer!, error: &error, withInputFrom: inputBlock) if error != nil { print(error!.localizedDescription) } else if let pcmBuffer = pcmBuffer, let channelData = pcmBuffer.int16ChannelData { let channelDataPointer = channelData.pointee self.buffer = stride(from: 0, to: self.windowLengthSamples, by: 1).map { Float(channelDataPointer[$0]) / 32768.0 } onBufferUpdated(self.buffer) } } } The onBufferUpdated is the block that provides [Float] for the saveAudioFile method above. I have tried some experiements with different output formats, but that ended up with unplayable audio files.
1
0
1.9k
Jan ’22
Trouble understanding the API limits for internal use
Hello, with new App Store Connect API I wanted to prototype a couple of ideas, but when I wanted to get a API key, this dialog made me question the validity: So this sounds like we are not supposed to create apps or web apps that would help other developers with App Store Connect tasks? For example if I wanted to create a web app that lets people manage their TestFlight, that is against the rules? Because it would presumably involve them getting an API key which my web app would use to talk to ASC API? On the other hand there are services like RevenueCat, Bitrise and similar, that presumably "access ASC on behalf of their users"? I would really appreciate if someone can explain this to me.
0
0
631
Jul ’22
How to shield categories with the ManagedSettingsStore?
Hello, I am not quite sure, how the shielding of entire categories of apps is supposed to work. The FamilyActivitySelection contains tokens for apps, websites and categories. But the shield property of ManagedSettingsStore has only attributes applications and webDomains where I can configure the tokens from the family activity selection. shield.applications = selection.applicationTokens shield.webDomains = selection.webDomainTokens I would expect there to be the property categories that expects Set<ActivityCategoryToken> and based on this shields apps in that category.
1
0
1.4k
Jul ’22
`NEPacketTunnelProvider` configuration rarely gets duplicated
Hello, with iOS 16 (multiple betas), I noticed that our VPN configuration created with NEPacketTunnelProvider appears twice in Settings -> General -> VPN & Device Management. I thought that this shouldn't be possible (even if I wanted to) because on iOS apps can provide just one configuration? All the basic configuration for your VPN is static. providerBundleIdentifier & serverAddress are contants in the source code. The only thing that gets changed is onDemandRules. When I inspected the configurations details in Settings, they were identical.
2
0
718
Sep ’22
Cannot submit in-app with new buid: Unable to get the in-app approved
Hello, I have added new non-consumable in-app to my existing app. Initially I messed up and sent it to review by itself (since in-apps and regular submissions seem to be handled by different teams). It got rejected and the review notes said that the review couldn't be done, because they couldn't test it in the app. Makes sense. So I submitted new build to app review and then submitted the in-app again. Same rejection reason. The build got approved, the in-app got rejected. After my newest build was approved. I added note in the review notes that the in-app is available in approved build. But after almost 48 hours "In Review" I got another rejection: We have returned your in-app purchase products to you as the required binary was not submitted. When you are ready to submit the binary, please resubmit the in-app purchase products with the binary. I have no idea how to submit "in-app purchase products with the binary". When preparing new version for review, there isn't any option to add the in-app as a single submission. Please help :/
1
0
1k
Feb ’23
Apps getting stuck in "In Review"?
Hello, is this just me or does it seem lately that apps spend quite long time in "In Review" state? One of my work app has been "In Review" since April 20th, which is already 9 days. I have recently submitted minor update for my hobby app and it has been "In Review" for over 24 hours - while in the past it took just a couple of hours or even less to be approved. I understand that "Waiting for Review" can take time if there is long queue of apps but "In Review" this long? Surely the reviewer is not testing my app for 24 hours straight? Did they forget? Are they waiting for someone else to take a look? It just seems quite strange.
0
0
667
Apr ’23
Proper way to import screen recordings with SwiftUI PhotosPicker?
Hello, I am building contact form that allows to attach screenshots and screen recordings. The PhotosPicker part is relatively straightforward but I am not sure how to properly import the selected items. The binding is of type [PhotosPickerItem] which requires (at least for my current implementation) to first know if the item is image or video. I have this not so pretty code to detect if the item is video: let isVideo = item.supportedContentTypes.first(where: { $0.conforms(to: .video) }) != nil || item.supportedContentTypes.contains(.mpeg4Movie) Which for screen recordings seems to work only because I ask about .mpeg4Movie and then I have this struct: struct ScreenRecording: Transferable { let url: URL static var transferRepresentation: some TransferRepresentation { FileRepresentation(contentType: .mpeg4Movie) { video in SentTransferredFile(video.url) } importing: { received in let copy = URL.temporaryDirectory.appending(path: "\(UUID().uuidString).mp4") try FileManager.default.copyItem(at: received.file, to: copy) return Self.init(url: copy) } } } Notice here I have just the .mpeg4Movie content type, I couldn't get it to work with more generic ones like movie and I am afraid this implementation could soon break if the screen recordings change video format/codec. And finally my logic to load the item: if isVideo { if let movie = try? await item.loadTransferable(type: ScreenRecording.self) { viewModel.addVideoAttachment(movie) } } else { if let data = try? await item.loadTransferable(type: Data.self) { if let uiImage = UIImage(data: data) { viewModel.addScreenshotAttachment(uiImage) } } } I would like to make this more "future proof" and less error prone - particularly the screen recordings part. I don't even need the UIImage since I am saving the attachments as files, I just need to know if the attachment is screenshot or video and get its URL. Thanks!
0
0
901
Aug ’23
Stability issues with ManagedSettings on iOS 16.6? Lots of new crashes
Hello, I am curious if someone else also noticed this. We have started getting reports about our app not working correctly (mostly related to the Device Activity monitor that "runs" in the background, https://developer.apple.com/documentation/deviceactivity/deviceactivitymonitor). Upon checking the Xcode Organizer I can see somewhat significant amount of crashes that mostly appear to happen on iOS 16.6 - it is possible that our users don't wait with updating iOS but still looks suspicious. The crashes are related to ManagedSettings calls outside our own code. We haven't changes this code in a while so this coupled with the fact that these crashes happen "deep" in the ManagedSettings framework leads me to believe there is some other issue.
3
0
1.1k
Sep ’23
Correct Collection View "stretchy header" implementation?
Hello, I have the following subclass of UICompositionalCollectionViewLayout to get the stretchy header effect as shown below. It works quite well, but I don't really have an experience with creating custom layouts so I thought I'd ask if maybe my implementation doesn't have some important flaws. I once ran into persistent layout loop crash with this and I am not sure what exactly I changed but it stopped happening. However since I am using this layout on important screen, I would like to make sure there isn't obvious potential for the layout loop crash happening in App Store version. I am particularly unsure about the shouldInvalidateLayout implementation. Originally I was returning true all the time, but decided to change it and only force invalidation for negative content offset which is when my header is supposed to stretch. Here is the full code: final class StretchyCompositionalLayout: UICollectionViewCompositionalLayout { override func layoutAttributesForElements(in rect: CGRect) -> [UICollectionViewLayoutAttributes]? { var attrs = super.layoutAttributesForElements(in: rect) ?? [] guard let collectionView = collectionView else { return attrs } let contentOffset = collectionView.contentOffset.y guard contentOffset < 0 else { return attrs } var newAttributes: UICollectionViewLayoutAttributes? attrs.forEach({ attribute in if attribute.indexPath.section == 0 && attribute.indexPath.item == 0 { let startFrame = attribute.frame newAttributes = attribute.copy() as? UICollectionViewLayoutAttributes let newFrame: CGRect = .init(x: 0, y: contentOffset, width: startFrame.width, height: startFrame.height - contentOffset) newAttributes?.frame = newFrame } }) if let new = newAttributes { attrs.removeAll { attr in return attr.indexPath.section == 0 && attr.indexPath.item == 0 } attrs.insert(new, at: 0) } return attrs } override func layoutAttributesForItem(at indexPath: IndexPath) -> UICollectionViewLayoutAttributes? { guard let attributes = super.layoutAttributesForItem(at: indexPath) else { return nil } let contentOffset = collectionView?.contentOffset.y ?? 1 guard contentOffset < 0 else { return attributes } if indexPath.section == 0 && indexPath.item == 0 { let attributes = attributes.copy() as? UICollectionViewLayoutAttributes ?? attributes let startFrame = attributes.frame let newFrame: CGRect = .init(x: 0, y: contentOffset, width: startFrame.width, height: startFrame.height - contentOffset) attributes.frame = newFrame return attributes } else { return super.layoutAttributesForItem(at: indexPath) } } override func shouldInvalidateLayout(forBoundsChange newBounds: CGRect) -> Bool { let contentOffset = collectionView?.contentOffset.y ?? 1 // There is visual glitch when 0 is used in this condition if contentOffset < 1 { return true } else { return super.shouldInvalidateLayout(forBoundsChange: newBounds) } } } Any feedback welcome!
Topic: UI Frameworks SubTopic: UIKit Tags:
0
0
534
Jul ’24