Hello, I am working on a NEPacketTunnelProvider and I am not exactly sure if I correctly understand the NEPacketTunnelNetworkSettings.
For example the ipv4Settings according to docs:
This property contains the IPv4 routes specifying what IPv4 traffic to route to the tunnel, as well as the IPv4 address and netmask to assign to the TUN interface.
So this seems like unless I set all the possible routes here, the tunnel should not work for all the traffic?
Currently I have this:
swift
let ipv4Settings: NEIPv4Settings = NEIPv4Settings(
addresses: ["192.169.89.1"],
subnetMasks: ["255.255.255.255"]
)
Which seems to work pretty well, both for WiFi and cellular. In the past I tried various other addresses, even manually including all the IPV4 routes but I never noticed any effect regarding the tunnel.
Then there is the includedRoutes property.
The routes that specify what IPv4 network traffic will be routed to the TUN interface.
So this is basically another way to set the address like in the constructor for NEIPv4Settings?
This seems to work best when I don't set anything. I tried setting all the routes but that did not change things a bit. The only difference is when I set includedRoutes to NEIPv4Route.default(). Then some apps stop working when the tunnel is active.
This is strange, because even setting all the available routes + default one doesn't fix this "issue".
What is the relation between these properties? It is best to not set includedRoutes if the tunnel works fine?
And lastly. What about dnsSettings? This looks like another optional property. Does it make sense to manually specify DNS to point maybe to 1.1.1.1?
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hello,
I am working on a small app to easily create time relative reminders. Meaning I can quickly create reminder that will remind me about something after 45 minutes from now..
I want to add configurable shortcuts, so users can use this app via Siri and the Shortcuts app.
I have created the Intents.intentdefinition file and it's (so far one) shortcut correctly displays in the Shortcuts app with the parametrs.
But I am unsure how should I now handle it? Some tutorials and docs mention the delegate methods in SceneDelegate while others point to Intents Extension which is supposed to handle the shortcut?
swift
override func restoreUserActivityState(_ activity: NSUserActivity) {
}
This shortcut what I want does not need more user interaction than just providing the two input parameters.
So the simplest way to handle that would be nice.
Ideally if that work could happen in background without opening my app? Because after the reminder is added, there is nothing to do in my app. Is that possible?
Hello,
I am working on a NetworkExtension that uses NEPacketTunnelProvider.
Is there any option to somehow see all traffic that gets send to my extension?
Some apps have issues working over my tunnel, so I want to check the connections and see what is happening.
Hello,
in my usecase, I want to use the matchDomains property, so my VPN (NEPacketTunnelProvider) handles traffic for just some apps and not everything that happens on the device.
But settings matchDomains to anything other than [""] doesn't seem to work properly.
It works for websites in Safari but in my testing not for other apps. Let's use Instagram as an example:
let proxySettings: NEProxySettings = NEProxySettings()
proxySettings.matchDomains = ["instagram.com"]
With this settings, using the Instagram app doesn't send traffic to my VPN. However if I set something like "theverge.com" as the domain, it gets set to my app.
According to the docs, the matchDomains uses suffixes:
If the destination host name of a HTTP connection shares a suffix with one of these strings then the proxy settings will be used for the HTTP connection. Otherwise the proxy settings will not be used.
I also tried wildcards like *.instagram.com without much luck.
How would I go about this? Is there any internal limits on how many domains I can match like this?
Thanks
Hello,
I am using apply(_:completionHandler:) on the NEHotspotConfigurationManager.shared to prompt the user to join particular WiFi network.
In my testing it works all the time, but via Sentry I am getting a lot of errors with this code from the NEHotspotConfigurationErrorDomain domain.
In particular I am getting: NEHotspotConfigurationErrorInternal and NEHotspotConfigurationErrorUnknown.
It doens't appear to be connected to particular iOS version nor device.
The WiFi network should be pretty same too - this is an app that connects to the Nintendo Switch game console.
From some anectodal reports it looks like when the user tries to connect via Camera.app (by scanning QR code) that works for the WiFi connection.
It is possible to debug this further?
Hello,
I am trying to use AVAudioFile to save audio buffer to .wav file. The buffer is of type [Float].
Currently I am able to successfully create the .wav files and even play them, but they are blank - I cannot hear any sound.
private func saveAudioFile(using buffer: [Float]) {
let fileUrl = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask).first!.appendingPathComponent("\(UUID().uuidString).wav")
let fileSettings = [
AVFormatIDKey: Int(kAudioFormatLinearPCM),
AVSampleRateKey: 15600,
AVNumberOfChannelsKey: 1,
AVEncoderAudioQualityKey: AVAudioQuality.high.rawValue
]
guard let file = try? AVAudioFile(forWriting: fileUrl, settings: fileSettings, commonFormat: .pcmFormatInt16, interleaved: true) else {
print("Cannot create AudioFile")
return
}
guard let bufferFormat = AVAudioFormat(settings: settings) else {
print("Cannot create buffer format")
return
}
guard let outputBuffer = AVAudioPCMBuffer(pcmFormat: bufferFormat, frameCapacity: AVAudioFrameCount(buffer.count)) else {
print("Cannot create output buffer")
return
}
for i in 0..<buffer.count {
outputBuffer.int16ChannelData!.pointee[i] = Int16(buffer[i])
}
outputBuffer.frameLength = AVAudioFrameCount(buffer.count)
do {
try file.write(from: outputBuffer)
} catch {
print(error.localizedDescription)
print("Write to file failed")
}
}
Where should I be looking first for the problem? Is it format issue?
I am getting the data from the microphone with the AVAudioEngine.
Its format is created like this:
let outputFormat = AVAudioFormat(commonFormat: .pcmFormatInt16, sampleRate: Double(15600), channels: 1, interleaved: true)!
And here is the installTap implementation with the buffer callback:
input.installTap(onBus: 0, bufferSize: AVAudioFrameCount(sampleRate*2), format: inputFormat) { (incomingBuffer, time) in
DispatchQueue.global(qos: .background).async {
let pcmBuffer = AVAudioPCMBuffer(pcmFormat: outputFormat, frameCapacity: AVAudioFrameCount(outputFormat.sampleRate * 2.0))
var error: NSError? = nil
let inputBlock: AVAudioConverterInputBlock = { inNumPackets, outStatus in
outStatus.pointee = AVAudioConverterInputStatus.haveData
return incomingBuffer
}
formatConverter.convert(to: pcmBuffer!, error: &error, withInputFrom: inputBlock)
if error != nil {
print(error!.localizedDescription)
}
else if let pcmBuffer = pcmBuffer, let channelData = pcmBuffer.int16ChannelData {
let channelDataPointer = channelData.pointee
self.buffer = stride(from: 0, to: self.windowLengthSamples, by: 1).map { Float(channelDataPointer[$0]) / 32768.0 }
onBufferUpdated(self.buffer)
}
}
}
The onBufferUpdated is the block that provides [Float] for the saveAudioFile method above.
I have tried some experiements with different output formats, but that ended up with unplayable audio files.
Hello,
I am unable to use UIApplication.shared.open to open URL to my app's iCloud Drive folder that I am using to export files to.
I am using this method to get the URL:
url(forUbiquityContainerIdentifier: nil)?.appendingPathComponent("Documents")
But when testing this URL with UIApplication.shared.canOpenURL I am getting back false.
-canOpenURL: failed for URL: "file:///private/var/mobile/Library/Mobile%20Documents/iCloud~MyApp/Documents/" - error: "The operation couldn’t be completed. (OSStatus error -10814.)"
Since my app's local document folder can be opened by prefixing the URL with shareddocuments:// I tried that for iCloud Drive URL, but didn't work either.
Is this even possible?
Hello,
I have successfully implemented NEPacketTunnelProvider network extension in iOS app which works fine most of the time.
By working fine I mean it starts, stops (it is configured to disconnect on sleep) and handles network traffic as expected.
However I have a few reports that sometimes it doesn't start correctly. It hangs on "Connecting..." when checking the Settings -> VPN.
As far as I can tell even with waiting for minutes, it seems still stuck.
Re-installing either the VPN provider extension or entire app fixes this problem.
What could be causing such random and very rare issues? This doesn't seem to be connected to single iOS version for example.
Is there a way to get all instances of ManagedSettingsStore that my app has previously configured?
I managed to get into a situation, where I removed the identifier in my database, but forgot to clear the store and now the app is shielding couple of apps and I am unable to stop it.
What is the proper workflow for this?
Hello,
with new App Store Connect API I wanted to prototype a couple of ideas, but when I wanted to get a API key, this dialog made me question the validity:
So this sounds like we are not supposed to create apps or web apps that would help other developers with App Store Connect tasks?
For example if I wanted to create a web app that lets people manage their TestFlight, that is against the rules? Because it would presumably involve them getting an API key which my web app would use to talk to ASC API?
On the other hand there are services like RevenueCat, Bitrise and similar, that presumably "access ASC on behalf of their users"?
I would really appreciate if someone can explain this to me.
Topic:
App Store Distribution & Marketing
SubTopic:
App Store Connect API
Tags:
App Store Connect API
Hello,
I am not quite sure, how the shielding of entire categories of apps is supposed to work. The FamilyActivitySelection contains tokens for apps, websites and categories.
But the shield property of ManagedSettingsStore has only attributes applications and webDomains where I can configure the tokens from the family activity selection.
shield.applications = selection.applicationTokens
shield.webDomains = selection.webDomainTokens
I would expect there to be the property categories that expects Set<ActivityCategoryToken> and based on this shields apps in that category.
Hello,
I noticed an issue with my VPN configuration created with NETunnelProviderManager. When the user has multiple apps that use VPN configuration and are active, I cannot activate my network extension.
I am getting this error: NEVPNError.Code.configurationDisabled
For ObjC it's NEVPNErrorConfigurationDisabled
An error code indicating the VPN configuration associated with the VPN manager isn’t enabled.
So if this happens, I need to open Settings - VPN and select my app's profile.
How to do this programatically? Other apps are able to re-enable their VPN profile, if another one was selected.
Hello,
with iOS 16 (multiple betas), I noticed that our VPN configuration created with NEPacketTunnelProvider appears twice in Settings -> General -> VPN & Device Management.
I thought that this shouldn't be possible (even if I wanted to) because on iOS apps can provide just one configuration?
All the basic configuration for your VPN is static. providerBundleIdentifier & serverAddress are contants in the source code.
The only thing that gets changed is onDemandRules.
When I inspected the configurations details in Settings, they were identical.
Hello,
I am trying to play around with the Live Text API according to this docs - https://developer.apple.com/documentation/visionkit/enabling_live_text_interactions_with_images?changes=latest_minor
But it always fails with [api] -[CIImage initWithCVPixelBuffer:options:] failed because the buffer is nil.
I am running this on a UIImage instance that I got from VNDocumentCameraViewController.
This is my current implementation that I run after the scanned image is displayed:
private func setupLiveText() {
guard let image = imageView.image else {
return
}
let interaction = ImageAnalysisInteraction()
imageView.addInteraction(interaction)
Task {
let configuration = ImageAnalyzer.Configuration([.text])
let analyzer = ImageAnalyzer()
do {
let analysis = try await analyzer.analyze(image, configuration: configuration)
DispatchQueue.main.async {
interaction.analysis = analysis
}
} catch {
print(error.localizedDescription)
}
}
}
It does not fail, it returns non-nil analysis object, but setting it to the interaction does nothing.
I am testing this on iPhone SE 2020 which has the A13 chip. This feature requires A12 and up.
Hello,
I am working on an app that scans documents and recognizes the text with help of Vision framework. This works great.
I would also like to "recognize" or detect individual images which are part of the document. Does Vision has any support for this or should I be looking into training my own ML model?
Below is an example document - I would like to extract the text (already done) and also the image of the building.