I have created a custom file type for my app, that conforms to XML. Now I am trying to find a way so that users can share the file online and import it from there into the app.
I have added an UTExportedTypeDeclarations and CFBundleDocumentTypes for my custom type to info.plist, and also added LSSupportsOpeningDocumentsInPlace and UIFileSharingEnabled.
What works so far is that I can share a file via UIActivityViewController and send by mail, and when I tap the file in the E-Mail, the app opens and imports the file.
But what I haven't managed to get working so far is to import a file from a website. Safari will automatically detect that it is an XML file, ignore my own file extension and show the file in Safari. As XML.
The only way to get the file open in the app is if I long-press a link to a file, then tap on "Download linked file". This will send the file to the downloads folder and add ".xml" to the downloaded file. Now if I also add "public.xml" to LSItemContentTypes, it will open the file in the app. But this is of course not very user friendly, and it will open any XML-files in my app, which is not what I want.
Is there any other way on iOS that users can share app files online and directly import them into an app? Or am I missing a setting somewhere?
Here an example of how my CFBundleDocumentTypes is set up:
<key>CFBundleDocumentTypes</key>
<array>
<dict>
<key>CFBundleTypeName</key>
<string>My own project file</string>
<key>LSHandlerRank</key>
<string>Owner</string>
<key>LSItemContentTypes</key>
<array>
<string>com.mycompany.myfiletype</string>
<string>public.xml</string>
</array>
</dict>
</array>
And the exported UTExportedTypeDeclarations:
<key>UTExportedTypeDeclarations</key>
<array>
<dict>
<key>UTTypeConformsTo</key>
<array>
<string>public.xml</string>
</array>
<key>UTTypeDescription</key>
<string>My own project file</string>
<key>UTTypeIconFiles</key>
<array />
<key>UTTypeIdentifier</key>
<string>com.mycompany.myfiletype</string>
<key>UTTypeTagSpecification</key>
<dict>
<key>public.filename-extension</key>
<array>
<string>myfiletype</string>
</array>
</dict>
</dict>
</array>
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
While implementing the background behaviour of an app, I learned that there is a difference between the following app states:-being suspended-being terminated by the system-being terminated by the user (force quit)I can simulate app suspension by pushing the home button and moving the app to the background. I can force quit the app by double-tapping the home button and flicking the app out of the screen in the app switcher.But how can I reliably simulate the app being terminated by the system?I need this to test a few background features that seem to behave differently when the app is terminated by the system or while it is still suspended (and do not do anything while the app is force quit).Thanks!
I tried to update my current game controller code to the new iOS 17.0 GCControllerLiveInput. But the Touchpads of the PS4 and PS5 controllers are not listed as elements of the GCControllerLiveInput.
Were they moved somewhere else? They are not listed as a GCMouse, and I didn't find anything about this in the documentation or header files either...
I am trying to map the 3D skeleton joint positions of an ARBodyAnchor to the real body on the camera image.
I know I could simply use the "detectedBody" of the ARFrame, which would already deliver the normalized 2D position of each joint, but what I am mostly interested in is the z-axis (the distance of each joint to the camera).
I am starting a ARBodyTrackingConfiguration, setting the world alignment to ARWorldAlignmentCamera (in which case the camera transform is an identity matrix) and multiplying each joint transform in model space (via modelTransformForJointName:) with the transform of the ARBodyAnchor. And then tried many different ways to get the joints to line up with the image, by for example multiplying the transforms with the projectionMatrix of the ARCamera. But whatever I do, it never lines up correctly.
For example, the doesn't really seem to be a scale factor in the projectionMatrix or the ARBodyAnchor transform, no matter the distance of the camera to the detected body, the scale of the body is always the same.
Which means I am missing something important, and I haven't figured out what. So does anyone have an example of how I can get the body align to the camera image? (or get the distance to each joint in any other way?)
Thanks!
I connect two AVAudioNodes by using
- (void)connectMIDI:(AVAudioNode *)sourceNode to:(AVAudioNode *)destinationNode format:(AVAudioFormat * __nullable)format eventListBlock:(AUMIDIEventListBlock __nullable)tapBlock
and add a AUMIDIEventListBlock tap block to it to capture the MIDI events.
Both AUAudioUnits of the AVAudioNodes involved in this connection are set to use MIDI 1.0 UMP events:
[[avAudioUnit AUAudioUnit] setHostMIDIProtocol:(kMIDIProtocol_1_0)];
But all the MIDI voice channel events received are automatically converted to UMP MIDI 2.0 format. Is there something else I need to set so that the tap receives MIDI 1.0 UMPs?
(Note: My app can handle MIDI 2.0, so it is not really a problem. So this question is mainly to find out if I forgot to set the protocol somewhere...).
Thanks!!
Is there a way to destroy MIDIUMPMutableEndpoint again?
In my app, the user has a setting to enable and disable MIDI 2.0. If MIDI 2.0 should not be supported (or if iOS version < 18), it creates a virtual destination and a virtual source. And if MIDI 2.0 should be enabled, it instead creates a MIDIUMPMutableEndpoint, which itself creates the virtual destination and source automatically.
So here is my problem: I didn't find any way to destroy the MIDIUMPMutableEndpoint again. There is a method to disable it (setEnabled:NO), but that doesn't destroy or hide the virtual destination and source. So when the user turns MIDI 2.0 support off, I will have two virtual destinations and sources, and cannot get rid of the 2.0 ones.
What is the correct way to get rid of the MIDIUMPMutableEndpoint once it is created?