Post

Replies

Boosts

Views

Activity

How to set content scroll view when using SwiftUI ScrollView in UIKit view controller?
I have a UIViewController that initially does not display any scrollable content, but later on I add a child view controller that does scroll - a UIHostingController whose rootView is a GeometryReader containing a ScrollView. The problem is when you scroll the UINavigationBar remains transparent I’m sure because it couldn’t find a scroll view in the view hierarchy. There is an API to specify which scroll view to use but that’s a UIScrollView. How can I tell it about my SwiftUI scroll view? viewController.setContentScrollView(scrollView, for: .bottom)
1
0
2.6k
Jun ’21
Is it ok to get fullSizeImageURL by requesting content editing input to share photos?
Is it ok to call requestContentEditingInput for a lot of PHAssets to get URLs for their full size image? It seems odd because I would not be using the content editing input to actually modify these images. Is that ok are or are there implications to be aware of? Use case: I want to allow the user to share multiple PHAssets via UIActivityViewController. I can download and share an array of UIImage, which works, but I found if you tap Copy the app freezes for like 1 second for each photo (10 seconds if you shared 10 photos). Profiling the app it looks like iOS is spending the time creating a PNG for each image. Also it's probably not a good idea to store huge images in memory like that. I figured I'd try sharing an array of URLs to the images. Seemingly the only way you can get a URL for a photo is by requesting a content editing input for the asset and accessing its fullSizeImageURL property. Is this a good idea, and is this the right approach to share PHAssets?
1
0
1.7k
Aug ’22
Trait collection changes in iOS 17 and code compatibility
I believe when trait collections were first introduced, the values were unknown initially, so you could put code that accessed those values in traitCollectionDidChange because it always changed from unknown to known values. An iOS update changed this behavior to provide an estimated initial value, so traitCollectionDidChange would only get called if its value changed from its initial value. This required us to optimize for the trait collection in viewDidLoad for example to handle its initial value and handle changes in traitCollectionDidChange. In iOS 17, it’s stated if you access traits before the view is added to the hierarchy, the values won’t be up-to-date. It’s recommended to use viewIsAppearing instead of viewDidLoad and viewWillAppear. traitCollectionDidChange is still invoked but deprecated replaced with a new registration API to be informed when a value changes. My question is, will the code written using the previous approach still work when compiled with the iOS 17 SDK? Meaning, does the system still provide an estimated initial value and inform you if it changed upon getting added to the view hierarchy? Or is this a breaking change in behavior that will require us to rewrite our logic moving code that accesses the traitCollection from viewDidLoad to viewIsAppearing (and be really careful in doing so because this function is called every time the view appears not just once)? Are there any scenarios where the code written for iOS 16 would stop working once compiled for iOS 17 if you access trait values in viewDidLoad and handle changes in traitCollectionDidChange? I’m trying to understand if I can keep my existing code and use the new approach going forward or if I need to revisit existing code that utilizes trait collections. Thanks!
1
0
2.8k
Jun ’23
EnumerableEntityQuery references AppEntity type in the singular form when it should be plural
I’m implementing App Shortcuts in my iOS app to allow you to add and find plants. In attempt to get a “Find Plants” shortcut, I created a query that conforms to EnumerableEntityQuery and set that as the defaultQuery in my PlantAppEntity. I have the typeDisplayRepresentation set to TypeDisplayRepresentation(name: "Plant", numericFormat: "\(placeholder: .int) plants"). I added a Localizable.stringsdict to the app target, added Plant and %lld plants as the header comments shows, then clicked Localize so now English is selected in the Localization section. But when I run the app then open Shortcuts and tap my app, there’s a Find Plant shortcut, but I expected it to be titled Find Plants. When I tap the info button it shows “plant” instead of “plants” in every parameter description. When you add that action to a shortcut the placeholder is All Plant, unlike similar shortcuts from Reminders and Contacts that say “All Reminders” and “All Contacts”. The action is working properly as it returns an array of plants, the only issue is it’s using the singular form of plant in places it should be plural. Have I done something wrong, am I missing anything, or is this a bug? (FB12908309)
1
0
956
Aug ’23
Build XCFramework from source that has dependencies on Swift Packages
I’m looking into building a closed source XCFramework from a local Swift package that has dependencies on other packages, which can later be distributed via Swift Package Manager. In initial discussions, we thought xcodebuild does not support linking the dependencies externally, it always includes them statically in the built framework. It's my understanding this is because we're asking xcodebuild to build a framework from a local Swift Package. Is there another way this can be achieved? To explain in more detail: I have built a closed source SDK for other developers to integrate in their apps, currently distributed as an XCFramework. The interesting thing about the SDK is it has dependencies on other libraries, which need to be resolved when adding this SDK as a dependency to an app. The SDK’s dependencies should not be baked into our XCFramework. CocoaPods has worked well for that but we want to instead use SPM. The current project setup is an iOS framework Xcode project and an app Xcode workspace. The framework project is included in the app workspace and is in the same repo as the app, which allows me to modify the framework source code then run the app to test it. The framework project can also be opened independently and built to verify it doesn’t have any errors, but to verify it’s working I run it with the app. To distribute a new release I use xcodebuild to create an XCFramework and then deploy that. For this to work with CocoaPods I had to add a Podfile to the app directly as well as the framework directory so both have the dependencies available. This means I have an xcworkspace for the framework and not just a xcodeproj. I specify the framework workspace file in the xcodebuild command. To switch to a setup that utilizes Swift Package Manager, I created a Package.swift in the iOS framework project’s directory that specifies its dependencies, removed CocoaPods integration including deleting the workspace file, removed the framework project from the app’s workspace, added the Package as a local package to the app project, and added the framework directory via + > Add Files to “App” which adds the package to the top of the sidebar, making its source code available to edit within the app workspace. Everything is working when I run the app. Xcode properly resolves the dependencies for the local package and I can run the app to develop it. Now to create an XCFramework I run the following command in the framework directory (which contains the Package.swift): xcodebuild archive -workspace . -scheme FrameworkName -configuration Release -destination 'generic/platform=iOS' -archivePath './build/FrameworkName.framework-iphoneos.xcarchive' SKIP_INSTALL=NO BUILD_LIBRARIES_FOR_DISTRIBUTION=YES ENABLE_USER_SCRIPT_SANDBOXING=NO This succeeds however the dependencies have been linked statically thus included in our SDK. We need to only include the code from our framework and link to external dependencies, like it does with our current CocoaPods setup. I'm wondering what options there are to achieve this. Even if I need to change the project setup locally, for example to continue using a framework project/workspace instead of a local Swift package. It seems I just need xcodebuild to be able to create an XCFramework which can then be distributed with its own Package.swift file that specifies its dependencies. If it's not possible to link the dependencies externally, could you help me to understand the implications of including them statically? I don't know what problems could arise as a result of that or other concerns this would bring. Thanks!
1
1
3.5k
Jun ’24
Distribute XCFramework that has dependencies on Swift Packages with Example project
I've created a closed source iOS SDK from a local Swift package, which has dependencies on other Swift packages, and successfully created a binary XCFramework following the solution from my previous post. Now I'm proceeding with the process to distribute this SDK. I believe I want to upload the XCFramework to a public repo alongside a Package.swift file and an Example app project that uses the XCFramework. So each time I go to create a new release I’ll create a new XCFramework replacing the current one, verify it's working properly in the example app, then commit, tag, and push to the repo. My question is how do I set this up as a Swift package that includes an example app that uses the local XCFramework (not a remote url to a zip of the framework) and properly resolves dependencies? So far I created a directory containing MyFramework.xcframework and Package.swift containing: // swift-tools-version: 5.10 // The swift-tools-version declares the minimum version of Swift required to build this package. import PackageDescription let package = Package( name: "MyFramework", platforms: [ .iOS(.v14) ], products: [ .library( name: "MyFramework", targets: ["MyFramework"] ) ], dependencies: [ .package(url: "https://github.com/example/example.git", from: "1.0.0") ], targets: [ .binaryTarget( name: "MyFramework", path: "MyFramework.xcframework" ) ] ) I then created an Example iOS app project in that directory and now need to integrate the local XCFramework. I wondered if I could do that via File > Add Package Dependencies > Add Local, but when I navigate to that Package.swift and click Add Package it says The selected package cannot be a direct ancestor of the project. Do I need a different Package.swift for the Example app, and if so, how do I get that set up? I created a separate Package.swift (contents below) alongside the xcodeproj but when I try to add that in Xcode I get the same error, despite the fact this package is a sibling of the project not an ancestor. // swift-tools-version: 5.10 // The swift-tools-version declares the minimum version of Swift required to build this package. import PackageDescription let package = Package( name: "MyFramework-Example", platforms: [ .iOS(.v14) ], dependencies: [ .package(name: "MyFramework", path: "../") ], targets: [ .target( name: "MyFramework-Example", dependencies: ["MyFramework"] ) ] )
1
0
1.2k
Jun ’24
What format for writeHEIFRepresentation preserves HDR?
In the WWDC 24 session "Use HDR for dynamic image experiences in your app" it's noted this is how you save edits for Adaptive HDR: SDR + HDR: writeHEIFRepresentation(of: sdrImage, to: url, colorSpace: p3Space, options: [.hdrImage: hdrImage]) SDR + Gain: writeHEIFRepresentation(of: sdrImage, to: url, colorSpace: p3Space, options: [.hdrGainMapImage: gainImage]) This won't compile because the format argument is missing. What format should be used? In the WWDC 23 session "Support HDR images in your app" RGBAf, RGBAh, and RGBA16, and RGB10 were mentioned but I'm not sure which one to use. If relevant, I'm editing photos from the user's photo library, so the image was probably taken on iPhone but perhaps not. Thanks!
1
0
711
Oct ’24
Get View Full HDR state from Settings > Photos to properly set preferredImageDynamicRange in editing extension
I'm updating my Photo Editing Extension to support HDR. To do this I set imageView.preferredImageDynamicRange = .high. But you can turn off the option to view HDR photos in the complete dynamic range in Settings > Photos. When you do that, open a photo, and tap the edit button, it does not appear in the full range as expected, but when you select my app from More > Extensions, it does appear in the complete dynamic range unexpectedly. I need to set imageView.preferredImageDynamicRange = .standard when View Full HDR is off, but I don't see any way to get that in my PHContentEditingController.
1
0
655
Oct ’24
How to edit gain map image to preserve HDR in Live Photo
I have an app that allows you to edit your photos. To preserve HDR, I edit both the SDR image and gain map image, like so: let sdrImage = CIImage(data: data, options: [.applyOrientationProperty: true]) let gainMapImage = CIImage(data: data, options: [.applyOrientationProperty: true, .auxiliaryHDRGainMap: true]) // edit them... try CIContext().writeHEIFRepresentation(of: sdrImage, to: url, format: .RGBA8, colorSpace: colorSpace, options: [.hdrGainMapImage: gainMapImage]) I also support editing the still photo in Live Photos. To do this you create a PHLivePhotoEditingContext, set the frameProcessor block which gives you a CIImage that I edit when the frame.type is .photo, then you create a PHContentEditingOutput and call saveLivePhoto. I’m not seeing any way to preserve HDR here. Interestingly the frame processor is called twice with .photo frame.type, but I don’t see any difference between these images. How can I edit a gain map image to preserve HDR in the still photo of a Live Photo?
1
0
772
Nov ’24
PHLivePhotoEditingContext.saveLivePhoto results in AVFoundation error -11800 "The operation could not be completed" reason An unknown error occurred (-12815)
When trying to edit some Live Photos, calling PHLivePhotoEditingContext.saveLivePhoto results in the following error: Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" UserInfo={NSLocalizedFailureReason=An unknown error occurred (-12815), NSLocalizedDescription=The operation could not be completed, NSUnderlyingError=0x300d05380 {Error Domain=NSOSStatusErrorDomain Code=-12815 "(null)"}} I was able to replicate it on my device by taking a new Live Photo. Not sure what's wrong with that one specifically, not all Live Photos replicate the issue. I've submitted FB15880825 with a sysdiagnose and a Photos Diagnostics as well. Any ideas what's going on here? It's impacting multiple customers. Thanks!
1
0
579
Jun ’25
Failed to launch Photo Editing Extension from Mac Catalyst app
I have an iOS app that includes a Photo Editing Extension and is optimized for Mac Catalyst so you can edit photos in the Photos app on your Mac. This has worked really well but now I am encountering an error alert trying to open the photo editing extension: RBSLaunchRequest error trying to launch plugin com.company.TestEditor. TestPhotoEditor (B7A616A7-2 5A8-4E02-8B32-5CAB37C8B4B2): ErrorDomain=RBSRequestErrorDomain Code=5 "Launch failed." UserInfo={NSLocalizedFailureReason=Launch failed., NSUnderlyingError=0x7f08fafd0 {ErrorDomain=NSPOSIXErrorDomain Code=153 "Unknown error: 153" UserInfo={NSLocalizedDescription=Launchd job spawn failed}}} Create a new iOS app project in Xcode Create a new target and choose iOS > Photo Editing Extension For both targets in the project, add Mac Catalyst as a supported destination Run the app on My Mac (Mac Catalyst) Open the Photos app, double click a photo, click Edit, click the more plugins button, and click TestPhotoEditor in the list macOS 15.4.1 + Xcode 16.3
1
0
181
May ’25
PHAssetChangeRequest revertAssetContentToOriginal without original asset content
The documentation for PHAssetChangeRequest.revertAssetContentToOriginal says it will fail if the original asset content is not on the current device so you should use PHAssetResourceManager to download it first, but this no longer seems to be the case in the latest iOS versions because an error no longer occurs when I take a photo on my iPhone, edit it, open Photos on my iPad and let it sync, then open my app on iPad and call revertAssetContentToOriginal for that asset. Does the system now take care of downloading the original when needed?
1
0
130
Jun ’25