I have been spending the last several weeks implementing NSPersistentCloudKitContainer in my app, and it is most of the way there. Unfortunately, I keep running into an issue where after several days of successful syncing between devices, each device begins to crash after about a minute of use, repeatedly.
The crash report points to a SQL thread doing things with the CoreData and CloudKit frameworks — none of my code whatsoever. It is the typical “CPU: 48 seconds cpu time over 58 seconds (82% cpu average), exceeding limit of 80% cpu over 60 seconds” issue. If I run the devices hooked up to Xcode and debug, I see the thread spin up and the log shows it chugging through changed CKRecords it needs to import, just like normal. If I leave the devices hooked up to Xcode, they eventually make it through this huge job and the devices become usable again.
Once one device is in this state, the problem also occurs on new devices trying to download from the cloud for the first time.
I’ve attached a screenshot of the stacktrace of that thread in Instruments. I haven’t had any luck finding other people mentioning the system killing their app during a sync, so I’m kind of at a loss for what to do. It seems like the issue is occurring in a job that the NSPersistentCloudKitContainer is managing on my behalf and I haven’t been able to figure out a way to configure a timeout or anything.
Has anyone experienced this? I’m not sure what to do if the chunks that NSPersistentCloudKitContainer breaks up the import into are too large for the device to work through before the system kills the app. I’d appreciate any ideas or insights. Please let me know if any other information would be helpful.
Thanks!
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
ApplicationMusicPlayer is available on the Mac! 🎉🎉🎉 Enormous thanks to @JoeKun and the team. I've already gotten my app up and running through Catalyst, and I've successfully played music! I also got some timeouts, but that was happening on my phone a lot that day too, so maybe my local CDN was just having a bad day.
I wanted to ask this question in a lab this week, but the timing didn't work out: Do you expect the experience to be the same using ApplicationMusicPlayer on a Catalyst vs a macOS target? I'm hoping to reuse much of my iPad app and go the Catalyst route, but I wanted to double check that the new support wasn't just for macOS.
I'm very excited about the new MusicLibrary API, but after a couple of days of playing around with it, I have to say that I find the implementation of filtering MusicLibraryRequests a little confusing. MPMediaQuery has a fairly extensive list of predicates that can be applied, including string and persistentID comparisons for artist, album artist genre, and more. It also lets you filter on an item’s title. MusicLibraryRequests let you filter on the item’s ID, or on its MusicKit Artist and Genre relationships. To me, this seems like it adds an extra step.
With an MPMediaQuery, if I wanted to fetch every album by a given artist, I’d apply an MPMediaPropertyPredicate looking at MPMediaItemPropertyAlbumArtist and compare the string. It was also easy to change the MPMediaPredicateComparison to .contains to match more widely. If I wanted to surface albums by “Aesop Rock” or “Aesop Rock & Blockhead,” I could use that.
In the MusicLibraryRequest implementation, it looks like I need to perform a MusicLibraryRequest<Artist> first in order to get the Artist objects. There’s no filter for the name property, so if I don’t have their IDs, I’ve got to use filter(text:). From there, I can take the results of that request and apply them to my MusicLibraryRequest<Album> using the filter(matching:memberOf) function.
I could use filter(text:) on the MusicLibraryRequest<Album>, but that filters across multiple properties (title and artistName?) and is less precise than defining the actual property I want to match against.
I think my ideal version of the MusicLibraryRequest API would offer something like filter(matching:equalTo:) or filter(matching:contains:) that worked off of KeyPaths rather than relationships. That seems more intuitive to me. I’m not saying we need every property from every filterable MPMediaItemProperty key, but I’d love to be able to do it on title, artistName, and other common metadata. That might look something like:
filter(matching: \.title, contains: “Abbey Road”)
filter(matching: \.artistName, equalTo: “Between The Buried And Me”)
I noticed that filter(text:) is case insensitive, which is awesome, and something I’ve wanted for a long time in MPMediaPropertyPredicate. As a bonus, it would be great if a KeyPath based filter API supported a case sensitivity flag. This is less of a problem when dealing with Apple Music catalog content, but users’ libraries are a harsh environment, and you might have an artist “Between The Buried And Me” and one called “Between the Buried and Me.” It would be great to get albums from both with something like:
filter(matching: \.artistName, equalTo: “Between The Buried And Me”, caseSensitive: false)
I've submitted the above as FB10185685. I also submitted another feedback this morning regarding filter(text:) and repeating text as FB10184823.
My last wishlist item for this API (for the time being!) is exposing the MPMediaItemPropertyAlbumPersistentID as an available filter attribute. I know, I know… hear me out. If you take a look at the other thread I made today, you’ll see that due to missing metadata in MusicKit, I still have some use cases where I need to be able to reference an MPMediaItem and might need to fetch its containing MPMediaItemCollection to get at other tracks on the album. It would be nice to seamlessly be able to fetch the MPMediaItemCollection or the library Album using a shared identifier, especially when it comes to being able to play the album in MusicKit’s player rather than Media Player’s.
I've submitted that list bit as FB10185789
Thanks for bearing with my walls of text today. Keep up the great work!
I’m unable to use the .searchScopes modifier to add a segmented Picker to my search bar as of developer beta 6. It will not display whether I’m using a NavigationStack, NavigationSplitView, or NavigationView. Has anyone had any luck using this modifier?
This simple code will demonstrate the problem.
struct ContentView: View {
@State var searchText: String = ""
@State var searchScope: String = "Scope 1"
let data = Array(0..<20)
var body: some View {
NavigationStack {
List {
ForEach(data, id:\.self) { item in
Text("\(item)")
}
}
.searchable(text: $searchText)
.searchScopes($searchScope, scopes: {
Text("Scope 1")
Text("Scope 2")
})
}
}
}
I've submitted this as FB11298015
Hey there Apple Music team! I'm excited to dig into the sessions coming up this week, and what I've seen so far from the developer documentation diffs looks great: audio quality, artist images, and a way to interface with a user's music library in MusicKit. Love it!
The thing at the very top of my WWDC wishlist this year was macOS/Mac Catalyst support for the ApplicationMusicPlayer class. I just got finished installing Ventura and Xcode 14, and sadly it looks like the support story is the same as on Big Sur. No API availability on macOS, and an available Mac Catalyst API that ultimately results in the same error from a feedback I submitted on Big Sur: FB9851840
The connection to service named com.apple.Music.MPMusicPlayerApplicationControllerInternal was invalidated: failed at lookup with error 3 - No such process.
Is that the end of the story on Ventura, or is there a chance support might be added in a later beta? Is there any additional detail at all that can be shared? I field several requests a week asking if/when my app is coming to the Mac, and I would really love to be able to make that happen. If there is anything at all I can do to test and help overcome the engineering challenges alluded to in the past, I am ready, willing, and able!
In any case, thanks for the great work, and I'm looking forward to spending time with the new stuff this summer.