Post

Replies

Boosts

Views

Activity

Completion handler blocks are not supported in background sessions
When I try to implement the new Background Task options in the same way as they show in the WWDC video (on watchOS) likes this: let config = URLSessionConfiguration.background(withIdentifier: "SESSION_ID") config.sessionSendsLaunchEvents = true let session = URLSession(configuration: config) let response = await withTaskCancellationHandler {       try? await session.data(for: request) } onCancel: {       let task = session.downloadTask(with: request))       task.resume() } I'm receiving the following error: Terminating app due to uncaught exception 'NSGenericException', reason: 'Completion handler blocks are not supported in background sessions. Use a delegate instead.' Did I forget something?
7
2
2.7k
Apr ’25
Can't add values to Relationship?
Hi There, i'm not sure if anything changed in Beta 7 or it's me being an idiot but I can't seem to update my relationship property, anyone else experienced this? @Model final class Goal { // More properties here @Relationship(deleteRule: .cascade, inverse: \Progress.goal) var progress: [Progress]? init(progress: [Progress]? = []) { self.progress = progress } func updateProgress(with value: Double) { // I've also tried this with having the modelContext in the initialiser let context = ModelContext(DataStore.container) let newProgress = Progress(date: Date.now, value: value) newProgress.goal = self context.insert(newProgress) self.progress?.append(newProgress) // Other code } } @Model final class Progress { var progressDate: Date? var value: Double? var goal: Goal? init(date: Date = Date.now, value: Double = 0.0, goal: Goal? = nil) { self.progressDate = date self.value = value self.goal = goal } } Everytime I call the updateProgress method I get a fatal error (e.g. goal.updateProgress(with: 33.2) The Error Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: 'Unacceptable type of value for to-one relationship: property = "goal"; desired type = NSManagedObject; given type = NSManagedObject; value = <NSManagedObject: 0x2823e0c80> (entity: Goal; id: 0x8617307020b13c03 <x-coredata://7AE46844-B3C6-46A3-B350-FC65B7CCDF8F/Goal/p17>; data: { // Properties here }) I've tried many different ways but without any luck… In case this is a bug: FB13034172
3
7
1.3k
Sep ’23
List of Genre IDs?
Hey All I've been looking at adding a visual representation of the song's first genre in my app and therefor was wondering if there's a list available somewhere of all the Genre's available in MusicKit and their IDs e.g. Genre(id: "21", name: "Rock", parent: Genre(id: "34", name: "Music")) So my questions: Is there a list of all IDs? Are these IDs the same in every country? The reason why I would want this is to have my visuals be safe for localization. For example: If I'd call my asset icon-electronic and then localize my app in dutch it would ask for icon-electronisch. I hope this makes sense and I hope I don't have to browse and save a list of all Genre's by hand haha
4
0
2.9k
Jun ’23
What would be the easiest way to have a example MusicItem for SwiftUI previews
Hey all, I was just wondering what would be the best/easiest way to have a sample for any MusicItem to use in SwiftUI previews. I know I could convert one into a string and then decode (as it conforms to Codable) but maybe there is a way that would make it easier to setup? if not I might file a feedback for this as I think it would be great if all MusicItem structures would have a .example or something like that so for a view you'd have #Preview { SongTitlesView(song: Song.example) }
0
0
867
Jun ’23
Microphone feedback noise and can I use the output to recognise?
I recently released my first ShazamKit app, but there is one thing that still bothers me. When I started I followed the steps as documented by Apple right here : https://developer.apple.com/documentation/shazamkit/shsession/matching_audio_using_the_built-in_microphone however when I was running this on iPad I receive a lot of high pitched feedback noise when I ran my app with this configuration. I got it to work by commenting out the output node and format and only use the input. But now I want to be able to recognise the song that’s playing from the device that has my app open and was wondering if I need the output nodes for that or if I can do something else to prevent the Mic. Feedback from happening. In short: What can I do to prevent feedback from happening Can I use the output of a device to recognise songs or do I just need to make sure that the microphone can run at the same time as playing music? Other than that I really love the ShazamKit API and can highly recommend to have a go with it! This is the code as documented in the above link (I just added the comments of what broke it for me) func configureAudioEngine() { // Get the native audio format of the engine's input bus. let inputFormat = audioEngine.inputNode.inputFormat(forBus: 0) // THIS CREATES FEEDBACK ON IPAD PRO let outputFormat = AVAudioFormat(standardFormatWithSampleRate: 48000, channels: 1) // Create a mixer node to convert the input. audioEngine.attach(mixerNode) // Attach the mixer to the microphone input and the output of the audio engine. audioEngine.connect(audioEngine.inputNode, to: mixerNode, format: inputFormat) // THIS CREATES FEEDBACK ON IPAD PRO audioEngine.connect(mixerNode, to: audioEngine.outputNode, format: outputFormat) // Install a tap on the mixer node to capture the microphone audio. mixerNode.installTap(onBus: 0, bufferSize: 8192, format: outputFormat) { buffer, audioTime in // Add captured audio to the buffer used for making a match. self.addAudio(buffer: buffer, audioTime: audioTime) } }
3
0
2.7k
Feb ’23
Changing the images and tint of the icons provided by NavigationSplitView
Hey All, I'm currently implementing the new NavigationSplitView to improve my app's iPad experience, but I noticed that I couldn't seem to be able to tint the icons provided to toggle the Sidebar and the overflow menu. If I can't change the tint of these icons, I'm afraid I have to fall back the old implementation. The same counts for the Customize toolbar view. (I would also love to be able to change that background for legibility)
2
0
1.6k
Aug ’22
Is there any way to exclude curated and smart playlists?
The new MusicLibraryRequest<Playlist>() is great to get all the user's playlists but I have not found a way to exclude a specific type of playlist. For example; I wish to only get playlists that the user owns and can add content to (so this would exclude Smart and Curated playlists) I did find the Playlist.Kind enum, but I can't seem to apply this to the request, and it doesn't seem to have an option for smart playlists. So in short: Is there any way I can exclude Smart- and Curated Playlists from a MusicLibraryRequest
0
0
997
Jul ’22
Audio crashes when connected to AirPods
Hi There, Whenever I want to use the microphone for my ShazamKit app while connected to AirPods my app crashes with a "Invalid input sample rate." message. I've tried multiple formats but keep getting this crash. Any pointers would be really helpful. func configureAudioEngine() { do { try audioSession.setCategory(.playAndRecord, options: [.mixWithOthers, .defaultToSpeaker, .allowAirPlay, .allowBluetoothA2DP ,.allowBluetooth]) try audioSession.setActive(false, options: .notifyOthersOnDeactivation) } catch { print(error.localizedDescription) } guard let engine = audioEngine else { return } let inputNode = engine.inputNode let inputNodeFormat = inputNode.inputFormat(forBus: 0) let audioFormat = AVAudioFormat( standardFormatWithSampleRate: inputNodeFormat.sampleRate, channels: 1 ) // Install a "tap" in the audio engine's input so that we can send buffers from the microphone to the signature generator. engine.inputNode.installTap(onBus: 0, bufferSize: 1024, format: audioFormat) { buffer, audioTime in self.addAudio(buffer: buffer, audioTime: audioTime) } } ```
2
1
2.4k
Jul ’22
What's the easiest way to check when user has downloaded your app?
I'm switching my business model to freemium and now I need a clean way to keep giving access to all features for my existing userbase. I know I probably should do something with receipt validation but which receipt do I check and is there a very clean and straightforward way of doing this with StoreKit 2 maybe? I would have loved to see something like: AppStore.originalPurchaseDate property in StoreKit 2 but would love to hear some thoughts and input!
4
1
4.2k
Jun ’22
Request content in a specific locale?
Hi All, I recently decided to remove my localizations for now because I'm requesting content from a lot of different sources (many of which only provides English text. But with MusicKit I still receive the locale content. So basically I'm wondering if it's possible to set a locale for a MusicCatalogResourceRequest?
3
1
1.7k
May ’22
Is the MusicItemID different per locale?
Hi There, I've created a little tool for myself to quickly find the MusicItemID of a song which I then use in one of my own APIs to match the content. In this tool, I'm using the MusicCatalogSearchRequest but I was just wondering if anyone knows if those IDs I get back from this request are unique to the locale? and if so, is there a way to get all the IDs of a song to write down? I don't want to match on a name as that's more error prone. e.g. my API would look something like this: { "id": ["1259176472","510004981","1302212469"], "text": "Cool content.", }
2
0
1.6k
Mar ’22
How do I access lyrics in MusicKit or Apple Music Catalog
Hi There, In the documentation of MusicKit's Song object I can find the hasLyrics property, with the following description: A Boolean value that indicates whether the song has lyrics available in the catalog. If true, the song has lyrics available; otherwise, it doesn’t. My only question now is, where can I find the lyrics if this returns true? What is "the catalog" in this case. I hope the lyrics are available in MusicKit or the Apple Music API as this would save me a lot of headache connecting more API's (lyrics is one of my most requested feature)
5
2
4.6k
Feb ’22
Any way to check if song is in Library with MusicKit? ?relate=library not working for me
For my app that heavily uses both ShazamKit and MusicKit, I need to be able to check if the matched song is in the user's library or not. I couldn't find an easy way to do this with MusicKit so I first then turned to the Apple Music API as they introduced the new parameter ?relate=library during this year's WWDC. When I first tried it, it worked as expected, and was very happy to have gotten it to work. However, it stopped working lately and now I turned to use MPMediaLibrary as that still works but is a lot slower in performance. Anyone has any idea why the ?relate=library stopped working for me or know a better way to check if the Song exists in the user's library? func checkInLibrary(from appleMusicID: String) async { do { let countryCode = try await MusicDataRequest.currentCountryCode let libURL = URL(string: "https://api.music.apple.com/v1/catalog/\(countryCode)/songs/\(appleMusicID)?relate=library")! let request = MusicDataRequest(urlRequest: URLRequest(url: libURL)) let dataResponse = try await request.response() print(dataResponse.debugDescription)               } catch { // I'm handling errors here }}
6
0
3.1k
Feb ’22
MusicSubscriptionOffer can't be presented without Apple Music App
Hey All, Loving the MusicKit API but have one question regarding the .musicSubscriptionOffer(isPresented:options:onLoadCompletion:). When Apple Music is not installed on the device it can't be presented, is there any way to still present the musicSubscriptionOffer or do I have to prompt the user to install Apple Music first? and if so, what's the best way to check if Apple Music is installed?
4
0
1.5k
Feb ’22
Completion handler blocks are not supported in background sessions
When I try to implement the new Background Task options in the same way as they show in the WWDC video (on watchOS) likes this: let config = URLSessionConfiguration.background(withIdentifier: "SESSION_ID") config.sessionSendsLaunchEvents = true let session = URLSession(configuration: config) let response = await withTaskCancellationHandler {       try? await session.data(for: request) } onCancel: {       let task = session.downloadTask(with: request))       task.resume() } I'm receiving the following error: Terminating app due to uncaught exception 'NSGenericException', reason: 'Completion handler blocks are not supported in background sessions. Use a delegate instead.' Did I forget something?
Replies
7
Boosts
2
Views
2.7k
Activity
Apr ’25
Can't add values to Relationship?
Hi There, i'm not sure if anything changed in Beta 7 or it's me being an idiot but I can't seem to update my relationship property, anyone else experienced this? @Model final class Goal { // More properties here @Relationship(deleteRule: .cascade, inverse: \Progress.goal) var progress: [Progress]? init(progress: [Progress]? = []) { self.progress = progress } func updateProgress(with value: Double) { // I've also tried this with having the modelContext in the initialiser let context = ModelContext(DataStore.container) let newProgress = Progress(date: Date.now, value: value) newProgress.goal = self context.insert(newProgress) self.progress?.append(newProgress) // Other code } } @Model final class Progress { var progressDate: Date? var value: Double? var goal: Goal? init(date: Date = Date.now, value: Double = 0.0, goal: Goal? = nil) { self.progressDate = date self.value = value self.goal = goal } } Everytime I call the updateProgress method I get a fatal error (e.g. goal.updateProgress(with: 33.2) The Error Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: 'Unacceptable type of value for to-one relationship: property = "goal"; desired type = NSManagedObject; given type = NSManagedObject; value = <NSManagedObject: 0x2823e0c80> (entity: Goal; id: 0x8617307020b13c03 <x-coredata://7AE46844-B3C6-46A3-B350-FC65B7CCDF8F/Goal/p17>; data: { // Properties here }) I've tried many different ways but without any luck… In case this is a bug: FB13034172
Replies
3
Boosts
7
Views
1.3k
Activity
Sep ’23
List of Genre IDs?
Hey All I've been looking at adding a visual representation of the song's first genre in my app and therefor was wondering if there's a list available somewhere of all the Genre's available in MusicKit and their IDs e.g. Genre(id: "21", name: "Rock", parent: Genre(id: "34", name: "Music")) So my questions: Is there a list of all IDs? Are these IDs the same in every country? The reason why I would want this is to have my visuals be safe for localization. For example: If I'd call my asset icon-electronic and then localize my app in dutch it would ask for icon-electronisch. I hope this makes sense and I hope I don't have to browse and save a list of all Genre's by hand haha
Replies
4
Boosts
0
Views
2.9k
Activity
Jun ’23
What would be the easiest way to have a example MusicItem for SwiftUI previews
Hey all, I was just wondering what would be the best/easiest way to have a sample for any MusicItem to use in SwiftUI previews. I know I could convert one into a string and then decode (as it conforms to Codable) but maybe there is a way that would make it easier to setup? if not I might file a feedback for this as I think it would be great if all MusicItem structures would have a .example or something like that so for a view you'd have #Preview { SongTitlesView(song: Song.example) }
Replies
0
Boosts
0
Views
867
Activity
Jun ’23
Microphone feedback noise and can I use the output to recognise?
I recently released my first ShazamKit app, but there is one thing that still bothers me. When I started I followed the steps as documented by Apple right here : https://developer.apple.com/documentation/shazamkit/shsession/matching_audio_using_the_built-in_microphone however when I was running this on iPad I receive a lot of high pitched feedback noise when I ran my app with this configuration. I got it to work by commenting out the output node and format and only use the input. But now I want to be able to recognise the song that’s playing from the device that has my app open and was wondering if I need the output nodes for that or if I can do something else to prevent the Mic. Feedback from happening. In short: What can I do to prevent feedback from happening Can I use the output of a device to recognise songs or do I just need to make sure that the microphone can run at the same time as playing music? Other than that I really love the ShazamKit API and can highly recommend to have a go with it! This is the code as documented in the above link (I just added the comments of what broke it for me) func configureAudioEngine() { // Get the native audio format of the engine's input bus. let inputFormat = audioEngine.inputNode.inputFormat(forBus: 0) // THIS CREATES FEEDBACK ON IPAD PRO let outputFormat = AVAudioFormat(standardFormatWithSampleRate: 48000, channels: 1) // Create a mixer node to convert the input. audioEngine.attach(mixerNode) // Attach the mixer to the microphone input and the output of the audio engine. audioEngine.connect(audioEngine.inputNode, to: mixerNode, format: inputFormat) // THIS CREATES FEEDBACK ON IPAD PRO audioEngine.connect(mixerNode, to: audioEngine.outputNode, format: outputFormat) // Install a tap on the mixer node to capture the microphone audio. mixerNode.installTap(onBus: 0, bufferSize: 8192, format: outputFormat) { buffer, audioTime in // Add captured audio to the buffer used for making a match. self.addAudio(buffer: buffer, audioTime: audioTime) } }
Replies
3
Boosts
0
Views
2.7k
Activity
Feb ’23
Changing the images and tint of the icons provided by NavigationSplitView
Hey All, I'm currently implementing the new NavigationSplitView to improve my app's iPad experience, but I noticed that I couldn't seem to be able to tint the icons provided to toggle the Sidebar and the overflow menu. If I can't change the tint of these icons, I'm afraid I have to fall back the old implementation. The same counts for the Customize toolbar view. (I would also love to be able to change that background for legibility)
Replies
2
Boosts
0
Views
1.6k
Activity
Aug ’22
Is there any way to exclude curated and smart playlists?
The new MusicLibraryRequest<Playlist>() is great to get all the user's playlists but I have not found a way to exclude a specific type of playlist. For example; I wish to only get playlists that the user owns and can add content to (so this would exclude Smart and Curated playlists) I did find the Playlist.Kind enum, but I can't seem to apply this to the request, and it doesn't seem to have an option for smart playlists. So in short: Is there any way I can exclude Smart- and Curated Playlists from a MusicLibraryRequest
Replies
0
Boosts
0
Views
997
Activity
Jul ’22
Is it possible to get a certainty of the match ShazamKit gives?
Hey there, I was wondering if it's possible to check how "certain" ShazamKit is with the match. For example; I'd only change the result when the SHMatch has an accuracy/certainty of at least 80% I know there's a frequencySkew on the SHMatchedMediaItem but I'm not sure if that could be helpful
Replies
3
Boosts
0
Views
1.3k
Activity
Jul ’22
Audio crashes when connected to AirPods
Hi There, Whenever I want to use the microphone for my ShazamKit app while connected to AirPods my app crashes with a "Invalid input sample rate." message. I've tried multiple formats but keep getting this crash. Any pointers would be really helpful. func configureAudioEngine() { do { try audioSession.setCategory(.playAndRecord, options: [.mixWithOthers, .defaultToSpeaker, .allowAirPlay, .allowBluetoothA2DP ,.allowBluetooth]) try audioSession.setActive(false, options: .notifyOthersOnDeactivation) } catch { print(error.localizedDescription) } guard let engine = audioEngine else { return } let inputNode = engine.inputNode let inputNodeFormat = inputNode.inputFormat(forBus: 0) let audioFormat = AVAudioFormat( standardFormatWithSampleRate: inputNodeFormat.sampleRate, channels: 1 ) // Install a "tap" in the audio engine's input so that we can send buffers from the microphone to the signature generator. engine.inputNode.installTap(onBus: 0, bufferSize: 1024, format: audioFormat) { buffer, audioTime in self.addAudio(buffer: buffer, audioTime: audioTime) } } ```
Replies
2
Boosts
1
Views
2.4k
Activity
Jul ’22
What's the easiest way to check when user has downloaded your app?
I'm switching my business model to freemium and now I need a clean way to keep giving access to all features for my existing userbase. I know I probably should do something with receipt validation but which receipt do I check and is there a very clean and straightforward way of doing this with StoreKit 2 maybe? I would have loved to see something like: AppStore.originalPurchaseDate property in StoreKit 2 but would love to hear some thoughts and input!
Replies
4
Boosts
1
Views
4.2k
Activity
Jun ’22
Request content in a specific locale?
Hi All, I recently decided to remove my localizations for now because I'm requesting content from a lot of different sources (many of which only provides English text. But with MusicKit I still receive the locale content. So basically I'm wondering if it's possible to set a locale for a MusicCatalogResourceRequest?
Replies
3
Boosts
1
Views
1.7k
Activity
May ’22
Is the MusicItemID different per locale?
Hi There, I've created a little tool for myself to quickly find the MusicItemID of a song which I then use in one of my own APIs to match the content. In this tool, I'm using the MusicCatalogSearchRequest but I was just wondering if anyone knows if those IDs I get back from this request are unique to the locale? and if so, is there a way to get all the IDs of a song to write down? I don't want to match on a name as that's more error prone. e.g. my API would look something like this: { "id": ["1259176472","510004981","1302212469"], "text": "Cool content.", }
Replies
2
Boosts
0
Views
1.6k
Activity
Mar ’22
How do I access lyrics in MusicKit or Apple Music Catalog
Hi There, In the documentation of MusicKit's Song object I can find the hasLyrics property, with the following description: A Boolean value that indicates whether the song has lyrics available in the catalog. If true, the song has lyrics available; otherwise, it doesn’t. My only question now is, where can I find the lyrics if this returns true? What is "the catalog" in this case. I hope the lyrics are available in MusicKit or the Apple Music API as this would save me a lot of headache connecting more API's (lyrics is one of my most requested feature)
Replies
5
Boosts
2
Views
4.6k
Activity
Feb ’22
Any way to check if song is in Library with MusicKit? ?relate=library not working for me
For my app that heavily uses both ShazamKit and MusicKit, I need to be able to check if the matched song is in the user's library or not. I couldn't find an easy way to do this with MusicKit so I first then turned to the Apple Music API as they introduced the new parameter ?relate=library during this year's WWDC. When I first tried it, it worked as expected, and was very happy to have gotten it to work. However, it stopped working lately and now I turned to use MPMediaLibrary as that still works but is a lot slower in performance. Anyone has any idea why the ?relate=library stopped working for me or know a better way to check if the Song exists in the user's library? func checkInLibrary(from appleMusicID: String) async { do { let countryCode = try await MusicDataRequest.currentCountryCode let libURL = URL(string: "https://api.music.apple.com/v1/catalog/\(countryCode)/songs/\(appleMusicID)?relate=library")! let request = MusicDataRequest(urlRequest: URLRequest(url: libURL)) let dataResponse = try await request.response() print(dataResponse.debugDescription)               } catch { // I'm handling errors here }}
Replies
6
Boosts
0
Views
3.1k
Activity
Feb ’22
MusicSubscriptionOffer can't be presented without Apple Music App
Hey All, Loving the MusicKit API but have one question regarding the .musicSubscriptionOffer(isPresented:options:onLoadCompletion:). When Apple Music is not installed on the device it can't be presented, is there any way to still present the musicSubscriptionOffer or do I have to prompt the user to install Apple Music first? and if so, what's the best way to check if Apple Music is installed?
Replies
4
Boosts
0
Views
1.5k
Activity
Feb ’22