ShazamKit

RSS for tag

Get exact audio matching for any audio source using the Shazam catalog or a custom catalog in an app.

Posts under ShazamKit tag

14 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

Trouble with AppleScript Permissions and ShazamKit Integration in macOS App
Hello fellow developers, I am developing a macOS app called "Playlist Plunderer 2," aimed at using AppleScript to control the Music app to play songs and employing ShazamKit to recognize these songs and update their metadata. Despite setting up the entitlements and plist files correctly, I'm encountering issues with gaining the necessary AppleScript permissions, and my app is not appearing under 'Automation' in System Preferences. Additionally, ShazamKit fails to match songs, consistently returning error 201. Here are the specifics of my setup and what I've tried so far: Xcode Version: 15.4, macOS 14.1.2 Entitlements Configured: Includes permissions for Apple events, audio input, and scripting targets for the Music app. Capabilities: ShazamKit and ScriptingBridge frameworks integrated, set to "Do Not Embed." Info.plist Adjustments: Added "Privacy - Microphone Usage Description." Scripting: Manual AppleScript commands outside of Xcode succeed, but the app's scripts do not trigger. Entitlements File: <?xml version="1.0" encoding="UTF-8"?> <!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd"> <plist version="1.0"> <dict> <key>com.apple.security.app-sandbox</key> <true/> <key>com.apple.security.automation.apple-events</key> <true/> <key>com.apple.security.device.audio-input</key> <true/> <key>com.apple.security.files.user-selected.read-only</key> <true/> <key>com.apple.security.scripting-targets</key> <dict> <key>com.apple.Music</key> <array> <string>com.apple.Music.playback</string> <string>com.apple.Music.library.read-write</string> </array> </dict> </dict> </plist> I am having issues controlling the music app (itunes) from the apple script within my xcode project. the objective of the app is to rewrite the metadata of songs inside a folder in my Music app, this folder is titled Playlist Plunderer. The way I intend for the app to function is, the app will play the songs in the playlist, and then it will use shazamkit to recognize the song thats playing, it will then copy the metadata results of that song to rewrite the metadata of the song in the music playlist. I am still in the beginning stages. and I am very new to xcode. I created a apple developer account, paid the $99 and it is active, and I added the identifier bundle to the account from my app. I am VERY new to xcode,(this is my first project) my development team is set ( Shane Vincent), and the app is set to automatically manage signing, under app sandbox. i have audio input checked, under hardened runtime/ resource access audio input is checked. in build settings the path to the info.plist file is correct, the info.plist contains Privacy - Microphone Usage Description that I added i think it was called NSMmicriphone or something, with a description that reads "This app needs to access the microphone to identify songs using ShazamKit." the app appears under System Preferences > Security & Privacy > Privacy > Microphone but not under System Preferences > Security & Privacy > Privacy > Automation it is being made on macOS 14.1.2 (23B92) and xcode Version 15.4 (15F31d) Under framework library, and embedded content, I have added two frameworks, Shazamkit.framework, and ScriptingBridge.framework, both set to do not embed Current Issue: AppleScript fails to authorize with the Music app, and ShazamKit errors suggest an issue with song matching. Has anyone faced similar challenges or can offer guidance on how to ensure AppleScript and ShazamKit function correctly within a sandboxed macOS app? Any insights into troubleshooting or configuring entitlements more effectively would be greatly appreciated. Thanks for your help!
0
0
57
2d
Shazamkit - Exception 'required condition is false: IsFormatSampleRateAndChannelCountValid(format)'
I'm trying to expose my native shazamkit code to the host react native app. The implementation works fine in a separate swift project but it fails when I try to integrate it into a React Native app. Exception 'required condition is false: IsFormatSampleRateAndChannelCountValid(format)' was thrown while invoking exposed on target ShazamIOS with params ( 1682, 1683 ) callstack: ( 0 CoreFoundation 0x00007ff80049b761 __exceptionPreprocess + 242 1 libobjc.A.dylib 0x00007ff800063904 objc_exception_throw + 48 2 CoreFoundation 0x00007ff80049b56b +[NSException raise:format:] + 0 3 AVFAudio 0x00007ff846197929 _Z19AVAE_RaiseExceptionP8NSStringz + 156 4 AVFAudio 0x00007ff8461f2e90 _ZN17AUGraphNodeBaseV318CreateRecordingTapEmjP13AVAudioFormatU13block_pointerFvP16AVAudioPCMBufferP11AVAudioTimeE + 766 5 AVFAudio 0x00007ff84625f703 -[AVAudioNode installTapOnBus:bufferSize:format:block:] + 1456 6 muse 0x000000010a313dd0 $s4muse9ShazamIOSC6record33_35CC2309E4CA22278DC49D01D96C376ALLyyF + 496 7 muse 0x000000010a313210 $s4muse9ShazamIOSC5startyyF + 288 8 muse 0x000000010a312d03 $s4muse9ShazamIOSC7exposed_6rejectyyypSgXE_ySSSg_AGs5Error_pSgtXEtF + 83 9 muse 0x000000010a312e47 $s4muse9ShazamIOSC7exposed_6rejectyyypSgXE_ySSSg_AGs5Error_pSgtXEtFTo + 103 10 CoreFoundation 0x00007ff8004a238c __invoking___ + 140 11 CoreFoundation 0x00007ff80049f6b3 -[NSInvocation invoke] + 302 12 CoreFoundation 0x00007ff80049f923 -[NSInvocation invokeWithTarget:] + 70 13 muse 0x000000010a9210ef -[RCTModuleMethod invokeWithBridge:module:arguments:] + 2495 14 muse 0x000000010a925cb4 _ZN8facebook5reactL11invokeInnerEP9RCTBridgeP13RCTModuleDatajRKN5folly7dynamicEiN12_GLOBAL__N_117SchedulingContextE + 2036 15 muse 0x000000010a925305 _ZZN8facebook5react15RCTNativeModule6invokeEjON5folly7dynamicEiENK3$_0clEv + 133 16 muse 0x000000010a925279 ___ZN8facebook5react15RCTNativeModule6invokeEjON5folly7dynamicEi_block_invoke + 25 17 libdispatch.dylib 0x000000010e577747 _dispatch_call_block_and_release + 12 18 libdispatch.dylib 0x000000010e5789f7 _dispatch_client_callout + 8 19 libdispatch.dylib 0x000000010e5808c9 _dispatch_lane_serial_drain + 1127 20 libdispatch.dylib 0x000000010e581665 _dispatch_lane_invoke + 441 21 libdispatch.dylib 0x000000010e58e76e _dispatch_root_queue_drain_deferred_wlh + 318 22 libdispatch.dylib 0x000000010e58db69 _dispatch_workloop_worker_thread + 590 23 libsystem_pthread.dylib 0x000000010da67b84 _pthread_wqthread + 327 24 libsystem_pthread.dylib 0x000000010da66acf start_wqthread + 15 ) RCTFatal facebook::react::invokeInner(RCTBridge*, RCTModuleData*, unsigned int, folly::dynamic const&, int, (anonymous namespace)::SchedulingContext) facebook::react::RCTNativeModule::invoke(unsigned int, folly::dynamic&&, int)::$_0::operator()() const invocation function for block in facebook::react::RCTNativeModule::invoke(unsigned int, folly::dynamic&&, int) This is my swift file, error happens in the record function. import Foundation import ShazamKit @objc(ShazamIOS) class ShazamIOS : NSObject { @Published var matching: Bool = false @Published var mediaItem: SHMatchedMediaItem? @Published var error: Error? { didSet { hasError = error != nil } } @Published var hasError: Bool = false private lazy var audioSession: AVAudioSession = .sharedInstance() private lazy var session: SHSession = .init() private lazy var audioEngine: AVAudioEngine = .init() private lazy var inputNode = self.audioEngine.inputNode private lazy var bus: AVAudioNodeBus = 0 override init() { super.init() session.delegate = self } @objc func exposed(_ resolve:RCTPromiseResolveBlock, reject:RCTPromiseRejectBlock){ start() resolve("ios code executed") } func start() { switch audioSession.recordPermission { case .granted: self.record() case .denied: DispatchQueue.main.async { self.error = ShazamError.recordDenied } case .undetermined: audioSession.requestRecordPermission { granted in DispatchQueue.main.async { if granted { self.record() } else { self.error = ShazamError.recordDenied } } } @unknown default: DispatchQueue.main.async { self.error = ShazamError.unknown } } } private func record() { do { self.matching = true let format = self.inputNode.outputFormat(forBus: bus) self.inputNode.installTap(onBus: bus, bufferSize: 8192, format: format) { [weak self] (buffer, time) in self?.session.matchStreamingBuffer(buffer, at: time) } self.audioEngine.prepare() try self.audioEngine.start() } catch { self.error = error } } func stop() { self.audioEngine.stop() self.inputNode.removeTap(onBus: bus) self.matching = false } @objc static func requiresMainQueueSetup() -> Bool { return true; } } extension ShazamIOS: SHSessionDelegate { func session(_ session: SHSession, didFind match: SHMatch) { DispatchQueue.main.async { [self] in if let mediaItem = match.mediaItems.first { self.mediaItem = mediaItem self.stop() } } } func session(_ session: SHSession, didNotFindMatchFor signature: SHSignature, error: Error?) { DispatchQueue.main.async {[self] in self.error = error self.stop() } } } objC file #import <Foundation/Foundation.h> #import "React/RCTBridgeModule.h" @interface RCT_EXTERN_MODULE(ShazamIOS, NSObject); RCT_EXTERN_METHOD(exposed:(RCTPromiseResolveBlock)resolve reject:(RCTPromiseRejectBlock)reject) @end how I consume the exposed function in RN. const {ShazamModule, ShazamIOS} = NativeModules; const onPressIOSButton = () => { ShazamIOS.exposed().then(result => console.log(result)).catch(e => console.log(e.message, e.code)); };
2
0
342
May ’24
Displaying Song Lyrics using the Shazam API
Hi Apple developers, Is it possible to display lyrics synchronized with a song in my app after the song has been identified? Just like the feature available in the general Shazam app when clicking on the middle-top icon (with music note in it) after Shazam'ing a song. The app I'm developing is intended for the Deaf and hard-of-hearing and therefore I would love to be able to show the song lyrics to make the app accessible. I would greatly appreciate your help because I can't find this in the documentation. Many thanks in advance!
0
0
311
Mar ’24
Shazam Video
Hello, I have a music on apple music. When I search this music on Shazam, I want it to appear with a clip like the link I provided below. Is there any way you can help with this? Example: https://www.youtube.com/watch?v=St8smx2q1Ho My Music: https://music.apple.com/us/album/tam-ba%C4%9F%C4%B1ms%C4%B1z-t%C3%BCrkiye/1689395789?i=1689395790 Thanks.
0
0
398
Feb ’24
Trying to extract Song Array from Shazamkit
I am try to extract the audio file url from Shazamkit, it is deep inside the hierarchy of SHMediaItem > songs > previewAssets > url when I access the url with like this: let url = firstItem.songs[0].previewAssets?[0].url I am getting a warning like this: here is the Variable Viewer this is what I have done so far: struct MediaItems: Codable { let title: String? let subtitle: String? let shazamId: String? let appleMusicId: String? let appleMusicUrL: URL? let artworkUrl: URL? let artist: String? let matchOffset: TimeInterval? let videoUrl: URL? let webUrl: URL? let genres: [String] let isrc: String? let songs: [Song]? } extension SwiftFlutterShazamKitPlugin: SHSessionDelegate{ public func session(_ session: SHSession, didFind match: SHMatch) { let mediaItems = match.mediaItems if let firstItem = mediaItems.first { // extracting the url let url = firstItem.songs[0].previewAssets?[0].url let _shazamMedia = MediaItems( title:firstItem.title!, subtitle:firstItem.subtitle!, shazamId:firstItem.shazamID!, appleMusicId:firstItem.appleMusicID!, appleMusicUrL:firstItem.appleMusicURL!, artworkUrl:firstItem.artworkURL!, artist:firstItem.artist!, matchOffset:firstItem.matchOffset, videoUrl:firstItem.videoURL!, webUrl:firstItem.webURL!, genres:firstItem.genres, isrc:firstItem.isrc!, songs:firstItem.songs ) do { let jsonData = try JSONEncoder().encode([_shazamMedia]) let jsonString = String(data: jsonData, encoding: .utf8)! self.callbackChannel?.invokeMethod("matchFound", arguments: jsonString) } catch { callbackChannel?.invokeMethod("didHasError", arguments: "Error when trying to format data, please try again") } } }
3
0
527
Jan ’24
ShazamKit Cost
We're looking to integrate ShazamKit, but can't find any details of associated costs. Is there a fee or rate limits for matching? And is attribution required to the matched song on Apple Music? Thank you
0
0
485
Oct ’23
Shazamkit's SHManagedSession() doesn't work on macOS 14 RC 23A339
Shazamkit's SHManagedSession() doesn't work on macOS 14 RC 23A339 Error code: AddInstanceForFactory: No factory registered for id <CFUUID 0x600000540340> F8BB1C28-BAE8-11D6-9C31-00039315CD46 HALC_ShellDevice.cpp:2,609 HALC_ShellDevice::RebuildControlList: couldn't find the control object Prepare call ignored, the caller does not have record permission Error The operation couldn’t be completed. (com.apple.ShazamKit error 202.)
1
0
545
Sep ’23
SHLibrary.default.items always return empty list
I want use SHLibrary.default.items to show the music i recognized by Shazam. but SHLibrary.default.items always return empty list. I did an experiment and I called SHLibrary.default.items as soon as I entered on a page and it returned an empty list, but after use SHManagedSession to identify songs and then call SHLibrary.default.items it returned the result I wanted. Below is the test code private func bindEvent() { // call when View was create the items return empty if #available(iOS 17, *) { let items = SHLibrary.default.items print("-------->>>>>>>>\(items)") } self.addToMediaLibray.onTap { [weak self] in guard let `self` = self, let result = self.result, let appleMusicID = result.appleMusicID else { return } if #available(iOS 17, *) { // call when music was recognized the item is not empty. let items = SHLibrary.default.items print("1111-------->>>>>>>>\(items)") } } } The attach file is the part of result log My iOS Verion is iOS 17 (21A5326a) XCode Version is 15.0 beta 8 (15A5229m) The result log
1
0
621
Sep ’23
ShazamKit failing to get matches
I'm trying to get ShazamKit for Android to work. I have a catalog that I am downloading from an external service, caching in internal app storage and reading from internal app storage. Doing so I'm getting no matches. However if I manually download the file from internal app storage to my computer and put it in the assets folder and read it from there I'm getting matches. So the issue must be in the reading of the file. See comments in the code below. Here's the code: private const val BUFFER_SIZE = 3840 class ShazamService(private val app: Application) { private val coroutineScope = CoroutineScope(Dispatchers.IO + Job()) private val repository = ShazamRepository(...) private val catalog = ShazamKit.createCustomCatalog() private val recorder by lazy { AudioRecording(app) } private var session: StreamingSession? = null suspend fun initialize(source: Source) { // This method does not work addCatalog(source) // This works when used // loadCustomCatalog() session = (ShazamKit.createStreamingSession( catalog, AudioSampleRateInHz.SAMPLE_RATE_48000, BUFFER_SIZE ) as ShazamKitResult.Success).data session?.recognitionResults()?.onEach { matchResult -> onMatch(matchResult) }?.flowOn(Dispatchers.Main)?.launchIn(coroutineScope) } // This works private suspend fun loadCustomCatalog() { val assetManager = app.assets val inputStream: InputStream = assetManager.open("catalog.shazamcatalog") catalog.addFromCatalog(inputStream) } // This does not work private suspend fun addCatalog(source: Source) { repository.loadFile(app.applicationContext, source)?.use { data -> val result = this.catalog.addFromCatalog(data) Timber.d("Catalog added: $result") } } fun start() { recorder.startRecording { data -> session?.matchStream(data, data.size, 0) } } fun stop() { recorder.stopRecording() } private fun onMatch(result: com.shazam.shazamkit.MatchResult) { Timber.d("Received MatchResult: $result") } fun destroy() { coroutineScope.cancel() } } Here's the repository responsible for providing the catalog file. Source contains an id and a url from which a catalog can be downloaded. It downloads the catalog and saves it as a file in internal app storage and returns a FileInputStream. class ShazamRepository( private val shazamClient: ShazamClient ) { suspend fun loadFile(context: Context, source: Source): FileInputStream? { val file = File(context.filesDir, source.id + ".shazamcatalog") val catalog = loadFile(file) if (catalog == null) { val response = shazamClient.getCatalog(source.url) if (response.isSuccessful) { response.body()?.let { data -> saveResponseData(data, file) } } } else { return catalog } return loadFile(file) } private fun saveResponseData(data: ResponseBody, file: File) { data.byteStream().use { inputStream -> FileOutputStream(file).use { outputStream -> val buffer = ByteArray(4 * 1024) var read: Int while (inputStream.read(buffer).also { read = it } != -1) { outputStream.write(buffer, 0, read) } outputStream.flush() } inputStream.close() } } private fun loadFile(file: File): FileInputStream? { return if (file.exists()) { FileInputStream(file) } else { return null } } } To summarise: The catalog is downloaded and saved correctly. If the catalog file is opened with assetManager.open I'm getting matches. When using FileInputStream(file) no matches are received. What could be wrong with the File-API approach? Why does it work when using the AssetManager but not when using it as a File?
0
0
516
Aug ’23
ShazamKit SDK crash on Android
We are using ShazamKit SDK for Android and our application sometimes crashes when performing an audio recognition. We get the following logs: Cause: null pointer dereference backtrace: #00 pc 000000000000806c /data/app/lib/arm64/libsigx.so (SHAZAM_SIGX::reset()) (BuildId: 40e0b3c4250b21f23f7c4ec7d7b88f954606d914) #01 pc 00000000000dc324 /data/app//oat/arm64/base.odex at libsigx.SHAZAM_SIGX::reset()(reset:0) at base.0xdc324(Native Method)
1
2
1.3k
Oct ’23