ShazamKit

RSS for tag

Get exact audio matching for any audio source using the Shazam catalog or a custom catalog in an app.

ShazamKit Documentation

Posts under ShazamKit tag

17 Posts
Sort by:
Post not yet marked as solved
0 Replies
147 Views
Hi Apple developers, Is it possible to display lyrics synchronized with a song in my app after the song has been identified? Just like the feature available in the general Shazam app when clicking on the middle-top icon (with music note in it) after Shazam'ing a song. The app I'm developing is intended for the Deaf and hard-of-hearing and therefore I would love to be able to show the song lyrics to make the app accessible. I would greatly appreciate your help because I can't find this in the documentation. Many thanks in advance!
Posted Last updated
.
Post not yet marked as solved
0 Replies
219 Views
Hello, I have a music on apple music. When I search this music on Shazam, I want it to appear with a clip like the link I provided below. Is there any way you can help with this? Example: https://www.youtube.com/watch?v=St8smx2q1Ho My Music: https://music.apple.com/us/album/tam-ba%C4%9F%C4%B1ms%C4%B1z-t%C3%BCrkiye/1689395789?i=1689395790 Thanks.
Posted
by yasirb_.
Last updated
.
Post marked as solved
1 Replies
340 Views
I am try to extract the audio file url from Shazamkit, it is deep inside the hierarchy of SHMediaItem > songs > previewAssets > url when I access the url with like this: let url = firstItem.songs[0].previewAssets?[0].url I am getting a warning like this: here is the Variable Viewer this is what I have done so far: struct MediaItems: Codable { let title: String? let subtitle: String? let shazamId: String? let appleMusicId: String? let appleMusicUrL: URL? let artworkUrl: URL? let artist: String? let matchOffset: TimeInterval? let videoUrl: URL? let webUrl: URL? let genres: [String] let isrc: String? let songs: [Song]? } extension SwiftFlutterShazamKitPlugin: SHSessionDelegate{ public func session(_ session: SHSession, didFind match: SHMatch) { let mediaItems = match.mediaItems if let firstItem = mediaItems.first { // extracting the url let url = firstItem.songs[0].previewAssets?[0].url let _shazamMedia = MediaItems( title:firstItem.title!, subtitle:firstItem.subtitle!, shazamId:firstItem.shazamID!, appleMusicId:firstItem.appleMusicID!, appleMusicUrL:firstItem.appleMusicURL!, artworkUrl:firstItem.artworkURL!, artist:firstItem.artist!, matchOffset:firstItem.matchOffset, videoUrl:firstItem.videoURL!, webUrl:firstItem.webURL!, genres:firstItem.genres, isrc:firstItem.isrc!, songs:firstItem.songs ) do { let jsonData = try JSONEncoder().encode([_shazamMedia]) let jsonString = String(data: jsonData, encoding: .utf8)! self.callbackChannel?.invokeMethod("matchFound", arguments: jsonString) } catch { callbackChannel?.invokeMethod("didHasError", arguments: "Error when trying to format data, please try again") } } }
Posted Last updated
.
Post not yet marked as solved
0 Replies
365 Views
We're looking to integrate ShazamKit, but can't find any details of associated costs. Is there a fee or rate limits for matching? And is attribution required to the matched song on Apple Music? Thank you
Posted
by _Jay.
Last updated
.
Post not yet marked as solved
1 Replies
1.1k Views
We are using ShazamKit SDK for Android and our application sometimes crashes when performing an audio recognition. We get the following logs: Cause: null pointer dereference backtrace: #00 pc 000000000000806c /data/app/lib/arm64/libsigx.so (SHAZAM_SIGX::reset()) (BuildId: 40e0b3c4250b21f23f7c4ec7d7b88f954606d914) #01 pc 00000000000dc324 /data/app//oat/arm64/base.odex at libsigx.SHAZAM_SIGX::reset()(reset:0) at base.0xdc324(Native Method)
Posted
by guerwan.
Last updated
.
Post not yet marked as solved
0 Replies
398 Views
Shazamkit's SHManagedSession() doesn't work on macOS 14 RC 23A339 Error code: AddInstanceForFactory: No factory registered for id <CFUUID 0x600000540340> F8BB1C28-BAE8-11D6-9C31-00039315CD46 HALC_ShellDevice.cpp:2,609 HALC_ShellDevice::RebuildControlList: couldn't find the control object Prepare call ignored, the caller does not have record permission Error The operation couldn’t be completed. (com.apple.ShazamKit error 202.)
Posted
by Ruizhe.
Last updated
.
Post not yet marked as solved
1 Replies
473 Views
I want use SHLibrary.default.items to show the music i recognized by Shazam. but SHLibrary.default.items always return empty list. I did an experiment and I called SHLibrary.default.items as soon as I entered on a page and it returned an empty list, but after use SHManagedSession to identify songs and then call SHLibrary.default.items it returned the result I wanted. Below is the test code private func bindEvent() { // call when View was create the items return empty if #available(iOS 17, *) { let items = SHLibrary.default.items print("-------->>>>>>>>\(items)") } self.addToMediaLibray.onTap { [weak self] in guard let `self` = self, let result = self.result, let appleMusicID = result.appleMusicID else { return } if #available(iOS 17, *) { // call when music was recognized the item is not empty. let items = SHLibrary.default.items print("1111-------->>>>>>>>\(items)") } } } The attach file is the part of result log My iOS Verion is iOS 17 (21A5326a) XCode Version is 15.0 beta 8 (15A5229m) The result log
Posted
by tbfungeek.
Last updated
.
Post not yet marked as solved
2 Replies
955 Views
In the video, there is a demonstration of shazam tool. Where do I download it from? What is the process of getting it on mac os.
Posted
by greenpau.
Last updated
.
Post not yet marked as solved
3 Replies
738 Views
https://developer.apple.com/documentation/shazamkit/shazamkit_dance_finder_with_managed_session The song detection is successful however with new APIs, I can't find this demo working with SHLibrary, it expect to display the RecentDanceRowView. I wonder if I missed any steps or the SHLibrary is not ready yet.
Posted
by Ruizhe.
Last updated
.
Post not yet marked as solved
0 Replies
410 Views
Looking to fill up a budget line for an iOs application we are trying to build. By adding ShazamKit to our app, how much cost the use of it per stream ? My research indicates that it is 0,00065$, correct ?
Posted
by utrema.
Last updated
.
Post not yet marked as solved
0 Replies
401 Views
I'm trying to get ShazamKit for Android to work. I have a catalog that I am downloading from an external service, caching in internal app storage and reading from internal app storage. Doing so I'm getting no matches. However if I manually download the file from internal app storage to my computer and put it in the assets folder and read it from there I'm getting matches. So the issue must be in the reading of the file. See comments in the code below. Here's the code: private const val BUFFER_SIZE = 3840 class ShazamService(private val app: Application) { private val coroutineScope = CoroutineScope(Dispatchers.IO + Job()) private val repository = ShazamRepository(...) private val catalog = ShazamKit.createCustomCatalog() private val recorder by lazy { AudioRecording(app) } private var session: StreamingSession? = null suspend fun initialize(source: Source) { // This method does not work addCatalog(source) // This works when used // loadCustomCatalog() session = (ShazamKit.createStreamingSession( catalog, AudioSampleRateInHz.SAMPLE_RATE_48000, BUFFER_SIZE ) as ShazamKitResult.Success).data session?.recognitionResults()?.onEach { matchResult -> onMatch(matchResult) }?.flowOn(Dispatchers.Main)?.launchIn(coroutineScope) } // This works private suspend fun loadCustomCatalog() { val assetManager = app.assets val inputStream: InputStream = assetManager.open("catalog.shazamcatalog") catalog.addFromCatalog(inputStream) } // This does not work private suspend fun addCatalog(source: Source) { repository.loadFile(app.applicationContext, source)?.use { data -> val result = this.catalog.addFromCatalog(data) Timber.d("Catalog added: $result") } } fun start() { recorder.startRecording { data -> session?.matchStream(data, data.size, 0) } } fun stop() { recorder.stopRecording() } private fun onMatch(result: com.shazam.shazamkit.MatchResult) { Timber.d("Received MatchResult: $result") } fun destroy() { coroutineScope.cancel() } } Here's the repository responsible for providing the catalog file. Source contains an id and a url from which a catalog can be downloaded. It downloads the catalog and saves it as a file in internal app storage and returns a FileInputStream. class ShazamRepository( private val shazamClient: ShazamClient ) { suspend fun loadFile(context: Context, source: Source): FileInputStream? { val file = File(context.filesDir, source.id + ".shazamcatalog") val catalog = loadFile(file) if (catalog == null) { val response = shazamClient.getCatalog(source.url) if (response.isSuccessful) { response.body()?.let { data -> saveResponseData(data, file) } } } else { return catalog } return loadFile(file) } private fun saveResponseData(data: ResponseBody, file: File) { data.byteStream().use { inputStream -> FileOutputStream(file).use { outputStream -> val buffer = ByteArray(4 * 1024) var read: Int while (inputStream.read(buffer).also { read = it } != -1) { outputStream.write(buffer, 0, read) } outputStream.flush() } inputStream.close() } } private fun loadFile(file: File): FileInputStream? { return if (file.exists()) { FileInputStream(file) } else { return null } } } To summarise: The catalog is downloaded and saved correctly. If the catalog file is opened with assetManager.open I'm getting matches. When using FileInputStream(file) no matches are received. What could be wrong with the File-API approach? Why does it work when using the AssetManager but not when using it as a File?
Posted
by adpal.
Last updated
.
Post not yet marked as solved
0 Replies
388 Views
ShazamKit is integrated into my music app, but some users have reported issues with the music recognition function not working for them. I am wondering if ShazamKit's functionality could be limited in certain countries or regions. Any insights on this matter would be greatly appreciated.
Posted
by Denis_M.
Last updated
.
Post not yet marked as solved
1 Replies
534 Views
I try to match a microphone audio with a custom catalog which I created via ShazamKit. What is the code for extracting and displaying "matchedMediaItem.artist" information on my iPhone screen after finding a song match with an entry of my custom-built catalog? I am a beginner.
Posted Last updated
.
Post not yet marked as solved
2 Replies
556 Views
Hi, I am using ShazamKit to detect songs from a live stream. I am using matchStreamingBuffer with a PCMBuffer. It looks like it works for the most part, but sometimes it throws an NSException. Here's the code calling the match: engine.mainMixerNode.installTap(onBus: 0, bufferSize: 4096, format: options.audioFormat) { buffer, time in do { self.session.matchStreamingBuffer(buffer, at: time) } catch { } } The exception: Supplied audio format is not supported <CMAudioFormatDescription 0x2828a29e0 [0x20f7863a0]> { mediaType:'soun' mediaSubType:'lpcm' mediaSpecific: { ASBD: { mSampleRate: 44100.000000 mFormatID: 'lpcm' mFormatFlags: 0x29 mBytesPerPacket: 4 mFramesPerPacket: 1 mBytesPerFrame: 4 mChannelsPerFrame: 2 mBitsPerChannel: 32 } cookie: {(null)} ACL: {Stereo (L R)} FormatList Array: { Index: 0 ChannelLayoutTag: 0x650002 ASBD: { mSampleRate: 44100.000000 mFormatID: 'lpcm' mFormatFlags: 0x29 mBytesPerPacket: 4 mFramesPerPacket: 1 mBytesPerFrame: 4 mChannelsPerFrame: 2 mBitsPerChannel: 32 }} } extensions: {(null)} } This is the stack stack: 0 CoreFoundation 0xa248 __exceptionPreprocess 1 libobjc.A.dylib 0x17a68 objc_exception_throw 2 ShazamKit 0x159d0 -[SHMutableSignature appendBuffer:atTime:error:] 3 ShazamKit 0x6d7c -[SHSignatureGenerator appendBuffer:atTime:error:] 4 ShazamKit 0x3968 -[SHSessionDriverSignatureSlot appendBuffer:atTime:error:] 5 ShazamKit 0x10430 -[SHSignatureBuffer flow:time:] 6 ShazamKit 0x2490 -[SHStreamingSessionDriver flow:time:] 7 ShazamKit 0xf784 -[SHSession matchStreamingBuffer:atTime:] 8 MyApp 0x17f69c thunk for @escaping @callee_guaranteed (@guaranteed AVAudioPCMBuffer, @guaranteed AVAudioTime) -> () (<compiler-generated>) 9 AVFAudio 0x482ac AVAudioNodeTap::TapMessage::RealtimeMessenger_Perform() 10 AVFAudio 0x71c4 CADeprecated::RealtimeMessenger::_PerformPendingMessages() 11 AVFAudio 0x471e4 invocation function for block in CADeprecated::RealtimeMessenger::RealtimeMessenger(applesauce::dispatch::v1::queue) I don't mind failing if the format is not good, but how can I avoid crashing?
Posted
by wotson.
Last updated
.
Post not yet marked as solved
3 Replies
1.3k Views
Hi, I'm trying to convert a stream into a PCMBuffer and then use Shazam to match. Shazam always fails to match. I have a theory it "listens" to the playback at double speed or more. Starts from here: ... let format = audioEngine.outputNode.inputFormat(forBus: 0) guard let pcmBuffer = format.toPCMBuffer(frame: currentFrame) else {          return } session.matchStreamingBuffer(pcmBuffer, at: nil) Where toPCMBuffer is: extension AVAudioFormat {     func toPCMBuffer(frame: AudioFrame) -> AVAudioPCMBuffer? {         guard let pcmBuffer = AVAudioPCMBuffer(pcmFormat: self, frameCapacity: UInt32(frame.dataWrap.size[0]) / streamDescription.pointee.mBytesPerFrame) else {             return nil         }         pcmBuffer.frameLength = pcmBuffer.frameCapacity         for i in 0 ..< min(Int(pcmBuffer.format.channelCount), frame.dataWrap.size.count) {             frame.dataWrap.data[i]?.withMemoryRebound(to: Float.self, capacity: Int(pcmBuffer.frameCapacity)) { srcFloatsForChannel in                 pcmBuffer.floatChannelData?[i].assign(from: srcFloatsForChannel, count: Int(pcmBuffer.frameCapacity))             }         }         return pcmBuffer     } } AudioFrame is: final class AudioFrame: MEFrame {     var timebase = Timebase.defaultValue     var duration: Int64 = 0     var size: Int64 = 0     var position: Int64 = 0     var numberOfSamples = 0     let dataWrap: ByteDataWrap     public init(bufferSize: Int32, channels: Int32) {         dataWrap = ObjectPool.share.object(class: ByteDataWrap.self, key: "AudioData_\(channels)") { ByteDataWrap() }         if dataWrap.size[0] < bufferSize {             dataWrap.size = Array(repeating: Int(bufferSize), count: Int(channels))         }     } ... } and MEFrame is: extension MEFrame {     public var seconds: TimeInterval { cmtime.seconds }     public var cmtime: CMTime { timebase.cmtime(for: position) } }
Posted
by wotson.
Last updated
.
Post not yet marked as solved
0 Replies
566 Views
am trying to identify songs using shazamkit. every time it runs, this is the output: 2023-04-26 04:39:28.947059-0400 shazoom[38003:3098314] [AQ] AudioQueueObject.cpp:2364 Error (-4) getting reporterIDs Processing chunk 1 from 0.0 to 60.0: 4316113920 Chunk saved at: file:///Users/youssefhemimy/Downloads/beyoncee.mp3 No match found 2023-04-26 04:39:29.483659-0400 shazoom[38003:3098339] [AQ] AudioQueueObject.cpp:2364 Error (-4) getting reporterIDs Processing chunk 2 from 60.0 to 120.0: 4317271680 Chunk saved at: file:///Users/youssefhemimy/Downloads/beyoncee.mp3 No match found 2023-04-26 04:39:29.965718-0400 shazoom[38003:3098343] [AQ] AudioQueueObject.cpp:2364 Error (-4) getting reporterIDs Processing chunk 3 from 120.0 to 180.0: 4317324608 Chunk saved at: file:///Users/youssefhemimy/Downloads/beyoncee.mp3 No match found Note: all songs I have used have been identified by shazam the app. I have tried to use different songs, re-exported them from logic pro x but no luck. It also takes less than 2 seconds to generate this output which is way faster than the app. I have tried to use: var minimumQuerySignatureDuration: TimeInterval { get } but no luck. Below is the full code: import Foundation import AVFoundation import ShazamKit class ShazamAnalyzer: NSObject, SHSessionDelegate { let session = SHSession() func analyzeAudioFile(url: URL) { let asset = AVAsset(url: url) let assetDuration = asset.duration.seconds let chunkDuration = 30.0 let numberOfChunks = Int(ceil(assetDuration / chunkDuration)) session.delegate = self for i in 0..<numberOfChunks { let startTime = Double(i) * chunkDuration let start = CMTime(seconds: startTime, preferredTimescale: asset.duration.timescale) let end = CMTime(seconds: min(startTime + chunkDuration, assetDuration), preferredTimescale: asset.duration.timescale) let range = CMTimeRange(start: start, end: end) if let signature = generateSignature(from: asset, for: range) { print("Processing chunk &bsol;(i + 1) from &bsol;(start.seconds) to &bsol;(end.seconds): &bsol;(signature.hash)") print("Chunk saved at: &bsol;(url.absoluteString)") session.match(signature) } else { print("Error generating signature for chunk &bsol;(i + 1)") } } } func generateSignature(from asset: AVAsset, for timeRange: CMTimeRange) -> SHSignature? { let audioFormat = AVAudioFormat(standardFormatWithSampleRate: 44100, channels: 1) let signatureGenerator = SHSignatureGenerator() let exportSession = AVAssetExportSession(asset: asset, presetName: AVAssetExportPresetPassthrough)! exportSession.timeRange = timeRange let reader = try? AVAssetReader(asset: exportSession.asset) let audioOutput = AVAssetReaderAudioMixOutput(audioTracks: exportSession.asset.tracks(withMediaType: .audio), audioSettings: nil) audioOutput.alwaysCopiesSampleData = false reader?.add(audioOutput) reader?.startReading() while let sampleBuffer = audioOutput.copyNextSampleBuffer() { if let blockBuffer = CMSampleBufferGetDataBuffer(sampleBuffer) { let length = CMBlockBufferGetDataLength(blockBuffer) var data = Data(count: length) _ = data.withUnsafeMutableBytes {(bytes: UnsafeMutableRawBufferPointer) in CMBlockBufferCopyDataBytes(blockBuffer, atOffset: 0, dataLength: length, destination: bytes.baseAddress!) } let audioBuffer = AVAudioPCMBuffer(pcmFormat: audioFormat!, frameCapacity: AVAudioFrameCount(length) / audioFormat!.streamDescription.pointee.mBytesPerFrame)! guard let channelData = audioBuffer.floatChannelData else { print("Error: Channel data is nil") return nil } let channels = UnsafeBufferPointer(start: channelData, count: Int(audioFormat!.channelCount)) let destinationBuffer = UnsafeMutableBufferPointer(start: channels[0], count: length) _ = data.copyBytes(to: destinationBuffer) do { try signatureGenerator.append(audioBuffer, at: nil) } catch { print( "Error appending buffer to signature generator: &bsol;(error)") return nil } } } return signatureGenerator.signature() } func session(_ session: SHSession, didFind match: SHMatch) { if let matchedItem = match.mediaItems.first { print("Match found: &bsol;(matchedItem.title ?? "Unknown Title") by &bsol;(matchedItem.artist ?? "Unknown Artist")") } else { print("Match found, but unable to retrieve media item details.") } } func session(_ session: SHSession, didNotFindMatchFor signature: SHSignature, error: Error?) { print("No match found") } } let filePath = "/Users/youssefhemimy/Downloads/beyoncee.mp3" // Replace this with the actual path to your audio file if let fileURL = URL(string: "file://" + filePath) { let analyzer = ShazamAnalyzer() analyzer.analyzeAudioFile(url: fileURL) } else { print("Invalid file path") }
Posted
by youssefh4.
Last updated
.
Post not yet marked as solved
0 Replies
556 Views
Hi, Is there an app limit or user limit for song matches with ShazamKit? I need to know if I need to limit access or not on my app. If there is a limit, is it per user or per the entire calls made by the app? I am using the matchStreamingBuffer continiously, so it is called every few seconds (with the same match).
Posted
by wotson.
Last updated
.