ShazamKit

RSS for tag

Get exact audio matching for any audio source using the Shazam catalog or a custom catalog in an app.

ShazamKit Documentation

Posts under ShazamKit tag

32 Posts
Sort by:
Post not yet marked as solved
0 Replies
361 Views
We try to implement shazamkit music recognition in our app. All works amazing but, SHSession return track information in english translit, but when we open web url, it show right language. Region and language set as Ru on device.
Posted
by
Post not yet marked as solved
1 Replies
349 Views
Hello good morning lovely ShazamKit team! I was wondering if there's a way to evaluate the result (SHMatch) to be able to estimate if this is a very good match (the catalog is very certain that this is correct), or if this is at the lower end of the spectrum of certainty. Thanks a lot and have a nice day! – Frederik
Posted
by
Post not yet marked as solved
0 Replies
388 Views
Hello Team, Hope you are doing well..!! I am developing a music recognition app by using ShazamKit. I get a MATCH_ATTEMPT_FAILED error every time. So Please let me know how I can fix this issue? Error Log:- Error(exception=com.shazam.shazamkit.ShazamKitMatchException: MATCH_ATTEMPT_FAILED, querySignature=Signature(dataRepresentation=[-128, 37, -128, 37, 1, 0, 0, 0, 12, 0, 0, 0, -128, 37, -2, -54, -100, -90, 21, 112, -36, 1, 0, 0, 0, -100, 32, -108, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 48, 0, 0, 0, 0, 0, 0, 0, 0, 56, 32, 1, 0, 0, 0, 124, 0, 0, 0, 0, 64, -36, 1, 0, 0, 64, 0, 3, 96, 75, 0, 0, 0, 16, -9, 110, 107, 15, 13, -124, 110, -118, 11, 5, -63, 109, 124, 10, 15, 101, 113, 57, 13, 6, 63, 108, 119, 8, 0, 88, 112, -68, 12, 10, -103, 110, -93, 14, 9, -79, 109, 71, 16, 8, 5, 111, -9, 9, 11, 85, 110, 120, 15, 3, -96, 112, 60, 9, 13, -113, 110, -16, 10, 15, 103, 111, 58, 8, 18, 28, 110, -17, 15, 14, -46, 112, 65, 14, 0, 65, 0, 3, 96, 120, 0, 0, 0, 27, 40, 112, 56, 30, 3, -111, 112, 77, 21, 6, -6, 110, -71, 31, 4, 21, 120, -1, 25, 5, 4, 111, -77, 40, 2, 82, 111, 85, 23, 14, 26, 111, -128, 19, 6, 14, 111, 0, 29, 1, -54, 109, 85, 42, 0, 95, 111, 39, 46, 3, 63, 113, -106, 32, 3, 6, 112, 117, 39, 14, -32, 119, 16, 26, 6, 38, 113, -64, 29, 7, -3, 110, 48, 17, 8, 5, 111, -62, 31, 1, 108, 110, 71, 37, 5, -67, 109, 64, 31, 4, 22, 111, 110, 22, 7, -41, 110, 63, 42, 2, 3, 111, -22, 22, 13, -43, 110, -65, 34, 11, -105, 110, -18, 17, 4, 39, 110, 84, 43, 66, 0, 3, 96, 120, 0, 0, 0, 14, -110, 111, 15, 67, 6, 29, 112, -117, 65, 3, -15, 119, 20, 78, 4, -17, 111, 24, 52, 2, 64, 113, -8, 83, 13, 41, 113, -73, 71, 4, -56, 113, -107, 103, 11, -1, 111, -102, 73, 5, -4, 111, 112, 96, 2, -123, 114, -57, 50, 8, 101, 112, 52, 72, 6, -121, 112, 7, 83, 16, -34, 111, -43, 65, 0, 124, 120, 19, 78, 1, -34, 111, -119, 70, 6, -123, 113, -54, 108, 17, 106, 111, -56, 84, 4, -43, 111, 68, 69, 6, 96, 111, -62, 55, 6, -78, 111, -48, 111, 9, 99, 112, -6, 102, 1, 74, 113, 18, 58, 5, -8, 111, -3, 96, 1, 8, 113, 94, 104, 67, 0, 3, 96, 120, 0, 0, 0, 19, 19, 112, 49, -122, 2, -128, 111, -64, 116, 5, -77, 112, -81, -90, 1, 110, 113, -20, -96, 4, 81, 113, -56, -111, 23, -57, 111, -109, -82, 3, 110, 111, -11, -122, 6, 8, 112, -38, 112, 15, 17, 113, -6, -81, 2, -72, 111, -2, 125, 15, -46, 111, 13, -110, 1, 100, 111, -106, 113, 3, 83, 119, 49, -126, 0, -87, 113, -64, -84, 3, 29, 112, -74, -110, 5, -86, 111, -125, 124, 5, 58, 112, 123, -109, 1, 54, 112, 66, 122, 5, 72, 111, -99, -94, 23, -96, 112, -17, -92, 3, 31, 113, 44, -101, 4, 54, 112, -84, -96, 4, -69, 112, 52, -100, 3, 66, 111, -14, -110], durationInMs=1297, audioStartTimestamp=b.a.a.a.d@9c5d28e0))
Posted
by
Post not yet marked as solved
1 Replies
486 Views
so i'm learning how to work with ShazamKit on xcode 13 beta 4 and while i'm trying to watch this video https://www.youtube.com/watch?v=KvyQvZYqGL0&list=PL5PR3UyfTWveeb0iCmauh7boDnpCmpMed at 2:33 - the dude said that i need to enable Shazam service if i want ShazamKit to work properly. where do i need to go? how do i find this page?
Posted
by
Post not yet marked as solved
1 Replies
608 Views
I'm trying to test ShazamKit on Android. In the following Kotlin snippet, I load apple-test-wav.wav - a 9 second long, 48khz, 16bit PCM WAV audio snippet of the Food Math example video - and attempt to match it against the FoodMath.shazamsignature custom catalog. This audio is not capture via a microphone, it's a clean sample directly from the video and I can't get it to match. The output is always: NoMatch Am I doing something obviously wrong here? override fun onViewCreated(view: View, savedInstanceState: Bundle?) { super.onViewCreated(view, savedInstanceState) CoroutineScope(Dispatchers.IO).launch { searchCatalog() } } private suspend fun searchCatalog() { val file = context?.assets?.open("apple-test-wav.wav") if (file != null) { val bytes = file.readBytes() file.close() val signatureGenerator = (ShazamKit.createSignatureGenerator(AudioSampleRateInHz.SAMPLE_RATE_48000) as ShazamKitResult.Success).data signatureGenerator.append(bytes, bytes.size, System.currentTimeMillis()) val signature = signatureGenerator.generateSignature() val inputStream: InputStream? = context?.assets?.open("FoodMath.shazamsignature") if (inputStream!= null) { val catalog = ShazamKit.createCustomCatalog().apply { addFromCatalog(inputStream) } val session = (ShazamKit.createSession(catalog) as ShazamKitResult.Success).data val matchResult: MatchResult = session.match(signature) println(matchResult.toString()) } else { println("no input stream") } } }
Posted
by
Post not yet marked as solved
1 Replies
675 Views
Hi, I want to implement ShazamKit in my project. But I have some problems. I use AVCaptureSession to take photos in my app and I'm unable to use ShazamKit. I tried to use three different ways Use an AVAudioEngine during my AVCaptureSession But I didn't obtain any result from Shazam. Try to use ShazamKit after stopping my AvCaptureSession but this causes some problems, and some crashes. Try to use the buffer of my AVCaptureSession to catch audio directly without use AVAudioEngine. This is the code that I use with AVAudioEngine: try! audioSession.setActive(true, options: .notifyOthersOnDeactivation)                 let inputNode = self.audioEngine.inputNode                 let recordingFormat = inputNode.outputFormat(forBus: 0)                                 let audioFormat = recordingFormat //AVAudioFormat(standardFormatWithSampleRate: self.audioEngine.inputNode.outputFormat(forBus: 0).sampleRate,                     //                            channels: 1)                                  inputNode.installTap(onBus: 0, bufferSize: 1024, format: audioFormat) { (buffer: AVAudioPCMBuffer, when: AVAudioTime) in                     try! self.signatureGenerator.append(buffer, at: nil)                                          self.session.matchStreamingBuffer(buffer, at: nil)                 }                              self.audioEngine.prepare()                 try! self.audioEngine.start() I can choose two ways to do this, use AVCaptureSession output to pass it to ShazamKit or use an AVAudioSession after the stop of AVCaptureSession. So I have two questions: Can I use a CMSampleBufferRef from AVCaptureSession buffer in a SHSession? And if the answer is yes how? How can I prevent this error if I want to use an AVAudioSession after I stopped my AVCaptureSession? [aurioc]            AURemoteIO.cpp:1117  failed: -10851 (enable 1, outf< 2 ch,      0 Hz, Float32, deinterleaved> inf< 2 ch,      0 Hz, Float32, deinterleaved>) [avae]            AVAEInternal.h:76    required condition is false: [AVAEGraphNode.mm:834:CreateRecordingTap: (IsFormatSampleRateAndChannelCountValid(format))] *** Terminating app due to uncaught exception 'com.apple.coreaudio.avfaudio', reason: 'required condition is false: IsFormatSampleRateAndChannelCountValid(format)' Thanks
Posted
by
Post not yet marked as solved
1 Replies
725 Views
I'm creating a simple song-matching application. My steps are: I create a project in XCode. Set bundle ID. Select my team Create an identifier, in my developer account Inserting my bundle ID from Xcode Selecting my team's application ID Turn on ShazamKit in services And after all that, I still get 202 error Error Domain=com.apple.ShazamKit Code=202 "Please check that you have enabled the ShazamKit App Service for this app identifier" UserInfo={NSDebugDescription=Please check that you have enabled the ShazamKit App Service for this app identifier} What am I doing wrong? Could it have something to do with the code?
Posted
by
Post not yet marked as solved
2 Replies
803 Views
In continuation to this question, I am now able to use ShazamKit in my app, but for every video I pass to it didFailWithSignature is always called: func session(_ session: SHSession,        didNotFindMatchFor signature: SHSignature,                  error: Error?) {         DispatchQueue.main.async {             print("match error: \(String(describing: error))")         }     } with the following output: match error: nil This is happening even with videos that are just screen recorded video clips of Ed Sheeran for example, something that Shazam would easily recognized. Any idea why that would happen? What I should investigate? (the full code is also updated on my GitHub
Posted
by
Post not yet marked as solved
1 Replies
593 Views
The operation couldn’t be completed. (com.apple.ShazamKit error 300.) I get this error when I try to add multiple shazamcatalogs on a single SHCustomCatalog. The fact that the operation is called add(from:) and not 'load' lets me suggest that it's meant to use that way. Am I wrong? And yes, both work work if I just add one at a time. let catalog = SHCustomCatalog() do { try catalog.add(from: url1) try catalog.add(from: url2) {...} } catch {...} . . Also, fun fact: I was working on a ShazamKit myself before dubDub and it's a really fascinating topic and so awesome to see how well it works, great job! 👏
Post marked as solved
3 Replies
855 Views
I was trying to generate a signature from an audio file on-disk. Loading the files work, however generating signatures from them don't seem to work. Here's the code I'm using to generate the signature: func signature(from asset: AVAudioFile) -> SHSignature? {     let format = asset.processingFormat     let frameCount = AVAudioFrameCount(asset.length)           guard let buffer = AVAudioPCMBuffer(pcmFormat: format, frameCapacity: frameCount) else { return nil }     let generator = SHSignatureGenerator()           do {        try asset.read(into: buffer)                let sampleRate = asset.processingFormat.sampleRate                try generator.append(buffer, at: AVAudioTime(sampleTime: .zero, atRate: sampleRate))                return generator.signature()     } catch {        print(error)        return nil     } } There are no errors being thrown in the code, but when I try to use the signatures in an SHSession, the audio is not being recognised. However, using the Shazam library works. I have tried searching ways to do this, however the only examples I've seen are for generating signatures from live audio input. Is there something I'm missing from the code above?
Posted
by
Post marked as solved
5 Replies
1.7k Views
I want to create my own custom audio recognition with ShazamKit, when opening the sample project I found the FoodMath.shazamsignature file. I believe there is a way to generate that file based on my audio collections. How do I create the .shazamsignature file? Thanks.
Posted
by