Getting MatchError "MATCH_ATTEMPT_FAILED" everytime when matchstream on Android Studio Java+Kotlin project. My project reads the samples from the mic input using audioRecord class and sents them to the Shazamkit to matchstream. I created a kotlin class to handle to Shazamkit. The audioRecord is build to be mono and 16 bit.
My Kotlin Class
class ShazamKitHelper {
val shazamScope = CoroutineScope(Dispatchers.IO + SupervisorJob())
lateinit var streaming_session: StreamingSession
lateinit var signature: Signature
lateinit var catalog: ShazamCatalog
fun createStreamingSessionAsync(developerTokenProvider: DeveloperTokenProvider, readBufferSize: Int, sampleRate: AudioSampleRateInHz
): CompletableFuture<Unit>{
return CompletableFuture.supplyAsync {
runBlocking {
runCatching {
shazamScope.launch {
createStreamingSession(developerTokenProvider,readBufferSize,sampleRate)
}.join()
}.onFailure { throwable ->
}.getOrThrow()
}
}
}
private suspend fun createStreamingSession(developerTokenProvider:DeveloperTokenProvider,readBufferSize: Int,sampleRateInHz: AudioSampleRateInHz) {
catalog = ShazamKit.createShazamCatalog(developerTokenProvider)
streaming_session = (ShazamKit.createStreamingSession(
catalog,
sampleRateInHz,
readBufferSize
) as ShazamKitResult.Success).data
}
fun startMatching() {
val audioData = sharedAudioData ?: return // Return if sharedAudioData is null
CoroutineScope(Dispatchers.IO).launch {
runCatching {
streaming_session.matchStream(audioData.data, audioData.meaningfulLengthInBytes, audioData.timestampInMs)
}.onFailure { throwable ->
Log.e("ShazamKitHelper", "Error during matchStream", throwable)
}
}
}
@JvmField
var sharedAudioData: AudioData? = null;
data class AudioData(val data: ByteArray, val meaningfulLengthInBytes: Int, val timestampInMs: Long)
fun startListeningForMatches() {
CoroutineScope(Dispatchers.IO).launch {
streaming_session.recognitionResults().collect { matchResult ->
when (matchResult) {
is MatchResult.Match -> {
val match = matchResult.matchedMediaItems
println("Match found: ${match.get(0).title} by ${match.get(0).artist}")
}
is MatchResult.NoMatch -> {
println("No match found")
}
is MatchResult.Error -> {
val error = matchResult.exception
println("Match error: ${error.message}")
}
}
}
}
}
}
My code in java reads the samples from a thread:
shazam_create_session();
while (audioRecord.getRecordingState() == AudioRecord.RECORDSTATE_RECORDING){
if (shazam_session_created){
byte[] buffer = new byte[288000];//max_shazam_seconds * sampleRate * 2];
audioRecord.read(buffer,0,buffer.length,AudioRecord.READ_BLOCKING);
helper.sharedAudioData = new ShazamKitHelper.AudioData(buffer,buffer.length,System.currentTimeMillis());
helper.startMatching();
if (!listener_called){
listener_called = true;
helper.startListeningForMatches();
}
} else{
SystemClock.sleep(100);
}
}
private void shazam_create_session() {
MyDeveloperTokenProvider provider = new MyDeveloperTokenProvider();
AudioSampleRateInHz sample_rate = AudioSampleRateInHz.SAMPLE_RATE_48000;
if (sampleRate == 44100)
sample_rate = AudioSampleRateInHz.SAMPLE_RATE_44100;
CompletableFuture<Unit> future = helper.createStreamingSessionAsync(provider, 288000, sample_rate);
future.thenAccept(result -> {
shazam_session_created = true;
});
future.exceptionally(throwable -> {
Toast.makeText(mine, "Failure", Toast.LENGTH_SHORT).show();
return null;
});
}
I Implemented the developer token in java as follows
public static class MyDeveloperTokenProvider implements DeveloperTokenProvider {
DeveloperToken the_token = null;
@NonNull
@Override
public DeveloperToken provideDeveloperToken() {
if (the_token == null){
try {
the_token = generateDeveloperToken();
return the_token;
} catch (NoSuchAlgorithmException | InvalidKeySpecException e) {
throw new RuntimeException(e);
}
} else{
return the_token;
}
}
@NonNull
private DeveloperToken generateDeveloperToken() throws NoSuchAlgorithmException, InvalidKeySpecException {
PKCS8EncodedKeySpec priPKCS8 = new PKCS8EncodedKeySpec(Decoders.BASE64.decode(p8));
PrivateKey appleKey = KeyFactory.getInstance("EC").generatePrivate(priPKCS8);
Instant now = Instant.now();
Instant expiration = now.plus(Duration.ofDays(90));
String jwt = Jwts.builder()
.header().add("alg", "ES256").add("kid", keyId).and()
.issuer(teamId)
.issuedAt(Date.from(now))
.expiration(Date.from(expiration))
.signWith(appleKey) // Specify algorithm explicitly
.compact();
return new DeveloperToken(jwt);
}
}
ShazamKit
RSS for tagGet exact audio matching for any audio source using the Shazam catalog or a custom catalog in an app.
Posts under ShazamKit tag
9 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
HI Guys,
I'm using Shazamkit in my IOS app and successfully capturing the currently playing track details, when using the devices (iPhone) built-in mic.
When I test with AirPods though, my app cannot both send the output to through the AirPods and capture that same output with the AirPods mic, for Shazamkit recognition.
I believe this must be possible, because the Shazamkit widget on IOS can do this.
Is it restricted in some way for third party apps?
If not, I'd appreciate some guidance on how to achieve this in Swift code.
Thanks in advance.
Hello,
Using ShazamKit, based on a shazam catalog result, would it be possible to detect the audio-recorded FPS (speed)?
I'm thinking that the shazam catalog which was created from an audio file can be used to compare the speed of a live recorded audio.
Thank you!
Hello fellow developers,
I am developing a macOS app called "Playlist Plunderer 2," aimed at using AppleScript to control the Music app to play songs and employing ShazamKit to recognize these songs and update their metadata. Despite setting up the entitlements and plist files correctly, I'm encountering issues with gaining the necessary AppleScript permissions, and my app is not appearing under 'Automation' in System Preferences. Additionally, ShazamKit fails to match songs, consistently returning error 201.
Here are the specifics of my setup and what I've tried so far:
Xcode Version: 15.4, macOS 14.1.2
Entitlements Configured: Includes permissions for Apple events, audio input, and scripting targets for the Music app.
Capabilities: ShazamKit and ScriptingBridge frameworks integrated, set to "Do Not Embed."
Info.plist Adjustments: Added "Privacy - Microphone Usage Description."
Scripting: Manual AppleScript commands outside of Xcode succeed, but the app's scripts do not trigger.
Entitlements File:
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
<key>com.apple.security.app-sandbox</key>
<true/>
<key>com.apple.security.automation.apple-events</key>
<true/>
<key>com.apple.security.device.audio-input</key>
<true/>
<key>com.apple.security.files.user-selected.read-only</key>
<true/>
<key>com.apple.security.scripting-targets</key>
<dict>
<key>com.apple.Music</key>
<array>
<string>com.apple.Music.playback</string>
<string>com.apple.Music.library.read-write</string>
</array>
</dict>
</dict>
</plist>
I am having issues controlling the music app (itunes) from the apple script within my xcode project. the objective of the app is to rewrite the metadata of songs inside a folder in my Music app, this folder is titled Playlist Plunderer. The way I intend for the app to function is, the app will play the songs in the playlist, and then it will use shazamkit to recognize the song thats playing, it will then copy the metadata results of that song to rewrite the metadata of the song in the music playlist. I am still in the beginning stages. and I am very new to xcode. I created a apple developer account, paid the $99 and it is active, and I added the identifier bundle to the account from my app. I am VERY new to xcode,(this is my first project)
my development team is set ( Shane Vincent), and the app is set to automatically manage signing, under app sandbox. i have audio input checked, under hardened runtime/ resource access audio input is checked.
in build settings the path to the info.plist file is correct,
the info.plist contains Privacy - Microphone Usage Description that I added i think it was called NSMmicriphone or something, with a description that reads "This app needs to access the microphone to identify songs using ShazamKit."
the app appears under System Preferences > Security & Privacy > Privacy > Microphone but not under System Preferences > Security & Privacy > Privacy > Automation
it is being made on macOS 14.1.2 (23B92) and xcode Version 15.4 (15F31d)
Under framework library, and embedded content, I have added two frameworks, Shazamkit.framework, and ScriptingBridge.framework, both set to do not embed
Current Issue: AppleScript fails to authorize with the Music app, and ShazamKit errors suggest an issue with song matching.
Has anyone faced similar challenges or can offer guidance on how to ensure AppleScript and ShazamKit function correctly within a sandboxed macOS app? Any insights into troubleshooting or configuring entitlements more effectively would be greatly appreciated.
Thanks for your help!
Shazam with IOS18 developer beta is not working. I’ve tried to reinstall the app but when the music sample is sent to the cloud i don’t recieve any answer
I'm trying to expose my native shazamkit code to the host react native app.
The implementation works fine in a separate swift project but it fails when I try to integrate it into a React Native app.
Exception 'required condition is false: IsFormatSampleRateAndChannelCountValid(format)' was thrown while invoking exposed on target ShazamIOS with params (
1682,
1683
)
callstack: (
0 CoreFoundation 0x00007ff80049b761 __exceptionPreprocess + 242
1 libobjc.A.dylib 0x00007ff800063904 objc_exception_throw + 48
2 CoreFoundation 0x00007ff80049b56b +[NSException raise:format:] + 0
3 AVFAudio 0x00007ff846197929 _Z19AVAE_RaiseExceptionP8NSStringz + 156
4 AVFAudio 0x00007ff8461f2e90 _ZN17AUGraphNodeBaseV318CreateRecordingTapEmjP13AVAudioFormatU13block_pointerFvP16AVAudioPCMBufferP11AVAudioTimeE + 766
5 AVFAudio 0x00007ff84625f703 -[AVAudioNode installTapOnBus:bufferSize:format:block:] + 1456
6 muse 0x000000010a313dd0 $s4muse9ShazamIOSC6record33_35CC2309E4CA22278DC49D01D96C376ALLyyF + 496
7 muse 0x000000010a313210 $s4muse9ShazamIOSC5startyyF + 288
8 muse 0x000000010a312d03 $s4muse9ShazamIOSC7exposed_6rejectyyypSgXE_ySSSg_AGs5Error_pSgtXEtF + 83
9 muse 0x000000010a312e47 $s4muse9ShazamIOSC7exposed_6rejectyyypSgXE_ySSSg_AGs5Error_pSgtXEtFTo + 103
10 CoreFoundation 0x00007ff8004a238c __invoking___ + 140
11 CoreFoundation 0x00007ff80049f6b3 -[NSInvocation invoke] + 302
12 CoreFoundation 0x00007ff80049f923 -[NSInvocation invokeWithTarget:] + 70
13 muse 0x000000010a9210ef -[RCTModuleMethod invokeWithBridge:module:arguments:] + 2495
14 muse 0x000000010a925cb4 _ZN8facebook5reactL11invokeInnerEP9RCTBridgeP13RCTModuleDatajRKN5folly7dynamicEiN12_GLOBAL__N_117SchedulingContextE + 2036
15 muse 0x000000010a925305 _ZZN8facebook5react15RCTNativeModule6invokeEjON5folly7dynamicEiENK3$_0clEv + 133
16 muse 0x000000010a925279 ___ZN8facebook5react15RCTNativeModule6invokeEjON5folly7dynamicEi_block_invoke + 25
17 libdispatch.dylib 0x000000010e577747 _dispatch_call_block_and_release + 12
18 libdispatch.dylib 0x000000010e5789f7 _dispatch_client_callout + 8
19 libdispatch.dylib 0x000000010e5808c9 _dispatch_lane_serial_drain + 1127
20 libdispatch.dylib 0x000000010e581665 _dispatch_lane_invoke + 441
21 libdispatch.dylib 0x000000010e58e76e _dispatch_root_queue_drain_deferred_wlh + 318
22 libdispatch.dylib 0x000000010e58db69 _dispatch_workloop_worker_thread + 590
23 libsystem_pthread.dylib 0x000000010da67b84 _pthread_wqthread + 327
24 libsystem_pthread.dylib 0x000000010da66acf start_wqthread + 15
)
RCTFatal
facebook::react::invokeInner(RCTBridge*, RCTModuleData*, unsigned int, folly::dynamic const&, int, (anonymous namespace)::SchedulingContext)
facebook::react::RCTNativeModule::invoke(unsigned int, folly::dynamic&&, int)::$_0::operator()() const
invocation function for block in facebook::react::RCTNativeModule::invoke(unsigned int, folly::dynamic&&, int)
This is my swift file, error happens in the record function.
import Foundation
import ShazamKit
@objc(ShazamIOS)
class ShazamIOS : NSObject {
@Published var matching: Bool = false
@Published var mediaItem: SHMatchedMediaItem?
@Published var error: Error? {
didSet {
hasError = error != nil
}
}
@Published var hasError: Bool = false
private lazy var audioSession: AVAudioSession = .sharedInstance()
private lazy var session: SHSession = .init()
private lazy var audioEngine: AVAudioEngine = .init()
private lazy var inputNode = self.audioEngine.inputNode
private lazy var bus: AVAudioNodeBus = 0
override init() {
super.init()
session.delegate = self
}
@objc
func exposed(_ resolve:RCTPromiseResolveBlock, reject:RCTPromiseRejectBlock){
start()
resolve("ios code executed")
}
func start() {
switch audioSession.recordPermission {
case .granted:
self.record()
case .denied:
DispatchQueue.main.async {
self.error = ShazamError.recordDenied
}
case .undetermined:
audioSession.requestRecordPermission { granted in
DispatchQueue.main.async {
if granted {
self.record()
}
else {
self.error = ShazamError.recordDenied
}
}
}
@unknown default:
DispatchQueue.main.async {
self.error = ShazamError.unknown
}
}
}
private func record() {
do {
self.matching = true
let format = self.inputNode.outputFormat(forBus: bus)
self.inputNode.installTap(onBus: bus, bufferSize: 8192, format: format) { [weak self] (buffer, time) in
self?.session.matchStreamingBuffer(buffer, at: time)
}
self.audioEngine.prepare()
try self.audioEngine.start()
}
catch {
self.error = error
}
}
func stop() {
self.audioEngine.stop()
self.inputNode.removeTap(onBus: bus)
self.matching = false
}
@objc
static func requiresMainQueueSetup() -> Bool {
return true;
}
}
extension ShazamIOS: SHSessionDelegate {
func session(_ session: SHSession, didFind match: SHMatch) {
DispatchQueue.main.async { [self] in
if let mediaItem = match.mediaItems.first {
self.mediaItem = mediaItem
self.stop()
}
}
}
func session(_ session: SHSession, didNotFindMatchFor signature: SHSignature, error: Error?) {
DispatchQueue.main.async {[self] in
self.error = error
self.stop()
}
}
}
objC file
#import <Foundation/Foundation.h>
#import "React/RCTBridgeModule.h"
@interface RCT_EXTERN_MODULE(ShazamIOS, NSObject);
RCT_EXTERN_METHOD(exposed:(RCTPromiseResolveBlock)resolve reject:(RCTPromiseRejectBlock)reject)
@end
how I consume the exposed function in RN.
const {ShazamModule, ShazamIOS} = NativeModules;
const onPressIOSButton = () => {
ShazamIOS.exposed().then(result => console.log(result)).catch(e => console.log(e.message, e.code));
};
Hi Apple developers,
Is it possible to display lyrics synchronized with a song in my app after the song has been identified?
Just like the feature available in the general Shazam app when clicking on the middle-top icon (with music note in it) after Shazam'ing a song.
The app I'm developing is intended for the Deaf and hard-of-hearing and therefore I would love to be able to show the song lyrics to make the app accessible.
I would greatly appreciate your help because I can't find this in the documentation. Many thanks in advance!
Hello, I have a music on apple music. When I search this music on Shazam, I want it to appear with a clip like the link I provided below. Is there any way you can help with this?
Example: https://www.youtube.com/watch?v=St8smx2q1Ho
My Music: https://music.apple.com/us/album/tam-ba%C4%9F%C4%B1ms%C4%B1z-t%C3%BCrkiye/1689395789?i=1689395790
Thanks.
I am try to extract the audio file url from Shazamkit, it is deep inside the hierarchy of SHMediaItem > songs > previewAssets > url
when I access the url with like this:
let url = firstItem.songs[0].previewAssets?[0].url
I am getting a warning like this:
here is the Variable Viewer
this is what I have done so far:
struct MediaItems: Codable {
let title: String?
let subtitle: String?
let shazamId: String?
let appleMusicId: String?
let appleMusicUrL: URL?
let artworkUrl: URL?
let artist: String?
let matchOffset: TimeInterval?
let videoUrl: URL?
let webUrl: URL?
let genres: [String]
let isrc: String?
let songs: [Song]?
}
extension SwiftFlutterShazamKitPlugin: SHSessionDelegate{
public func session(_ session: SHSession, didFind match: SHMatch) {
let mediaItems = match.mediaItems
if let firstItem = mediaItems.first {
// extracting the url
let url = firstItem.songs[0].previewAssets?[0].url
let _shazamMedia = MediaItems(
title:firstItem.title!,
subtitle:firstItem.subtitle!,
shazamId:firstItem.shazamID!,
appleMusicId:firstItem.appleMusicID!,
appleMusicUrL:firstItem.appleMusicURL!,
artworkUrl:firstItem.artworkURL!,
artist:firstItem.artist!,
matchOffset:firstItem.matchOffset,
videoUrl:firstItem.videoURL!,
webUrl:firstItem.webURL!,
genres:firstItem.genres,
isrc:firstItem.isrc!,
songs:firstItem.songs
)
do {
let jsonData = try JSONEncoder().encode([_shazamMedia])
let jsonString = String(data: jsonData, encoding: .utf8)!
self.callbackChannel?.invokeMethod("matchFound", arguments: jsonString)
} catch {
callbackChannel?.invokeMethod("didHasError", arguments: "Error when trying to format data, please try again")
}
}
}