Hello,
Using ShazamKit, based on a shazam catalog result, would it be possible to detect the audio-recorded FPS (speed)?
I'm thinking that the shazam catalog which was created from an audio file can be used to compare the speed of a live recorded audio.
Thank you!
ShazamKit
RSS for tagGet exact audio matching for any audio source using the Shazam catalog or a custom catalog in an app.
Posts under ShazamKit tag
7 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
Hello fellow developers,
I am developing a macOS app called "Playlist Plunderer 2," aimed at using AppleScript to control the Music app to play songs and employing ShazamKit to recognize these songs and update their metadata. Despite setting up the entitlements and plist files correctly, I'm encountering issues with gaining the necessary AppleScript permissions, and my app is not appearing under 'Automation' in System Preferences. Additionally, ShazamKit fails to match songs, consistently returning error 201.
Here are the specifics of my setup and what I've tried so far:
Xcode Version: 15.4, macOS 14.1.2
Entitlements Configured: Includes permissions for Apple events, audio input, and scripting targets for the Music app.
Capabilities: ShazamKit and ScriptingBridge frameworks integrated, set to "Do Not Embed."
Info.plist Adjustments: Added "Privacy - Microphone Usage Description."
Scripting: Manual AppleScript commands outside of Xcode succeed, but the app's scripts do not trigger.
Entitlements File:
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
<key>com.apple.security.app-sandbox</key>
<true/>
<key>com.apple.security.automation.apple-events</key>
<true/>
<key>com.apple.security.device.audio-input</key>
<true/>
<key>com.apple.security.files.user-selected.read-only</key>
<true/>
<key>com.apple.security.scripting-targets</key>
<dict>
<key>com.apple.Music</key>
<array>
<string>com.apple.Music.playback</string>
<string>com.apple.Music.library.read-write</string>
</array>
</dict>
</dict>
</plist>
I am having issues controlling the music app (itunes) from the apple script within my xcode project. the objective of the app is to rewrite the metadata of songs inside a folder in my Music app, this folder is titled Playlist Plunderer. The way I intend for the app to function is, the app will play the songs in the playlist, and then it will use shazamkit to recognize the song thats playing, it will then copy the metadata results of that song to rewrite the metadata of the song in the music playlist. I am still in the beginning stages. and I am very new to xcode. I created a apple developer account, paid the $99 and it is active, and I added the identifier bundle to the account from my app. I am VERY new to xcode,(this is my first project)
my development team is set ( Shane Vincent), and the app is set to automatically manage signing, under app sandbox. i have audio input checked, under hardened runtime/ resource access audio input is checked.
in build settings the path to the info.plist file is correct,
the info.plist contains Privacy - Microphone Usage Description that I added i think it was called NSMmicriphone or something, with a description that reads "This app needs to access the microphone to identify songs using ShazamKit."
the app appears under System Preferences > Security & Privacy > Privacy > Microphone but not under System Preferences > Security & Privacy > Privacy > Automation
it is being made on macOS 14.1.2 (23B92) and xcode Version 15.4 (15F31d)
Under framework library, and embedded content, I have added two frameworks, Shazamkit.framework, and ScriptingBridge.framework, both set to do not embed
Current Issue: AppleScript fails to authorize with the Music app, and ShazamKit errors suggest an issue with song matching.
Has anyone faced similar challenges or can offer guidance on how to ensure AppleScript and ShazamKit function correctly within a sandboxed macOS app? Any insights into troubleshooting or configuring entitlements more effectively would be greatly appreciated.
Thanks for your help!
Shazam with IOS18 developer beta is not working. I’ve tried to reinstall the app but when the music sample is sent to the cloud i don’t recieve any answer
I'm trying to expose my native shazamkit code to the host react native app.
The implementation works fine in a separate swift project but it fails when I try to integrate it into a React Native app.
Exception 'required condition is false: IsFormatSampleRateAndChannelCountValid(format)' was thrown while invoking exposed on target ShazamIOS with params (
1682,
1683
)
callstack: (
0 CoreFoundation 0x00007ff80049b761 __exceptionPreprocess + 242
1 libobjc.A.dylib 0x00007ff800063904 objc_exception_throw + 48
2 CoreFoundation 0x00007ff80049b56b +[NSException raise:format:] + 0
3 AVFAudio 0x00007ff846197929 _Z19AVAE_RaiseExceptionP8NSStringz + 156
4 AVFAudio 0x00007ff8461f2e90 _ZN17AUGraphNodeBaseV318CreateRecordingTapEmjP13AVAudioFormatU13block_pointerFvP16AVAudioPCMBufferP11AVAudioTimeE + 766
5 AVFAudio 0x00007ff84625f703 -[AVAudioNode installTapOnBus:bufferSize:format:block:] + 1456
6 muse 0x000000010a313dd0 $s4muse9ShazamIOSC6record33_35CC2309E4CA22278DC49D01D96C376ALLyyF + 496
7 muse 0x000000010a313210 $s4muse9ShazamIOSC5startyyF + 288
8 muse 0x000000010a312d03 $s4muse9ShazamIOSC7exposed_6rejectyyypSgXE_ySSSg_AGs5Error_pSgtXEtF + 83
9 muse 0x000000010a312e47 $s4muse9ShazamIOSC7exposed_6rejectyyypSgXE_ySSSg_AGs5Error_pSgtXEtFTo + 103
10 CoreFoundation 0x00007ff8004a238c __invoking___ + 140
11 CoreFoundation 0x00007ff80049f6b3 -[NSInvocation invoke] + 302
12 CoreFoundation 0x00007ff80049f923 -[NSInvocation invokeWithTarget:] + 70
13 muse 0x000000010a9210ef -[RCTModuleMethod invokeWithBridge:module:arguments:] + 2495
14 muse 0x000000010a925cb4 _ZN8facebook5reactL11invokeInnerEP9RCTBridgeP13RCTModuleDatajRKN5folly7dynamicEiN12_GLOBAL__N_117SchedulingContextE + 2036
15 muse 0x000000010a925305 _ZZN8facebook5react15RCTNativeModule6invokeEjON5folly7dynamicEiENK3$_0clEv + 133
16 muse 0x000000010a925279 ___ZN8facebook5react15RCTNativeModule6invokeEjON5folly7dynamicEi_block_invoke + 25
17 libdispatch.dylib 0x000000010e577747 _dispatch_call_block_and_release + 12
18 libdispatch.dylib 0x000000010e5789f7 _dispatch_client_callout + 8
19 libdispatch.dylib 0x000000010e5808c9 _dispatch_lane_serial_drain + 1127
20 libdispatch.dylib 0x000000010e581665 _dispatch_lane_invoke + 441
21 libdispatch.dylib 0x000000010e58e76e _dispatch_root_queue_drain_deferred_wlh + 318
22 libdispatch.dylib 0x000000010e58db69 _dispatch_workloop_worker_thread + 590
23 libsystem_pthread.dylib 0x000000010da67b84 _pthread_wqthread + 327
24 libsystem_pthread.dylib 0x000000010da66acf start_wqthread + 15
)
RCTFatal
facebook::react::invokeInner(RCTBridge*, RCTModuleData*, unsigned int, folly::dynamic const&, int, (anonymous namespace)::SchedulingContext)
facebook::react::RCTNativeModule::invoke(unsigned int, folly::dynamic&&, int)::$_0::operator()() const
invocation function for block in facebook::react::RCTNativeModule::invoke(unsigned int, folly::dynamic&&, int)
This is my swift file, error happens in the record function.
import Foundation
import ShazamKit
@objc(ShazamIOS)
class ShazamIOS : NSObject {
@Published var matching: Bool = false
@Published var mediaItem: SHMatchedMediaItem?
@Published var error: Error? {
didSet {
hasError = error != nil
}
}
@Published var hasError: Bool = false
private lazy var audioSession: AVAudioSession = .sharedInstance()
private lazy var session: SHSession = .init()
private lazy var audioEngine: AVAudioEngine = .init()
private lazy var inputNode = self.audioEngine.inputNode
private lazy var bus: AVAudioNodeBus = 0
override init() {
super.init()
session.delegate = self
}
@objc
func exposed(_ resolve:RCTPromiseResolveBlock, reject:RCTPromiseRejectBlock){
start()
resolve("ios code executed")
}
func start() {
switch audioSession.recordPermission {
case .granted:
self.record()
case .denied:
DispatchQueue.main.async {
self.error = ShazamError.recordDenied
}
case .undetermined:
audioSession.requestRecordPermission { granted in
DispatchQueue.main.async {
if granted {
self.record()
}
else {
self.error = ShazamError.recordDenied
}
}
}
@unknown default:
DispatchQueue.main.async {
self.error = ShazamError.unknown
}
}
}
private func record() {
do {
self.matching = true
let format = self.inputNode.outputFormat(forBus: bus)
self.inputNode.installTap(onBus: bus, bufferSize: 8192, format: format) { [weak self] (buffer, time) in
self?.session.matchStreamingBuffer(buffer, at: time)
}
self.audioEngine.prepare()
try self.audioEngine.start()
}
catch {
self.error = error
}
}
func stop() {
self.audioEngine.stop()
self.inputNode.removeTap(onBus: bus)
self.matching = false
}
@objc
static func requiresMainQueueSetup() -> Bool {
return true;
}
}
extension ShazamIOS: SHSessionDelegate {
func session(_ session: SHSession, didFind match: SHMatch) {
DispatchQueue.main.async { [self] in
if let mediaItem = match.mediaItems.first {
self.mediaItem = mediaItem
self.stop()
}
}
}
func session(_ session: SHSession, didNotFindMatchFor signature: SHSignature, error: Error?) {
DispatchQueue.main.async {[self] in
self.error = error
self.stop()
}
}
}
objC file
#import <Foundation/Foundation.h>
#import "React/RCTBridgeModule.h"
@interface RCT_EXTERN_MODULE(ShazamIOS, NSObject);
RCT_EXTERN_METHOD(exposed:(RCTPromiseResolveBlock)resolve reject:(RCTPromiseRejectBlock)reject)
@end
how I consume the exposed function in RN.
const {ShazamModule, ShazamIOS} = NativeModules;
const onPressIOSButton = () => {
ShazamIOS.exposed().then(result => console.log(result)).catch(e => console.log(e.message, e.code));
};
Hi Apple developers,
Is it possible to display lyrics synchronized with a song in my app after the song has been identified?
Just like the feature available in the general Shazam app when clicking on the middle-top icon (with music note in it) after Shazam'ing a song.
The app I'm developing is intended for the Deaf and hard-of-hearing and therefore I would love to be able to show the song lyrics to make the app accessible.
I would greatly appreciate your help because I can't find this in the documentation. Many thanks in advance!
Hello, I have a music on apple music. When I search this music on Shazam, I want it to appear with a clip like the link I provided below. Is there any way you can help with this?
Example: https://www.youtube.com/watch?v=St8smx2q1Ho
My Music: https://music.apple.com/us/album/tam-ba%C4%9F%C4%B1ms%C4%B1z-t%C3%BCrkiye/1689395789?i=1689395790
Thanks.
I am try to extract the audio file url from Shazamkit, it is deep inside the hierarchy of SHMediaItem > songs > previewAssets > url
when I access the url with like this:
let url = firstItem.songs[0].previewAssets?[0].url
I am getting a warning like this:
here is the Variable Viewer
this is what I have done so far:
struct MediaItems: Codable {
let title: String?
let subtitle: String?
let shazamId: String?
let appleMusicId: String?
let appleMusicUrL: URL?
let artworkUrl: URL?
let artist: String?
let matchOffset: TimeInterval?
let videoUrl: URL?
let webUrl: URL?
let genres: [String]
let isrc: String?
let songs: [Song]?
}
extension SwiftFlutterShazamKitPlugin: SHSessionDelegate{
public func session(_ session: SHSession, didFind match: SHMatch) {
let mediaItems = match.mediaItems
if let firstItem = mediaItems.first {
// extracting the url
let url = firstItem.songs[0].previewAssets?[0].url
let _shazamMedia = MediaItems(
title:firstItem.title!,
subtitle:firstItem.subtitle!,
shazamId:firstItem.shazamID!,
appleMusicId:firstItem.appleMusicID!,
appleMusicUrL:firstItem.appleMusicURL!,
artworkUrl:firstItem.artworkURL!,
artist:firstItem.artist!,
matchOffset:firstItem.matchOffset,
videoUrl:firstItem.videoURL!,
webUrl:firstItem.webURL!,
genres:firstItem.genres,
isrc:firstItem.isrc!,
songs:firstItem.songs
)
do {
let jsonData = try JSONEncoder().encode([_shazamMedia])
let jsonString = String(data: jsonData, encoding: .utf8)!
self.callbackChannel?.invokeMethod("matchFound", arguments: jsonString)
} catch {
callbackChannel?.invokeMethod("didHasError", arguments: "Error when trying to format data, please try again")
}
}
}