I'm trying to use AVCaptureSession and AVAssetWriter to convert video and audio from an iPhone's camera and microphone into a fragmented video file in AppleHLS format.
Below is part of the code.
It seems that the capture is successful, and I have confirmed that the data received with captureOutput() can be appended to videoWriterIput and audioWriterInput using append().
When executing audioWriterInput!.append(sampleBuffer), sampleBuffer has the following value, and it looks like the audio data has been passed to AssetWriter.
sampleBuffer.duration : CMTime(value: 941, timescale: 44100, flags: __C.CMTimeFlags(rawValue: 1), epoch: 0)
sampleBuffer.totalSampleSize : 1882
However, the final output init.mp4 and *.m4s do not contain Audio. (The video can be played without any problems.)
Could you please tell me any problems or hints as to why Audio is not included?
/// Capture Session
let captureSession = AVCaptureSession()
/// Capture Input
var videoDevice: AVCaptureDevice?
var audioDevice: AVCaptureDevice?
/// Configure and Start Capture Session
func startCapture() {
// Start Configuration
captureSession.beginConfiguration()
// Setup Input Video
videoDevice = self.defaultCamera(cameraSide: cameraSide)
videoDevice!.activeVideoMinFrameDuration = CMTimeMake(value: 1, timescale: 30)
let videoInput = try AVCaptureDeviceInput(device: videoDevice!) as AVCaptureDeviceInput
captureSession.addInput(videoInput)
// Setup Input Audio
audioDevice = AVCaptureDevice.default(for: AVMediaType.audio)
let audioInput = try AVCaptureDeviceInput(device: audioDevice!) as AVCaptureDeviceInput
captureSession.addInput(audioInput)
// Setup Output Video
let videoDataOutput = AVCaptureVideoDataOutput()
videoDataOutput.setSampleBufferDelegate(self, queue: recordingQueue)
videoDataOutput.alwaysDiscardsLateVideoFrames = true
videoDataOutput.videoSettings = [
kCVPixelBufferPixelFormatTypeKey: Int(kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange)] as [String : Any]
captureSession.addOutput(videoDataOutput)
// Setup Output Audio
let audioDataOutput = AVCaptureAudioDataOutput()
audioDataOutput.setSampleBufferDelegate(self, queue: recordingQueue)
captureSession.addOutput(audioDataOutput)
//End Configuration
captureSession.commitConfiguration()
// Start Capture
captureSession.startRunning()
}
private let assetWriter: AVAssetWriter?
private let startTimeOffset: CMTime
private var audioWriterInput: AVAssetWriterInput?
private let videoWriterInput: AVAssetWriterInput?
func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {
if assetWriter == nil {
// AssetWriter
assetWriter = AVAssetWriter(contentType: UTType(AVFileType.mp4.rawValue)!)
self.startTimeOffset = CMTime(value: 1, timescale: 1)
// Setup Input of Audio.
let audioCompressionSettings: [String: Any] = [
AVFormatIDKey: kAudioFormatMPEG4AAC,
AVSampleRateKey: 44_100,
AVNumberOfChannelsKey: 1,
AVEncoderBitRateKey: 128_000
]
audioWriterInput = AVAssetWriterInput(mediaType: .audio, outputSettings: audioCompressionSettings)
audioWriterInput!.expectsMediaDataInRealTime = true
assetWriter.add(audioWriterInput!)
// Setup Input of Video.
let videoCompressionSettings: [String: Any] = [
AVVideoCodecKey: AVVideoCodecType.h264
]
let videoCompressionSettings: [String: Any] = [
AVVideoCodecKey: AVVideoCodecType.h264,
AVVideoWidthKey: 1280,
AVVideoHeightKey: 720,
AVVideoCompressionPropertiesKey: [
kVTCompressionPropertyKey_AverageBitRate: 1_024_000,
kVTCompressionPropertyKey_ProfileLevel: kVTProfileLevel_H264_Baseline_AutoLevel
]
]
videoWriterInput = AVAssetWriterInput(mediaType: .video, outputSettings: videoCompressionSettings)
videoWriterInput!.expectsMediaDataInRealTime = true
pixelBuffer = AVAssetWriterInputPixelBufferAdaptor(assetWriterInput: videoWriterInput!, sourcePixelBufferAttributes: [kCVPixelBufferPixelFormatTypeKey as String: Int(kCVPixelFormatType_32BGRA)])
assetWriter.add(videoWriterInput!)
// Configure the asset writer for writing data in fragmented MPEG-4 format.
assetWriter.outputFileTypeProfile = AVFileTypeProfile.mpeg4AppleHLS
assetWriter.preferredOutputSegmentInterval = CMTime(seconds: 1.0, preferredTimescale: 1)
assetWriter.initialSegmentStartTime = startTimeOffset
assetWriter.delegate = self
// start AssetWriiter
startTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer)
assetWriter.startWriting()
assetWriter.startSession(atSourceTime: startTime)
}
let isVideo = output is AVCaptureVideoDataOutput
if isVideo {
if videoWriterInput.isReadyForMoreMediaData {
videoWriterInput!.append(sampleBuffer)
}
}else{
if audioWriterInput!.isReadyForMoreMediaData {
audioWriterInput!.append(sampleBuffer)
}
}
}
func assetWriter(_ writer: AVAssetWriter, didOutputSegmentData segmentData: Data, segmentType: AVAssetSegmentType, segmentReport: AVAssetSegmentReport?) {
:
:
}
Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.
Post
Replies
Boosts
Views
Activity
I am using the official website API to decode now, the callback did not trigger.
Hello Apple Community,
First, I'm asking this on the holidays, so happy holidays. Now I have some "fun coding time" and can do it because I have some holiday time.
I'm new to this area and would greatly appreciate your expertise and guidance. Have mercy.
I'm attempting to develop a simple application on my Mac OS to browse my playlists. However, I've encountered a few roadblocks that I struggle to navigate. I understand I need to implement two-factor authentication to access my playlists, for which an OAuth setup is required. This involves obtaining an Apple ID and a service ID and dealing with other complex elements.
One particular challenge I'm facing is with the redirect URI in the OAuth process. It seems that it needs to be a valid domain, and I'm unsure if using a local server address like https://localhost:5000 would work. My goal is to create a basic Flask application that can interact with Apple's web authentication system, but I'm uncertain about the feasibility of this approach, given the domain restrictions.
I would appreciate any advice or step-by-step guidance on the following points.
What would be the simplest way to create an application (Swift, Python, or JavaScript) that can authenticate and enable browsing through my playlists?
Any insights, tips, or examples you could share would be immensely helpful. I am eager to learn and look forward to your valuable suggestions. Anything step-by-step would be great, but I like to dream.
Thank you so much for your time and assistance.
Hello,
We have a Photo Vault app. We were hiding users' photos behind a semi-functional calculator. But after rejection, we thought "decoy functionality" meant we needed to remove this fake calculator feature. We removed it, and tried many things to solve this issue, but couldn't understand what Apple wants us to change. We've been trying to contact Apple for more details, but they keep sending the same message every time.
Helps appreciated. Here is the rejection message:
.
Your app uses public APIs in an unapproved manner, which does not comply with guideline 2.5.1 of the App Store Review Guidelines.
Specifically, we found that your app uses a decoy functionality to hide a user’s photos, which is not an appropriate use of the Photos API.
Since there is no accurate way of predicting how an API may be modified and what effects those modifications may have, Apple does not permit unapproved uses of public APIs in App Store apps.
Next Steps
Please revise your app to ensure that documented APIs are used in the manner prescribed by Apple.
It would be appropriate to remove any features in your app that use a decoy functionality to hide a user's photos from your app.
If there are no alternatives for providing the functionality your app requires, you can use Feedback Assistant to submit an enhancement request.
I'm trying to test the MusicMarathon and MusicAlbums tutorial/demo apps for MusicKit and half the endpoints do not work.
As an example the MusicMarathon call to MusicRecentlyPlayedContainerRequest() just returns a 401.
Everything I've done seems correct. I've got a fully authorized session and I have all the development keys successfully setup.
Also it's not all API's as I can access the users Library, Just none of the recommendation and search endpoints seems to be working correctly.
I'm running iOS 17.2 and Xcode 15.1
I'm pretty certain this is easily repeatable by just running the demo applications from the MusicKit documentation,
2 Days and I am frustrated. I"ve crossed my T's and dotted my I's.
Using
musickit
Error
Attempted to register account monitor for types client is not authorized to access: {(
"com.apple.account.iTunesStore"
)}
Offending Code
var request = MusicLibraryRequest<MusicKit.Playlist>()
request.sort(by: .lastPlayedDate, ascending: false)
let response = try await request.response()
Verified
Custom IOS Target Properities
Privacy - Media Library Usage Description
Correct Bundle Identifier
Checkbox AppServcies/Music Kit for App ID
Please help!
2 days of racking my brain, just can't get passed error
MusicKit does ask me to authorize
Other code works
let request = MusicRecentlyPlayedContainerRequest()
let response = try await request.response()
See Image
The error shown in the picture appears when I try to link the StartScene.sks file to the code
Attached are pictures of the code
Please help me🙏🏻
Hey there, I'm trying to display all user's albums using the MediaPlayer library. I'm getting many albums returning nil, but I know artwork exists because they show up in the default Music app. There doesn't seem to be much rhyme or reason for what shows up and what doesn't. All downloaded albums display artwork, but some cloud album artwork displays as well. Here's the code I'm using to debug this.
let query = MPMediaQuery.albums()
if let albumCollections = query.collections {
albums = albumCollections
}
for album in albums {
let artwork = album.representativeItem?.artwork
print(artwork, artwork?.image(at: CGSize(width: 100, height: 100)))
}
Any help would be greatly appreciated. Thanks!
I have a music player that is able to save and restore AU parameters using the kAudioUnitProperty_ClassInfo property. For non apple AUs, this works fine. But for any of the Apple units, the class info can be set only the first time after the audio graph is built. Subsequent sets of the property do not stick even though the OSStatus code is 0 upon return. Previously this had worked fine. But sometime, not sure when, the Apple provided AUs changed their behavior and is now causing me problems.
Can anyone help shed light on this ?
Thanks in advance for the help.
Jeff Frey
I notice from macOS Sonoma System Settings, we have "Screen & System audio Recording". I'm an macOS app developer and want to request only Audio permission,
I browse the document for a while and WWDC code demo, but still have no idea of how to request "System Audio Recording Only" permission?
All the demo and doc I can find is request "Screen Recording & System Audio"
When making a library sectioned request, some MusicLibraryRequestable types used result in an MusicKit.MusicLibraryRequestError being thrown.
When a Playlist is used as the MusicLibrarySectionRequestable type, no other MusicLibraryRequestable type than Track can be used for the request. For others, Artist & Genre cannot be used.
Is there a way to work around this issue? The (seemingly) equivalent functionality in MediaPlayer (MPMediaQuery and MPMediaGrouping) was very consistent and reliable.
Full error info: MusicKit.MusicLibraryRequestError.invalidType, The operation couldn’t be completed. (MusicKit.MusicLibraryRequestError error 1.)
Device and OS: iPhone 13 Pro, iOS 17.2.1
Application Crashed: com.apple.main-thread
EXC_BAD_ACCESS KERN_INVALID_ADDRESS 0x000000000000001e
Crashed: com.apple.main-thread
0 libobjc.A.dylib 0x2df58 object_isClass + 16
1 Foundation 0x1c9bc KVO_IS_RETAINING_ALL_OBSERVERS_OF_THIS_OBJECT_IF_IT_CRASHES_AN_OBSERVER_WAS_OVERRELEASED_OR_SMASHED + 76
2 Foundation 0x1bd60 NSKeyValueWillChangeWithPerThreadPendingNotifications + 300
3 AVFoundation 0x1380 -[AVPlayerAccessibility willChangeValueForKey:] + 72
4 AVFCore 0x13954 -[AVPlayer _noteNewPresentationSizeForPlayerItem:] + 48
5 AVFCore 0x1fbb0 __avplayeritem_fpItemNotificationCallback_block_invoke + 4336
6 libdispatch.dylib 0x26a8 _dispatch_call_block_and_release + 32
7 libdispatch.dylib 0x4300 _dispatch_client_callout + 20
8 libdispatch.dylib 0x12998 _dispatch_main_queue_drain + 984
9 libdispatch.dylib 0x125b0 _dispatch_main_queue_callback_4CF + 44
10 CoreFoundation 0x3701c CFRUNLOOP_IS_SERVICING_THE_MAIN_DISPATCH_QUEUE + 16
11 CoreFoundation 0x33d28 __CFRunLoopRun + 1996
12 CoreFoundation 0x33478 CFRunLoopRunSpecific + 608
13 GraphicsServices 0x34f8 GSEventRunModal + 164
14 UIKitCore 0x22c62c -[UIApplication _run] + 888
15 UIKitCore 0x22bc68 UIApplicationMain + 340
16 UIKitCore 0x4563d0 __swift_destroy_boxed_opaque_existential_1Tm +
12220
17 AajTak 0x84c4 main + 4333552836 (QuizLeaderboardViewModel.swift:4333552836)
com.livingMedia.AajTakiPhone_issue_4e4b5f148b75496175c3900a1405bd62_crash_session_3ff23a3e8e854c4ab68de2789fe76c5b_DNE_0_v2_stacktrace.txt
Hi Team,
We see an issue with this version if CoreMedia requesting multiple qualities at all times for a stream. We don't see this issue on 1.0.0.21C62. We are unsure what would be causing this.
[2024-01-05 16:53:51] "GET /live/cinecanal/live/cinecanal_720p/video.m3u8?_HLS_msn=630&_HLS_part=0 HTTP/1.0" 200 1145 2529 1090199 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)"
[2024-01-05 16:53:52] "GET /live/cinecanal/live/cinecanal_1080p/audio.m3u8?_HLS_msn=630&_HLS_part=0 HTTP/1.0" 200 1146 2396 1013356 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)"
[2024-01-05 16:53:52] "GET /live/cinecanal/live/cinecanal_1080p/a_1459_2452140264_630_0.fmp4 HTTP/1.0" 200 1139 24975 1013385 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)"
[2024-01-05 16:53:52] "GET /live/cinecanal/live/cinecanal_720p/video.m3u8?_HLS_msn=630&_HLS_part=1 HTTP/1.0" 200 1145 2603 998670 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)"
[2024-01-05 16:53:52] "GET /live/cinecanal/live/cinecanal_720p/v_1058_2452140000_630_1.fmp4 HTTP/1.0" 200 1138 40534 998739 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)"
[2024-01-05 16:53:53] "GET /live/cinecanal/live/cinecanal_720p/video.m3u8?_HLS_msn=630&_HLS_part=2 HTTP/1.0" 200 1145 2677 835327 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)"
[2024-01-05 16:53:53] "GET /live/cinecanal/live/cinecanal_720p/v_1058_2452140000_630_2.fmp4 HTTP/1.0" 200 1138 57656 835207 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)"
[2024-01-05 16:53:53] "GET /live/cinecanal/live/cinecanal_1080p/audio.m3u8?_HLS_msn=630&_HLS_part=1 HTTP/1.0" 200 1146 2458 986038 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)"
[2024-01-05 16:53:53] "GET /live/cinecanal/live/cinecanal_1080p/a_1459_2452140264_630_1.fmp4 HTTP/1.0" 200 1139 24700 986032 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)"
[2024-01-05 16:53:54] "GET /live/cinecanal/live/cinecanal_720p/video.m3u8?_HLS_msn=630&_HLS_part=3 HTTP/1.0" 200 1145 2751 1013257 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)"
[2024-01-05 16:53:54] "GET /live/cinecanal/live/cinecanal_720p/v_1058_2452140000_630_3.fmp4 HTTP/1.0" 200 1138 55900 1013324 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)"
[2024-01-05 16:53:54] "GET /live/cinecanal/live/cinecanal_1080p/audio.m3u8?_HLS_msn=630&_HLS_part=2 HTTP/1.0" 200 1146 2520 1016693 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)"
[2024-01-05 16:53:54] "GET /live/cinecanal/live/cinecanal_1080p/a_1459_2452140264_630_2.fmp4 HTTP/1.0" 200 1139 25014 1016717 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)"
[2024-01-05 16:53:55] "GET /live/cinecanal/live/cinecanal_720p/video.m3u8?_HLS_msn=630&_HLS_part=4 HTTP/1.0" 200 1145 2825 917753 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)"
[2024-01-05 16:53:55] "GET /live/cinecanal/live/cinecanal_720p/v_1058_2452140000_630_4.fmp4 HTTP/1.0" 200 1138 103745 917903 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)"
[2024-01-05 16:53:55] "GET /live/cinecanal/live/cinecanal_1080p/audio.m3u8?_HLS_msn=630&_HLS_part=3 HTTP/1.0" 200 1146 2582 958102 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)"
[2024-01-05 16:53:55] "GET /live/cinecanal/live/cinecanal_1080p/a_1459_2452140264_630_3.fmp4 HTTP/1.0" 200 1139 24782 958195 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)"
[2024-01-05 16:53:56] "GET /live/cinecanal/live/cinecanal_720p/video.m3u8?_HLS_msn=630&_HLS_part=5 HTTP/1.0" 200 1145 2899 931101 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)"
[2024-01-05 16:53:56] "GET /live/cinecanal/live/cinecanal_720p/v_1058_2452140000_630_5.fmp4 HTTP/1.0" 200 1138 112113 931228 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)"
[2024-01-05 16:53:56] "GET /live/cinecanal/live/cinecanal_1080p/audio.m3u8?_HLS_msn=630&_HLS_part=4 HTTP/1.0" 200 1146 2644 935550 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)"
[2024-01-05 16:53:56] "GET /live/cinecanal/live/cinecanal_1080p/a_1459_2452140264_630_4.fmp4 HTTP/1.0" 200 1139 24824 937720 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)"
[2024-01-05 16:53:57] "GET /live/cinecanal/live/cinecanal_1080p/audio.m3u8?_HLS_msn=630&_HLS_part=5 HTTP/1.0" 200 1146 2706 895680 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)"
[2024-01-05 16:53:57] "GET /live/cinecanal/live/cinecanal_1080p/a_1459_2452140264_630_5.fmp4 HTTP/1.0" 200 1139 24843 895734 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)"
[2024-01-05 16:53:57] "GET /live/cinecanal/live/cinecanal_720p/video.m3u8?_HLS_msn=631&_HLS_part=0 HTTP/1.0" 200 1145 2529 907045 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)"
I’m using the new ApplicationMusicPlayer support on macOS 14 and playing items from my Apple Music library. I wanted to play this music from my app to an AirPlay destination so i added an AVRoutePickerView. However, selecting any destination via this view doesn’t make a difference to the playback. It continues to play on my mac speakers no matter which airplay destination i choose.
Also submitted as FB13521393.
I have downloaded the official Apple MusicKit SDK for Android and integrated 2 AARs it has in my app (musickitauth-release-1.1.2.aar and mediaplayback-release-1.1.1.aar). When I try to build my app I'm getting an error:
Manifest merger failed : android:exported needs to be explicitly specified for element <activity#com.apple.android.sdk.authentication.SDKUriHandlerActivity>. Apps targeting Android 12 and higher are required to specify an explicit value for android:exported when the corresponding component has an intent filter defined. See https://developer.android.com/guide/topics/manifest/activity-element#exported for details.
Which makes sense, since when I look into the AAR's AndroidManifest.xml I see that this is missing in SDKUriHandlerActivity. Can this be fixed?
I'm writing an Android app that uses the Apple MusicKit SDK for Android. I am trying to understand how to handle the Apple Music user token, once I got it from authentication flow. I don't know when the token will expire, it is not a regular jwt token, so I cannot check the expiration date. And I don't want to run the auth flow on every app run, it will be annoying for the users. Any guidance on how to handle and invalidate apple music user tokens?
there is a method setPreferredInput in AVAudioSession which can be used to select different input device. But, does there any similar function like "setPerferredOutput" so that in my APP I can select a specific audio output device to play audio ?
I do not want user to change it through system interfaces (such as the Control Center), but by logic inside APP.
thanks!
Hello all,
I am a completely new to iOS development, and I need to make an app that will be playing encrypted content using Fairplay. Right now I am using the Sample client in the latest FPS Server SDK (HLSCatalog), and from what I understand the default sources in Streams.plist are not protected. There's an entry with is_protected=YES and where I can replace the playlist_url, but I was wondering if I could use the .m3u8 files in the FPS Test Content found at https://developer.apple.com/streaming/fps/ ? If so, I don't really know how to.
In one of the .m3u8 files of the Test Content, I could find the "content_key_id_list" example value that's also mentioned in the client's README (skd://twelve), which is why I'm asking.
Thanks in advance!
I am creating a camera app where I would like music from another app (Apple Music, Spotify, etc.) to continue playing once the app is opened. Currently I am using .mixWithOthers to do this in my viewDidLoad.
let audioSession = AVAudioSession.sharedInstance()
do {
try audioSession.setCategory(AVAudioSession.Category.playback, options: [.mixWithOthers])
try audioSession.setActive(true)
} catch {
print("error trying to record and play audio")
}
However I am running into an issue where the music only plays if you resume music playback once you start recording a video. Otherwise, when you open the app music will stop when you see the preview. The interesting thing is that if you start playing music while recording, then once you stop music continues to play in the preview view. If you close the app (not force close) and reopen then music play back continues as expected. However, once you force close the app then it returns to the original behavior. I've tried to do research on this and I have not been able to find anything. Any help is appreciated. Let me know if more details are needed.
Hi! I've been working on a project in python that pulls in a bunch of my personal apple music playback history and library, etc.
I can't find a single good/functional example on how to pull the Music User Token via the android method or MusicKit JS (web) - I've spent a lot of hours on this today, and no permutation of existing examples/documentation has worked.
Any guidance would be much appreciated!! If you have a web app that pulls the music user token, I just need help understanding how to get to the token itself.
Thank you!