Post not yet marked as solved
I understand why there isn’t an API to remove items from the library or playlist, but I propose a way to safely remove items from the library or playlist which would show a system alert to confirm the removal. This is exactly how the deletion is allowed for the Photos Library via third-party apps.
Post not yet marked as solved
Hi I am working on a music app where I can do some sound analysis, my end goal is to integrate Apple Music API in my app where users can search songs. When the song is played I want to show a type of soundwave/spectrogram. For sound visualization I have AVFoundation but the Music API returns only song ID which could be played using Apple MediaPlayer framework only. Is there any API for doing sound analysis or can I play the Apple music songs using AVFoundation.
Hi ,
I have a usecase where I need to add multiple songs to apple music playlists. I am using MPMediaplaylist.addItem(withProductID: ) function to add a single song to the playlist. Is there a way to convert the song to MPMediaItem so that I can use MPMediaplaylist.add([MPMediaItem]) to add multiple songs to the playlist?
Regards
Post not yet marked as solved
I try to get fullSizeImageURL and then calculate md5 of asset via url:
var assetUrl = syncUrl(asset: asset)
let assetMD5 = md5File(url: assetUrl)
To get url I use requestContentEditingInput. The problem is contentEditingInput can be empty. I have had cases where my app worked on my new iphone the first few times. But after some tries, this problem was gone and contentEditingInput was always not empty. Not sure but I think it is because all assets were cached.
So my questions are:
Why contentEditingInput can be empty?
How can to get asset url if contentEditingInput is empty?
How can I clear asset photos cache to reproduce this issue if it is related to cache?
Part of code which I use:
extension PHAsset {
func getURL(completionHandler: @escaping ((_ responseURL: URL?) -> Void)) {
let options: PHContentEditingInputRequestOptions = PHContentEditingInputRequestOptions()
options.canHandleAdjustmentData = { (adjustmeta: PHAdjustmentData) -> Bool in
return true
}
options.isNetworkAccessAllowed = true
self.requestContentEditingInput(with: options, completionHandler: { (contentEditingInput: PHContentEditingInput?, info: [AnyHashable: Any]) -> Void in
!!!!!!contentEditingInput can be empty
completionHandler(contentEditingInput?.fullSizeImageURL as URL?)
})
}
md5:
func md5File(url: URL, isCancelled: @escaping () -> Bool) -> Data? {
let bufferSize = 1024 * 1024
do {
// Open file for reading:
let file = try FileHandle(forReadingFrom: url)
defer {
file.closeFile()
}
// Create and initialize MD5 context:
var context = CC_MD5_CTX()
CC_MD5_Init(&context)
// Read up to `bufferSize` bytes, until EOF is reached, and update MD5 context:
while autoreleasepool(invoking: {
if isCancelled() { return false }
let data = file.readData(ofLength: bufferSize)
if data.count > 0 {
data.withUnsafeBytes {
_ = CC_MD5_Update(&context, $0.baseAddress, numericCast(data.count))
}
return true // Continue
} else {
return false // End of file
}
}) { }
if isCancelled() { return nil }
// Compute the MD5 digest:
var digest: [UInt8] = Array(repeating: 0, count: Int(CC_MD5_DIGEST_LENGTH))
_ = CC_MD5_Final(&digest, &context)
return Data(digest)
} catch {
print("Cannot open file:", error.localizedDescription)
return nil
}
}
Post not yet marked as solved
I stumbled upon a really peculiar issue while running my application that features both a functionality to create image and video attachments from the user's photos and videos using the PHPicker* API and previewing said attachments using the QuickLook API:
Whenever I tap to open a photo (e.g. a JPEG) or a video (e.g. a MP4) using the QuickLook API, on an iPhone 13 15.4 simulator device on Xcode 13.3 on a M1 Macbook Pro host machine, I get a crash with the following call stack:
*** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '*** -[__NSPlaceholderDictionary initWithObjects:forKeys:count:]: attempt to insert nil object from objects[2]'
terminating with uncaught exception of type NSException
*** First throw call stack:
(
0 CoreFoundation 0x000000011003bd44 __exceptionPreprocess + 242
1 libobjc.A.dylib 0x000000010cd1ea65 objc_exception_throw + 48
2 CoreFoundation 0x00000001100bcf47 _CFThrowFormattedException + 200
3 CoreFoundation 0x00000001100c7417 -[__NSPlaceholderDictionary initWithObjects:forKeys:count:].cold.5 + 0
4 CoreFoundation 0x00000001100aa827 -[__NSPlaceholderDictionary initWithObjects:forKeys:count:] + 243
5 CoreFoundation 0x000000011003a998 +[NSDictionary dictionaryWithObjects:forKeys:count:] + 49
6 QuickLook 0x0000000110f37636 __49+[QLItem(PreviewInfo) contentTypesToPreviewTypes]_block_invoke + 484
7 libdispatch.dylib 0x000000011b0f5a5b _dispatch_client_callout + 8
8 libdispatch.dylib 0x000000011b0f6f24 _dispatch_once_callout + 66
9 QuickLook 0x0000000110f37450 +[QLItem(PreviewInfo) contentTypesToPreviewTypes] + 46
10 QuickLook 0x0000000110f37d91 -[QLItem(PreviewInfo) _uncachedPreviewItemTypeForContentType:] + 117
11 QuickLook 0x0000000110f382fe -[QLItem(PreviewInfo) _previewItemTypeForType:] + 150
12 QuickLook 0x0000000110f38104 -[QLItem(PreviewInfo) _getPreviewItemType] + 61
13 QuickLook 0x0000000110fbf4d2 -[QLItem previewItemType] + 59
14 QuickLook 0x0000000110f6ed15 +[QLItemFetcherFactory fetcherForPreviewItem:] + 90
15 QuickLook 0x0000000110fbeb9b -[QLItem fetcher] + 44
16 QuickLook 0x0000000110f22880 -[QLPreviewController previewItemAtIndex:withCompletionHandler:] + 153
17 QuickLook 0x0000000110f587a8 __63-[QLPreviewItemStore previewItemAtIndex:withCompletionHandler:]_block_invoke + 262
18 QuickLook 0x0000000110f8c621 QLRunInMainThread + 51
19 QuickLook 0x0000000110f58673 -[QLPreviewItemStore previewItemAtIndex:withCompletionHandler:] + 181
20 QuickLook 0x0000000110f1c89f -[QLPreviewController internalCurrentPreviewItem] + 222
21 QuickLook 0x0000000110f9c140 -[QLPreviewController(Overlay) _actionButton] + 344
22 QuickLook 0x0000000110f9c81c -[QLPreviewController(Overlay) _toolBarButtonsWithTraitCollection:] + 1147
23 QuickLook 0x0000000110f9a2ff -[QLPreviewController(Overlay) _updateOverlayButtonsIfNeededWithTraitCollection:animated:updatedToolbarButtons:] + 129
24 QuickLook 0x0000000110f999a2 -[QLPreviewController(Overlay) updateOverlayAnimated:animatedButtons:forceRefresh:withTraitCollection:] + 152
25 QuickLook 0x0000000110f1c707 -[QLPreviewController _setCurrentPreviewItemIndex:updatePreview:animated:] + 230
26 QuickLook 0x0000000110f22416 -[QLPreviewController reloadData] + 418
27 <MYAPP> 0x0000000102a03293 __53-[MyPreviewController downloadAttachment]_block_invoke_2 + 611
28 libdispatch.dylib 0x000000011b0f4816 _dispatch_call_block_and_release + 12
29 libdispatch.dylib 0x000000011b0f5a5b _dispatch_client_callout + 8
30 libdispatch.dylib 0x000000011b104325 _dispatch_main_queue_drain + 1169
31 libdispatch.dylib 0x000000011b103e86 _dispatch_main_queue_callback_4CF + 31
32 CoreFoundation 0x000000010ffa8261 __CFRUNLOOP_IS_SERVICING_THE_MAIN_DISPATCH_QUEUE__ + 9
33 CoreFoundation 0x000000010ffa2a56 __CFRunLoopRun + 2761
34 CoreFoundation 0x000000010ffa1a90 CFRunLoopRunSpecific + 562
35 GraphicsServices 0x00000001209c8c8e GSEventRunModal + 139
36 UIKitCore 0x000000013499c90e -[UIApplication _run] + 928
37 UIKitCore 0x00000001349a1569 UIApplicationMain + 101
38 <MYAPP> 0x0000000102970669 main + 233
39 dyld 0x000000010c6c4f21 start_sim + 10
40 ??? 0x00000002041ad51e 0x0 + 8658801950
41 ??? 0x0000000000000003 0x0 + 3
I tried also selecting a photo or a video using the PHPicker* APIs, and although I get back the file, if I try to query the UTType from the file extension using the [UTType typeWithFilenameExtension:] method, I get back a nil result if the extension passed is either a "jpeg" or a "mp4".
The above do not happen if I run the same simulator (iPhone 13 15.4 / Xcode 13.3) on an Intel Macbook Pro host machine, or on a real device.
Any ideas?
Post not yet marked as solved
We implemented the new PHPicker and have run into an issue we haven't been able to replicate on our own devices but see a lot of users running into it. The problem is an error after getting the PHPickerResults and trying to get the UIImages.
Because the user can select several images at once, what we do is get the results and iterate over each itemProvider object.
I'm following apple's guidance and checking if itemProvider canLoadObjectOfClass:UIImage.class, before executing itemProvider loadObjectOfClass:UIImage.class.
However we are getting hundreds of reports of users where this last method returns an error.
Firstly, this is how we configure our PHPickerViewController:
PHPickerConfiguration *configuration = [[PHPickerConfiguration alloc] init];
configuration.selectionLimit = self.pictureSelectionLimit;
configuration.filter = PHPickerFilter.imagesFilter;
configuration.preferredAssetRepresentationMode = PHPickerConfigurationAssetRepresentationModeCurrent;
PHPickerViewController *pickerViewController = [[PHPickerViewController alloc] initWithConfiguration:configuration];
pickerViewController.delegate = self;
pickerViewController.modalPresentationStyle = UIModalPresentationFullScreen;
[viewController presentViewController:pickerViewController
animated:YES
completion:nil];
And this is what we do with the PHPickerResult. This is a method that returns a block with an array of an object NewPicture instantiated with the UIImage I should be getting.
NSMutableArray *picArray = [[NSMutableArray alloc] init];
NSArray *itemProviders = [self.results custom_map: ^id _Nullable (PHPickerResult *_Nonnull current) {
return current.itemProvider;
}];
dispatch_group_t dispatchGroup = dispatch_group_create();
for (NSItemProvider *itemProvider in itemProviders) {
dispatch_group_enter(dispatchGroup);
/**
We cannot properly retrieve raw type images with the current authorization status.
If the image is of type raw, we ignore it.
*/
if ([itemProvider hasItemConformingToTypeIdentifier:@"public.camera-raw-image"]) {
NSException *exception = [NSException exceptionWithName:@"ImageIsTypeRaw"
reason:[NSString stringWithFormat:@"Object is type raw. ItemProvider: %@", itemProvider.description]
userInfo:nil];
// Log exception...
dispatch_group_leave(dispatchGroup);
continue;
}
if ([itemProvider canLoadObjectOfClass:UIImage.class]) {
[itemProvider loadObjectOfClass:UIImage.class completionHandler: ^(__kindof id NSItemProviderReading _Nullable object, NSError *_Nullable error) {
if ([object isKindOfClass:UIImage.class]) {
NewPicture *picture = [[NewPicture alloc]initWithImage:object];
[picArray addObject:picture];
}
if (error) {
NSException *exception = [NSException exceptionWithName:@"CouldNotLoadImage"
reason:[NSString stringWithFormat:@"Object is nil. UserInfo: %@", error.userInfo]
userInfo:error.userInfo];
// Log exception...
}
dispatch_group_leave(dispatchGroup);
}];
}
}
dispatch_group_notify(dispatchGroup, dispatch_get_main_queue(), ^{
picturesBlock(picArray);
});
The most common error we see our users are getting is:
Object is nil. UserInfo: {
NSLocalizedDescription = "Cannot load representation of type public.jpeg";
NSUnderlyingError = "Error Domain=NSCocoaErrorDomain Code=260 \"The file \U201cversion=1&uuid=*&mode=current.jpeg\U201d couldn\U2019t be opened because there is no such file.\" UserInfo={NSURL=file:///private/var/mobile/Containers/Shared/AppGroup/*/File%20Provider%20Storage/photospicker/version=1&uuid=*&mode=current.jpeg, NSFilePath=/private/var/mobile/Containers/Shared/AppGroup/*/File Provider Storage/photospicker/version=1&uuid=***&mode=current.jpeg, NSUnderlyingError=0x283822970 {Error Domain=NSPOSIXErrorDomain Code=2 \"No such file or directory\"}}";
}
I'm having a really hard time understanding why this sometimes fails. I'd really appreciate it if someone could give me a hand with this.
I'm attaching the stack trace:
stack_trace - https://developer.apple.com/forums/content/attachment/051f7018-05ff-4ad1-a626-29f248d0b497
Post not yet marked as solved
In a WKWebview I am loading a page that has a button to allow a user to upload a photo for a profile. Clicking the button prompts the user to select their photo library, camera, or choose a file in the same way safari does.
Choose a file works as expected
Camera or Photo Library fail with a log:
[core] "Error returned from daemon: Error Domain=com.apple.accounts Code=7 "(null)""
2022-04-20 12:32:29.010947-0400
[PAAccessLogger] Failed to log access with error: access=<PATCCAccess 0x283e88990> accessor:<<PAApplication 0x28137a260 identifierType:auditToken identifier:{pid:810, version:2204}>> identifier:789F7A90-3A0B-4FC3-8E69-8863321CDE2F kind:intervalEnd timestampAdjustment:0 tccService:kTCCServicePhotos, error=Error Domain=PAErrorDomain Code=10 "Possibly incomplete access interval automatically ended by daemon"
I also do not get a request for camera or photo library access like I would expect. I DO have camera and photo library permissions in my info.plist and if they are omitted I get a different and more descriptive failure, prompting me to add the permissions.
Mobile Safari works in all cases
Test device is an iPhone X running ios 15.4.1
App is built in XCode 13
Post not yet marked as solved
Hi all. IM need help with Big Sur Beta 4, because in betas 1 and 2 the drivers pack from third party like epass2003 Feitian Technology works nice and recognize until update for Beta 4.
Is that possible copy or replace some files in system like /usr folder to recognize again this midia?
If there a way, please im need help. Im not advanced user.
Im think is Java problems and missing dylib files on Big Sur beta 4.
TL;DR C function can't open MP3 file in app's documents directory, am I missing any sort of permissions?
I am trying to create an app to play music through the BASS Audio Library for C/C++ and while I have it playing music, I cannot seem to have it open local files.
To create a stream from a file to play in this library, you use BASS_StreamCreateFile(); function, which you pass a URL to the file to use, but even thought I can verify the URL I passing is correct and the file is in the files app, it throws error code 2 "Cannot open file"
However, when I use BASS_StreamCreateURL(); and pass in a URL from the internet, it works perfectly, so I have to assume the problem has something to do with file permissions.
Here is the C function in which I am creating these streams
int createStream(const char* url) {
//HSTREAM stream = BASS_StreamCreateURL("https://vgmsite.com/soundtracks/legend-of-zelda-ocarina-of-time-original-sound-track/fticxozs/68%20-%20Gerudo%20Valley.mp3", 0, 0, NULL, 0);
HSTREAM stream = BASS_StreamCreateFile(false, url, 0, 0, 0);
if (stream == 0) {
printf("Error at createStream, error code: %i\n", BASS_ErrorGetCode());
return 0;
} else {
return stream;
}
}
In the commented out line is the working Stream created from a URL
And here is the URL I am passing in
guard let documentsURL = fileManager.urls(for: .documentDirectory, in: .userDomainMask).first else { return }
gerudoValleyURL = documentsURL.appendingPathComponent("GerudoValley.mp3")
stream = createStream(gerudoValleyURL.absoluteString)
I can confirm that the MP3 "GerudoValley.mp3" is in the app's documents directory in the files app.
Is there anything I could do to allow this C file to open to open MP3's form the App's documents directory? The exact MP3 from that link is already there.
Post not yet marked as solved
I’d appreciate some help in diagnosing a crash in MPMediaLibrary that occurs on some iOS 15.1.x user devices. So far this issue can't be reproduced with test devices that range from iOS10 to iOS15.2, whether via Xcode 12.4 or TestFlight.
The app needs access to the device’s microphone and audio files which are selected via the MPMediaPickerController. The usual usage description keys (namely NSMicrophoneUsageDescription, NSAppleMusicUsageDescription and kTCCServiceMediaLibrary) are specified in the info.plist.
The crash relates to library access and involves the iTunesCloud binary. The crash occurs when the app starts, probably for the first time after installation (not the best way to greet new users).
Here is a sample crash report:
iOS15_1_Crash_MPMediaLibrary_authorizationStatus_anon.txt
Here is a typical traceback:
Thread 0 Crashed:
0 libsystem_kernel.dylib 0x00000001b8964504 mach_msg_trap + 8
1 libsystem_kernel.dylib 0x00000001b8964b9c mach_msg + 76 (mach_msg.c:119)
2 libdispatch.dylib 0x000000018165227c _dispatch_mach_send_and_wait_for_reply + 520 (mach.c:815)
3 libdispatch.dylib 0x000000018165262c dispatch_mach_send_with_result_and_wait_for_reply + 56 (mach.c:2019)
4 libxpc.dylib 0x00000001f2576b9c xpc_connection_send_message_with_reply_sync + 240 (connection.c:974)
5 TCC 0x00000001e961c0c0 tccd_send_message + 940 (TCC.c:490)
6 TCC 0x00000001e9621e08 __TCCAccessRequest_block_invoke.213 + 876 (TCC.c:591)
7 libdispatch.dylib 0x0000000181637660 _dispatch_client_callout + 20 (object.m:560)
8 libdispatch.dylib 0x00000001816468b4 _dispatch_lane_barrier_sync_invoke_and_complete + 56 (queue.c:1016)
9 libsystem_trace.dylib 0x000000019c147668 _os_activity_initiate_impl + 64 (activity.c:131)
10 TCC 0x00000001e961d4e0 TCCAccessRequest + 476 (TCC.c:1019)
11 TCC 0x00000001e961c73c TCCAccessPreflight + 300 (TCC.c:1651)
12 iTunesCloud 0x0000000198d58160 -[ICCloudServiceStatusMonitor authorizationStatusForScopes:] + 60 (ICCloudServiceStatusMonitor.m:689)
13 MediaPlayer 0x000000018a8c8ee4 +[MPMediaLibrary authorizationStatus] + 64 (MPMediaLibrary.m:778)
16 MyAppppp 0x00000001003e9880 AppDelegate.application(_:willFinishLaunchingWithOptions:) + 32 (AppDelegate.swift:23)
17 MyAppppp 0x00000001003e9880 @objc AppDelegate.application(_:willFinishLaunchingWithOptions:) + 136 (<compiler-generated>:20)
The app startup logic is basically:
func application(_ application: UIApplication, willFinishLaunchingWithOptions launchOptions: [UIApplication.LaunchOptionsKey : Any]? = nil) -> Bool
{
let audioSession = AVAudioSession.sharedInstance()
if audioSession.recordPermission == .undetermined
{
audioSession.requestRecordPermission
{
// Permission initialized
}
}
if MPMediaLibrary.authorizationStatus() == .notDetermined // Sometimes crashes in iOS15.1 on user devices
{
MPMediaLibrary.requestAuthorization
{
// Permission initialized
}
}
if MPMediaLibrary.authorizationStatus() == .authorized
{
// The following code sometimes crashes in iOS15.1 on user devices without the enclosing ".authorized” check
let predicate = MPMediaPropertyPredicate(value: defaultSongId, forProperty: MPMediaItemPropertyPersistentID)
let query = MPMediaQuery()
}
return true
}
Questions:
What exactly is the iTunesCloud binary doing? Is additional configuration needed?
Why does the crash happen only on some iOS 15.1 user devices?
Is it too early in the app lifecycle to call MPMediaLibrary.authorizationStatus?
Post-crash workaround: Are users able to grant access to the library via the device’s global privacy settings?
Thanks!
Post not yet marked as solved
Good day. I am currently experiencing issues gaining access to the iOS Media-Library to save files. In my react-native app I use the RNPermissions (react-native-permissions) library to set up permissions. I have followed all the steps from the documentation but had no success.
I will attach a screenshot of the error below.
My PodFile:
require_relative '../node_modules/@react-native-community/cli-platform-ios/native_modules'
require_relative '../mobile_sdk/SalesforceMobileSDK-iOS/mobilesdk_pods'
platform :ios, '13.0'
use_frameworks!
pre_install do |installer|
installer.pod_targets.each do |pod|
if pod.name.eql?('RNPermissions') || pod.name.start_with?('Permission-')
def pod.build_type;
Pod::BuildType.static_library # I assume you use CocoaPods >= 1.9
end
end
end
end
project 'myfinglobal.xcodeproj'
target 'myfinglobal' do
source 'https://cdn.cocoapods.org/'
config = use_native_modules!
use_react_native!(:path => config["reactNativePath"])
use_mobile_sdk!(:path => '../mobile_sdk/SalesforceMobileSDK-iOS')
pod 'SalesforceReact', :path => '../node_modules/react-native-force'
pod 'React' , :path => '../node_modules/react-native/'
pod 'React-cxxreact' , :path => '../node_modules/react-native/ReactCommon/cxxreact'
permissions_path = '../node_modules/react-native-permissions/ios'
pod 'Permission-Camera', :path => "../node_modules/react-native-permissions/ios/Camera/Permission-Camera.podspec"
end
# To avoid Xcode 12 compilation errors in RNScreens and RNCMaskedView
pre_install do |installer|
installer.pod_targets.each do |pod|
if pod.name.eql?('RNScreens') || pod.name.eql?('RNCMaskedView')
def pod.build_type
Pod::BuildType.static_library
end
end
end
end
# Comment the following if you do not want the SDK to emit signpost events for instrumentation. Signposts are enabled for non release version of the app.
post_install do |installer|
signposts_post_install(installer)
end```
Post not yet marked as solved
Hi I want to show a shared photo album in my app. Using PHPhotoLibrary.shared() and PHAssetCollection I get all photos of the requested photo album and I get an information if the album content changes. So far so good, but i want to access the comments on the photos and the post text , which can be added when adding photos to a shared album. Is there anyone who has a solution for this problem? regards Sven
Post not yet marked as solved
Hi,
I use following code to add listener for kCMIODevicePropertyDeviceIsRunningSomewhere to get using camera notification.
opa.mSelector = kCMIODevicePropertyDeviceIsRunningSomewhere;
opa.mScope = kCMIOObjectPropertyScopeGlobal;
opa.mElement = kCMIOObjectPropertyElementMaster;
result = CMIOObjectAddPropertyListenerBlock(CMIOObjectID(camera), &opa,NULL, CameralistenerBlock);
This code can work as expected before.
But when I test it with macOS 12 on M1, the listener callback block is invoked by mistake when I open Music.app or iTunes.
When I run my application for two processes, the 1st process will get listener callback when the 2nd process invoke CMIOObjectAddPropertyListenerBlock.
Is it a bug of macOS 12? Or spec is changed?
Any suggestion?
Thank you very much!
Post not yet marked as solved
Crashed: com.apple.MediaPlayer.MPNowPlayingInfoCenter/accessQueue
EXC_BAD_ACCESS KERN_INVALID_ADDRESS 0x0000000001dd4a68
Post not yet marked as solved
App getting crashed after updating OS version to OS 15.1 at the time of first time launch, and after crash it works fine. In earlier version like in 15.0 it was working fine.
While Debug I found in the first time in audio video permission app getting stuck in below code.
if ([AVCaptureDevice respondsToSelector:@selector(requestAccessForMediaType: completionHandler:)]) {
[AVCaptureDevice requestAccessForMediaType:AVMediaTypeAudio completionHandler:^(BOOL granted) {
if (granted) {
dispatch_async(dispatch_get_main_queue(), ^{
});
} else {
}
}];
} else {
}
Post not yet marked as solved
This issue has a blocking impact on our ability to serve our product on any iOS devices (since Web Audio APIs are not supported on any other browsers than Safari on iOS) and Safari browser on desktop and iOS in general. And one of our customer is currently heavily impacted because of this limitation in Safari. Currently Safari built on WebKit has a limitation that it cannot provide access to raw audio data via AudioContext for HLS playback, which works on mp4 files. This is supported by EVERY OTHER MAJOR BROWSER EXCEPT SAFARI, which is concerning because we will need to force users to not use our application on safari desktop, and we simply CANNOT SERVE ANY IPHONE AND IPAD USERS which is a BLOCKER for us given that more than half of our users use iOS based devices. And of course this is clearly a feature that should’ve been in place already in Safari, which is currently lagging behind in comparison to other browsers. The W3C specification already supports this and all major browsers have already implemented and supported HLS streams to be used with AudioContext.
We’d like to re-iterate the importance and urgency of this (https://bugs.webkit.org/show_bug.cgi?id=231656) for us, and this has been raised multiple times by other developers as well, so certainly this will help thousands of other Web developers to bring HLS based applications to life on Safari and iOS ecosystem.
Can we please get the visibility on what is going to be the plan and timelines for HLS support with AudioContext in Safari? Critical part of our business and our customer’s products depend on this support in Safari.
We're using new webkitAudioContext() in Safari 15.0 on MacBook and iOS Safari on iPhone and iPad to create AudioContext instance, and we're creating ScriptProcessorNode and attaching it to the HLS/m3u8 source create using audioContext. createMediaElementSource(). The onaudioprocess callback gets called with the audio data, but no data is processed and instead we get 0’s.
If you also connect Analyser node to the same audio source create using audioContext. createMediaElementSource(), analyser.getByteTimeDomainData(dataArray) populates no data in the data but onaudioprocess on the ScriptProcessorNode on the same source
What has been tried:
We confirmed that the stream being used is the only stream in the tab and createMediaElementSource() was only called once to get the stream.
We confirmed that if the stream source is MP4/MP3 it works with no issues and data is received in onaudioprocess, but when modifing the source to HLS/m3u8 it does not work
We also tried using MediaRecorder with HLS/m3u8 as the stream source but didn’t get any events or data
We also tried to create two AudioContext’s, so the first AudioContext will be the source passing the createMediaElementSource as the destination to the other Audio Context and then pass it to ScriptProcessorNode, but Safari does not allow more than one output.
Currently none of the scenarios we tried works and this is a major blocker to us and for our customers.
Code sample used to create the ScriptProcessorNode:
const AudioContext = window.AudioContext || window.webkitAudioContext;
audioContext = new AudioContext();
// Create a MediaElementAudioSourceNode
// Feed the HTML Video Element 'VideoElement' into it
const audioSource = audioContext.createMediaElementSource(VideoElement);
const processor = audioContext.createScriptProcessor(2048, 1, 1);
processor.connect(audioContext.destination);
processor.onaudioprocess = (e) => {
// Does not get called when connected to external microphone
// Gets called when using internal MacBook microphone
console.log('print audio buffer', e);
}
The exact same behavior is also observed on iOS Safari on iPhone and iPad.
We are asking for your help on this matter ASAP.
Thank you!
Post not yet marked as solved
How can I play video using AVPlayer? I have retrieved the file URL i.e file:///Users/admin/Library/Developer/CoreSimulator/Devices/718B08F8-D4DD-44E6-9DFA-0E81D5EDA78C/data/Containers/Shared/AppGroup/D82C51F4-E1B2-4390-9885-296A185ACF16/File%20Provider%20Storage/photospicker/version=1&uuid=BCC39930-E835-4BBE-A6F1-716B21CA10A0&mode=compatible.mov
how to play using this?
Post not yet marked as solved
Hello Apple World,
I am working on an app that adds Apple Music songs to an MPMediaPlaylist.
I was able to do this successfully in iOS 13.7.
However, today I started testing on iOS 14.0.1 and the same code causes my app to freeze completely.
print("PlaylistManager.addItem playlist \(playlist) identifier \(identifier) MPMediaLibrary.authorizationStatus() \(MPMediaLibrary.authorizationStatus().rawValue)")
playlist.addItem(withProductID: identifier) { (error) in
if error != nil {
print("An error occurred while adding an item to the playlist: \(error!.localizedDescription)")
}
}
Output:
"PlaylistManager.addItem playlist <MPConcreteMediaPlaylist: 0x2819dc4d0> identifier 1531532609 MPMediaLibrary.authorizationStatus() 3"
Because I don't get any errors printed (the app just freezes up) and I don't see any crash logs in Devices and Simulators, I am not sure how to proceed.
Is anyone else running into this issue with MPMediaPlaylist.addItem in iOS 14?
Is there a general way to debug app freezes?
Any help will be very much appreciated;
Thank you in advance!
Marvin
Post not yet marked as solved
When use presentLimitedLibraryPicker or when use "Select Photos.." in library, there is an abnormal condition. The search box is transparent, it's only happen in iOS 14, iOS 15 Beta is not happen. Where can I handle it?