Hello,
I develop an application called MiniMeters and am using ScreenCaptureKit to get the desktop audio on macOS. As of macOS 14.2, a few users began noticing that the volume of the incoming audio differed depending on the audio device connected.
One user's Apogee Symphony Desktop is trimmed -14dB, another user's UAD Apollo Twin is -14dB as well, my MOTU M4 is -6dB, and my MacBook Pro's internal speakers show 0dB.
This does not change with changing the output volume on the interface (obviously), nor digitally in the system.
I cannot seem to find anything documenting this change. It also affects other applications that use ScreenCaptureKit such as OBS. Does anyone have any idea what that could correlate to so I could potentially compensate?
Thanks.
Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.
Post
Replies
Boosts
Views
Activity
Hi,
In iOS 17.0 Apple introduced a favorite button for the music app. This favorite button is available everywhere including Control Center and CarPlay.
My question is if there is a way to show/use this favorite button in my own app. Couldn't find it in the documentation.
Thank you..!
Greetings everyone,
My app is crash when i open camera screen open and close i have added subview in the camera that shows the main screen but the app does not crash every time, the app works well 5-6 times after the app crashes.
I'm using instead of the Quickpose.ai library and the app crashes instead of lib. so I don't know where is the problem i have shown some code and my crash log.
*** Terminating app due to uncaught exception 'NSGenericException', reason: '*** -[AVCaptureSession startRunning] startRunning may not be called between calls to beginConfiguration and commitConfiguration'
*** First throw call stack:
(0x1889f4870 0x180d13c00 0x1a4e30b44 0x10505cff0 0x1047ed7cc 0x1047ed84c 0x105824f50 0x105826b34 0x10582e98c 0x10582f728 0x10583c5f8 0x10583bc2c 0x1f2365964 0x1f2365a04)
libc++abi: terminating due to uncaught exception of type NSException
![]("https://developer.apple.com/forums/content/attachment/a1eeece3-6529-4c79-8931-963f58818a93" "title=Screenshot 2023-12-12 at 9.35.27 AM.png;width=1920;height=1080")
![]("https://developer.apple.com/forums/content/attachment/2184c975-e299-40e4-b466-cafa5165ae03" "title=Screenshot 2023-12-12 at 9.35.32 AM.png;width=1920;height=1080")
`
![]("https://developer.apple.com/forums/content/attachment/d78ac3ac-313a-4df9-960d-0c58c3087bec" "title=Screenshot 2023-12-15 at 12.11.38 PM.png;width=1920;height=1080")
``
Hello,
I'm wondering if there is a way to programmatically write a series of UIImages into an APNG, similar to what the code below does for GIFs (credit: https://github.com/AFathi/ARVideoKit/tree/swift_5). I've tried implementing a similar solution but it doesn't seem to work. My code is included below
I've also done a lot of searching and have found lots of code for displaying APNGs, but have had no luck with code for writing them.
Any hints or pointers would be appreciated.
func generate(gif images: [UIImage], with delay: Float, loop count: Int = 0, _ finished: ((_ status: Bool, _ path: URL?) -> Void)? = nil) {
currentGIFPath = newGIFPath
gifQueue.async {
let gifSettings = [kCGImagePropertyGIFDictionary as String : [kCGImagePropertyGIFLoopCount as String : count]]
let imageSettings = [kCGImagePropertyGIFDictionary as String : [kCGImagePropertyGIFDelayTime as String : delay]]
guard let path = self.currentGIFPath else { return }
guard let destination = CGImageDestinationCreateWithURL(path as CFURL, __UTTypeGIF as! CFString, images.count, nil)
else { finished?(false, nil); return }
//logAR.message("\(destination)")
CGImageDestinationSetProperties(destination, gifSettings as CFDictionary)
for image in images {
if let imageRef = image.cgImage {
CGImageDestinationAddImage(destination, imageRef, imageSettings as CFDictionary)
}
}
if !CGImageDestinationFinalize(destination) {
finished?(false, nil); return
} else {
finished?(true, path)
}
}
}
My adaptation of the above code for APNGs (doesn't work; outputs empty file):
func generateAPNG(images: [UIImage], delay: Float, count: Int = 0) {
let apngSettings = [kCGImagePropertyPNGDictionary as String : [kCGImagePropertyAPNGLoopCount as String : count]]
let imageSettings = [kCGImagePropertyPNGDictionary as String : [kCGImagePropertyAPNGDelayTime as String : delay]]
guard let destination = CGImageDestinationCreateWithURL(outputURL as CFURL, UTType.png.identifier as CFString, images.count, nil)
else { fatalError("Failed") }
CGImageDestinationSetProperties(destination, apngSettings as CFDictionary)
for image in images {
if let imageRef = image.cgImage {
CGImageDestinationAddImage(destination, imageRef, imageSettings as CFDictionary)
}
}
}
Generate tones, etc. in code? Is there a way?
Regards, Patrick
Starting with Sonoma 14.2 it is not possible any longer to connect canon cameras to an app via USB using Canon's EDSDK framework. This has been working fine up to Sonoma 14.1.
The app using the EDSDK is not crashing, but the SDK is not reporting back any connected cameras any longer. The camera is connected and can be seen in the system report as well as in e.g. gphoto2 and even in the EOS Utility Software.
It seems that 14.2 introduced some breaking change to the access to cameras from within apps.
I've tried upgrading to the newest EDSDK version and checked with and without App Sandbox. There is no way to find the camera any longer on 14.2.
Hi, my team is developing an iOS app which connects to Apple Music (currently most of it is via API, but some tests have been done with MusicKit). In a separate development, we were using a service which competes with A.M., and when their (web) API is used, if the user doesn't interact with the app somehow (scrolling, buttons, etc), the connection will timeout after 30 seconds and force them to reconnect.
So, questions I have are:
Is there any timeout regarding stopping tracks in Apple Music or does the connection to the A.M. app stay active?
If the connection drops on iOS as well, is there any limitation regarding streaming A.M. content via API (aka are there licensing issues as well?)
Even with sample code from Apple, the same warning forever.
"Error returned from daemon: Error Domain=com.apple.accounts Code=7 "(null)""
Failed to log access with error: access=<PATCCAccess 0x28316b070> accessor:<<PAApplication 0x283166df0 identifierType:auditToken identifier:{pid:1456, version:3962}>> identifier:2EE1A54C-344A-40AB-9328-3F8E8B5E8A85 kind:intervalEvent timestampAdjustment:0 visibilityState:0 assetIdentifierCount:0 tccService:kTCCServicePhotos, error=Error Domain=NSCocoaErrorDomain Code=4097 "connection to service with pid 235 named com.apple.privacyaccountingd" UserInfo={NSDebugDescription=connection to service with pid 235 named com.apple.privacyaccountingd}
Entitlements are reviewed.
Who can I contact that can remove the gray frame from around the full screen video player on my iOS mobile application? This is an Apple iOS feature that I have no control over.
The screenshot attached below shows the full screen view of a video when the iOS mobile phone is held sideways. The issue is the big gray frame that is around the video, is taking up too much space from the video and it needs to be removed so the video can be fully screened.
I recently downloaded Apple Music for for Windows 10 and started a new music library. Then I downloaded all my purchased music albums to my computer. However, the album art is not embedded in any of the m4a files. It is stored in a separate artwork folder. I have Visual Studio 2022 and was wondering how I might go about accessing the Apple Music app using the API. I want the script to go through all my purchased music and take the album art from the artwork folder and embed it in the purchased m4a files. I found some other programs/scripts online but they are all for iTunes or for Mac. None of the old scripts will work because they don’t recognize the new Apple Music for Windows music library. I can’t go back to iTunes because Apple removed all the music options from the iTunes for Windows app. Any ideas on how to get started writing the app/script would be greatly appreciated. Can I download something from Apple to make it easy to access the API in Visual Studio 2022 for Windows?
Developing for iphone/ipad/mac
I have an idea for a music training app, but need to know of supporting libraries for recognizing a musical note's fundamental frequency in close to real time (100 ms delay) Accuracy should be within a few cents (hundredths of a semi tone)
A search for "music" resolved the core-midi library -- fine if I want to take input from midi, but I want to be open to audio input too.
And I found MusicKit, which seems to be a programmer's API for digging into
Meta questions:
Should I be using different search terms:
Where are libraries listed?
Who are the names in 3rd party libraries.
I have repeatedly checked if you limit the connection speed to a host with a video file (mp4), it brings that AVPlayer is stalled, but after you return the high speed connection to the host, the player does not resume playback. If you check the status, no errors, just the empty buffer:
AVPlayer.error is nil.
AVPlayerItem.error is nil.
AVPlayerItem.isPlaybackBufferEmpty is true
AVPlayerItem.isPlaybackLikelyToKeepUp is false
Even if you try to wait a lot of time nothing happens and tapping play button it doesn't help as well. The player is frozen forever.
Only if you call "seek" or call "playImmediately" method the player is unfrozen and resume playback.
It happens not all the time, maybe one time from four.
It seems like AVPlayer has a bug. What do you think?
我使用VideoToolBox库进行编码后再通过NDI发送到网络上,是可以成功再苹果电脑上接收到ndi源屏显示画面的,但是在windows上只能ndi源名称,并没有画面显示。
我想知道是不是使用VideoToolBox库无法在windows上进行正确编码,这个问题需要如何解决
I am trying to set up HLS with MV HEVC. I have an MV HEVC MP4 converted with AVAssetWriter that plays as a "spatial video" in Photos in the simulator. I've used ffmpeg to fragment the video for HLS (sample m3u8 file below).
The HLS of the mp4 plays on a VideoMaterial with an AVPlayer in the simulator, but it is hard to determine if the streamed video is stereo. Is there any guidance on confirming that the streamed mp4 video is properly being read as stereo?
Additionally, I see that REQ-VIDEO-LAYOUT is required for multivariant HLS. However if there is ONLY stereo video in the playlist is it needed? Are there any other configurations need to make the device read as stereo?
Sample m3u8 playlist
#EXTM3U
#EXT-X-VERSION:3
#EXT-X-TARGETDURATION:13
#EXT-X-MEDIA-SEQUENCE:0
#EXTINF:12.512500,
sample_video0.ts
#EXTINF:8.341667,
sample_video1.ts
#EXTINF:12.512500,
sample_video2.ts
#EXTINF:8.341667,
sample_video3.ts
#EXTINF:8.341667,
sample_video4.ts
#EXTINF:12.433222,
sample_video5.ts
#EXT-X-ENDLIST
I'm running this SwiftUI sample app for photos without any modifications except for adding my developer profile, which is necessary to build it. When I tap on the thumbnail to see the photo library (after granting access to my photo library), I see that some of the thumbnails are stuck in a loading state, and when I tap on thumbnails, I only see a low-resolution image (the thumbnail), not the full-resolution image that should load.
In the console I can see this error that occurs when tapping on a thumbnail to see the full-resolution image:
CachedImageManager requestImage error: The operation couldn’t be completed. (PHPhotosErrorDomain error 3164.)
When I make a few modifications necessary to run the app as a native macOS app, all the thumbnails load immediately, and clicking on them reveals the full-resolution images.
So I've spent the last five years optimizing my video AI system so that it runs with less than 5% CPU while processing a 30fps video feed on a Macbook Pro M2, and everything is great, until Sonoma comes out, and I find myself consuming 40% CPU for the exact same workload.
So I fire up Instruments, and the "heaviest stack trace" (see screenshot) turns out to be Espresso doing some completely unasked-for and absolutely useless processing on my video frames. I turn off Reactions, but nothing helps - the CPU consumptions stays at 40%.
"Reactions" is nothing but a useless toy to please some WWDC keynote fanboys, I don't want it anywhere near my app or my users, and I especially do not want to take the blame for it pissing away the user's CPU cycles and battery.
Now, how do I make it go away, for ever?
Best regards
Jacob
i have create one recording application, but user switch off or kill the application, so that time how to save ongoing record.
I want to add a small play button to play a song preview on my site. I don't want to use the embedded player because I want to style it custom. If I use the MusicKit Web library, do I need to add Apple branding legally? I don't see anywhere that says its required and I want to confirm.
We have to play some encrypted videos from server.
In AVAssetResourceLoaderDelegate we got the ckc data correctly and responded with that. Then video just starts playing and stops immdediately. When we check the playerItem error description we got Error Domain=AVFoundationErrorDomain Code=-11819 "Cannot Complete Action" UserInfo={NSLocalizedDescription=Cannot Complete Action, NSLocalizedRecoverySuggestion=Try again later.}. Any one encountered this?
When I try to initialize SCContentFilter(desktopIndependentWindow: window), I get a very weird error:
Assertion failed: (did_initialize), function CGS_REQUIRE_INIT, file CGInitialization.c, line 44.
This is a brand new CLI project on Sonoma 14.2.1. Any ideas?