I have a mac os app that uses screen capture logic. It was originally coded using the Quartz CG api:
if let cgimage = CGDisplayCreateImage(CGMainDisplayID(), rect: cgRect) {...}
and this worked as expected even when capturing a screen rect that began on secondary monitor and ended on primary monitor (or was entirely contained on secondary monitor).
However now that API is deprecated and you're supposed to use ScreenCaptureKit instead. So I have attempted to convert the code. The trial code is:
let scConfig = SCStreamConfiguration()
scConfig.sourceRect = drect
scConfig.width = Int(drect.width)
scConfig.height = Int(drect.height)
SCScreenshotManager.captureImage(contentFilter: sFilter, configuration: scConfig) {any,error in
if let cgim = any {
print("image dims \(cgim.width), \(cgim.height), requested: \(drect)")
self.writeToFile2(cgim)
}
else {
print("SCREEN CAP failed")
}
}
...
where sFilter was previously set based on main screen display (with no exclusions). This code also "works" as long as the capture rect is entirely on primary monitor. But it fails if the rect spans both monitors or is fully contained on secondary monitor. (By fails I mean it produces empty image)
So my question is: How to use ScreenCaptureKit to obtain screen shot of rectangle that spans dual monitors?
General
RSS for tagExplore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.
Post
Replies
Boosts
Views
Activity
So I have been using the iOS 18 beta and I would like to say first and foremost really great update I love the timed messages and the car Sickness stuff that’s awesome but my only problem with the update is screen time. I have screen time on my phone and when I updated to iOS 18 beta I can’t seem to claim more screen time. I have my screen time to where I can ignore it once I run out But in the new update I can seem press the button all I want but It wont give me more screen time. This Is what I imagine a minor fix and I would really appreciate if this was fixed soon. But other than that small detail Amazing update love what your doing over here at apple!
It seems that there’s still no way to get all TIFF tags from a TIFF image, is that right? I've got these GeoTIFF images that have a handful of specialized TIFF tags in them. Calling CGImageSourceCopyPropertiesAtIndex(), I can see basic properties common to all TIFF images, like dimensions and color/pixel information, but no others.
Short of including libtiff, is there another way to get at the metadata? I've tried all of the options in CGImageSourceCopyAuxiliaryDataInfoAtIndex.
I've written a few bugs about this since 2020, all ignored.
I’m a tracking company and have my own tracking platform. Looking for the solution that using tag device for animals like Air Tags but running on my platform.
Is there a way to allow my platform to interface with the Find My Phone to get the location data of my Tags ?
Good morning,
I'm trying to use MusicKit functionalities in order to get last played songs and put them into a local DB, to be played later. Following the guide on developer.apple.com, I created the required AppServices integration:
Below is a minimal working version of what I'm doing:
func requestMusicAuthorization() async {
let status = await MusicAuthorization.request()
switch status {
case .authorized:
isAuthorizedForMusicKit = true
error = nil
case .restricted:
error = NSError(domain: "Music access is restricted", code: -10)
case .notDetermined:
break
case .denied:
error = NSError(domain: "Music access is denied", code: -10)
@unknown default:
break
}
}
on the SwiftUI ContentView there's something like that:
.onAppear {
Task {
await requestMusicAuthorization()
if MusicManager.shared.isAuthorizedForMusicKit {
let response = try await fetchLastSongs()
do {
let request = MusicRecentlyPlayedRequest<Song>()
let response = try await request.response()
var songs: [Song] = response.items.map { $0 }
// do some CloudKit handling with songs...
print("Recent songs: \(songs)")
} catch {
NSLog(error.localizedDescription)
}
}
}
}
Everything seems to works fine, but my console log is full of garbage like that:
MSVEntitlementUtilities - Process MyMusicApp PID[33633] - Group: (null) - Entitlement: com.apple.accounts.appleaccount.fullaccess - Entitled: NO - Error: (null)
Attempted to register account monitor for types client is not authorized to access: {(
"com.apple.account.iTunesStore"
)}
is there something I'm missing on? Should I ignore that and go forward with my implementation? Any help is really appreciated.
1
. I have recently downloaded iOS 18 and my phone is on dark mode but on somethings, things are on white / light mode. Is there any way too fix this? I’ll post some screen shots showing my problem.
How to add multiple resolutions to CMIO camera extension
I have an IOSurface and I want to turn that into a CIImage. However, the constructor of CIImage takes a IOSurfaceRef instead of a IOSurface.
On most platforms, this is not an issue because the two types are toll-free bridgeable... except for Mac Catalyst, where this fails.
I observed the same back in Xcode 13 on macOS. But there I could force-cast the IOSurface to a IOSurfaceRef:
let image = CIImage(ioSurface: surface as! IOSurfaceRef)
This cast fails at runtime on Catalyst.
I found that unsafeBitCast(surface, to: IOSurfaceRef.self) actually works on Catalyst, but it feels very wrong.
Am I missing something? Why aren't the types bridgeable on Catalyst?
Also, there should ideally be an init for CIImage that takes an IOSurface instead of a ref.
I am looking to do the OOB (Out of band) pairing using QR code with a device from iOS app. I referred the documentation but could not find if this is feasible or not. Few forum says it is not feasible, few says it is. May I know the latest state from Apple development support team?
Hello Apple,
I am concerned about the new iOS Screen Mirroring that is available on iOS.
I have an app that is only meant to be viewed on iPhones (not Macs or Computers, due to security reasons.
I am assuming that Screen Mirroring is using AirPlay underneath, otherwise is there an API being planned or coming that can disable this functionality or is there a way for my app to opt out out of iOS Screen Mirroring?
Thanks.
Hi everyone, I'm currently developing an iOS app using React Native and recently got accepted into the Apple Music Global Affiliate Program. To fully utilize this opportunity, I need to implement the following functionalities:
Authorize Apple Music usage
Play Apple Music within my app
Identify if a user has an Apple Music subscription
Initiate and complete Apple Music subscription within my app
I've successfully implemented the first three functionalities using the react-native-apple-music module. Now, I need your help to understand how I can directly trigger the Apple Music subscription process from within my app.
Thank you for your help!
I'm working on a macOS application that captures audio and video. When the user selects a video capture source (most likely an elgato box), I would like the application to automatically select the audio input from the same device. I was achieving this by pairing video and audio sources that had the same name, but this doesn't work when the user plugs in two capture devices of the same make and model.
With the command system_profiler SPUSBDataType I can list all the USB devices, and I can see that the two elgato boxes have different serial numbers. If I could find this serial number, then I could figure out which AVCaptureDevices come from the same hardware.
Is there a way to get the manufacturer's serial number from the AVCaptureDevice object? Or a way to identify the USB device for an AVCaptureDevice, and from there I could get the serial or some other unique ID?
Hello,
We've a music app reading MPMediaItem.
We got items using MPMediaQuery. But we realized that some downloaded tracks from Apple Music were fetched too. Not all downloaded track but only those who were played recently.
Of course, since these tracks are protected with DRM we can't play them in our player.
It's weird to get them in our query because we added predicate in order to dont fetch protected asset and iCloud item
MPMediaPropertyPredicate(value: false, forProperty: MPMediaItemPropertyHasProtectedAsset)
MPMediaPropertyPredicate(value: false, forProperty: MPMediaItemPropertyIsCloudItem)
To be sure, we made a second check on each item we've fetched
extension MPMediaItem {
public func isValid() -> Bool {
return self.assetURL != nil && !self.isCloudItem && !self.hasProtectedAsset
}
}
But we still get these items. Their hasProtectedAsset attribute always return false.
I dont know if it's a bug, but since we can't detect this items as Apple Music downloaded track, we can't either:
filter them to not add them in our application library
OR
switch on a MPMusicPlayerController.applicationMusicPlayer to allow the user to play them
We have an app with a broadcast extension with a RPBroadcastSampleHandler. The implementation is working fine, however for quite some users the extension suddenly crashes during the broadcast.
The stacktrace stacktrace of the crashing thread always looks like the shortened sample below. (Full crash reports and stack traces are attached to the submitted Feedbacks.) Looking at the stacktrace none of our code is running, just ReplayKit code handling XPC messages at that moment:
Thread:
#0 0x00000001e2cf342c in __pthread_kill ()
#1 0x00000001f6a51c0c in pthread_kill ()
#2 0x00000001a1bfaba0 in abort ()
#3 0x00000001a9e38588 in malloc_vreport ()
#4 0x00000001a9e35430 in malloc_zone_error ()
[...]
#18 0x0000000218ac91bc in -[RPBroadcastSampleHandler processPayload:completion:] ()
#19 0x0000000198b81360 in __NSXPCCONNECTION_IS_CALLING_OUT_TO_EXPORTED_OBJECT_S2__ ()
Is anyone aware of there issues with ReplayKit? Are there known workarounds? Could anything we're doing affect crashes like this?
Would greatly appreciate it if anyone from Apple DTS could look into this and flag the below Feedbacks to the relevant teams!
Feedback IDs: FB13949098, FB13949188
Hello, currently working on a shareplay feature that allows users to pull 3d models from icloud and view it via volumes/immersive space on the vision pro. Was able to get the sharing working with multiple windows recently so now all that's left is to be able to sync/share the model in the SharePlay session.
As I understand it, we should generally use GroupSessionMessenger for commands and light data like model positioning/syncing properties. Whereas for bigger pieces of data (images/videos/models), we should send these through GroupSessionJournal which the group session manages and syncs it for all users in the call.
I have a button to get the current user's model data and add it to the journal via
/// modelData is type `Data`
try await journal.add(modelData)
I have also set up a task to observe/receive updates to the journal's attachments in when receiving a group session.
for await groupSession in MyModelActivity.sessions() {
...
tasks.insert {
Task {
for await attachments in journal.attachments {
for attachment in attachments {
do {
let modelData = try await attachment.load(Data.self) // throws error here - `notSupported`
let modelUrl = writeModelDataToTempDirectory(modelData: modelData)
self.modelUrlToLoadForGroupSession = modelUrl
} catch let error {
print("Error: \(error)")
}
}
}
}
}
}
Not quite sure why I'm running into an error being thrown when attempting to load the attachment data on the other devices, any thoughts? The documentation for add(_:) and load(_:) say that the attachment should conform to Transferable but Data.Type should already conform to Transferable
I would like to enable users of my iOS app to place an object they want to scan, for example, on a turntable, instead of walking around it, in order to generate a 3D model of it afterward. How can I achieve this with the Object Capture API?
Hello there,
I am faced with the following situation:
We are building a web app that manages playlists for different platforms, including Apple music
We have the concept of teams in there, where a user can be part of multiple teams, and teams are managed by team admin
A team admin could manage multiple teams
The problem here is, that a team admin wouldn't be able to sign in to the Apple music account for multiple teams because if using the same computer we try to let the user sign in once and store the Music User Token, we can't do another login unless we unauthorized the previous one.
Is there anything we can do about this? Thanks
Hi there,
I ve Been wondering about getting a 'music user token' for manipulating users playlists.
The situation I found myself in is that that can only be done in the front-end, but by exposing the 'developer token' I need to generate, and the 'developer token' is the key to our app, if someone takes that, they can do anything with it, am I wrong?
Thanks for your time!
Hello everyone,
I am thrilled about the iPhone Mirroring demo on WWDC24 and I have a few thoughts to share.
Will it work through a local network, or can the iPhone be accessed within a global network? Will there be an API to initiate iPhone mirroring via an app? This would be a great feature for MDMs, allowing administrators to provide support for their users. Could you share more details from the development perspective?
Hi, i have try with RPBroadcastSampleHandler Broadcast Extension and RPSystemBroadcastPickerView but i don't understand how can i mirror my iOS screen to Android smart TV?