Since updating to iOS 18.1 my WhatsApp no longer allows reply in my car and will only read one message at a time…
General
RSS for tagExplore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.
Post
Replies
Boosts
Views
Activity
I've been using CGWindowListCreateImage which automatically creates an image with the size of the captured window.
But SCScreenshotManager.captureImage(contentFilter:configuration:) always creates images with the width and height specified in the provided SCStreamConfiguration. I could be setting the size explicitly by reading SCWindow.frame or SCContentFilter.contentRect and multiplying the width and height by SCContentFilter.pointPixelScale , but it won't work if I want to keep the window shadow with SCStreamConfiguration.ignoreShadowsSingleWindow = false.
Is there a way and what's the best way to take full-resolution screenshots of the correct size?
import Cocoa
import ScreenCaptureKit
class ViewController: NSViewController {
@IBOutlet weak var imageView: NSImageView!
override func viewDidAppear() {
imageView.imageScaling = .scaleProportionallyUpOrDown
view.wantsLayer = true
view.layer!.backgroundColor = .init(red: 1, green: 0, blue: 0, alpha: 1)
Task {
let windows = try await SCShareableContent.excludingDesktopWindows(false, onScreenWindowsOnly: true).windows
let window = windows[0]
let filter = SCContentFilter(desktopIndependentWindow: window)
let configuration = SCStreamConfiguration()
configuration.ignoreShadowsSingleWindow = false
configuration.showsCursor = false
configuration.width = Int(Float(filter.contentRect.width) * filter.pointPixelScale)
configuration.height = Int(Float(filter.contentRect.height) * filter.pointPixelScale)
print(filter.contentRect)
let windowImage = try await SCScreenshotManager.captureImage(contentFilter: filter, configuration: configuration)
imageView.image = NSImage(cgImage: windowImage, size: CGSize(width: windowImage.width, height: windowImage.height))
}
}
}
Hello,
I try to get the Video from an HDMI USB capture card and show it in a PreviewLayer with 60fps. The device I am using (ShadowCast 2) is supporting 1080p with 60fps in "yuvs" and "420v".
This is my code with stripped away uninteresting stuff and removed error handling to build the previewLayer.
I am using the AVFrameRateRange because the capture device is not directly supporting 60.00 but <AVFrameRateRange: 0x600000875680 60.00 - 60.00 (1000000 / 60000240 - 1000000 / 60000240)> fps.
@Observable
final class AVFoundationService: AVService {
// Live View
private let session: AVCaptureSession = .init()
var previewLayer: AVCaptureVideoPreviewLayer {
let layer = AVCaptureVideoPreviewLayer(session: session)
layer.videoGravity = .resizeAspect
return layer
}
var activeVideoDevice: AVCaptureDevice? {
// TODO: implement correct logic
if let device = videoDevices.first(where: { $0.localizedName.contains("Shadow") }) {
return device
}
return AVCaptureDevice.default(for: .video)
}
func setupStreamDemo(completion: @escaping (Error?) -> Void) {
session.beginConfiguration()
if let device = activeVideoDevice {
do {
let input = try AVCaptureDeviceInput(device: device)
if session.canAddInput(input) {
session.addInput(input)
} else {
print("explode")
}
for format in device.formats {
let dimensions = CMVideoFormatDescriptionGetDimensions(format.formatDescription)
if dimensions.width == 1920 && dimensions.height == 1080 && format.formatDescription.mediaSubType.description == "'yuvs'" {
let foundFPS = format.videoSupportedFrameRateRanges.first {
Int($0.minFrameRate) == 60 && Int($0.minFrameRate) == 60
}
try device.lockForConfiguration()
device.activeFormat = format
device.activeVideoMinFrameDuration = foundFPS!.minFrameDuration
device.activeVideoMaxFrameDuration = foundFPS!.minFrameDuration
device.unlockForConfiguration()
}
}
} catch {
return completion(error)
}
}
session.commitConfiguration()
session.startRunning()
completion(nil)
}
}
I am using the following code in SwiftUI to show the AVCaptureVideoPreviewLayer.
struct VideoPreviewView: NSViewRepresentable {
private let previewLayer: AVCaptureVideoPreviewLayer
func makeNSView(context: Context) -> NSView {
let view = NSView()
view.layer = self.previewLayer
view.layer?.frame = view.bounds
return view
}
func updateNSView(_ nsView: NSView, context: Context) {
if let layer = nsView.layer as? AVCaptureVideoPreviewLayer {
layer.session = self.previewLayer.session
}
}
}
When I now run my app, it will ignore whatever I set on device.activeVideoMinFrameDuration and/or device.activeVideoMaxFrameDuration. If I set it to 10 fps - it's running with 30, if I set 60 it is running with 30.
If I start in parallel to my app QuickTime and start a "Recording" from my USB Capture Card, it will switch to 60fps mode.
I am on Mac Sequoia 15.0 with Xcode 16.0.
What I am doing wrong?
I am facing an Issue regarding the voicemail feature. when someone calls me, it would go to voicemail only if I click on the voicemail icon. If I do not respond at all to the incoming call, it does not give the caller an option to record voicemail. I have tried switching the voicemail on and off, its working on other devices Iphone 14 and 15 for my family member, just not for me. How can i resolve this?
I'm attempting to use AVExternalStorageDevice.requestAccess on iOS 18 using Xcode 16.
When calling requestAccess, a dialog does appear, but the completionHandler closure is never called to indicate whether access was granted. If using the async version, the function just never returns.
Calling requestAccess also results in a mediaServicesWereReset (-11819) error without fail.
Supposedly, "the system only presents the dialog to a person the first time your app calls the method." That also doesn't appear to be the case. The dialog appears every time requestAccess is called, regardless of previous invocations and whether "Allow" or "Don't Allow" was selected.
The dialog itself says "You can change this in Privacy settings." I cannot find this permission anywhere in the Settings app, neither under Privacy & Security nor under the app-specific settings page.
Has anyone else experienced these issues? Am I missing something here? I did suspect permissions issues and tried adding a NSRemovableVolumesUsageDescription entry to the app. This did not appear to change anything.
How can I obtain the invoice order id on the user's purchase order?For example, "MT2345678"
Hello,
I noticed that SFSpeechRecognizer is broken on iOS 18. During a recognition task, it keeps dropping the recognized text on every pause. For example, if you say "how are you fine", it will drop the "how are you" part and only give you "fine" as the result.
Say "how are you <pause> fine"
// iOS 17 ✅ (perfect final result)
How
How are
How are you
How are you.
How are you. Fine.
// iOS 18 ❌
How
How are
How are you
How are you
Fine
(the text before the pause is dropped, and fail to recognize the punctuations.)
Reproducing the issue:
Download the official sample project.
Run it on an iOS 18 device or simulator.
Say "how are you fine"
Only "fine" will be displayed.
I'm experiencing an issue with QuickLook in iOS 18 where.reality files with audio playback are affected. When I open a.reality file that includes audio, the audio track plays twice: once from the moment the file is opened, and again from the start of the animation. This results in a duplicate audio playback.
I've tested this issue on multiple devices running iOS 16, 17, and 18, and the problem only occurs on iOS 18. I've tried restarting the devices and checking for any software updates, but the issue persists.
Steps to reproduce:
Open a.reality file with audio playback in QuickLook on an iOS 18 device.
Observe the audio playback.
Expected result:
The audio track should play only once, from the start of the animation.
Actual result:
The audio track plays twice, once from the moment the file is opened and again from the start of the animation.
Device and iOS version:
I've tested this issue on iPhone 12 Pro, iPhone 13 Pro running iOS 18, iPhone 13 running iOS 16 and iPhone 11 Pro running iOS 17,
I have sync 3.4 on my ford F150 - CarPlay does not work with iOS 18.1. Sync says I don’t have anything connected. Something is up.
It's simple to reproduce. The bug is simply when you queue a bunch of songs to play, it will always queue less than what you gave it.
Here, I'm attempting to play an apple curated playlist, it will only queue a subset, usually less than 15, but as low as 1 out of 100. Use the system's forward and backwards to test it out.
Here is the code, just paste it in to the ContentView file and make sure you have the capibility to run it.
import SwiftUI
import MusicKit
struct ContentView: View {
var body: some View {
VStack{
Button("Play Music") {
Task{
await playMusic()
}
}
}
}
}
func getOnlySongsFromTracks(tracks:MusicItemCollection<Track>?) async throws ->MusicItemCollection<Song>?{
var songs:[Song]?
if let t = tracks{
songs = [Song]()
for track in t {
if case let .song(song) = track {
songs?.append(song)
print("track is song \(track.debugDescription)")
}else{
print("track not song \(track.debugDescription)")
}
}
}
if let songs = songs {
let topSongs = MusicItemCollection(songs)
return topSongs
}
return nil
}
func playMusic() async {
// Request authorization
let status = await MusicAuthorization.request()
guard status == .authorized else {
print("Music authorization denied.")
return
}
do {
// Perform a hardcoded search for a playlist
let searchTerm = "2000"
let request = MusicCatalogSearchRequest(term: searchTerm, types: [Playlist.self])
let response = try await request.response()
guard let playlist = response.playlists.first else {
print("No playlists found for the search term '\(searchTerm)'.")
return
}
// Fetch the songs in the playlist
let detailedPlaylist = try await playlist.with([.tracks])
guard let songCollection = try await getOnlySongsFromTracks(tracks: detailedPlaylist.tracks) else {
print("no songs found")
return }
guard let t = detailedPlaylist.tracks else {
print("no tracks")
return
}
// Create a queue and play
let musicPlayer = ApplicationMusicPlayer.shared
let q = ApplicationMusicPlayer.Queue(for: t)
musicPlayer.queue = q
try await musicPlayer.play()
print("Now playing playlist: \(playlist.name)")
} catch {
print("An error occurred: \(error.localizedDescription)")
}
}
I was wondering if anyone could assist with the following query.
Apple's Private Relay functionality requires companies to register all email-sending subdomains for the service to function properly. With 26 markets and 3 subdomains per market for one department, and another department with around 20 markets and even more subdomains, the limit of 100 sending domains is exceeded.
As a result, we’re unable to register all the domains currently being used to send emails to our customers.
Does any have any recommendations to overcome this?
Hi,
I’ve encountered a bug related to including tracks as a relationship in the playlists list. The issue arises when there is more than one playlist.
Specifically:
Single Playlist: The functionality works as expected.
Multiple Playlists: The application crashes.
Please let me know if you need additional information or if there are any updates on this issue.
Thank you!
curl --request GET \
--url 'https://api.music.apple.com/v1/me/library/playlists?include=tracks' \
--header 'Authorization: Bearer {token}' \
--header 'Music-User-Token: {token}'
Hello Apple,
I am yet again concerned about the new iOS Screen Mirroring that going to be available on iOS 18 stable.
I have an app that is only meant to be viewed on iPhones (not Macs or Computers, due to various security reasons.
I have raised a Feedback Assistant on this and Apple have ignored this.
Other apps that might benefit from a Disabling / Detecting API for iOS Screen Mirroring for Apps may be Snapchat, DAZN, Sony, Netflix, Amazon Prime, etc.
Is there still pans for an API that can disable this functionality now or in the future as I am sure that a company like Snapchat doesn't want people screenshotting photos using iOS Screen Mirroring and the app doesn't know.
Thanks.
MPNowPlayingInfoPropertyInternationalStandardRecordingCode not working in iOS18 beta 6 + Xcode 16.0 beta.
Reproduce:
“Becoming a now playable app demo” with
1、Info.plist MusicHapticsSupported set YES;
2、song with correct isrc
nowPlayingInfo[MPNowPlayingInfoPropertyInternationalStandardRecordingCode] = metadata.isrc
Hello, I noticed that the CADisplayLink seems to emit incorrect targetTimestamp and timestamp in the iOS 18 simulator. If you compute the actual duration of a frame, the duration is always a negative number.
This only occurs in the iOS 18 simulator.
Can We use the Enterprise APP to communicate with the eSIM card of Apple phone through the LPA module or OMA channel? How to connect with LPA or OMA, is it paid?
Having a focused sample Xcode project which demonstrates the issue you are facing is critical to Developer Technical Support engineers being able to assist you. Based on your answers to the questions above, you’ll need to collect additional information before submitting a code-level support request. If you’re unable to provide a sample Xcode project, or are unsure how to proceed, please ask your question in the Apple Developer Forums.
I've uninstalled and reinstalled VLC media player multiple times. However, I receive the following message opening the app after each installation:
Translated Report (Full Report Below)
Process: VLC [71778]
Path: /Applications/VLC2.app/Contents/MacOS/VLC
Identifier: org.videolan.vlc
Version: 3.0.21 (3.0.21)
Code Type: ARM-64 (Native)
Parent Process: launchd [1]
User ID: 501
Date/Time: 2024-08-14 12:27:24.3563 -0400
OS Version: macOS 14.6.1 (23G93)
Report Version: 12
Anonymous UUID: 9DDE1CE7-A635-1165-0FE9-04EA599A542F
Sleep/Wake UUID: E22A843E-7A51-414F-BA7F-AB35B1674915
Time Awake Since Boot: 300000 seconds
Time Since Wake: 267959 seconds
System Integrity Protection: enabled
Crashed Thread: 0 Dispatch queue: com.apple.main-thread
Exception Type: EXC_BAD_ACCESS (SIGSEGV)
Exception Codes: KERN_INVALID_ADDRESS at 0x0000000102af4ae8
Exception Codes: 0x0000000000000001, 0x0000000102af4ae8
Termination Reason: Namespace SIGNAL, Code 11 Segmentation fault: 11
Terminating Process: exc handler [71778]
VM Region Info: 0x102af4ae8 is not in any region. Bytes after previous region: 2793 Bytes before following region: 783640
REGION TYPE START - END [ VSIZE] PRT/MAX SHRMOD REGION DETAIL
__LINKEDIT 102ae8000-102af4000 [ 48K] r--/rwx SM=COW /Applications/VLC2.app/Contents/Frameworks/Breakpad.framework/Versions/A/Resources/breakpadUtilities.dylib
---> GAP OF 0xc0000 BYTES
__TEXT 102bb4000-102c78000 [ 784K] r-x/rwx SM=COW /Applications/VLC2.app/Contents/MacOS/lib/libvlccore.9.dylib
Application Specific Information:
*** multi-threaded process forked ***
crashed on child side of fork pre-exec
Thread 0 Crashed:: Dispatch queue: com.apple.main-thread
0 libsystem_trace.dylib 0x193d8b0c0 _os_log_preferences_refresh + 68
1 libsystem_trace.dylib 0x193d8bb20 os_log_type_enabled + 712
2 CoreFoundation 0x1940da800 _CFBundleCopyPreferredLanguagesInList + 516
3 CoreFoundation 0x1940e75a4 _CFBundleCopyLanguageSearchListInBundle + 124
4 CoreFoundation 0x1940e738c _copyQueryTable + 64
5 CoreFoundation 0x1940e6d5c _copyResourceURLsFromBundle + 376
6 CoreFoundation 0x1940e6118 _CFBundleCopyFindResources + 1400
7 CoreFoundation 0x1940e5b90 CFBundleCopyResourceURL + 56
8 CoreAudio 0x1966c3b58 HALSystem::InitializeShell() + 1412
9 CoreAudio 0x1966c3274 HALSystem::CheckOutInstance() + 192
10 CoreAudio 0x19693360c AudioObjectSetPropertyData_mac_imp + 116
11 libauhal_plugin.dylib 0x10290915c 0x102904000 + 20828
12 VLC 0x1025df4dc 0x1025d8000 + 29916
13 dyld 0x193cb3154 start + 2476
Thread 0 crashed with ARM Thread State (64-bit):
x0: 0x00000001fbe8cfec x1: 0x0000000193da0985 x2: 0x0000000001000104 x3: 0x0000000000000000
x4: 0x0000000193da0937 x5: 0x000000016d826500 x6: 0x0000000000000074 x7: 0x0000000000000000
x8: 0x0000000102af4ae6 x9: 0x00000001fbe97610 x10: 0x0000000000000001 x11: 0x0000000143909730
x12: 0x0000000000000001 x13: 0x000000016d8266f0 x14: 0xaaaaaaaaaaaaaaaa x15: 0x0000000193da01db
x16: 0x0000000193ffd7d4 x17: 0x000000020658e3e0 x18: 0x0000000000000000 x19: 0x0000000143909700
x20: 0x0000000143909700 x21: 0x0000000102af4aea x22: 0x0000000102af4aea x23: 0x0000000143d069f0
x24: 0x0000000143d075a0 x25: 0x0000000000000016 x26: 0x0000000000000000 x27: 0x0000000143d07c60
x28: 0x0000000143d06af0 fp: 0x000000016d826a30 lr: 0x0000000193d8b0a4
sp: 0x000000016d8269e0 pc: 0x0000000193d8b0c0 cpsr: 0x20001000
far: 0x0000000102af4ae8 esr: 0x92000007 (Data Abort) byte read Translation fault
Hello,
I am trying to make use of SCContentSharingPicker for my app and I wonder how I can detect a close event of SCContentSharingPicker.
I could open the picker screen with following simple code:
SCContentSharingPicker.shared.isActive = true
SCContentSharingPicker.shared.add(self)
SCContentSharingPicker.shared.present()
And I closed it with "Cancel" button located at the top right corner.
Initially I was expecting to get a event through an observer like below but realised that it's called when a stream is canceled.
extension ContentPickerButton: SCContentSharingPickerObserver {
func contentSharingPicker(_ picker: SCContentSharingPicker, didCancelFor stream: SCStream?) {
logger.info("Picker canceled for stream \(stream)")
}
I would like to get a picker close event so that I can deactivate the picker. (Otherwise, camera icon will stay alive at the tray.)
How do we get a close event?
The scenario is quite simple
run an application which uses [SCShareableContent getShareableContentExcludingDesktopWindows] and invoke captureImageWithFilter in completionHandler.
delay invoking captureImageWithFilter for several seconds and switch user session before call it.
The WindowServer crashes if app runs in inactive session.
How to manage this issue correctly? Are there any way to avoid this crash?