Hi folks:
I've been creating .reality files out of Reality Composer for over a year. Some of the files are up to 500 mB and, prior to the last month they opened fine as AR projected experiences on even basic iPhones and iPads. Now, I think since iOS 18, a 64Mb file will open as an AR experience but files it seems from about 350MB up don't open. Files just opens a window displaying the name of the file, that it's a .reality file and the file size. But it no longer opens into either an AR or Object display of the .reality scene. Has there been a new file size limit put on .reality files that Files will open or what else is going on here. Have a client who was about to launch and experience based on the .Reality file I can no longer open. Please help.
General
RSS for tagExplore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.
Post
Replies
Boosts
Views
Activity
Hi,
just generated a HDR10 MVHEVC file, mediainfo is below:
Color range : Limited
Color primaries : BT.2020
Transfer characteristics : PQ
Matrix coefficients : BT.2020 non-constant
Codec configuration box : hvcC+lhvC
then generate the segment files with below command:
mediafilesegmenter --iso-fragmented -t 4 -f av_1 av_new_1.mov
then upload the segment files and prog_index.m3u8 to web server.
just find that can not play the HLS stream on Safari...
the url is http://ip/vod/prog_index.m3u8
just checked that if i remove the tag Transfer characteristics : PQ when generating the MVHEVC file.
above same mediafilesegmenter command and upload the files to web server.
the new version of HLS stream is can play on Safari...
Is there any way to play HLS PQ video on Safari. thanks.
Since updating to iOS 18.1 my WhatsApp no longer allows reply in my car and will only read one message at a time…
Hi all,
Apple dropping on-going development for FireWire devices that were supported with the Core Audio driver standard is a catastrophe for a lot of struggling musicians who need to both keep up to date on security updates that come with new OS releases, and continue to utilise their hard earned investments in very expensive and still pristine audio devices that have been reduced to e-waste by Apple's seemingly tone-deaf ignorance in the cries for on-going support.
I have one of said audio devices, and I'd like to keep using it while keeping my 2019 Intel Mac Book Pro up to date with the latest security updates and OS features.
Probably not the first time you gurus have had someone make the logical leap leading to a request for something like this, but I was wondering if it might be somehow possible of shoe-horning the code used in previous versions of Mac OS that allowed the Mac to speak with the audio features of such devices to run inside the Ventura version of the OS.
Would it possible? Would it involve a lot of work? I don't think I'd be the only person willing to pay for a third party application or utility that restored this functionality.
There has to be 100's of thousands of people who would be happy to spare some cash to stop their multi-thousand dollar investment in gear to be so thoughtlessly resigned to the scrap heap.
Any comments or layman-friendly explanations as to why this couldn’t happen would be gratefully received!
Thanks,
em
I've been using CGWindowListCreateImage which automatically creates an image with the size of the captured window.
But SCScreenshotManager.captureImage(contentFilter:configuration:) always creates images with the width and height specified in the provided SCStreamConfiguration. I could be setting the size explicitly by reading SCWindow.frame or SCContentFilter.contentRect and multiplying the width and height by SCContentFilter.pointPixelScale , but it won't work if I want to keep the window shadow with SCStreamConfiguration.ignoreShadowsSingleWindow = false.
Is there a way and what's the best way to take full-resolution screenshots of the correct size?
import Cocoa
import ScreenCaptureKit
class ViewController: NSViewController {
@IBOutlet weak var imageView: NSImageView!
override func viewDidAppear() {
imageView.imageScaling = .scaleProportionallyUpOrDown
view.wantsLayer = true
view.layer!.backgroundColor = .init(red: 1, green: 0, blue: 0, alpha: 1)
Task {
let windows = try await SCShareableContent.excludingDesktopWindows(false, onScreenWindowsOnly: true).windows
let window = windows[0]
let filter = SCContentFilter(desktopIndependentWindow: window)
let configuration = SCStreamConfiguration()
configuration.ignoreShadowsSingleWindow = false
configuration.showsCursor = false
configuration.width = Int(Float(filter.contentRect.width) * filter.pointPixelScale)
configuration.height = Int(Float(filter.contentRect.height) * filter.pointPixelScale)
print(filter.contentRect)
let windowImage = try await SCScreenshotManager.captureImage(contentFilter: filter, configuration: configuration)
imageView.image = NSImage(cgImage: windowImage, size: CGSize(width: windowImage.width, height: windowImage.height))
}
}
}
Hello,
I noticed that SFSpeechRecognizer is broken on iOS 18. During a recognition task, it keeps dropping the recognized text on every pause. For example, if you say "how are you fine", it will drop the "how are you" part and only give you "fine" as the result.
Say "how are you <pause> fine"
// iOS 17 ✅ (perfect final result)
How
How are
How are you
How are you.
How are you. Fine.
// iOS 18 ❌
How
How are
How are you
How are you
Fine
(the text before the pause is dropped, and fail to recognize the punctuations.)
Reproducing the issue:
Download the official sample project.
Run it on an iOS 18 device or simulator.
Say "how are you fine"
Only "fine" will be displayed.
Hello,
I try to get the Video from an HDMI USB capture card and show it in a PreviewLayer with 60fps. The device I am using (ShadowCast 2) is supporting 1080p with 60fps in "yuvs" and "420v".
This is my code with stripped away uninteresting stuff and removed error handling to build the previewLayer.
I am using the AVFrameRateRange because the capture device is not directly supporting 60.00 but <AVFrameRateRange: 0x600000875680 60.00 - 60.00 (1000000 / 60000240 - 1000000 / 60000240)> fps.
@Observable
final class AVFoundationService: AVService {
// Live View
private let session: AVCaptureSession = .init()
var previewLayer: AVCaptureVideoPreviewLayer {
let layer = AVCaptureVideoPreviewLayer(session: session)
layer.videoGravity = .resizeAspect
return layer
}
var activeVideoDevice: AVCaptureDevice? {
// TODO: implement correct logic
if let device = videoDevices.first(where: { $0.localizedName.contains("Shadow") }) {
return device
}
return AVCaptureDevice.default(for: .video)
}
func setupStreamDemo(completion: @escaping (Error?) -> Void) {
session.beginConfiguration()
if let device = activeVideoDevice {
do {
let input = try AVCaptureDeviceInput(device: device)
if session.canAddInput(input) {
session.addInput(input)
} else {
print("explode")
}
for format in device.formats {
let dimensions = CMVideoFormatDescriptionGetDimensions(format.formatDescription)
if dimensions.width == 1920 && dimensions.height == 1080 && format.formatDescription.mediaSubType.description == "'yuvs'" {
let foundFPS = format.videoSupportedFrameRateRanges.first {
Int($0.minFrameRate) == 60 && Int($0.minFrameRate) == 60
}
try device.lockForConfiguration()
device.activeFormat = format
device.activeVideoMinFrameDuration = foundFPS!.minFrameDuration
device.activeVideoMaxFrameDuration = foundFPS!.minFrameDuration
device.unlockForConfiguration()
}
}
} catch {
return completion(error)
}
}
session.commitConfiguration()
session.startRunning()
completion(nil)
}
}
I am using the following code in SwiftUI to show the AVCaptureVideoPreviewLayer.
struct VideoPreviewView: NSViewRepresentable {
private let previewLayer: AVCaptureVideoPreviewLayer
func makeNSView(context: Context) -> NSView {
let view = NSView()
view.layer = self.previewLayer
view.layer?.frame = view.bounds
return view
}
func updateNSView(_ nsView: NSView, context: Context) {
if let layer = nsView.layer as? AVCaptureVideoPreviewLayer {
layer.session = self.previewLayer.session
}
}
}
When I now run my app, it will ignore whatever I set on device.activeVideoMinFrameDuration and/or device.activeVideoMaxFrameDuration. If I set it to 10 fps - it's running with 30, if I set 60 it is running with 30.
If I start in parallel to my app QuickTime and start a "Recording" from my USB Capture Card, it will switch to 60fps mode.
I am on Mac Sequoia 15.0 with Xcode 16.0.
What I am doing wrong?
I am facing an Issue regarding the voicemail feature. when someone calls me, it would go to voicemail only if I click on the voicemail icon. If I do not respond at all to the incoming call, it does not give the caller an option to record voicemail. I have tried switching the voicemail on and off, its working on other devices Iphone 14 and 15 for my family member, just not for me. How can i resolve this?
I'm attempting to use AVExternalStorageDevice.requestAccess on iOS 18 using Xcode 16.
When calling requestAccess, a dialog does appear, but the completionHandler closure is never called to indicate whether access was granted. If using the async version, the function just never returns.
Calling requestAccess also results in a mediaServicesWereReset (-11819) error without fail.
Supposedly, "the system only presents the dialog to a person the first time your app calls the method." That also doesn't appear to be the case. The dialog appears every time requestAccess is called, regardless of previous invocations and whether "Allow" or "Don't Allow" was selected.
The dialog itself says "You can change this in Privacy settings." I cannot find this permission anywhere in the Settings app, neither under Privacy & Security nor under the app-specific settings page.
Has anyone else experienced these issues? Am I missing something here? I did suspect permissions issues and tried adding a NSRemovableVolumesUsageDescription entry to the app. This did not appear to change anything.
How can I obtain the invoice order id on the user's purchase order?For example, "MT2345678"
I have sync 3.4 on my ford F150 - CarPlay does not work with iOS 18.1. Sync says I don’t have anything connected. Something is up.
I have a mac os app that uses screen capture logic. It was originally coded using the Quartz CG api:
if let cgimage = CGDisplayCreateImage(CGMainDisplayID(), rect: cgRect) {...}
and this worked as expected even when capturing a screen rect that began on secondary monitor and ended on primary monitor (or was entirely contained on secondary monitor).
However now that API is deprecated and you're supposed to use ScreenCaptureKit instead. So I have attempted to convert the code. The trial code is:
let scConfig = SCStreamConfiguration()
scConfig.sourceRect = drect
scConfig.width = Int(drect.width)
scConfig.height = Int(drect.height)
SCScreenshotManager.captureImage(contentFilter: sFilter, configuration: scConfig) {any,error in
if let cgim = any {
print("image dims \(cgim.width), \(cgim.height), requested: \(drect)")
self.writeToFile2(cgim)
}
else {
print("SCREEN CAP failed")
}
}
...
where sFilter was previously set based on main screen display (with no exclusions). This code also "works" as long as the capture rect is entirely on primary monitor. But it fails if the rect spans both monitors or is fully contained on secondary monitor. (By fails I mean it produces empty image)
So my question is: How to use ScreenCaptureKit to obtain screen shot of rectangle that spans dual monitors?
I'm experiencing an issue with QuickLook in iOS 18 where.reality files with audio playback are affected. When I open a.reality file that includes audio, the audio track plays twice: once from the moment the file is opened, and again from the start of the animation. This results in a duplicate audio playback.
I've tested this issue on multiple devices running iOS 16, 17, and 18, and the problem only occurs on iOS 18. I've tried restarting the devices and checking for any software updates, but the issue persists.
Steps to reproduce:
Open a.reality file with audio playback in QuickLook on an iOS 18 device.
Observe the audio playback.
Expected result:
The audio track should play only once, from the start of the animation.
Actual result:
The audio track plays twice, once from the moment the file is opened and again from the start of the animation.
Device and iOS version:
I've tested this issue on iPhone 12 Pro, iPhone 13 Pro running iOS 18, iPhone 13 running iOS 16 and iPhone 11 Pro running iOS 17,
It's simple to reproduce. The bug is simply when you queue a bunch of songs to play, it will always queue less than what you gave it.
Here, I'm attempting to play an apple curated playlist, it will only queue a subset, usually less than 15, but as low as 1 out of 100. Use the system's forward and backwards to test it out.
Here is the code, just paste it in to the ContentView file and make sure you have the capibility to run it.
import SwiftUI
import MusicKit
struct ContentView: View {
var body: some View {
VStack{
Button("Play Music") {
Task{
await playMusic()
}
}
}
}
}
func getOnlySongsFromTracks(tracks:MusicItemCollection<Track>?) async throws ->MusicItemCollection<Song>?{
var songs:[Song]?
if let t = tracks{
songs = [Song]()
for track in t {
if case let .song(song) = track {
songs?.append(song)
print("track is song \(track.debugDescription)")
}else{
print("track not song \(track.debugDescription)")
}
}
}
if let songs = songs {
let topSongs = MusicItemCollection(songs)
return topSongs
}
return nil
}
func playMusic() async {
// Request authorization
let status = await MusicAuthorization.request()
guard status == .authorized else {
print("Music authorization denied.")
return
}
do {
// Perform a hardcoded search for a playlist
let searchTerm = "2000"
let request = MusicCatalogSearchRequest(term: searchTerm, types: [Playlist.self])
let response = try await request.response()
guard let playlist = response.playlists.first else {
print("No playlists found for the search term '\(searchTerm)'.")
return
}
// Fetch the songs in the playlist
let detailedPlaylist = try await playlist.with([.tracks])
guard let songCollection = try await getOnlySongsFromTracks(tracks: detailedPlaylist.tracks) else {
print("no songs found")
return }
guard let t = detailedPlaylist.tracks else {
print("no tracks")
return
}
// Create a queue and play
let musicPlayer = ApplicationMusicPlayer.shared
let q = ApplicationMusicPlayer.Queue(for: t)
musicPlayer.queue = q
try await musicPlayer.play()
print("Now playing playlist: \(playlist.name)")
} catch {
print("An error occurred: \(error.localizedDescription)")
}
}
I was wondering if anyone could assist with the following query.
Apple's Private Relay functionality requires companies to register all email-sending subdomains for the service to function properly. With 26 markets and 3 subdomains per market for one department, and another department with around 20 markets and even more subdomains, the limit of 100 sending domains is exceeded.
As a result, we’re unable to register all the domains currently being used to send emails to our customers.
Does any have any recommendations to overcome this?
Hello, we are embedding a PHPickerViewController with UIKit (adding the vc as a child vc, embedding the view, calling didMoveToParent) in our app using the compact mode. We are disabling the following capabilities .collectionNavigation, .selectionActions, .search.
One of our users using iOS 17.2.1 and iPhone 12 encountered a crash with the following stacktrace:
Crashed: com.apple.main-thread
0 libsystem_kernel.dylib 0x9fbc __pthread_kill + 8
1 libsystem_pthread.dylib 0x5680 pthread_kill + 268
2 libsystem_c.dylib 0x75b90 abort + 180
3 PhotoFoundation 0x33b0 -[PFAssertionPolicyCrashReport notifyAssertion:] + 66
4 PhotoFoundation 0x3198 -[PFAssertionPolicyComposite notifyAssertion:] + 160
5 PhotoFoundation 0x374c -[PFAssertionPolicyUnique notifyAssertion:] + 176
6 PhotoFoundation 0x2924 -[PFAssertionHandler handleFailureInFunction:file:lineNumber:description:arguments:] + 140
7 PhotoFoundation 0x3da4 _PFAssertFailHandler + 148
8 PhotosUI 0x22050 -[PHPickerViewController _handleRemoteViewControllerConnection:extension:extensionRequestIdentifier:error:completionHandler:] + 1356
9 PhotosUI 0x22b74 __66-[PHPickerViewController _setupExtension:error:completionHandler:]_block_invoke_3 + 52
10 libdispatch.dylib 0x26a8 _dispatch_call_block_and_release + 32
11 libdispatch.dylib 0x4300 _dispatch_client_callout + 20
12 libdispatch.dylib 0x12998 _dispatch_main_queue_drain + 984
13 libdispatch.dylib 0x125b0 _dispatch_main_queue_callback_4CF + 44
14 CoreFoundation 0x3701c __CFRUNLOOP_IS_SERVICING_THE_MAIN_DISPATCH_QUEUE__ + 16
15 CoreFoundation 0x33d28 __CFRunLoopRun + 1996
16 CoreFoundation 0x33478 CFRunLoopRunSpecific + 608
17 GraphicsServices 0x34f8 GSEventRunModal + 164
18 UIKitCore 0x22c62c -[UIApplication _run] + 888
19 UIKitCore 0x22bc68 UIApplicationMain + 340
20 WorkAngel 0x8060 main + 20 (main.m:20)
21 ??? 0x1bd62adcc (Missing)
Please share if you have any ideas as to what might have caused that, or what to look at in such a case. I haven't been able to reproduce this myself unfortunately.
Hi,
I’ve encountered a bug related to including tracks as a relationship in the playlists list. The issue arises when there is more than one playlist.
Specifically:
Single Playlist: The functionality works as expected.
Multiple Playlists: The application crashes.
Please let me know if you need additional information or if there are any updates on this issue.
Thank you!
curl --request GET \
--url 'https://api.music.apple.com/v1/me/library/playlists?include=tracks' \
--header 'Authorization: Bearer {token}' \
--header 'Music-User-Token: {token}'
Hello Apple,
I am yet again concerned about the new iOS Screen Mirroring that going to be available on iOS 18 stable.
I have an app that is only meant to be viewed on iPhones (not Macs or Computers, due to various security reasons.
I have raised a Feedback Assistant on this and Apple have ignored this.
Other apps that might benefit from a Disabling / Detecting API for iOS Screen Mirroring for Apps may be Snapchat, DAZN, Sony, Netflix, Amazon Prime, etc.
Is there still pans for an API that can disable this functionality now or in the future as I am sure that a company like Snapchat doesn't want people screenshotting photos using iOS Screen Mirroring and the app doesn't know.
Thanks.
Hi, i have try with RPBroadcastSampleHandler Broadcast Extension and RPSystemBroadcastPickerView but i don't understand how can i mirror my iOS screen to Android smart TV?
MPNowPlayingInfoPropertyInternationalStandardRecordingCode not working in iOS18 beta 6 + Xcode 16.0 beta.
Reproduce:
“Becoming a now playable app demo” with
1、Info.plist MusicHapticsSupported set YES;
2、song with correct isrc
nowPlayingInfo[MPNowPlayingInfoPropertyInternationalStandardRecordingCode] = metadata.isrc