Hi all,
so, in my app Transloader, when the app is terminated, I sync the Mac's "turned off" status to iCloud.
This works on Intel Macs and on Apple silicon Macs when the user manually quits the app.
However, on Apple silicon Macs, when the user shuts down or restarts the Mac and the app is terminated that way, the app is terminated right away (doesn't even receive the -applicationWillTerminate: call), so I'm unable to properly sync the status. This still works on Intel Macs, as far as I know.
So, is there any new API to extend the Mac's shutdown a little longer to finish my sync?
Currently, I use -applicationShouldTerminate:, returning NSTerminateLater, and after syncing, calling -replyToApplicationShouldTerminate:YES .
Again, this works fine on Intel Macs if the user manually quits, as well as if the Intel Mac shuts down, but on Apple silicon, it only works if the user manually quits - a shutdown results in the app being terminated without even receiving a -applicationWillTerminate: call.
Both NSSupportsSuddenTermination and NSSupportsAutomaticTermination are disabled in the app's Info.plist.
Thank you for any insights,
Matt
Post
Replies
Boosts
Views
Activity
I'm appending CVPixelBufferRefs to an AVAssetWriterInputPixelBufferAdaptor which is connected to an AVAssetWriterInput to write to a file using AVAssetWriter.
So, I'm calling
[pixelAdaptor appendPixelBuffer:pxbuffer withPresentationTime:someCMTime];
and it works as it should.
Now, I'm doing this sequentially, so at CMTimeZero, I add the first frame, at time 0.5s I append another frame, at 1s another, and so on, and so on.
I'm wondering, do I have to do it sequentially, or can I do it randomly as well? For the sake of argument, the other way around? Like, append a buffer at 1s first, then at 0.5s and then at 0s?
The reason I'm asking is, I've tried, and it crashes, but I'm not sure if it crashes because I tried to append at random times (non-sequentially), or if there's another reason (like asynchronous appending)?
I'm only appending when the input tells me to (readyForMoreMediaData), on a serial dispatch queue.
The question is: Should non-sequential appending work, or is it by design that it doesn't?
Thank you,
Matthias
Hi,
when I try to push to my git repository using Xcode 12 beta, I get the following error:
An unknown error occurred
username does not match previous requests (-1)
It works correctly using Xcode 11, using the same copy / files on disk.
Is this a known issue?
Thank you,
Matthias
The LinkPresentation framework, specifically using LPMetadataProvider, makes my app crash after calling -startFetchingMetadataForURL:completionHandler: and before calling the completionHandler, suggesting it's a bug within the framework, not my app.Crashes in ***:WebKit something on macOS. On iOS, the framework performs nicely.
Hello, everyone.My name is Matt, I'm the developer of Eternal Storms Software.I recently released a freeware app: SiriMote, outside of the Mac App Store.The reason I could not release it on the Mac App Store is that it uses CGEventPost to simulate keypresses (for example, when the play/pause button is pressed on the Siri Remote, the Play/Pause media key (on the F8 key) of the Mac's keyboard is pressed), and CGEventPost is ignored inside the sandbox.So I was wondering - is there a way to do this inside the sandbox?I guess an alternative would be using the Scripting Bridge, but then I would have to specifically communicate with particular apps. The nice thing about CGEventPost is that any app that responds to the media keys can be used with SiriMote.I'd love a more open approach, like CGEventPost.Any hints appreciated!Thank you kindly,Matt