Hi,
I'm using ScreenCaptureKit on macOS 14+ to record a single window. I've noticed that the Presenter Overlay only appears when capturing the entire screen, but it does not appear when recording a specific window or a region.
Is there a way to enable the Presenter Overlay while recording a single window or a defined region, similar to how it works with full-screen capture?
Any guidance or clarification would be greatly appreciated.
Thanks in advance!
General
RSS for tagDelve into the world of built-in app and system services available to developers. Discuss leveraging these services to enhance your app's functionality and user experience.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hello,
I’m developing an app to detect movement past a strong magnet, targeting both Android and iOS. On Android, I’m using the Sensor API, which provides calibrated readings with temperature compensation, factory (or online) soft-iron calibration, and online hard-iron calibration. The equivalent on iOS appears to be the CMCalibratedMagneticField data from the CoreMotion framework.
However, I’m encountering an issue with the iOS implementation. The magnetometer data on iOS behaves erratically compared to Android. While Android produces perfectly symmetric peaks, iOS shows visual peaks that report double the magnetic field strength. Additionally, there’s a "pendulum" effect: the field strength rises, drops rapidly, rises again to form a "double peak" structure, and takes a while to return to the local Earth magnetic field average. The peaks on iOS are also asymmetric.
I’m wondering if this could be due to sensor fusion algorithms applied by iOS, which might affect the CMCalibratedMagneticField data. Are there other potential reasons for this behavior? Any insights or suggestions would be greatly appreciated.
Thank you!
We've been using the WeatherKit API for a few years now. Everything has been pretty stable. We'll periodically get 404 errors, but they usually disappear within a couple days.
Starting March 5th we've again been getting 404 errors that slowly ramped up to March 20th and continued. We have had no code changes on our end, so something seems to have changed / broken on the server side of things.
Here are some example API calls that are giving us a 404 error now
https://weatherkit.apple.com/api/v1/weather/en/35.9981205/-78.8920444?dataSets=forecastDaily&dailyStart=2025-03-21T05:00:00Z&timezone=America/New_York&countryCode=US
https://weatherkit.apple.com/api/v1/weather/en/41.4789363/-81.7404134?dataSets=forecastDaily&dailyStart=2025-03-21T04:56:00Z&timezone=America/New_York&countryCode=US
Does anyone have any insights or information on this?
Also if Apple is listening, an error more meaningful than 404 would be much much appreciated.
Hi there,
I am using WeatherKit to display weather forecast information in an app.
I would like to include some information about when the weather forecast was issued for my users to see.
This information is included in the response Metadata as documented in the WeatherKit REST API docs:
https://developer.apple.com/documentation/weatherkitrestapi/metadata
Specifically there is a “reportedTime” property which I would like to use here.
However I am consuming WeatherKit via the Swift API, I don’t see this property available via the Swift APIs.
How can I access the reportedTime property via the WeatherKit Swift APIs? Or is it not exposed via the Swift APIs?
Hi,
I'd like to develop an app which runs speech recognition even after going into background. I know I can accomplish this using audio background mode and the process the audio but I am not sure if this workaround would get accepted into App Store because of the processing limitations while in the background.
How can I accomplish this while still being compliant with Apples privacy policy and other restrictions?
Thanks,
Marek
My CoreSpotlight extension seems to exceed the 6 MB memory limit. What’s the best way to debug this?
I've tried to attach the debugger on the Simulator but the extension seems to be never launched when I trigger the reindex from Developer settings. Is this supposed to work?
On device, I am able to attach the debugger. However, I can neither transfer the debug session to Instruments, nor display the memory graph. So I've no idea how the memory is used.
Any recommendations how to move forward? Is there a way to temporarily disable the memory limit since even with LLDB attached, the extension is killed.
I recently used Open core legacy patcher to update my old 2012 Macbook pro to run a new pice of dj software. the update went smooth but now the Dj software wont open just gives me a crash report. Im totally stumped.
the crash report.
Crashed Thread: 0 Dispatch queue: com.apple.main-thread
Exception Type: EXC_BAD_INSTRUCTION (SIGILL)
Exception Codes: 0x0000000000000001, 0x0000000000000000
Termination Reason: Namespace SIGNAL, Code 4 Illegal instruction: 4
Terminating Process: exc handler [3839]
Thread 0 Crashed:: Dispatch queue: com.apple.main-thread
0 Engine DJ 0x10c9e3d81 0x10c28f000 + 7687553
1 dyld 0x7ff807632729 invocation function for block in dyld4::Loader::findAndRunAllInitializers(dyld4::RuntimeState&) const + 241
2 dyld 0x7ff80766b34e invocation function for block in dyld3::MachOAnalyzer::forEachInitializer(Diagnostics&, dyld3::MachOAnalyzer::VMAddrConverter const&, void (unsigned int) block_pointer, void const*) const + 133
3 dyld 0x7ff80765fb73 invocation function for block in dyld3::MachOFile::forEachSection(void (dyld3::MachOFile::SectionInfo const&, bool, bool&) block_pointer) const + 543
4 dyld 0x7ff80761a07b dyld3::MachOFile::forEachLoadCommand(Diagnostics&, void (load_command const*, bool&) block_pointer) const + 249
5 dyld 0x7ff80765ebe8 dyld3::MachOFile::forEachSection(void (dyld3::MachOFile::SectionInfo const&, bool, bool&) block_pointer) const + 176
6 dyld 0x7ff807661266 dyld3::MachOFile::forEachInitializerPointerSection(Diagnostics&, void (unsigned int, unsigned int, bool&) block_pointer) const + 116
7 dyld 0x7ff80766b084 dyld3::MachOAnalyzer::forEachInitializer(Diagnostics&, dyld3::MachOAnalyzer::VMAddrConverter const&, void (unsigned int) block_pointer, void const*) const + 390
8 dyld 0x7ff8076325c2 dyld4::Loader::findAndRunAllInitializers(dyld4::RuntimeState&) const + 150
9 dyld 0x7ff807638af7 dyld4::JustInTimeLoader::runInitializers(dyld4::RuntimeState&) const + 21
10 dyld 0x7ff807632928 dyld4::Loader::runInitializersBottomUp(dyld4::RuntimeState&, dyld3::Array<dyld4::Loader const*>&, dyld3::Array<dyld4::Loader const*>&) const + 276
11 dyld 0x7ff807636141 dyld4::Loader::runInitializersBottomUpPlusUpwardLinks(dyld4::RuntimeState&) const::$_0::operator()() const + 147
12 dyld 0x7ff8076329bc dyld4::Loader::runInitializersBottomUpPlusUpwardLinks(dyld4::RuntimeState&) const + 90
13 dyld 0x7ff80764e1f7 dyld4::APIs::runAllInitializersForMain() + 277
14 dyld 0x7ff80761f52e dyld4::prepare(dyld4::APIs&, dyld3::MachOAnalyzer const*) + 3433
15 dyld 0x7ff80761e792 dyld4::start(dyld4::KernelArgs*, void*, void*)::$_0::operator()() const + 572
16 dyld 0x7ff80761e27f start + 1727
Thread 1:
0 libsystem_pthread.dylib 0x7ff8079a8bcc start_wqthread + 0
Thread 2:
0 libsystem_pthread.dylib 0x7ff8079a8bcc start_wqthread + 0
Thread 0 crashed with X86 Thread State (64-bit):
rax: 0x00006000012acc80 rbx: 0x00006000032b3c90 rcx: 0x00006000012acd00 rdx: 0x000000011008e000
rdi: 0x0000000000000000 rsi: 0x00006000012ac000 rbp: 0x00007ff7b3c5a9f0 rsp: 0x00007ff7b3c5a9c0
r8: 0x0000000000000002 r9: 0x000000000000001b r10: 0x00000000001ff800 r11: 0x0000000000000080
r12: 0x000000010eeae278 r13: 0x000000010c28f6e8 r14: 0x00007ff84a461050 r15: 0x00007ff84a4614a0
rip: 0x000000010c9e3d81 rfl: 0x0000000000010207 cr2: 0x0000000000000000
Logical CPU: 2
Error Code: 0x00000000
Trap Number: 6
Topic:
App & System Services
SubTopic:
General
On iOS Bundle.main.preferredLocalizations returns the list of languages the application bundle supports in user-preferred order with the first element being the language the application is running in.
Additionally Locale.preferredLanguages returns the list of languages in the order they are presented in Preferences.app > General > Language & Region > Preferred Languages with the first element being the user's "primary language" (i.e. the language the system is running in).
However this only seems to be true unless the user has chosen a per-app language which is different from the primary language in which case Locale.preferredLanguages.first is equal to Bundle.main.preferredLocalizations.first - regardless of the latter's position in the Preferred Languages list.
Furthermore this seems to change depending on the value of the "AppleLanguages" key in the User Defaults' global domain (see c.f. https://stackoverflow.com/a/42648166).
Is this behaviour documented anywhere?
Addendum: I know that according to https://forums.developer.apple.com/forums/thread/718512?answerId=733680022#733680022
AppleLanguages is an implementation detail, not something that’s considered API.
Locale.preferredLanguages is API, though.
Hey everyone,
I have an app using the screen time api, I've had quite a few reports from users saying that our monitoring features stop working until they open our app. What happens is that activities and schedules set with the device activity monitor seem to disappear. This is something we check on app re-opens and so we schedule them again and that is why the monitoring starts working again.
Of course our current solution is not optimal since our app is mainly passive. Has anyone experienced these kinds of issue ?
Hi everyone,
I’m developing a screen-time and focus app that uses Apple’s Family Controls framework to block distracting apps and reward users through a gamified experience.
Apple approved Family Controls (Distribution) for the main app identifier,
but they did not approve the required extension target (which implements DeviceActivityMonitor, required for the framework to function).
Because of this, I can’t archive or upload the app to TestFlight — and I’ve been stuck in limbo for over 2 months.
This is severely delaying my launch. The app works perfectly, but I can’t release it because the extension wasn’t included in the entitlement approval.
Has anyone else run into this with Family Controls?
Is there any known way to escalate or fast-track approval for additional app IDs or targets?
Would really appreciate any help, advice, or hacks. 🙏
Thanks,
Enzer
I've discovered a bug in the Phone app on iOS related to how long verdicts are displayed.
When a call is identified by a third-party Caller ID app, long verdicts display correctly during the call (they auto-scroll) and in the call log (with an ellipsis at the end). However, on the call details screen, the text is strangely truncated - showing only the beginning of the string and the last word.
For testing, I used this verdict: "Musclemen grow on trees. They can tense their muscles and look good in a mirror. So what? I'm interested in practical strength that's going to help me run, jump, twist, punch."
I'll attach a screenshots demonstrating the problem:
I try to update remoteHandle using CXCallUpdate for outgoing call, but this works only on iOS 15 but not on 17 or 18 (16 didn't test). This problem actual only for outgoing calls, but for incoming calls update works fine.
func startOutgoingCall(with callID: UUID, userID: String) {
let handle = CXHandle(type: .generic, value: userID)
let action = CXStartCallAction(call: callID, handle: handle)
callController.requestTransaction(with: action) { [weak self] error in
// ...
}
}
func updateOutgoingCall(with callID: UUID, groupID: String) {
let update = CXCallUpdate()
update.remoteHandle = CXHandle(type: .generic, value: groupID)
provider.reportCall(with: callID, updated: update)
}
I also tried phoneNumber type but it seems initial handle that I set to CXStartCallAction not possible to change (value or even type).
I use this handle value to implement recall by tap on call in Recents tab of system address book. But since my calls can transform from p2p to group call, I need to update handle value or find some another way to pass call identification info.
When using Mobile Device Management (MDM), it is possible to restrict access to contacts on managed devices.
Please see https://support.apple.com/en-us/101932
According to this article, contacts are considered "managed" if they come from either a managed app or a managed account.
Does this principle also apply when an MDM-managed app provides contacts via the Contact Provider Extension (CPE)? Specifically, would such "managed" contacts remain inaccessible to unmanaged apps? Our goal is to ensure that unmanaged apps cannot access these contacts.
Thanks!
Topic:
App & System Services
SubTopic:
General
I want to monitor again from the bellow function of DeviceActivityMonitorExtension. I have the function of startMonitoring like this.
override func eventDidReachThreshold(_ event: DeviceActivityEvent.Name, activity: DeviceActivityName) {
super.eventDidReachThreshold(event, activity: activity)
startMonitoring()
}
public func startMonitoring() {
let startTime = DateComponents(hour: 0, minute: 0, second: 0)
let endTime = DateComponents(hour: 23, minute: 59, second: 59)//DateComponents(hour: 11, minute: 0, second: 0)//
let schedule = DeviceActivitySchedule(
intervalStart: startTime,//DateComponents(hour: 0, minute: 0, second: 0),
intervalEnd: endTime,
repeats: true
//warningTime: DateComponents(minute:1)
)
let selection: FamilyActivitySelection = savedSelection() ?? FamilyActivitySelection()
let center = DeviceActivityCenter()
let selections = self.savedSelection() ?? FamilyActivitySelection()
let applications = selections.applicationTokens
let categories = selections.categoryTokens
let webCategories = selections.webDomainTokens
let store = ManagedSettingsStore()
store.shield.applicationCategories = ShieldSettings.ActivityCategoryPolicy.specific(categories, except: Set())
store.shield.applications = applications
store.shield.webDomains = webCategories
let scheduleHard = DeviceActivitySchedule(
intervalStart: startTime,//DateComponents(hour: 0, minute: 0, second: 0),
intervalEnd: endTime,
repeats: true
//warningTime: DateComponents(minute:1)
)
let event = DeviceActivityEvent(
applications: selection.applicationTokens,
categories: selection.categoryTokens,
webDomains: selection.webDomainTokens,
threshold: DateComponents(minute: 0)//timeLimitToUseApp i.e for 15 mins
)
do {
try center.startMonitoring( .weekend,
during: scheduleHard,
events: [
.weekend: event,
]
)
print("ScreenTime Monitoring Started")
} catch let error {
print(error.localizedDescription)
}
}
Please provide us with a solution about starting monitoring from DeviceActivityMonitoringExtension's eventDidReachThreshold function or if there is any other way.
Hello!
It's my first time posting in this forum. Apple Intelligence is enabled by default in Workspace ONE (WS1). I was wondering if Apple Intelligence can access or process corporate data within (WS1) corporate apps, which are containerized?
Thank you!
Topic:
App & System Services
SubTopic:
General
We developing an app, It's a Parental control app required to block large number of apps. In child mobile installed more than 200 apps parent has to block and disable these apps but parent cant able to block more than 50 apps. Is there any option is there to block all the 200 apps from child mobile.
Hello, I am currently optimizing the performance of my application. I would like to obtain information about users being killed after leaving the application in the background, in order to evaluate whether the application is running normally in the background. I noticed that there is a Background Termination information in Xcode ->Organizer that records background exits. I would like to know the rules for obtaining this information and what is the health standard for this indicator on the Apple side?
Speech Framework
I've been checking for SFSpeechRecognitionMetadata to determine the end of a sentence when using Voice Recognition.
Yet it doesn't detect small pauses but only large ones, so that I've transcribed basically an entire paragraph before going onto the next one.
Besides implementing your own timer, are there any other ways to have more natural pauses to detect the end of sentences, similar to the browser's Web Speech recognition? Since it's in Safari, I assume there should be some similar feature that can be equivalent in MacOS.
I'd like to write a MacOS App that makes use of the ASAM functonality as described here: https://developer.apple.com/documentation/devicemanagement/autonomoussingleappmode
I have tried to use the example with Safari, and have enrolled a Mac with MDM and installed the profile. But when opening Safari it does not appear in Single App Mode. I've only tried it with Safari so far but eventually I want to be able to use my own App.
Is there an API that has to be used to enter single app mode programmatically? I've found the whole Assessment API but as I do not have the required entitlements to use that API I'm looking for another solution. The documentation on ASAM does not mention the Assessment API at all, is it the only way to enter "a" single app mode on MacOS? How is the Assessment API linked to ASAM? As far as I have understood there's the com.apple.developer.automatic-assessment-configuration entitlement but apps having this do not need to be configured via MDM? I'm really confused as to what's actually required to be able to get into single app mode on MacOS. The app I'm trying to write isn't really related to an an assessment task, but I am doing this for an academic institution so maybe requesting the entitlement would be feasable.
The documentation on ASAM also mentions that the App is granted access to the "Accessibility" API and I've found the whole
UIAccessibility/requestGuidedAccessSession but this does not seem to be available on MacOS proper?
Any help on this would be greatly appreciated.
Topic:
App & System Services
SubTopic:
General
Is it possible to integrate a button in an app that displays advertisements to support charities -without offering any direct reward to the user?