Hey there my application allows users to have video calls with each other using Agora. I have successfully set up incoming call functionality on Android but on iOS I am struggling to get the call ui to appear when the app is not running/in background/locked.
To my knowledge this is because there is much stricter security on iOS which is limiting me from calling this. When i initially set it up it worked at first when the app was in the background but I think I was failing to report the call to call kit in time and now it's not working.
I'm not sure if I need access to this entitlement:
com.apple.developer.pushkit.unrestricted-voip
Which i believe is only for the big boys or if I make sure I'm reporting the call to call kit fast enough that I won't encounter this issue and it will consistently work in the background.
CallKit
RSS for tagDisplay the system-calling UI for your app’s VoIP services and coordinate your calling services with other apps and the system using CallKit.
Posts under CallKit tag
157 Posts
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Ref: https://developer.apple.com/documentation/bundleresources/entitlements/com.apple.developer.usernotifications.filtering?language=objc
Currently, it seems impossible to enable this entitlement for local development and testing without first going through Apple’s approval process.
I would like to be able to test how it works on a side project without having to submit the form which seems designed for real app.
There is any trick I can use?
i have codes looks like:
import UIKit
import LiveCommunicationKit
@available(iOS 17.4, *)
class LiveCallKit: NSObject, ConversationManagerDelegate {
@available(iOS 17.4, *)
func conversationManager(_ manager: ConversationManager, conversationChanged conversation: Conversation) {
}
@available(iOS 17.4, *)
func conversationManagerDidBegin(_ manager: ConversationManager) {
}
@available(iOS 17.4, *)
func conversationManagerDidReset(_ manager: ConversationManager) {
}
@available(iOS 17.4, *)
func conversationManager(_ manager: ConversationManager, perform action: ConversationAction) {
}
@available(iOS 17.4, *)
func conversationManager(_ manager: ConversationManager, timedOutPerforming action: ConversationAction) {
}
@available(iOS 17.4, *)
func conversationManager(_ manager: ConversationManager, didActivate audioSession: AVAudioSession) {
}
@available(iOS 17.4, *)
func conversationManager(_ manager: ConversationManager, didDeactivate audioSession: AVAudioSession) {
}
@objc public enum InterfaceKind : Int, Sendable, Codable, Hashable {
/// 拒绝/挂断
case reject
/// 接听.
case answer
}
var sessoin: ConversationManager
var callId: UUID
var completionHandler: ((_ actionType: InterfaceKind,_ payload: [AnyHashable : Any]) -> Void)?
var payload: [AnyHashable : Any]?
@objc init(icon: UIImage!) {
let data:Data = icon.pngData()!;
let cfg: ConversationManager.Configuration = ConversationManager.Configuration(ringtoneName: "ring.mp3",
iconTemplateImageData: data,
maximumConversationGroups: 1,
maximumConversationsPerConversationGroup: 1,
includesConversationInRecents: false,
supportsVideo: false,
supportedHandleTypes: Set([Handle.Kind.generic]))
self.sessoin = ConversationManager(configuration: cfg)
self.callId = UUID()
super.init()
self.sessoin.delegate = self
}
@objc func toIncoming(_ payload: [AnyHashable : Any], displayName: String,actBlock: @escaping(_ actionType: InterfaceKind,_ payload: [AnyHashable : Any])->Void) async {
self.completionHandler = actBlock
do {
self.payload = payload
self.callId = UUID()
var update = Conversation.Update(members: [Handle(type: .generic, value: displayName, displayName: displayName)])
let actNumber = Handle(type: .generic, value: displayName, displayName: displayName)
update.activeRemoteMembers = Set([actNumber])
update.localMember = Handle(type: .generic, value: displayName, displayName: displayName);
update.capabilities = [ .playingTones ];
try await self.sessoin.reportNewIncomingConversation(uuid: self.callId, update: update)
try await Task.sleep(nanoseconds: 2000000000);
} catch {
}
}
}
i want to listen the button event,but i can't find the solutions!please give me a code demo
I've successfully started the Live Caller ID Lookup example and initialized the PIRService.
I added several identities to the input.txtpb file, some with block: true and others with block: false.
Here is the file but modified phone digits:
identities {
key: "+40790123123"
value {
name: "Blocking 1"
cache_expiry_minutes: 7
block: true
}
}
identities {
key: "+972526111111"
value {
name: "Blocking 2"
cache_expiry_minutes: 7
block: true
}
}
identities {
key: "+123"
value {
name: "Adam"
cache_expiry_minutes: 8
block: false
category: IDENTITY_CATEGORY_PERSON
}
}
identities {
key: "+972526111112"
value {
name: "Identified Business Name 1"
cache_expiry_minutes: 1
block: false
category: IDENTITY_CATEGORY_BUSINESS
}
}
identities {
key: "+972526111113"
value {
name: "Identified Business Name 2"
cache_expiry_minutes: 1
block: false
category: IDENTITY_CATEGORY_BUSINESS
}
}
The main issue is that only the number marked as +40790123123 was actually blocked, while "Blocking 2" appeared as identified contacts with their assigned name displayed.
Notably, the only blocked number was a foreign number with a different country code than the number being called. The other numbers belonged to the same country.
Can someone clarify whether this is a bug in the example project or an issue with the data file?
I've discovered a bug in the Phone app on iOS related to how long verdicts are displayed.
When a call is identified by a third-party Caller ID app, long verdicts display correctly during the call (they auto-scroll) and in the call log (with an ellipsis at the end). However, on the call details screen, the text is strangely truncated - showing only the beginning of the string and the last word.
For testing, I used this verdict: "Musclemen grow on trees. They can tense their muscles and look good in a mirror. So what? I'm interested in practical strength that's going to help me run, jump, twist, punch."
I'll attach a screenshots demonstrating the problem:
I'm reaching out to see if anyone else is experiencing issues with the Live Caller ID feature on iOS. We recently encountered a problem where the feature stopped working entirely.
Here's a brief overview of the situation:
We were monitoring test traffic on our backend and noticed everything came to a halt around 1:00 AM UTC on November 15th.
After this time, any attempts to reach our backend through calls failed completely.
I tested this across multiple devices running iOS 18.2 and iOS 18.0.
I used both TestFlight builds and development builds via Xcode, which should communicate directly with our backend.
I experienced the problem on our main application as well as a dedicated test app.
To troubleshoot further, I even set up a local server on localhost and tried directing requests there, but the requests did not reach the local server when a call was received.
Further debugging in Console.app revealed the following error:
identity request returned error: Error Domain=com.apple.CipherML Code=400 "Error Domain=com.apple.CipherML Code=401 "Unable to request data by keywords batch: failed to fetch token issuer directory"
However, when I manually tried to hit our server endpoint using curl, the request successfully reached the server:
curl https://our_server/something
hb_method=GET hb_uri=/something [Hummingbird] Request -- log on backend
This suggests that while our backend is responsive, the requests from the iOS client side are simply not being initiated.
I have tried setting a 'apns-expiration' to current time + 30 seconds and also a value '0'. But still my voip app receives the voip push notification after 2-3 minutes. Till this time, caller has already hung up the call. But the receivers phone still rings on receiving the push notification as we have to report it to CallKit.
Am I missing something or there is no way and even 'apns-expiration' does not guarantee timely delivery of Voip push notifications or discard if it is expired.
I have set 'apns-priority' to 10 already as recommended.
I try to update remoteHandle using CXCallUpdate for outgoing call, but this works only on iOS 15 but not on 17 or 18 (16 didn't test). This problem actual only for outgoing calls, but for incoming calls update works fine.
func startOutgoingCall(with callID: UUID, userID: String) {
let handle = CXHandle(type: .generic, value: userID)
let action = CXStartCallAction(call: callID, handle: handle)
callController.requestTransaction(with: action) { [weak self] error in
// ...
}
}
func updateOutgoingCall(with callID: UUID, groupID: String) {
let update = CXCallUpdate()
update.remoteHandle = CXHandle(type: .generic, value: groupID)
provider.reportCall(with: callID, updated: update)
}
I also tried phoneNumber type but it seems initial handle that I set to CXStartCallAction not possible to change (value or even type).
I use this handle value to implement recall by tap on call in Recents tab of system address book. But since my calls can transform from p2p to group call, I need to update handle value or find some another way to pass call identification info.
i thought it is impossible to have CallKit show system UI for outgoing calls. but then i saw this:
"For incoming and outgoing calls, CallKit displays the same interfaces as the Phone app..."
https://developer.apple.com/documentation/callkit
how do i present it though? or is this a documentation error?
Hello! We're currently testing Live Caller ID implementation and noticed an issue with userIdentifier values in our database.
Initially, we expected to have approximately 100 records (one per user), but the database grew to about 10,000 evaluationKey entries. Upon investigation, we discovered that the userIdentifier (extracted from "User-Identifier" header) for the same device remains constant throughout a day but changes after a few days.
We store these evaluation keys using a composite key pattern "userIdentifier/configHash". All these entries have the same configHash but different userIdentifier values.
This behavior leads to unnecessary database growth as new entries are created for the same users with different userIdentifier values.
Could you please clarify:
Is this the expected behavior for userIdentifier to change over time?
If yes, is there a specific TTL (time-to-live) for userIdentifier?
If this is not intended, could this be a potential iOS bug?
This information would help us optimize our database storage and implement proper cleanup procedures.
Thank you for your assistance!
In the documentation for the example Live Caller ID server (https://swiftpackageindex.com/apple/live-caller-id-lookup-example/main/documentation/pirservice/testinginstructions) there is an example service-config.json. file shown (without thorough documentation).
That config file, and the whole of the instructions, center around there being two datasets of numbers: block and identity.
My question is - is it possible for more than one dataset to be specified i.e. for block1 and block2 to be specified?
The use case for this would be - suppose the Live Caller ID server has a set of numbers it has identified as being nuisance callers and so it lists these in the block section. However user A might want all these nuisance callers to be blocked but user B does not. Therefore the Live Caller ID extension on the handset would need to use a different dataset on the server so that user A's calls from a set of numbers is blocked, but user B's are not.
Note that I'm not suggesting that the Caller ID server should be capable of storing individual user's preferences. All that would be required would be two data sets: one where blocked content is none and and one where blocked content is some. Then a user/app could switch between them as indicated by the user.
Is that possible?
If the database structure and service-config.json etc. is not configured to permit that, then could two different servers be set up to achieve this instead? i.e. so the server url specified in the app's extension can be set at run time and not at compile time?
With the Live Caller ID example server, the caller lookup dataset is defined in an input.txtpd and processed by running a ConstructDatabase command which creates a block.binpb and an identity.binpb file.
In other words, a static input file is being processed into static block and identity files.
However, in the real world, the data content for identified and blocked numbers is something which is in a constant state of flux and evolution, as new numbers becoming available, old ones become stale, numbers which were initially considered safe change into being considered malicious etc. etc.
Is the example server just that, merely an example using fixed datasets, and an actual production server is able to use live every changing data to formulate its response back to the iPhone OS query?
Here's a concrete use case - suppose it's a requirement to permit US nanp numbers but to block anything else. The total number of non US nanp numbers is so large and ever changing that it would be unfeasible to attempt to capture them in an input.txtpd file and then process that, and then to re-capture and re-process it endlessly. Instead what would be required is the ability for the Live Caller ID server to evaluate at query time, using a regular expressions for example, if a number is nanp or not.
Is this possible?
I am developing a VoIP service.
Usually, when receiving a VoIP Push, Callkit is exposed immediately after receiving the message and the app is designed to be used.
However, there is an extremely intermittent phenomenon (not well reproduced) where the app does not wake up even when receiving a VoIP Push. And after a long time, the app wakes up and Callkit is activated. (A long time after receiving the call…)
Has anyone experienced the above phenomenon? I wonder if there are any reported parts depending on the OS version. (I have identified that it does not occur in the 17.x version, but it is difficult to guarantee because it occurs extremely intermittently)
The app is not running in the background, but... Could this be happening if there are a lot of pending operations in the background?
I need help urgently
I completed the CallKit Demo with the same code.
When I changed to LiveCommunicationKit, the code goes perfectly when the app is in foreground, but it crashed in background.
If I changed the reportIncoming method from LCK to CallKit, it goes well. What is the reason?
I changed the method from
func pushRegistry(_ registry: PKPushRegistry,
didReceiveIncomingPushWith payload: PKPushPayload,
for type: PKPushType, completion: @escaping () -> Void)
to
func pushRegistry(_ registry: PKPushRegistry,
didReceiveIncomingPushWith payload: PKPushPayload,
for type: PKPushType) async
it crashed before show the print "receive voip noti".
Here is the core code:
var providerDelegate: ProviderDelegate?
func pushRegistry(_ registry: PKPushRegistry,
didReceiveIncomingPushWith payload: PKPushPayload,
for type: PKPushType, completion: @escaping () -> Void) {
if type != .voIP { return }
guard let uuidString = payload.dictionaryPayload["uuid"] as? String,
let uuid = UUID(uuidString: uuidString),
let handle = payload.dictionaryPayload["handle"] as? String,
let hasVideo = payload.dictionaryPayload["hasVideo"] as? Bool,
let callerID = payload.dictionaryPayload["callerID"] as? String else {
return
}
print("receive voip noti: \(type):\(payload.dictionaryPayload)")
if #available(iOS 17.4, *) {
// This code is only goes perfectly when the App is in foreground
var update = Conversation.Update(members: [Handle(type: .generic, value: callerID, displayName: callerID)])
if hasVideo {
update.capabilities = [.video, .playingTones]
} else {
update.capabilities = .playingTones
}
Task { @MainActor in
do {
print("LCKit report start")
try await LCKitManager.shared.reportNewIncomingConversation(uuid: uuid, update: update)
print("LCKit report success")
completion()
} catch {
print("LCKit report failed")
print(error)
completion()
}
}
} else {
// It went perfectly
providerDelegate?.reportIncomingCall(uuid: uuid, callerID: callerID, handle: handle, hasVideo: hasVideo) { _ in
completion()
}
}
@available(iOS 17.4, *)
final class LCKitManager {
static let shared = LCKitManager()
let manager: ConversationManager
init() {
manager = ConversationManager(configuration: type(of: self).configuration)
manager.delegate = self
}
static var configuration: ConversationManager.Configuration {
ConversationManager.Configuration(ringtoneName: "Ringtone.aif",
iconTemplateImageData: #imageLiteral(resourceName: "IconMask").pngData(),
maximumConversationGroups: 1,
maximumConversationsPerConversationGroup: 1,
includesConversationInRecents: true,
supportsVideo: false,
supportedHandleTypes: [.generic])
}
func reportNewIncomingConversation(uuid: UUID, update: Conversation.Update) async throws {
try await manager.reportNewIncomingConversation(uuid: uuid, update: update)
}
}
final class ProviderDelegate: NSObject, ObservableObject {
static let providerConfiguration: CXProviderConfiguration = {
let providerConfiguration: CXProviderConfiguration
if #available(iOS 14.0, *) {
providerConfiguration = CXProviderConfiguration()
} else {
providerConfiguration = CXProviderConfiguration(localizedName: "Name")
}
providerConfiguration.supportsVideo = false
providerConfiguration.maximumCallGroups = 1
providerConfiguration.maximumCallsPerCallGroup = 1
let iconMaskImage = #imageLiteral(resourceName: "IconMask")
providerConfiguration.iconTemplateImageData = iconMaskImage.pngData()
providerConfiguration.ringtoneSound = "Ringtone.aif"
providerConfiguration.includesCallsInRecents = true
providerConfiguration.supportedHandleTypes = [.generic]
return providerConfiguration
}()
private let provider: CXProvider
init( {
provider = CXProvider(configuration: type(of: self).providerConfiguration)
super.init()
provider.setDelegate(self, queue: nil)
}
func reportCall(uuid: UUID, callerID: String, handle: String, hasVideo: Bool, completion: ((Error?) -> Void)? = nil) {
let callerUUID = UUID()
let update = CXCallUpdate()
update.remoteHandle = CXHandle(type: .generic, value: callerID)
update.hasVideo = hasVideo
update.localizedCallerName = callerID
// Report the incoming call to the system
provider.reportNewIncomingCall(with: callerUUID, update: update) { [weak self] error in
completion?(error)
}
}
}
Issue:
Under certain conditions, using CallKit does not automatically enable the microphone.
Steps to Reproduce:
1.Start an outgoing call, then the user manually mutes the audio.
2.Receive a native incoming call, end the current call, then answer the new incoming call.(This order is important.)
3.End the incoming call.
4.Start another outgoing call and observe the microphone; do not manually mute or unmute.
Actual Behavior:
The audio icon indicates that the audio is unmuted, but the microphone remains off, and the small yellow dot in the top status bar (which represents the microphone) does not appear.
Expected Behavior:
The microphone should be on, consistent with the audio icon display, and the small yellow dot should appear in the top status bar.
Device:
iPhone 16 pro & iPhone 15 pro, iOS 18.0+
Can it be reproduced using speakerbox(CallKit Demo)?
YES
The example database/server provided by Apple for Live Caller ID contains a hardcoded database with a tiny number of pre-defined numbers.
However, its not expected to be representational of an live real world usage server.
But the question is how can that be accomplished if its a requirement that the data be KPIR encrypted?
In real world scenarios, the factors that effect whether a number should be blocked or not are continually changing and evolving on a minute-by-minute basis and new information becomes available or existing information changes.
If the database supports tens of millions or hundreds of millions of constantly changing phone numbers, in order to meet the requirements of the Live Caller ID being KPIR encrypted, that would imply the database has to re-encrypt its database of millions endlessly for all time.
That seems unfeasable and impractical to implement.
Therefore how do the Apple designers of this feature envisage/suggest a real-world server supporting millions of changing data should meet the requirement to be KPIR encrypted?
CallKit and WebRTC are used to realize the call functionality.
You can select video, voice, or text calling as your calling method.
When making a text call, the voice input is grayed out and cannot be used, is there a solution?
In mainland China, CallKit is not available. Recently, I discovered LiveCommunicationKit.
Can it replace CallKit?
There is no relevant introduction in the documentation. Can someone help me understand how to use it?
https://developer.apple.com/documentation/livecommunicationkit
Hello,
We have a Push-to-Talk (PTT) application that is already well established and widely used. Our app has the proper VoIP entitlement, which we are using to wake up the app and establish a WebSocket connection for real-time communication. We are also using CallKit as a supporting mechanism, but not as the primary interaction upon receiving the VoIP Push, since our use case differs from traditional full-duplex VoIP calls.
While our implementation works correctly in many cases, we have noticed a consistent issue where, after multiple VoIP Push notifications, the system still delivers the push, but prevents the WebSocket from reconnecting.
At this point, all connection attempts return errors such as:
• "Software caused connection abort"
This issue persists until the app is manually relaunched, after which the behavior resets and repeats.
We are aware that VoIP Push was originally designed for full-duplex calls, but since Apple allows its use for other purposes through the entitlement, we would like to understand why this limitation is occurring and how to handle it properly.
Questions:
1. Is iOS enforcing stricter background execution rules after multiple VoIP Push events within a short period?
2. Are there any recommended best practices to ensure reliable WebSocket reconnection in this scenario?
I am implementing flutter_callkit_incoming for handling call notifications in my Flutter app. However, I am facing an issue where VoIP push notifications are not consistently received when the app is in the background or terminated.
According to Apple’s documentation:
"On iOS 13.0 and later, if you fail to report a call to CallKit, the system will terminate your app. Repeatedly failing to report calls may cause the system to stop delivering any more VoIP push notifications to your app."
I have followed the official installation guide: flutter_callkit_incoming installation and implemented all necessary configurations. However, VoIP notifications sometimes get lost and do not deliver reliably.
Here is the payload I am using:
{
"notification": { "title": "New Alert", "body": "@H is calling you..." },
"android": {
"notification": {
"channelId": "channel_id",
"sound": "sound_name.mp3"
}
},
"apns": { "payload": { "aps": {} } },
"data": {
"title": "New Call",
"body": "@H is calling you...",
"notificationType": "CALL",
"type": "NOTIFICATION",
"sound": "sound_name"
},
"token": "token"
}
I expect the call notification to appear even when the app is in the background or killed state. Has anyone encountered this issue and found a solution? Any insights would be greatly appreciated.