We are experiencing an issue where our app gets stuck during launch. The splash screen appears for some time, and then the app either becomes unresponsive or closes unexpectedly. However, there are no crash logs captured in Xcode or Firebase Crashlytics, indicating that the app is not crashing but rather being terminated. This issue is preventing affected users from properly launching the app.
Additionally, some users have reported occasional lag and slow performance when using the app. The issue occurs only for a specific subset of users and appears to be related to other Electronic Logging Device (ELD) apps running in the background. When these apps are active, our app struggles to launch and sometimes becomes unresponsive.
We suspect that this behavior could be related to system resource allocation, such as high memory consumption by background apps, which might be affecting our app's ability to launch correctly. However, we have been unable to reproduce the issue on our end despite multiple attempts.
Actions Performed During App Launch:
Firebase configuration
API requests, including:
Fetching account details
Registering the FCM token with the server
Asynchronous background requests to fetch POI details
Creating a local database and storing POI data in local storage
We would like guidance from Apple regarding potential causes and debugging strategies, especially in scenarios where the app does not produce crash logs but still fails to launch properly. Any insights into memory management, conflicts with background applications, or system resource constraints would be highly appreciated.
Steps to Reproduce:
Install and launch the app on an affected device.
Observe that the app gets stuck on the launch screen.
After some time, the app terminates unexpectedly.
Issue is inconsistent and occurs only for certain users.
Presence of other ELD apps running in the background appears to influence the issue.
Posts under CarPlay tag
169 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
With CarPlay communication plugin R18.1, I followed these steps to integrate Enhance siri, the music sound was output from carplay and there is no option for output to Car.
============================================
Enhanced Siri
Declare supported audioFormats for the AuxIn and AuxOut streams
Since the AuxIn and AuxOut streams for Siri do not have to be both active at the same time, the accessory must claim audio formats support
for AuxIn Audio and AuxOut audio independently. The audio formats for each stream can differ from each other (48KHz for AuxOut and 16KHz
for AuxIn). The new audio types represent these new streams - AuxIn/speechRecognition & AuxOut/speechRecognition.
Check if connected device supports the feature
AirPlayReceiverSessionHasFeatureEnhancedSiri()
Claim support in the Setup Response message if device supports the feature
Add kAirPlayKeyAccessoryEnabledFeature_EnhancedSiri key through the AirPlayReceiverServer delegate AirPlayReceiverServerCopyProperty_f
function for the kAirPlayKey_AccessoryEnabledFeatures key.
Helper function: CFArrayAppendValue()
Add Enhanced Siri parameters dictionary in the INFO message
Add dictionary through the AirPlayReceiverServer delegate AirPlayReceiverServerCopyProperty_f
function for the kAirPlayKey_EnhancedSiriInfo key.
kAirPlayKey_EnhancedSiriInfo dictionary parameters:
Voice activation of Siri - kAirPlayKey_EnhancedSiriVoice
Current language of voice model - kAirPlayKey_VoiceModelCurrentLanguage
Supported languages of voice model - kAirPlayKey_VoiceModelSupportedLanguages
Enhanced Button activation of Siri - kAirPlayKey_EnhancedSiriButton
Supported zone(s) if any - kAirPlayKey_SupportedSiriTriggerZones
Update AudioStream
Get state of the AuxIn state by providing an implementation of AudioStreamUpdateState() - Off, local buffering, or streaming to device.
Decouple input streams from output streams. AuxIn is an independent input stream only. The property kAudioStreamProperty_Direction will
provide the necessary information if the stream is input, output or input & output.
Provide a handler for the AirPlayReceiverSessionDelegate setEnhancedSiriParams_f
This will provide additional information:
Activation type
Setting the language of the voice model
Invoke the Communication Plugin to start buffering
Once the activation type has been specified, the accessory can request the plugin to start buffering using
AirPlayReceiverSessionAuxInStart().
Use the new APIs to trigger Siri:
AirPlayReceiverSessionRequestSiriActionWithLatency()
AirPlayReceiverSessionRequestSiriVoiceActivationWithLatency()
AirPlayReceiverSessionRequestSiriVoiceActivationWithSample()
Button presses and voice activations should use this new APIs which adds a timestamp of the activation. These APIs allow
a choice of a latency or a sample for button and voice activations.
If there is a delay between the user pressing the button to notifying the device on the button press, this latency value
should represent this time.
If the accessory can determine which zone activated, it can provide the zone with the request.
Invoke the Communication Plugin to stop buffering
You may need to invoke the plugin to stop buffering (AirPlayReceiverSessionAuxInStop()) if exclusive access to the microphone is necessary.
Such instances may include but not limited to:
Native voice recognition session
Telephony
Another stream function which uses the microphone starts
modesChanged notification can be used to determine if a resource is being used
Note, if the session ends, the plugin will automatically stop buffering
Why is there no option as a CarPlay developer to enable the creation of an App to track and enter your car's maintenance records? I know the pat reply would be Apple doesn't want you to do this while car is in motion. But I would normally do this while parked at the dealership or other service provider no?
I'm working on adding CarPlay support to an audio app and I'd like to mimic the behavior of the Apple Music app on launch.
Forgive me, but I think using Gherkin syntax here will help to best describe the desired behavior:
GIVEN the Apple Music app is in a cold state (not launched or in memory)
AND another audio app is actively playing audio
WHEN I launch the Apple Music app from CarPlay
THEN the Now Playing template is shown via a push
AND the appropriate Now Playing info is shown
AND the Now Playing button is shown on the tab bar
AND the actively playing audio from another audio app is NOT interrupted
The current behavior I see in my own app is that I can push on the Now Playing template and fill out the MPNowPlayingInfoCenter's info dictionary, but it won't render the info or show the Now Playing button on the tab bar until I start playing audio.
Also, is there a way to hide the Now Playing button after the queue of content has finished playing? I'm able to pop the Now Playing template, but the Now Playing button is still present and tapping it will navigate the user to the now blank Now Playing template.
I am currently implementing multiple scenes in my React Native / Swift application (one scene for the phone and one scene for CarPlay). I am facing an issue where one scene renders completely white (on the iPhone) but I can see in the console that the code is running (for example if I add a console.log to the App.tsx I can see that console log happen in XCode).
There are no errors when building the app in XCode, and testing with the simulator CarPlay appears to render the correct output, but there is no component being rendered on the simulated phone screen (just white).
AppDelegate.swift
import CarPlay
import React
import React_RCTAppDelegate
import ReactAppDependencyProvider
import UIKit
@main
class AppDelegate: RCTAppDelegate {
var rootView: UIView?;
static var shared: AppDelegate { return UIApplication.shared.delegate as! AppDelegate }
override func application(_ application: UIApplication, didFinishLaunchingWithOptions launchOptions: [UIApplication.LaunchOptionsKey : Any]? = nil) -> Bool {
self.moduleName = "appName"
self.dependencyProvider = RCTAppDependencyProvider()
self.initialProps = [:]
self.rootView = self.createRootView(
with: RCTBridge(
delegate: self,
launchOptions: launchOptions
),
moduleName: self.moduleName!,
initProps: self.initialProps!
);
return super.application(application, didFinishLaunchingWithOptions: launchOptions)
}
override func application(_ application: UIApplication, configurationForConnecting connectingSceneSession: UISceneSession, options: UIScene.ConnectionOptions) -> UISceneConfiguration {
if (connectingSceneSession.role == UISceneSession.Role.carTemplateApplication) {
let scene = UISceneConfiguration(name: "CarPlay", sessionRole: connectingSceneSession.role)
scene.delegateClass = CarSceneDelegate.self
return scene
}
let scene = UISceneConfiguration(name: "Phone", sessionRole: connectingSceneSession.role)
scene.delegateClass = PhoneSceneDelegate.self
return scene
}
override func application(_ application: UIApplication, didDiscardSceneSessions sceneSessions: Set<UISceneSession>) {}
override func sourceURL(for bridge: RCTBridge) -> URL? {
self.bundleURL()
}
override func bundleURL() -> URL? {
#if DEBUG
RCTBundleURLProvider.sharedSettings().jsBundleURL(forBundleRoot: "index")
#else
Bundle.main.url(forResource: "main", withExtension: "jsbundle")
#endif
}
}
PhoneSceneDelegate.swift
import Foundation
import UIKit
import SwiftUI
class PhoneSceneDelegate: UIResponder, UIWindowSceneDelegate {
var window: UIWindow?;
func scene(_ scene: UIScene, willConnectTo session: UISceneSession, options connectionOptions: UIScene.ConnectionOptions) {
if session.role != .windowApplication {
return
}
guard let appDelegate = (UIApplication.shared.delegate as? AppDelegate) else {
return
}
guard let windowScene = (scene as? UIWindowScene) else {
return
}
let rootViewController = UIViewController()
rootViewController.view = appDelegate.rootView;
let window = UIWindow(windowScene: windowScene)
window.rootViewController = rootViewController
self.window = window
window.makeKeyAndVisible()
}
}
App.tsx
import React, {useEffect, useState} from 'react';
import type {PropsWithChildren} from 'react';
import {CarPlay, ListTemplate} from 'react-native-carplay';
import {
ScrollView,
StatusBar,
StyleSheet,
Text,
useColorScheme,
View,
} from 'react-native';
import {
Colors,
DebugInstructions,
Header,
LearnMoreLinks,
ReloadInstructions,
} from 'react-native/Libraries/NewAppScreen';
type SectionProps = PropsWithChildren<{
title: string;
}>;
function Section({children, title}: SectionProps): React.JSX.Element {
const isDarkMode = useColorScheme() === 'dark';
return (
<View style={styles.sectionContainer}>
<Text
style={[
styles.sectionTitle,
{
color: isDarkMode ? Colors.white : Colors.black,
},
]}>
{title}
</Text>
<Text
style={[
styles.sectionDescription,
{
color: isDarkMode ? Colors.light : Colors.dark,
},
]}>
{children}
</Text>
</View>
);
}
function App(): any { // React.JSX.Element
const isDarkMode = useColorScheme() === 'dark';
const backgroundStyle = {
backgroundColor: isDarkMode ? Colors.darker : Colors.lighter,
};
const [carPlayConnected, setCarPlayConnected] = useState(CarPlay.connected);
useEffect(() => {
function onConnect() {
setCarPlayConnected(true);
CarPlay.setRootTemplate(new ListTemplate(/** This renders fine on the CarPlay side */));
}
function onDisconnect() {
setCarPlayConnected(false);
}
CarPlay.registerOnConnect(onConnect);
CarPlay.registerOnDisconnect(onDisconnect);
return () => {
CarPlay.unregisterOnConnect(onConnect);
CarPlay.unregisterOnDisconnect(onDisconnect);
};
});
if (carPlayConnected) {
console.log('car play connected');
} else {
console.log('car play not connected');
}
const safePadding = '5%';
// This doesn't render on the phone?
return (
<View style={backgroundStyle}>
<StatusBar
barStyle={isDarkMode ? 'light-content' : 'dark-content'}
backgroundColor={backgroundStyle.backgroundColor}
/>
<ScrollView
style={backgroundStyle}>
<View style={{paddingRight: safePadding}}>
<Header/>
</View>
<View
style={{
backgroundColor: isDarkMode ? Colors.black : Colors.white,
paddingHorizontal: safePadding,
paddingBottom: safePadding,
}}>
<Section title="Step One">
Edit <Text style={styles.highlight}>App.tsx</Text> to change this
screen and then come back to see your edits.
</Section>
<Section title="See Your Changes">
<ReloadInstructions />
</Section>
<Section title="Debug">
<DebugInstructions />
</Section>
<Section title="Learn More">
Read the docs to discover what to do next:
</Section>
<LearnMoreLinks />
</View>
</ScrollView>
</View>
);
}
const styles = StyleSheet.create({
sectionContainer: {
marginTop: 32,
paddingHorizontal: 24,
},
sectionTitle: {
fontSize: 24,
fontWeight: '600',
},
sectionDescription: {
marginTop: 8,
fontSize: 18,
fontWeight: '400',
},
highlight: {
fontWeight: '700',
},
});
export default App;
I have been attempting to get this working now for some 20+ hours with no luck with searching for answers elsewhere. I am very new to building apps with React Native and Swift so could do with some support.
Is there anyway I can customize Carplay template look like this
Hi there, I'm facing an issue when disconnecting CarPlay that the navigation session seems to be in some weird state where it is not properly finished. So when I reconnect CarPlay the "Metadata in instrument cluster or HUD" does not update anymore until I start another navigation session and stop that one.
You can see that the instruction to the left on this screen recording is not updating anymore after a reconnect.
https://www.youtube.com/watch?v=sncxyJULjQk
I have a modified the CostalRoad sample app to add support for the HUD cluster and to auto start a navigation simulation when CarPlay connects.
https://github.com/g4rb4g3/CoastalRoads
Can anyone tell me what I have to do when CarPlay disconnect so I can start a new navigation session on reconnect that has a working HUD cluster?
Fun fact is that Apple Maps handles this quite nice (https://www.youtube.com/watch?v=OpJEIyGcwdo), it somehow manages to finish the navigation session and brings up the HUD cluster just fine on reconnect.
I wonder how I can achieve the same, anyone having an idea on that?
When are you guys going to fix the CarPlay issues with this new update? I use this for work and it’s really an issue. Nothing is working and it takes up entirely too much space.
Is there any way I can get updates when I change CarPlay style settings?
I've tried CPSessionConfigurationDelegate.contentStyleChanged and CPTemplateApplicationSceneDelegate.contentStyleDidChange, but they always produce the same result.
When I choose:
Automatic -> I receive light in case of daylight;
Always Dark and Always Show Dark Map toggle on -> dark
Always Dark and Always Show Dark Map toggle off -> light.
But it seems to be wrong, b/c CarPlay's toolbar is still dark, and I receive light.
Is there a way to get a dark style when choosing Always Dark and Always Show Dark Map toggle off? Or at least get updates when the Always Show Dark Map toggle changes?
Hello,
I have a CarPlay Navigation app and utilize the AVSpeechSynthesizer to speak directions to a user. Everything works great on my CarPlay simulator as well as when plugged into my GMC truck. However, I found out yesterday that one of my users with a Ford truck the audio would cut in an out.
After much troubleshooting, I was able to replicate this on my own truck when using Bluetooth to connect to CarPlay. My user was also utilizing Bluetooth. Has anyone else experienced this? Is there a fix to the problem?
import SwiftUI
import AVFoundation
class TextToSpeechService: NSObject, ObservableObject, AVSpeechSynthesizerDelegate {
private var speechSynthesizer = AVSpeechSynthesizer()
static let shared = TextToSpeechService()
override init() {
super.init()
speechSynthesizer.delegate = self
}
func configureAudioSession() {
speechSynthesizer.delegate = self
do {
try AVAudioSession.sharedInstance().setCategory(.playback, mode: .voicePrompt, options: [.mixWithOthers, .allowBluetooth])
} catch {
print("Failed to set audio session category: \(error.localizedDescription)")
}
}
func speak(_ text: String) {
Task(priority: .high) {
let speechUtterance = AVSpeechUtterance(string: text)
speechUtterance.voice = AVSpeechSynthesisVoice(language: AVSpeechSynthesisVoice.currentLanguageCode())
try AVAudioSession.sharedInstance().setActive(true, options: .notifyOthersOnDeactivation)
speechSynthesizer.speak(speechUtterance)
}
}
func speechSynthesizer(_ synthesizer: AVSpeechSynthesizer, didFinish utterance: AVSpeechUtterance) {
Task {
stopSpeech()
try AVAudioSession.sharedInstance().setActive(false)
}
}
func stopSpeech() {
speechSynthesizer.stopSpeaking(at: .immediate)
}
}
right now it looks like the app type must follow the guidelines like Messenger app, Navigate app, and Music app only. What about the Automotive app itself, What is the flexibility of it?
We have an app for service cars for one brand (officially)(car users around 1m+). but we looking to merge the experience between outside the car and inside the car. Can we top up some features on the app to share some information that is a part of the car like trip calculation or car info display on the screen? or services time to notice them?
And following that question can we know about the exact spot or brief from Apple car play for now and next-gen?
Or can we work with your team closely as a partner? to make things happen and develop it to be a flagship product, we can share some data and talk about it with real insight.
Hi you all,
I wrote an CarPlay Driving Task App that can start Navigation to a well defined target. This works as expected (navigation app is started to the destination), but sometime, for any important reason, my application receives updates about current destination. I should interrupt current navigation to current destination and update to the new destination. I'm not able to interrnupt/update. Have you any idea if this is possible?
Thanls in advance!
Hello everyone,
I am currently working on an app project aimed at users who want to quickly and easily capture their ideas and notes while on the go. The basic concept is to develop an iOS app where users can store both typed notes and voice recordings – essentially a "brain dump" solution. The core functionality (storing, editing, synchronizing via CloudKit, etc.) will be handled within the iOS app.
In addition, I plan to integrate a CarPlay extension that allows the driver to start and stop a recording – ideally through a minimalist interface featuring a large record button and a "Done" button. Since the iPhone is often not within immediate reach in the car, the CarPlay integration should serve as a quick trigger to initiate the recording in the iOS app.
My questions are as follows:
Has anyone had experience implementing a CarPlay extension for an app that primarily handles notes and voice recordings, rather than falling into the traditional categories like navigation, audio, or communication?
Has such a concept ever been approved by Apple, or are there known hurdles and guidelines that must be observed?
Are there alternative approaches to implementing CarPlay integration in this context in a compliant and effective manner?
I would greatly appreciate any feedback, shared experiences, and tips on best practices.
Thank you in advance and best regards!
Hello,
Could you please help me with the below,
How to display a toast message to the user in CarPlay after a successful operation?
How to show a spinner or an activity indicator just before performing some operation?
I have referred to the CarPlay pdf design guidelines in which I couldn't find support for the above two. But I could see a loader within a button in one of the default apps in CarPlay simulator.
Kindly help me with these queries
Hi all!
Based on documentation: https://developer.apple.com/documentation/carplay/cplistitem/handler
If you need to perform asynchronous tasks, dispatch them to a background queue and call the completion closure or completionBlock when they complete.
In a normal case, it works perfectly. But, if it takes "too much", for example, 10 seconds (streaming with retries, app business logic), when I call the "completionBlock" inside "handler" doesn't do anything.
Exists a timeout in "completionBlock"?
Thanks!
Hello,
Please guide me if there is a way to show a simple toast to a user that the action has been performed. When a user taps on a button, an api returns a status based on which I need to show appropriate message as a toast. Is this possible in CarPlay? If not, why? Please suggest any alternative for this.
Awaiting your response
Thanks a lot!!
Messaging is missing from CarPlay. Unable to send or receive messages. Not even an option to add it to CarPlay. This is my most heavily used feature while I’m in the car.
I am wondering how I change the measurement units on screen in my CPMapTemplate. In my screenshot below the distance is in miles, but how can I change that to kilometers?
Does this need to come from my route data?
I am not seeing this anywhere in the CarPlay programming guide or in the documentation.
Dear Developers,
I'm exploring the feasibility of playing video content on CarPlay when the vehicle is in a parked state. Could you please provide guidance on whether this is supported and, if so, the best practices for implementation?
System: iOS 18.1.1
When connected to Carplay, after playing a song, check the playback page CPNowPlayingTemplate. This error appears on the BMW car, as shown in the picture:
In our project, this is achieved using the following methods:
UIImage *image1 = [UIImage imageNamed:@"imageName"];;
CPNowPlayingImageButton *button1 = [[CPNowPlayingImageButton alloc] initWithImage:image1 handler:^(__kindof CPNowPlayingButton * _Nonnull action) {
//do something
}];
UIImage *image2 = [UIImage imageNamed:@"imageName"];;
CPNowPlayingImageButton *button2 = [[CPNowPlayingImageButton alloc] initWithImage:image2 handler:^(__kindof CPNowPlayingButton * _Nonnull action) {
//do something
}];
NSArray<CPNowPlayingButton *> *buttons;
buttons = @[button1,button2];
[[CPNowPlayingTemplate sharedTemplate] updateNowPlayingButtons:buttons];
Is there any way to solve this problem?