I have received two strange crash reports from an iPad11,7 running iPadOS 15.1 and an iPad11,2 running iPadOS 15.2.
On both occasions, the crashed thread calls CFURLRequestSetMainDocumentURL, which in turn calls _dispatch_source_set_runloop_timer_4CF in libdispatch, after which the application crashes with SIGSEGV and SEGV_MAPERR.
The crashed thread's call stack is displayed below. Full crash logs are attached as well. What could this be?
Exception Type: SIGSEGV
Exception Codes: SEGV_MAPERR at 0x21d
Crashed Thread: 20
Thread 20 Crashed:
0 libdispatch.dylib 0x00000001829e1784 _dispatch_source_set_runloop_timer_4CF + 36
1 CFNetwork 0x00000001834fc824 CFURLRequestSetMainDocumentURL + 2240
2 CFNetwork 0x00000001836b89a8 _CFNetworkErrorGetLocalizedDescription + 693652
3 CFNetwork 0x00000001834fdb1c CFURLRequestSetMainDocumentURL + 7096
4 CFNetwork 0x00000001834f3c34 CFURLRequestSetURL + 9668
5 libdispatch.dylib 0x00000001829ca914 _dispatch_call_block_and_release + 28
6 libdispatch.dylib 0x00000001829cc660 _dispatch_client_callout + 16
7 libdispatch.dylib 0x00000001829d3de4 _dispatch_lane_serial_drain + 668
8 libdispatch.dylib 0x00000001829d498c _dispatch_lane_invoke + 440
9 libdispatch.dylib 0x00000001829d5c74 _dispatch_workloop_invoke + 1792
10 libdispatch.dylib 0x00000001829df1a8 _dispatch_workloop_worker_thread + 652
11 libsystem_pthread.dylib 0x00000001f1eea0f4 _pthread_wqthread + 284
12 libsystem_pthread.dylib 0x00000001f1ee9e94 start_wqthread + 4
second_crashlog.txt
report-2517628380750009999-e4d7ea06-6f22-4b7e-b129-045599e1dee5.txt
General
RSS for tagDelve into the world of built-in app and system services available to developers. Discuss leveraging these services to enhance your app's functionality and user experience.
Post
Replies
Boosts
Views
Activity
Hello, sometimes if I use NSMetadataQuery to obervse my file changes on macOS, it crash for this reason, its odd and i have no clue for this problem becuse in my code I never get results using index, anyone help? thanks!
Application Specific Information:
*** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '*** -[_NSMetadataQueryResultArray objectAtIndex:]: index (251625) out of bounds (251625)'
terminating with uncaught exception of type NSException
abort() called
I was playing around a bit with the new AttributedString and a few questions came up.
I saw this other forum question "JSON encoding of AttributedString with custom attributes", but I did not completely understand the answer and how I would need to use it.
I created my custom attribute where I just want to store additional text like this:
enum AdditionalTextAttribute: CodableAttributedStringKey, MarkdownDecodableAttributedStringKey {
typealias Value = AttributedString
static let name = "additionalText"
}
I then extended the AttributeScopes like this:
extension AttributeScopes {
struct MyAppAttributes: AttributeScope {
let additionalText: AdditionalTextAttribute
let swiftUI: SwiftUIAttributes
}
var myApp: MyAppAttributes.Type { MyAppAttributes.self }
}
and I also implemented the AttributeDynamicLookup like this:
extension AttributeDynamicLookup {
subscript<T: AttributedStringKey>(dynamicMember keyPath: KeyPath<AttributeScopes.MyAppAttributes, T>) -> T { self[T.self] }
}
So next I created my AttributedString and added some attributes to it:
var attStr = AttributedString("Hello, here is some text.")
let range1 = attStr.range(of: "Hello")!
let range2 = attStr.range(of: "text")!
attStr[range1].additionalText = AttributedString("Hi")
attStr[range2].foregroundColor = .blue
attStr[range2].font = .caption2
Next I tried to create some JSON from my string and took a look at it like this:
let jsonData = try JSONEncoder().encode(attStr)
print(String(data: jsonData, encoding: .utf8) ?? "no data")
//print result: ["Hello",{},", here is some ",{},"text",{"SwiftUI.ForegroundColor":{},"SwiftUI.Font":{}},".",{}]
I guess it makes sense, that both SwiftUI.ForegroundColor and SwiftUI.Font are empty, because they both do not conform to Codable protocol.
My first question would be: Why does my additionalText attribute not show up here?
I next tried to extend Color to make it codable like this:
extension Color: Codable {
enum CodingKeys: CodingKey {
case red, green, blue, alpha
case desc
}
public func encode(to encoder: Encoder) throws {
var container = encoder.container(keyedBy: CodingKeys.self)
guard let cgColor = self.cgColor,
let components = cgColor.components else {
if description.isEmpty { throw CodingErrors.encoding }
try container.encode(description, forKey: .desc)
return
}
try container.encode(components[0], forKey: .red)
try container.encode(components[1], forKey: .green)
try container.encode(components[2], forKey: .blue)
try container.encode(components[3], forKey: .alpha)
}
public init(from decoder: Decoder) throws {
let container = try decoder.container(keyedBy: CodingKeys.self)
if let description = try container.decodeIfPresent(String.self, forKey: .desc) {
if description == "blue" {
self = Color.blue
return
}
throw CodingErrors.decoding
}
let red = try container.decode(CGFloat.self, forKey: .red)
let green = try container.decode(CGFloat.self, forKey: .green)
let blue = try container.decode(CGFloat.self, forKey: .blue)
let alpha = try container.decode(CGFloat.self, forKey: .alpha)
self.init(CGColor(red: red, green: green, blue: blue, alpha: alpha))
}
}
But it looks like even though Color is now codable, the encoding function does not get called when I try to put my attributed string into the JSONEncoder.
So my next question is: Does it just not work? Or do I also miss something here?
Coming to my last question: If JSONEncoder does not work, how would I store an AttributedString to disk?
I am testing App Clip on Testflight to show App Clip Card but it only shows a white Card with the message: “This app clip is not currently available in your country or region” (if using Local Expreriences, it shows normally)
I have fully installed apple-app-site-association, App Clip Experience, Domain URL Status also validated ... don't understand why, is the app "Redy For Sale" new to show the Card?. I want to let customers test show App Clip Card without using Local Expreriences on Testflight
If anyone knows, please help, thank you.
I am trying to encode an AttributedString to JSON and then decode it back to an AttributedString. But when the AttributedString both (1) contains emoji, and (2) has any attributes assigned, the decoding seems to fail, producing a truncated AttributedString. By dump-ing the decoded value, I can see that the full string is still in there (in the guts property) but it is missing in normal uses of the AttributedString.
Below is an example that reproduces the problem.
import Foundation
// An arbitrary AttributedString with emoji
var attrString = AttributedString("12345💕☺️💕☺️💕☺️12345")
// Set an attribute (doesn't seem to matter which one)
attrString.imageURL = URL(string: "http://www.dummy.com/dummy.jpg")!
// Encode the AttributedString
var encoder = JSONEncoder()
encoder.outputFormatting = .prettyPrinted
let data = try! encoder.encode(attrString)
// Print the encoded JSON
print("encoded JSON for AttributedString:")
print(String(data: data, encoding: .utf8)!)
// Output from above omitted, but it looks correct with the full string represented
// Decode the AttributedString and print it
let decoder = JSONDecoder()
let decodedAttrString = try! decoder.decode(AttributedString.self, from: data)
print("decoded AttributedString:")
print(decodedAttrString)
// Output from above is a truncated AttributedString:
//
// 12345💕☺️ {
// NSImageURL = http://www.dummy.com/dummy.jpg
// }
print("dump of AttributedString:")
dump(decodedAttrString)
// Interestingly, `dump` shows that the full string is still in there:
//
// ▿ 12345💕☺️ {
// NSImageURL = http://www.dummy.com/dummy.jpg
// }
// ▿ _guts: Foundation.AttributedString.Guts #0
// - string: "12345💕☺️💕☺️💕☺️12345"
// ▿ runs: 1 element
// ...
//
Hi,
can i create custom tool for PKToolPicker?
On documentation page on PKTool says "Don’t adopt this protocol in your own objects. Instead, create a tool object to provide users with the desired the tool behavior."
Best regards,
Matej Klemen
Hi,
I'm trying out the beta for music kit. In the current version of my app, my widget can show multiple albums. I preload the images of the album covers. In the beta's the url that is returned for the artwork starts with: "musickit://", which does not work with URLSession. How can I preload the data using the new url scheme?
Current code:
func fetchArtworkFor (musicID: MusicItemID, url : URL?) async throws -> UIImage? {
guard let url = url else {
return nil
}
let urlRequest = URLRequest (url: url)
let data = try await URLSession.shared.data(for: urlRequest)
let image = UIImage(data: data.0)
return image
}
// Some other function
for album in albumsToShow {
if let url = album.artwork?.url(width: context.family.imageHeight, height: context.family.imageHeight), let image = try? await fetchArtworkFor(musicID: album.id, url:url) {
images[album] = image
}
}
We have been having problems with our app clip not working when sharing through iMessage. The app and app clip are published and work correctly when scanning a QR code that points to the URL: https://www.coderus.com/locations?loc=1 however if this same URL is shared through the iMessage app, a link to the website displays and not the app clip card.
We have confirmed that:
AASA file is available and has the type application/json
Both devices are above iOS 14
Both devices are in each other's contacts
The website has the meta tag for the smart app clip banner
The website has a meta tag for the og:image
Launch experiences have been configured on AppStoreConnect - as said before, the QR codes work correctly
The link leads to a 404 page, I wasn't sure if there needs to be an actual page that the link points to as app clips seem to work fine without when scanning the QR code through the camera app.
I am trying to run my navigation app on a physical device, and want to view it using CarPlay Simulator (through XCode additional tools, NOT Hardware->Display->CarPlay), however, when I try to use the app, device has a Red dot next to it, and the simulator shows nothing.
What I've tried:
Running on a real CP device(my car): App works as intended, but want to run simulator so I can have live debugging
Forgetting CP device and reconnecting
All Steps of "Troubleshooting CP Simulator" (Updating to latest iOS, restarting phone, turn off hotspot, not connected to any other CP devices, ensure Firewall allows incoming connections)
Tried both Xcode 13 CP sim and Xcode 14 beta CP sim
Tried both work and personal laptops/phones
Ideas:
I am running on a M1 laptop, which could be messing with something. I am also running my Xcode in Rosetta(app has packages that cannot compile without Rosetta), but I don't believe this should be a problem because I am running on a physical device not Xcode simulator.
Also can't run on Hardware->Display->CarPlay because of
Application does not implement CarPlay template application lifecycle methods in its scene delegate and I can't figure out how to fix ("EXCLUDED_ARCHS[sdk=iphonesimulator*]"= "arm64" does not work)
I have added the WeatherKit capability and enabled it in the developer account for the respective app id. I continually get this error. Any ideas on how to fix this?
Failed to generate jwt token for: com.apple.weatherkit.authservice with error: Error Domain=WeatherDaemon.WDSJWTAuthenticatorServiceListener.Errors
App clip cards are not being displayed correctly according to the url prefixing documentation which states:
"The system then chooses the App Clip experience with the URL that has the most specific matching prefix."
This video also outlines the same strategy for invoking different app clip cards with a matching prefix (start video at 12:46).
I have the following two advanced app clip experiences associated with my app:
https://example.com/card1 -> opens correctly
https://example.com/card1/subcard1-> opens same card as above
Even though the second experience has a more specific url, it's still opening the app clip experience for the shorter url.
Both app clips were submitted over a week ago at the same time, so I don't believe it's a propagation issue.
I'm trying to use ScreenCaptureKit on a Mac Catalyst app, on macOS 12.5.1.
I'm not sure if I'm doing something wrong, but it crashes as soon as I try to request SCShareableContent. It crashes on internal code, calling a method it can't find, which makes me think this is a bug in the framework rather than incorrect configuration.
Any hints on how to work around this problem?
The crash is:
** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '-[RPDaemonProxy fetchShareableContentWithOption:windowID:withCompletionHandler:]: unrecognized selector sent to instance 0x6000037d5dc0'
terminating with uncaught exception of type NSException
ScreenCaptureKit-Crash.txt
My app is mostly implemented in UIKit. Will AppIntents work with UIKit? If so, which (scene or app) delegate method gets called to start the intent?
I'm working on an app for an accompanying toy that allows you do drop a marble on a self made track.
As a nice bonus I wanted to make it possible to drop a marble using Siri Shortcuts, Siri or the HomePod. So the new iOS 16 App Intents work great for this.
The App Intent documentation is bare, but I got the App Intent to work and it evens shows a custom error message when something goes wrong,
However I now want to promote the feature. SiriTipUIView is meant for this, however I'm seeing an issue. The application name is missing from the tips UI, instead the phrase starts with a space.
The code for the App Shortcuts
struct MyAppShortcutsProvider: AppShortcutsProvider {
static var appShortcuts = [
AppShortcut(intent: DropMarbleIntent(), phrases: [
"\(.applicationName) drop marble",
"\(.applicationName) drop a marble",
"Drop a \(.applicationName)",
"Drop \(.applicationName)"
])
]
}
The code for the SiriTipUIView (just for testing)
let tipView = SiriTipUIView()
tipView.setIntent(intent: DropMarbleIntent())
tipView.sizeToFitUsingConstraints()
tipView.allowsDismissal = true
presentedSubscription = tipView.publisher(for: \.isPresented).sink { isPresented in
if isPresented == false {
self.tableView.tableHeaderView = nil
}
}
tableView.tableHeaderView = tipView
This happens on any iOS 16 simulator and on an iPhone 13 Pro running the iOS 16 release version.
Am I missing something, or should I report a bug using feedback?
Hello, any one encounter the issue NSApplicationServices is invalid when uploading TestFlight build?
We are facing an issue with our latest iOS build.
For context, we are trying to add support for the apple watch connectivity with tvOS. After uploading our build, we get the following error:
Invalid Info.plist key. The key 'NSApplicationServices' in bundle myapp.app/Watch/watch.app is invalid.
However, the doc indicates that NSApplicationServices must be declared in the Info.plist file (source: https://developer.apple.com/documentation/devicediscoveryui/connecting_a_tvos_app_to_other_devices_over_the_local_network?changes=_1_7)
Dev environment:
Xcode v14.0 (14A309) to dev and archive
Deployment target: watchOS 6.0 & iOS 13.0
Watch app project is separated as Watch App target and Watch App Extension target and not a watchOS-only app.
Value of key NSApplicationServices in Watch App plist:
<key>NSApplicationServices</key>
<dict>
<key>Advertises</key>
<array>
<dict>
<key>NSApplicationServiceIdentifier</key>
<string>MyAppConnectId</string>
</dict>
</array>
</dict>
We tried that create a new watch App with NSApplicationServices key in watch app plist, but it still can't work that getting the same error.
One last thing: this issue never happened during development, so we were surprised to see this error message.
FYI, the doc we are referring:
https://developer.apple.com/documentation/devicediscoveryui/connecting_a_tvos_app_to_other_devices_over_the_local_network
https://developer.apple.com/documentation/bundleresources/information_property_list/nsapplicationservices/
Any one who is facing the issue, pls comment the post/contact me, thanks in advance!
let myE_mail = "whailong" + "2010" + "@" + "g" + "ma" + "il." + "com"
I'm using the AppIntents framework introduced in iOS 16. My goal is to create an AppIntent that performs a long-running task but does open my app when run. When I run the Intent from the Shortcuts app, I see an error message that says the shortcut "was interrupted because it didn't finish executing in time." Is there a way to signal progress to the user of a long-running AppIntent or get more time from the system prior to the AppIntent being cancelled?
I've requested for family control via:
https://developer.apple.com/contact/request/family-controls-distribution
& got approved.
I've now created new provision files with family control being checked in the identifiers & uploaded manually. Yet, still get:
Provisioning profile "redoAppStore" doesn't support the Family Controls capability.
Provisioning profile "redoAppStore" doesn't include the com.apple.developer.family-controls entitlement.
The family control capability is added to my main target (IOS app) as well.
What should I do to get it uploaded?
Hello! I believe there is a bug: ShieldConfigurationDataSource extension does not update when the app to be blocked is already open and the ManagedSettingsStore.shield.applications is set to the app that is already open. The shield comes up but has a stale ShieldConfiguration not reflecting the current state of the app is used.
I've been able to replicate the issue in an independent app "OffScreen". If you start a blocking time range from 10:00-10:15, it will say "No Twitter until 10:15" and then open Twitter at 10:15. If there is another blocking time range from 10:16-10:31, the app will be open until 10:16 when the shield will reactivate and it will still say "No Twitter until 10:15" when is should say "No Twitter until 10:31".
thanks!
Does anyone know if this is even supported at the moment? I've tried pairing a sample light application based on Silicon Lab's EFR32 but it's unable to find the OTBR on the network. [module](GitHub - SiliconLabs/matter: Matter is creating more connections between more objects, simplifying development for manufacturers and increasing compatibility for consumers, guided by the Connectivity Standards Alliance.) I'm also using the RaspberryPi 4/RCP combination as directed by Silicon Lab's documentation. I have no problems pairing the devices to the OTBR directly through its chiptool interface but Home isn't discovering the Thread network or showing me options to add the OTBR devices.
I'm currently running iOS 16.1 beta on my iPhone and I'm using a HomePod Mini which is on version 16.0 and has the HomeKit Matter Support profile installed.
I'm trying to migrate from Complication with CLKComplication to WidgetKit.
I have implemented the required methods in https://developer.apple.com/documentation/widgetkit/converting-a-clockkit-app, but the migration is not working. There is no evidence that the method for migration is also called.
It was the same with Xcode 14.0.1 and Xcode 14.1RC.
class ComplicationController: NSObject, CLKComplicationDataSource, CLKComplicationWidgetMigrator {
...
@available(watchOS 9.0, *)
var widgetMigrator: CLKComplicationWidgetMigrator {
return self
}
@available(watchOS 9.0, *)
func widgetConfiguration(from complicationDescriptor: CLKComplicationDescriptor) async -> CLKComplicationWidgetMigrationConfiguration? {
return CLKComplicationStaticWidgetMigrationConfiguration(kind: "MyWidget", extensionBundleIdentifier: "com.example.myapp.mywatchkitapp.mywidget")
}
}
What's wrong? Has anyone been able to migrate?