Post

Replies

Boosts

Views

Activity

Reply to Image Playground Sheet Appears Blank First Time Presented?
It was more of a general question about if the API is working correctly for others which of course can be very useful information - pretty common for Apple APIs, especially SwiftUI, to behave in strange ways and sometimes just not work at all. So that was the purpose - knowing it works fine for everyone else would be useful. That's what the 'just me' question means. But sure, here's the code. It's basically the same as the sample in the documentation and since this modifier just uses a Bool state for presentation it didn't seem useful. With the below: When button pressed, sheet presents as expected. Sheet is blank, i.e. totally white. Wait a little bit to see what happens (nothing). Dismiss sheet with interactive pull dismissal. Tap button again. Sheet appears, Image Playgrounds content loads. Code is as follows in case something does jump out. Maybe you'll see something that I do not. import SwiftUI #if canImport(ImagePlayground) import ImagePlayground #endif @available(iOS 18.2, *) struct TripCustomImageGenerationView: View { @Environment(\.supportsImagePlayground) private var supportsImagePlayground @State private var showImagePlayground: Bool = false @State private var data: Data? var body: some View { if supportsImagePlayground { Button("Generate Custom Image") { showImagePlayground = true } .imagePlaygroundSheet(isPresented: $showImagePlayground, concept: "Test Image Concept") { url in self.data = try? Data(contentsOf: url) } } else { EmptyView() } } }
Dec ’24
Reply to Making onscreen content available to Siri not requesting my Transferable
Hi Ed, I haven't split it out into a test project yet but that may be the next step. This is a complex app which always makes the extraction a bit more of a process. One interesting note - I have this functionality allowing Siri to 'read' the screen in this way for two types of content. One is in the app's photo gallery which is very similar to the sample code. That one works - when it shares with ChatGPT, it correctly identifies the type as 'photo' not 'screenshot' and my Transferable implementation is called. The second one that's not working is using the .reader.document schema, i.e. I want to share a text document with Siri but instead it only wants to share a screenshot. Looking at the code, other than the schema type being different, the mechanics are basically the same which is where I'm confused and was curious if anyone else had done this and could be helpful. Perhaps a sample is the next thing...
Dec ’24
Reply to Guidance Implementing IndexedEntity and CSSearchableItemAttributeSet
After spending some time talking with DTS and reviewing the updated App Intent sample code located here https://developer.apple.com/documentation/appintents/acceleratingappinteractionswithappintents, I've got my answer. The sample provides one way to integrate Spotlight, by indexing a separate model and associating the app entity. What I discovered was that if you want to implement IndexedEntity, you can indeed provide an extended attribute set, you just need to start with the existing one, not create your own. Pretty simple: extension HotelEntity: IndexedEntity { var attributeSet: CSSearchableItemAttributeSet { let existingAttributes = defaultAttributeSet // this is the key existingAttributes.displayName = "\(name) displayName" existingAttributes.title = "\(name) title" existingAttributes.domainIdentifier = "\(name) domainIdentifier" existingAttributes.identifier = "\(name) identifier" existingAttributes.contentDescription = "\(name) contentDescription" existingAttributes.namedLocation = "\(name) namedLocation" return existingAttributes } } With that, it's all working!
Nov ’24