I attempted to utilize the Background Assets feature for an iOS app. While debugging, I employed the following command to trigger the installation event:
xcrun backgroundassets-debug -b <bundleID> -s --app-install -d <Device ID>
This command worked flawlessly on an iPhone.
However, when I attempted to trigger the installation event on a Mac, I encountered the following error message:
The requested device to send simulation events to is not available.
Verify that the device is connected to this Mac.
Please note that the xcrun backgroundassets-debug -l command only displays a list of connected devices.Mac is not listed in that list.
WWDC23
RSS for tagDiscuss the latest Apple technologies announced at WWDC23.
Posts under WWDC23 tag
55 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
Adding an inspector and toolbar to Xcode's app template, I have:
struct ContentView: View {
var body: some View {
VStack {
Image(systemName: "globe")
.imageScale(.large)
.foregroundStyle(.tint)
Text("Hello, world!")
}
.padding()
.toolbar {
Text("test")
}
.inspector(isPresented: .constant(true)) {
Text("this is a test")
}
}
}
In the preview canvas, this renders as I would expect:
However when running the app:
Am I missing something?
(Relevant wwdc video is wwdc2023-10161. I couldn't add that as a tag)
Rapidly tapping on a Button in an interactive widget bypasses the button's AppIntent action, and launches the host app instead.
I've filed a radar for this, but is there any known workaround for this behaviour?
Doesn't seem to happen when using Apple's first party app widgets.
I'm trying to create swift macros for initializer. I can't get the members of the inherited class.
In the video 'The SwiftUI cookbook for focus" a key detail is left out.
https://developer.apple.com/videos/play/wwdc2023/10162/?time=1130
selectRecipe has no code provided meaning it leaves out a vital detail, how to handle up and down keyboard presses.
If a LazyVGrid has 4 items per row with the current shape of the window and the user presses the down key, how is the application supposed to know which item is directly underneath the currently focused one? Or if they press up and they need to know which is directly above? What happens when the user resizes the window and the number of items per row changes?
This would seem to require knowing the exact current layout of the window to return the correct recipe ID. The code provided isn't wrapped in a complex GeometryReader so I assume there's some magic I am missing here.
I am trying to create a similar LazyVGrid that can be navigated with the keyboard as with the recipes grid here but have no means of implementing .onMoveCommand in such a way that makes sense.
At the moment, SwiftUI seems to be intentionally built in such a way to defy all attempts to implement keyboard navigation.
Is there a limit to the size of the object that you are wanting to capture with Object Capture. e.g could it capture a horse or other such sized animal?
I'd love to play around with DockKit, but I didn't see anything mentioned about hardware. I'm assuming Apple isn't releasing their own motorized dock and haven't seen anything about how to get hardware recognized by the accessory manager.
I'd like to prototype a dock myself using esp32 and some stepper motors. I've already got this working with bluetooth communication from iOS via CoreBluetooth, but I don't know if there's specific service and characteristic UUIDs that the system is looking for to say it's compatible with DockKit?
Would really love to start playing with this, anyone got any insights on how to get up and running?
I'm trying to put together an app intent that allows a user to navigate to a specific part of my app.
I've built a basic intent, and set up an AppEnum with a case for each "screen" in my app a user should be allowed to navigate to (e.g. "All Posts", "Favourite Posts", etc.).
In addition, I'd like to include additional parameters based on the enum selected. For example, I'd like to include an enum case "Post" where a user can configure a specific post to navigate to.
This would mean I can have an enum of "All Posts", "Specific Post", "Favourite Posts" etc. which is cleaner than having a separate intent for "Open Specific Post"...
Is this possible? I can see ParameterSummaryBuilder, AppIntent.Switch etc. but there are no docs or examples using these.
Can you provide more information on whether this is possible, and show an example of Swift code to do this.
Thanks!
Hi,
I am getting a linking error when building my app to run against an iOS17 device, using Xcode15. Same project builds and runs fine with Xcode 14 and iOS16. The linking error just says:
clang: error: unable to execute command: Segmentation fault: 11
clang: error: linker command failed due to signal (use -v to see invocation)
Not sure what I should try to overcome this. I can't run my app on an iOS17 device. It builds, links and runs just fine on a simulator.
I liked the TipKit presentation -- nice and short and to the point, great introduction!
All the code snippets were in SwiftUI. Will TipKit be available for regular UIKit / AppKit apps as well, or is it restricted to only being used within SwiftUI apps?
thanks
Is it possible to use an iPhone running iOS 17 with Xcode 14.3.1?
I tried to use the old method but nothing. In Xcode 15 inside DeviceSupport folder there isn't the folder of iOS 17.
There was no mention if the Vision Pro could be used outside. Several of the other AR/VR systems out there are prohibited from this (sensors overload). Can the Vision Pro be used in sunlight?
Thanks!