Post not yet marked as solved
I'm playing around with the new Swift Charts, so far everything is rather straight forward, however I'm looking for a way to change the time frame/scope of the chart. For example, recreating the Heart Rate chart in the Health App or in other words, only showing a subsection of the data, and being able to move the x-axis for left/right for more data.
I know it's early, but anyone have any ideas how that might be done?
Excellent work on Passkeys.
For context, I’m soon to release a Password Manager app that is built specifically for Apple devices only (iOS, iPadOS, macOS). A user’s vault items are encrypted on their own device and synced end-to-end encrypted via their own private iCloud database. As you’d expect, the app requires the user to enter their master password to unlock their vaults, and allows them to optionally enable Touch or Face ID for a passwordless unlock experience.
In this scenario where there is no third-party server involved, and auth takes place on-device only, is there any meaningful way an app like this can or should take advantage of Passkeys?
The only thing I can think of so far would be to allow the user to use a Passkey instead of a master password to unlock their vault. But aside from the convenience factor for the user in terms of UX, I’m not entirely sure I understand if there would be any major security advantage in doing so, over the app’s existing auth/unlock flow?
I have compiled my iOS app for both the simulator and device. When clicking on 'Recent Build Timeline' in Xcode 14 Beta 1 nothing happens with no error.
Do I need to enable or set up something else to get it to work?
I remembered there's a message that says that you cannot modify this video or whatever at the end of every session. So, if I trim the video and send it to my friend, is it allowed?
Bonus: where can I find the background music for every daily debrief video (if I can)?
Post not yet marked as solved
Hello! I am starting to dig into the docs on object and mesh shaders. I see that the Metal API on the CPU side has new functions for setting object and mesh buffers in the new programable stage. But I don't see corresponding changes to the API for MTLIndirectCommandBuffer. Will we be able to use the GPU to encode draw commands using a pipeline that leverages the new shader types?
Thanks,
Post not yet marked as solved
All of my builds get stuck on Archive action - it keeps running forever (10+ hours where it took 30 minutes before for a clean build), and never finishes despite having all subtasks finished (green check).
This started to happen on a workflow that has worked reliably for months, right after WWDC22 start - is there a problem with a new version of Xcode Cloud?
Post not yet marked as solved
Exciting to see the new Medication tracking features coming to Health. As a medical app we're wondering if read/write access to these is coming to HealthKit?
We're currently building prescription management into our app so it would be great to be able to let users also add these to Health to handle their adherence tracking.
I found out that for some reason, /Library/Developer/CoreSimulator/Volumes/watchOS_20R5287q/Library/Developer/CoreSimulator/Profiles/Runtimes/watchOS 9.0.simruntime/Contents/Resources/RuntimeRoot/usr/lib/swift/libswiftCloudKit.dylib is not a file, but a symlink to /System/Library/Frameworks/CloudKit.framework/CloudKit - which is a folder, not a file, and as a result, any app that uses CloudKit crashes instantly after start on my WatchOS 9 and iOS 16 simulators.
On WatchOS 8.5 and iOS 15.5 sim everything works. Can you confirm if this is a known bug? I tried redownloading the simulators, but the result is same.
Post not yet marked as solved
As a follow-up to a Networking lab conversation I had on Tuesday I have the following question:
Is there a way to use nscurl to connect to a server that requires client authentication by providing a client certificate? None of the documented options seem to allow that, but maybe there is an undocumented one...
Post not yet marked as solved
Voice isolation does a great job with noise suppression when the user is holding the phone in hand (facetime use case). But when the phone is about 4-feet away from the user, voice isolation quality substantially drops and we are seeing that it is better to not use it.
Our use case demands that user mounts phone on tripod and sites approximately 4 feet away from camera. In this case we are seeing worst performance from voice isolation, presumably because of heavy signal processing and lower original signal to begin with.
Post not yet marked as solved
I'm in the process of building a SwiftUI app with a Sidebar and Detail View. I've run into some issues and I need some help fixing them:
When the app is launched, my detail view shows up at the right of the sidebar. Great! However, the button that is supposed to navigate to that view isn't highlighted, which could cause user confusion. How do I make this button "light up" (with that blue background indicating to the user that it is selected, highlighted) and make the code so that this view opens up when the app opens (like in other Apple apps, see Music and Files)
When I click on one of my sidebar items (which are just NavigationLinks), the background doesn't turn blue (highlight) to indicate that item is selected. I feel like this would cause user confusion, and I'd like to figure out why my code doesn't do this. One side note and a useful piece of information: whenever I click the NavigationLink in the sidebar, the console prints this: onChange(of: UpdateTrigger) action tried to update multiple times per frame.
In the macOS app, upon launch, the detail view shows up. Great. What doesn't show up is my sidebar with the options on it. How do I make it so that the sidebar shows up no matter what unless the user specifies to remove it from view?
Attached are some images and my code. Thanks, y'all!
struct ContentView: View {
var body: some View {
// NavigationSplitView for the sidebar
NavigationSplitView {
// List of the options
List {
// NavigationLink for my button
NavigationLink {
// Linking to my main view that I want to show up upon launch of the app
MainView()
// Label to make it look pretty
} label: {
Label("Main View", systemImage: "icloud")
}
// Make the sidebar actually look like a sidebar
.listStyle(.sidebar)
}
// NavigationTitle at the top of the sidebar
.navigationTitle("Cling")
//
} detail: {
// Make it so that the main screen shows up upon launch (however, this doesn't make the button light up to signify that the user has selected that view)
MainView()
}
}
}
Post not yet marked as solved
What is the correct approach to save the image (eg. CVPixelBuffer to png) obtained after calling the captureHighResolutionFrame method?
"ARKit captures pixel buffers in a full-range planar YCbCr format (also known as YUV) format according to the ITU R. 601-4 standard"
Should I change the color space of the image (ycbcr to rgb using Metal)?
Post not yet marked as solved
HI all,
Like many others, I accidentally installed Ventura on my M1 Max MBP but when I attempt to login on Ventura, the progress bar slowly moves as if trying to login, and then the systems reboots and does so every time so I can’t log in. I tried booting into Safe mode, but the same problem occurs, and I tried reinstalling Ventura and it still has the same problem.
This is on my M1 Max MBP w/32GB/4TB. I think the problem may have occurred because I didn’t have enough disk space for the Ventura update (It popped up a warning), but after I dismissed the warning it went ahead and installed anyway. So I don’t know if that prevented some of the necessary files from installing/updating or if the problem is related to something else.
I also tried to Restore from Time Machine using the Recovery Mode, and I am trying to restore from a snapshot on Macintosh HD (which is encrypted). But selecting Macintosh HD and then clicking Unlock does nothing. I am able to successfully unlock the disk while using Disk Utility and to Reinstall Ventura but Restore from Time Machine doesn’t want to cooperate.
Is there a way to create another User Account via the Terminal? So I can try logging in with a clean account created by Ventura?
Any suggestions or ideas would be much appreciated.
Post not yet marked as solved
With WebXR and event-exchange with maybe coming next summer, would you please consider to add Live-Text to ARQL?
Currently an interactive AR scene is not able to redirect the user to any other webpage, except the originating. And only a single tapToPay event can be sent at the end of an ARQL session without any reasonable payload.
I've come up with a in-house workflow to build interactiveUSDZ archives for product configuration that dynamically show/update an https-URL representing the user's choice that forwards you to an order form / Webshop.
A sample project can be evaluated here:
https://kreativekk.de/Swivel.html
Currently users have to take a screenshot of their configuration, including this URL and switch over to the Photos.app to let it recognise and invite to open the link.
Post not yet marked as solved
Continuity Camera is a way to stream raw video and metadata from iPhone to mac. Is it possible for an iPhone local recording app to use camera continuity to stream a preview from iPhone to mac?
Can camera continuity be made available on iPad, so that one can stream video/metadata to iPad screen (use case being a need to use better camera and user does not have a mac-book)
Post marked as Apple Recommended
Short question: Is is possible to disable the iCloud Keychain synchronization of the Passkeys on-demand?
This would ensure and allow device-specific binding where necessary.
I was wondering if Instruments provides a way to visualize tree-like data structures (eg: a view hierarchy) when building a custom plugin. The last time I checked none of the built-in detail view types (lists, aggregations, call trees, narrative and Time Slice styles) seemed to be a good fit for this.
Looking at this year's "Visualize and optimize Swift concurrency" session there's a mention to a new set of instruments that use a detail UI ("Task force") close to what would be ideal for my use case, but I'm not sure if this is available to 3rd party developers nor where to find the documentation about it.
Thanks!
Just a general question. Is it allowed to join an online Lab together with a colleague or do we need to be only one person?
Post not yet marked as solved
If I was excepted for a lab session, can a person from my team (same developer account) join the WebEx meeting for the lab session?
Post not yet marked as solved
I noticed that this year the Developer App is not allowing me to run the videos in full screen mode. I have confirmed this on a 2019 16inch MBP and 2021 11 Inch iPad Pro. Is anyone else seeing this issue?