Search results for

Visual Studio Maui IOS

105,723 results found

Post

Replies

Boosts

Views

Activity

Custom USB Network Device Driver on iPhone
Hi, We are using the AX88772C as a USB->Ethernet bridge in a product we are developing. Due to the chip not following the NCM protocol, it is not supported by the default networking drivers on the iPhone. Initially, we intended on using DriverKit to develop a userspace driver for this device. However, we have been informed DriverKit is only available on iPad OS, not iOS. As such is the case, we have found two possible alternatives. First being IOkit, and the second being External Accessory Session (EASession). What are the limitations of each of these options? We need the ability to send and receive USB packets to Control and Bulk endpoints. Is this possible with either of the options defined above? Would either of these options require the device to be MFi certified? We have read that some APIs within IOkit require the apple device to be jailbroken. Is there a list of features that can be used without a jailbroken device? Documentation on these 2 options is limited, so any official documentation
5
0
129
3w
AlarmKit plays system error tone instead of custom sound files (iOS 26.0)
AlarmKit custom sounds are universally broken in iOS 26.0 stable - instead of playing your custom sound, it plays a system error/timeout beep. I've spent days investigating why custom sounds result in what sounds like an error beep (like when you cancel an operation or hit a timeout) instead of the actual audio file. I can now prove this is an Apple bug, not implementation error. Evidence: Test 1: My Implementation Followed Apple's documentation exactly Tried both bundle and Library/Sounds (as documented) Result: System error beep (not my audio) Test 2: Professional Apps Tested ADHDAlarms (popular AlarmKit example by jacobsapps) https://github.com/jacobsapps/ADHDAlarms Their airhorn.mp3 custom sound: same error beep (not an airhorn) Their default sound: works perfectly Test 3: Device Testing Physical iPhone (iOS 26.0 - 23A341): broken iOS Simulator: broken Not device-specific Files are found correctly, but the actual audio file is never played. Instead, you hear what sounds like a s
3
0
91
3w
Reply to Safe areas ignored after navigating a WebView/WebPage back in a NavigationStack
Yes, I'm having the same problem (with a wrapped WKWebView, but presumably also the new WebView). The jumping-on-back-button issue was also a problem before iOS 26, but you could work around it by using .ignoresSafeArea(.all, edges: .bottom) However, now in iOS 26, Apple wants the content to flow behind the toolbar, meaning the above fix is no longer suitable. Did you manage to find a workaround?
Topic: Safari & Web SubTopic: General Tags:
3w
Reply to I Need some clarifications about FoundationModels
Happy to help answer some of these! Yes. The Foundation Models framework uses the on-device system foundation model. It's an ~3 billion parameter model designed and trained by Apple. Is the language model provided by FoundationModels designed and trained by Apple? Or is it based on an open‑source model? The exact same on-device model is available on iOS, iPadOS, macOS, and VisionOS on all devices that support Apple Intelligence. Is this on‑device model available on iOS (and iPadOS), or is it limited to macOS? No. The general on-device model you access via Foundation Models is not a coding model and isn't suitable for most code tasks. Xcode on Mac has a separate built-in coding model to help with code completion. When I write code in Xcode, is code completion powered by this same local model? If so, why isn’t the same model available in the left‑hand chat sidebar in Xcode (so that I can use it there instead of relying on ChatGPT)? Yes and no. You can give the on-device model access to that ki
3w
Reply to App Clips don't work
We have the same issue. The appclip was working well before, but with iOS 26 we get the error This operation couldn't be completed. (ASDErrorDomain error 507.) . The appclip works when run via the TestFlight, but not when open via an URL. So it seems the error is at the appClip invocation. Any help to fix the issue is appreciated.
Topic: UI Frameworks SubTopic: General Tags:
3w
Reply to iOS26 beta: AppClips are not working properly
We have the same issue. It was working well with iOS 18, but with iOS 26 we get the error This operation couldn't be completed (ASDErrorDomain- Error 507.) . The appclip works when run via the TestFlight, but not when open via an URL. So it seems the error is at the appClip invocation. Any help to fix the issue is appreciated.
Topic: App & System Services SubTopic: General Tags:
3w
Reply to Supervised devices show wifi setup screen on restart
This behavior still exists in iOS 26.0.1 and iPadOS 26.0.1 even after upgrading the Apple Configurator host to macOS Tahoe 26.0.1. On every restart, Software Update Complete is displayed, along with: Your iPad has been updated to iPadOS 26.0.1. Your iPhone has been updated to iOS 26.0.1. Followed by the Welcome screen with the Get Started button. (Upgrading the Apple Configurator host to macOS Tahoe 26.0.1 did, however, fix the blank Profiles bug in Apple Configurator Version 2.18 10A23.)
3w
UIScene based state restoration on tvOS not working
I can’t get UIScene-based state restoration to work on tvOS as it does on iOS. UISceneSession.stateRestorationActivity is always nil in UIWindowSceneDelegate.scene(_:willConnectTo:options:) despite UIWindowSceneDelegate.stateRestorationActivity(for:) being called in the previous lifecycle. The NSUserActivityType is correctly configured in the Info.plist. Has anyone encountered the same issue or knows how to get this to work? Sample Project https://github.com/antiraum/tvosSceneStateRestoration Feedback FB20451479
1
0
55
3w
iOS 26 Voice Over is reporting an extra tab
Feedback number: FB20451665 When building with Xcode 26, Voice Over is reporting an extra tab when swiping through tabs. Please see the sample project below: /* This is a Sample project to show that I believe there is a Voice Over bug in iOS 26. When swiping through tabs with Voice Over active, there always appears to be an extra tab. Here I have 5 tabs, when on tab one VO reads out tab 1 of 6, then tab 2 of 6, all the way to the last tab, when voice over reads out tab 5 of 6. Never tab 6 of 6. Is there a possibility that voice over is picking up the underlying `more` tab and reading that out? This has also been reportedly found in the Files app here: https://www.applevis.com/comment/195441#comment-195441 */ struct ContentView: View { var body: some View { TabView { /// Activating this has Voice over telling us there are 6 Tabs. Tab(RootTab.home.title, systemImage: circle.fill) { Text(This is the (RootTab.home.title.capitalized) screen) } .accessibilityLabel((RootTab.home.title.capitalized) tab) .acc
5
0
2.0k
3w
Reply to MapKit JS Look Around not pointing camera towards the lat/lng entered
That sample code doesn't show how to create a place from a lat/lng or lookup a place by address, so I'm not sure if it would help my use-case. Even though lat/lng is a 2D point with no heading information, the point were the camera is vs. the requested point creates a line that the camera should be pointing to. The Apple Maps app does this automatically. We're also using Look Around in our iOS app and the heading is calculated there automatically from the same lat/lng we're providing to MapKit JS. See screenshots below: Native app Look Around MapKit JS Look Around
3w
What’s the best way to improve my app’s rating and get more positive reviews?
My iOS app currently holds a 3.5★ rating with limited reviews, and I’d like to raise it by motivating happy users to share feedback. I’m looking for ethical ways to do this without being pushy. What are the best strategies and timing for review prompts to boost ratings while keeping users satisfied?
4
0
202
3w
I Need some clarifications about FoundationModels
Hello I’m experimenting with Apple’s on‑device language model via the FoundationModels framework in Xcode (using LanguageModelSession in my code). I’d like to confirm a few points: • Is the language model provided by FoundationModels designed and trained by Apple? Or is it based on an open‑source model? • Is this on‑device model available on iOS (and iPadOS), or is it limited to macOS? • When I write code in Xcode, is code completion powered by this same local model? If so, why isn’t the same model available in the left‑hand chat sidebar in Xcode (so that I can use it there instead of relying on ChatGPT)? • Can I grant this local model access to my personal data (photos, contacts, SMS, emails) so it can answer questions based on that information? If yes, what APIs, permission prompts, and privacy constraints apply? Thanks
3
0
538
3w