We operate a social network application, SportsYou with over 3 million monthly active users and are experiencing significant issues with push notification delivery through APNs. We have a large number of users reporting they are not receiving push notifications. Our infrastructure uses AWS SNS integrated with APNs to deliver notifications. However, AWS CloudWatch consistently reports successful delivery (Success response), even though users confirm they never received the notifications. Because we receive success responses from AWS SNS, our system does not attempt to recreate or refresh the device endpoints. This leaves us unable to detect or recover from these delivery failures automatically. This issue is widespread and inconsistent. It affects users across multiple variables including different iOS versions, different device models, and different versions of our application. We cannot identify a clear pattern that would help us isolate the root cause. With millions of active users, even a small pe
Search results for
Visual Studio Maui IOS
105,699 results found
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hey there, I just upgraded to Mac OS Tahoe ,son an apple MacBook Pro 2019 16inch. am using IntellijIDEA and Flutter to develop a mobile app which I test on the simulator app running iOS 18.4 . the issue: when I start the simulator app. ( while in the loading phase and in the operation phase as well ), the audio from an already open YouTube tab on safari (this happens on chrome browser as well). the sound glitches and becomes Noise. a fix I found online is to kill the audio deamon on Mac OS, This works using the command: sudo killall coreaudiod this kills the audio process, (while the emulator is operational), then the macOS restarts the audio deamon then the audio works fine alongside with the simulator being open. I just want to ask is there a permanent fix for this? is Apple working on a fix for this in the upcoming update?
[quote='861033022, DTS Engineer, /thread/802846?answerId=861033022#861033022'] I’m still researching the exact details of those limitations [/quote] Hey hey, that went quicker than I expected. As things currently stand on iOS 26, an app can only host extensions that it contains. Needless to say, this significantly undermines the utility of the ExtensionKit. While it’s possible that you might find a creative use for it, there’s one specific situation where it’s super useful, namely, using an extension to host code that is either unreliable or deals with untrusted data. For more on that last point, see Creating enhanced security helper extensions. There’s obviously a lot of demand from third-party developers to broaden the scope of ExtensionKit on iOS. If you have a specific use case in mind, feel free to file an enhancement request with the details. That’s particularly important if your use case is limited in some way. As I mentioned above, we already have an ER for the sort of general suppor
Topic:
App & System Services
SubTopic:
Processes & Concurrency
Tags:
My iOS app currently holds a 3.5★ rating with limited reviews, and I’d like to raise it by motivating happy users to share feedback. I’m looking for ethical ways to do this without being pushy. What are the best strategies and timing for review prompts to boost ratings while keeping users satisfied?
Topic:
App Store Distribution & Marketing
SubTopic:
App Store Connect
Tags:
App Store
App Review
App Store Connect
I’m not sure what you’re asking for here. Based on your contribution to other threads (like here), I get the feeling that you’re looking for a sample that shows how to communicate between iOS and Android apps. We don’t have such a sample, and it’s not something I’m create for you here on the forums. If you’d like to see that, please do file an enhancement request for it. And post your bug number, just for the record. If I’ve misunderstood your message, I’d appreciate more details on what you’re looking for. Share and Enjoy — Quinn “The Eskimo!” @ Developer Technical Support @ Apple let myEmail = eskimo + 1 + @ + apple.com
Topic:
App & System Services
SubTopic:
Networking
Hi, as described by the subject, I'm trying to find a way to remove the preinstalled iOS support via command line tool. I need to do that because I need to use the universal architectureVariant in order to build on old Intel-based iOS Simulators, but on iOS 26. As described in this page, I can use this command to download the architecture I need xcodebuild -downloadPlatform iOS -architectureVariant universal however, launching this command I receive this error iOS is already downloaded as arm64Only. To replace with universal, first delete the existing one. Is there any way to remove the current installed iOS platform via command line? In particular I'm serching for a way to do what the button Delete in the attached screen does. Thank you
My Xcode project has the following configuration: 1 iOS app target 1 Xcode framework target (mach-o-type Dynamic Library) 5 static libraries Dependencies: All the static libraries are target dependencies of the framework. The framework is the only target dependency of the iOS app. For the iOS app target, within the General tab > Frameworks, Libraries & Embedded content, I've set the framework as Do not embed So now I have a dynamic framework which won't be copied to the .app bundle in the build output. As per my understanding, this should result in a runtime error, dyld should not be able to find the framework files as they were not embedded in the final .app bundle. But regardless, my app runs without any errors, using all the methods exposed by the framework. What is the correct understanding here? What exactly does Embed/Do not embed mean (apart from excluding the files from .app bundle) When both settings are specified, is there any priority or precedence of one setting o
[quote='860843022, herman602, /thread/802846?answerId=860843022#860843022, /profile/herman602'] Or can an app built with Xcode 26 also run this feature on earlier iOS versions? [/quote] No. This is a new facility in iOS 26. With that out of the way, let’s talk terminology. In app extension parlance: The container app is the one in which the appex is embedded. The host app is the one using the appex. In general, ExtensionKit lets you create a host app that invokes app extensions provided by other developers. Indeed, that’s how it works on macOS. On iOS, however, there are limitations. I’m still researching the exact details of those limitations, but it’s certainly true that iOS apps cannot host extensions created by other third-party developers (FB18784426). Share and Enjoy — Quinn “The Eskimo!” @ Developer Technical Support @ Apple let myEmail = eskimo + 1 + @ + apple.com
Topic:
App & System Services
SubTopic:
Processes & Concurrency
Tags:
[quote='860903022, nam-common, /thread/802640?answerId=860903022#860903022, /profile/nam-common'] For example are there limits to how we can interact with the device? [/quote] An MFi accessory can support the External Accessory framework, which has a very specific communication model. Notably, it doesn’t expose a USB-like API. Rather, your accessory has to implement an MFi-specific on-the-wire protocol that ‘connects’ it to the input and output streams exposed to EA. After that, it’s up to you what commands to run over those streams. I can’t go into the details about the MFi side of this because that info isn’t public. Creating an EA-compatible accessory is a bit of a faff, so I’d definitely explore options outside of that space. For example, an off-the-shelf USB Ethernet dongle will Just Work™ with iOS, at which point you can communicate using standard networking APIs. That definitely has its limitations [1], but EA also has a bunch of limitations. And the advantage with a USB Ethernet dongle is tha
Topic:
App & System Services
SubTopic:
Drivers
Tags:
Yes, my iOS file provider is NSFileProviderReplicatedExtension, and is available on iOS, macOS and visionOS. So that works ok. The UI extension is UIKit, and that one is currently available on iOS only. I am now trying to extend that to visionOS. MacOS will follow after that as it needs code changes.
Topic:
App Store Distribution & Marketing
SubTopic:
App Store Connect
Tags:
I use .scaleEffect(x: 1, y: -1, anchor: .center) to reverse the messages list, so that the latest message always at the bottom. This is correct in ios18, but blurred the whole view in ios26. Complete code: ScrollView { Rectangle() .fill(.clear) .frame(height: 10) // if messages.isEmpty { // MessagesEmpty() // .padding(.horizontal, 10) // .scaleEffect(x: 1, y: -1, anchor: .center) // } MessageInput(chat: chat) .padding(.horizontal, 10) .scaleEffect(x: 1, y: -1, anchor: .center).id(#messag-input-identifier) LazyVStack(spacing: 10) { ForEach(messages) { (message: Message) in MessageItem(message: message, activation: $activeMessageId, audioAdapter: AudioAdapter.shared).id(message.id) } .padding(.horizontal, 10) .scaleEffect(x: 1, y: -1, anchor: .center) } Rectangle() .fill(.clear) .frame(height: 20) } .scaleEffect(x: 1, y: -1, anchor: .center) As shown in the screenshot WechatIMG49.jpg(using ios26beta which is incorrect), WechatIMG50.jpg(using ios18 which is correct) my messages list displays normally on iOS
I'm currently testing this on a physical device (12 Pro Max, iOS 26). Through shortcuts, I know for a fact that I am able to successfully trigger the perform code to do what's needed. In addition, if I just tell siri the phrase without my unit parameter, and it asks me which unit, I am able to, once again, successfully call my perform. The problem is any of my phrases that I include my unit, it either just opens my application, or says I can't understand Here is my sample code: My Entity: import Foundation import AppIntents struct Unit: Codable, Identifiable { let nickname: String let ipAddress: String let id: String } struct UnitEntity: AppEntity { static var typeDisplayRepresentation: TypeDisplayRepresentation { TypeDisplayRepresentation( name: LocalizedStringResource(Unit, table: AppIntents) ) } static let defaultQuery = UnitEntityQuery() // Unique Identifer var id: Unit.ID // @Property allows this data to be available to Shortcuts, Siri, Etc. @Property var name: String // By not including @Proper
Topic:
App & System Services
SubTopic:
Widgets & Live Activities
Tags:
Siri and Voice
Intents
App Intents
I'm having problems with my released app with iOS & WatchOS 26 support. I've added AppIntentConfiguration support in the WatchOS app such that users can configure the complication. My complications also support multiple families and so I have slightly different configuration options available if its in the .accessoryRectangular slot or the .accessoryCircular one. This works fine on Apple Watch when editing the Watch face. Here you can then select the configuration options fine and they are correct for the different variants. However on iOS when configuring in the Apple Watch app on iPhone, the different complication size is ignored and the same configuration options are offered meaning they are wrong for one of them. I created a sample project, here is the app intent code: struct TestWidgetConfigurationIntent: AppIntent, WidgetConfigurationIntent { static var title: LocalizedStringResource = New Widgets with Configuration static var description = IntentDescription(Lots of stuff.) static
Topic:
App & System Services
SubTopic:
Widgets & Live Activities
Tags:
WatchKit
watchOS
WidgetKit
App Intents
When we tried using this feature, we were not able to get it to work on iOS 26. We stumbled upon this ticket (https://developer.apple.com/forums/thread/797538?answerId=854825022#854825022) in the Apple Developer forum, in which possibly an Apple engineer claims it is supported ONLY for iPadOS 26. That engineer would be me, and yes, what I said there is correct. Background GPU access is currently only available on iPads. 96% of the users are on iPhone (compared to iPad), and if we refer to the official documentation above, it claims that this feature should work on iOS 26. What the documentation says is that the APIs themselves are available on iOS and iPadOS and that background GPU access is not available on all devices, both of which are true. This both simplifies the implementation on both platforms and allows for future API evolution without requiring major API revision. This is the same basic pattern most of our APIs use when functionality is only available on some hardware. __
Topic:
Media Technologies
SubTopic:
Video
We build mobile apps for creators to edit their videos. Post editing the video, the creator has to export the video so that it can be uploaded to Youtube. The export is a time consuming and GPU intensive process. The creator can exit the app due to various reasons like receiving the call, putting the app in background etc. This causes the export to fail :( Keeping this limitation in mind there was an announcement from Apple that with the IOS 26 launch would start to support background GPU access. Here is the official documentation: https://developer.apple.com/documentation/BundleResources/Entitlements/com.apple.developer.background-tasks.continued-processing.gpu When we tried using this feature, we were not able to get it to work on IOS 26. We stumbled upon this ticket(https://developer.apple.com/forums/thread/797538?answerId=854825022#854825022) in the Apple Developer forum, in which possibly an Apple engineer claims it is supported ONLY for iPadOS 26. This is a very big bummer for us. 96%
Topic:
Media Technologies
SubTopic:
Video