We have found that on iOS 26 beta some of our app icons built from an Xcode 16 asset catalog containing a single 1024x1024 .png file have a Liquid Glass effect applied to them while others have not.
The documentation states that
If you choose not to use Icon Composer, you can still use an AppIcon asset catalog in your project containing individual app icon images and let the system apply the Liquid Glass material.
and
If you prefer, you can take advantage of the system’s automatically generated treatment that is applied to all app icons.
Is there any insight into how the system treats app icons that have not yet been updated with Icon Composer?
Overview
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hi,
I've had a new deck installed in my car for about 1.5 weeks.
I'm having compatibility issues with my 15PM.
It happens both wired and wirelessly, I get the error "Accessory not supported by this device". It used to happen all the time, now it's 50/50. Sometimes it works.
I've removed and added Bluetooth multiple times on phone and deck, I bought a belkin usb-c to usb-a cable today and it seems to fix it but the problem comes back.
I've changed the setting "FaceID and passcode-allow access when locked-accessories."
The car stereo guy reckons it's definitely an issue with the phone not the deck, I'm inclined to believe him since the error states "by this device".
Any advice appreciated.
Topic:
Media Technologies
SubTopic:
Audio
We’ve run into what looks like a gap in how forceAirDropUnmanaged is enforced on iOS devices.
Setup:
Device: iOS 17.x (unsupervised, enrolled in MDM)
MDM Restriction: forceAirDropUnmanaged = true
Managed Open-In restriction also applied (block unmanaged destinations).
Verified: from a managed app, the AirDrop icon is hidden in the share sheet. This part works as expected.
Issue:
When two iOS devices are brought close together, the proximity-initiated AirDrop / NameDrop flow still allows transfer of photos, videos, or files between devices. In this path, forceAirDropUnmanaged does not appear to apply, even though the same restriction works correctly in the standard sharing pane.
What I’d expect: If forceAirDropUnmanaged is enabled, all AirDrop transfer paths (including proximity/NameDrop) should be treated as unmanaged, and thus blocked when “Managed Open-In to unmanaged destinations” is restricted.
What I observe instead:
Share sheet → AirDrop hidden ✅
Proximity/NameDrop → transfer still possible ❌
Questions for Apple / Community:
Is this a known limitation or expected behavior?
Is there a different restriction key (or combination) that also covers proximity-based AirDrop?
If not currently supported, should this be filed as Feedback (FB) to request alignment between share sheet AirDrop and NameDrop enforcement?
This behaviour introduces a compliance gap for organisations relying on MDM to control data exfiltration on unsupervised or user-enrolled devices. Any clarification or guidance would be greatly appreciated.
Topic:
Business & Education
SubTopic:
Device Management
Tags:
Privacy
Apple Business Manager
Device Management
In iOS 18 betas, the App Store search bar has been moved to the bottom of the screen. This breaks years of usability and is inconsistent with Apple’s own apps—Calendar, Reminders, Maps, Safari, Files, Wallet, and Shortcuts—all of which keep search at the top.
I (and many others) hold the phone in one hand and tap with the other. Top placement is faster, more natural, and aligns with established Apple design. The “thumb reach” argument does not fit real-world usage for a large portion of users.
What I want is consistency across all Apple apps: put the search bar at the top everywhere.
Apple already made this mistake with Safari’s bottom address bar in iOS 15 and had to add a toggle after backlash. Please don’t repeat history.
Feedback ID: FB19598638
If you agree, please follow your own feedback and reference this thread. The more reports Apple sees, the more likely this gets fixed.
Dear Apple Review Team,
The app I submitted has been pending for several days and hasn't entered the review process yet. Could you please help check if there is any issue causing this delay?
APP Name:小马AI美术
BundleID:com.hangzhouyq.art.xmai
I am developing an App in iPadOS26 beta. I try to switch "Multitasking & Gestures" setting to “Full Screen Apps“ in the code of App. Is there any public API available to implement it.
By the way, my App is a Cordova App.
Thank you
Topic:
UI Frameworks
SubTopic:
General
Hello All,
It’s been almost a month, and I’m trying to submit my app’s first version, but I keep getting rejections with the reason “App Completeness – No Connectivity.”
Here’s the timeline:
• Initially, the app was rejected for app completeness.
• I scheduled a review appointment, and during that session, the reviewer confirmed the app was working fine.
• The reviewer only asked for permission policy changes, which I implemented and resubmitted.
• After resubmission, the app was rejected again with the same “App Completeness – No Connectivity” issue.
The reviewer in the first session mentioned it might have been a connectivity issue on the device during testing. So, I scheduled another review appointment, but that appointment request was declined.
To date, I have made 13 submissions, and 10 were rejected for the same reason. I have also raised multiple appeals, but haven’t received any response.
The app works perfectly on multiple devices and iOS versions, and TestFlight testing is successful.
Has anyone faced this issue before? How can I resolve this and prevent repeated rejections? Any suggestions would be appreciated.
Thank you.
Widget abnormal refresh
My app is a music application. When playing or pausing a song, the status is synchronized to NowPlaying, and the app itself supports widgets. During testing, I found that when pausing or playing music, the widget triggers a timeline refresh, which is completely unexpected. However, switching songs does not cause this.
Looking at Apple’s logs:
By default 21:27:08.094490+0800 mediaremoted Set: origin-iPhone-1280262988/client-com.company.musicdev-10059 (music)/player-MediaRemote-DefaultPlayer setting inferred playback state from to
By default 21:27:08.094607+0800 mediaremoted [MRDNowPlayingPlayerClient] PlaybackState changed from Playing to Paused for origin-iPhone-1280262988/client-com.company.musicdev-10059 (music)/player-MediaRemote-DefaultPlayer
By default 21:27:08.094713+0800 mediaremoted [MRDNowPlayingPlayerClient] isPlaying changed to false for origin-iPhone-1280262988/client-com.company.musicdev-10059 (music)/player-MediaRemote-DefaultPlayer
By default 21:27:08.111861+0800 mediaremoted Posted Active Now Playing Notification kMRMediaRemoteNowPlayingApplicationPlaybackStateDidChangeNotification for path origin-iPhone-1280262988/client-com.company.musicdev-10059 (music)/player-MediaRemote-DefaultPlayer
By default 21:27:08.115550+0800 mediaremoted Response: handlePlaybackQueueRequest<B0BDBB4E-C539-4D39-B51C-718115EBD7C4 assistantd-2659 /M/L/AF/R[0:1]> returned for origin-iPhone-1280262988/client-com.company.musicdev-10059 (music)/player-MediaRemote-DefaultPlayer in 0.0005 seconds
By default 21:27:08.119344+0800 assistantd Response: playbackQueue<B0BDBB4E-C539-4D39-B51C-718115EBD7C4 assistantd-2659 /M/L/AF/R[0:1]> returned <> for origin-iPhone-1280262988/client-com.company.musicdev-10059 (music)/player-MediaRemote-DefaultPlayer in 0.0010 seconds
By default 21:27:08.122322+0800 SpringBoard Response: playbackState<63A30582-E3C2-4F4D-AC57-8E5841FAD568> returned for origin-iPhone-1280262988/client-com.company.musicdev-10059 (music)/player-MediaRemote-DefaultPlayer in 0.0005 seconds
By default 21:27:08.126132+0800 chronod Observed com.company.musicdev stopped running for exempt reason: nowPlaying - remainingReasons: None By default 21:27:08.126285+0800 chronod [com.company.musicdev::com.company.musicdev.musicdesktopwidget:VisionWidget_medium4158108784:systemMedium:3758765227620768254:338.00/158.00/21.60:(null)~(null)] on local marked as requiring reload
By default 21:27:08.126455+0800 chronod [com.company.musicdev::com.company.musicdev.musicdesktopwidget:VisionWidget_medium4158108784:3758765227620768254] Reload with configuration [systemRequest(sessionEnded)-immediate-free-1] By default 21:27:08.126854+0800 mediaremoted Response: handlePlaybackQueueRequest<ACF764D9-05A6-41FF-8BB5-8CB81A8BC163 assistantd-2659 /M/L/AF/R[0:1]> returned for origin-iPhone-1280262988/client-com.company.musicdev-10059 (music)/player-MediaRemote-DefaultPlayer in 0.0008 seconds
By default 21:27:08.127960+0800 assistantd Response: playbackQueue<ACF764D9-05A6-41FF-8BB5-8CB81A8BC163 assistantd-2659 /M/L/AF/R[0:1]> returned <> for origin-iPhone-1280262988/client-com.company.musicdev-10059 (music)/player-MediaRemote-DefaultPlayer in 0.0016 seconds
By default 21:27:08.128091+0800 wifid Response: playbackState returned for origin-iPhone-1280262988/client-com.company.musicdev-10059 (music)/player-MediaRemote-DefaultPlayer in 0.0115 seconds
When the music state changes, you can see the widget is triggered to refresh. I want to ask if this is the system’s mechanism? I checked my code and there is no operation to actively refresh the widget when pausing or playing. Can this situation be avoided?
A bit of a novice to app development here but I have a paid developer account, I have registered the identifier for MusicKit on the developer website (using the bundle identifier I've selected in Xcode) but the option to add MusicKit as a capability is not available in Xcode?
I've manually updated the certificates, closed the app and reopened it, started a new project and tried with a different demo project?
Apologies if I am missing something obvious but could someone help me get this capability added?
I'm not sure at which beta it happened, but somewhere during the iOS 26 Public Beta releases, my iCloud Contacts have duplicated close to 100 times each. When I scroll to the bottom of the list to try and manage duplicates, it indicates I only have 2 duplicates. I should be sitting around 350 contacts and instead, I now have over 10k.
Anyone else with this issue? I have filed Feedback in the app. I'm not certain if it was related to iOS 26 or iPadOS 26 as I have both, but noticed it first on my iPhone. I confirmed that the issue has affected my iCloud Contacts and therefore all of my Apple devices included older iOS and MacOS versions. Short of manually deleting all the duplicates, I'm at a loss as to how I can correct his.
This has been an issue since installing the first beta and I haven't seen anyone else say anything about it so I feel like I'm going kind of bonkers. While using Dark Mode, double-tapping to select a word while using the Liquid Glass keyboard, like in Messages or Safari, produces a flashlight-like effect over the text with each tap. While this is a fine animation and pretty helpful when moving the cursor, the brightness of the flashlight effect on white text in Dark Mode makes it impossible to see what you're actually tapping to select. The brightness seems to intensify 100x when rapidly tapping. This is not an issue in Light Mode when the text itself is black.
To recreate this, just turn your phone to Dark Mode, go into Messages and type a few words into a thread. Try to double-tap to select a word in the text box. The brightness of the selected word should intensify to the point of being unable to see the word itself.
Has anyone been bothered by this? Is there a way to fix or adjust it? I've tried reducing transparency and a bunch of other settings but nothing has worked.
General:
Forums subtopic: App & System Services > Networking
TN3151 Choosing the right networking API
Networking Overview document — Despite the fact that this is in the archive, this is still really useful.
TLS for App Developers forums post
Choosing a Network Debugging Tool documentation
WWDC 2019 Session 712 Advances in Networking, Part 1 — This explains the concept of constrained networking, which is Apple’s preferred solution to questions like How do I check whether I’m on Wi-Fi?
TN3135 Low-level networking on watchOS
TN3179 Understanding local network privacy
Adapt to changing network conditions tech talk
Understanding Also-Ran Connections forums post
Extra-ordinary Networking forums post
Foundation networking:
Forums tags: Foundation, CFNetwork
URL Loading System documentation — NSURLSession, or URLSession in Swift, is the recommended API for HTTP[S] on Apple platforms.
Network framework:
Forums tag: Network
Network framework documentation — Network framework is the recommended API for TCP, UDP, and QUIC on Apple platforms.
Building a custom peer-to-peer protocol sample code (aka TicTacToe)
Implementing netcat with Network Framework sample code (aka nwcat)
Configuring a Wi-Fi accessory to join a network sample code
Moving from Multipeer Connectivity to Network Framework forums post
Network Extension (including Wi-Fi on iOS):
See Network Extension Resources
Wi-Fi Fundamentals
TN3111 iOS Wi-Fi API overview
Wi-Fi Aware framework documentation
Wi-Fi on macOS:
Forums tag: Core WLAN
Core WLAN framework documentation
Wi-Fi Fundamentals
Secure networking:
Forums tags: Security
Apple Platform Security support document
Preventing Insecure Network Connections documentation — This is all about App Transport Security (ATS).
Available trusted root certificates for Apple operating systems support article
Requirements for trusted certificates in iOS 13 and macOS 10.15 support article
About upcoming limits on trusted certificates support article
Apple’s Certificate Transparency policy support article
What’s new for enterprise in iOS 18 support article — This discusses new key usage requirements.
Technote 2232 HTTPS Server Trust Evaluation
Technote 2326 Creating Certificates for TLS Testing
QA1948 HTTPS and Test Servers
Miscellaneous:
More network-related forums tags: 5G, QUIC, Bonjour
On FTP forums post
Using the Multicast Networking Additional Capability forums post
Investigating Network Latency Problems forums post
WirelessInsights framework documentation
iOS Network Signal Strength
Share and Enjoy
—
Quinn “The Eskimo!” @ Developer Technical Support @ Apple
let myEmail = "eskimo" + "1" + "@" + "apple.com"
Xcode version 16.4 (16F6) now only supports uploading a 1024x1024 icon, but during archiving, it keeps prompting that a 120x120 icon is missing. Attempts to add the image in Assets didn't work. How can this issue be resolved?"
I have an app that displays a MapView. While I am in light mode everything is fine. I can scroll around the map and my overlays (made by UIVisualEffectView containing an UIGlassEffect) stay light and look well!
As soon as I change my phone to dark mode, depending on what's underneath the buttons (a light residential area or darker wooded areas) some of my buttons change color. But not all, only where it's supposedly lighter or darker underneath. This makes my whole UI look strange. Some buttons bright, some dark.
Is there a way to lock a "color" or interfaceStyle to the effects-view? In light mode everything is fine, but in dark mode it just looks super strange.
I am trying to get MIDI output from the AU Host demo app using the recent MIDI processor example. The processor works correctly in Logic Pro, but I cannot send MIDI from the AUv3 extension in standalone mode using the default host app to another program (e.g., Ableton).
The MIDI manager, which is part of the standalone host app, works fine, and I can send MIDI using it directly—Ableton receives it without issues. I have already set the midiOutputNames in the extension, and the midiOutBlock is mapped. However, the MIDI data from the AUv3 extension does not reach Ableton in standalone mode. I suspect the issue is that midiOutBlock might never be called in the plugin, or perhaps an input to the plugin is missing, which prevents it from sending MIDI. I am currently using the default routing.
I have modified the MIDI manager such that it works well as described above. Here is a part of my code for SimplePlayEngine.swift and my MIDIManager.swift for reference:
@MainActor
@Observable
public class SimplePlayEngine {
private let midiOutBlock: AUMIDIOutputEventBlock = { sampleTime, cable, length, data in return noErr }
var scheduleMIDIEventListBlock: AUMIDIEventListBlock? = nil
public init() {
engine.attach(player)
engine.prepare()
setupMIDI()
}
private func setupMIDI() {
if !MIDIManager.shared.setupPort(midiProtocol: MIDIProtocolID._2_0, receiveBlock: { [weak self] eventList, _ in
if let scheduleMIDIEventListBlock = self?.scheduleMIDIEventListBlock {
_ = scheduleMIDIEventListBlock(AUEventSampleTimeImmediate, 0, eventList)
}
}) {
fatalError("Failed to setup Core MIDI")
}
}
func initComponent(type: String, subType: String, manufacturer: String) async -> ViewController? {
reset()
guard let component = AVAudioUnit.findComponent(type: type, subType: subType, manufacturer: manufacturer) else {
fatalError("Failed to find component with type: \(type), subtype: \(subType), manufacturer: \(manufacturer))" )
}
do {
let audioUnit = try await AVAudioUnit.instantiate(
with: component.audioComponentDescription, options: AudioComponentInstantiationOptions.loadOutOfProcess)
self.avAudioUnit = audioUnit
self.connect(avAudioUnit: audioUnit)
return await audioUnit.loadAudioUnitViewController()
} catch {
return nil
}
}
private func startPlayingInternal() {
guard let avAudioUnit = self.avAudioUnit else { return }
setSessionActive(true)
if avAudioUnit.wantsAudioInput { scheduleEffectLoop() }
let hardwareFormat = engine.outputNode.outputFormat(forBus: 0)
engine.connect(engine.mainMixerNode, to: engine.outputNode, format: hardwareFormat)
do { try engine.start() } catch {
isPlaying = false
fatalError("Could not start engine. error: \(error).")
}
if avAudioUnit.wantsAudioInput { player.play() }
isPlaying = true
}
private func resetAudioLoop() {
guard let avAudioUnit = self.avAudioUnit else { return }
if avAudioUnit.wantsAudioInput {
guard let format = file?.processingFormat else { fatalError("No AVAudioFile defined.") }
engine.connect(player, to: engine.mainMixerNode, format: format)
}
}
public func connect(avAudioUnit: AVAudioUnit?, completion: @escaping (() -> Void) = {}) {
guard let avAudioUnit = self.avAudioUnit else { return }
engine.disconnectNodeInput(engine.mainMixerNode)
resetAudioLoop()
engine.detach(avAudioUnit)
func rewiringComplete() {
scheduleMIDIEventListBlock = auAudioUnit.scheduleMIDIEventListBlock
if isPlaying { player.play() }
completion()
}
let hardwareFormat = engine.outputNode.outputFormat(forBus: 0)
engine.connect(engine.mainMixerNode, to: engine.outputNode, format: hardwareFormat)
if isPlaying { player.pause() }
let auAudioUnit = avAudioUnit.auAudioUnit
if !auAudioUnit.midiOutputNames.isEmpty { auAudioUnit.midiOutputEventBlock = midiOutBlock }
engine.attach(avAudioUnit)
if avAudioUnit.wantsAudioInput {
engine.disconnectNodeInput(engine.mainMixerNode)
if let format = file?.processingFormat {
engine.connect(player, to: avAudioUnit, format: format)
engine.connect(avAudioUnit, to: engine.mainMixerNode, format: format)
}
} else {
let stereoFormat = AVAudioFormat(standardFormatWithSampleRate: hardwareFormat.sampleRate, channels: 2)
engine.connect(avAudioUnit, to: engine.mainMixerNode, format: stereoFormat)
}
rewiringComplete()
}
}
and my MIDI Manager
@MainActor
class MIDIManager: Identifiable, ObservableObject {
func setupPort(midiProtocol: MIDIProtocolID,
receiveBlock: @escaping @Sendable MIDIReceiveBlock) -> Bool {
guard setupClient() else { return false }
if MIDIInputPortCreateWithProtocol(client, portName, midiProtocol, &port, receiveBlock) != noErr {
return false
}
for source in self.sources {
if MIDIPortConnectSource(port, source, nil) != noErr {
print("Failed to connect to source \(source)")
return false
}
}
setupVirtualMIDIOutput()
return true
}
private func setupVirtualMIDIOutput() {
let virtualStatus = MIDISourceCreate(client, virtualSourceName, &virtualSource)
if virtualStatus != noErr {
print("❌ Failed to create virtual MIDI source: \(virtualStatus)")
} else {
print("✅ Created virtual MIDI source: \(virtualSourceName)")
}
}
func sendMIDIData(_ data: [UInt8]) {
print("hey")
var packetList = MIDIPacketList()
withUnsafeMutablePointer(to: &packetList) { ptr in
let pkt = MIDIPacketListInit(ptr)
_ = MIDIPacketListAdd(ptr, 1024, pkt, 0, data.count, data)
if virtualSource != 0 {
let status = MIDIReceived(virtualSource, ptr)
if status != noErr {
print("❌ Failed to send MIDI data: \(status)")
} else {
print("✅ Sent MIDI data: \(data)")
}
}
}
}
}
Hello, can someone share how long it takes to verify payment data? It's been more than 30 days, and it still says "under moderation".
Topic:
App Store Distribution & Marketing
SubTopic:
General
I have a public and accessible .well-known/apple-app-site-association
file for both my domain.com and subdomain.domain.com with
"paths": ["*"] .
Both example.com and blog.example.com are added in Associated domains and any link that contains domain.com and domain.com/path normally deep links into my app.
I used to have an *.example.com that successfully deep linked all my subdomains into my app but now I had to remove it as some subdomains will need to link to other apps, but some should still link to the same app.
I removed * but left blog.example.com as that specific subdomain still needs to deep link into my app. But now blog.example.com is not even being recognized by my app and any link starting with blog.example.com just opens in safari.
What am I missing? Why is this happening ?
I'm building a SwiftUI app for iPad using a NavigationSplitView as the navigation root. Below is a simplified version of the app's navigation. There are a Home Page and a Settings Page, each with its own NavigationStack. The page that appears in the detail column depends on the sidebar's selection value. The issue I'm facing is that when I navigate deeply into the Home Page's NavigationStack (e.g., to a Home Page Child view), switch to the Settings Page, and then switch back to the Home Page, the Home Page's navigation path has been reset to [] and the previous state is lost. The same issue occurs if I navigate deeply into the Settings Page (e.g., to a Settings Page Child view), switch to the Home Page, and then return to the Settings Page: the navigation state for the Settings Page is lost, and it reverts to the root of the NavigationStack. Why is this happening and how can I fix it so that switching pages in the sidebar doesn't reset the NavigationStack of each individual page in the detail column? Thank you.
struct ContentView: View {
@State var selection: String?
@State var firstPath = [String]()
@State var secondPath = [String]()
var body: some View {
NavigationSplitView {
List(selection: $selection) {
Text("Home")
.tag("home")
Text("Settings")
.tag("settings")
}
} detail: {
if selection == "home" {
HomePage(path: $firstPath)
} else {
SettingsPage(path: $secondPath)
}
}
}
}
struct HomePage: View {
@Binding var path: [String]
var body: some View {
NavigationStack(path: $path) {
NavigationLink("Home Page", value: "Home")
.navigationDestination(for: String.self) { _ in
Text("Home Page Child")
}
}
}
}
struct SettingsPage: View {
@Binding var path: [String]
var body: some View {
NavigationStack(path: $path) {
NavigationLink("Settings Page", value: "Settings")
.navigationDestination(for: String.self) { _ in
Text("Settings Page Child")
}
}
}
}
#Preview {
ContentView()
}
I've noticed that in iOS 26, Navigation Bar Items' Image Insets parameters as set in Xcode are not being read correctly. Specifically, it appears that on iOS 26 beta, negative inset numbers are being read as positive.
Feedback report FB19838333 includes a sample project demonstrating this bug.
I have a standard Mac project created using Xcode templates with Storyboard. I manually removed the window from the Storyboard file and use ViewController.h as the base class for other ViewControllers. I create new windows and ViewControllers programmatically, implementing the UI entirely through code.
However, I've discovered that on all macOS versions prior to macOS 26 beta, the application would trigger automatic termination due to App Nap under certain circumstances. The specific steps are:
Close the application window (the application doesn't exit immediately at this point)
Click on other windows or the desktop, causing the application to lose focus (though this description might not be entirely accurate - the app might have already lost focus when the window closed, but the top menu bar still shows the current application immediately after window closure)
The application automatically terminates
In previous versions, I worked around this issue by manually executing [[NSApplication sharedApplication] run]; to create an additional main event loop, keeping the application active (although I understand this isn't an ideal approach).
However, in macOS 26 beta, I've found that calling [[NSApplication sharedApplication] run]; causes all NSButton controls I created in the window to become unresponsive to click events, though they still respond to mouse enter events.
Through recent research, I've learned that the best practice for preventing App Nap automatic termination is to use [[NSProcessInfo processInfo] disableAutomaticTermination:@"User interaction required"]; to increment the automatic termination reference count.
My questions are:
Why did calling [[NSApplication sharedApplication] run]; work properly on macOS versions prior to 15.6.1?
Why does calling [[NSApplication sharedApplication] run]; in macOS 26 beta cause buttons to become unresponsive to click events?