Overview

Post

Replies

Boosts

Views

Activity

ShieldConfiguration: Shield does not update when token is moved from one store to anohter (while in foreground)
Hello, I have noticed that the ShieldConfiguration is only requested when opening a target app, and never when the application token is moved to a different shield while the target app remains in foreground. This causes problems because many times the wrong ShieldConfiguration is displayed (recycled) instead of requesting a new ShieldConfiguration. This bug has been around since the introduction of the Screen Time API in 2020 and is has not been addressed. Bug reports: FB14237883 FB17902392 Please fix asap!! Not acceptable to have bugs not being addressed for more than 5 years. Most concerning: This is still reproducing on iOS 26 beta 7!! Thanks a lot for your help.
0
0
134
3w
Liquid glass - precent automatic color changes
I have an app that displays a MapView. While I am in light mode everything is fine. I can scroll around the map and my overlays (made by UIVisualEffectView containing an UIGlassEffect) stay light and look well! As soon as I change my phone to dark mode, depending on what's underneath the buttons (a light residential area or darker wooded areas) some of my buttons change color. But not all, only where it's supposedly lighter or darker underneath. This makes my whole UI look strange. Some buttons bright, some dark. Is there a way to lock a "color" or interfaceStyle to the effects-view? In light mode everything is fine, but in dark mode it just looks super strange.
Topic: Design SubTopic: General Tags:
0
0
1.1k
2w
Xcode Crashes While Opening or Searching Specific Files
My Xcode crashed over and over again while I searing spefic file like "TingMusic". I have tried uninstall and install Xcode from 16.0 to 16.4, clear derived data folder, reboot my Mac, but none of these working. 😭 Translated Report (Full Report Below) Process: Xcode [1811] Path: /Applications/Xcode.app/Contents/MacOS/Xcode Identifier: com.apple.dt.Xcode Version: 16.4 (23792) Build Info: IDEApplication-23792000000000000~2 (16F6) App Item ID: 497799835 App External ID: 874973124 Code Type: ARM-64 (Native) Parent Process: launchd [1] User ID: 501 Date/Time: 2025-09-02 10:51:26.8582 +0800 OS Version: macOS 15.6 (24G84) Report Version: 12 Anonymous UUID: 9835064A-AD7C-EE47-64DE-49587A7EC956 Time Awake Since Boot: 320 seconds System Integrity Protection: enabled Crashed Thread: 0 Dispatch queue: com.apple.main-thread Exception Type: EXC_CRASH (SIGABRT) Exception Codes: 0x0000000000000000, 0x0000000000000000 Termination Reason: Namespace SIGNAL, Code 6 Abort trap: 6 Terminating Process: Xcode [1811] Application Specific Information: abort() called Application Specific Signatures: isSameDocumentAsURL Kernel Triage: VM - (arg = 0x3) mach_vm_allocate_kernel failed within call to vm_map_enter Thread 0 Crashed:: Dispatch queue: com.apple.main-thread 0 libsystem_kernel.dylib 0x186d5e388 __pthread_kill + 8 1 libsystem_pthread.dylib 0x186d9788c pthread_kill + 296 2 libsystem_c.dylib 0x186ca0a3c abort + 124 3 IDEKit 0x10890d888 +[IDEAssertionHandler _handleAssertionWithLogString:assertionSignature:assertionReason:extraBacktrace:] + 964 4 IDEKit 0x10890dcf8 -[IDEAssertionHandler handleFailureInMethod:object:fileName:lineNumber:assertionSignature:messageFormat:arguments:] + 872 5 DVTFoundation 0x1047839a0 _DVTAssertionHandler + 412 6 DVTFoundation 0x104783b04 _DVTAssertionFailureHandler + 196 7 IDEKit 0x1086e5178 -[IDENavigableItemCoordinator _navigableItemForFilePath:inWorkspace:withSeenFileReferences:computedNavItemsByContainerFilePath:allowLeaf:] + 2240 8 IDEKit 0x1086e3d24 -[IDENavigableItemCoordinator _structureNavigableItemForFileURL:inWorkspace:error:] + 268 9 IDEKit 0x1086e4658 -[IDENavigableItemCoordinator structureNavigableItemForDocumentURL:inWorkspace:error:] + 152 10 IDEKit 0x1086e89bc +[IDENavigableItemCoordinator temporaryStructureItemForDocumentURL:forWorkspace:error:inScope:] + 116 11 IDEKit 0x1088689e0 +[IDEOpenQuicklySubpathGenerator subpathForURL:lineNumber:isFromProject:showFileName:fromWorkspace:withAttributes:] + 716 12 IDEKit 0x108869a88 -[IDEOpenQuicklyResult(SubPath) subPathForWorkspace:withAttributes:] + 356 13 IDEKit 0x1088eee54 -[IDEOpenQuicklyResultDisplayRecord subtitle] + 172 14 IDEKit 0x10894bb1c -[IDEOpenQuicklyResult(ViewExtension) cellViewForOutlineView:displayRecord:delegate:] + 596 15 IDEKit 0x108849564 -[IDEQuickSearchWindowController outlineView:viewForTableColumn:item:] + 72 16 AppKit 0x18ae314a8 -[NSTableView(NSTableViewViewBased) makeViewForTableColumn:row:] + 176 17 AppKit 0x18ae30c0c -[NSTableRowData _addViewToRowView:atColumn:row:] + 228 18 AppKit 0x18ae2ed5c -[NSTableRowData _initializeRowView:atRow:] + 328 19 AppKit 0x18ae2d8ec -[NSTableRowData _preparedRowViewForRow:storageHandler:] + 140 20 AppKit 0x18ae2d7bc -[NSTableRowData _addRowViewForVisibleRow:withPriorView:] + 268 21 AppKit 0x18ae2d5e4 -[NSTableRowData _addRowViewForVisibleRow:] + 316 22 AppKit 0x18ae2cdf0 -[NSTableRowData _updateVisibleRowEntries] + 640 23 AppKit 0x18ae2c854 -[NSTableRowData updateVisibleRowViews] + 612 24 AppKit 0x18ae2bf08 -[NSTableView layout] + 148 25 AppKit 0x18b8d0fa8 ___NSViewLayout_block_invoke + 632 26 AppKit 0x18ade7000 NSPerformVisuallyAtomicChange + 108 27 AppKit 0x18adeb8d8 _NSViewLayout + 96 28 AppKit 0x18b8c72cc __36-[NSView _layoutSubtreeWithOldSize:]_block_invoke + 372 29 AppKit 0x18ade7000 NSPerformVisuallyAtomicChange + 108 30 AppKit 0x18adeb86c -[NSView _layoutSubtreeWithOldSize:] + 100 31 AppKit 0x18b8c7410 __36-[NSView _layoutSubtreeWithOldSize:]_block_invoke + 696 32 AppKit 0x18ade7000 NSPerformVisuallyAtomicChange + 108 33 AppKit 0x18adeb86c -[NSView _layoutSubtreeWithOldSize:] + 100 34 AppKit 0x18b8c7410 __36-[NSView _layoutSubtreeWithOldSize:]_block_invoke + 696 35 AppKit 0x18ade7000 NSPerformVisuallyAtomicChange + 108 36 AppKit 0x18adeb86c -[NSView _layoutSubtreeWithOldSize:] + 100 37 AppKit 0x18b8c7410 __36-[NSView _layoutSubtreeWithOldSize:]_block_invoke + 696 38 AppKit 0x18ade7000 NSPerformVisuallyAtomicChange + 108 39 AppKit 0x18adeb86c -[NSView _layoutSubtreeWithOldSize:] + 100 40 AppKit 0x18b8c7410 __36-[NSView _layoutSubtreeWithOldSize:]_block_invoke + 696 41 AppKit 0x18ade7000 NSPerformVisuallyAtomicChange + 108 42 AppKit 0x18adeb86c -[NSView _layoutSubtreeWithOldSize:] + 100 43 AppKit 0x18b8c7410 __36-[NSView _layoutSubtreeWithOldSize:]_block_invoke + 696
0
0
149
2w
Threading guarantees with AVCaptureVideoDataOutput
I'm writing some camera functionality that uses AVCaptureVideoDataOutput. I've set it up so that it calls my AVCaptureVideoDataOutputSampleBufferDelegate on a background thread, by making my own dispatch_queue and configuring the AVCaptureVideoDataOutput. My question is then, if I configure my AVCaptureSession differently, or even stop it altogether, is this guaranteed to flush all pending jobs on my background thread? I have a more practical example below, showing how I am accessing something from the foreground thread from the background thread, but I wonder when/how it's safe to clean up that resource. I have setup similar to the following: // Foreground thread logic dispatch_queue_t queue = dispatch_queue_create("avf_camera_queue", nullptr); AVCaptureSession *captureSession = [[AVCaptureSession alloc] init]; setupInputDevice(captureSession); // Connects the AVCaptureDevice... // Store some arbitrary data to be attached to the frame, stored on the foreground thread FrameMetaData frameMetaData = ...; MySampleBufferDelegate *sampleBufferDelegate = [MySampleBufferDelegate alloc]; // Capture frameMetaData by reference in lambda [sampleBufferDelegate setFrameMetaDataGetter: [&frameMetaData]() { return &frameMetaData; }]; AVCaptureVideoDataOutput *captureVideoDataOutput = [[AVCaptureVideoDataOutput alloc] init]; [captureVideoDataOutput setSampleBufferDelegate:sampleBufferDelegate queue:queue]; [captureSession addOutput:captureVideoDataOutput]; [captureSession startRunning]; [captureSession stopRunning]; // Is it now safe to destroy frameMetaData, or do we need manual barrier? And then in MySampleBufferDelegate: - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection { // Invokes the callback set above FrameMetaData *frameMetaData = frameMetaDataGetter(); emitSampleBuffer(sampleBuffer, frameMetaData); }
0
0
303
2w
VPP Asset allocation getting delayed
We are experiencing a critical issue where VPP app installations are consistently taking an excessive amount of time, leading to significant delays in asset association. We are deployionThis is a systemic problem that affects all VPP apps, not just an isolated case. Apps: 39470db7-e475-4269-9709-c80641657027 => com.zimride.instant d0876900-2579-463e-99f1-b7c85ef5c5e8 com.microsoft.azureauthenticator Troubleshooting: We have performed extensive troubleshooting and can confirm the following: VPP Token: The VPP token has been successfully renewed and is currently active and valid. License Availability: We've verified that there are sufficient VPP licenses available for the apps being deployed. Device Status: We've attempted the following on the affected devices: Restarted the devices. Switched to different Wi-Fi networks. Uninstalled and re-installed the apps. App Status: The issue is not limited to a single app; all VPP apps are failing to install. License Revocation: We attempted to revoke and reassign licenses for some devices, but this did not resolve the issue. The app was not pushed, and the pending status remained. Troubleshooting: Through our internal investigation, we have determined that the core issue is that the Asset Association Status is consistently taking excessive time. This seems to be preventing the app installation queue from processing. We have observed a significant delay in the processing of events within the Notification Channel. The time between the event being created and a response being received is excessively long, indicating a potential backlog or issue. We have included a few recent examples below for your reference: Event ID: 39470db7-e475-4269-9709-c80641657027 com.zimride.instant Created Time: 2025-08-26 01:02:04 Response Time: 2025-08-26 01:34:05 Event ID: d0876900-2579-463e-99f1-b7c85ef5c5e8 com.microsoft.azureauthenticator Created Time: 2025-08-25 21:16:29 Response Time: 2025-08-25 22:21:07 We would appreciate your help in the following areas: Resolution: Could you provide any known solutions or workarounds for an asset association status that is taking excessive amount of time'? Best Practices: Are there any recommended best practices or additional parameters we should be checking with the MDM that might influence the queueing of VPP app assignments? Queueing Parameters: Could you provide insight into the parameters or conditions that can affect the queueing and processing of VPP app installations on Apple's servers? Please let us know if there is any additional information or logs we can provide.
0
0
355
3w
Text with Liquid Glass effect
I'm looking for a way to implement Liquid Glass effect in a Text, and I have issues. If I want to do gradient effect in a Text, no problem, like below. Text("Liquid Glass") .font(Font.system(size: 30, weight: .bold)) .multilineTextAlignment(.center) .foregroundStyle( LinearGradient( colors: [.blue, .mint, .green], startPoint: .leading, endPoint: .trailing ) ) But I cannot make sure that I can apply the .glassEffect with .mask or .foregroundStyle. The Glass type and Shape type argument looks like not compatible with the Text shape itself. Any solution to do this effect on Text ? Thanks in advance for your answers.
0
0
176
2w
Use as ringtone
Through IOS 26 public beta 5, the “use as ringtone” option will freeze the Music app, which must be closed and restarted. Is this capability supposed to apply to songs in Apple Music or only for MP3 WMA music files in the files app? If it is only applicable to music files and not Apple Music, then it should not appear as an option in the share menu of an apple song. I have a 15 pro.
0
0
226
3w
Navigation title not visible in SplitViewController in macCatalyst on iOS 26
We are using a column style split view controller as root view of our app and in iOS26 the navigation titles of primary and supplementary view controllers are not visible and secondary view controller title is displayed in supplementary column. Looks the split view hidden all the child view controllers title and shown the secondary view title as global in macCatlayst. The right and left barbutton items are showing properly for individual view controllers. Facing this weird issue in iOS26 betas. The secondary navigation title also visible only when WindowScene,titlebar.titleVisibility is not hidden. Kindly suggest the fix for this issue as we can't use the secondary view navigation title for showing supplementary view's data. The issue not arises in old style split views or when the split view embedded in another splitView. Refer the sample code and attachment here let splitView = UISplitViewController(style: .tripleColumn) splitView.preferredDisplayMode = .twoBesideSecondary splitView.setViewController(SplitViewChildVc(title: "Primary"), for: .primary) splitView.setViewController(SplitViewChildVc(title: "Supplementary"), for: .supplementary) splitView.setViewController(SplitViewChildVc(title: "Secondary"), for: .secondary) class SplitViewChildVc: UIViewController { let viewTitle: String init(title: String = "Default") { self.viewTitle = title super.init(nibName: nil, bundle: nil) } override func viewDidLoad() { super.viewDidLoad() self.title = viewTitle self.navigationItem.title = viewTitle if #available(iOS 26.0, *) { navigationItem.subtitle = "Subtitle" } let leftbutton = UIBarButtonItem(barButtonSystemItem: .cancel, target: nil, action: nil) navigationItem.leftBarButtonItem = leftbutton let rightbutton = UIBarButtonItem(barButtonSystemItem: .add, target: nil, action: nil) navigationItem.rightBarButtonItem = rightbutton } }
0
0
192
2w
Can individual Apple Developer accounts stream full tracks with MusicKit?
I have implemented fetching Apple Music preview songs using a Swift framework integrated into a Unity app. My requirement is to fetch full tracks from a user’s Apple Music library and play them inside Unity. To do this, I understand that I need to handle authentication, generate a Developer Token, and then obtain a Music User Token to access the user’s Apple Music content. Currently, I have an Individual Apple Developer account (not Organization). Based on my research, it seems that: With an Individual account, I can implement this functionality and even upload builds to TestFlight for internal testing. However, when releasing the app publicly on the App Store, full-track playback may be restricted for Individual accounts and allowed only for Organization accounts. 👉 Can you confirm if this understanding is correct? 👉 Specifically, is it possible for an Individual account to fetch and play full-length tracks from a subscribed Apple Music user’s library (at least for internal/TestFlight testing)?
0
0
146
2w
Potential Documentation Error in kAudioAggregateDevicePropertyTapList Sample Code
Hi, I believe I've found a potential error in the sample code on the documentation page for creating and using a process tap with an aggregate device. The issue is in the section explaining how to add a tap to the aggregate device. I have already filed a Feedback Assistant ticket on this (ID: FB17411663) but haven't heard back for months. Capturing system audio with Core Audio taps The sample code for modifying the kAudioAggregateDevicePropertyTapList incorrectly uses the tapID as the target AudioObjectID when calling AudioObjectSetPropertyData. // (Code to get the list and potentially modify listAsArray) if var listAsArray = list as? [CFString] { // ... (modification logic) ... // Set the list back on the aggregate device. <--- The comment is correct list = listAsArray as CFArray _ = withUnsafeMutablePointer(to: &list) { list in // INCORRECT: This call uses tapID as the target object. AudioObjectSetPropertyData(tapID, &propertyAddress, 0, nil, propertySize, list) } } The kAudioAggregateDevicePropertyTapList is a property that belongs to the aggregate device, not the individual tap. Therefore, to set this property, the AudioObjectSetPropertyData function must target the AudioObjectID of the aggregate device itself. Using tapID as the first argument is logically incorrect for this operation and will not update the aggregate device as intended. Furthermore, the preceding AudioObjectGetPropertyData call to fetch the list also appears to use the incorrect tapID as its target in the sample. The AudioObjectID for both getting and setting this property should be the ID of the aggregate device. _ = AudioObjectGetPropertyData(aggregateDeviceID, &propertyAddress, 0, nil, &propertySize, &list) _ = AudioObjectSetPropertyData(aggregateDeviceID, &propertyAddress, 0, nil, propertySize, newList) Thank you!
0
0
268
2w
Google showing outdated iOS app icon in search results (icon was changed 8 months ago)
We updated the icon for our iOS app over 8 months ago. The new icon has been live in App Store Connect and on the App Store listing since then, and the old icon is no longer included in the build or metadata. However, when searching for the app on Google from an iOS device, the App Store result (on Google) still displays the old icon. Everywhere else (including the App Store itself), the new icon shows correctly. What we've tried already: Triple checked that the old icon is not present in the app build or website Added MobileApplication schema with the new icon to our website Waited... (8 months and counting) It seems this might be related to how App Store metadata is cached externally. Has anyone else run into this, and if so, did you find a way to get Google to refresh the icon it shows in search results?
0
0
81
3w
Programme enrollment
I started the Apple Developer Program enrollment for my organization about 2 months ago. I received an email stating that Apple needed documentation, which they mentioned in an email titled “Request for Enrollment Documents” last week. However, I never received this email with the actual document request. I have written several times through the online platform in our account asking Apple to resend this email, but I have not received any reply for over a month. Instead, every Thursday I only receive the same automated email reminding me to upload the documents. I even called Apple Business Support, but they told me they could not help because the Apple Developer team is separate. How can I reach the correct team to resolve this issue?
0
0
101
2w
Domain Verification Failed for Apple Pay – Tried Everything
I am attempting to verify my domain https://technoq.genesistechnologies.tech for use with Apple Pay Merchant ID. However, when I attempt verification, the process fails with the message: “Domain verification failed.” Unfortunately, no additional details are provided. I have already completed the following steps: Downloaded the verification file apple-developer-merchantid-domain-association.txt. Placed it in the .well-known directory as instructed. Confirmed that it is publicly accessible at: https://technoq.genesistechnologies.tech/.well-known/apple-developer-merchantid-domain-association.txt Verified that a valid SSL certificate is configured for the domain. Could you please advise on why the verification might be failing and what additional steps I should take to resolve this issue?
0
0
73
4w
Thai input glitch after Command + Delete in macOS beta
Issue: When using the shortcut Command + Delete to clear a line of text, the next character I type in Thai unexpectedly appears as an English character, even though the input source is still set to Thai. After that, subsequent characters return to Thai as expected. Details: Affected apps: Notes, Messages, and some other native apps Not affected: Browser text fields (Safari, Chrome, etc.) Does not occur when using Option + Delete or just Delete macOS [insert beta version + build number] Mac model: [insert model] Input sources: Thai – Kedmanee, English – U.S. Steps to reproduce: Open Notes (or Messages). Switch to Thai input. Type a few Thai words. Press Command + Delete. Type again — the first character shows up in English. Expected: First character should remain in Thai, consistent with the active input source. Actual: First character shows as English, then input switches back to Thai.
0
0
763
3w
CarPlay: CPListTemplate item limit and image memory usage
I’m working with CPListTemplate in CarPlay and have run into two issues: Item limit: The documentation states that maximumItemCount is 500. In practice, when providing a list of ~2–4k items, only the first 500 are displayed. However, Apple Music on CarPlay seems to handle larger lists without this limitation. Is there an API-level approach or recommended pattern to support lists beyond this cap? Image memory usage: Cells don’t appear to load lazily. Even with small images, the first 500 items load all their artwork into memory immediately, resulting in ~400–700 MB usage and high CPU loads. This seems excessive for CarPlay environments. Is there a best practice for deferring or managing image loading within CPListTemplate? Any official guidance or known workarounds for these two issues would be very helpful.
0
0
82
2w
iMessage not connecting ios26
Since adding the ios26 beta, I can no longer connect to iMessage using my phone number. It connects using email addresses but, when trying to connect to my phone number, it just gives me the buffering signal and never connects. I am on Beta 4 and still no joy. Help! I have become a green bubbler!!
0
0
237
3w
Broadcasts and Multicasts, Hints and Tips
For important background information, read Extra-ordinary Networking before reading this. Share and Enjoy — Quinn “The Eskimo!” @ Developer Technical Support @ Apple let myEmail = "eskimo" + "1" + "@" + "apple.com" Broadcasts and Multicasts, Hints and Tips I regularly see folks struggle with broadcasts and multicasts on Apple platforms. This post is my attempt to clear up some of the confusion. This post covers both IPv4 and IPv6. There is, however, a key difference. In IPv4, broadcasts and multicasts are distinct concepts. In contrast, IPv6 doesn’t support broadcast as such; rather, it treats broadcasts as a special case of multicasts. IPv6 does have an all nodes multicast address, but it’s rarely used. Before reading this post, I suggest you familiarise yourself with IP addresses in general. A good place to start is The Fount of All Knowledge™. Service Discovery A lot of broadcast and multicast questions come from folks implementing their own service discovery protocol. I generally recommend against doing that, for the reasons outlined in the Service Discovery section of Don’t Try to Get the Device’s IP Address. There are, however, some good reasons to implement a custom service discovery protocol. For example, you might be working with an accessory that only supports this custom protocol [1]. If you must implement your own service discovery protocol, read this post and also read the advice in Don’t Try to Get the Device’s IP Address. IMPORTANT Sometimes I see folks implementing their own version of mDNS. This is almost always a mistake: If you’re using third-party tooling that includes its own mDNS implementation, it’s likely that this tooling allows you to disable that implementation and instead rely on the Bonjour support that’s built-in to all Apple platforms. If you’re doing some weird low-level thing with mDNS or DNS-SD, it’s likely that you can do that with the low-level DNS-SD API. [1] And whose firmware you can’t change! I talk more about this in Working with a Wi-Fi Accessory. API Choice Broadcasts and multicasts typically use UDP [1]. TN3151 Choosing the right networking API describes two recommended UDP APIs: Network framework BSD Sockets Our general advice is to prefer Network framework over BSD Sockets, but UDP broadcasts and multicasts are an exception to that rule. Network framework has very limited UDP broadcast support. And while it’s support for UDP multicasts is less limited, it’s still not sufficient for all UDP applications. In cases where Network framework is not sufficient, BSD Sockets is your only option. [1] It is possible to broadcast and multicast at the Ethernet level, but I almost never see questions about that. UDP Broadcasts in Network Framework Historically I’ve claimed that Network framework was useful for UDP broadcasts is very limited circumstances (for example, in the footnote on this post). I’ve since learnt that this isn’t the case. Or, more accurately, this support is so limited (r. 122924701) as to be useless in practice. For the moment, if you want to work with UDP broadcasts, your only option is BSD Sockets. UDP Multicasts in Network Framework Network framework supports UDP multicast using the NWConnectionGroup class with the NWMulticastGroup group descriptor. This support has limits. The most significant limit is that it doesn’t support broadcasts; it’s for multicasts only. Note This only relevant to IPv4. Remember that IPv6 doesn’t support broadcasts as a separate concept. There are other limitations, but I don’t have a good feel for them. I’ll update this post as I encounter issues. Local Network Privacy Some Apple platforms support local network privacy. This impacts broadcasts and multicasts in two ways: Broadcasts and multicasts require local network access, something that’s typically granted by the user. Broadcasts and multicasts are limited by a managed entitlement (except on macOS). TN3179 Understanding local network privacy has lots of additional info on this topic, including the list of platforms to which it applies. Send, Receive, and Interfaces When you broadcast or multicast, there’s a fundamental asymmetry between send and receive: You can reasonable receive datagrams on all broadcast-capable interfaces. But when you send a datagram, it has to target a specific interface. The sending behaviour is the source of many weird problems. Consider the IPv4 case. If you send a directed broadcast, you can reasonably assume it’ll be routed to the correct interface based on the network prefix. But folks commonly send an all-hosts broadcast (255.255.255.255), and it’s not obvious what happens in that case. Note If you’re unfamiliar with the terms directed broadcast and all-hosts broadcast, see IP address. The exact rules for this are complex, vary by platform, and can change over time. For that reason, it’s best to write your broadcast code to be interface specific. That is: Identify the interfaces on which you want to work. Create a socket per interface. Bind that socket to that interface. Note Use the IP_BOUND_IF (IPv4) or IPV6_BOUND_IF (IPv6) socket options rather than binding to the interface address, because the interface address can change over time. Extra-ordinary Networking has links to other posts which discuss these concepts and the specific APIs in more detail. Miscellaneous Gotchas A common cause of mysterious broadcast and multicast problems is folks who hard code BSD interface names, like en0. Doing that might work for the vast majority of users but then fail in some obscure scenarios. BSD interface names are not considered API and you must not hard code them. Extra-ordinary Networking has links to posts that describe how to enumerate the interface list and identify interfaces of a specific type. Don’t assume that there’ll be only one interface of a given type. This might seem obviously true, but it’s not. For example, our platforms support peer-to-peer Wi-Fi, so each device has multiple Wi-Fi interfaces. When sending a broadcast, don’t forget to enable the SO_BROADCAST socket option. If you’re building a sandboxed app on the Mac, working with UDP requires both the com.apple.security.network.client and com.apple.security.network.server entitlements. Some folks reach for broadcasts or multicasts because they’re sending the same content to multiple devices and they believe that it’ll be faster than unicasts. That’s not true in many cases, especially on Wi-Fi. For more on this, see the Broadcasts section of Wi-Fi Fundamentals. Snippets To send a UDP broadcast: func broadcast(message: Data, to interfaceName: String) throws { let fd = try FileDescriptor.socket(AF_INET, SOCK_DGRAM, 0) defer { try! fd.close() } try fd.setSocketOption(SOL_SOCKET, SO_BROADCAST, 1 as CInt) let interfaceIndex = if_nametoindex(interfaceName) guard interfaceIndex > 0 else { throw … } try fd.setSocketOption(IPPROTO_IP, IP_BOUND_IF, interfaceIndex) try fd.send(data: message, to: ("255.255.255.255", 2222)) } Note These snippet uses the helpers from Calling BSD Sockets from Swift. To receive UDP broadcasts: func receiveBroadcasts(from interfaceName: String) throws { let fd = try FileDescriptor.socket(AF_INET, SOCK_DGRAM, 0) defer { try! fd.close() } let interfaceIndex = if_nametoindex(interfaceName) guard interfaceIndex > 0 else { fatalError() } try fd.setSocketOption(IPPROTO_IP, IP_BOUND_IF, interfaceIndex) try fd.setSocketOption(SOL_SOCKET, SO_REUSEADDR, 1 as CInt) try fd.setSocketOption(SOL_SOCKET, SO_REUSEPORT, 1 as CInt) try fd.bind("0.0.0.0", 2222) while true { let (data, (sender, port)) = try fd.receiveFrom() … } } IMPORTANT This code runs synchronously, which is less than ideal. In a real app you’d run the receive asynchronously, for example, using a Dispatch read source. For an example of how to do that, see this post. If you need similar snippets for multicast, lemme know. I’ve got them lurking on my hard disk somewhere (-: Other Resources Apple’s official documentation for BSD Sockets is in the man pages. See Reading UNIX Manual Pages. Of particular interest are: setsockopt man page ip man page ip6 man page If you’re not familiar with BSD Sockets, I strongly recommend that you consult third-party documentation for it. BSD Sockets is one of those APIs that looks simple but, in reality, is ridiculously complicated. That’s especially true if you’re trying to write code that works on BSD-based platforms, like all of Apple’s platforms, and non-BSD-based platforms, like Linux. I specifically recommend UNIX Network Programming, by Stevens et al, but there are lots of good alternatives. https://unpbook.com Revision History 2025-09-01 Fixed a broken link. 2025-01-16 First posted.
0
0
473
2w