Search results for

İOS 26 beta battery %1

253,795 results found

Post

Replies

Boosts

Views

Activity

Reply to Can an IP address manually be entered into Xcode to wirelessly connect to an iOS device?
There are two main network level requirements for device connectivity with Xcode: The network supports Bonjour The network supports IPv6 with link-local traffic It's likely that the network where you can't connect the devices to Xcode is missing one of those requirements, and you need work with the network owner to address those configurations. If the network owner here is a corporate IT department, the link above to TN3158 tackles a particular slice these configuration requirements for some VPN and security configurations, so your IT organization may get some value from reading through that document, though there are many other types of IT configurations that fall beyond the scope of that document. There isn't a way to configure a device connection with just an IP address, neither with IPv4 or IPv6. You're correct that was possible a long time ago, but the inner workings of how Xcode and paired iOS devices communicate has significantly evolved since that time. — Ed Ford,  DTS Engineer
5d
When trying to run SwiftUI previews, it fails with "SimulatorShutdownUnexpectedlyError - Simulator was shutdown during an update"
My main app target builds fine and can run on Simulator without issue. Whenever I try to run a Preview, I get this error: == DATE: Monday, November 3, 2025 at 2:52:23 PM Pacific Standard Time 2025-11-03T22:52:23Z == PREVIEW UPDATE ERROR: SimulatorShutdownUnexpectedlyError: Simulator was shutdown during an update Simulator [F85A5AF1-F52C-4662-AFCD-762F87AF537D] failed to boot and may have crashed. This seems like it started happening after updating to MacOS 26. I've tried reinstalling all Simulators, tried on Xcode 26, deleted derived data, restarted Xcode and my Mac several times. What other troubleshooting steps can I take?
2
0
40
5d
Reply to Matter Media Playback Cluster
First off, I need to clarify the vocabulary here a little bit: Apple-specific Controller (Home Pod or Apple TV) In terms of the Matter specification and my own vocabulary, an ecosystem controller is any component (software or hardware) that maintains its own set of Matter ecosystem credentials, allowing it to directly communicate with Matter accessories. Most ecosystem vendors sell some kind of external hardware/accessory, but nothing about Matter actually requires that. Home Hub is the name HomeKit uses for a device that's located in the home that HomeKit can use to centralize the routing of commands to accessories. See this support article for more background. HomeKit does require a Home Hub to support Matter; however, this is best understood as a practical engineering choice on our part and not a fundamental limitation of Matter. I had an additional clarification request. Does using Apple’s Home exposition of the Matter API require using an Apple-specific Controller (Home Pod or Apple TV), or can any type
5d
Reply to Mac dictation repeating sentences
I tried to post the same post in Apple support community and was sent this reply: Thanks for participating in the Apple Support Community. We removed your post “OS 26 dictation repeating sentences... because it contained information about beta software. To comment or ask questions about beta software, go to our Apple Developer Forums page: https://developer.apple.com/forums/ Here you can share your OS 26 dictation repeating sentences... post, make comments, and ask questions.
5d
NSOutlineView incorrectly draws disclosure indicator when item views are SwiftUI views.
I am using an NSOutlineView via NSViewRepresentable in a SwiftUI application running on macOS. Everything has been working fine. Up until lately, I've been returning a custom NSView for each item using the standard: func outlineView(_ outlineView: NSOutlineView, viewFor tableColumn: NSTableColumn?, item: Any) -> NSView? { // View recycling omitted. return MyItemView(item) } Now I want to explore using a little bit more SwiftUI and returning an NSHostingView from this delegate method. func outlineView(_ outlineView: NSOutlineView, viewFor tableColumn: NSTableColumn?, item: Any) -> NSView? { // View recycling omitted. let rootView = MySwiftUIView(item) let hostingView = NSHostingView(rootView: rootView) return hostingView } For the most part, this appears to be working fine. NSOutlineView is even correctly applying highlight styling, so that's great. But there's one small glitch. The outline view's disclosure triangles do not align with the hosting view's content. The disclosure triangles appear
1
0
99
5d
Reply to [DEXT Migration Issue] IOUserSCSIParallelInterfaceController fails to handle low-level I/O from `diskutil`
We conducted a test in our legacy KEXT environment, and the results indicate a behavioral difference between the KEXT and DriverKit frameworks in handling I/O requests. So, the first thing to understand is that IOKit and DriverKit are NOT fundamentally different/separate technologies. The best way to understand DriverKit is that it's implemented as a very specialized user client built on to our existing IOKit infrastructure. Note that this dynamic is quite direct- for example, a DEXT's IOKitPersonalities dictionary doesn't just look like an IOKit match dictionary, it IS an IOKit matching dictionary. How DEXT loading/matching actually works is: The matching dictionary is added into the kernels KEXT matching set just like any KEXT would be. When hardware is attached, that matching dictionary is used to match and load an in kernel driver in EXACTLY the same way ANY other KEXT would be. Once the kernel driver finishes loading, the system then uses the DEXT keys inside that matching dictionary to create your DEXT
Topic: App & System Services SubTopic: Drivers Tags:
5d
Reply to No mic capture on iOS 18.5
We discovered the iOS version is not the culprit. All models starting with 14 series iPhones and onward are affected by this problem, even on iOS 17. Must be some glitched interaction of hardware and software feature. Affected models do not produce any audio, magnitude logging from C++ engine doesn't produce anything. static OSStatus mixerPlaybackCallack(void *inRefCon, AudioUnitRenderActionFlags *ioActionFlags, const AudioTimeStamp *inTimeStamp, UInt32 inBusNumber, UInt32 inNumberFrames, AudioBufferList *ioData) { AudioEngine *self = (AudioEngine*)inRefCon; uint32_t toget = inNumberFrames << 2; //printf(mixerPlaybackCallack: bus: %u, frames: %un, (uint32_t)inBusNumber, (uint32_t)inNumberFrames); if (inBusNumber < kMaxRemoteLines) { auto& audiobuffer = self->m_remoteBuffers[inBusNumber]; int32_t len = 0; if (uint8_t *buf = audiobuffer.tail(len)) { int32_t tocopy = len > toget ? toget : len; memcpy(ioData->mBuffers[0].mData, buf, tocopy); audiobuffer.consume(tocopy); if
5d
IOS 26 Full Keyboard Access (navigation) and WKWebView
We use an embedded WKWebView for several screens in our app. Recently, we have been testing keyboard navigation via Full Keyboard Access in our apps. On IOS 18, everything works pretty much as expected. On IOS 26, it does not. On IOS 26, you can tab away from the webview and then never tab back to the webview for keyboard navigation. Is this a known issue? Are there workarounds for this issue that anyone is aware of?
2
0
272
5d
Is Anyone Else Stuck in “Waiting for Review” for 1 Month? This Is Getting Ridiculous.
I’m genuinely on the edge of losing my sanity right now. My apps have been stuck in “Waiting for Review” for almost a MONTH, yes, 30 days and absolutely nothing is happening. At this point, I’m not even sure if my apps are in a review queue or a black hole. Here’s my situation: 2 apps submitted, both still “Waiting for Review” 30 days of zero movement Contacted support multiple times, replies that lead nowhere Expedited Review request APPROVED 4+ days ago… and guess what? Still no movement. Not even a “Hi, we’re looking at it.” I feel like I’m talking to a wall. This is starting to feel like a glitch in the Matrix. This is seriously impacting my launch, marketing, business timeline and user acquisition strategy. I love Apple, I love the platform, but this silence and lack of transparency is painful for developers. What we need is very simple: A realistic timeline 1 week? 2 weeks? A month? If expedited review is approved, why is nothing happening? If there’s an issue, just tell me so I can fix it inst
1
0
110
5d
My Apps Have Been “Waiting for Review” for 1 Month – No Response, No Progress
I’m honestly about to lose my mind at this point. My apps have been stuck in “Waiting for Review” for almost a month, and I still have zero progress, zero response, and zero clarity on what’s going on. Here are the facts: Both of my apps were submitted for review nearly 30 days ago Status: Still “Waiting for Review” with no movement at all I contacted Apple Support multiple times — no useful feedback was provided One of my expedited review requests was approved 4+ days ago, yet the status of the app hasn’t changed whatsoever This situation is extremely frustrating. I’m a developer trying to run a business, and these unexplained delays are causing serious damage to my release timeline, marketing schedule, and user acquisition plans. I genuinely don’t understand why things are stuck for this long. All I’m asking for is basic transparency: Why is the review not starting? Why is the expedite request approved but not reflected in the review queue? Is there a hidden issue blocking my apps that I am not bei
1
0
126
5d
Reply to UITextField selects all text on focus when the content is long — how to keep the caret at the end?
I've tried your code with my iOS 26.x device + Xcode 26.1 beta 2 (17B5035f) and can't reproduce the behavior. Have you tried with the latest version system and tool? I am curious if you still see the behavior there. I'd say changing the selectedTextRange of the text field is the right way to go, but the system can indeed override the result, and so you might try to delay your change a bit. The following code makes the change 100ms later: func textFieldDidBeginEditing(_ textField: UITextField) { DispatchQueue.main.asyncAfter(deadline: .now() + 0.01) { let end = textField.beginningOfDocument textField.selectedTextRange = textField.textRange(from: end, to: end) } } Best, —— Ziqiao Chen  Worldwide Developer Relations.
Topic: UI Frameworks SubTopic: UIKit Tags:
5d
Happy Eyeballs cancels also-ran only after WebSocket handshake (duplicate WS sessions)
Network.framework: Happy Eyeballs cancels also-ran only after WebSocket handshake (duplicate WS sessions) Hi everyone 👋 When using NWConnection with NWProtocolWebSocket, I’ve noticed that Happy Eyeballs cancels the losing connection only after the WebSocket handshake completes on the winning path. As a result, both IPv4 and IPv6 attempts can send the GET / Upgrade request in parallel, which may cause duplicate WebSocket sessions on the server. Standards context RFC 8305 §6 (Happy Eyeballs v2) states: Once one of the connection attempts succeeds (generally when the TCP handshake completes), all other connections attempts that have not yet succeeded SHOULD be canceled. This “SHOULD” is intentionally non-mandatory — implementations may reasonably delay cancellation to account for additional factors (e.g. TLS success or ALPN negotiation). So Network.framework’s current behavior — canceling after the WebSocket handshake — is technically valid, but it can have practical side effects at the application lay
1
0
38
5d