Search results for

DTiPhoneSimulatorErrorDomain Code 2

158,670 results found

Post

Replies

Boosts

Views

Activity

recent JAX versions fail on Metal
Hi, I'm not sure whether this is the appropriate forum for this topic. I just followed a link from the JAX Metal plugin page https://developer.apple.com/metal/jax/ I'm writing a Python app with JAX, and recent JAX versions fail on Metal. E.g. v0.8.2 I have to downgrade JAX pretty hard to make it work: pip install jax==0.4.35 jaxlib==0.4.35 jax-metal==0.1.1 Can we get an updated release of jax-metal that would fix this issue? Here is the error I get with JAX v0.8.2: WARNING:2025-12-26 09:55:28,117:jax._src.xla_bridge:881: Platform 'METAL' is experimental and not all JAX functionality may be correctly supported! WARNING: All log messages before absl::InitializeLog() is called are written to STDERR W0000 00:00:1766771728.118004 207582 mps_client.cc:510] WARNING: JAX Apple GPU support is experimental and not all JAX functionality is correctly supported! Metal device set to: Apple M3 Max systemMemory: 36.00 GB maxCacheSize: 13.50 GB I0000 00:00:1766771728.129886 207582 service.cc:145] XLA service 0x600001fad300 in
0
0
406
2w
Too many bugs porting to iOS 26
Has anyone noticed that there are way too many bugs caused by the liquid glass issues? I'm noticing apple adding private layer classes between existing UI views to add glassy effect and no way to disable them. Some of these bugs have no way around without rewriting the code from. For example, search controller doesn't work anymore on iOS 26 but does work on iOS 18. The tab bar controller doesn't show at all on iOS 26 but does on iOS 18. I'm wondering if other people are facing the same issues.
Topic: UI Frameworks SubTopic: General
0
0
338
2w
Pre-inference AI Safety Governor for FoundationModels (Swift, On-Device)
Greetings, and Happy Holidays, I've been building an on-device AI safety layer called Newton Engine, designed to validate prompts before they reach FoundationModels (or any LLM). Wanted to share v1.3 and get feedback from the community. The Problem Current AI safety is post-training — baked into the model, probabilistic, not auditable. When Apple Intelligence ships with FoundationModels, developers will need a way to catch unsafe prompts before inference, with deterministic results they can log and explain. What Newton Does Newton validates every prompt pre-inference and returns: Phase (0/1/7/8/9) Shape classification Confidence score Full audit trace If validation fails, generation is blocked. If it passes (Phase 9), the prompt proceeds to the model. v1.3 Detection Categories (14 total) Jailbreak / prompt injection Corrosive self-negation (I hate myself) Hedged corrosive (Not saying I'm worthless, but...) Emotional dependency (You're the only one who understands) Third-person manipulation (If you refuse, you
1
0
257
2w
iOS 26 + UINavigationBar crash in layoutSubViews - new observation tracking
My #1 crash report since iOS 26 is the following: 1 libobjc.A.dylib objc_exception_throw 2 Foundation -[NSAssertionHandler handleFailureInMethod:object:file:lineNumber:description:] 3 UIKitCore -[UINavigationBar layoutSubviews] 4 UIKitCore UIView._layoutSubviewsWithObservationTracking() 5 UIKitCore @objc UIView._layoutSubviewsWithObservationTracking() Thread 0 9 libobjc.A.dylib objc_exception_throw 10 Foundation -[NSAssertionHandler handleFailureInMethod:object:file:lineNumber:description:] 11 UIKitCore -[UINavigationBar layoutSubviews] 12 UIKitCore UIView._layoutSubviewsWithObservationTracking() 13 UIKitCore @objc UIView._layoutSubviewsWithObservationTracking() I tried many things, without success so far. It seemed related to the new view update with observation tracking: link to documentation Any lead on how I should investigate this? Many thanks in advance!!
Topic: UI Frameworks SubTopic: UIKit Tags:
1
0
300
2w
SoundAnalysis built-in classifier fails in background (SNErrorCode.operationFailed)
I’m seeing consistent failures using SoundAnalysis live classification when my app moves to the background. Setup iOS 17.x AVAudioEngine mic capture SNAudioStreamAnalyzer SNClassifySoundRequest(classifierIdentifier: .version1) UIBackgroundModes = audio AVAudioSession .record / .playAndRecord, active Audio capture + level metering continue working in background (mic indicator stays on) Issue As soon as the app enters background / screen locks: SoundAnalysis starts failing every second with domain:com.apple.SoundAnalysis, code:2(SNErrorCode.operationFailed) Audio capture itself continues normally When the app returns to foreground, classification immediately resumes without restarting the engine/analyzer Question Is live background sound classification with the built-in SoundAnalysis classifier officially unsupported or known to fail in background? If so, is a custom Core ML model the only supported approach for background detection? Or is there a required configuration I’m missing to keep SNC
0
0
103
2w
I have a question about Safari running in the background on iOS and iPadOS.
I coded two demo websites as follows (both written in NextJS): Website 1: I coded an interval counter that increments every 1 second. Website 2: I used the MediaRecorder API (described in the WebKit documentation: https://webkit.org/blog/11353/mediarecorder-api/). In the ondataavailable function, I periodically send a blob (once every 1 second) to my server. In the backend, I coded a POST API to upload this blob. I noticed that with website 1, the interval doesn't work when I run Safari in the background, even on iOS and iPadOS. However, website 2 works, meaning it still calls my API normally (I tried running Safari in the background for about 1-2 hours and it worked). So, does this mean Apple allows native APIs like MediaRecorder and its callbacks to run in the background?
0
0
346
2w
Apple account stuck in pending status after migrating from an individual developer account to a company account
The situation is as follows: Recently, we upgraded from an individual developer account to a company account about 2–3 weeks ago. When it came time to renew the account, it was strange that the fee displayed was $99 instead of $299. We had no way to change this, so we proceeded with the $99 payment. After that, Apple introduced new terms and conditions. When we tried to accept them, the account status did not change at all. I clicked “Accept” many times, but it remained in pending status. When I checked the browser console (F12), I noticed that Apple was returning a 500 error, but this error was not shown anywhere in the UI. I have called and emailed Apple support many times. However, they insist that our company account is already active and that we just need to accept the terms and conditions. Unfortunately, we are unable to accept them due to the error returned by Apple’s system. I explained this to them, but they still insist that we simply follow their instructions. As a result, we are currently
1
0
172
2w
[Matter] Device cannot be commissioned to Google Home through iOS
Hi, We are facing the issue of commissioning our Matter device to google home through iOS device will be 100% failed. Here is our test summary regarding the issue: TestCase1 [OK]: Commissioning our Matter 1.4.0 device to Google Nest Hub 2 by Android device (see log DoorWindow_2.0.1_Google_Success.txt ) TestCase2 [NG]: Commissioning Matter 1.4.0 device to Google Nest Hub 2 by iPhone13 or iPhone16 (see log DoorWindow_2.0.1_Google_by_iOS_NG.txt ) TestCase3 [OK]: Commissioning our Matter 1.3.0 device to Google Nest Hub 2 by iPhone13 In TestCase2, we noticed that device was first commissioned to iOS(Apple keychain) then iOS opened a commissioning window again to commission it in Google’s ecosystem, and the device was failed at above step 2, so we also tried: Commissioning the device to Apple Home works as expected, next share the device to Google Home app on iOS, this also fails. Commissioning the device to Apple Home works as expected, next share the device to Google Home app o
1
0
107
2w
[Matter] Device cannot be commissioned to Google Home through iOS
Hi, We are facing the issue of commissioning our Matter device to google home through iOS device will be 100% failed. Here is our test summary regarding the issue: TestCase1 [OK]: Commissioning our Matter 1.4.0 device to Google Nest Hub 2 by Android device (see log DoorWindow_2.0.1_Google_Success.txt ) TestCase2 [NG]: Commissioning Matter 1.4.0 device to Google Nest Hub 2 by iPhone13 or iPhone16 (see log DoorWindow_2.0.1_Google_by_iOS_NG.txt ) TestCase3 [OK]: Commissioning our Matter 1.3.0 device to Google Nest Hub 2 by iPhone13 In TestCase2, we noticed that device was first commissioned to iOS(Apple keychain) then iOS opened a commissioning window again to commission it in Google’s ecosystem, and the device was failed at above step 2, so we also tried: Commissioning the device to Apple Home works as expected, next share the device to Google Home app on iOS, this also fails. Commissioning the device to Apple Home works as expected, next share the device to Google Home app o
2
0
108
2w
issue with iconv() on macOS using "WCHAR_T//TRANSLIT"
Hello, I am working on a cross‑platform application that uses libiconv to convert strings to/from Unicode. I need to modify the existing code for compatibility with macOS. However, the call to iconv() fails with an unclear errno value (92) when using WCHAR_T: std::wstring ConvertToWchar(const std::string& iconvCodeSet, const std::string_view str) { iconv_t conv = iconv_open(WCHAR_T//TRANSLIT, iconvCodeSet.c_str()); if (conv == (iconv_t)-1) { std::cerr << iconv_open() failed << std::endl; return {}; } std::wstring out(str.size(), L'0'); auto inPtr = (char*)str.data(); size_t inSize = str.size(); auto outPtr = (char*)out.data(); size_t outSize = out.size() * sizeof(wchar_t); if (iconv(conv, &inPtr, &inSize, &outPtr, &outSize) == (size_t)-1) { std::cerr << iconv() failed. errno = << errno << std::endl; return {}; } if (iconv(conv, nullptr, &inSize, &outPtr, &outSize) == (size_t)-1) { std::cerr << iconv() failed. errno = << errno
2
0
143
2w