When I try to use an entity created in a CoreData, it gives me: 'PlayerData' is ambiguous for type lookup in this context
Swift
RSS for tagSwift is a powerful and intuitive programming language for Apple platforms and beyond.
Posts under Swift tag
200 Posts
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I have the following function
private func SetupLocaleObserver ()
{
NotificationCenter.default.addObserver (
forName: NSLocale.currentLocaleDidChangeNotification,
object: nil,
queue: .main
) {_ in
print ("Locale changed to: \(Locale.current.identifier)");
}
}
I call this function inside the viewDidLoad () method of my view controller. The expectation was that whenever I change the system or app-specific language preference, the locale gets changed, and this change triggers my closure which should print "Locale changed to: " on the console.
However, the app gets terminated with a SIGKILL whenever I change the language from the settings. So, it is observed that sometimes my closure runs, while most of the times it does not run - maybe the app dies even before the closure is executed.
So, the question is, what is the use of this particular notification if the corresponding closure isn't guaranteed to be executed before the app dies? Or am I using it the wrong way?
Opening this question after discussing the issue in the AVCapture lab, hopefully so we can track down this issue.
We've been noticing some crashes in App Store Connect caused by layoutSublayers being called on a background thread.
After debugging the issue a bit we found that all calls which modified the AVCaptureSession or preview layer were indeed done on the main thread. It would be useful to see what results in AVCaptureVideoPreviewLayer.updateFormatDescription being called.
I've attached the crashlog below.
Crash log.ips - https://developer.apple.com/forums/content/attachment/800b0dba-3477-4c5a-b56c-f4cc393b384f
Last night my iPhone game crashed while running in debug mode on my iPhone. I just plugged it into my Mac, and was able to find the ips file. The stack trace shows the function in my app where it crashed, and then a couple of frames in libswiftCore.dylib before an assertion failure.
My question is - I've got absolutely no idea what the assertion failure actually was, all I have is...
0 libswiftCore.dylib 0x1921412a0 closure #1 in closure #1 in closure #1 in _assertionFailure(_:_:file:line:flags:) + 228
1 libswiftCore.dylib 0x192141178 closure #1 in closure #1 in _assertionFailure(_:_:file:line:flags:) + 327
2 libswiftCore.dylib 0x192140b4c _assertionFailure(_:_:file:line:flags:) + 183
3 MyGame.debug.dylib 0x104e52818 SentryBrain.takeTurn(actor:) + 1240
...
How do I figure out what the assertion failure was that triggered the crash? How do I figure out what line of code in takeTurn(...) triggered the failing assertion failure?
I'm currently working on implementing a character limit for Korean text input using UITextField, but I've encountered two key issues.
1. How can I determine if Korean input is complete?
I understand that markedTextRange represents provisional (composing) text during multistage text input systems (such as Korean, Japanese, Chinese).
While testing with Korean input, I expected markedTextRange to reflect the composing state.
However, it seems that markedTextRange remains nil throughout the composition process.
2. Problems limiting character count for Korean input
I’ve tried two methods to enforce a character limit. Both lead to incorrect behavior due to how Korean characters are composed.
Method 1 – Before replacement:
func textField(_ textField: UITextField, shouldChangeCharactersIn range: NSRange, replacementString string: String) -> Bool {
guard let text = textField.text else { return true }
return text.count <= 5
}
This checks the text length before applying the replacementString.
The issue is that when the user enters a character that is meant to combine with the previous one to form a composed character, the input should result in a single, combined character.
However, because the character limit check is based on the state before the replacement is applied, the second character does not get composed as expected.
Method 2 – After change:
textField.addTarget(self, action: #selector(editingChanged), for: .editingChanged)
@objc private func editingChanged(_ sender: UITextField) {
guard var text = sender.text else { return }
if text.count > limitCount {
text.removeLast()
sender.text = text
}
}
This removes the last character if the count exceeds the limit after the change.
But when a user keeps typing past the limit, the last character is overwritten by new input.
I suspect this happens because the .editingChanged event occurs before the multistage input is finalized,
and the final composed character is applied after that event.
My understanding of the input flow:
Standard input:
shouldChangeCharactersIn is called
replacementString is applied
.editingChanged is triggered
With multistage input (Korean, etc.):
shouldChangeCharactersIn is called
replacementString is applied
.editingChanged is triggered
Final composed character is inserted (after all the above)
Conclusion
Because both approaches lead to incorrect character count behavior with Korean input,
I believe I need a new strategy.
Is there an officially recommended way to handle multistage input properly with UITextField in this context?
Any advice or clarification would be greatly appreciated.
MacOS 15.5(24F74)
Xcode 16.4 (16F6)
Hi everyone 👋,
I’m new to the Apple Developer Forums and just getting started with building apps for iOS/macOS. I’ve explored the documentation but wanted to introduce myself and ask for some advice from experienced developers.
Currently working on:
An iOS app using SwiftUI
Learning more about integrating Sign in with Apple
Exploring best practices for App Store submission
Here’s what I’d like to know:
What is the recommended approach to saving minimal app state before termination?
Are there SwiftUI lifecycle metho18336114753ds or SceneDelegate hooks I should be aware of?
Is UserDefaults the best tool for small state preservation in this context?
Would love to hear from anyone who’s implemented sim18336114753ilar behavior — even a high-level suggestion would help.
Thanks!
Harjeet Singh
I have an @objC used for notification.
kTag is an Int constant, fieldBeingEdited is an Int variable.
The following code fails at compilation with error: Command CompileSwift failed with a nonzero exit code if I capture self (I edited code, to have minimal case)
@objc func keyboardDone(_ sender : UIButton) {
DispatchQueue.main.async { [self] () -> Void in
switch fieldBeingEdited {
case kTag : break
default : break
}
}
}
If I explicitly use self, it compiles, even with self captured:
@objc func keyboardDone(_ sender : UIButton) {
DispatchQueue.main.async { [self] () -> Void in
switch fieldBeingEdited { // <<-- no need for self here
case self.kTag : break // <<-- self here
default : break
}
}
}
This compiles as well:
@objc func keyboardDone(_ sender : UIButton) {
DispatchQueue.main.async { () -> Void in
switch self.fieldBeingEdited { // <<-- no need for self here
case self.kTag : break // <<-- self here
default : break
}
}
}
Is it a compiler bug or am I missing something ?
I didn't find a suggestion box on Swift's website so I'll post it here.
SwiftCharts are great but limited. I need more data on a single chart. Candlestick and OHLC type charts would be an excellent addition. Hopefully, influencers from Apple can make that happen.
Thanks.
Hi all,
I’m running into a persistent build issue with my Swift project ORSOFINAL after migrating from Xcode stable to Xcode-beta.app 26 (June 2025 version).
⸻
💥 Errors displayed:
1. C99 was enabled in PCH file but is currently disabled
2.
module file .../ModuleCache.noindex/SwiftShims-AXUM98L131W4...pcm
cannot be loaded due to a configuration mismatch with the current compilation
3. missing required module 'SwiftShims'
⸻
🛠 What I’ve already tried:
• xcode-select -s /Applications/Xcode-beta.app/Contents/Developer
• Deleted ~/Library/Developer/Xcode/DerivedData, ModuleCache.noindex, Archives, and Products
• Ran sudo xcodebuild -runFirstLaunch
• Clean Build Folder in Xcode-beta
• Verified Command Line Tools setting points to Xcode-beta
⸻
❓Looking for guidance on:
• Whether this is a known bug in Xcode-beta
• If SwiftShims/PCM conflicts are expected between versions
• Best practices to safely migrate from Xcode stable to beta for Swift-based projects
Any advice is much appreciated.
Thanks,
Mathéo
Hi all,
I’m running into a persistent build issue with my Swift project ORSOFINAL after migrating from Xcode stable to Xcode-beta.app (June 2025 version).
⸻
💥 Errors displayed:
1. C99 was enabled in PCH file but is currently disabled
2.
module file .../ModuleCache.noindex/SwiftShims-AXUM98L131W4...pcm
cannot be loaded due to a configuration mismatch with the current compilation
3. missing required module 'SwiftShims'
⸻
🛠 What I’ve already tried:
• xcode-select -s /Applications/Xcode-beta.app/Contents/Developer
• Deleted ~/Library/Developer/Xcode/DerivedData, ModuleCache.noindex, Archives, and Products
• Ran sudo xcodebuild -runFirstLaunch
• Clean Build Folder in Xcode-beta
• Verified Command Line Tools setting points to Xcode-beta
⸻
❓Looking for guidance on:
• Whether this is a known bug in Xcode-beta
• If SwiftShims/PCM conflicts are expected between versions
• Best practices to safely migrate from Xcode stable to beta for Swift-based projects
Any advice is much appreciated.
Thanks,
Mathéo
The swift syntax compilation reported an error.
as follows
How should I be compatible
The other day I was playing with iBeacon and found out that CLBeaconIdentityConstraint will be deprecated after iOS 18.5. So I've written code with BeaconIdentityCondition in reference to this Apple's sample project.
import Foundation
import CoreLocation
let monitorName = "BeaconMonitor"
@MainActor
public class BeaconViewModel: ObservableObject {
private let manager: CLLocationManager
static let shared = BeaconViewModel()
public var monitor: CLMonitor?
@Published var UIRows: [String: [CLMonitor.Event]] = [:]
init() {
self.manager = CLLocationManager()
self.manager.requestWhenInUseAuthorization()
}
func startMonitoringConditions() {
Task {
print("Set up monitor")
monitor = await CLMonitor(monitorName)
await monitor!.add(getBeaconIdentityCondition(), identifier: "TestBeacon")
for identifier in await monitor!.identifiers {
guard let lastEvent = await monitor!.record(for: identifier)?.lastEvent else { continue }
UIRows[identifier] = [lastEvent]
}
for try await event in await monitor!.events {
guard let lastEvent = await monitor!.record(for: event.identifier)?.lastEvent else { continue }
if event.state == lastEvent.state {
continue
}
UIRows[event.identifier] = [event]
UIRows[event.identifier]?.append(lastEvent)
}
}
}
func updateRecords() async {
UIRows = [:]
for identifier in await monitor?.identifiers ?? [] {
guard let lastEvent = await monitor!.record(for: identifier)?.lastEvent else { continue }
UIRows[identifier] = [lastEvent]
}
}
func getBeaconIdentityCondition() -> CLMonitor.BeaconIdentityCondition {
CLMonitor.BeaconIdentityCondition(uuid: UUID(uuidString: "abc")!, major: 123, minor: 789)
}
}
It works except that my sample app can take as long as 90 seconds to see event changes. You would get an instant update with an fashion (CLBeacon and CLBeaconIdentityConstraint). Is there anything that I can do to see changes faster? Thanks.
Hi everyone,
I’m currently trying to create a pure backdrop blur effect in my iOS app (SwiftUI / UIKit), similar to the backdrop-filter: blur(20px) effect in CSS. My goal is simple:
• Apply a Gaussian blur (radius ~20px) to the background content
• Overlay a semi-transparent black layer (opacity 0.3)
• Avoid any predefined color tint from UIBlurEffect or .ultraThinMaterial, etc.
However, every method I’ve tried so far (e.g., .ultraThinMaterial, UIBlurEffect(style:)) always introduces a built-in tint, which makes the result look gray or washed out. Even when layering a black color with opacity 0.3 over .ultraThinMaterial, it doesn’t give the clean, transparent-black + blur look I want.
What I’m looking for:
• A clean 20px blur effect (like CIGaussianBlur)
• No color shift/tint added by default
• A layer of black at 30% opacity on top of the blur
• Ideally works live (not a static snapshot blur)
Has anyone achieved something like this in UIKit or SwiftUI? Would really appreciate any insights, workarounds, or libraries that can help.
Thanks in advance!
Ben
I found that the aggregated device correctly obtains input channels in the standard microphone mode. However, in voice isolation mode, it only retrieves channels from the first sub-device in the aggregated device's list. If I want to properly obtain channel information in voice isolation mode, how should I do it?
According to docs, .focusedObject() usage should be moved to .focusedValue() when migrating to @Observable, but there is no .focusedSceneValue() overload that accepts Observable like with .focusedValue(). So how are we supposed migrate .focusedSceneObject() to @Observable?
Hi, I am making a AI-Powered app that makes api requests to the openai API. However, for security, I set up a vercel backend that handles the API calls securely, while my frontend makes a call to my vercel-hosted https endpoint. Interestingly, whenever I try to make that call on my device, an iPhone, I get this error:
Task <91AE4DE0-2845-4348-89B4-D3DD1CF51B65>.<10> finished with error [-1003] Error Domain=NSURLErrorDomain Code=-1003 "A server with the specified hostname could not be found." UserInfo={_kCFStreamErrorCodeKey=-72000, NSUnderlyingError=0x1435783f0 {Error Domain=kCFErrorDomainCFNetwork Code=-1003 "(null)" UserInfo={_kCFStreamErrorDomainKey=10, _kCFStreamErrorCodeKey=-72000, _NSURLErrorNWResolutionReportKey=Resolved 0 endpoints in 3ms using unknown from query, _NSURLErrorNWPathKey=satisfied (Path is satisfied), interface: pdp_ip0[lte], ipv4, ipv6, dns, expensive, uses cell}}, _NSURLErrorFailingURLSessionTaskErrorKey=LocalDataTask <91AE4DE0-2845-4348-89B4-D3DD1CF51B65>.<10>, _NSURLErrorRelatedURLSessionTaskErrorKey=(
"LocalDataTask <91AE4DE0-2845-4348-89B4-D3DD1CF51B65>.<10>"
), NSLocalizedDescription=A server with the specified hostname could not be found., NSErrorFailingURLStringKey=https://[my endpoint], NSErrorFailingURLKey=https://[my endpoint], _kCFStreamErrorDomainKey=10}
I'm completely stuck because when I directly make https requests to other api's like openai's endpoint, without the proxy, it finds the server completely fine. Running my endpoint on terminal with curl also works as intended, as I see api key usages. But for some reason, on my project, it does not work. I've looked through almost every single post I could find online, but a lot all of the solutions are outdated and unhelpful.
I'm willing to schedule a call, meeting, whatever to resolve this issue and get help more in depth as well.
In reference to this webpage, I'm turning my iPad to an iBeacon device.
class BeaconViewModel: NSObject, ObservableObject, CBPeripheralManagerDelegate {
private var peripheralManager: CBPeripheralManager?
private var beaconRegion: CLBeaconRegion?
private var beaconIdentityConstraint: CLBeaconIdentityConstraint?
//private var beaconCondition: CLBeaconIdentityCondition?
override init() {
super.init()
if let uuid = UUID(uuidString: "abc") {
beaconIdentityConstraint = CLBeaconIdentityConstraint(uuid: uuid, major: 123, minor: 456)
beaconRegion = CLBeaconRegion(beaconIdentityConstraint: beaconIdentityConstraint!, identifier: "com.example.myDeviceRegion")
peripheralManager = CBPeripheralManager(delegate: self, queue: nil, options: nil)
}
}
func peripheralManagerDidUpdateState(_ peripheral: CBPeripheralManager) {
switch peripheral.state {
case .poweredOn:
startAdvertise()
case .poweredOff:
peripheralManager?.stopAdvertising()
default:
break
}
}
func startAdvertise() {
guard let beaconRegion = beaconRegion else { return }
let peripheralData = beaconRegion.peripheralData(withMeasuredPower: nil)
peripheralManager?.startAdvertising(((peripheralData as NSDictionary) as! [String: Any]))
}
func stopAdvertise() {
peripheralManager?.stopAdvertising()
}
}
In Line 10, I'm using CLBeaconidentityConstraint to constrain the beacon. Xcode says that this class is deprecated and suggests that we use CLBeaconIdentityCondition. But if I try to use it, Xcode says
Cannot find type 'CLBeaconIdentityCondition' in scope
I've just updated Xcode to 16.4. I still get the same error. So how do we use CLBeaconIdentityCondition to constrain the beacon? My macOS version is Sequoia 15.5. Thanks.
After two types of objects correctly inserted as nodes in an augmented reality setting, I replicated exactly the same procedure with a third kind of objects that unfortunately refuse to show up. I checked the flow and it is the same as the other objects as well the content of the LocationAnnotation, but there is surely something that escapes me. Could someone help with some ideas?
This is the common code, apart of the class:
func appendInAR(ghostElement: Ghost){
let ghostElementAnnotationLocation=GhostLocationAnnotationNode(ghost: ghostElement)
ghostElementAnnotationLocation.scaleRelativeToDistance = true
sceneLocationView.addLocationNodeWithConfirmedLocation(locationNode: ghostElementAnnotationLocation)
shownGhostsAnnotations.append(ghostElementAnnotationLocation)
}
Problem Description
I'm encountering an issue with SCNTechnique where the clearColor setting is being ignored when multiple passes share the same depth buffer. The clear color always appears as the scene background, regardless of what value I set. The minimal project for reproducing the issue: https://www.dropbox.com/scl/fi/30mx06xunh75wgl3t4sbd/SCNTechniqueCustomSymbols.zip?rlkey=yuehjtk7xh2pmdbetv2r8t2lx&st=b9uobpkp&dl=0
Problem Details
In my SCNTechnique configuration, I have two passes that need to share the same depth buffer for proper occlusion handling:
"passes": [
"box1_pass": [
"draw": "DRAW_SCENE",
"includeCategoryMask": 1,
"colorStates": [
"clear": true,
"clearColor": "0 0 0 0" // Expecting transparent black
],
"depthStates": [
"clear": true,
"enableWrite": true
],
"outputs": [
"depth": "box1_depth",
"color": "box1_color"
],
],
"box2_pass": [
"draw": "DRAW_SCENE",
"includeCategoryMask": 2,
"colorStates": [
"clear": true,
"clearColor": "0 0 0 0" // Also expecting transparent black
],
"depthStates": [
"clear": false,
"enableWrite": false
],
"outputs": [
"depth": "box1_depth", // Sharing the same depth buffer
"color": "box2_color",
],
],
"final_quad": [
"draw": "DRAW_QUAD",
"metalVertexShader": "myVertexShader",
"metalFragmentShader": "myFragmentShader",
"inputs": [
"box1_color": "box1_color",
"box2_color": "box2_color",
],
"outputs": [
"color": "COLOR"
]
]
]
And the metal shader used to display box1_color and box2_color with splitting:
fragment half4 myFragmentShader(VertexOut in [[stage_in]],
texture2d<half, access::sample> box1_color [[texture(0)]],
texture2d<half, access::sample> box2_color [[texture(1)]]) {
half4 color1 = box1_color.sample(s, in.texcoord);
half4 color2 = box2_color.sample(s, in.texcoord);
if (in.texcoord.x < 0.5) {
return color1;
}
return color2;
};
Expected Behavior
Both passes should clear their color targets to transparent black (0, 0, 0, 0)
The depth buffer should be shared between passes for proper occlusion
Actual Behavior
Both box1_color and box2_color targets contain the scene background instead of being cleared to transparent (see attached image)
This happens even when I explicitly set clearColor: "0 0 0 0" for both passes
Setting scene.background.contents = UIColor.clear makes the clearColor work as expected, but I need to keep the scene background for other purposes
What I've Tried
Setting different clearColor values - all are ignored when sharing depth buffer
Using DRAW_NODE instead of DRAW_SCENE - didn't solve the issue
Creating a separate pass to capture the background - the background still appears in the other passes
Various combinations of clear flags and render orders
Environment
iOS/macOS, running with "My Mac (Designed for iPad)"
Xcode 16.2
Question
Is this a known limitation of SceneKit when passes share a depth buffer? Is there a workaround to achieve truly transparent clear colors while maintaining a shared depth buffer for occlusion testing?
The core issue seems to be that SceneKit automatically renders the scene background in every DRAW_SCENE pass when a shared depth buffer is detected, overriding any clearColor settings.
Any insights or workarounds would be greatly appreciated. Thank you!
I would like to clarify that my app is a Reader APP and a hybrid application built with Vue.js and Capacitor. To comply with Apple’s guidelines, I am not using any third-party SDKs for account management or payments. Instead, I am attempting to use the official StoreKit External Link Account API as required.
To achieve this, I created a custom native Capacitor plugin in Swift, which calls the StoreKit 2 classes (SKStoreExternalLinkAccountRequest and SKStoreExternalLinkAccountViewController) to present the required modal before redirecting users to manage their accounts externally.
However, I am encountering a technical issue:
When building the app in Xcode 16 (with iOS Deployment Target set to 16+), the Swift compiler cannot find the StoreKit 2 classes (SKStoreExternalLinkAccountRequest and SKStoreExternalLinkAccountViewController).
I have attached a screenshot showing the error in Xcode.
Could you please clarify if there are any additional requirements or steps needed to access these StoreKit 2 APIs in a hybrid (Capacitor/Vue) app?
Is there any limitation for hybrid apps, or is there a specific configuration needed in Xcode or the project to make these APIs available?
I am committed to fully complying with Apple’s guidelines and want to ensure the best and safest experience for my users.
Any guidance or documentation you can provide would be greatly appreciated.
my plugin:
my app in xcode - build failed
I would really appreciate it if someone could help me.