Build, test, and submit your app using Xcode, Apple's integrated development environment.

Xcode Documentation

Posts under Xcode subtopic

Post

Replies

Boosts

Views

Activity

Firebase Phone Auth OTP not working on TestFlight
Hi, I'm working in unity and I've implemented Firebase Phone Number Authentication in it. Everything works fine when I directly install build from xCode. App Attest screen shows up, user receives OTP on their phone and login works. But when I download the same build from TestFlight, it gets stuck after the user sends OTP request. I've added Push Notifications and App Attest in capabilities. I've also additionally added Remote Notifications. In device log I see an error about mobile provisioning file but I've added that to my account also. Is this expected behavior that phone number authentication does not work on TestFlight? If yes, how can I get this approved from apple since they need to test it before approving it. Thanks!
0
0
215
Feb ’25
Test Plans: application data for macOS apps
In the test plan settings, it's possible to select an .xcappdata bundle for a set of tests: I failed to find any documentation about it in the context of macOS apps testing. What I'm trying to achieve is having a sandboxed app container (/Users/{user}/Library/Containers/{bundle-id)/Data/) replaced before running a test suite. Is it something possible to achieve using the latest Xcode? What should be the .xcappdata structure to make it work on macOS?
0
1
263
Feb ’25
Broken Xcode 16 autocomplete using Tab
I've recently upgraded to Xcode 16 and noticed a change in how the Tab key functions during autocomplete. (not-replied-but-closed post: https://discussions.apple.com/thread/255762888) Previously, pressing Tab would extend the typed text up to the first point of choice. For example, we have two classes: NSViewController and NSViewCoordinator BEFORE, typing: "NSV" + Tab used to complete to NSViewCo Now, in Xcode 16, pressing Tab selects the first suggestion by default, instead of completing up to the choice point. That is very inconvenient because very often I want just see all possible cases with some prefix...without need of typing all prefix manually. Seems there is no way to restore the previous behavior and that looks very very sad. I have reverse engineered Xcode 16...and what I get? They just install new CodeCompletion handler instead of old one without any chance to configure this behaviour by settings or UserDefaults =\ Hope this thread would raise votes and attract Xcode devs here
0
1
298
Feb ’25
Crash report from user unexpectedly looks like beta app
According to this page: https://developer.apple.com/documentation/xcode/interpreting-the-json-format-of-a-crash-report the storeInfo element of a JSON crash log may contain a deviceIdentifierForVendor element, but quote "This field is only present for TestFlight builds of an app". A user has just sent me a crash report which contains this key, but this isn't a testflight build. The suspicion is of course that the app is "cracked" in some way, but having interacted with the user for some time I think this is unlikely. He has had various issues with the app, affecting multiple devices; I have been wondering if there is something wrong with his Apple account. The docs also say that this "replaces the CrashReporter Key field", yet this crash report contains both. So my question for other developers: if you have any .ips crash logs, obtained from non-testflight users of your apps, please could you grep them for deviceIdentifierForVendor and let me know what you find. If any Apple people have any clues about how this could innocently end up in a crash report, please let me know. Maybe it's a documentation bug?
0
0
389
Dec ’24
How to start and automatically update/end a local live activity?
I'm implementing a timer feature and facing the issue that the live activity I'm starting just continues showing after the timer is complete. The body of the live activity widget is more or less: ActivityConfiguration(for: WhendyWidgetAttributes.self) { context in VStack { Text( context.state.timerEndDate, style: .timer ) // if Date.now < timerEndTime { Text("Done") } self.expandedView(state: context.state) } } … Ideally I could get the activity to show something else when it is done but I don't know how to get it to re-evaluate it's body once the end time is reached. I create the activity with let activity = try ActivityKit.Activity.request( attributes: attributes, content: .init( state: .init(timerEndDate: timerEndDate), staleDate: timerEndDate ), pushType: nil ) Can I schedule the activity to do a refresh it's body (and reevaluating Date.now) once the timerEndDate is reached? Considered Approaches trying staleDate However, the activity never shows that it has become stale. Would it be expected that it shows the stale-ness? scheduling dismissal I also thought about starting and immediately stopping the activity with a delayed dismissal, but unfortunately it seems this is limited to a 4 hour window, and I'd like longer timers too. remote updates I understand I could use remote notifications to update the live activity, but I'd really like to keep things local as all the functionality is locally plannable. Background Tasks I understand these don't run reliably or at a predictable time. A Timer in the app that updates the content I think this would only update the activity while the app is in foreground.
0
0
535
Nov ’24
Xcode16 missing dSYM in Firebase
The problem is that when building the application with Debug mode on Xcode 16.1, the dSYM files fail to upload to Crashlytics. It worked in latest Xcode 15 version. The workaround is to disable Debug Dylib Support in the target's Build Settings. However, this causes SwiftUI previews to stop working. Reproducing the issue Set ENABLE_DEBUG_DYLIB=YES for build options Build the application in Xcode 16.1 Firebase SDK Version 11.4.0 Xcode Version 16.1 Installation Method Swift Package Manager Firebase Version 11.5.0 Relevant Log Output warning: (arm64) /Users/dustin/Library/Developer/Xcode/DerivedData/MyAppName-cicejndcecececfe/Build/Products/Debug-iphonesimulator/MyAppName.app/ MyAppName empty dSYM file detected, dSYM was created with an executable with no debug info. The warning seems like is from XCode/lldb compiler rather than Crashlytics (https://lldb.llvm.org/cpp_reference/SymbolFileDWARF_8cpp_source.html line655). This is probably something on Apple side, Crashlytics only consumes dSYM which is generated from Xcode. (ref:https://github.com/firebase/firebase-ios-sdk/issues/14054#issuecomment-2477235548) This is related to: Firebase Issue
0
8
2.4k
Nov ’24
What to return from func focusItems(in rect: CGRect) -> [any UIFocusItem]
I have a warning on my AR app Virtual Tags that since July does not show its camera information telling: "ARCL.SceneLocationView implements focusItemsInRect: - caching for linear focus movement is limited as long as this view is on screen." I tried inserting the function, but documentation does not explain what it should return and how to generate it. Is someone able to help me? Thanks,
0
0
301
Feb ’25
SceneKit Performance Issues with Large Node Counts on iPad (10th Gen, iPadOS 18.3)
We’re developing an iPad application that visualizes 2D and 3D building floor plans, including a mesh network of nodes that control lighting and climate. The node count ranges from 1,000 to 15,000. We’re using SceneKit to dynamically render the floor plan and node mesh on an iPad 10th generation running iPadOS 18.3. While the core visualization works, we are experiencing significant performance degradation as the node count increases. Specifically: At 750–1,000 nodes, UI responsiveness noticeably declines. At 2,000 nodes, navigating the floor plan becomes nearly unusable. We attempted to optimize performance with a Geometric Pool algorithm, but the impact was minimal. Strangely, the same iPad handles 30,000+ 3D objects effortlessly when using Unity or Unreal Engine, raising the question of whether SceneKit may not be optimized for this scale. Our questions: Is SceneKit suitable for visualizing such large node counts, or are we hitting an inherent limitation of the framework? Are there best practices or optimization techniques for SceneKit that we might be missing? Should we consider a hybrid approach or fully transition to a different 3D engine for this use case? We’ve attached a code sample below demonstrating the issue. Any insights, suggestions, or experiences would be greatly appreciated! ContentView.swift
0
1
325
Feb ’25
XCode 16.2 forgets my git credentials constantly
Like the title says, I'll commit some changes no problem with my Git name and email displayed properly, then work for a while longer and when use the integrate menu to stage and commit I find the git name/email are empty. My credentials are properly entered in settings. The only fix i've found is quitting and restarting. Any less frustrating option?
0
0
156
Feb ’25
WKWebView
I am using Xcode 15.2. I am loading a URL in a WKWebView, but a hand cursor appears above the web view. Scrolling up and down does not work. I tried adjusting the scroll bounds and isUserInteractionEnabled, but the hand cursor is still visible. How can I fix this?
0
0
127
Feb ’25
Issue during loading account information on virtual machine
I'm not able to see my account information on the Mac Mini machine with M2 CPU. I log in to my account in Settings, but in Xcode 16.1 it fails with the Decoding Error There was a failure decoding response I think this is the same as here: https://developer.apple.com/forums/thread/767673 https://developer.apple.com/forums/thread/769069 https://developer.apple.com/forums/thread/759877
0
0
191
Nov ’24
How to Export OBJ with Texture (JPG + MTL) from ARKit LiDAR Scan in iOS?
I am using ARKit with RealityKit to scan objects using LiDAR on iOS. I can generate an OBJ file from ARMeshAnchors, but I am missing the texture export (JPG + MTL). What I Have So Far: Successfully capturing mesh using ARMeshAnchor. Converting mesh into MDLAsset and exporting .obj. I need help generating the .jpg texture and linking it to the .mtl file. private func exportScannedObject() { guard let camera = arView.session.currentFrame?.camera else { return } func convertToAsset(meshAnchors: [ARMeshAnchor]) -> MDLAsset? { guard let device = MTLCreateSystemDefaultDevice() else {return nil} let asset = MDLAsset() for anchor in meshAnchors { let mdlMesh = anchor.geometry.toMDLMesh(device: device, camera: camera, modelMatrix: anchor.transform) // Apply a gray material to the mesh let material = MDLMaterial(name: "GrayMaterial", scatteringFunction: MDLScatteringFunction()) material.setProperty(MDLMaterialProperty(name: "baseColor", semantic: .baseColor, float3: SIMD3(0.5, 0.5, 0.5))) // Gray color if let submeshes = mdlMesh.submeshes as? [MDLSubmesh] { for submesh in submeshes { submesh.material = material } } asset.add(mdlMesh) } return asset } func export(asset: MDLAsset) throws -> URL { let directory = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask).first! let url = directory.appendingPathComponent("scaned.obj") if MDLAsset.canExportFileExtension("obj") { do { try asset.export(to: url) return url } catch let error { fatalError(error.localizedDescription) } } else { fatalError("Can't export USD") } } if let meshAnchors = arView.session.currentFrame?.anchors.compactMap({ $0 as? ARMeshAnchor }), let asset = convertToAsset(meshAnchors: meshAnchors) { do { let url = try export(asset: asset) showScanPreview(url) } catch { print("export error") } } } extension ARMeshGeometry { func vertex(at index: UInt32) -> SIMD3<Float> { assert(vertices.format == MTLVertexFormat.float3, "Expected three floats (twelve bytes) per vertex.") let vertexPointer = vertices.buffer.contents().advanced(by: vertices.offset + (vertices.stride * Int(index))) let vertex = vertexPointer.assumingMemoryBound(to: SIMD3<Float>.self).pointee return vertex } // helps from StackOverflow: // https://stackoverflow.com/questions/61063571/arkit-3-5-how-to-export-obj-from-new-ipad-pro-with-lidar func toMDLMesh(device: MTLDevice, camera: ARCamera, modelMatrix: simd_float4x4) -> MDLMesh { func convertVertexLocalToWorld() { let verticesPointer = vertices.buffer.contents() for vertexIndex in 0..<vertices.count { let vertex = self.vertex(at: UInt32(vertexIndex)) var vertexLocalTransform = matrix_identity_float4x4 vertexLocalTransform.columns.3 = SIMD4<Float>(x: vertex.x, y: vertex.y, z: vertex.z, w: 1) let vertexWorldPosition = (modelMatrix * vertexLocalTransform).columns.3 let vertexOffset = vertices.offset + vertices.stride * vertexIndex let componentStride = vertices.stride / 3 verticesPointer.storeBytes(of: vertexWorldPosition.x, toByteOffset: vertexOffset, as: Float.self) verticesPointer.storeBytes(of: vertexWorldPosition.y, toByteOffset: vertexOffset + componentStride, as: Float.self) verticesPointer.storeBytes(of: vertexWorldPosition.z, toByteOffset: vertexOffset + (2 * componentStride), as: Float.self) } } convertVertexLocalToWorld() let allocator = MTKMeshBufferAllocator(device: device); let data = Data.init(bytes: vertices.buffer.contents(), count: vertices.stride * vertices.count); let vertexBuffer = allocator.newBuffer(with: data, type: .vertex); let indexData = Data.init(bytes: faces.buffer.contents(), count: faces.bytesPerIndex * faces.count * faces.indexCountPerPrimitive); let indexBuffer = allocator.newBuffer(with: indexData, type: .index); let submesh = MDLSubmesh(indexBuffer: indexBuffer, indexCount: faces.count * faces.indexCountPerPrimitive, indexType: .uInt32, geometryType: .triangles, material: nil); let vertexDescriptor = MDLVertexDescriptor(); vertexDescriptor.attributes[0] = MDLVertexAttribute(name: MDLVertexAttributePosition, format: .float3, offset: 0, bufferIndex: 0); vertexDescriptor.layouts[0] = MDLVertexBufferLayout(stride: vertices.stride); let mesh = MDLMesh(vertexBuffer: vertexBuffer, vertexCount: vertices.count, descriptor: vertexDescriptor, submeshes: [submesh]) return mesh } } What I Need Help With: How do I generate the JPG texture from the AR scene? How do I save an MTL file linking the OBJ model to the texture? How can I correctly apply the texture when viewing the OBJ in an external 3D viewer? I appreciate any guidance, including sample code or resources! If you have a complete working solution, I’d love to discuss further via private channels.
0
0
417
Feb ’25
Apple developer
in xcode i have select the developer team. but show some error that is here, "Communication with Apple failed Your team has no devices from which to generate a provisioning profile. Connect a device to use or manually add device IDs in Certificates, Identifiers & Profiles. https://developer.apple.com/account/" and show this error also "No profiles for 'com.kuntaldoshi.homeautomation' were found Xcode couldn't find any iOS App Development provisioning profiles matching 'com.kuntaldoshi.homeautomation'."
0
0
341
Feb ’25
Advertising supported apps
Apple is pushing to use AdAttributionKit, but when I look at various ad networks, their sample code isn't using it. I'd like to find a simple example using a banner add and a third party ad network like admob. Also, I'm not clear on what post backs really do, or if I need them at all. If anyone can point me to clear documentation that is up to date, that would be great!
0
0
370
Dec ’24
Swift, kevent, and wth?!?!?
I have this code: var eventIn = kevent(ident: UInt(self.socket), filter: Int16(EVFILT_WRITE), flags: UInt16((EV_ADD | EV_ENABLE)), fflags: 0, data: 0, udata: nil ) I looked at it and thought why do I have those extra parentheses? So I changed it to var eventIn = kevent(ident: UInt(self.socket), filter: Int16(EVFILT_WRITE), flags: UInt16(EV_ADD | EV_ENABLE), // changed line! fflags: 0, data: 0, udata: nil ) and then kevent gave me EBADF. Does this make sense to anyone?
0
0
211
Feb ’25