I read online that there is no way to extract the call log from an iPhone. I want to develop an app to help people remember to call their mom, and if they did, the nagging would disappear automatically. I'm looking for any workaround to know when a user called someone, without having them log it manually.
Search results for
build disappears
50,315 results found
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
: Hello, I’m seeking clarification on whether Apple provides any framework or API that enables deep integration between Siri and advanced AI assistants (such as ChatGPT), including system-level functions like voice interaction, navigation, cross-platform syncing, and operational access similar to Siri’s own capabilities. If no such option exists today, I would appreciate guidance on the recommended path or approved third-party solutions for building a unified, voice-first experience across Apple’s ecosystem. Thank you for your time and insight.
I'm trying to benchmark a Core Image filter chains memory footprint and notice a weird quirk in instruments. On a real device, even with a simple Core Image chain, the memory balloons each time I ran the filter. See attached screen shots. Running on iPhone 17 Pro: Running on simulator (M2 Macbook Pro) As you can see there's a huge build up of 4MB VM: IOSurface memory on the real device, but the simulator seems to clean it up correctly. Here's my basic code: func processImage() { guard let inputImage = ContentViewModel.loadImageFromBundle(name: kitty.HEIC) else { print(Failed to load sample_image from bundle) return } var outputImage = inputImage outputImage = outputImage.applyingFilter(CIBloom, parameters: [ kCIInputRadiusKey: 20, kCIInputIntensityKey: 0.8 ]) DispatchQueue.global(qos: .userInitiated).async { let data = self.context.jpegRepresentation(of: outputImage, colorSpace: CGColorSpace(name: CGColorSpace.sRGB)!) if let data = data, let uiImage = UIImage(data: data) { DispatchQueue.main.async {
The Test target not build for not such file or directory: 'CoreGraphics'. Not sure why I get this error, but I configured the target without forgetting the variables BUNDLE_LOADER with $(BUILT_PRODUCTS_DIR)/MyExistingApp.app/MyExistingApp value and TEST_HOST with $(BUNDLE_LOADER) value. App target (not the test target), the Symbols Hidden by Default build setting its equal to NO, unlike the Test target that is set to YES. Any variable more for this? I'm not sure if I should take anything into account when using Xcode 26.1.1 and Swift Testing framework.
Hello everyone, I’m looking for guidance regarding my app review timeline, as things seem unusually delayed compared to previous submissions. My iOS app was rejected on November 19th due to AI-related policy questions. I immediately responded to the reviewer with detailed explanations covering: Model used (Gemini Flash 2.0 / 2.5 Lite) How the AI only generates neutral, non-directive reflective questions How the system prevents any diagnosis, therapy-like behavior or recommendations Crisis-handling limitations Safety safeguards at generation and UI level Internal red-team testing and results Data retention, privacy, and non-use of data for model training After sending the requested information, I resubmitted the build on November 19th at 14:40. Since then: November 20th (7:30) → Status changed to In Review. November 21st, 22nd, 23rd, 24th, 25th → No movement, still In Review. My open case on App Store Connect is still pending without updates. Because of the previous rejection, I expected a short delay
Hello, I’m seeking help regarding an App Review situation that has become increasingly difficult to resolve. Our app has been in review since July 29, and despite multiple rounds of communication, detailed explanations, updated builds, and full implementation of all required changes, the review process continues to stall without clear feedback. Most recently, we received new review questions regarding the External Purchase entitlement and the “Notify Me When Open” feature. We provided a detailed explanation addressing each point, including clarification that: All digital purchases, including Online Groups, use the StoreKit External Purchase API. The External Purchase Modal Sheet appears before every purchase flow. The “Notify Me When Open” feature does not initiate or bypass a payment flow; it only notifies users when a time-scheduled course becomes available, after which the user proceeds through the standard StoreKit External Purchase Modal Sheet. After submitting these explanations, our app return
Hello, I am currently implementing External Purchase Link and External Purchase Custom Link and am encountering an issue where both ExternalPurchaseLink.canOpen and ExternalPurchaseCustomLink.isEligible always return false under all test conditions. I would like to confirm whether my setup is missing any required steps or whether this behavior is expected. Below are the details of my current environment and configuration: 🔧 1. Development Environment Xcode: 16.3, 16.4, 26.0 beta 4 Devices: iPhone running iOS 26.2 beta iPhone running iOS 16.7.12 macOS 15.5 (real device testing) Simulator iOS 18.0 Build Type: Local development build using a Developer Provisioning Profile Sandbox account signed in during testing 🔑 2. Entitlements (Developer site & Xcode) In Certificates → Identifiers → App ID, both capabilities are enabled: StoreKit External Purchase StoreKit External Purchase Link The .entitlements file in Xcode includes: com.apple.developer.storekit.external-purchase = YES com.apple.dev
I’m building a photo‑gallery view that mimics the iOS Photos app when it’s zoomed in to the maximum level: all years are displayed at once, with roughly 400 tiny thumbnails per page. The user experience of the system app is that the view is instantly visible, and scrolling keeps thumbnails instantly appearing. I’ve already tried fetching thumbnails with PHImageManager and PHCachingImageManager, requesting the .fastFormat representation. However, the thumbnails still take several seconds to load, so the scrolling experience is noticeably laggy compared to the system app. Is there another approach or technique—perhaps a different caching strategy, pre‑fetching, or a lower‑level API—that would allow me to retrieve and display thumbnails as quickly (or faster) than the native Photos app? Any guidance or code snippets would be greatly appreciated.
Hello, I am studying the Building peer-to-peer apps codebase https://developer.apple.com/documentation/wifiaware/building-peer-to-peer-apps and am wondering why no connection is ever started? I searched the codebase and didn't find .start() be called once. Start function I'm referencing https://developer.apple.com/documentation/network/networkconnection/start() Are NetworkConnections started automatically? Note that I am using QUIC NetworkConnections (NetworkConnection) in what I'm trying to do.
Hello I'm wrapping my head around on how to properly set up xcode project to produce a static library Why? file locations /usr/local/lib/libXXX.a and /usr/local/include/XXX/xxx.h so it can be used Unix style in other projects That's not really the way that macOS works. I could write an old style Makefile and have xcode call the makefile but there must be an easier way to do this. What's wrong with a makefile? Xcode is designed to build iOS apps. There is no easy way to make it build open-source style archives and headers. And why should there be? Any open source project would be using standard tools to do this kind of thing. They would never, ever use Xcode. This is for a cross platform development so having it packaged into a Framework would not solve it neither. The Mac and Xcode are not useful for cross-platform development. Just use whatever standard tools fit your technical and social requirements - autotools, CMake, Google-build-engine-du-jour, whatever. Now if you wanted to build
Topic:
Developer Tools & Services
SubTopic:
Xcode
When using a Swift Build Plugin, the generated code definitions are available through autocomplete, but it is currently not possible to view them directly in Xcode using Option+click. An example of such a plugin is swift-openapi-generator. According to information from Meet Swift Package plugins from WWDC22 the generated code is stored with other build artifacts. It would be immensely helpful if there was support for viewing these intermediate files in read-only mode using Option+click. Currently, I have to resort to opening these files through Finder, or opening the project in VS Code where viewing the generated files using Cmd+click works without a problem. Am I missing something? If not, it seems like a big oversight that this is not supported to the same extent in Apple's own tools.
Hello! I'm encountering a weird issue on TestFlight/AppStoreConnect. I've builds that are in the Processing state since the 19th (that's 5 days ago from the date of this post). I've tried to generate new builds to see if it was just a fluke during those days, but new builds generated today also get stuck at processing. It's starting to become an issue because everybody on our team that's not a developer can't test the latest changes... Is there any way that I can get them unstuck? Thanks!
Hi, I created a Feedback from the Build-Navigator in Xcode-Cloud but do not get an ID for the feedback right now. Even the Feedback Assistant didn't show anything. As soon as the ID is available, I will send it here. Kind regards. Maik
Topic:
Developer Tools & Services
SubTopic:
Xcode Cloud
Tags:
Responding to user reviews boosts ASO by improving sentiment, restoring ratings, and building user trust to enhance overall ranking.
Topic:
App Store Distribution & Marketing
SubTopic:
App Store Connect
Tags:
Hello, We are experiencing a persistent issue where macOS builds in Xcode Cloud consistently hang at the Archive stage. The build itself completes successfully, but no artifacts appear, and it seems the build gets stuck during artifact upload. These builds remain in this state for several days (currently 3 days and counting) not failing, but never finishing. We opened a support ticket (102756662562), but we have not received any response yet. We rely on Xcode Cloud for our entire CI/CD pipeline, and at the moment our workflow is completely blocked because of this issue. Has anyone encountered something similar or found a workaround? Thank you.