Hi, Sorry to hear you are having problems getting previews working. To verify that the simulators work outside of previews on your system could you attempt a build+run of your app ⌘+R ? If that works then the best next step will be to file a feedback with diagnostics so we can take a look. We will need the diagnostics Xcode Previews generates in order to make sure we understand the error the previews system is encountering. Install the logging profile using instructions available here: https://developer.apple.com/bug-reporting/profiles-and-logs/?name=swift On your mac running Xcode, and on your physical preview device (if you are using one). Install the logging profile using the following instructions on your mac running Xcode; and if you are using one, your physical preview device (iOS or visionOS): https://developer.apple.com/bug-reporting/profiles-and-logs/?name=swift Then when you reproduce the problem in Xcode: Either (a) an error banner will appear, click the Diagnostics butto
Building for iOS Simulator, but the linked and embedded framework ‘XX.framework‘ was built for
167,470 results found
Post
Replies
Boosts
Views
Activity
I am working with an open source watch called PineTime to implement ANCS support so users can receive iOS notifications in the watch. I am having g trouble discovering the ANCS on an iOS device. The watch is properly paired and bonded. I realize that the ANCS may not always be present, so I also subscribed to the GATT Service changed characteristic, but never received a notification/indication for a service becoming active. Is there something I am missing? Thanks!
I can also confirm that the same thing is happening when communicating from the app to the watch: the watch app wakes in background to process data from the iPhone app and calls reloadAllTimelines, but the watch face complication does not reload. I'm guessing this is an iOS 18 bug because I've only heard from my customers in the last few weeks about this.
Hi everyone, I am a beginner in iOS/Swift programming. I'm trying to develop a mobile application that allows to mount a network drive in the iphone Files application via the WebDav protocol. I saw on the internet that WebDav is no longer implemented in iOS because considered deprecated by apple. To accomplish this task, I decided to separate responsibilities as follows: Framework: WebDav (responsible for communication with the WebDav server) FileProviderExtension: FileBridge (Responsible for bridging the gap between the WebDav Framework and the iOS Files app) Main App I also have an AppGroup that includes the main application and the fileproviderextension Initially, to measure the feasibility and complexity of this task, I'd like to make a simplistic version that simply displays the files on my drive in the Files app, without necessarily being able to interact with them. FileProviderExtension.swift: import FileProvider import WebDav class FileProviderExtension: NS
Hi, the new style of tab bar is at top, sort of a picker style , how to enforce SwiftUI to use it in old style fashion at bottom in iPadOS same as iOS ? — Kind Regards
I had this problem on iOS 18.2. I fixed my changing the order of my keyboards in the general space keyboard setting.
When I try to run my project on the simulator, it tells me there is a bug. It is not in the code I wrote, but I believe in the compiler. It would work perfectly, say the build succeeded, but the phone turns white and stops there. I don't know how to debunk it, what to do! Picture of what happens with the phone: Picture of the debugging area:
I use AppIntent to trigger a widget refresh, Appint is used on Button or Toggle,as follows var isAudibleArming = false struct SoundAlarmIntent: AppIntent { static var title: LocalizedStringResource = SoundAlarmIntent func perform() async throws -> some IntentResult { isAudibleArming = true return .result() } } func timeline( for configuration: DynamicIntentWidgetPersonIntent, in context: Context ) async -> Timeline { var entries: [Entry] = [] let currentDate = Date() let entry = Entry(person: person(for: configuration)) entries.append(entry) if isAudibleArming { let entry2 = Entry(person: Person(name: Friend4, dateOfBirth: currentDate.adding(.second, value: 6))) entries.append(entry2) } return .init(entries: entries, policy: .never) } The timeline function fires, with entry corresponding to view1 and entry2 corresponding to view2. I expect to show view1 immediately and view2 6 seconds later. You get the correct response on iOS17. But the 6 second delay function on the discovery code in iOS18.2 takes eff
We as a team of engineers work on an app intended to visualize medical images. The type of situations where the app is used involves time critical decision making for acute clinical conditions. Stability of the app and performance are of utmost importance and can directly help timely treatment action. The app we are developing uses multiple libraries and tools like vtk, webgl, opengl, webkit, gl-matrix etc. The problem specifically can be described as follows, it has been observed that when 3D volume is rendered in the app and we try to rotate the volume the rotation is slow, unresposive and laggy. Specifically, we have noticed that iOS 18.1 the volume rotation is much smoother as compared to latest iOS 18.2. Eariler, we have faced somewhat similar issue with iOS 17 but it got improved in iOS 18.1. This performance regression is affecting the user experience in our healthcare application. We have taken reference from the cornerstone.js code and you can reproduce the issue u
We want to do below addition to iOS Mobile App. Airpod announces Push notification = which is workking now we want to use voice command that Reply to this and sending Reply to that notification but it is saying it is not supported in your app. So basically we need to use feature - Listen and respond to messages with AirPods Do we need to add any integration inside app for this or it will directly worked with Siri settings ? Is it possible to do in non messaging App? Is it possible to do without syncing contacts ?
I have an app written in swift. It has multiple pods dependencies installed. When the app is generated with Xcode 15.4 the size of app is ~148Mb and when the same app is generated with Xcode 16.2 the size is ~246MB When I extracted and analysed the app, it was observed that one of the framework installed via pods Dependencies(DocumentReaderCore) was consuming more size(42.9 MB vs 215 MB) the DocumentReaderCore present in the Payload/.app/Frameworks/DocumentReaderCore.framework/DocumentReaderCore was of type linux executable when generated using Xcode 15.4 vs the other was od document type and size was 215MB.
hi I am using fastlane and match to upload an app to test flight. The app requires com.apple.developer.storekit.external-link.account to be activated. My identifier has activated the capabilities and when I look at provisional profile, it also has it, but when the app has been uploaded, it's missing (although all other entitlements are there). Now it gets weird: Every time I run my flow I delete derived data, deletes all downloaded provisional profiles and use match to redownload them with read only (force is not an option). It does not work. I go to Apple dev and Toggles the capability off and on and saves. This invalidates the profile. I press edit and save. So no real changes. Run my flow and the app is uploaded correctly WITH the correct capabilities. Runs the flow again without the manual steps and the entitlements will be missing once again. Repeats the toggle stuff and the subsequent behavior repeats itself. Same flow and same code and same settings, app and profile.... I am building automated
You can use Easy App Reports to extract all your iOS reviews from all countries (and for multiple apps) at the same time.
I am trying to convert a JPG image to a JP2 (JPEG 2000) format using the ImageMagick library on iOS. However, although the file extension is changing to .jp2, the format of the image does not seem to be changing. The output image is still being treated as a JPG file, and not as a true JP2 format. Here is the code (IBAction)convertButtonClicked:(id)sender { NSString *jpgPath = [[NSBundle mainBundle] pathForResource:@Example ofType:@jpg]; NSString *tempFilePath = [NSTemporaryDirectory() stringByAppendingPathComponent:@Converted.jp2]; MagickWand *wand = NewMagickWand(); if (MagickReadImage(wand, [jpgPath UTF8String]) == MagickFalse) { char *description; ExceptionType severity; description = MagickGetException(wand, &severity); NSLog(@Error reading image: %s, description); MagickRelinquishMemory(description); return; } if (MagickSetFormat(wand, JP2) == MagickFalse) { char *description; ExceptionType severity; description = MagickGetException(wand, &severity); NSLog(@Error setting image format to
I have developed a mobile app using SwiftUI that supports GoogleMaps. Now I am in the process of building a CarPlay application. I assume CarPlay only supports Apple MapKit, as I could not find any way to integrate the Google Maps. Below are few queries, Could you please guide me on how I can obtain the user's current location on the CarPlay app launch? Is there a way CarPlay can get the details from the mobile app(not pretty sure as its using Google Maps)? If the user is logged out from the mobile app, what is the flow in CarPlay? Do we have any standard login page asking user to login to the mobile app first? Is there any UI asking the user to capture the location in CarPlay? This is my first CarPlay app. Kindly guide me to a document or so that covers these details. Thanks a ton!!