Context: SwiftUI TextField with a String for simple math using NSExpression.
I first prepare the input string to an extent but a malformed input using valid characters still fails, as expected. Let's say preparedExpression is "5--"
let expr = NSExpression(format: preparedExpression)
gives
FAULT: NSInvalidArgumentException: Unable to parse the format string "5-- == 1"; (user info absent)
How can I use NSExpression such that either the preparedExpression is pre-tested before asking for actual execution or the error is handled in a polite way that I can use to alert the user to try again.
Is there a Swift alternative to NSExpression that I've missed?
Post
Replies
Boosts
Views
Activity
Hello together,
since Xcode Version > 15 the following error handling causes following error "Pattern of type 'DecodingError' cannot match 'Never'
func getSupportedCountries() async {
// fetch all documents from collection "seasons" from firestore
let queryCountries = try? await db.collection("countries").getDocuments()
if queryCountries != nil {
self.countries = (queryCountries!.documents.compactMap({ (queryDocumentSnapshot) -> Country? in
let result = Result { try? queryDocumentSnapshot.data(as: Country.self) }
switch result {
case .success(let country):
if let country = country {
// A country value was successfully initialized from the DocumentSnapshot
self.errorMessage = nil
return country
}
else {
// A nil value was successfully initialized from the DocumentSnapshot,
// or the DocumentSnapshot was nil
self.errorMessage = "Document doesn't exist."
return nil
}
case .failure(let error):
// A Country value could not be initialized from the DocumentSnapshot
switch error {
case DecodingError.typeMismatch(_, let context):
self.errorMessage = "\(error.localizedDescription): \(context.debugDescription)"
case DecodingError.valueNotFound(_, let context):
self.errorMessage = "\(error.localizedDescription): \(context.debugDescription)"
case DecodingError.keyNotFound(_, let context):
self.errorMessage = "\(error.localizedDescription): \(context.debugDescription)"
case DecodingError.dataCorrupted(let key):
self.errorMessage = "\(error.localizedDescription): \(key)"
default:
self.errorMessage = "Error decoding document: \(error.localizedDescription)"
}
return nil
}
}))
} else {
self.errorMessage = "No documents in 'countries' collection"
return
}
}
the interesting part of the code where XCODE shows an error is from "switch error" downwards.
Does anyone of you have an idea what's wrong?
Ay help appreciated !
Thx, Peter
I am currently studying the Accelerate library by referring to Apple documentation.
Here is the link to the referenced document:
https://developer.apple.com/documentation/accelerate/veclib/vforce
When I executed the sample code provided at the bottom of the document, I found a case where the results were different.
let n = 10_000
let x = (0..<n).map { _ in
Float.random(in: 1 ... 10_000)
}
let y = x.map {
return sqrt($0)
}
and
let y = [Float](unsafeUninitializedCapacity: n) { buffer, initializedCount in
vForce.sqrt(x,
result: &buffer)
initializedCount = n
}
The code below is provided to observe the issue described above.
import Accelerate
Task {
let n = 1//10_000
let x = (0..<n).map { _ in
Float(6737.015)//Float.random(in: 1 ... 10_000)
}
let y = x.map {
return sqrt($0)
}
try? await Task.sleep(nanoseconds: 1_000_000_000)
let z = [Float](unsafeUninitializedCapacity: n) { buffer, initializedCount in
vForce.sqrt(x, result: &buffer)
initializedCount = n
}
}
For a value of 6737.015 when calculating the square root:
Using the sqrt(_:) function gives the result 82.07932,
While using the vForce.sqrt(_:result:) function gives the result 82.07933.
Using a calculator, the value comes out as 82.07932139, which shows that the result from vForce is incorrect.
Could you explain the reason behind this difference?
Hey everyone,
I’m learning async/await and trying to fetch an image from a URL off the main thread to avoid overloading it, while updating the UI afterward. Before starting the fetch, I want to show a loading indicator (UI-related work). I’ve implemented this in two different ways using Task and Task.detached, and I have some doubts:
Is using Task { @MainActor the better approach?
I added @MainActor because, after await, the resumed execution might not return to the Task's original actor. Is this the right way to ensure UI updates are done safely?
Does calling fetchImage() on @MainActor force it to run entirely on the main thread?
I used an async data fetch function (not explicitly marked with any actor). If I were to use a completion handler instead, would the function run on the main thread?
Is using Task.detached overkill here?
I tried Task.detached to ensure the fetch runs on a non-main actor. However, it seems to involve unnecessary actor hopping since I still need to hop back to the main actor for UI updates. Is there any scenario where Task.detached would be a better fit?
class ViewController : UIViewController{
override func viewDidLoad() {
super.viewDidLoad()
//MARK: First approch
Task{@MainActor in
showLoading()
let image = try? await fetchImage() //Will the image fetch happen on main thread?
updateImageView(image:image)
hideLoading()
}
//MARK: 2nd approch
Task{@MainActor in
showLoading()
let detachedTask = Task.detached{
try await self.fetchImage()
}
updateImageView(image:try? await detachedTask.value)
hideLoading()
}
}
func fetchImage() async throws -> UIImage {
let url = URL(string: "https://via.placeholder.com/600x400.png?text=Example+Image")!
//Async data function call
let (data, response) = try await URLSession.shared.data(from: url)
guard let httpResponse = response as? HTTPURLResponse, httpResponse.statusCode == 200 else {
throw URLError(.badServerResponse)
}
guard let image = UIImage(data: data) else {
throw URLError(.cannotDecodeContentData)
}
return image
}
func showLoading(){
//Show Loader handling
}
func hideLoading(){
//Hides the loader
}
func updateImageView(image:UIImage?){
//Image view updated
}
}
I want to build a Swift library package that uses modified build of OpenSSL and Curl.
I have already statically compiled both and verified I can use them in an Objective-C framework on my target platform (iOS & iOS Simulator). I'm using XCFramework files that contain the static library binaries and headers:
openssl.xcframework/
ios-arm64/
openssl.framework/
Headers/
[...]
openssl
ios-arm64_x86_64-simulator/
openssl.framework/
Headers/
[...]
openssl
Info.plist
I'm not sure how I'm supposed to set up my Swift package to import these libraries.
I can use .systemLibrary but that seems to use the embedded copies of libssl and libcurl on my system, and I can't figure out how to use the path: parameter to that.
I also tried using a .binaryTarget pointing to the XCFramework files, but that didn't seem to work as there is no module generated and I'm not sure how to make one myself.
At a basic high level, this is what I'm trying to accomplish:
where libcrypto & libssl come from the provided openssl.xcframework file, and libcurl from curl.xcframework
Hello dear community,
I have the sample code from Apple “CapturingDepthUsingLiDAR” to access the LiDAR on my iPhone 12 Pro. My goal is to use the “photo output” function to generate a point cloud from a single image and then save it as a ply file. So far I have tested different approaches to create a .ply file from the depthmap, the intrinsic camera data and the rgba values. Unfortunately, I have had no success so far and the result has always been an incorrect point cloud.
My question now is whether there are already approaches to this and whether anyone has any experience with it.
Thank you very much in advance!!!
Hello Im having an error in swiftUI project of mine. I use fullscreencover to navigate through views. Normally it s been working but one point it doesn't. I go through MainMenu -> SomeOtherView -> GameView -> AfterGameView -> SomeOtherView -> MainMenu. When it comes to mainmenu at last, it s showing main menu for a glimpse of a look and then goes back to GameView. In console an error took my notice.
> A new orientation transaction token is being requested while a valid one already exists. reason=Fullscreen transition (dismissing): fromVC=<_TtGC7SwiftUI29PresentationHostingControllerVS_7AnyView_: 0x10795ca00>; toVC=<_TtGC7SwiftUI29PresentationHostingControllerVS_7AnyView_: 0x1071c3400>;; windowOrientation=portrait; sceneOrientation=portrait; existingTransaction=<_UIForcedOrientationTransactionToken: 0x600001804a40; state: active; originalOrientation: portrait (1)>
Cant really finding the solution. Need help asap I will release a bug update to Appstore.
I'm dealing with a strange bug where I am requesting read access for 'appleExerciseTime' and 'activitySummaryType', and despite enabling both in the permission sheet, they are being set to 'sharingDenied'.
I'm writing a Swift Test for making sure permissions are being granted.
@Test
func PermissionsGranted() {
try await self.manager.getPermissions()
for type in await manager.allHealthTypes {
let status = await manager.healthStore.authorizationStatus(for: type)
#expect(status == .sharingAuthorized, "\(type) authorization status is \(status)")
}
}
let healthTypesToShare: Set<HKSampleType> = [
HKQuantityType(.bodyMass),
HKQuantityType(.bodyFatPercentage),
HKQuantityType(.leanBodyMass),
HKQuantityType(.activeEnergyBurned),
HKQuantityType(.basalEnergyBurned),
HKObjectType.workoutType()
]
let allHealthTypes: Set<HKObjectType> = [
HKQuantityType(.bodyMass),
HKQuantityType(.bodyFatPercentage),
HKQuantityType(.leanBodyMass),
HKQuantityType(.activeEnergyBurned),
HKQuantityType(.basalEnergyBurned),
HKQuantityType(.appleExerciseTime),
HKObjectType.activitySummaryType()
]
let healthStore = HKHealthStore()
func getPermissions() async throws {
try await healthStore.requestAuthorization(toShare: self.healthTypesToShare, read: self.allHealthTypes)
}
After 'getPermissions' runs, the permission sheet shows up on the Simulator, and I accept all. I've double checked that the failing permissions show up on the sheet and are enabled. Then the test fails with:
Expectation failed: (status → HKAuthorizationStatus(rawValue: 1)) == (.sharingAuthorized → HKAuthorizationStatus(rawValue: 2)) HKActivitySummaryTypeIdentifier authorization status is HKAuthorizationStatus(rawValue: 1)
Expectation failed: (status → HKAuthorizationStatus(rawValue: 1)) == (.sharingAuthorized → HKAuthorizationStatus(rawValue: 2)) HKActivitySummaryTypeIdentifier authorization status is HKAuthorizationStatus(rawValue: 1)
With the rawValue of '1' being 'sharingDenied'. All other permissions are granted. Is there a workaround here, or something I'm potentially doing wrong?
Getting this error several times when presenting a modal window over my splitview window when running it on my Mac using Swift/Mac Catalyst in XCode 14.2. When I click the Cancel button in the window then I get Scene destruction request failed with error: (null) right after an unwind segue.
2023-07-04 16:50:45.488538-0500 Recipes[27836:1295134] [WindowHosting] UIScene property of UINSSceneViewController was accessed before it was set.
2023-07-04 16:50:45.488972-0500 Recipes[27836:1295134] [WindowHosting] UIScene property of UINSSceneViewController was accessed before it was set.
2023-07-04 16:50:45.496702-0500 Recipes[27836:1295134] [WindowHosting] UIScene property of UINSSceneViewController was accessed before it was set.
2023-07-04 16:50:45.496800-0500 Recipes[27836:1295134] [WindowHosting] UIScene property of UINSSceneViewController was accessed before it was set.
2023-07-04 16:50:45.994147-0500 Recipes[27836:1295134] Unbalanced calls to begin/end appearance transitions for <UINavigationController: 0x7f7fdf068a00>.
bleep
2023-07-04 16:51:00.655233-0500 Recipes[27836:1297298] Scene destruction request failed with error: (null)
I don't quite understand what all all this means. (The "bleep" was a debugging print code I put in the unwind segue). I'm working through Apple's Mac Catalyst tutorial but it seems to be riddled with bugs and coding issues, even in the final part of the completed app which I dowmloaded and ran. I don't see these problems on IPad simulator.
I don't know if it's because Catalyst has problems itself or there's something else going on that I can fix myself. Any insight into these errors would be very much appreciated!
PS: The app seems to run ok on Mac without crashing despite the muliple issues
Hi guys,
I've been struggling for a few days with this really weird behaviour.
We made an app for our e-commerce website and found out that a part of the product page is missing.
For any reason, the header and first blocks of the page and footer are displayed, but then a massive part of the content is missing. This content is not loaded through ajax; that's why I don't understand why it's not displayed.
You can see here 2 screenshots of what the page should look like and what the page looks like with WKWebView.
I've been inspecting this with Safari; there isn't any blocking error in the console, and html elements are just empty. There is the div with class row and nothing in it.
The same website is working perfectly with native Android Webview.
If anyone has any clue to find out what's going wrong
Consider this Swift struct:
public struct Example
{
public func foo(callback: ()->Void)
{
....
}
public func blah(i: Int)
{
....
}
....
}
Using Swift/C++ interop, I can create Example objects and call methods like blah. But I can't call foo because Swift/C++ interop doesn't currently support passing closures (right?).
On the other hand, Swift/objC does support passing objC blocks to Swift functions. But I can't use that here because Example is a Swift struct, not a class. So I could change it to a class, and update everything to work with reference rather than value semantics; but then I also have to change the objC++ code to create the object and call its methods using objC syntax. I'd like to avoid that.
Is there some hack that I can use to make this possible? I'm hoping that I can wrap a C++ std::function in some sort of opaque wrapper and pass that to swift, or something.
Thanks for any suggestions!
Hello, I have a problem with the .onMove function. I believe I have set everything up properly. However, the moving does not seem to be working correctly. When I try to move the item, it is highlighted first, as it is supposed to be. Then, while I am moving it through the list, it disappears for some reason, and at the end of the move, it comes back to its initial place. (I use iOS 16.0 minimum, so I don't have to include the EditButton(). It works the same in the edit mode tho)
import SwiftUI
struct Animal: Identifiable {
var id = UUID()
var name: String
}
struct ListMove: View {
@State var animals = [Animal(name: "Dog"), Animal(name: "Cat"), Animal(name: "Cow"), Animal(name: "Goat"), Animal(name: "Chicken")]
var body: some View {
List {
ForEach(animals) { animal in
Text(animal.name)
}
.onMove(perform: move)
}
}
func move(from source: IndexSet, to destination: Int) {
animals.move(fromOffsets: source, toOffset: destination)
}
}
#Preview {
ListMove()
}
I get this red warning in Xcode every time my app is syncing to the iCloud. My model has only basic types and enum that conform to Codable so i'm not sure what is the problem.
App is working well, synchronization works. But the warning doesn't look good.
Maybe someone has idea how to debug it.
I use AppIntent to trigger a widget refresh, Appint is used on Button or Toggle,as follows
var isAudibleArming = false
struct SoundAlarmIntent: AppIntent {
static var title: LocalizedStringResource = "SoundAlarmIntent"
func perform() async throws -> some IntentResult {
isAudibleArming = true
return .result()
}
}
func timeline( for configuration: DynamicIntentWidgetPersonIntent, in context: Context ) async -> Timeline {
var entries: [Entry] = []
let currentDate = Date()
let entry = Entry(person: person(for: configuration))
entries.append(entry)
if isAudibleArming {
let entry2 = Entry(person: Person(name: "Friend4", dateOfBirth: currentDate.adding(.second, value: 6)))
entries.append(entry2)
}
return .init(entries: entries, policy: .never)
}
The timeline function fires, with entry corresponding to view1 and entry2 corresponding to view2. I expect to show view1 immediately and view2 6 seconds later. You get the correct response on iOS17. But the 6 second delay function on the discovery code in iOS18.2 takes effect immediately, view1 flashes, view2 appears immediately instead of waiting 6 seconds to appear.
These helper methods allow to use modifier methods in standard for SwiftUI short way.
extension View {
@inline(__always)
func modify(_ block: (_ view: Self) -> some View) -> some View {
block(self)
}
@inline(__always)
func modify<V : View, T>(_ block: (_ view: Self, _ data: T) -> V, with data: T) -> V {
block(self, data)
}
}
_
DISCUSSION
Suppose you have modifier methods:
func addBorder(view: some View) -> some View {
view.padding().border(Color.red, width: borderWidth)
}
func highlight(view: some View, color: Color) -> some View {
view.border(Color.red, width: borderWidth).overlay { color.opacity(0.3) }
}
_
Ordinar Decision
Your code may be like this:
var body: some View {
let image = Image(systemName: "globe")
let borderedImage = addBorder(view: image)
let highlightedImage = highlight(view: borderedImage, color: .red)
let text = Text("Some Text")
let borderedText = addBorder(view: text)
let highlightedText = highlight(view: borderedText, color: .yellow)
VStack {
highlightedImage
highlightedText
}
}
This code doesn't look like standard SwiftUI code.
_
Better Decision
Described above helper methods modify(:) and modify(:,with:) allow to write code in typical for SwiftUI short way:
var body: some View {
VStack {
Image(systemName: "globe")
.modify(addBorder)
.modify(highlight, with: .red)
Text("Some Text")
.modify(addBorder)
.modify(highlight, with: .yellow)
}
}
System provides AnyShape type erasure that animates correctly. But system doesn't provide AnyInsettableShape. Here is my implementation of AnyInsettableShape (and AnyAnimatableData that is needed to support animation).
Let me know if there is simpler solution.
struct AnyInsettableShape: InsettableShape {
private let _path: (CGRect) -> Path
private let _inset: (CGFloat) -> AnyInsettableShape
private let _getAnimatableData: () -> AnyAnimatableData
private let _setAnimatableData: (_ data: AnyAnimatableData) -> AnyInsettableShape
init<S>(_ shape: S) where S : InsettableShape {
_path = { shape.path(in: $0) }
_inset = { AnyInsettableShape(shape.inset(by: $0)) }
_getAnimatableData = { AnyAnimatableData(shape.animatableData) }
_setAnimatableData = { data in
guard let otherData = data.rawValue as? S.AnimatableData else { assertionFailure(); return AnyInsettableShape(shape) }
var shape = shape
shape.animatableData = otherData
return AnyInsettableShape(shape)
}
}
var animatableData: AnyAnimatableData {
get { _getAnimatableData() }
set { self = _setAnimatableData(newValue) }
}
func path(in rect: CGRect) -> Path {
_path(rect)
}
func inset(by amount: CGFloat) -> some InsettableShape {
_inset(amount)
}
}
struct AnyAnimatableData : VectorArithmetic {
init<T : VectorArithmetic>(_ value: T) {
self.init(optional: value)
}
private init<T : VectorArithmetic>(optional value: T?) {
rawValue = value
_scaleBy = { factor in
(value != nil) ? AnyAnimatableData(value!.scaled(by: factor)) : .zero
}
_add = { other in
AnyAnimatableData(value! + (other.rawValue as! T))
}
_subtract = { other in
AnyAnimatableData(value! - (other.rawValue as! T))
}
_equal = { other in
value! == (other.rawValue as! T)
}
_magnitudeSquared = {
(value != nil) ? value!.magnitudeSquared : .zero
}
_zero = {
AnyAnimatableData(T.zero)
}
}
fileprivate let rawValue: (any VectorArithmetic)?
private let _scaleBy: (_: Double) -> AnyAnimatableData
private let _add: (_ other: AnyAnimatableData) -> AnyAnimatableData
private let _subtract: (_ other: AnyAnimatableData) -> AnyAnimatableData
private let _equal: (_ other: AnyAnimatableData) -> Bool
private let _magnitudeSquared: () -> Double
private let _zero: () -> AnyAnimatableData
mutating func scale(by rhs: Double) {
self = _scaleBy(rhs)
}
var magnitudeSquared: Double {
_magnitudeSquared()
}
static let zero = AnyAnimatableData(optional: nil as Double?)
@inline(__always)
private var isZero: Bool { rawValue == nil }
static func + (lhs: AnyAnimatableData, rhs: AnyAnimatableData) -> AnyAnimatableData {
guard let (lhs, rhs) = fillZeroTypes(lhs, rhs) else { return .zero }
return lhs._add(rhs)
}
static func - (lhs: AnyAnimatableData, rhs: AnyAnimatableData) -> AnyAnimatableData {
guard let (lhs, rhs) = fillZeroTypes(lhs, rhs) else { return .zero }
return lhs._subtract(rhs)
}
static func == (lhs: AnyAnimatableData, rhs: AnyAnimatableData) -> Bool {
guard let (lhs, rhs) = fillZeroTypes(lhs, rhs) else { return true }
return lhs._equal(rhs)
}
@inline(__always)
private static func fillZeroTypes(_ lhs: AnyAnimatableData, _ rhs: AnyAnimatableData) -> (AnyAnimatableData, AnyAnimatableData)? {
switch (!lhs.isZero, !rhs.isZero) {
case (true, true): (lhs, rhs)
case (true, false): (lhs, lhs._zero())
case (false, true): (rhs._zero(), rhs)
case (false, false): nil
}
}
}
Hi, I'm relatively new to iOS development and kindly ask for some feedback on a strategy to achieve this desired behavior in my app.
My Question:
What would be the best strategy for sound effect playback when an app is in the background with precise timing? Is this even possible?
Context:
I created a basic countdown timer app (targeting iOS 17 with Swift/SwiftUI.). Countdown sessions can last up to 30-60 mins. When the timer is started it progresses through a series of sub-intervals and plays a short sound for each one. I used AVAudioPlayer and everything works fine when the app is in the foreground. I'm considering switching to AVAudioEngine b/c precise timing is very important and the AIs tell me this would have better precision.
I'm already setting "App plays audio or streams audio/video using AirPlay" in my Plist, and have configured:
AVAudioSession.sharedInstance().setCategory(.playback, mode: .default, options: .mixWithOthers)
Curiously, when testing on my iPhone 13 mini, sounds sometimes still play when the app is in the background, but not always.
What I've considered:
Background Tasks: Would they make any sense for this use-case? Seems like not if the allowed time is short & limited by the system.
Pre-scheduling all Sounds: Not sure this would even work and seems like a lot of memory would be needed (could be hundreds of intervals).
ActivityKit Alerts: works but with a ~50ms delay which is too long for my purposes.
Pre-Render all SFX to 1 large audio file: Seems like a lot of work and processing time and probably not worth it. I hope there's a better solution.
I'd really appreciate any feedback.
I’m experiencing a crash at runtime when trying to extract audio from a video. This issue occurs on both iOS 18 and earlier versions. The crash is caused by the following error:
*** Terminating app due to uncaught exception 'NSInternalInconsistencyException', reason: '*** -[AVAssetExportSession exportAsynchronouslyWithCompletionHandler:] Cannot call exportAsynchronouslyWithCompletionHandler: more than once.
(0x1851435ec 0x1826dd244 0x1970c09c0 0x214d8f358 0x214d95899 0x190a1c8b9 0x214d8efd9 0x30204cef5 0x302053ab9 0x190a5ae39)
libc++abi: terminating due to uncaught exception of type NSException
My previous code worked fine, but it's crashing with Swift 6.
Does anyone know a solution for this?
Previous code:
func extractAudioFromVideo(from videoURL: URL, exportHandler: ((AVAssetExportSession, CurrentValueSubject<Float, Never>?) -> Void)? = nil, completion: @escaping (Swift.Result<URL, Error>) -> Void) {
let asset = AVAsset(url: videoURL)
// Create an AVAssetExportSession to export the audio track
guard let exportSession = AVAssetExportSession(asset: asset, presetName: AVAssetExportPresetAppleM4A) else {
completion(.failure(NSError(domain: "com.example.app", code: -1, userInfo: [NSLocalizedDescriptionKey: "Failed to create AVAssetExportSession"])))
return
}
// Set the output file type and path
guard let filename = videoURL.lastPathComponent.components(separatedBy: ["."]).first else { return }
let outputURL = VideoUtils.getTempAudioExportUrl(filename)
VideoUtils.deleteFileIfExists(outputURL.path)
exportSession.outputFileType = .m4a
exportSession.outputURL = outputURL
let audioExportProgressPublisher = CurrentValueSubject<Float, Never>(0.0)
if let exportHandler = exportHandler {
exportHandler(exportSession, audioExportProgressPublisher)
}
// Periodically check the progress of the export session
let timer = Timer.scheduledTimer(withTimeInterval: 0.1, repeats: true) { _ in
audioExportProgressPublisher.send(exportSession.progress)
}
// Export the audio track asynchronously
exportSession.exportAsynchronously {
switch exportSession.status {
case .completed:
completion(.success(outputURL))
case .failed:
completion(.failure(exportSession.error ?? NSError(domain: "com.example.app", code: -1, userInfo: [NSLocalizedDescriptionKey: "Unknown error occurred while exporting audio"])))
case .cancelled:
completion(.failure(NSError(domain: "com.example.app", code: -1, userInfo: [NSLocalizedDescriptionKey: "Export session was cancelled"])))
default:
completion(.failure(NSError(domain: "com.example.app", code: -1, userInfo: [NSLocalizedDescriptionKey: "Unknown export session status"])))
}
// Invalidate the timer when the export session completes or is cancelled
timer.invalidate()
}
}
## New Code:
func extractAudioFromVideo(from videoURL: URL, exportHandler: ((AVAssetExportSession, CurrentValueSubject<Float, Never>?) -> Void)? = nil, completion: @escaping (Swift.Result<URL, Error>) -> Void)
{
let asset = AVAsset(url: videoURL)
// Create an AVAssetExportSession to export the audio track
guard let exportSession = AVAssetExportSession(asset: asset, presetName: AVAssetExportPresetAppleM4A) else {
completion(.failure(NSError(domain: "com.example.app", code: -1, userInfo: [NSLocalizedDescriptionKey: "Failed to create AVAssetExportSession"])))
return
}
// Set the output file type and path
guard let filename = videoURL.lastPathComponent.components(separatedBy: ["."]).first else { return }
let outputURL = VideoUtils.getTempAudioExportUrl(filename)
VideoUtils.deleteFileIfExists(outputURL.path)
exportSession.outputFileType = .m4a
exportSession.outputURL = outputURL
let audioExportProgressPublisher = CurrentValueSubject<Float, Never>(0.0)
if let exportHandler {
exportHandler(exportSession, audioExportProgressPublisher)
}
let task = Task {
if #available(iOS 18.0, *) {
// Handle export for iOS 18 and later
let states = exportSession.states(updateInterval: 0.1)
for await state in states {
switch state {
case .pending, .waiting:
break
case .exporting(progress: let progress):
print("Exporting: \(progress.fractionCompleted)")
if progress.isFinished {
completion(.success(outputURL))
} else if progress.isCancelled {
completion(.failure(NSError(domain: "com.example.app", code: -1, userInfo: [NSLocalizedDescriptionKey: "Export session was cancelled"])))
} else {
audioExportProgressPublisher.send(Float(progress.fractionCompleted))
}
}
}
try await exportSession.export(to: outputURL, as: .m4a) // Only call export once
} else {
// Handle export for iOS versions below 18
let publishTimer = Timer.publish(every: 0.1, on: .main, in: .common)
.autoconnect()
.sink { [weak exportSession] _ in
guard let exportSession = exportSession else { return }
audioExportProgressPublisher.send(exportSession.progress)
}
// Only call export once
await exportSession.export()
// Handle the export session's status
switch exportSession.status {
case .completed:
completion(.success(outputURL))
case .failed:
completion(.failure(exportSession.error ?? NSError(domain: "com.example.app", code: -1, userInfo: [NSLocalizedDescriptionKey: "Unknown error occurred while exporting audio"])))
case .cancelled:
completion(.failure(NSError(domain: "com.example.app", code: -1, userInfo: [NSLocalizedDescriptionKey: "Export session was cancelled"])))
default:
completion(.failure(NSError(domain: "com.example.app", code: -1, userInfo: [NSLocalizedDescriptionKey: "Unknown export session status"])))
}
// Invalidate the timer when the export session completes or is cancelled
publishTimer.cancel()
}
}
task.cancel()
}
Does SwiftUI now support the ability for a chart to have two different Y Axes? ChaptGPT seems to think it does, but I keep getting compiler errors in XCode.
I recently encountered an issue with Xcode 16.2 while attempting to integrate Settings.bundle into a new app. I added Settings.bundle as a new file (using the provided template), but when I ran the app (the standard simple "Hello World" project), the expected three default controls (Name, Enabled, Slider) did not appear in the app's settings.
To troubleshoot, I downgraded my system to macOS Sonoma 14.7.2 and Xcode 15.4 (on a 2023 Mac Mini, M2). After this downgrade, everything worked as expected. With a new project, adding Settings.bundle, and running the app, the settings entry for the app appeared, including the three default fields.
This behavior suggests a potential issue or incompatibility with Xcode 16.2.