Dive into the world of programming languages used for app development.

All subtopics

Post

Replies

Boosts

Views

Activity

UITextField no deinitialization since IOS17 -[RTIInputSystemClient remoteTextInputSessionWithID:performInputOperation:] perform input operation requires a valid sessionID
The problem more and more developers have to face is that a UITextField do not get deinitialized when its supposed to do so. This can cause multiple Problems. For example: I am dismissing a "ENTER_NAME" View Controller. This View Controller gets deinitialized correctly but the text field doesn't. If I want to access a new Instance of an "ENTER_NAME" View Controller now I get the following problem: -[RTIInputSystemClient remoteTextInputSessionWithID:performInputOperation:] perform input operation requires a valid sessionID I tested this scenario in IOS 16 and lower and everything works properly. Please let me know below if anyone have found a solution for this. Thanks
0
2
1.1k
Mar ’24
Adding Python Interpreter to iOS App: Risks of Rejection and Execution Guidance
Hi Apple community, I'm currently working on an iOS app development project and considering integrating a Python interpreter for certain functionalities where python scripts needs to be executed. However, I'm concerned about potential App Store rejection due to this addition. Has anyone had experience with this, and if so, what are the risks or precautions I should be aware of? Additionally, I'd appreciate some guidance on executing Python scripts within the same project. Are there specific best practices or considerations I should keep in mind? Your insights and experiences will be invaluable in navigating this aspect of iOS development. Thanks in advance for your help!
0
0
445
Mar ’24
AVAssetExportPresetHEVCHighestQualityWithAlpha decreases quality a lot when exporting video
Problem I need to import a video, process and then export the video with alpha. I noticed the video gets a lot grayer/loses quality compared to the original. I don't need any compression. Sidenote: I need to export video's with transparency enabled, that's why I use AVAssetExportPresetHEVCHighestQualityWithAlpha. It seems that that is causing the problem, since AVAssetExportPresetHighestQuality is looking good. This are side-by-side frames of the original and a video that's processed. The left is the original frame, the right is a processed video: https://i.stack.imgur.com/ORqfz.png This is another example where the bottom is exported and the above is the original. You can see at the bar where the YouTube NL is displayed, that the above one is almost fully black, while the below one (exported) is really gray: https://i.stack.imgur.com/s8lCn.png As far as I know, I don't do anything special, I just load the video and directly export it. It still loses quality. How can I prevent this? Reproduction path You can either clone the repository, or see the code below. The repository is available here: https://github.com/Jasperav/VideoCompression/tree/main/VideoCompressionTests. After you cloned it, run the only unit-test and check the logging of where the output of the video is stored. You can then observe that temp.mov is a lot grayer than the original video. The code of importing and exporting the video is here. As far as I can see, I just import and directly export the movie without modifying it. What's the problem? import AppKit import AVFoundation import Foundation import Photos import QuartzCore import OSLog let logger = Logger() class VideoEditor { func export( url: URL, outputDir: URL ) async { let asset = AVURLAsset(url: url) let extract = try! await extractData(videoAsset: asset) try! await exportVideo(outputPath: outputDir, asset: asset, videoComposition: extract) } private func exportVideo(outputPath: URL, asset: AVAsset, videoComposition: AVMutableVideoComposition) async throws { let fileExists = FileManager.default.fileExists(atPath: outputPath.path()) logger.debug("Output dir: \(outputPath), exists: \(fileExists), render size: \(String(describing: videoComposition.renderSize))") if fileExists { do { try FileManager.default.removeItem(atPath: outputPath.path()) } catch { logger.error("remove file failed") } } let dir = outputPath.deletingLastPathComponent().path() logger.debug("Will try to create dir: \(dir)") try? FileManager.default.createDirectory(atPath: dir, withIntermediateDirectories: true) var isDirectory = ObjCBool(false) guard FileManager.default.fileExists(atPath: dir, isDirectory: &isDirectory), isDirectory.boolValue else { logger.error("Could not create dir, or dir is a file") fatalError() } guard let exporter = AVAssetExportSession(asset: asset, presetName: AVAssetExportPresetHEVCHighestQualityWithAlpha) else { logger.error("generate export failed") fatalError() } exporter.outputURL = outputPath exporter.outputFileType = .mov exporter.shouldOptimizeForNetworkUse = false exporter.videoComposition = videoComposition await exporter.export() logger.debug("Status: \(String(describing: exporter.status)), error: \(exporter.error)") if exporter.status != .completed { fatalError() } } private func extractData(videoAsset: AVURLAsset) async throws -> AVMutableVideoComposition { guard let videoTrack = try await videoAsset.loadTracks(withMediaType: .video).first else { fatalError() } let composition = AVMutableComposition(urlAssetInitializationOptions: nil) guard let compositionVideoTrack = composition.addMutableTrack(withMediaType: AVMediaType.video, preferredTrackID: videoTrack.trackID) else { fatalError() } let duration = try await videoAsset.load(.duration) try compositionVideoTrack.insertTimeRange(CMTimeRangeMake(start: CMTime.zero, duration: duration), of: videoTrack, at: CMTime.zero) let naturalSize = try await videoTrack.load(.naturalSize) let preferredTransform = try await videoTrack.load(.preferredTransform) let mainInstruction = AVMutableVideoCompositionInstruction() mainInstruction.timeRange = CMTimeRange(start: CMTime.zero, end: duration) let layerInstruction = AVMutableVideoCompositionLayerInstruction(assetTrack: videoTrack) let videoComposition = AVMutableVideoComposition() let frameRate = try await videoTrack.load(.nominalFrameRate) videoComposition.frameDuration = CMTimeMake(value: 1, timescale: Int32(frameRate)) mainInstruction.layerInstructions = [layerInstruction] videoComposition.instructions = [mainInstruction] videoComposition.renderSize = naturalSize return videoComposition } }
0
0
707
Mar ’24
Products.product always returns an empty array
Hello, I'm trying to add in-app purchases to a macOS app but failing to load the local test products. From the docs and various examples I've found online, it should be pretty straightforward with StoreKit 2. So far, I've done the following: added in-app purchase capability created the local .storekit file added consumable and non-consumable products updated the scheme to use the test StoreKit configuration verified the products are present and can be purchase by using Debug > StoreKit > Manage Transactions to make direct purchases verified the product IDs are correct To simplify things I tried creating a barebones app. I created a new .storekit file and added a single non-consumable product with ID product1. I used the default view and just a task to retrieve the products: import SwiftUI import StoreKit struct ContentView: View { var body: some View { VStack { Image(systemName: "globe") .imageScale(.large) .foregroundStyle(.tint) Text("Hello, world!") } .task { let products = try? await Product.products(for: ["product1"]) print(products) } .padding() } } The print statement output is: Optional([]) I know I must be doing something wrong, but I'm completely missing it. I hope someone can help me. Thanks
2
0
1.4k
Mar ’24
Host Card Emulation iOS 17.4+
Hello everyone, Apple has finally released the iOS 17.4 version which also allows you to develop apps with HCE technology. Unfortunately, besides the API (https://developer.apple.com/documentation/corenfc/cardsession), I can't find any example projects. I understand that the update was released recently, but if anyone has already tried to develop an app of this kind, any help is welcome!
0
0
1.3k
Mar ’24
Swift Cpp/c++ interop: swift functions with c++ enums arguments not available in c++
I'm trying to develop a mix-language (c++ and Swift) framework where calls are mainly from c++ -> Swift. However, I'm having problems with get Swift functions available in c++ when they take a c++ enum value as parameter. Assume the following c++ header: #pragma once enum class MyCppEnumClass { A, B, C }; enum class MyCppEnum { D, E, F }; and following Swift source file: public func swiftFunctionWithCppEnumClass(_ value: MyCppEnumClass) { print("Swift function with C++ enum: \(value)") } public func swiftFunctionWithCppEnum(_ value: MyCppEnum) { print("Swift function with C++ enum: \(value)") } The project compiles correctly this way, so the bridging of the enum types does work, making both the enum and enum class available in Swift. However, when inspecting the generated Swift header, I'm seeing this: namespace CxxInteropExperiments SWIFT_PRIVATE_ATTR SWIFT_SYMBOL_MODULE("CxxInteropExperiments") { // Unavailable in C++: Swift global function 'swiftFunctionWithCppEnum(_:)'. // Unavailable in C++: Swift global function 'swiftFunctionWithCppEnumClass(_:)'. } // namespace CxxInteropExperiments I can't find any information on swift.org (https://www.swift.org/documentation/cxx-interop/) that this would be unsupported. Currently the only solution I can find is to 'mirror' the enum with native Swift enum and implement a convert function in c++ like so: public enum MySwiftEnum { case A case B case C } public func swiftFunctionWithSwiftEnum(_ value: MySwiftEnum) { print("Swift function with Swift enum: \(value)") } #include <CxxInteropExperiments/CxxInteropExperiments-Swift.h> CxxInteropExperiments::MySwiftEnum convert(MyCppEnumClass e) { switch(e) { case MyCppEnumClass::A: return CxxInteropExperiments::MySwiftEnum::A(); case MyCppEnumClass::B: return CxxInteropExperiments::MySwiftEnum::B(); case MyCppEnumClass::C: return CxxInteropExperiments::MySwiftEnum::C(); } } void callSwiftFunctionWithEnum(MyCppEnumClass e) { CxxInteropExperiments::swiftFunctionWithSwiftEnum(convert(e)); } and not use c++ enum or enum classes in Swift function signatures that I want to be able to use in c++. Am I missing something obvious, or is passing c++ enum values directly to Swift functions just not possible? Any help is appreciated.
3
0
889
Mar ’24
CreateML model doesn't work as expected when added to my application (Swift)
I have a trained model to identify squats (good & bad repetitions). It seems to be working perfectly in CreateML when I preview it with some test data, although once I add it to my app the model seems to be inaccurate and the majority of the time mixes up the actions. Does anyone know if the issue is code related or is it something to do with the model itself and how it analyses live data? Below I have added one of my functions for "Good Squats" which most of the time doesn't even get called (even with lower confidence). The majority of the time the model classes everything as a bad squat even though it is clearly not. Could the problem be that my dataset doesn't have enough videos? print("GoodForm") squatDetected = true DispatchQueue.main.asyncAfter(deadline: .now() + 1.5) { self.squatDetected = false } DispatchQueue.main.async { self.showGoodFormAlert(with: confidence) AudioServicesPlayAlertSound(SystemSoundID(1322)) } } Any help would be appreciated.
2
0
601
Mar ’24
send udp socket multicast address in network framework iOS 17 version
Hi, I am trying send data multicast address via using NWConnectionGroup, I tried below code at iOS 15 version it works, it sends data and I received sended data, but When I try at iOS 17 version it connects successfully url and port, but it doesn't sending data, Why I can't send data from iOS 17 version, Can I need get specific permission func init() { var group: NWConnectionGroup! connect(withHost: "235.10.10.100", port: 3333) } func connect(withHost: NWEndpoint.Host, port: NWEndpoint.Port) { let multicast = try! NWMulticastGroup(for: [.hostPort(host: withHost, port: port)]) group = NWConnectionGroup(with: multicast, using: .udp) group.setReceiveHandler(maximumMessageSize: 128) { message, data, b in print("Received message:") } group.stateUpdateHandler = { (newState) in print("Group entered state \(String(describing: newState))") } group.start(queue: .global()) } func sendData() { let groupSendContent = Data("helloAll".utf8) self.group.send(content: groupSendContent) { error in (error) print("Send complete with error (String(describing: error))") } }
1
0
514
Mar ’24
UICollectionView vision becomes small for a few seconds when cell tap
UICollectionView vision becomes small for a few seconds when cell tap See image [Before] [After] In practice, when I press on the cell (tap) it should tag the element, on iOS 17 it works perfectly, on iOS 15.8.1 however it has the ugly effect of shrinking the entire content of the collectionview for a few seconds. Can anybody help me ? I don't know where to look anymore
0
0
596
Mar ’24
Audio Extension App Audio Visualisation
I’m new to this and I haven’t been able to find proper documentation. I am developing an audio extension application for GarageBand/Logic Pro and I want to be able to visualize the frequency magnitude data that’s being processed. How do I pass the audio data from the C++ code to SwiftUI to visualize?
0
0
379
Mar ’24
MacOS 14.4 ServiceManagement, cannot install sandboxed daemon from sandboxed main application anymore
I have made an app that requires a daemon to run. For this I use the ServiceManagement framework and the SMAppService.register to register the daemon. The macOS 14.4 update broke the installation process and the daemon cannot be installed anymore and instead returns an error when trying to install the helper. The installation works on MacOS 14.3.1 or lower. I have narrowed the error to the main app being sandboxed. Both the daemon and the main app are sandboxed (as MacOS 14.2 introduced the restriction that a sandboxed app can only run/install a sandboxed daemon, https://developer.apple.com/documentation/macos-release-notes/macos-14_2-release-notes#ServiceManagement). I have been able to confirm that removing the sandbox on the main application results in the register function working again on MacOS 14.4. However, the release notes of 14.4 do not mention anything regarding the ServiceManagement API or something related. So my question is, what has changed in MacOS 14.4 so that the register function for a daemon causes an error when the main app is sandboxed? And moreover, how can I prevent this error without removing the sandbox -- Information regarding the error: The .register function returns the following error: Error Domain=SMAppServiceErrorDomain Code=22 "Invalid argument" UserInfo={NSLocalizedFailureReason=Invalid argument I have also created a log file according to the procedure at the link below and attached it to this post: https://forums.developer.apple.com/forums/thread/707482#716553022 It appears from the log file and from observing the logs in the Console app, that the error "plist changed between client and smd" causes the issue but I don't understand what causes this error out2 2.log -- (I already use the com.apple.security.temporary-exception.sbpl entitlement in the daemon such that it can write to a specific file that the pmset command write to when invoked. This to indicate that I would prefer to keep the main app sandboxed as well. As I could also just remove the sandbox but I don't want to do that)
2
0
882
Mar ’24
Swift 5.9 Noncopyable and os_unfair_lock crash
Hello, I recently implemented a lock that uses OSAllocatedUnfairLock on iOS 16+ and os_unfair_lock on below iOS 16. I know that using os_unfair_lock with Swift is tricky, and it is error-prone. For example, the famous open-source library Alamofire has even been using os_unfair_lock incorrectly in terms of Swift's lifecycle view. (They fixed it as in the link [1] ) So, I implemented a lock like below. To use os_unfair_lock safely, I used the Noncopyable protocol which was added in Swift 5.9 [2]. Also, I allocated memory on the heap to use os_unfair_lock. public struct UnfairLock: ~Copyable { public init() { if #available(iOS 16.0, *) { _osAllocatedUnfairLock = OSAllocatedUnfairLock() } else { self.unfairLock = UnsafeMutablePointer.allocate(capacity: 1) } } deinit { if #unavailable(iOS 16.0) { unfairLock!.deallocate() } } public func lock() { if #available(iOS 16.0, *) { osAllocatedUnfairLock.lock() } else { os_unfair_lock_lock(unfairLock!) } } public func unlock() { if #available(iOS 16.0, *) { osAllocatedUnfairLock.unlock() } else { os_unfair_lock_unlock(unfairLock!) } } public func with<T>(_ closure: () -> T) -> T { lock() defer { unlock() } return closure() } private var _osAllocatedUnfairLock: Any? private var unfairLock: UnsafeMutablePointer<os_unfair_lock_s>? @available(iOS 16.0, *) private var osAllocatedUnfairLock: OSAllocatedUnfairLock<Void> { // swiftlint:disable force_cast _osAllocatedUnfairLock as! OSAllocatedUnfairLock // swiftlint:enable force_cast } } However, I got several crashes on iOS 14-15 users like this (This app targets iOS 14+ and on iOS 16+, it uses OSAllocatedUnfairLock): (Sorry for using a third-party crash reporting tool's log, but I think it is enough to understand the issue) BUG IN CLIENT OF LIBPLATFORM: os_unfair_lock is corrupt Crashed: com.foo.bar.queue 0 libsystem_platform.dylib 0x6144 _os_unfair_lock_corruption_abort + 88 1 libsystem_platform.dylib 0xa20 _os_unfair_lock_lock_slow + 320 2 FoooBarr 0x159416c closure #1 in static FooBar.baz() + 6321360 3 FoooBarr 0x2e65b8 thunk for @escaping @callee_guaranteed @Sendable () -> () + 4298794424 (<compiler-generated>:4298794424) 4 libdispatch.dylib 0x1c04 _dispatch_call_block_and_release + 32 5 libdispatch.dylib 0x3950 _dispatch_client_callout + 20 6 libdispatch.dylib 0x6e04 _dispatch_continuation_pop + 504 7 libdispatch.dylib 0x6460 _dispatch_async_redirect_invoke + 596 8 libdispatch.dylib 0x14f48 _dispatch_root_queue_drain + 388 9 libdispatch.dylib 0x15768 _dispatch_worker_thread2 + 164 10 libsystem_pthread.dylib 0x1174 _pthread_wqthread + 228 11 libsystem_pthread.dylib 0xf50 start_wqthread + 8 ( libplatform's source code [3] suggests that __ulock_wait returns error, but I don't know the details) Per @eskimo 's suggestion in [4], I will change my code to use NSLock until OSAllocatedUnfairLock is available on all users' devices (i.e. iOS 16+), but I still want to know why this crash happens. I believe that making a struct Noncopyable is enough to use os_unfair_lock safely, but it seems that it is not enough. Did I miss something? Or is there any other way to use os_unfair_lock safely? [1] https://github.com/Alamofire/Alamofire/commit/1b89a57c2f272408b84d20132a2ed6628e95d3e2 [2] https://github.com/apple/swift-evolution/blob/1b0b339bc3072a83b5a6a529ae405a0f076c7d5d/proposals/0390-noncopyable-structs-and-enums.md [3] https://github.com/apple-open-source/macos/blob/ea4cd5a06831aca49e33df829d2976d6de5316ec/libplatform/src/os/lock.c#L555 [4] https://forums.developer.apple.com/forums/thread/712379
1
0
793
Mar ’24
Cannot load module '***' built with SDK 'iphoneos16.4' when using SDK 'iphoneos17.0'
While building xcode is giving me the **** error Cannot load module '***' built with SDK 'iphoneos16.4' when using SDK 'iphoneos17.0': /Users/***/Library/Developer/Xcode/DerivedData/project-biopprgumksoaqgnrhztiivzzjkq/Build/Products/Debug-iphoneos/Turf.framework/Modules/Turf.swiftmodule/arm64-apple-ios.swiftmodule It's working fine on xcode 14.3 but giving error on xcode 15
1
0
3k
Mar ’24
IOS SceneDelegate not invoked when extended in a different target
I have an iOS project with the following targets: SwiftExtensions (AppTarget) -> depends on Experience Experience (StaticLibrary) -> depends on Lifecycle Lifecycle (StaticLibrary) I have defined the SceneDelegate in Lifecycle library: public class SceneDelegate: UIResponder, UIWindowSceneDelegate { // scene(_:willConnectTo:options:) is implemented in Experience // scene(_:openURLContexts:) is implemented in Experience // Other methods such as sceneWillEnterForeground(_:), sceneDidBecomeActive(_:) etc. } As shown above, scene(_:willConnectTo:options:) and scene(_:openURLContexts:) are not defined here. In the Experience library, SceneDelegate is extended: extension SceneDelegate { func scene(_ scene: UIScene, willConnectTo session: UISceneSession, options connectionOptions: UIScene.ConnectionOptions) { NSLog("[Experience]: SceneDelegate.scene(_:willConnectTo:options:)") if (connectionOptions.urlContexts.isEmpty) { NSLog("[Experience]: Not launched using file!") } else { NSLog("[Experience]: Launched using file!") let urlContexts: Set<UIOpenURLContext> = connectionOptions.urlContexts for (index, urlContext) in urlContexts.enumerated() { NSLog(String(format: "[Experience]: url[%d] = %@ ", index, urlContext.url.path())) } } } func scene(_ scene: UIScene, openURLContexts URLContexts: Set<UIOpenURLContext>) { NSLog("[Experience]: SceneDelegate.scene(_:openURLContexts:)") for (index, urlContext) in URLContexts.enumerated() { NSLog(String(format: "[Experience]: url[%d] = %@ ", index, urlContext.url.path())) } } } Now, when I tap the app icon, scene(_:willConnectTo:options:) is not invoked. When I tap an associated file type, scene(_:willConnectTo:options:) is not invoked, again. If app is running in background, and I foreground the app by tapping an associated file type, scene(_:openURLContexts:) is not invoked. Basically, when I define these two methods outside the target, despite with public access modifier, iOS doesn't invoke my delegate methods defined in the other library. My understanding was, Extensions can be used to break up the class. What is missing here?
0
0
594
Mar ’24
RealityKit Plane flickering bug in Model with VideoMaterial
I am working on an AR app on iOS. I found this issue and can't find a quick solution at this moment nor do I find any insight into what is happening. Context: The AR app contains a 3D model of a sphere that is cut in half. The models are created with a 3D modeling software. Additional context: The sphere model is placed inside the environment. When the user enters the sphere then the sphere materials are set to video materials containing this jungle-like content visible in the video. The vertical center of the sphere is the floor on which the user moves and looks around. Each ARPlaneAnchor is connected to an ARAnchorEntity with its ModelEntity (Plane) for visualization and sphere placement. The bug (video): https://www.youtube.com/shorts/58860U1IkhM As the user moves inside the sphere parts of the video material start to show the square plane camera feed parts (background). What has been tried: changing material type on the floor plane (physically based material seems a little less bad) changing the culling of the materials (no effect). the issue is not related to z-fighting. When a solid material for the floor plane is used then the flickering is not visible. When a solid material with alpha is used then both are visible (alpha material and flickering background) editing the .usdz file (does not work at all) (changing shadows and other properties) checked .usdz file with usdz tools (usdzconvert - all tests have passed and fixed opacity did not help) Changing the video type (.mp4 to .mov) Google & ChatGPT Similar issues: How do I eliminate flickering ... Plane entity's grounding shadow flicking in RealityKit AR flickering Observations: The flickering background is always tied to the floor plane (can provide screenshots). This seems to highlight either a point of contact with the 3D floor plane or something else. The flickering happens at some certain angle and position. It does not happen all the time in all positions which is weird. This problem didn't happen with the old sphere 3D file. The difference is a new 3D-generated floor plane WHICH IS DISABLED. It seems that even if it is disabled it is still being used somehow. I had a similar issue where the floor planes would change color (the color tone would go more light and dark). This issue was solved by disabling automatic shadow rendering. Shadow rendering inside an object does not seem to work properly. The main difference is that the previous issue changed the color brightness, not color transparency (translucent), and that the whole plane color was changed and not some part of it. Logging all the available planes shows only the expected planes (1-3, of which 1-2 two are floor planes and 1 is an image plane). Any ideas, solutions, or feedback is welcome. SO Post: https://stackoverflow.com/questions/78139934/realitykit-plane-flickering-bug-in-model-with-videomaterial Thank you for your time.
0
1
464
Mar ’24
Object Detection Model returns empty dictionary of results
I have made a object detection model and when I load my model in xcode I can see the expected results when inputing an image in the preview page of the model. However, when I am trying to use the model in my code it is return 0 detections, however when i put the same image in preview it returns expected result? Anyone know what could be wrong? func detectLeaf(in image: UIImage) -> Int { var leafDetected = 0 guard let modelURL = Bundle.main.url(forResource: "LeafModel", withExtension: "mlmodelc") else { print("Model file is missing") return leafDetected } do { let visionModel = try VNCoreMLModel(for: MLModel(contentsOf: modelURL)) let objectRecognition = VNCoreMLRequest(model: visionModel) { request, error in if let results = request.results { print(results.count) print(results) leafDetected = results.count } } let handler = VNImageRequestHandler(cgImage: image.cgImage!) try handler.perform([objectRecognition]) return leafDetected } catch { print("Error loading model: \(error.localizedDescription)") return 0 } }
2
0
597
Mar ’24
Exporting localization using xcodebuild with a project containing macros
I've a workspace with multiple packages, and due to the a bug in Xcode I cannot export the app localizations using the Xcode GUI tool, but I need to resort on using a command from terminal xcodebuild -exportLocalizations -localizationPath . -workspace &lt;path_workspace&gt; -sdk iphoneos -exportLanguage en One of my packages contains some macros, and I use them from my code without any problem, the code compile But when I try to export localizations using that command, the build fails due to "compiler plugin not loaded" So I cannot use Xcode normal exporting because Xcode bug, and cannot export by running a command due to the macro problem What should I do? It is very discouraging this situation, do you have any suggestion? I've found a similar problem
2
0
810
Mar ’24