Post

Replies

Boosts

Views

Activity

Reply to Macro-mode in AVCaptureDevice(custom camera)
Close focussing requires the use of a virtual device that has some form of wide-angle lens, especially ultra-wide. However, for my application needs (small label with small fonts) this can mean that the camera is very close to the subject, thereby blocking illumination - resulting in poorer OCR results. My solution to this is to use the device's Video Zoom Factor capability (if available), which is a digital zoom feature rather than optical (i.e. it's based on cropping the sensor output) The below code sample is used in the Photo Preview UI, with reference to the class that manages the camera set up and delegates ("camera") and to the available device determined by that class instance. Of course, zooming (cropping) too much can also cause OCR results to deteriorate: so it's a case of trial and error. There is a videoMaxZoomFactor read-only property for applicable device formats. Slider( value:$zoomFactor, in: 1.0...4.0, step: 0.5 ) { Text("Zoom") } minimumValueLabel: { Text("1x") } maximumValueLabel: { Text("4x") } onEditingChanged: { _ in do { try camera.device?.lockForConfiguration() camera.device?.ramp(toVideoZoomFactor: self.zoomFactor,withRate:1.0) camera.device?.unlockForConfiguration() } catch { print("**** Unable to set camera zoom") } } Regards, Michaela
Feb ’25
Reply to Macro-mode in AVCaptureDevice(custom camera)
Further to Greg's reply to my solution, I confirm that builtInDualCamera does not provide close-focus, whereas builtInDualWideCamera does. I've therefore revised my solution (to support different iPhone / iPad models) as per below: func getMacroDevice() -> AVCaptureDevice? { if let device = AVCaptureDevice.default(.builtInTripleCamera, for: .video, position: .back) { return device } if let device = AVCaptureDevice.default(.builtInDualWideCamera,for: .video, position: .back) { return device } if let device = AVCaptureDevice.default(for: .video) { return device } return nil } I'm still puzzled by the "default behaviour" not working and will explore further at https://developer.apple.com/documentation/avfoundation/avcapturedevice/activeprimaryconstituentdeviceswitchingbehavior UPDATE: using device = AVCaptureDevice.default(for: .video) and then getting that device's .primaryConstituentDeviceSwitchingBehavior returns .unsupported for my iPhone 15 Pro under iOS 18.3. So, it seems that our code must specify a primary constituent device, as per my code above, otherwise the default device does not qualify for switching, if available. Perhaps this is what Greg was getting at with his original answer. Also, the M1 iPad Pro has a .builtInDualWideCamera, which is probably its default. Regards, Michaela
Jan ’25
Reply to Macro-mode in AVCaptureDevice(custom camera)
I solved the problem by explicitly setting the device, as per the below function: func getMacroDevice() -> AVCaptureDevice? { if let device = AVCaptureDevice.default(.builtInTripleCamera, for: .video, position: .back) { return device } if let device = AVCaptureDevice.default(.builtInDualCamera, for: .video, position: .back) { return device } else { if let device = AVCaptureDevice.default(for: .video) { return device } else { return nil } } } I use this function in the session.beginConfiguration() part of the camera functionality to set the device. The order is important, because on my iPhone 15 Pro if the Dual Camera is the first choice then macro mode doesn't occur. Also, on my M1 iPad Pro, builtInDouble and builtInTriple don't apply (the let fails): the iPad will close focus using the default camera device. It seems to me that if the default behaviour is to automatically switch, then in my development environment and on my devices the supposed behaviour is not working. Regards, Michaela
Jan ’25
Reply to Macro-mode in AVCaptureDevice(custom camera)
I too need this capability, for photographing product labels and OCR text recognition. The small size (and font) of the labels requires macro function. The Camera App on my iPhone 15 Pro (iOS 18.3) automatically enters Macro mode and creates very sharp Label images, which OCR accurately. However, within my iOS Swift app with AVCapture there appears to be no way of enabling this behaviour: close focus is not possible and therefore Labels do not OCR properly.
Jan ’25
Reply to New Vision API
I've just done a prototype app for recognising text from photos of product labels, using the new Vision API with Xcode 16.2 beta 2 and images from an iPad Pro (2020 M1 chip) and iPhone 15 Pro. The recognition is very accurate, even correctly recognising neatly handwritten label text. These are the settings I'm using: ocrRequest.recognitionLevel = .accurate ocrRequest.usesLanguageCorrection = true ocrRequest.automaticallyDetectsLanguage = true Some of the labels are very small (1cm x 2cm), but even then with the iPhone camera on x2 or x3 and macro-mode the recognition is near perfect. Regards, Michaela
Nov ’24
Reply to Heic format image incompatibility issue
I don't know if it's related, but in a test project I'm using to try out text recognition from photos Xcode does not recognise .HEIC as a valid extension for images in the Asset catalogue. Changing (in Finder's Get Info) the extension to .heic (i.e. lowercase) solves the issue. A device/app interoperability issue methinks..... Images from my iPhone and iPad all have the .HEIC extension, which (of course) doesn't get changed to lowercase on Airdrop to my Mac. I'm using Xcode 16.2 Beta 2 with iOS devices on 18.2 and Mac on 15.2. Regards, Michaela
Nov ’24
Reply to SwiftUI FormView not updating after value creation/updating in a SubView
UPDATE: The problem with this sample (Test) code was with the way I passed the CoreData Entity to the Form. My in-development app updates the NSManaged vars correctly and recomputes the derived data, but there's a problem (yet to be resolved) in getting all those back into the Form for display, something to do with the way I've generalised the Form's processing. Apologies for any inconvenience. Regards, Michaela
Nov ’24
Reply to iOS 17.2 Simulator Error Xcode
I was having the same problem with Xcode 15 betas on an M1MacMini with Ventura betas. For me it seems to have been a space problem: Developer files on the main HD kept ballooning to over 60GB on an already constrained machine. When I moved Xcode to an external USB-C SSD and set Locations in Xcode Settings to that drive - the download and registration worked fine, plus there's more free space for other uses.
Dec ’23
Reply to How to fix NSCloudKitMirroringDelegate unhandled exception after faulty model change
Fixed the problem by these steps: Created a backup of the MacMini's persistent store, using code for a temporary persistent store coordinator.migratePersistentStore. Also created CSV file backups of all objects of all entities (as a fall-back if the migrate didn't work correctly). Deleted all objects of all Entities in the MacMini persistent store. Waited for deletions (Step 3) to be propagated to the CloudKit container. Checked the CloudKit container (Web Console) and manually deleted any Entity Objects not deleted by Steps 3 and 4 - there were quite a few. Restored the MacMini persistent store by code for: a persistentStoreCoordinator.destroyPersistentStore of the original MacMini store b persistentStoreCoordinator.addPersistentStore based on the backup store c persistentStoreCoordinator.migratePersistentStore from the backup to the "new" store (step 6 b) Waited for the "new store" Entity Objects to be propagated to the CloudKit container. Checked the CloudKit container (Web Console) - all good Checked that additions/deletions worked correctly whether on the MacMini or via The CloudKit Console - all good. Somewhat of a pain, but necessary given the amount of data and its complexity. An advantage is that I now have backup and restore functions in the file menu of the MacMini App. Regards, Michaela
Oct ’23
Reply to How to fix NSCloudKitMirroringDelegate unhandled exception after faulty model change
I've created a test app, with CoreData records synced via CloudKit. Records created on the Mac get synced to the iPad Pro. It's a very simple app, just Listing records from a FetchedResults. I then Reset development environment in the CloudKit console Web Portal, with both devices' app closed. This Reset, of course, deleted all records, schema and coredata.cloudkit zone from the Development environment. On restarting each app, CoreData records on both devices remained and the app worked correctly, fetching existing records and also recreated the CloudKit zone, schema and records. Thereafter new records on the Mac were correctly synced to the iPad Pro. I'll do some more testing, making the test app more complex with the sort of relationship I was trying to achieve in the messed-up app, then might be bold enough to use this approach to fix the problem app. However, it would still be better if I could somehow purge the CloudKit update queue - although I can't be sure that the schema is correct anyway: probably not. Regards, Michaela
Oct ’23
Reply to preload core data from cloud kit
This what I use in my apps (during the initialisation of my Data Controller): self.mainContainer = { let container = NSPersistentCloudKitContainer(name: "MyDataModel") container.loadPersistentStores(completionHandler: { description, error in if let error = error { print("**** ERROR loading persistent store \(error)") } //Setup auto merge of Cloudkit data container.viewContext.automaticallyMergesChangesFromParent = true container.viewContext.mergePolicy = NSMergeByPropertyObjectTrumpMergePolicy //Set the Query generation to .current. for dynamically updating views from Cloudkit try? container.viewContext.setQueryGenerationFrom(.current) }) return container }() The key lines are the 2 below //Setup auto merge. Also, be sure to have enabled Remote Notifications in Background modes of the App's Signing and Capabilities. I hope this helps. Regards, Michaela
Apr ’23
Reply to ECG Anomaly Dataset
If the app referenced by Claude31 is available to you/your university, then this would be the way to go in getting data from a variety of volunteer subjects who have an ECG capable Apple Watch and an iPhone. However, if you're a developer with an Apple Watch & iPhone and/or have colleagues with them and you're only collecting your data (or your colleagues') , then a fairly simple Swift App will suffice to get the raw data, which are a timestamp and voltage. The Apple documentation is https://developer.apple.com/documentation/healthkit/hkelectrocardiogram I use my such app to create a CSV file, which I then import to an ECG Viewer. I use this setup quite frequently, as well as data from a Polar H10 ECG app, to monitor my various arrhythmias and provide data to EDs as required (unfortunately, too often for comfort!). Regards, Michaela
Mar ’23