I am trying to recreate the iOS Messages app photo selection UI, where a PHPickerViewController is displayed half screen, with the message text field and a scrolling photo viewer on top. I have that UI mostly working, but cannot figure out how to allow a user to remove an image from my scrolling photo viewer (just like in the iOS Messages app).
When the picker is initially displayed, I can show selected images using the preselectedAssetIdentifiers. However, if the user taps the "x" to remove an image from the scrolling photo viewer, there is no way that I have found to update that selection in the picker.
I can dismiss/show a new picker with the animated property set to false, but that creates a very apparent bounce in the screen. Are there any ways I am missing to accomplish this?
Here is what I have so far:
Photos & Camera
RSS for tagExplore technical aspects of capturing high-quality photos and videos, including exposure control, focus modes, and RAW capture options.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hi,
I'm using Core Graphics to load a .DNG photo shot by a Leica Q3 camera.
The photo is shot in portrait, however the embedded preview is rotated 90 degrees to landscape.
I load the photo like this:
let options = [kCGImageSourceDecodeRequest: kCGImageSourceDecodeToHDR] as CFDictionary
let source = CGImageSourceCreateWithData(data as CFData, nil)
let cgimage = CGImageSourceCreateImageAtIndex(source, 0, options)
let properties = CGImageSourceCopyPropertiesAtIndex(source, 0, nil) as? [CFString : Any]
When doing this I can see that the orientation property is 1 indicating that the orientation is 'Up', which it isn't.
If I don't specify the kCGImageSourceDecodeToHDR option (eseentially setting options to nil) - the orientation property is 8 (rotated 90 degrees).
What puzzles me is that a chang to the CGImageSourceCreateImageAtIndex call can have an influence on that latter call to CGImageSourceCopyPropertiesAtIndex ?
I would expect these to work independently?
Cheers
Thomas
I have a complex CoreImage pipeline which I'm keen to optimise. I'm aware that calling back to the CPU can have a significant impact on the performance - what is the best way to find out where this is happening?
This is an issue with the Insta360 Flow Pro 2.
My iOS app uses DockKit to control the gimbal; in particular, my app disables tracking and sends angular velocity commands to control the gimbal's orientation. I only try to modify the yaw (rotation around the vertical axis); never the pitch or yaw. Note that I don't send the gimbal to a particular orientation directly; I modify the velocity.
Everything works great for a long period of time: typically for a continuous run of 4-6 hours; in the most recent case, I managed about 36 hours of continous operation before the following problem occurred.
I came back to check on the system, and because no visual activity had occurred in the camera's field of view for a while, the phone had commanded the gimbal to rotate back to a yaw angle of 0 degrees.
So the phone in the gimbal should have been looking straight ahead (i.e. the 0 degree yaw position), but it was definitely looking off at an angle. I've seen this twice now. The first time, when it should have been looking straight ahead, it was in fact looking 60 degrees off center. This time (caught on video, see below), it was off by 22 degrees from center.
Here's the weird part: the gimbal reports this way off center positioning as zero degrees (well close enough to zero, like 0.2 or something that's fine). But, mechanically, the gimbal still knows where zero degrees is: if we double click on the trigger of the Flow Pro 2, which is supposed to reset the gimbal to 0 degrees yaw and pitch, the gimbal responds correctly and reorients to a 0 degree position. However, the yaw values it reports are not zero, but as shown in my video, 22 degrees off axis or so.
Power cycling the gimbal and restarting immediately fixes the problem. Also, I switched from my app to the Insta360 app, which caused the phone to flip from landscape to portrait, then when I returned to my app and switched back to landscape, the gimbal now started reporting correct yaw angles.
Is there a possibility this is a bug in the DockKit framework? Has anyone seen this? I have a case open with Insta360, but although it's clearly a software issue, it's not clear if it's in Insta360's code or the DockKit layer. Any ideas for how I can get out of this mode? My concern is that the phone is in a tripod about 10' off the floor, and not very accessible. Also, if all goes well, we may have about 50 of these systems running, and having to fix them one by one after a few hours is not good.
For a demonstration of this bug, see the following video:
https://octoparry.com/offset.MOV
Any help greatly appreciated.
If I want to edit image in preview app. But there is only option to rotate left and right 90degree rotations. No option to rorate in any prticular angle. So Please look into this and provide option in next update
Topic:
Media Technologies
SubTopic:
Photos & Camera
Tags:
Image I/O
Graphics and Games
App Review
Media
Hello everyone,
I am looking for a solution to programmatically, e.g. using AppleScript to import photos into the Photos library on MacOS and also push them to the shared library, like it can be done using the standard GUI of the Photos application.
Maybe it is not possible using AppleScript, but using a short Swift script and PhotoKit, I do not not know.
Any help is appreciated!
Thomas
Hello everyone,
I’m working on an iOS app that fetches videos from the "Recently Deleted" album using the Photos framework in Swift. However, I’m unable to fetch any videos, even though the "Recently Deleted" album contains 233 items (including videos), as seen in the Photos app.
Environment:
iOS Version: 18.3.1
Xcode Version: 16.2
Swift Version: Swift 5
Device: iPhone (simulator and physical device both tested)
Photo Library Permission: "All Photos" access granted
Recently Deleted Lock: Face ID/Passcode is disabled for "Recently Deleted"
when I get results from picker: PHPickerViewController, didFinishPicking results: [PHPickerResult])
and I load the image using itemProvider .loadFileRepresentation (the itemProvider is the NSItemProvider provided by the PHPickerResult)
will the url that's returned by this method be guaranteed to have the file extension ie, "file://image.jpeg" not "file://image"
I want to know if i need to just check the extension to know its file type.
(FYI in case this makes a difference, im only interested in user screenshots and screenrecordings)
I am developing a video streaming app for iPhone.
Minimum version is IOS 13.
I want to connect an external USB camera to the iPhone app and stream from it.
I have looked through a lot of information and have not found how to do this.
Is it possible to do this? Is there any documentation on this?
Has Objective-C been deprecated?
Hi all,
In MacOS, how can I disable or enable build-in camera by program or script?
I'm working on an application that uses the iPhone camera for scientific purposes - and, as a result would like to receive sensor data in as unprocessed format as possible.
I'm using AVCapturePhotoOutput to take Bayer RAW stills and receiving data in kCVPixelFormatType_14Bayer_RGGB format.
However, I'm puzzled as to the content of the bits. I simply demosaic the image by taking each 2x2 square:
RG
GB
and use R, (G+G)/2, B to get 16-bit RGB values - and this indeed works.
However, I am puzzled as to the values we are getting as they seem to be approximately in the range 2048 - 16383. The top value is understandable - the maximum that you can fit in 14-bits (as implied by the pixel format type).
However we don't seem to be able to get lower than ~2048 no matter how black/dark we make the sensor.
I'm aware that the sensor is probably not 14-bits (we're using the iPhone 16e camera) and that maybe this is to do with the way the sensor data is packaged.
The Advances in iOS Photography video (https://developer.apple.com/videos/play/wwdc2016/501/) describes it as "10-bit sensor RAW packaged in 14 bits per pixel instead of eight."
Is there any documentation describing what is going on here? It's vital for our use that we get as close to the raw camera sensor light readings as possible, so any pointers as to the mapping (e.g. decompanding?) being used would be extremely useful.
Many thanks in advance for your help.
I have an iOS app that includes a Photo Editing Extension and is optimized for Mac Catalyst so you can edit photos in the Photos app on your Mac. This has worked really well but now I am encountering an error alert trying to open the photo editing extension:
RBSLaunchRequest error trying to launch plugin com.company.TestEditor. TestPhotoEditor (B7A616A7-2 5A8-4E02-8B32-5CAB37C8B4B2): ErrorDomain=RBSRequestErrorDomain Code=5 "Launch failed." UserInfo={NSLocalizedFailureReason=Launch failed., NSUnderlyingError=0x7f08fafd0 {ErrorDomain=NSPOSIXErrorDomain Code=153 "Unknown error: 153" UserInfo={NSLocalizedDescription=Launchd job spawn failed}}}
Create a new iOS app project in Xcode
Create a new target and choose iOS > Photo Editing Extension
For both targets in the project, add Mac Catalyst as a supported destination
Run the app on My Mac (Mac Catalyst)
Open the Photos app, double click a photo, click Edit, click the more plugins button, and click TestPhotoEditor in the list
macOS 15.4.1 + Xcode 16.3
Hi guys,
How to achieve the following feature on macOS when a USB device (Camera/Mic/Speaker) is connected:
When a third-party video conferencing app is not in a meeting, ensure the app defaults to using the USB device (Camera/Mic/Speaker).
When a third-party conferencing app is in a meeting, ensure the app automatically switches to the USB device (Camera/Mic/Speaker).
I want to make use of IOKit extension to hidden or ignore build-in camera to realize the requirement.
however the extension can't be loaded for Invalid permissions in MacOS 15.4.1, buildVersion:24E263. I also tried to run in in MacOS 14.4.1, which can be loaded but can't auto load when restart laptop as KDK version not match.
Could you please give me some suggestion? Is it possible hidden build-in camera in MacOS M-series chip? Is there any other method to realize the feature. Thanks a lot.
Hi guys,
Can I use CMIO to achieve the following feature on macOS when a USB device (Camera/Mic/Speaker) is connected:
When a third-party video conferencing app is not in a meeting, ensure the app defaults to using the USB device (Camera/Mic/Speaker).
When a third-party conferencing app is in a meeting, ensure the app automatically switches to the USB device (Camera/Mic/Speaker).
Does the library exists in xCode 16.4?
"import WorldCaptureKit" gives error "No such module 'WorldCaptureKit'".
And I do not find any information about the library in the apple documentation.
But AI keeps suggesting me to use the library
Topic:
Media Technologies
SubTopic:
Photos & Camera
Can i use iokit usb lib to disable build-in camera?
iOS 26 added smoothness to CIRoundedRectangleGenerator, for use with CIFilter.roundedRectangleGenerator. What should the smoothness value be to achieve the same corner curve as CALayerCornerCurve.continuous? Does it need to be calculated based on the extent size, if so, how?
Some users have reported an error editing portrait photo assets in my app:
The operation couldn’t be completed. (CINonLocalizedDescriptionKey error 3.)
What is that error? Will affected photos always encounter this error (due to data corruption for example) or can it be resolved in a future iOS update?
FB16241301
The documentation for PHAssetChangeRequest.revertAssetContentToOriginal says it will fail if the original asset content is not on the current device so you should use PHAssetResourceManager to download it first, but this no longer seems to be the case in the latest iOS versions because an error no longer occurs when I take a photo on my iPhone, edit it, open Photos on my iPad and let it sync, then open my app on iPad and call revertAssetContentToOriginal for that asset. Does the system now take care of downloading the original when needed?