Core Image

RSS for tag

Use built-in or custom filters to process still and video images using Core Image.

Core Image Documentation

Posts under Core Image tag

50 Posts
Sort by:
Post not yet marked as solved
1 Replies
607 Views
I'm trying to create a sky mask on pictures taken from my iPhone. I've seen in the documentation that CoreImage support semantic segmentation for Sky among other type for person (skin, hair etc...) For now, I didn't found the proper workflow to use it. First, I watched https://developer.apple.com/videos/play/wwdc2019/225/ I understood that images must be captured with the segmentation with this kind of code: photoSettings.enabledSemanticSegmentationMatteTypes = self.photoOutput.availableSemanticSegmentationMatteTypes photoSettings.embedsSemanticSegmentationMattesInPhoto = true I capture the image on my iPhone, save it as HEIC format then later, I try to load the matte like that : let skyMatte = CIImage(contentsOf: imageURL, options: [.auxiliarySemanticSegmentationSkyMatte: true]) Unfortunately, self.photoOutput.availableSemanticSegmentationMatteTypes always give me a list of types for person only and never a types Sky. Anyway, the AVSemanticSegmentationMatte.MatteType is just [Hair, Skin, Teeth, Glasses] ... No Sky !!! So, How am I supposed to use semanticSegmentationSkyMatteImage ?!? Is there any simple workaround ?
Posted Last updated
.
Post not yet marked as solved
1 Replies
911 Views
Hello, I am a student and I am doing a search for my thesis on create ML and shape recognition and image processing, so for this subject I want to find the details of the steps used in create ML for this, such as the techniques used for pre-processing, and the methods of extracting characteristics, and the filters applied, ect...
Posted Last updated
.
Post not yet marked as solved
1 Replies
659 Views
When initializing a CIColor with a dynamic UIColor (like the system colors that resolve differently based on light/dark mode) on macOS 14 (Mac Catalyst), the resulting CIColor is invalid/uninitialized. For instance: po CIColor(color: UIColor.systemGray2) → <uninitialized> po CIColor(color: UIColor.systemGray2.resolvedColor(with: .current)) → <CIColor 0x60000339afd0 (0.388235 0.388235 0.4 1) ExtendedSRGB> But also, not all colors work even when resolved: po CIColor(color: UIColor.systemGray.resolvedColor(with: .current)) → <uninitialized> I think this is caused by the color space of the resulting UIColor: po UIColor.systemGray.resolvedColor(with: .current) → kCGColorSpaceModelRGB 0.596078 0.596078 0.615686 1 po UIColor.systemGray2.resolvedColor(with: .current) → UIExtendedSRGBColorSpace 0.388235 0.388235 0.4 1 This worked correctly before in macOS 13.
Posted Last updated
.
Post not yet marked as solved
2 Replies
801 Views
I have written two custom Core Image metal kernels which I'm using to produce a CIImage (by chaining several filters). I'm drawing the output image in a simple view and whatever I use (CIImage, NSImage, CGImageRef), the image appears corrupted on screen, like some sort of graphics corruption (I've tried on two different machines with different systems). However if I add a simple step to write the image to disk from the CIImage then read it from disk and draw it in that very same view, then all is fine and the image appears correctly. What could possibly be happening here?
Posted Last updated
.
Post not yet marked as solved
0 Replies
714 Views
When using the heif10Representation and writeHEIF10Representation APIs of CIContext, the resulting image doesn’t contain an alpha channel. When using the heifRepresentation and writeHEIFRepresentation APIs, the alpha channel is properly preserved, i.e., the resulting HEIC will contain a urn:mpeg:hevc:2015:auxid:1 auxiliary image. This image is missing when exporting as HEIF10. Is this a bug or is this intentional? If I understand the spec correctly, HEIF10 should be able to support alpha via auxiliary image (like HEIF8).
Posted Last updated
.
Post not yet marked as solved
1 Replies
1.2k Views
Environment Xcode 14 ios 16 Error [Unknown process name] CGImageCreate: invalid image byte order info for bitsPerPixel != 32 = 16384 And when I run my app in iphone, it can't show the image of result. It seems that there are some bugs when running in ios 16. How can I solve this problem?
Posted
by flygod.
Last updated
.
Post not yet marked as solved
4 Replies
3.3k Views
Is this accessible from Swift directly? Visual Look Up Lift subject from background Lift the subject from an image or isolate the subject by removing the background. This works in Photos, Screenshot, Quick Look, Safari, and more. Source: macOS Ventura Preview - New Features - Apple I see that Shortcuts now has a native Remove Background command that wasn't there in iOS 25 or MacOS 12. Is there any way to call that from Swift besides x-callback url schemes?
Posted Last updated
.
Post marked as solved
3 Replies
1.9k Views
When capturing RAW (not ProRAW) photos using AVCapturePhotoOutput, the resulting images are subject to a strange overexposed effect when viewed in Apple software. I have been able to recreate this in multiple iOS apps which allow RAW capture. Some users report previously normal images transforming over the span of a few minutes. I have actually watched this happen in real-time: if you observe the camera roll after taking a few RAW photos, the highlights in some will randomly **** (edit: this just says b l o w, nice job profanity filter) out of proportion after whatever is causing this issue kicks in. The issue can also be triggered by zooming in to one of these images from the stock Photos app. Once the overexposure happens on a given image, there doesn't appear to be a way to get it to display normally again. However, if you AirDrop an image to a different device and then back, you can see it display normally at first and then break once more. Interestingly, the photo displays completely fine when viewed in Affinity photo or random photo viewers on Ubuntu. Sometimes the issue is not that bad, but it is often egregious, resulting in completely white areas of a previously balanced photo (see https://discussions.apple.com/thread/254424489). This definitely seems like a bug, but is there any way to prevent it? Could there be an issue with color profiles? This is not the same issue in which users think RAW photos are broken because they are viewing the associated JPG – this happens even with photos that have no embedded JPG or HEIC preview. Very similar (supposedly fixed) issue on MacOS: https://www.macworld.com/article/1444389/overexposed-raw-image-export-macos-monterey-photos-fixed.html Numerous similar complaints: https://discussions.apple.com/thread/254424489 https://discussions.apple.com/thread/253179308 https://discussions.apple.com/thread/253773180 https://discussions.apple.com/thread/253954972 https://discussions.apple.com/thread/252354516
Posted
by tenuki.
Last updated
.
Post not yet marked as solved
0 Replies
643 Views
I am currently working on a SwiftUI video app. When I load a slow motion video being in 240 IPS (239.68), I use "asset.loadTracks" and then ".load(.nominalFrameRate)" which returns 30 IPS (29.xx), asset being AVAsset(url: ). And the duration in asset.load(.duration) is also 8 times bigger than original duration. Do you know how to get this 239.68 displayed in the Apple Photo app ? Is it stored somewhere in the video metadata or is it computed ?
Posted
by Timlead.
Last updated
.