Post not yet marked as solved
I use VNDetectHumanBodyPoseRequest to detect body from an image which in xcode assets(I download from image website), But I get error below:
2021-12-24 21:50:19.945976+0800 Guess My Exercise[91308:4258893] [espresso] [Espresso::handle_ex_plan] exception=Espresso exception: "I/O error": Missing weights path cnn_human_pose.espresso.weights status=-2
Unable to perform the request: Error Domain=com.apple.vis Code=9 "Unable to setup request in VNDetectHumanBodyPoseRequest" UserInfo={NSLocalizedDescription=Unable to setup request in VNDetectHumanBodyPoseRequest}.
Below is my codes:
let image = UIImage(named: "image2")
guard let cgImage = image?.cgImage else{return}
let requestHandler = VNImageRequestHandler(cgImage: cgImage)
let request = VNDetectHumanBodyPoseRequest(completionHandler: bodyPoseHandler)
do {
// Perform the body pose-detection request.
try requestHandler.perform([request])
} catch {
print("Unable to perform the request: \(error).")
}
func bodyPoseHandler(request: VNRequest, error: Error?) {
guard let observations =
request.results as? [VNHumanBodyPoseObservation] else {
return
}
let poses = Pose.fromObservations(observations)
self.drawPoses(poses, onto: self.simage!)
// Process each observation to find the recognized body pose points.
}
Post not yet marked as solved
I haven't be able to generate a HEIC photo with IPTC metadata, in JPEG I get it but no in HEIC.
I'm using:
CIContext method heifRepresentation(of: image, format: CIFormat.RGBA8, colorSpace: colorSpace, options: options)
to generate the photo data, image is a CIImage and has the IPTC metadata but the final photo doesn't have it.
If I use:
CIContext method jpegRepresentation(of: image, colorSpace: ColorSpace.deviceRGB, options: options)
the final JPG photo has the IPTC information.
Anyone with the same issue or with an idea about what's going on?
Post not yet marked as solved
Previews of colors disappeared after xcode 13.1 update. Preview works when set as variable. but the previews directly in the code are no longer visible. What can I do to see color previews?
Version 13.1 (13A1030d)
Post not yet marked as solved
Question
What may cause this crash above iOS 14 or 15
Here are images about crash stack tree below:
Post not yet marked as solved
If I enlarge up the Image 8x with pinch out gesture, there is a phenomenon that displaying area of the subimage and the touchable area of the sub image is not matched
I think it looks like the touchable area has been moved from the image area to the top left by 20dp.
Please guide me on how to exactly match the displayed area and the touched area of the subimage on the ImageView
I set the autoresizesSubViews property of ImageView to true,
and Subimages were scaled with CATransform3DMakeScale when pinched out.
I am downloading image files which are in the zip format to a folder in my app files folder.
When the folder is opened the file is shown as .zip and when selected it unzips and creates a new file. In my case it creates a jpg. leaving the zip file also.
My question is, how do I unzip the file with swift since iOS appears to have the capability?
Post not yet marked as solved
Hello,
I am trying to create an animated sequence of HEIC images but I cannot save the frame property duration. It seems this is a well know bug: https://github.com/SDWebImage/SDWebImage/issues/3120
The kCGImagePropertyHEICSDictionary is never saved.
Here's a sample project to reproduce the bug: ImageIOHEICSEncodeDecodeBug.zip
Has anybody managed to save this information in a HEIC sequence?
Thanks!
Here's how I am writing an reading the image sequence
- (void)testHEICSBug {
// First, load an animated image (GIF)
// And you can change the type into png, which is an animated PNG format. Same result
NSData *GIFData = [NSData dataWithContentsOfFile:[[NSBundle mainBundle] pathForResource:@"image1" ofType:@"gif"]];
CGImageSourceRef source = CGImageSourceCreateWithData((__bridge CFDataRef)GIFData, nil);
NSUInteger frameCount = CGImageSourceGetCount(source);
NSAssert(frameCount > 1, @"GIF frame count > 1");
// Split into frames array, encode to HEICS
NSMutableData *heicsData = [NSMutableData data];
CGImageDestinationRef destination = CGImageDestinationCreateWithData((__bridge CFMutableDataRef)heicsData, (__bridge CFStringRef)AVFileTypeHEIC, frameCount, nil);
for (int i = 0; i < frameCount; i++) {
// First get the GIF input image and duration
CGImageRef cgImage = CGImageSourceCreateImageAtIndex(source, i, nil);
NSDictionary *inputProperties = (__bridge_transfer NSDictionary *)CGImageSourceCopyPropertiesAtIndex(source, i, nil);
NSDictionary *inputDictionary = inputProperties[(__bridge NSString *)kCGImagePropertyGIFDictionary];
NSTimeInterval duration = [inputDictionary[(__bridge NSString *)kCGImagePropertyGIFUnclampedDelayTime] doubleValue];
NSAssert(cgImage, @"CGImage not nil");
NSAssert(duration > 0, @"Input duration > 0");
// Then, encode into HEICS animated image
NSMutableDictionary *outputDProperties = [NSMutableDictionary dictionary];
outputDProperties[(__bridge NSString *)kCGImagePropertyHEICSDictionary] = @{(__bridge NSString *)kCGImagePropertyHEICSUnclampedDelayTime : @(duration)};
CGImageDestinationAddImage(destination, cgImage, (__bridge_retained CFDictionaryRef)outputDProperties);
}
// Output HEICS image data
BOOL result = CGImageDestinationFinalize(destination);
NSAssert(result, @"Encode HEICS failed");
// Next, try to use ImageIO to decode HEICS and check duration
CGImageSourceRef newSource = CGImageSourceCreateWithData((__bridge CFDataRef)heicsData, nil);
frameCount = CGImageSourceGetCount(newSource);
NSAssert(frameCount > 1, @"New HEICS should be aniamted image");
NSUInteger frameIndex = 1; // I pick the 2nd frame, actually any frame contains this issue.
NSDictionary *newProperties = (__bridge_transfer NSDictionary *)CGImageSourceCopyPropertiesAtIndex(newSource, frameIndex, nil);
NSDictionary *newDictionary = newProperties[(__bridge NSString *)kCGImagePropertyHEICSDictionary];
NSTimeInterval newDuration = [newDictionary[(__bridge NSString *)kCGImagePropertyHEICSUnclampedDelayTime] doubleValue];
CGImageRef newImage = CGImageSourceCreateImageAtIndex(newSource, frameIndex, nil);
// Now, check the HEICS frame duration, however, it's nil :(
// Only image is kept.
NSAssert(newImage, @"frame image is not nil");
NSAssert(newDuration > 0, @"Decode the HEICS (which encoded from GIF) will loss the frame duration");
}
Post not yet marked as solved
I am trying to set the description of an image. The metadata tag necessary to add the description is of type alternateText.
I create the child tag with the new description value:
let childTag = CGImageMetadataTagCreate(identifier.section.namespace, identifier.section.prefix, "[x-default]" as CFString, .string, value as CFTypeRef)
I then set the description tag like this:
let parentTag = CGImageMetadataTagCreate(identifier.section.namespace, identifier.section.prefix, identifier.tagKey, .alternateText, [childTag] as CFTypeRef)
let result = CGImageMetadataSetTagWithPath(metadata, nil, identifier.tagKey, parentTag)
However, when I write the image file, I get a runtime error message and the operation fails:
XMP Error: AltText array items must have an xml:lang qualifier
So, I create the qualifier tag like this:
let qualifierTag = CGImageMetadataTagCreate("http://www.w3.org/XML/1998/namespace" as CFString, "xml" as CFString, "lang" as CFString, .string, "x-default" as CFTypeRef)
But I have not found a way to associate this qualifier tag to the child tag with the description value.
What is the way to do it?
Post not yet marked as solved
The memory used jumps to 498 megs when image is loaded but not displayed and drops to 242 megs when viewcontroller is
dropped with nav controllers back button.
UIImage(contentsOfFile: is not supposed to cache, i assume that is where the 240 meg drop is from, but why is half of memory used, 240 megs, being retained?
func save_ImageJPG() -> Bool {
// load tif from app folder image approx 8000 x 6000 pixels
let s = currentMap_Name.replacingOccurrences(of: " ", with: "%20")
let filePath = getDocumentsDirectory().appendingPathComponent("Folder" ).appendingPathComponent(s)
let image = UIImage(contentsOfFile: filePath.path)
// change file type to jpg
let f = URL(string: s)?.deletingPathExtension().absoluteString
let e = URL(string: f! + ".jpg")?.absoluteString
let path = getDocumentsDirectory().appendingPathComponent("TempFolder"
).appendingPathComponent(e!)
if !FileManager.default.fileExists(atPath: path.path) {
let jpgData = (image!.jpegData(compressionQuality: 1.0)!)
try? jpgData.write(to: path)
return true
}
return false
}
or is the let jpgData = (image!.jpegData(compressionQuality: 1.0)!) retaining memory?
Post not yet marked as solved
1) Portrait Image imported from files app is in landscape. How to fix this bug.
a) Image saved from iPad 11 Pro front camera.
b) saved image
c) Image thumbnail in files app import window
d) Image imported in landscape mode
2) This landscape image is fixed using below extension. But the output image obtained is slightly different for iOS 15 when compared with iOS 14.
How to fix both issue?
Post not yet marked as solved
Image captured using iPad 11 Pro in Portrait mode when imported using Document Picker from files app is loaded in landscape mode when seen in preview while debugging.
While thumbnail of saved image is in Portrait mode in files app.
Dear community,
Yesterday I update Xcode from 12.5.1 to 13.0 and I came across a lot of compile error. Until before the update all was working great, but I suppose probably some syntax rules changes with the update for iOS 15.
.background(Color.systemBlue)
error: Type 'Color' has no member 'systemBlue'
.maxWidth(.infinity)
error:
Cannot infer contextual base in reference to member 'infinity'
Value of type 'Image' has no member 'maxWidth'
.aspectRatio(contentMode: .fit)
error: Cannot infer contextual base in reference to member 'fit'
Image(.system("line.horizontal.3.circle.fill"))
error: Type 'String' has no member 'system'
.font(.largeTitle, weight: .bold)
errors:
- Cannot infer contextual base in reference to member 'bold'
- Extra argument 'weight' in call
Image(section.logo)
.resizable()
.height(32)
error: Value of type 'Image' has no member 'height'
Command CompileSwift failed with a nonzero exit code
I hope that this one will fix by itself
Basically I got 23 errors... and the code before yesterday was working great.
Thanks to those who will help me, hoping it will be useful to other users,
Martin
Post not yet marked as solved
I use ImageI/O call - CGImageDestinationCopyImageSource() to update image metadata and noticed it's failing on updating existing metadata fields since macOS 12 beta 5, and upgrading to beta 6 is not fixing it. MacOS 11 and below don't have this issue. Do you plan to fix it in coming betas and when?
Precondition:
have a file that has any IPTC. For example, IPTC Title = "Old".
Steps:
Update the metadata (e.g. change IPTC Title to "New") with following code:
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:(id)metadataToApply,kCGImageDestinationMetadata, kCFBooleanTrue,kCGImageDestinationMergeMetadata, nil];
CFErrorRef metaError = NULL;
BOOL isSuccess = CGImageDestinationCopyImageSource(idst, srf, (__bridge CFDictionaryRef)options, &metaError);
Observe:
At file level, silently fails, IPTC Title = "Old".
Post not yet marked as solved
Hi,
I'm trying to convert images picked from the users camera roll into jpegData for me to upload to a server.
I'm doing the following to covert to Data:
let imageData: Data? = uiImage.jpegData(compressionQuality: 0)
This works fine when converting screenshots. But when I attempt to
convert an image taken using the device's camera. It gives me the
following error:
[Metal] 9072 by 12198 iosurface is too large for GPU
It still converts it into data, but the data then just contains a blank image. Not the selected image.
I'm at a loss with where to go with this one, I've tried converting
HEIC preloaded onto the device's bundle and it works fine. But when
using ones selected from the camera roll, I get the stated error.
Any help would be appreciated. This is the sample code I'm using to
test the issue. (I'm aware it's quite pointless in this form)
func convertImage(image: Image?) -> Image? { //1
let key = "\(String.random(length: 15))"
let uiImage: UIImage = image.asUIImage()
let imageData: Data? = uiImage.jpegData(compressionQuality: 0)
guard let imageData = imageData else { return nil }
let encodedImage = imageData.base64EncodedString()
let fileManager = FileManager.default
let documentsPath = fileManager.urls(for: .documentDirectory, in: .userDomainMask).first
let localImageUrl = (documentsPath?.appendingPathComponent(key))
guard let localImageUrl = localImageUrl else {
return nil
}
try! encodedImage.write(to: localImageUrl, atomically: true, encoding: String.Encoding.utf8)
guard let fileContents = try? String(contentsOf: localImageUrl) else {
return nil
}
let dataDecoded : Data = Data(base64Encoded: fileContents, options: .ignoreUnknownCharacters)!
let decodedimage = UIImage(data: dataDecoded)
return (Image(uiImage: decodedimage ?? UIImage(named: "placeholder")!))
}
import SwiftUI
struct UploadTesting: View {
@ObservedObject var crViewModel = ChatroomsViewModel()
@State var showImagePicker = false
@State var selectedImage: Image? = nil
@State var convertedImage: Image?
var body: some View {
VStack {
Button("Select") {
showImagePicker.toggle()
}
selectedImage?.resizable().scaledToFit().frame(width: 300, height: 300)
Button("Convert") {
convertedImage = crViewModel.convertImage(image: selectedImage)
}
convertedImage?.resizable().scaledToFit().frame(width: 200, height: 200).background(Color.red)
}
.sheet(isPresented: $showImagePicker) {
ImagePicker(image: $selectedImage)
}
}
}
Video of issue: https://streamable.com/h0uqly
Hello everyone, It's a nice holiday.
I changed pixel value from uiimage, so produce other image.
Code is this.
func normalize(_ image: UIImage) -> UIImage {
let width = 320
let height = 240
let bytesPerComponent = 4
let channel = 1
let imageSize = width * height * bytesPerComponent
guard Int(image.size.width) == width,
Int(image.size.height) == height,
image.cgImage?.bitsPerComponent == 8,
image.cgImage?.bitsPerPixel == 8
else { // setting include width, height type is uint8 and 1 channel
fatalError("\(#function) : you should fixed image setting")
}
let cgimage = image.cgImage
guard let originalData = cgimage?.dataProvider?.data,
let original = CFDataGetBytePtr(originalData)
else {
fatalError("\(#function) : get original buffer failed.")
}
imageData = UnsafeMutablePointer<Float32>.allocate(capacity: width * height)
guard let imageData = imageData else { fatalError("\(#function) : unknwon error") }
// 0 ~ 255 (uint8) -> -1.0 ~ 1.0 (float32)
let normValue: Float32 = 127.5
for w in 0..<width {
for h in 0..<height {
imageData[h * width + w] = (Float32(original[h * width + w]) - normValue) / normValue
}
}
guard let provider = CGDataProvider(data: Data(bytesNoCopy: imageData,
count: imageSize,
deallocator: .none) as CFData)
else {
fatalError("\(#function) : provider load failed.")
}
guard let cgImage = CGImage(width: width,
height: height,
bitsPerComponent: bytesPerComponent * 8,
bitsPerPixel: bytesPerComponent * channel * 8,
bytesPerRow: bytesPerComponent * channel * width,
space: CGColorSpaceCreateDeviceGray(),
bitmapInfo: CGBitmapInfo(rawValue: 0),
provider: provider,
decode: nil,
shouldInterpolate: false,
intent: .defaultIntent)
else {
fatalError("\(#function) : create cgimage failed by unknown error")
}
return UIImage(cgImage: cgImage)
}
This code is done every frame, and produce this log, sometimes and logs are too much.
[Mars] mapData:843: *** ImageIO - mmapped file changed (old: 6782 new: 6875)
I don't know "843" mean and following number paste at tail of log, changed every times.
I don't know it means warning or error and so on.
It works well, it seems no problem, But I worried about too many logs and thread problem. I just feel bad. Have you see this log anyone?
Post not yet marked as solved
I'm going insane over this, I'm using Xcode 13 beta 6 on the M1 MacBook Air. Every time I pick an image, I receive this error message. At this point I don't know what I can do because the message is so vague. Please help me.
@Binding var image: UIImage
func makeUIViewController(context: Context) -> UIImagePickerController {
let imagePicker = UIImagePickerController()
imagePicker.delegate = context.coordinator
return imagePicker
}
func updateUIViewController(_ uiViewController: UIImagePickerController, context: Context) {
}
func makeCoordinator() -> Coordinator {
Coordinator(parent: self)
}
final class Coordinator: NSObject, UIImagePickerControllerDelegate, UINavigationControllerDelegate {
var parent: PhotoPicker
init(parent: PhotoPicker){
self.parent = parent
}
func imagePickerController(_ picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [UIImagePickerController.InfoKey : Any]) {
if let image = info[.originalImage] as? UIImage {
parent.image = image
}else{
}
picker.dismiss(animated: true, completion: nil)
}
}
}
@State private var isPresented: Bool = false
@State private var avatarImage = UIImage(systemName: "person") ?? UIImage()
var body: some View {
VStack {
Image(uiImage: avatarImage)
.resizable()
.scaledToFill()
.frame(width: 150, height: 150)
.clipShape(Circle())
.padding()
.onTapGesture {
isPresented = true
}
Spacer()
}
.navigationTitle("Profile")
.sheet(isPresented: $isPresented) {
PhotoPicker(image: $avatarImage)
}
}
}
Post not yet marked as solved
In my Singleton Class,I used [UIImage imageNamed:] in the block of dispatch_once,it sometimes crash,I have thought many days,my image is stored in my App(Not in Assets,In My custom Resource File),it's unbelievable,who can tell me why?
This is my Code
static dispatch_once_t onceToken;
static AClass *manager = nil;
dispatch_once(&onceToken, ^{
manager = [[AClass alloc] init];
dispatch_async(dispatch_get_global_queue(0, 0), ^{
NSMutableArray * imgDataList = [NSMutableArray array];
for (int i=0; i<=15; i++) {
NSString *imageName=[NSString stringWithFormat:@"imageNamedxx_%d@2x.png",i];
UIImage * img = [UIImage imageNamed:imageName];
if (img) {
[imgDataList addObject:img];
}
}
dispatch_async(dispatch_get_main_queue(), ^{
manager.liveImgData = [imgDataList copy];
});
});
});
return manager;
}
1
libobjc.A.dylib
objc_exception_throw
2
CoreFoundation
-[CFPrefsSearchListSource addManagedSourceForIdentifier:user:]
3
Foundation
-[NSAssertionHandler handleFailureInMethod:object:file:lineNumber:description:]
4
UIKitCore
-[_UIImageCGImageContent initWithCGImage:scale:]
5
UIKitCore
-[CUINamedImage(UIKitAdditions) UIImageWithAsset:configuration:flippedHorizontally:optionalVectorImage:]
6
UIKitCore
-[UIImageAsset imageWithConfiguration:]
7
UIKitCore
-[_UIPathLazyImageAsset imageWithConfiguration:]
8
UIKitCore
+[UIImage imageNamed:inBundle:withConfiguration:]
myApp
AClass - 第 42 行([UIImage imageNamed:])
__34+[AClass shareInstance]_block_invoke_2 + 42