Post not yet marked as solved
Do we have any api to upload app screenshots and metadata or create experiments with Screenshots and metadata for appstore connect ?
Post not yet marked as solved
This is a macOS app, while using Image("my file url") in my CoverImageView, I can export a .png file that contains the Image("my file url") view successfully.
However, this doesn't happen when I use Image(nsImage: NSImage(data: imageData.imagedata)), the imageData.imagedata is a Data type that will be gotten when I tap a button and get it from a selected picture (I will show it later in my code).
When I select an image, it can be seen in my app's View.
However, when I save this View(CoverImageView) to a .png file it only contain the blue view, the Mac view is gone!!!!
Here is the CoverImageView from which I want to create a .png file
struct CoverImageView: View {
@EnvironmentObject var imageData: ImageData
var body: some View {
ZStack {
Rectangle()
.frame(width: 512, height: 600)
.foregroundColor(.blue)
Image(nsImage: (NSImage(data: imageData.imagedata) ?? NSImage(byReferencing: URL(fileURLWithPath: ""))))
.resizable()
.aspectRatio(contentMode: .fit)
.frame(width: 512, height: 512)
}
}
}
This is the main view PhotoTestView
struct PhotoTestView: View {
@State private var imageUrl: URL = URL(fileURLWithPath: "")
@EnvironmentObject var imageData: ImageData
var body: some View {
VStack {
CoverImageView()
Divider()
.frame(width: 1024)
HStack {
Button(action: {
if let openURL = ImageProcess().showOpenPanel() {
imageUrl = openURL
if let codedImages = try? Data(contentsOf: openURL) {
imageData.imagedata = codedImages
}
}
}, label: {
Image(systemName: "doc.badge.plus")
})
Button(action: {
ImageProcess().saveImage()
}, label: {
Image(systemName: "square.and.arrow.down")
})
}.padding()
}
}
}
The View Extension which will create a png file from a view
extension View {
func imageRepresentation(rect: CGRect) -> NSBitmapImageRep? {
let hosting = NSHostingView(rootView: self)
hosting.setFrameSize(rect.size)
hosting.setBoundsSize(rect.size)
hosting.layout()
hosting.layerContentsRedrawPolicy = .onSetNeedsDisplay
hosting.setNeedsDisplay(rect)
if let imageRepresentation = hosting.bitmapImageRepForCachingDisplay(in: rect) {
hosting.cacheDisplay(in: rect, to: imageRepresentation)
return imageRepresentation
}
return nil
}
func asImage(rect: CGRect) -> NSImage? {
if let cgImage = imageRepresentation(rect: rect)?.cgImage {
return NSImage(cgImage: cgImage, size: rect.size)
}
return nil
}
func asPngData(rect: CGRect) -> Data? {
return imageRepresentation(rect: rect)?.representation(using: .png, properties: [:])
}
}
png File Reader and Save
struct ImageProcess {
func showOpenPanel() -> URL? {
let openPanel = NSOpenPanel()
openPanel.allowedContentTypes = [.image]
openPanel.allowsMultipleSelection = false
openPanel.canChooseDirectories = false
openPanel.canChooseFiles = true
let response = openPanel.runModal()
return response == .OK ? openPanel.url : nil
}
func saveURL() -> URL? {
let savePanel = NSSavePanel()
savePanel.allowedContentTypes = [.png]
savePanel.canCreateDirectories = true
savePanel.isExtensionHidden = false
savePanel.allowsOtherFileTypes = false
savePanel.title = "Save your image"
savePanel.message = "Choose a folder and a name to store your image."
savePanel.nameFieldLabel = "File name:"
let response = savePanel.runModal()
return response == .OK ? savePanel.url : nil
}
func saveImage() {
let view = CoverImageView().environmentObject(ImageData())
let imageData = view.asPngData(rect: CGRect.init(x: 0, y: 0, width: 1024, height: 768))
if let url = saveURL() {
try? imageData!.write(to: url)
}
// print(imageData)
}
}
Could you help me?
Hi,
I'm working on building a mac app in Swift that make batch conversions between the .openexr and .png file format in both directions. I would like to know what kinds of the library I could use. I found the mac system could directly convert the .openexr into other formats by right click on the openexr file. I also would like to know if the conversion could be done reversely with some supported libraries. Thanks.
Post not yet marked as solved
convert UIImage to CIImage,, but lose every element. position, rotate, scale etc..
i implemented video editor. so i add pan, rotate, pinch gesture event with UIImageVIew. and when i save video, i convert UIImageView to CIImage. but it's lose everything..
please help me......
==========.
CIFilter *filter = [CIFilter
filterWithName:@"CIAdditionCompositing"];
UIImageView *imageView = self.subviews[0];
CIImage *ciImage = [CIImage
imageWithCGImage:imageView.image.CGImage];
_playerItem.videoComposition = [AVVideoComposition
videoCompositionWithAsset:_playerItem.asset
applyingCIFiltersWithHandler:^(AVAsynchronou
sCIImageFilteringRequest *_Nonnull request) {
if (filter == nil) {
}
else {
CIImage *image =
request.sourceImage.imageByClampingToExtent;
[filter setDefaults];
[filter setValue:image
forKey:@"inputBackgroundImage"];
[filter setValue:ciImage forKey:@"inputImage"];
CIImage *outputImage = [filter.outputImage imageByCroppingToRect:request.sourceImage.extent];
[request finishWithImage:outputImage context:nil];
}
}
Post not yet marked as solved
Hello, I have a crash that is difficult to solve because I don't know how to solve it, it appears on any model, on any system.It appears 2-3 times in almost every version.
Because I am responsible for the module that crashed and until today I had no solution, so I am asking for your help and hope that the Apple engineers can give me an answer
Here is the latest crash stack information.
iPhone 13 Pro Max
15.3.1
0 CoreFoundation _CFGetTypeID + 148
1 ImageIO _CGImageSourceCopyPropertiesAtIndex + 164
2 ImageIO _CGImageSourceCopyPropertiesAtIndex + 164
3 UIKitCore __UIImageGetOrientationAndScale + 68
4 UIKitCore _ImageSourceAtPath + 332
5 UIKitCore __UIImageSourceAtPath + 316
6 UIKitCore -[UIImage initWithContentsOfFile:cache:] + 72
7 UIKitCore +[UIImage imageWithContentsOfFile:] + 60
8 sohunews +[UIImage(themeImage) readImageFromDisk:context:] (SNThemeManager.m:0)
9 sohunews +[UIImage(themeImage) snImageNamed:] (SNThemeManager.m:0)
10 sohunews -[SNRollingNewsVideoCell addFlameAnimationImage] (SNRollingNewsVideoCell.m:0)
11 sohunews -[SNRollingNewsVideoCell initflameAnimation] (SNRollingNewsVideoCell.m:1082)
12 sohunews -[SNRollingNewsVideoCell initWithStyle:reuseIdentifier:] (SNRollingNewsVideoCell.m:169)
13 sohunews -[SNNewsPageTableDataSource tableView:cellForRowAtIndexPath:] (SNNewsPageTableDataSource.m:635)
14 UIKitCore -[UITableView _createPreparedCellForGlobalRow:withIndexPath:willDisplay:] + 1536
17 UIKitCore -[UITableView _visibleCellsUsingPresentationValues:] + 452
18 sohunews -[SNNormalChannelView handleTimer:] (SNNormalChannelView.m:0)
19 sohunews -[SNNormalChannelView viewDidAppear:] (SNNormalChannelView.m:1231)
20 sohunews -[SNNewsPageView collectionView:willDisplayCell:forItemAtIndexPath:] (SNNewsPageView.m:315)
21 UIKitCore -[UICollectionView _notifyWillDisplayCellIfNeeded:forIndexPath:] + 160
27 UIKitCore -[UIView(Hierarchy) layoutBelowIfNeeded] + 552
28 sohunews -[SNRollingNewsViewController reloadData] (SNRollingNewsViewController.m:0)
29 sohunews -[SNRollingNewsViewController updateChannels:] (SNRollingNewsViewController.m:2284)
30 CoreFoundation ___CFNOTIFICATIONCENTER_IS_CALLING_OUT_TO_AN_OBSERVER__ + 28
31 CoreFoundation ____CFXRegistrationPost_block_invoke + 52
32 CoreFoundation __CFXRegistrationPost + 456
33 CoreFoundation __CFXNotificationPost + 716
34 Foundation -[NSNotificationCenter postNotificationName:object:userInfo:] + 96
35 sohunews +[SNNotificationManager postNotificationName:object:] (SNNotificationManager.m:0)
36 sohunews __28-[SNChannelModel doRequest:]_block_invoke (SNChannelModel.m:0)
37 sohunews -[SNRequestManager requestSucceeded:responseObject:] (SNRequestManager.m:0)
38 sohunews __124-[AFHTTPSessionManager dataTaskWithHTTPMethod:URLString:parameters:headers:uploadProgress:downloadProgress:success:failure:]_block_invoke_2 (AFHTTPSessionManager.m:290)
39 sohunews __72-[AFURLSessionManagerTaskDelegate URLSession:task:didCompleteWithError:]_block_invoke_2.108 (AFURLSessionManager.m:238)
40 libdispatch.dylib __dispatch_call_block_and_release + 32
41 libdispatch.dylib __dispatch_client_callout + 20
42 libdispatch.dylib __dispatch_main_queue_callback_4CF + 1036
43 CoreFoundation ___CFRUNLOOP_IS_SERVICING_THE_MAIN_DISPATCH_QUEUE__ + 16
44 CoreFoundation ___CFRunLoopRun + 2540
45 CoreFoundation _CFRunLoopRunSpecific + 600
46 GraphicsServices _GSEventRunModal + 164
47 UIKitCore -[UIApplication _run] + 1100
48 UIKitCore _UIApplicationMain + 364
49 sohunews main (main.m:15)
50 ??? 0x0000000104488000 + 0
I hope to hear from yours soon!
Best wishes!
Post not yet marked as solved
Is it possible to render an MTKView in different color spaces? For example, if I want to render it in CMYK, is there a way to adjust the colors on every frame to that colorspace before presenting it on the screen, or is that something that needs to be handled in a shader?
Post not yet marked as solved
Hi. I'd like to be able to do a flood fill on images, either UIImage or CGImage, and was wondering if there was a built in way to do this provided by Apple's standard frameworks? i.e. Take a bitmap image and specify a point and color and then make it fill the area with that color, no matter what shape it is.
I've seen a few examples of algorithm code to do this, but they're quite large and complicated so am trying to avoid them.
Post not yet marked as solved
I have recently purchased an iPhone coming from Android and placed all my documents on iCloud. When using the File app, I tap on some JPEG files in my Documents folder. They will fase about 5 times, each time if flashes it shows the image briefly, however, after that no image is shown. Sometimes they do not even flash, Files app will only show the loader and the word LOADING right below it, and no preview is shown. iOS version is 15.4.1.
One thing that differentiates one file with a problem and file without problems is the file size. My JPEG with issues on my physical device are way larger at the order of 20 to 60 MB.
If I pretend to share the file and save to iPhone instead, I can view it normally from Photos app, but these are document images I do not want to have in Photos because they belong to a folder with other document types as well, for the sake of organization.
Interestingly, on my MacBook Pro, I open an iOS simulator, in this case, a 15.4 iPod Touch and follow exactly the same steps, and all JPEG files are shown in Files app preview without any issues.
This is an annoying bug as I had personal issues for not being able to open my image document when I needed and because such a pricey product like an iPhone should not produce such a bug for such a basic thing like showing a JPEG file.
These files are not corrupt in any way as I can open them normally if using Photos or on my MacBook Pro, or everywhere else including the simulator in previous iOS release, the Photos app or an Android device.
Post not yet marked as solved
Some EXIF entries require float content as a string. For example FNumber often looks like that: "2.8" or "1.4". When I edit EXIF data using CGImageDestinationCopyImageSource most of the EXIF entries that require that float-like content are truncated to their first character.
To stay in the example above the EXIF dict will be written with a "2" or "1". The dot and the rest of the string are truncated. 🤯
Shall: "2.8"
Is: "2"
All EXIF data is written correctly (I use about 40 fields), but only the following keys are reduced to some kind of integer style:
kCGImagePropertyExifFNumber
kCGImagePropertyExifFocalLength
kCGImagePropertyGPSAltitude
kCGImagePropertyGPSImgDirection
kCGImagePropertyGPSSpeed
I am using this code to set the entries:
let metaData = CGImageMetadataCreateMutable()
if !CGImageMetadataSetValueMatchingImageProperty(metaData, kCGImagePropertyExifDictionary, kCGImagePropertyExifFNumber, 2.8 as CFTypeRef) {
print("kCGImagePropertyExifFNumber EXIF not written")
}
let destOptions: [String: AnyObject] = [
kCGImageDestinationMergeMetadata as String: NSNumber(value: 1),
kCGImageDestinationMetadata as String: metaData
]
if !CGImageDestinationCopyImageSource(destination, cgImgSource, destOptions as CFDictionary, nil) {
print("Error making CGImageDestinationCopyImageSource")
}
The dictionary being created shows me the correct values, but when I open the updated JPG, I only see the truncated floating point numbers.
I tried it with
2.8 as CFString
"2.8" as CFString
2.8 as CFTypeRef
"2.8" as CFTypeRef
2.8 as CFNumber
2.8 as NSNumber
2.8 as NSString
Only in kCGImagePropertyGPSLongitude & kCGImagePropertyGPSLatitude this behavior doesn't occur.
if !CGImageMetadataSetValueMatchingImageProperty(metaData, kCGImagePropertyGPSDictionary, kCGImagePropertyGPSLatitude, 53.997853 as CFTypeRef ) {
print("kCGImagePropertyGPSLatitude GPS not written")
}
Using CGImageDestinationCreateWithData does not do this and writes the data correctly.
I think that’s strange, because most photos aren’t taken with int-values. So either I found a bug or I am doing something wrong. What’s the point here? Is there some kind of round-rule?
Post not yet marked as solved
In my Xamarin iOS App (iOS 15 and later) i tried to save Exifdata in the dictionary with CGImageDestination.Create() and add there the changed Metadata. The data is displayed in the right way in the console, but after saving it with NSData.Save the data is not in the exif anymore, if i write it on the console. I tried changing the CGImageProperties.ExifUserComment and TIFFImageDescription, but both are not saved. Is there any limitation, on which exifdata can be saved or am i doing something wrong? Here is my example code, where i change the existing metadata, set it to the exifdictionary and then save it with NSData.Save().
var img = UIImage.FromFile(file);
NSData ns = img.AsJPEG();
var dicMetadata = ns.ExtractMetaDataFromImageData();
NSMutableDictionary dicDescription = (NSMutableDictionary)dicMetadata.ObjectForKey(ImageIO.CGImageProperties.ExifDictionary);
if (dicDescription == null) {
dicDescription = new NSMutableDictionary();
}
dicDescription.SetValueForKey(FromObject(bf.name.Trim()), ImageIO.CGImageProperties.ExifUserComment);
dicMetadata.SetValueForKey(dicDescription, ImageIO.CGImageProperties.ExifDictionary);
File.Delete(file);
var imgSrc = ImageIO.CGImageSource.FromData(ns);
var outImageData = new NSMutableData();
using (var d = ImageIO.CGImageDestination.Create(outImageData, imgSrc.TypeIdentifier, 1, new ImageIO.CGImageDestinationOptions())) {
if(d==null) {
Console.Write("could not generate dest");
}
d.AddImage(imgSrc, 0, dicMetadata);
d.Close()
}
NSError writeError;
var imageSaved = outImageData.Save(file, NSDataWritingOptions.Atomic, out writeError);
Here's basic code of my ToDo App:
Data:
struct ToDo: Codable {
var title: String
var isCompleted: Bool
var dateCreated: Date
var notes: String
static let DocumentsDirectory =
FileManager.default.urls(for: .documentDirectory, in: .userDomainMask).first!
static let ArchiveURL = DocumentsDirectory.appendingPathComponent("todos") .appendingPathExtension("plist")
static func loadToDos() -> [ToDo]? {
guard let codedToDos = try? Data(contentsOf: ArchiveURL) else {return nil}
let propertyListDecoder = PropertyListDecoder()
return try? propertyListDecoder.decode(Array<ToDo>.self, from: codedToDos)
}
static func saveToDos(_ todos: [ToDo]) {
let propertyListEncoder = PropertyListEncoder()
let codedToDos = try? propertyListEncoder.encode(todos)
try? codedToDos?.write(to: ArchiveURL,
options: .noFileProtection)
}
I have a tableViewController to display the data's detail, there's a noteTextView (textView) for editing/adding the todo.note. There's a Camera Button allows user to insert the images into (textView) todo.note using NSTextAttachment():
internal func imagePickerController(_ picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [UIImagePickerController.InfoKey : Any]) {
let attachment = NSTextAttachment()
let image = info[.originalImage] as! UIImage
attachment.image = image
//Resize Photo to fit in noteTextView: calculate new size so want a litter space on the right of image
let newImageWidth = (noteTextView.bounds.size.width - 20)
let scale = newImageWidth/image.size.width
let newImageHeight = image.size.height * scale
attachment.bounds = CGRect.init(x: 0, y: 0, width: newImageWidth, height: newImageHeight)
//attributedString=NSTextAttachment()
let imageString = NSAttributedString(attachment: attachment)
// add this attributed string to the cusor position
noteTextView.textStorage.insert(imageString, at: noteTextView.selectedRange.location)
picker.dismiss(animated: true, completion: nil)
}
The code working well. Now, how do I save it (with images) to DocumentsDirectory and how do I load it back? I can save/load without images.
Thanks.
Post not yet marked as solved
Since the release of iOS 16, many users of our app had complained that many WebP images in our App broken suddenly which made the app unusable.
We are using google's WebP library to decode WebP data. We create a CGDataProviderRef with WebPDecoderConfig output using MODE_RGB colorspace and feed it to CGImageCreate.
On iOS 16 beta version, It just returns nil which is unexpected.
Only if we use WebPDecoderConfig with MODE_rgbA, it works correctly and return a valid image ref. However, on iOS systems before iOS 16, it works correctly with the exactly same image.
Does apple will fix this issue?
Post not yet marked as solved
I know this is a strange and easy question.
but JUST CANT FIND ANY REFERENCE aaaaaaaa
I'm using SwiftUI and wants to
export to PDF (best I can find Creating PDFs on MacOS without UIKit)
export to image
export to HTML
export to everything possible etc
BONUS print the page
I can only find UIKit support and they are all outdated. AND IF POSSIBLE how to integrate ShareLink to share these(PDF, Image, HTML, etc) and write them onto disk using NSSavePane?
even better
how to do this on a of the platforms
(I know I'm greedy😂)
Thanks for any help🙏
Post not yet marked as solved
I use AVCapturePhotoOutput to take still pictures.
I print metadata with the following code.
func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) {
guard let data = photo.fileDataRepresentation() else { return }
let ciImage = CIImage(data: data)
print(ciImage?.properties)
The following CompositeImage can be obtained
CompositeImage = 2
However, CompositeImage will always be 2 even if HDR is turned off as shown below.
device.automaticallyAdjustsVideoHDREnabled = false
device.isVideoHDREnabled = false
The expected value is 1.
Could you find the cause of this?
Post not yet marked as solved
I want to store the number of images being composited with HDR in kCGImagePropertyExifSourceImageNumberOfCompositeImage.
But I don't know how to get the composite number.
Could you please tell me how to do that?
Post not yet marked as solved
I want to add GPS location data to Exif of UIImage
GPS location data as Exif information in UIImage
I want to give latitude and longitude.
How should I implement it?
Post not yet marked as solved
Hello developers,
I have a original image that is ordinary one.
and set image in image view like this
imageview.image = UIImage(cgImage: image)
result is this,
and then set same image in scenekit like this
scnview.scene.background.contents = image
and result is,
I use same image to two othre view, and one in scnview is darker than original one, and I don't know the problem...
found in google, who has a same problem, but there is no answer.
(https://stackoverflow.com/questions/60679819/adding-uiimageview-to-arscnview-scene-background-causes-saturation-hue-to-be-off)
checked scnview's pixelformat it was bgra8unorm.
What's the problem??