I set the device format and colorspace to Apple Log and turn off the HDR, why the movie output is still in HDR format rather than ProRes Log?
Full runnable demo here:
https://github.com/SpaceGrey/ColorSpaceDemo
session.sessionPreset = .inputPriority
// get the back camera
let deviceDiscoverySession = AVCaptureDevice.DiscoverySession(deviceTypes: [.builtInWideAngleCamera], mediaType: .video, position: .back)
backCamera = deviceDiscoverySession.devices.first!
try! backCamera.lockForConfiguration()
backCamera.automaticallyAdjustsVideoHDREnabled = false
backCamera.isVideoHDREnabled = false
let formats = backCamera.formats
let appleLogFormat = formats.first { format in
format.supportedColorSpaces.contains(.appleLog)
}
print(appleLogFormat!.supportedColorSpaces.contains(.appleLog))
backCamera.activeFormat = appleLogFormat!
backCamera.activeColorSpace = .appleLog
print("colorspace is Apple Log \(backCamera.activeColorSpace == .appleLog)")
backCamera.unlockForConfiguration()
do {
let input = try AVCaptureDeviceInput(device: backCamera)
session.addInput(input)
} catch {
print(error.localizedDescription)
}
// add output
output = AVCaptureMovieFileOutput()
session.addOutput(output)
let connection = output.connection(with: .video)!
print(
output.outputSettings(for: connection)
)
/*
["AVVideoWidthKey": 1920, "AVVideoHeightKey": 1080, "AVVideoCodecKey": apch,<----- prores has enabled.
"AVVideoCompressionPropertiesKey": {
AverageBitRate = 220029696;
ExpectedFrameRate = 30;
PrepareEncodedSampleBuffersForPaddedWrites = 1;
PrioritizeEncodingSpeedOverQuality = 0;
RealTime = 1;
}]
*/
previewSource = DefaultPreviewSource(session: session)
queue.async {
self.session.startRunning()
}
}
Post
Replies
Boosts
Views
Activity
I am using the depthAnything v2 provided by Apple on the developer website. On my iPhone 15 Pro, if I choose all or cpuAndNeuralEngine, it will stuck in loading models.
let config = MLModelConfiguration()
config.computeUnits = .cpuAndGPU//normal when not using neuralEngine.
let model = try await DepthModel.load(configuration: config)
with following error:
E5RT encountered an STL exception. msg = MILCompilerForANE error: failed to compile ANE model using ANEF. Error=无法与帮助程序通信。.
E5RT: MILCompilerForANE error: failed to compile ANE model using ANEF. Error=无法与帮助程序通信。 (11)
I add the following info in the build settings of my app.
Then I also add the icon to my asset.
I build the app, it works normally, but I notice that when I changed the icon to 'AppIcon-Defaults', The icon change alert didn't appear. but it succeeds
.
I uploaded the archive to App Store I got:
ITMS-90895: Missing Icon - The Info.plist key CFBundleIcons.CFBundleAlternateIcons contains an entry “AppIcon-Defaults” that references asset “AppIcon-Defaults.” No such asset is present in the asset catalog.
Then I use xcrun --sdk iphoneos assetutil --info Assets.car to check the assets inside the archieved app, It have the asset.
I extracted the gain map info from an image using
let url = Bundle.main.url(forResource: "IMG_1181", withExtension: "HEIC")
let source = CGImageSourceCreateWithURL(url! as CFURL, nil)
let portraitData = CGImageSourceCopyAuxiliaryDataInfoAtIndex(source!, 0, kCGImageAuxiliaryDataTypeHDRGainMap) as! [AnyHashable : Any]
let metaData = portraitData[kCGImageAuxiliaryDataInfoMetadata] as! CGImageMetadata
Then I printed all the metadata tags
func printMetadataProperties(from metadata: CGImageMetadata) {
guard let tags = CGImageMetadataCopyTags(metadata) as? [CGImageMetadataTag] else {
return
}
for tag in tags {
if let prefix = CGImageMetadataTagCopyPrefix(tag) as String?,
let namespace = CGImageMetadataTagCopyNamespace(tag) as String?,
let key = CGImageMetadataTagCopyName(tag) as String?,
let value = CGImageMetadataTagCopyValue(tag){
print("Namespace: \(namespace), Key: \(key), Prefix: \(prefix), value: \(value)")
} else {
}
}
}
//Namespace: http://ns.apple.com/ImageIO/1.0/, Key: hasXMP, Prefix: iio, value: True
//Namespace: http://ns.apple.com/HDRGainMap/1.0/, Key: HDRGainMapVersion, Prefix: HDRGainMap, value: 131072
//Namespace: http://ns.apple.com/HDRGainMap/1.0/, Key: HDRGainMapHeadroom, Prefix: HDRGainMap, value: 3.586325
I want to create a new CGImageMetadata and tags.
But when it comes to the HDR tags. It always fails to add to metadata.
let tag = CGImageMetadataTagCreate(
"http://ns.apple.com/HDRGainMap/1.0/" as CFString,
"HDRGainMap" as CFString,
"HDRGainMapHeadroom" as CFString,
.default,
3.56 as CFNumber
)
let path = "\(HDRGainMap):\(HDRGainMapHeadroom)" as CFString
let success = CGImageMetadataSetTagWithPath(metadata, nil, path, tag)// always false
The hasXMP works fine.
Is HDR a private dict for Apple?
I've requested the authentication in my main app.
PHPhotoLibrary.requestAuthorization(for: .readWrite) { status in }
Add the privacy description in both the main app and the extension.
But No matter the device is locked or unlocked. When I call
let fetchResult = PHAsset.fetchAssets(with: .image, options: nil)
let count = fetchResult.count
the count is always zero, even after a new photo is saved to the album in the same session.
In my SwiftUI view, I try to load the image from data.
var body: some View {
Group{
if let data = model.detailImageData, let uiimage = UIImage(data: data) {// no memory issue
Image(uiImage: uiimage)
.resizable()
.scaledToFit()
}
}
}
But I want to get the HDR style of my image, so I use
if let data = model.detailImageData, let uiimage = UIImageReader.default.image(data:data){ //memory leaks!!!
When I change the data, the memory of the previous image is never freeed. finally caused my app to crash.
You can see it from the Instrument screenshot.
I use this code to show the Image in HDR in SwiftUI
struct HDRImageView: UIViewRepresentable {
// Set up a common reader for all UIImage read requests.
static let reader: UIImageReader = {
var config = UIImageReader.Configuration()
config.prefersHighDynamicRange = true
return UIImageReader(configuration: config)
}()
let data:Data?
let enableHDR:Bool
func makeUIView(context: Context) -> UIImageView {
let view = UIImageView()
view.preferredImageDynamicRange = enableHDR ? .high : .standard
update(view)
// Set this view to fit itself to the parent view.
view.setContentCompressionResistancePriority(.defaultLow, for: .horizontal)
view.setContentCompressionResistancePriority(.defaultLow, for: .vertical)
view.setContentHuggingPriority(.required, for: .horizontal)
view.setContentHuggingPriority(.required, for: .vertical)
return view
}
func updateUIView(_ view: UIImageView, context: Context) {
update(view)
}
func update(_ view: UIImageView) {
autoreleasepool{//not working
if let data = data {
view.image = nil//set to nil first is not working
view.image = HDRImageView.reader.image(data: data)
} else {
view.image = nil
}
view.preferredImageDynamicRange = enableHDR ? .high : .standard
}
}
}
But when I update the input data, seems that the old image data can not be freeed.
After several changes, the app takes too much memory and crash.
I found it's the VM:ImageIO_Surface_Data and the VM_Image_IO take up the memory.
If I change the HDRImageView into a normal Image(uiimage:UIImage(data:)) It no longer have this issue.
Is it a memory leak? and how to solve this.
Update: I then tried using Image(_:cgImage), and it appear to be the same result.
I have an idea that uses the IMU in the AirPods, but on the challenge website, I can't find any information about if it is allowed.