IOSurface

RSS for tag

Share hardware-accelerated buffer data across multiple processes and frameworks using IOSurface.

IOSurface Documentation

Posts under IOSurface tag

9 Posts
Sort by:
Post not yet marked as solved
2 Replies
445 Views
(more details on StackOverflow) I'm getting messages like the following, SOMETIMES, when I draw to a CGContext IOSurface creation failed: e00002c2 parentID: 00000000 properties: { IOSurfaceAddress = 5207703552; IOSurfaceAllocSize = 9461418; IOSurfaceCacheMode = 0; IOSurfaceName = CMPhoto; IOSurfacePixelFormat = 1246774599; } call to context.draw(): context.draw(photo.image, in: CGRect(x: 0, y: top, width: width, height: height), byTiling: false) The results are just fine, so the draw seems to be working. It also, most often, draws without producing this error, but it fails pretty often. I'm not sure where to begin looking to sort out what I might need to do differently to avoid this error message in the console. Complete code: import Foundation import SwiftUI func generateSpritesImage(thumbPhotos: [Photo], width: Int, filename: URL) -> [Int] { var indices = [Int]() let totalHeight = thumbPhotos.reduce(0) { $0 + $1.heightOfImage(ofWidth: width) } debugPrint("creating context") let context = CGContext(data: nil, width: width, height: totalHeight, bitsPerComponent: 8, bytesPerRow: 0, space: CGColorSpace(name: CGColorSpace.sRGB)!, bitmapInfo: CGImageAlphaInfo.noneSkipLast.rawValue)! var top = totalHeight for photo in thumbPhotos { let height = photo.heightOfImage(ofWidth: width) indices.append(top - totalHeight) top -= height debugPrint("drawing \(photo.filteredFileURL())") context.draw(photo.image, in: CGRect(x: 0, y: top, width: width, height: height), byTiling: false) } debugPrint("write jpeg") writeJpegFromContext(context: context, filename: filename) return indices } func writeJpegFromContext(context: CGContext, filename: URL) { let cgImage = context.makeImage()! let bitmapRep = NSBitmapImageRep(cgImage: cgImage) let jpegData = bitmapRep.representation(using: NSBitmapImageRep.FileType.jpeg, properties: [:])! try! jpegData.write(to: filename) } sample of output: "drawing 0002-_MG_8542.jpg" "drawing 0003-_MG_8545.jpg" "drawing 0004-_MG_8550.jpg" IOSurface creation failed: e00002c2 parentID: 00000000 properties: { IOSurfaceAddress = 5211357184; IOSurfaceAllocSize = 9983331; IOSurfaceCacheMode = 0; IOSurfaceName = CMPhoto; IOSurfacePixelFormat = 1246774599; } "drawing 0005-_MG_8555.jpg" IOSurface creation failed: e00002c2 parentID: 00000000 properties: { IOSurfaceAddress = 5221351424; IOSurfaceAllocSize = 10041215; IOSurfaceCacheMode = 0; IOSurfaceName = CMPhoto; IOSurfacePixelFormat = 1246774599; } "drawing 0006-_MG_8562.jpg" "drawing 0007-_MG_8563.jpg" IOSurface creation failed: e00002c2 parentID: 00000000 properties: { IOSurfaceAddress = 5376163840; IOSurfaceAllocSize = 10109756; IOSurfaceCacheMode = 0; IOSurfaceName = CMPhoto; IOSurfacePixelFormat = 1246774599; } "drawing 0008-_MG_8584.jpg" "drawing 0009-_MG_8618.jpg" IOSurface creation failed: e00002c2 parentID: 00000000 properties: { IOSurfaceAddress = 5394612224; IOSurfaceAllocSize = 8425564; IOSurfaceCacheMode = 0; IOSurfaceName = CMPhoto; IOSurfacePixelFormat = 1246774599; } "drawing 0010-_MG_8627.jpg" "drawing 0011-_MG_8649.jpg" "drawing 0012-_MG_8658.jpg" "drawing 0013-_MG_8665.jpg" "drawing 0014-_MG_8677.jpg" "drawing 0015-_MG_8675.jpg" "drawing 0016-_MG_8676.jpg" "drawing 0017-IMGP0873.jpg" "drawing 0018-_MG_8719.jpg" "drawing 0019-_MG_8743.jpg" ...
Posted
by MCargal.
Last updated
.
Post not yet marked as solved
0 Replies
423 Views
I have this code to create an IOSurface from a bitmap image: auto src = loadSource32f(); // rgba 32-bit float image const auto desc = src->getDescriptor(); // metadata for that image auto pixelFmt = CGMTLBufferManager::getCVPixelFormat( desc.channelBitDepth, desc.channelOrder ); // returns proper `RGfA` int width = static_cast<int>( desc.width ); int height = static_cast<int>( desc.height ); int trowbytes = static_cast<int>( desc.trueRowbytes() ); // returns proper rowbytes value CFMutableDictionaryRef properties = CFDictionaryCreateMutable( kCFAllocatorDefault, 0, &kCFTypeDictionaryKeyCallBacks, &kCFTypeDictionaryValueCallBacks ); CFDictionarySetValue( properties, kIOSurfaceWidth, CFNumberCreate( kCFAllocatorDefault, kCFNumberIntType, &width ) ); CFDictionarySetValue( properties, kIOSurfaceHeight, CFNumberCreate( kCFAllocatorDefault, kCFNumberIntType, &height ) ); CFDictionarySetValue( properties, kIOSurfacePixelFormat, CFNumberCreate( kCFAllocatorDefault, kCFNumberIntType, &pixelFmt ) ); CFDictionarySetValue( properties, kIOSurfaceBytesPerRow, CFNumberCreate( kCFAllocatorDefault, kCFNumberIntType, &trowbytes ) ); NSDictionary *nsprops = ( __bridge NSDictionary * )properties; IOSurface *oSurface = [[IOSurface alloc] initWithProperties:nsprops]; CFRelease( properties ); ASSERT_TRUE( oSurface ); auto ioSurface = (IOSurfaceRef) oSurface; I tested that the pixels are properly written into the iosurface: // copy data to surface memcpy([oSurface baseAddress], src->getRawPtr(), src->getSizeInBytes()); auto surfPtr = (uint8_t*)[oSurface baseAddress]; // extract raw surface data and write it into a file saveOutputRaw(desc, surfPtr, getFileName("IOSurfaceTestSurfaceRaw")); And I see this: Now I want to create a MTLTexture based on the iosurface: // create texture auto fmt = IOSurfaceGetPixelFormat( ioSurface ); auto w = IOSurfaceGetWidth( ioSurface ); auto h = IOSurfaceGetHeight( ioSurface ); auto rowbytes = IOSurfaceGetBytesPerRow( ioSurface ); MTLTextureDescriptor *textureDescriptor = [MTLTextureDescriptor texture2DDescriptorWithPixelFormat:CGMTLBufferManager::getMTLPixelFormat( fmt ) width:w height:h mipmapped:NO]; textureDescriptor.usage = MTLTextureUsageShaderRead | MTLTextureUsageShaderWrite; textureDescriptor.storageMode = MTLStorageModeShared; auto device = MTLCreateSystemDefaultDevice(); id<MTLTexture> surfaceTex = [device newTextureWithDescriptor:textureDescriptor iosurface:ioSurface plane:0]; And now I want to test this: auto region = MTLRegionMake2D(0, 0, w, h); auto bufSize = [oSurface allocationSize]; // get texture bytes auto outBuf2 = std::vector<uint8_t>(bufSize); [surfaceTex getBytes:outBuf2.data() bytesPerRow:rowbytes fromRegion:region mipmapLevel:0]; // save to file saveOutputRaw(desc, outBuf2.data(), getFileName("IOSurfaceTestCreateTex")); // get bytes saveOutputRaw(desc, surfPtr, getFileName("IOSurfaceTestCreateRaw")); And I get this result: I also tried replaceRegion and blitEncoder copyFromTexture: toTexture: as well as managed texture with syncing, but the result is always the same - only the first 22 pixels get filled and the rest is transparent. I have no idea what I'm missing. Please help.
Posted
by BartW.
Last updated
.
Post marked as solved
1 Replies
352 Views
Hi! I have a Mid 2015 Macbook running XCode 14.2 and an iPhone 11 Pro with iOS 17.1. How can I deploy my app to app store if I can not run a test on my device since it is running iOS 17.1 and my xcode does not have iOS 17 framework because my macbook is old enough to have it? thank you
Posted
by etavares.
Last updated
.
Post not yet marked as solved
0 Replies
336 Views
I purchase apple developer program membership i enrolled for it i paid to apple for 1 year membership they send me mail and showing my active subscription in Purchases and apple developer app also but they didn't give me appstoreconnect access what it is mean why i buy apple developer program membership ? because i want to publish my app on appstore i attached screenshot of proof that i have active membership but i didn't get access of appstoreconnect they didn't reply me on mail and on call i contacted apple developer support on call and vie email also they didn't give me any solution They not refunding my money they don't have me product for what i paid means Scam
Posted Last updated
.
Post not yet marked as solved
0 Replies
364 Views
I've watched this issue for a long time but it seems this hasn't been fixed yet. My use case is to assign a UIView to the 'contents' variable of SCNMaterialProperty. It works without problem in terms of rendering, but when I assign 'nil' to the variable the allocated memory of IOSurface by SceneKit does not being destroyed I've searched about this and many other developers have been suffered by this issue. I did a 'Game Memory' profiling of my toy example and the allocated memory (134MB) by SceneKit hadn't been released after I've assigned nil. I'm sure I released every relavant UIViews and view controllers used for the 'contents'.
Posted Last updated
.
Post not yet marked as solved
0 Replies
2.0k Views
To create a custom keyboard in iOS, you will need to create a new target in Xcode and build a custom keyboard extension. Follow these steps to create a custom keyboard in iOS: Open Xcode and create a new project. Choose "Application" under the "iOS" tab and select "Keyboard Extension" from the list of templates. Choose a name for your keyboard and click "Finish" to create the project. Xcode will create a new target for your keyboard extension. Open the "MainInterface.storyboard" file in your project and design your keyboard layout. Add new keys to the keyboard by dragging them from the "Object Library" to the keyboard view. Customize the appearance and behavior of each key by adding IBActions and IBOutlets to your view controller code. Test your keyboard by running the app on a device or simulator. To enable your keyboard on your device, go to "Settings" > "General" > "Keyboard" > "Keyboards" > "Add New Keyboard" and select your custom keyboard. Once your custom keyboard is added to the list of keyboards, you can switch to it by tapping the globe icon on your iOS keyboard. You can add additional features to your keyboard, such as autocorrection, predictive text, and gestures. To do so, refer to the Apple documentation on custom keyboard extensions. Once you are satisfied with your keyboard, you can submit it to the App Store for others to download and use. That's it! With these steps, you can create a custom keyboard in iOS and customize it to meet your needs. Visit Blog To Know More in Brief : https://blog.yudiz.com/custom-keyboard-extension-in-ios-app-development/
Posted Last updated
.
Post not yet marked as solved
4 Replies
2.1k Views
I'm trying to send an IOSurfaceRef across an NSXPCConnection on osx 10.13 and I'm having trouble with the solution that was provided in the forum thread "Efficiently sending data from an XPC process to the host application." https://developer.apple.com/forums/thread/126716 From that thread: > However, that specific problem got resolved on 10.12 where we introduced a new Objective-C IOSurface object, and that object is transportable directly over NSXPCConnection . So double yay! But it doesn’t seem to work. I have a very simple service protocol that includes (void)sendFrame:(IOSurfaceRef)frame; along with some basic NSString sending methods that successfully transfer across my NSXPCConnection. I have a valid (non-NULL) IOSurface in my app that I send to my helper app with sendFrame, and when the call is executed in the helper, the resulting frame is always NULL. On the other hand, I’ve also tried creating an IOSurface with the (deprecated) kIOSurfaceIsGlobal property and sending the IOSurface’s ID instead with: (void)sendFrameID:(uint32_t)frameID; and [_service sendFrameID:IOSurfaceGetID(surface)]; And on the helper app side, I look up the IO to get an IOSurfaceRef: IOSurfaceRef frame = IOSurfaceLookup(frameID); and it works correctly – I get a valid IOSurface which I can display and see the same pixel contents in both the app and the helper. So what is meant by the new IOSurface object in 10.12 is “transportable directly” over NSXPCConnection? How is it supposed to work? I’m specifically interested in no-copy transfer. Thanks!
Posted
by diverdi.
Last updated
.
Post not yet marked as solved
0 Replies
652 Views
extension MTLTexture { func toCVPixelBufferInBGRA8() -> CVPixelBuffer? { let attrs = [kCVPixelBufferCGImageCompatibilityKey: kCFBooleanTrue, kCVPixelBufferCGBitmapContextCompatibilityKey: kCFBooleanTrue] as CFDictionary var pixelBufferOut: Unmanaged<CVPixelBuffer>? if let iosurface = self.iosurface { CVPixelBufferCreateWithIOSurface(kCFAllocatorDefault, iosurface, attrs, &pixelBufferOut) } guard let pixelBuffer = pixelBufferOut?.takeRetainedValue() else { return nil } CVPixelBufferLockBaseAddress(pixelBuffer, .readOnly) defer { CVPixelBufferUnlockBaseAddress(pixelBuffer, .readOnly) } let width = vImagePixelCount(CVPixelBufferGetWidth(pixelBuffer)) let height = vImagePixelCount(CVPixelBufferGetHeight(pixelBuffer)) let srcBytesPerRow = CVPixelBufferGetBytesPerRow(pixelBuffer) let srcBaseAddress = CVPixelBufferGetBaseAddress(pixelBuffer) var srcBuffer = vImage_Buffer(data: srcBaseAddress, height: height, width: width, rowBytes: srcBytesPerRow) if let src = srcBaseAddress?.assumingMemoryBound(to: UInt16.self) { var n = 0 n = 1 print("srcBaseAddress", src[0 + 4 * n], src[1 + 4 * n], src[2 + 4 * n], src[3 + 4 * n]); n = 1334 print("srcBaseAddress", src[0 + 4 * n], src[1 + 4 * n], src[2 + 4 * n], src[3 + 4 * n]); } let dstBytesPerRow = Int(width) * 5 guard let dstBaseAddress = malloc(Int(height) * dstBytesPerRow) else { return nil } var dstBuffer = vImage_Buffer(data: dstBaseAddress, height: height, width: width, rowBytes: dstBytesPerRow) let error = vImageConvert_ARGB2101010ToARGB8888(&srcBuffer, &dstBuffer, 0, 1023, [0, 1, 2, 3], vImage_Flags(kvImageNoFlags)) guard error == kvImageNoError else { free(dstBaseAddress) return nil } var dstCVPixelBuffer: CVPixelBuffer? let releaseCallback: CVPixelBufferReleaseBytesCallback = {_, pointer in if let pointer = pointer { free(UnsafeMutableRawPointer(mutating: pointer)) } } guard CVPixelBufferCreateWithBytes(nil, Int(width), Int(height), kCVPixelFormatType_32BGRA, dstBaseAddress, dstBytesPerRow, releaseCallback, nil, attrs, &dstCVPixelBuffer) == kCVReturnSuccess else { free(dstBaseAddress) return nil } let dst = dstBaseAddress.assumingMemoryBound(to: UInt8.self) var n = 0 n = 1 print("dstBaseAddress", dst[0 + 4 * n], dst[1 + 4 * n], dst[2 + 4 * n], dst[3 + 4 * n]); n = 1334 print("dstBaseAddress", dst[0 + 4 * n], dst[1 + 4 * n], dst[2 + 4 * n], dst[3 + 4 * n]); print("") return dstCVPixelBuffer } } I have few print statement and you can use the output to see the color value. When the metal texture is red and pixel format is kCVPixelFormatType_40ARGBLEWideGamut, the color value is B=30144 G=25216 R=54592 A=57216. After the conversion, the color in kCVPixelFormatType_32BGRA is 255 126 13 128. But I do not know which is A, R, G, B.
Posted
by woodfung.
Last updated
.