Post not yet marked as solved
We are spending a lot of time optimizing or Metal work for discrete graphics cards and external GPUs with eGPU support. Is this going to be something Apple Silicon will support since Thunderbolt 3 won't be part of these machines? I am worried.
I have seen this question come up a few times here on Apple Developer forums (recently noted here - https://developer.apple.com/forums/thread/655505), though I tend to find myself having a misunderstanding of what technology and steps are required to achieve a goal.
In general, my colleague and I are try to use Apple's Visualizing a Point Cloud Using Scene Depth - https://developer.apple.com/documentation/arkit/visualizing_a_point_cloud_using_scene_depth sample project from WWDC 2020, and save the rendered point cloud as a 3D model. I've seen this achieved (there are quite a few samples of the final exports available on popular 3D modeling websites), but remain unsure how to do so.
From what I can ascertain, using Model I/O seems like an ideal framework choice, by creating an empty MDLAsset and appending a MDLObject for each point to, finally, end up with a model ready for export.
How would one go about converting each "point" to a MDLObject to append to the MDLAsset? Or am I going down the wrong path?
Post not yet marked as solved
I notice that when I open the Photos app on my iPhone 12 Pro, viewing Photos or Videos shot in HDR makes them brighter than the overall display brightness level.
On macOS, there are APIs like EDRMetadata on CAMetalLayer and maximumExtendedDynamicRangeColorComponentValue on NSScreen.
I did see
CAMetalLayer.wantsExtendedDynamicRangeContent, but I'm not sure if this does what I'm looking for.
The "Using Color Spaces to Display HDR Content" - https://developer.apple.com/documentation/metal/drawable_objects/displaying_hdr_content_in_a_metal_layer/using_color_spaces_to_display_hdr_content?language=objc documentation page describes setting the .colorspace on the CAMetalLayer for BT2020_PQ content, but it's not clear if this is referring to macOS or iOS. Is that the right way to get colors to be "brighter" than 1.0 on "XDR" mobile displays?
Post not yet marked as solved
I have CVPixelBuffer's in kCVPixelFormatType_420YpCbCr10BiPlanarVideoRange, which is 10 bit HDR. I need to convert these to RGB and display in MTKView. I need to know the correct pixel format to use, the BT2020 conversion matrix, and displaying the 10 bit RGB pixel buffer in MTKView.
Hello,
We recently noticed that copying pixel data from a meta texture to memory is a lot slower on the new iPhones equipped with the A14 Bionic.
We tracked down the guilty function on MTLTexture and found that getBytes(_:bytesPerRow:from:mipmapLevel: runs 8 to 20 times slower than 2 years old iPhones (iPhone XR). To measure how long it takes, we used signposts.
We've created a dummy demo project where we convert a MTLTexture to a CVPixelBuffer in this project: https://github.com/alikaragoz/UsingARenderPipelineToRenderPrimitives
The interesting part is located at this line: https://github.com/alikaragoz/UsingARenderPipelineToRenderPrimitives/blob/41f7f4385a490e889b94ee2c8913ce532a43aacb/Renderer/MetalUtils.swift#L40
Do you guys have an idea about what could be the issue?
Post not yet marked as solved
Given six KTX, ASTC compressed textures -- all equal in size and attributes -- a.ktx, b.ktx, c.ktx, d.ktx, e.ktx, & f.ktx, I can embed them in the bundle and then create a working cube via-
let cube = MDLTexture(cubeWithImagesNamed: [ "a.ktx", "b.ktx", "c.ktx", "d.ktx", "e.ktx", "f.ktx"])
This can be assigned to background.contents and works great.
If, on the other hand, I have loaded those six textures from some other source into six separate MTLTextures, I cannot provide them as an array to background.contents (it fails with "image at index 0 is NULL). I have attempted to create a cube MTLTexture with the appropriate MTLTextureDescriptor.textureCubeDescriptor (using the pixel format and other attributes from the source textures), then copying the data via MTLBlitCommandEncoder, however the end result, while error free, is a cube that is wholly purple.
I suspect this may be that the source textures are ASTC compressed, but am a bit at a loss as the documentation is rather sparse. Everything else seems to be incredibly easy relative to this very simple need of creating a cube from textures that aren't named bundle items.
Any guidance or hints would be greatly appreciated.
Hello,
I have two quads with different vertex coordinates.
How can I multiply the first quad color to mask components like Red or blue or green from the second quad color.
Post not yet marked as solved
I am just starting to learn AR. Thanks for the help.
I am trying to bind large objects to a certain location in an open area. I tried to bind using an image, an object in a reality composer. After snapping, when moving, objects do not remain in the same place. ARGeoTrackingConfiguration is not available in my region. If you scan the world around you and then define it, then with a rainy day or the slightest change in the area (for example, mowing the lawn), the terrain will not be determined. What do you advise?
Hello everyone, I'm graphic beginner programmer
I want to use 3d texture on metal for my projects...
But I can't, because of error.
I try example of this link..
fragment half4 mip_fragment
(
VertexOutput in [[ stage_in ]],
texture2d<float> backface [[ texture(0) ]],
texture3d<float> volume [[ texture(1) ]]
)
{
constexpr sampler s(s_address::clamp_to_edge, t_address::clamp_to_edge, min_filter::linear, mag_filter::linear);
float3 rgb = backface.sample(s, in.pixelCoord).rgb;
float3 lookupColor = volume.sample(s, rgb, 0).rgb;
return half4(half3(lookupColor), 1.h);
}
But I get this errors.
Fragment Function(mip_fragment): incorrect type of texture (MTLTextureType2D) bound at texture binding at index 1 (expect MTLTextureType3D) for volume[0].
And app is crashed. Please help me.
Post not yet marked as solved
I am trying to push content to an MTKView in SwiftUI, wrapped in a UIViewRepresentable by manually calling draw(in: MTKView) on the MTKViewDelegate.
My question is how to obtain and release the correct drawable from the 3 available.
As I only want to push draw calls from an external source, the view settings are:
mtkView.isPaused = true // only push data
mtkView.enableSetNeedsDisplay = false // only push data from our single source
mtkView.framebufferOnly = true // we don't render to anything but the screen
The MTKViewDelegate draw call is as follows:
func draw(in view: MTKView) {
autoreleasepool() {
let passDescriptor =
view.currentRenderPassDescriptor!
// make command buffer, encoder from descriptor
// encode data
let drawable = view.currentDrawable!
commandBuffer.present(drawable)
commandBuffer.commit()
}
}
This works fine for the first trigger of draw and on the second draw call raises [CAMetalLayerDrawable texture] should not be called after already presenting this drawable. Get a nextDrawable instead. and Each CAMetalLayerDrawable can only be presented once!
Setting mtkView.isPaused = false renders fine, so I suppose whatever internal loop is handling calling nextDrawable(). How should I go about ensuring that I am getting the next drawable and releasing the current one when I assume control of drawing?
Best regards,
I'm using Xcode 12.5.1. When I follow a swift goose tutorial for Outline view and Treeview, I've followed it exactly. It fails with...
2021-09-04 09:32:44.832345-0700 treeview1[21236:868981] Metal API Validation Enabled
2021-09-04 09:32:44.851673-0700 treeview1[21236:868981] MTLIOAccelDevice bad MetalPluginClassName property (null)
2021-09-04 09:32:44.853029-0700 treeview1[21236:868981] +[MTLIOAccelDevice registerDevices]: Zero Metal services found
When I download swift goose's project code from GitHub, it works without any error.... The only thing I can think of is (1) When I set Xcode for a new project using macOS and app, I'm missing something, even though it basically looks all the same. (2) Something I'm missing in my Xcode set up due to the fact I've got the "free" Xcode verses developer? Something linking behind the scene's? Is there a difference?
Post not yet marked as solved
Hello, could you please advise how to use UIScrollView with MTKView. So I have 3 MTKViews and want to scroll them up and down. A have error with drawable
Does Metal support utilizing the ray tracing acceleration hardware available in the Radeon 6000 series GPUs?
Can the 6000 series run the WWDC20 session 10012: Discover Ray Tracing with Metal demo?
Do any AMD GPUs support it or only the Intel integrated ones? The WWDC session video shows a sample forest scene running on a Mac Pro with the W5000 series AMD GPU.
Please see: https://developer.apple.com/forums/thread/651077
Post not yet marked as solved
I have a 3D scene with a perspective camera and I'd like some of the elements to be projected using an orthographic projection instead.
My use case is that I have some 3D elements with attached text nodes. I'd like the text on these nodes to always be the same size no matter how far away the camera is. Is there a way I can use SceneKit to mix and match? Or is there another technique I can use?
Post not yet marked as solved
I am getting this error while running the app with Xcode. especially when I am moving things in view like swiping with my finger, rotating the screen etc.. I have not been able to reproduce it when running the app like normal user. I am just wondering if Xcode is doing something weird in debugging and is hitting the apps animations.
Post not yet marked as solved
While developing my Metal application I noticed that making a draw call is a lot slower than using a tile shader. In particular, when operating on a 4k resolution texture it takes about 3ms to complete a draw call while the tile shader takes about 150ns. I was wondering, is a tile shader the preferred approach for drawing with Metal now? Or is there any particular reason why a typical draw call should be used.
Post not yet marked as solved
I was following raywenderlich's Metal tutorial, but got stuck
rendering a texture on a plane, it seems to be showing only one color of
the image, not the entire image. I'm running on an iPad iOS 12.3.
The weirdest thing is that I can render a multicolored rectangle when the texture is nil, but can't render the texture.
Here's a repo for the project: https://github.com/TheJoseph-Dev/MyMetalProgram
May anyone help me?
I am new to metal, and am trying to move a material normal texture by an offset while also taking advantage of metal's geometry modifier. When I was using a PhysicallyBasedMaterial I was using this function in the session function in the ViewController:
waterMaterial.textureCoordinateTransform.offset.x += 0.0001
The normal is a png. This would move the texture every frame. Now that I'm using a CustomMaterial to take advantage of a geometryModifier this is no longer working. I can see the texture and am using the shader successfully but the texture itself is not moving. I assume I need to do this in my metal shader file. Possibly starting in this direction:
[[visible]]
void moveTexture(realitykit::geometry_parameters params)
{
auto normal = params.textures().normal();
}
Any help replicating the above functionality in metal would be much appreciated.
Post not yet marked as solved
I have tried everything but it looks to be impossible to get MTKView to display full range of colors of HDR CIImage made from CVPixelBuffer (in 10bit YUV format). Only builtin layers such as AVCaptureVideoPreviewLayer, AVPlayerLayer, AVSampleBufferDisplayLayer are able to fully display HDR images on iOS. Is MTKView incapable of displaying full BT2020_HLG color range? Why does MTKView clip colors no matter even if I set pixel Color format to bgra10_xr or bgra10_xr_srgb?
convenience init(frame: CGRect, contentScale:CGFloat) {
self.init(frame: frame)
contentScaleFactor = contentScale
}
convenience init(frame: CGRect) {
let device = MetalCamera.metalDevice
self.init(frame: frame, device: device)
colorPixelFormat = .bgra10_xr
self.preferredFramesPerSecond = 30
}
override init(frame frameRect: CGRect, device: MTLDevice?) {
guard let device = device else {
fatalError("Can't use Metal")
}
guard let cmdQueue = device.makeCommandQueue(maxCommandBufferCount: 5) else {
fatalError("Can't make Command Queue")
}
commandQueue = cmdQueue
context = CIContext(mtlDevice: device, options: [CIContextOption.cacheIntermediates: false])
super.init(frame: frameRect, device: device)
self.framebufferOnly = false
self.clearColor = MTLClearColor(red: 0, green: 0, blue: 0, alpha: 0)
}
And then rendering code:
override func draw(_ rect: CGRect) {
guard let image = self.image else {
return
}
let dRect = self.bounds
let drawImage: CIImage
let targetSize = dRect.size
let imageSize = image.extent.size
let scalingFactor = min(targetSize.width/imageSize.width, targetSize.height/imageSize.height)
let scalingTransform = CGAffineTransform(scaleX: scalingFactor, y: scalingFactor)
let translation:CGPoint = CGPoint(x: (targetSize.width - imageSize.width * scalingFactor)/2 , y: (targetSize.height - imageSize.height * scalingFactor)/2)
let translationTransform = CGAffineTransform(translationX: translation.x, y: translation.y)
let scalingTranslationTransform = scalingTransform.concatenating(translationTransform)
drawImage = image.transformed(by: scalingTranslationTransform)
let commandBuffer = commandQueue.makeCommandBufferWithUnretainedReferences()
guard let texture = self.currentDrawable?.texture else {
return
}
var colorSpace:CGColorSpace
if #available(iOS 14.0, *) {
colorSpace = CGColorSpace(name: CGColorSpace.itur_2100_HLG)!
} else {
// Fallback on earlier versions
colorSpace = drawImage.colorSpace ?? CGColorSpaceCreateDeviceRGB()
}
NSLog("Image \(colorSpace.name), \(image.colorSpace?.name)")
context.render(drawImage, to: texture, commandBuffer: commandBuffer, bounds: dRect, colorSpace: colorSpace)
commandBuffer?.present(self.currentDrawable!, afterMinimumDuration: 1.0/Double(self.preferredFramesPerSecond))
commandBuffer?.commit()
}
Post not yet marked as solved
When I use metal to render, the application switch to the background resulting in metal rendering failure in iOS 15 sys.
How can I do?
Error:
Execution of the command buffer was aborted due to an error during execution.Insufficient Permission (to submit GPU work from background) (00000006:kIOGPUCommandBufferCallbackErrorBackgroundExecutionNotPermitted)