Post not yet marked as solved
After the build 4.2.9. I have a weird bug. It keep crashing and when I read the message, it display
validateRenderPassDescriptor:782: failed assertion `RenderPass Descriptor Validation
Texture at colorAttachment[0] has usage (0x01) which doesn't specify MTLTextureUsageRenderTarget (0x04)
This happen when I run in debug mode and try to hook up the motion template. I found out that the output texture create have usage only "MTLTextureUsageShaderRead" but no "MTLTextureUsageRenderTarget"
Anyone have problem like me?
I uusing fxplug 4.2.9
motion 5.7 and final cut 10.7.1. running in sonoma 14.2.1
Post not yet marked as solved
Namaste!
I'm putting together a FCPX Effect that is supposed to increase the resolution with AI upscale, but the only way to add resolution is by scaling. The problem is that scaling causes the video to clip.
I want to be able to give a 480 video this "Resolution Upscale" Effect and have it output a 720 or 1080 AI upscaled video, however both FxPlug and Motion Effects does not allow such a thing.
The FxPlug is always getting 640x480 input (correct) but only 640x480 output.
What is the FxPlug code or Motion Configuration/Cncept for upscaling the resolution without affecting the scale? Is there a way to do this in Motion/FxPlug?
Scaling up by FxPlug effect, but then scaling down in a parent Motion Group doesn't do anything.
Setting the Group 2D Fixed Resolution doesn't output different dimensions; the debug output from the FxPlug continues saying the input and output is 640x480, even when the group is set at fixed resolution 1920x1080.
Doing a hierarchy of Groups with different settings for 2D Fixed Resolution and 3D Flatten do not work. In these instances, the debug output continues saying 640x480 for both input and output. So the plug in isn't aware of the Fixed Resolution change.
Does there need to be a new FxPlug property, via [properties:...], like "kFxPropertyKey_ResolutionChange" and an API for changing the dest image resolution? (and without changing the dest rect size)
How do we do this?
Post not yet marked as solved
when I try to import MetalFX in visionOS, Xcode show error : No such module 'MetalFX' , but the document show visionOS is supported
Post not yet marked as solved
Greetings!
I have made use of Apple ARKit documentations to create a simple ARKit application which utilizes SceneKit (Tried Metal too)
I am currently unsure of how to make use of SmoothedSceneDepth(SceneDepth) in general to acquire the DepthData from the DataMap acquired in the View.
is there any particular method or way that I can access this data for displaying the depth.
would be grateful with any inputs or suggestions.
Thanks in advance
Hi,
trying to wrap my head around Xcode's FXPlug. I already sell Final Cut Pro titles for a company. These titles were built in motion.
However, they want me to move them to an app and I'm looking for any help on how to accomplish this
*What the app should do is:
Allow users with an active subscription to our website the ability to access titles within FCPX and if they are not an active subscriber, for access to be denied.
Post not yet marked as solved
Hi,
Is transparency supported in MetalFX?
I have a project that sets a texture to a particular alpha value. It works fine. However, as soon as I enable MetalFX, the transparency stops working. The alpha value is set to 1.0.
If transparency is supported in MetalFX, how do I enable it?
Thank you
Post not yet marked as solved
I write iOS plug in to integrate MetalFX Spatial Upscaling to Unity URP project.
C# Code in Unity:
namespace UnityEngine.Rendering.Universal
{
///
/// Renders the post-processing effect stack.
///
internal class PostProcessPass : ScriptableRenderPass
{
RenderTexture _dstRT = null;
[DllImport ("__Internal")]
private static extern void MetalFX_SpatialScaling (IntPtr srcTexture, IntPtr dstTexture, IntPtr outTexture);
}
}
void RenderFinalPass(CommandBuffer cmd, ref RenderingData renderingData)
{
// ......
case ImageUpscalingFilter.MetalFX:
{
var upscaleRtDesc = tempRtDesc;
upscaleRtDesc.width = cameraData.pixelWidth;
upscaleRtDesc.height = cameraData.pixelHeight;
RenderingUtils.ReAllocateIfNeeded(ref m_UpscaledTarget, upscaleRtDesc, FilterMode.Point, TextureWrapMode.Clamp, name: "_UpscaledTexture");
var metalfxInputSize = new Vector2(cameraData.cameraTargetDescriptor.width, cameraData.cameraTargetDescriptor.height);
if (_dstRT == null)
{
_dstRT = new RenderTexture(upscaleRtDesc.width, upscaleRtDesc.height, 0, RenderTextureFormat.ARGB32);
_dstRT.Create();
}
// call native plugin
cmd.SetRenderTarget(m_UpscaledTarget, RenderBufferLoadAction.DontCare, RenderBufferStoreAction.Store, RenderBufferLoadAction.DontCare, RenderBufferStoreAction.DontCare);
MetalFX_SpatialScaling(sourceTex.rt.GetNativeTexturePtr(), m_UpscaledTarget.rt.GetNativeTexturePtr(), _dstRT.GetNativeTexturePtr());
Graphics.CopyTexture(_dstRT, m_UpscaledTarget.rt);
sourceTex = m_UpscaledTarget;
PostProcessUtils.SetSourceSize(cmd, upscaleRtDesc);
break;
}
// .....
}
Objective-c Code in iOS:
head file:
#import <Foundation/Foundation.h>
#import <MetalFX/MTLFXSpatialScaler.h>
@protocol MTLTexture;
@protocol MTLDevice;
API_AVAILABLE(ios(16.0))
@interface MetalFXDelegate : NSObject
{
int mode;
id _device;
id _commandQueue;
id _outTexture;
id _mfxSpatialScaler;
id _mfxSpatialEncoder;
};
(void)SpatialScaling: (MTLTextureRef) srcTexture
dstTexure: (MTLTextureRef) dstTexture
outTexure: (MTLTextureRef) outTexture;
(void)saveTexturePNG: (MTLTextureRef) texture
url: (CFURLRef) url;
@end
m file:
#import "MetalFXOC.h"
@implementation MetalFXDelegate
(id)init
{
self = [super init];
return self;
}
static MetalFXDelegate* delegateObject = nil;
(void)SpatialScaling: (MTLTextureRef) srcTexture
dstTexture: (MTLTextureRef) dstTexture
outTexture: (MTLTextureRef) outTexture {
int width = (int)srcTexture.width;
int height = (int)srcTexture.height;
int dstWidth = (int)dstTexture.width;
int dstHeight = (int)dstTexture.height;
if (_mfxSpatialScaler == nil) {
MTLFXSpatialScalerDescriptor* desc;
desc = [[MTLFXSpatialScalerDescriptor alloc]init];
desc.inputWidth = width;
desc.inputHeight = height;
desc.outputWidth = dstWidth; ///_screenWidth
desc.outputHeight = dstHeight; ///_screenHeight
desc.colorTextureFormat = srcTexture.pixelFormat;
desc.outputTextureFormat = dstTexture.pixelFormat;
if (@available(iOS 16.0, *)) {
desc.colorProcessingMode = MTLFXSpatialScalerColorProcessingModePerceptual;
} else {
// Fallback on earlier versions
}
_device = MTLCreateSystemDefaultDevice();
_mfxSpatialScaler = [desc newSpatialScalerWithDevice:_device];
if (_mfxSpatialScaler == nil) {
return;
}
_commandQueue = [_device newCommandQueue];
MTLTextureDescriptor *texdesc = [[MTLTextureDescriptor alloc] init];
texdesc.width = (int)dstTexture.width;
texdesc.height = (int)dstTexture.height;
texdesc.storageMode = MTLStorageModePrivate;
texdesc.usage = MTLTextureUsageRenderTarget | MTLTextureUsageShaderRead | MTLTextureUsageShaderWrite;
texdesc.pixelFormat = dstTexture.pixelFormat;
_outTexture = [_device newTextureWithDescriptor:texdesc];
}
id upscaleCommandBuffer = [_commandQueue commandBuffer];
upscaleCommandBuffer.label = @"Upscale Command Buffer";
_mfxSpatialScaler.colorTexture = srcTexture;
_mfxSpatialScaler.outputTexture = _outTexture;
[_mfxSpatialScaler encodeToCommandBuffer:upscaleCommandBuffer];
// outTexture = _outTexture;
id textureCommandBuffer = [_commandQueue commandBuffer];
id _mfxSpatialEncoder =[textureCommandBuffer blitCommandEncoder];
[_mfxSpatialEncoder copyFromTexture:_outTexture toTexture:outTexture];
[_mfxSpatialEncoder endEncoding];
[upscaleCommandBuffer commit];
}
@end
extern "C" {
void MetalFX_SpatialScaling(void* srcTexturePtr, void* dstTexturePtr, void* outTexturePtr) {
if (delegateObject == nil) {
if (@available(iOS 16.0, *)) {
delegateObject = [[MetalFXDelegate alloc] init];
} else {
// Fallback on earlier versions
}
}
if (srcTexturePtr == nil || dstTexturePtr == nil || outTexturePtr == nil) {
return;
}
id<MTLTexture> srcTexture = (__bridge id<MTLTexture>)(void *)srcTexturePtr;
id<MTLTexture> dstTexture = (__bridge id<MTLTexture>)(void *)dstTexturePtr;
id<MTLTexture> outTexture = (__bridge id<MTLTexture>)(void *)outTexturePtr;
if (@available(iOS 16.0, *)) {
[delegateObject SpatialScaling: srcTexture
dstTexture: dstTexture
outTexture: outTexture];
} else {
// Fallback on earlier versions
}
return;
}
}
With the C# and objective code, the appear on screen is black.
If I save the MTLTexture to PNG in ios plug in, the PNG is ok(not black), so I think the outTexture: outTexture write to unity is failed.
Post not yet marked as solved
I have tested SpatialUpsacling with a Unity URP sample project: https://github.com/mao-test-h/MetalFXSamples.
I used iPhone 13 with iOS Beta 7. The performance and quality are both worse than Native.
Post not yet marked as solved
I use Macmini with MacOS Ventura 13.3.1, while the Mac running MetalFX sample code, and choose Temporal Scaler, makeTemporalScaler return nil value, and print "The temporal scaler effect is not usable!". If i choose SpatialScaler, it is ok.
guard let temporalScaler = desc.makeTemporalScaler(device: device) else {
print("The temporal scaler effect is not usable!")
mfxScalingMode = .defaultScaling
return
}
Sample code:
https://developer.apple.com/documentation/metalfx/applying_temporal_antialiasing_and_upscaling_using_metalfx?language=objc
Post not yet marked as solved
In my game project, there is a functions.data file in then /AppData/Library/Caches/[bundleID]/com.apple.metal/functions.data,
when we reboot and launch the game, this file was rest to about 40KB, normaly this file's is about 30MB, this operation was done by the metal, Is there any way to avoid it?
Post not yet marked as solved
Hi Apple, we have purchased Silicon-based Macs at our studio and intend to use Unreal Engine with it, Epic games said it's left to you to allow this on the hardware, while I also learnt that the M chips allow ray-tracing.
Are you in talks with Epic Games? or should we expect this feature soon? if so, how soon? or will it not be possible on these present Silicon-based Macs?
An answer would really point our company in the right direction.
Thank you very much.
Post not yet marked as solved
Is there a reference for how much time should MetalFX temporal antialiasing + upscaling take? On a M1 Pro, for a test frame with an otherwise extremely short render time (see attached image, render takes ~600µs), the upscaler adds an additional ~10ms just for Metal FX temporal antialiasing + upscaling (see attached image). This obviously leaves very little time to keep the frame time under 16ms (60fps) and makes 8ms frame times (120fps) impossible.
The upscaling is working with an input render size of 1,904x1,370 and output size of 2,856x2,055, so it's not like the image is being upscaled to an outrageous size or anything.
Any thoughts? Is this to be expected? What can affect the Metal FX Temporal Upscaling frame time?
Post not yet marked as solved
It would be quite helpful if there was frame interpolation in MetalFX, similar to what NV and AMD are adding to their upscaling technologies. I have found MetalFX temporal upscaling critical to a real-time ray traced application, and frame interpolation could halve the compute cost, letting me try more expensive RT algorithms. I am hoping this technology will debut at WWDC 2023, if not then maybe during the winter round of OS updates.
Post not yet marked as solved
I just upgraded to Xcode 14.3.
I have started seeing the following debug message show up in my console and I am not sure why.
[GPUDebug] Null texture access executing kernel function "ColorCorrection" encoder: "Keyframing.SurfaceWarpFuser.InverseWarpKeyframe", dispatch: 2
It seems Metal related, but I am very confused by it. My project uses a very minimal amount of Metal. Only to get depth data from ARKit and to draw points, and I definitely do NOT have any kernel functions named ColorCorrection or an encoder named "Keyframing.SurfaceWarpFuser.InverseWarpKeyframe"
I haven't changed any of my metal code, so I don't know if this is something bigger I should be concerned about.
Sincerely,
Stan