With Xcode 16 I cant make the Final Cut Extension Workflow work.
I am quite sure I was able to add the Workflow Extension target a couple of months ago in Xcode 15 and did successfully build it, but in Xcode 16 it wont build and it fails with this error:
clang: error: unknown argument: '-e_ProExtensionMain'
I have installed multiple times Workflow Extensions SDK 1.0.2 and even disabled Library validation like its described in the release notes but I dont think thats the problem here.
Seems like clang doesn't have the arguments for it, I guess the SDK should add these but seems like something got messed up - the template shows up fine in Xcode. I reinstalled Xcode and command line tools but that didnt help. Checking clang it really doesnt have the argument. But where does the SDK add this? Xcode seems to be using internal clang and /usr/bin/clang doesn't have it either.
Any tips what could be the problem here? I was not really able to locate the SDK to remove it before trying to install again - any ideas?
Steps to reproduce
Create new project
Add FCP workflow extension target
Build -> it fails
Same behaviour with 16.1 Beta. I am running Sonoma 14.7.
Thanks in advance for any ideas!
Professional Video Applications
RSS for tagExchange data with Final Cut Pro X and create effects plugins for Final Cut Pro X and Motion using Professional Video Applications.
Posts under Professional Video Applications tag
17 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
We have a sandboxed Final Cut Pro (FCP) workflow extension that needs to control FCP to export the current video project.
When executing an AppleScript, we encounter the error: "System Events got an error: FCPExtension (Final Cut Pro) is not allowed assistive access."
This occurs despite the container app having been granted automation and accessibility permissions by the user.
What could be missing from the project to ensure the script runs without issues?
AppleScript:
shareDestination("Destination")
on shareDestination(_dest)
tell application "Final Cut Pro"
activate
end tell
tell application "System Events"
set frontmost of process "Final Cut Pro" to true
tell process "Final Cut Pro"
perform action "AXRaise" of (first window whose name contains "Final Cut Pro")
click menu bar 1
tell menu bar 1
tell item 3 of menu bar items
tell menu 1
tell menu item 12
tell menu 1
set menuItems to menu items whose title is (_dest & "…")
if length of menuItems > 0 then
set targetMenuItem to item 1 of menuItems
if enabled of targetMenuItem then
try
click targetMenuItem
on error errMsg number errNum
error errMsg
end try
else
error "Share destination is not enabled." & return & "Please try selecting an item in Final Cut Pro."
end if
else
error "Share destination not found."
end if
end tell
end tell
end tell
end tell
end tell
end tell
end tell
end shareDestination
As soon as the script reaches set frontmost of process "Final Cut Pro" to true it launches the error.
I’m building an app extension for Final Cut Pro. It includes a main app that doesn't perform any actions, an extension that handles the code execution, and an export app responsible for uploading the exported file.
To assist the user, I’ve added an upload button that triggers an AppleScript. This script exports the current project and then uploads it.
The AppleScript simply selects the share option and the appropriate share destination.
However, the issue arises when I click the upload button: the app asks the user to grant Automation permission, allowing it to control Final Cut Pro and System Events. After granting this permission, the script proceeds to the AppleScript, but an error occurs, stating: System Events got an error: APP is not allowed assistive access.
Is there a permission I'm missing?
1.在FxRemoteWindowAPI 的协议中,没有办法设置window.frame.origin 2.使用NSWindow时,又无法实现 [Window setLevel:NSFloatingWindowLevel]; 3.请问我应该如何把窗口保留在Final cut pro的前面呢?并且不影响Final cut pro的正常使用。
I'm developing a workflow extension for Final Cut Pro, but I'm encountering a setback. For some reason, the extension crashes if I launch the app that contains the plugin.
I already added some logic to prevent the app from launching if one is already running, but that didn't fix the issue because the plugin crashes while the app is still loading.
It seems to me that the plugin process is being killed while the app is loading, causing the plugin to crash.
Do you know why this is happening and how to solve it?
I have an application that enables recording video from multiple iPhones through an iPad. It uses Multipeer Connectivity for all the device communication. When the user presses record on the iPad, it sends a command to each device in parallel and they start capturing video. But since network latency varies, I cannot guarantee that the recording start and stop times are consistent among all the iPhones. I need the frames to be exactly in sync.
I tried using the system clock on each device for synchronizing the videos. If all the device system clocks were in sync within 3ms (30 frames per second), then it should be okay. But I tested and the clocks vary quite a bit, multiple seconds. So that won't work.
I ultimately solved the problem by having a countdown timer on the iPad. The user puts the iPad in view of each phone with the countdown. Then later I use a python script to cut all the videos when the countdown timer goes to 0. But that's more work for the end user and requires manual work on our end. With a little ML text recognition, this could get better.
Some people have suggested using a time server and syncing the clocks that way. I still haven't tried this out, and I'm not sure if it's even possible to run a NTP server on an iPad, and whether the NTP resolution will be below 3ms.
I tried out Final Cut Camera and it has solved the synchronization problem. Each frame is in sync. The phones don't start and stop at exactly the same time, and they account for this by adding black frames to the front and/or back of videos to account for differences.
I've searched online and other people have the same problem. I'd love to know how Apple was able to solve the synchronization issue when recording video from multiple iPhones from an iPad over what I assume is Multipeer Connectivity.
We have trying to programmatically send data to Final Cut Pro by using Apple Event as decribed in Sending Data Programmatically to Final Cut Pro :
tell application "Final Cut Pro"
activate
open POSIX file "/Users/JohnDoe/Documents/UberMAM/MyEvents.fcpxml"
end tell
This works fine in Script Editor but we run into problems when trying to do the same in our macOS app.
We found interesting information in Workflow Extensions SDK 1.0.2 Release Notes.pdf.
A) Hardened runtime has "Apple Events Enabled" checked.
B) Info.plist contains NSAppleEventsUsageDescription:
<key>NSAppleEventsUsageDescription</key>
<string>Test string</string>
C) We added following entitlements:
<key>com.apple.security.scripting-targets</key>
<dict>
<key>com.apple.FinalCut</key>
<array>
<string>com.apple.FinalCut.library.inspection</string>
</array>
<key>com.apple.FinalCutTrial</key>
<array>
<string>com.apple.FinalCut.library.inspection</string>
</array>
</dict>
<key>com.apple.security.automation.apple-events</key>
<true/>
With this configuration in place, our app is able to call AppleScript to activate Final Cut Pro application but it is unable to open the file. Following error is returned:
Error executing AppleScript: {
NSAppleScriptErrorAppName = "Final Cut Pro Trial";
NSAppleScriptErrorBriefMessage = "A privilege violation occurred.";
NSAppleScriptErrorMessage = "Final Cut Pro Trial got an error: A privilege violation occurred.";
NSAppleScriptErrorNumber = "-10004";
NSAppleScriptErrorRange = "NSRange: {56, 64}";
}
Also there is no prompt asking user to allow Automation from our app to Final Cut. I am not sure whether the prompt is to be expected when developing an application in Xcode.
Our current workaround is to add (or even replace com.apple.security.scripting-targets with): com.apple.security.temporary-exception.apple-events entitlement like this
<key>com.apple.security.temporary-exception.apple-events</key>
<array>
<key>com.apple.FinalCutTrial</key>
</array>
However while this approach might work in development we know this would probably prevent us from publishing the app to Mac App Store.
I think we are missing something obvious. Could you help? :-)
Hello everyone, I have been receiving this same crash report for the past month whenever I try and export a Final Cut Pro project. The FCP video will get to about 88% completion of export, then the application crashes and I get the attached report. Any leads on how to fix this would be greatly appreciated! Thank you.
-Lauren
I have generated FCPXML, but i can't figure out issue:
<?xml version="1.0"?>
<fcpxml version="1.11">
<resources>
<format id="r1" name="FFVideoFormat3840x2160p2997" frameDuration="1001/30000s" width="3840" height="2160" colorSpace="1-1-1 (Rec. 709)"/>
<asset id="video0" name="11a(1-5).mp4" start="0s" hasVideo="1" videoSources="1" duration="6.81s">
<media-rep kind="original-media" src="file:///Volumes/Dropbox/RealMedia Dropbox/Real Media/Media/Test/Test AE videos, City, testOLOLO/video/11a(1-5).mp4"/>
</asset>
<asset id="video1" name="12(4)r8 mute.mp4" start="0s" hasVideo="1" videoSources="1" duration="9.94s">
<media-rep kind="original-media" src="file:///Volumes/Dropbox/RealMedia Dropbox/Real Media/Media/Test/Test AE videos, City, testOLOLO/video/12(4)r8 mute.mp4"/>
</asset>
<asset id="video2" name="13 mute.mp4" start="0s" hasVideo="1" videoSources="1" duration="6.51s">
<media-rep kind="original-media" src="file:///Volumes/Dropbox/RealMedia Dropbox/Real Media/Media/Test/Test AE videos, City, testOLOLO/video/13 mute.mp4"/>
</asset>
<asset id="video3" name="13x (8,14,24,29,38).mp4" start="0s" hasVideo="1" videoSources="1" duration="45.55s">
<media-rep kind="original-media" src="file:///Volumes/Dropbox/RealMedia Dropbox/Real Media/Media/Test/Test AE videos, City, testOLOLO/video/13x (8,14,24,29,38).mp4"/>
</asset>
</resources>
<library>
<event name="Untitled">
<project name="Untitled Project" uid="28B2D4F3-05C4-44E7-8D0B-70A326135EDD" modDate="2024-04-17 15:44:26 -0400">
<sequence format="r1" duration="4802798/30000s" tcStart="0s" tcFormat="NDF" audioLayout="stereo" audioRate="48k">
<spine>
<asset-clip ref="video0" offset="0/10000s" name="11a(1-5).mp4" duration="0/10000s" format="r1" tcFormat="NDF"/>
<asset-clip ref="video1" offset="12119/10000s" name="12(4)r8 mute.mp4" duration="0/10000s" format="r1" tcFormat="NDF"/>
<asset-clip ref="video2" offset="22784/10000s" name="13 mute.mp4" duration="0/10000s" format="r1" tcFormat="NDF"/>
<asset-clip ref="video3" offset="34544/10000s" name="13x (8,14,24,29,38).mp4" duration="0/10000s" format="r1" tcFormat="NDF"/>
</spine>
</sequence>
</project>
</event>
</library>
</fcpxml>
Any ideas?
Hello Apple Developer Community,
I'm encountering an issue with my macOS application where I'm receiving the following error message:
Error Domain=NSCocoaErrorDomain Code=4099 "The connection to service named com.FxPlugTestXPC was invalidated: failed at lookup with error 159 - Sandbox restriction." UserInfo={NSDebugDescription=The connection to service named com.FxPlugTestXPC was invalidated: failed at lookup with error 159 - Sandbox restriction.}
This error occurs when my application tries to establish a connection to an XPC service named com.FxPlugTestXPC. It appears to be related to a sandbox restriction, but I'm unsure how to resolve it.
I've checked the sandboxing entitlements and ensured that the necessary permissions are in place. However, the issue persists.
Has anyone encountered a similar error before? If so, could you please provide guidance on how to troubleshoot and resolve this issue?
Any help or insights would be greatly appreciated.
Thank you.
this is some photos about my entitlements :
Hi,
Could you please explain how to use SF Symbols animations in Final Cut? I would greatly appreciate your help.
Thank you!
When I use LiDAR, AVCaptureDeviceTypeBuiltInLiDARDepthCamera is used.
As AVCaptureDeviceTypeBuiltInLiDARDepthCamera is A device that consists of two cameras, one LiDAR and one YUV.
I found that the LiDAR data is 30fps, even making the YUV data 30 fps. But I really need the 240fps YUV data.
Is there a way to utilize the 30fps LiDAR with 240fps YUV camera?
Any reply would be appreciated.
Is there a way for an FXPlug to access the Source audio?
Or do we need to make an AU plugin, apply it to a audio source [both video or audio track], and feed the info via shared memory to an FXPlug?
Is there an AU plugin for external processes to "listen" to the audio?
Namaste!
I'm putting together a FCPX Effect that is supposed to increase the resolution with AI upscale, but the only way to add resolution is by scaling. The problem is that scaling causes the video to clip.
I want to be able to give a 480 video this "Resolution Upscale" Effect and have it output a 720 or 1080 AI upscaled video, however both FxPlug and Motion Effects does not allow such a thing.
The FxPlug is always getting 640x480 input (correct) but only 640x480 output.
What is the FxPlug code or Motion Configuration/Cncept for upscaling the resolution without affecting the scale? Is there a way to do this in Motion/FxPlug?
Scaling up by FxPlug effect, but then scaling down in a parent Motion Group doesn't do anything.
Setting the Group 2D Fixed Resolution doesn't output different dimensions; the debug output from the FxPlug continues saying the input and output is 640x480, even when the group is set at fixed resolution 1920x1080.
Doing a hierarchy of Groups with different settings for 2D Fixed Resolution and 3D Flatten do not work. In these instances, the debug output continues saying 640x480 for both input and output. So the plug in isn't aware of the Fixed Resolution change.
Does there need to be a new FxPlug property, via [properties:...], like "kFxPropertyKey_ResolutionChange" and an API for changing the dest image resolution? (and without changing the dest rect size)
How do we do this?
Hi,
trying to wrap my head around Xcode's FXPlug. I already sell Final Cut Pro titles for a company. These titles were built in motion.
However, they want me to move them to an app and I'm looking for any help on how to accomplish this
*What the app should do is:
Allow users with an active subscription to our website the ability to access titles within FCPX and if they are not an active subscriber, for access to be denied.
Hi,
We're updating our plugins to offer native arm64 support in Final Cut Pro X and thus porting our existing code over to the FxPlug4 API.
Here's were we're running into issues:
Out plugin features a custom parameter (essentially a push button) that opens a configuration window on the UI thread. This window is supposed to display the frame at the current timeline cursor position. Since the FxTemporalImageAPI which we had been using before for that purpose has been deprecated, how could something like that be implemented?
We tried adding a hidden int slider parameter that gets incremented once our selector is hit in order to force a re-render and then grab the current frame in renderDestinationImage:... . However, the render pipeline seems to be stalled until our selector has exited so our force render mechanism seems to be ineffective. So how do we relieably filter out and grab the source image at the current time position in FxPlug 4 ?
Thanks!
Best,
Ray
I am trying to develop a plug in for motion in Objective-c using the FxPlug template and so far I have been having issues with getting "NSOpenPanel" to be called from a push button in order to get a file dialog window, similar to the "File" generator present in Motion.
[paramAPI addPushButtonWithName: @"MIDI file"
parameterID: 1
selector: @selector(MIDI_func)
parameterFlags: kFxParameterFlag_DEFAULT]
I have tried to call NSOpenPanel through "dispatch_async" so that it would run on the main thread and thus not crash but when I press the button it appears not to work
- (void)MIDI_func {
NSLog(@"Button pressed");
dispatch_async(dispatch_get_main_queue(), ^{
NSOpenPanel* openDlg = [NSOpenPanel openPanel];
[openDlg setCanChooseFiles:YES];
[openDlg setAllowsMultipleSelection:NO];
[openDlg setCanChooseDirectories:NO];
if ([openDlg runModal] == NSModalResponseOK) {
NSArray* urls = [openDlg URLs];
for(NSInteger i = 0; i < [urls count]; i++ ) {
NSString* url = [urls objectAtIndex:i];
NSLog(@"Url: %@", url);
}
}
});
}
How can I achieve this or is there a function in the FxPlug SDK that will let me open a file dialog from the host application?