Hello.
How to write this command correctly on a Macbook, in the script editor, so that I can click the "Run script" button and the script will give the result:
if there is no folder, then report that there is no folder,
if there is a folder, then report that the folder exists.
do shell script "test -d 'Users/user/Desktop/New folder'"
Now, if the folder exists, an empty string ("") is returned, if the folder does not exist, the script editor reports that an error has occurred.
In general, my task is to write a script that checks the existence of a folder.
Automation & Scripting
RSS for tagLearn about scripting languages and automation frameworks available on the platform to automate repetitive tasks.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Description
The Shortcut Automation Trigger Transaction frequently times out, ultimately causing the shortcut automation to fail. Please see the attached trace for details.
Additionally, the Trigger is activated even when the Transaction is declined.
Details
In the trace I see the error:
[WFWalletTransactionProvider observeForUpdatesWithInitialTransactionIfNeeded:transactionIdentifier:completion:]_block_invoke Hit timeout waiting for transaction with identifier: <private>, finishing.
Open bug report: FB14035016
On iOS 18, I'm trying to index documents in Spotlight using the new combination of AppIntents+IndexedEntity.
However, I don't seem to be able to index the textContent of the document. Only the displayName seems to be indexed.
As recommended, I start with the defaultAttributeSet:
/// I call this function to index in Spotlight
static func indexInSpotlight(document: Document) async {
do {
if let entity = document.toEntity {
try await CSSearchableIndex.default().indexAppEntities([entity])
}
} catch {
DLog("Spotlight: could not index document: \(document.name ?? "")")
}
}
/// This is the corresponding IndexedEntity with the attributeSet
@available(iOS 18, *)
extension DocumentEntity {
var attributeSet: CSSearchableItemAttributeSet {
let attributeSet = defaultAttributeSet
attributeSet.title = title
attributeSet.displayName = title
attributeSet.textContent = docContent
attributeSet.thumbnailData = thumbnailData
attributeSet.kind = "document"
attributeSet.creator = Constants.APP_NAME
return attributeSet
}
}
How can I have more that the displayName to be indexed? Thanks :-)
AppleScript for the Music app no longer supports the current track event. Before macOS Tahoe, running the following script in Script Editor would return the current track information:
tell application "Music"
return name of current track
end tell
However, when I run this script on a device with macOS 26 Tahoe, I receive this error:
"Result: error "Music got an error: Can’t get name of current track." number -1728 from name of current track”
I've tested this extensively, and here are my findings:
Going to the “songs” tab and playing something from there makes everything work.
Playing any song directly will make it work with current track UNLESS this song is NOT in your Music library (either added through Apple Music or uploaded).
If you play a song not in your library, current track is not updated even if you clicked on it specifically.
Playing an album (in your library obviously) makes all the tracks within it appear in current track until autoplay takes over.
Any autoplayed track won’t appear in current track even if in your library (unless: see the last bulletpoint)
Music played through the “songs” tab all appear in current track even if autoplay kicks in. I assume this is because this tab is an iTunes legacy (visually and under the hood) and doesn’t use the modern autoplay. This tab also won’t play non-library songs unlike the “albums” tab which seems to use the correct autoplay and suffers the same symptoms as the “recently added”, “home”, “radio”, etc… tabs.
Is this a bug, or has Apple simply deprecated this functionality?
I wrote an AppleScript that takes a bunch of scanned jpegs with systematically named filenames and transfers information from the filename into the date and time fields. That all works fine, but I've got many more scans to do and I'd like to augment the script to include a progress meter because it takes a long time to run on e.g. 1000 photos.
I've found basic progress meter examples online that involves commands like:
set progress total steps to theImageCount
set progress completed steps to 0
set progress description to "Processing Images..."
set progress additional description to "Preparing to process."
and they run OK in a separate dummy test case, however I'm getting syntax errors for such commands in my renaming script because (I think) they're inside a
tell application "Photos"
wrapper and it looks like Photos doesn't like those commands.
A progress meter (in any AppleScript) should be a straightforward thing i.e. I can clearly define a total number of steps and I can clearly define the step number I'm currently on. I just want to display something like:
I'd even be OK with just implementing something like:
display dialog "blah blah"
but that needs to be manually dismissed with each iteration of the loop, so that's no good. I also tried:
display notification "blah blah"
but that yields hundreds of notification boxes at the top right of my screen, so that's also impractical.
I was even thinking maybe I could call some generic system progress meter with all the right variables via a "do shell script" command (although I have no idea how to do that). Something surely must be possible, but I just can't figure it out :-(. Could some kind soul please help me out. Thanks.
Topic:
App & System Services
SubTopic:
Automation & Scripting
I need my application to copy some files, but using Finder. Now, I know all different methods and options to programmatically copy files using various APIs, but that's not the point here. I specifically need to use Finder for the purpose, so please, let's avoid eventual suggestions mentioning other ways to copy files.
My first thought was to use the most simple approach, execute an AppleScript script using NSUserAppleScriptTask, but that turned out not to be ideal. It works fine, unless there already are files with same names at the copying destination. In such case, either the script execution ends with an error, reporting already existing files at the destination, or the existing files can be simply overridden by adding with overwrite option to duplicate command in the script.
What I need is behaviour just like when Finder is used from the UI (drag'n'drop, copy/paste…); if there are existing files with same names at the destination, Finder should offer a "resolution panel", asking the user to "stop", "replace", "don't replace", "keep both" or "merge" (the latter in case of conflicting folders). So, I came to suspect that I could achieve such bahaviour by using Apple Events directly and passing kAEAlwaysInteract | kAECanSwitchLayer options to AESendMessage(). However, I can't figure out how to construct appropriate NSAppleEventDescriptor (nor old-style Carbon AppleEvent) objects and instruct Finder to copy files.
This is where I came so far, providing srcFiles are source files (to be copied) URLs and dstFolder destination folder (to be copied into) URL:
NSRunningApplication *finder = [[NSRunningApplication runningApplicationsWithBundleIdentifier:@"com.apple.finder"] firstObject];
if (!finder)
{
NSLog(@"Finder is not running.");
return;
}
NSAppleEventDescriptor *finderDescriptor = [NSAppleEventDescriptor descriptorWithBundleIdentifier:[finder bundleIdentifier]];
NSAppleEventDescriptor *dstDescriptor = [NSAppleEventDescriptor descriptorWithString:[dstFolder path]];
NSAppleEventDescriptor *srcDescriptor = [NSAppleEventDescriptor listDescriptor];
for (NSURL *url in srcFiles)
{
NSAppleEventDescriptor *fileDescriptor = [NSAppleEventDescriptor descriptorWithString:[url path]];
[srcDescriptor insertDescriptor:fileDescriptor atIndex:([srcDescriptor numberOfItems] + 1)];
}
NSAppleEventDescriptor *event = [NSAppleEventDescriptor appleEventWithEventClass:kAECoreSuite
eventID:kAEClone
targetDescriptor:finderDescriptor
returnID:kAutoGenerateReturnID
transactionID:kAnyTransactionID];
[event setParamDescriptor:srcDescriptor forKeyword:keyDirectObject];
[event setParamDescriptor:dstDescriptor forKeyword:keyAETarget];
NSError *error;
NSAppleEventDescriptor *result = [event sendEventWithOptions:(NSAppleEventSendAlwaysInteract | NSAppleEventSendCanSwitchLayer) timeout:10.0 error:&error];
The code above executes without any error. The final result descriptor is a NULL descriptor ([NSAppleEventDescriptor nullDescriptor]) and there's no error returned (by reference). However, nothing happens, Finder remains silent and the application doesn't make macOS/TCC prompt for a permission to "automate Finder".
I wonder if the approach above is correct and if I use correct parameters as arguments for all calling method/messages. I'm specially interested if passing keyAETarget is the right value in [event setParamDescriptor:dstDescriptor forKeyword:keyAETarget], since that one looks most suspicious to me. I'd really appreciate if anyone can help me with this.
I'd also like to point out that I tried the same approach outlined above with old-style Carbon AppleEvent API, using AECreateDesc(), AECreateAppleEvent(), AEPutParamDesc() and AESendMessage()… All API calls succeeded, returning noErr, but again, nothing happened, Finder remained silent and no macOS/TCC prompt for a permission to "automate Finder".
Any help is highly appreciated, thanks!
-- Dragan
Is VIP an object contained in the mail or contacts apps/objects?
Or is it just a property somewhere?
Or is it an independent object?
Where is the VIP object? Is it still managed in the mail app or is it just hidden because the design is changing at Apple?
The following AppleScript is failing:
tell application "Mail"
set theVIPs to VIPs
with the error:
error "Die Variable „VIPs“ ist nicht definiert." number -2753 from "VIPs" (The variable "VIPs" is not defined.".
In Mail.sdef documentation and in Contacts.sdef documentation I cannot find any documentation of VIP object or VIP property.
Topic:
App & System Services
SubTopic:
Automation & Scripting
Tags:
Contacts
AppleScript
Mail Extensions
Looking for any method to quickly flatten a PDF without opening Preview and without installing 3 party software. Any ideas?
Save as PDF in Preview works, but I don't want to have to open Preview each time I need to do this.
The Create PDF action which appears in Finder when you select 2 or more PDFs flattens PDFs, but it requires me to select 2 or more files, and I generally don't want to combine PDFs--I simply wish to flatten a PDF.
Most Automator and Shortcuts options I am aware of do not flatten PDFs, and in some cases, strip out form field data from PDFs.
Hi all,
I'm developing a sandboxed Mac OS app that generates and compiles AppleScript files to automate tasks in Pages (and other iWork apps). The app creates an AppleScript file and writes it to the NSApplicationScriptsDirectory (i.e., ~/Library/Application Scripts/com.example.app), then compiles and executes it via NSUserAppleScriptTask.
On Mac OS Ventura, however, I get the following error in the console when trying to write the file:
[PagesModifier] Error creating or compiling the script: You are not allowed to save the file "PagesModifier_...applescript" in the folder "com.example.app"
Here are my current entitlements:
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
<key>com.apple.security.app-sandbox</key>
<true/>
<key>com.apple.security.application-groups</key>
<array/>
<key>com.apple.security.automation.apple-events</key>
<array>
<string>com.apple.iWork.Pages</string>
<string>com.apple.iWork.Numbers</string>
<string>com.apple.iWork.Keynote</string>
</array>
<key>com.apple.security.files.user-selected.read-write</key>
<true/>
<key>com.apple.security.scripting-targets</key>
<dict>
<key>com.apple.iWork.Keynote</key>
<array>
<string>com.apple.iWork.Keynote</string>
</array>
<key>com.apple.iWork.Numbers</key>
<array>
<string>com.apple.iWork.Numbers</string>
</array>
<key>com.apple.iWork.Pages</key>
<array>
<string>com.apple.iWork.Pages</string>
</array>
</dict>
<key>com.apple.security.temporary-exception.apple-events</key>
<array>
<string>com.apple.iWork.Pages</string>
<string>com.apple.iWork.Numbers</string>
<string>com.apple.iWork.Keynote</string>
</array>
<key>com.apple.security.temporary-exception.files.home-relative-path.read-write</key>
<array>
<string>Library/Application Scripts/com.example.app</string>
</array>
</dict>
</plist>
I suspect the issue might be due to sandbox restrictions on dynamically creating or modifying the Application Scripts directory on Ventura. Has anyone experienced something similar or have any suggestions on how to work around this?
Thanks in advance for your help!
Topic:
App & System Services
SubTopic:
Automation & Scripting
Tags:
Entitlements
Scripting
AppleScript
App Sandbox
I am trying to setup in Automator a Folder Action, where screenshots on Desktop are automatically imported into my Photos app
Unfortunately, the automation is not working. When I have a screenshot on the Desktop and I open Automator to manually trigger, I receive the error:
The action "Import Files into Photos" encountered an error: "Import Files into Photos script failed"
Under log, it further says: The action "Import Files into Photos" was not supplied with the required data
Topic:
App & System Services
SubTopic:
Automation & Scripting
I'm developing an iOS app that uses Siri Shortcuts to enhance the user experience. Currently, I have implemented functionality that allows users to perform certain actions via Siri Shortcuts.
My team wants to improve the user experience by giving an instructional audio prompt (e.g., "say 'hey Siri [action name]' if you want to [perform action]") to users. However, we want to ensure this prompt is only played when the user has already enabled Siri Shortcuts.
The challenge is determining whether Siri Shortcuts are properly enabled before suggesting their use. We want to avoid situations where users follow our audio instructions to use Siri, only to discover that Siri Shortcuts aren't properly configured on their device.
Since we're using Siri Shortcuts for this feature, the standard requestSiriAuthorization(_:) method doesn't apply to our use case(It said You don’t need to request authorization if your app only supports actions in the Shortcuts app. in https://developer.apple.com/documentation/sirikit/requesting-authorization-to-use-siri).
What is the recommended approach to verify that Siri Shortcuts are properly enabled before prompting users to interact with them? Is there a reliable way to check this status(should be the bool value of the toggle in the pic below) programmatically?
Thank you for your assistance.
Hello, I made myself an app to track my expenses.
The most important event is when I make a purchase via apple wallet.
What happens is sometimes the values from Merchant and Amount are;
Merchant = " "
Amount = 0.0
Has anyone experienced this, is there something I can do about it ? I was thinking that sometimes maybe speed connection and service is something that might make an impact
Does anyone here know something about the topic ?
I need to have a dynamic parameter for Shortcuts, so a person can say something like
Hey Siri, order a pizza with
The parameter code in the appIntent is
@Parameter(title: "Title")
var itemName: String
In the Shortcut I use:
AppShortcut(
intent: NewItemIntent(),
phrases: [
"order \(\.$itemName) using \(.applicationName)"
],
shortTitle: "Order Item",
systemImageName: "sparkles"
)
When I call it "hey Siri, order pizza using ***" where pizza should be passed via the parameter then handed off to the appintent. However, it ignores the spoken parameter in lieu of putting up a dialog asking "What's the name?" I can say "pizza" and it now works. How can I pick up the parameter without having to go to that second step as the code implies?
Add NSUserActiveTypes ->INSendMessage Intent to info. plist. Starting the app on watchos will handoff Outlook on Mac? Normal on iOS.
I'm making a PoC for Visual Intelligence integration in iOS. It's a very simple setup... the extension will always reply with a couple of static "results" just so I can verify that it's working and figure out how to handle receiving app activation from the Intents framework.
The app seems to be registering the VI intent correctly, because I see my app's name in the tab list of providers for search results, but when I select my app, I always get no results. I looked at the console for the moment I'm selecting my app and seeing this error:
error 16:37:09.433057-0600 duetexpertd [com.hairlessape.VisualIntelligenceProvider.VIAppIntent] Unable to get connection interface: Error Domain=LNConnectionErrorDomain Code=1100 "Unable to locate `com.hairlessape.VisualIntelligenceProvider.VIAppIntent` for the `com.apple.appintents-extension` extension point"
No amount of web searching or AI interrogation has produced any headwind here. I've checked the build product and I can see the VIAppIntent.appex file in the Extensions\ folder of my app bundle.
I've triple checked the bundle identifiers, code file membership, installed the app from an IPA, restarted my phone, etc. I cannot get my intent to be queried and it's very frustrating.
I've put the PoC project on Github: https://github.com/JoshuaSullivan/VisualSearchForVI
I have an existing iOS app with shortcuts support, and I am trying to bring the same shortcuts to my Mac app in macOS Monterey. In my case, I have added the same intents definition file to my Mac target app, added "Intents eligible for in-app handling" to my Info.plust file and added the intent names, and made sure all the intent handling code is part of both iOS and Mac targets. Still, when I build and run the app on macOS Monterey, the new shortcuts don't show in the shortcut editor at all. I've tried closing and restarting the Shortcuts app, but no luck. The build logs do show the intents being built, but they're just not showing up in the Shortcuts app.
I tried 'donating' one of the intents in my Mac app code, but got an error:
Cannot donate interaction with intent that has no valid shortcut types
Not sure what to try to make it work.
Thanks.
Topic:
App & System Services
SubTopic:
Automation & Scripting
Tags:
wwdc21-10232
App Intents
Shortcuts
Hi!
I have an AppShortcutsProvider that has been marked @available(iOS 17.0, *), because all Intents used within it are only available in iOS 17. So the static var appShortcuts should only be accessible in iOS 17, obviously.
Now this code has been released and works fine for iOS 17 users. But I just noticed that around 150 iOS 16 users have crashed in the static var appShortcuts, or more specifically, the defaultQuery of one of the Intents. This should not be possible, as neither the AppShortcutsProvider, nor the Intent is exposed to iOS 16, they are marked as @available(iOS 17.0, *). Any idea why this could happen?
Here is the crash log:
Crashed: com.apple.root.user-initiated-qos.cooperative
0 libswiftCore.dylib 0x3e178c swift::ResolveAsSymbolicReference::operator()(swift::Demangle::__runtime::SymbolicReferenceKind, swift::Demangle::__runtime::Directness, int, void const*)
1 libswiftCore.dylib 0x40bec4 swift::Demangle::__runtime::Demangler::demangleSymbolicReference(unsigned char)
2 libswiftCore.dylib 0x408254 swift::Demangle::__runtime::Demangler::demangleType(__swift::__runtime::llvm::StringRef, std::__1::function<swift::Demangle::__runtime::Node* (swift::Demangle::__runtime::SymbolicReferenceKind, swift::Demangle::__runtime::Directness, int, void const*)>)
3 libswiftCore.dylib 0x3e9680 swift_getTypeByMangledNameImpl(swift::MetadataRequest, __swift::__runtime::llvm::StringRef, void const* const*, std::__1::function<swift::TargetMetadata<swift::InProcess> const* (unsigned int, unsigned int)>, std::__1::function<swift::TargetWitnessTable<swift::InProcess> const* (swift::TargetMetadata<swift::InProcess> const*, unsigned int)>)
4 libswiftCore.dylib 0x3e4d9c swift_getTypeByMangledName
5 libswiftCore.dylib 0x3e50ac swift_getTypeByMangledNameInContext
6 MyApp 0x3f6b8c __swift_instantiateConcreteTypeFromMangledName (<compiler-generated>)
7 MyApp 0x640e3c one-time initialization function for defaultQuery + 146 (IntentAction.swift:146)
8 libdispatch.dylib 0x3eac _dispatch_client_callout
9 libdispatch.dylib 0x56ec _dispatch_once_callout
10 MyApp 0x64109c protocol witness for static AppEntity.defaultQuery.getter in conformance IntentAction + 124 (IntentAction.swift:124)
11 AppIntents 0x15c760 _swift_stdlib_malloc_size
12 AppIntents 0x19dfd4 __swift_destroy_boxed_opaque_existential_1Tm
13 MyApp 0x52dadc specialized ActionIntent.init(action:) + 41 (ActionIntent.swift:41)
14 MyApp 0x52df80 specialized static MyShortcuts.appShortcuts.getter + 17 (MyShortcuts.swift:17)
how to register more than 10 shortcuts
We created a script .scpt that is run before an operation. the script use a login via Safari. almost always it works but sometimes not. if we log on the machine the script will work. do you have idea of the problems? Thanks
I made a set of Siri Shortcuts in my app with the AppShortcutsProvider, and they each have a set of phrases.
I can activate the shortcuts via Siri phrases or Spotlight search on iOS 18+, but not on iOS -17.
I've checked the documentation and see that AppShortcutsProvider is supported from iOS 16+, so I don't understand why I can't view the shortcuts in Spotlight or activate them with Siri unless it's at least iOS 18.
Any thoughts?
Topic:
App & System Services
SubTopic:
Automation & Scripting
Tags:
Spotlight
Siri and Voice
Shortcuts
App Intents