Hi folks,
I've got some music that I want playing on iTunes all the time on an older system, but it'll sometimes stop. I tried making a Applescript to check and play the music/playlist again if it stops, but I keep getting a timeout error.
This is the AppleScript:
repeat
tell application "iTunes"
if player state is paused then
tell application "iTunes" to play
end if
delay 30
end tell
end repeat
I get this error:
AppleEvent timed out.
iTunes got an error: AppleEvent timed out. (-1712)
I can't figure out why I'm getting a timeout error... anyone have any ideas?
Automation & Scripting
RSS for tagLearn about scripting languages and automation frameworks available on the platform to automate repetitive tasks.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Instead of a .sdef file (or older formats) made in advance and stored in an app’s bundle, I wonder if the script dictionary can be set soon after the app opens? I seen related API, with a note that the scripting file reader ultimately calls that API. I grok about 90% of that API, but I’m not sure it’s possible.
hi so im trying to create a simple app that has two pressable buttons that turn a spa mode on and off wishing another app
so far I have the UI figured out the app works the only issue is right now when I press spa on I have setup so it opens Siri Shortcuts and enables the shortcut is there a way to make the app revert back to mine after the action is done? or is there another way to open the other app and navigate to the button within that app and enable it behind my app
I want to create automations when the first person comes Home.
When Setting up the Automation on the owner device everything seems to be correct. Yet the Automation doesn‘t Wort properly. my girlfriend is listed as admin.
when she has a Look at the Automation, the Location of my phone is unknown.
the issue is that this leads To the following behaviour. When i come House the Automation Checks if my girlfriend is Home. All Good. When my Grilfriend comes Home the Automation doesn‘t Check where i am But directly execut the Automation.
My app monitors users heart beats and if critical reading is noticed, it auto -dials 911 for emergency and ambulance help.
I was under the impression that auto-dial may not be permitted or possible on the platform.
Can anyone confirm and provide any additional guidance on if it is possible in the newer SDK/API stack or using any 3rd party service ?
Thank you in advance!!
Topic:
App & System Services
SubTopic:
Automation & Scripting
Tags:
Mobile Core Services
Watch Connectivity
WatchKit
How to access comments and their associated text in iWork documents via AppleScript/ScriptingBridge?
Hi all,
I’m developing a Mac OS application with XCode that interacts with iWork documents (Pages, Numbers, Keynote) using ScriptingBridge (and maybe AppleScript). Right now I started with Pages, assuming if it works for Pages, it will likely be similar for Numbers and Keynote.
While I can successfully access and modify the main body text (e.g. the “body text” property in a Pages document), I’m having major difficulties accessing the comments (or annotations) within these documents.
There are many aspects, but right now what I’m trying to achieve:
For a Pages document, I need to scan the document and extract, for each comment:
The content of the comment (i.e. the comment’s text).
The text that is being commented on.
For Numbers, similarly, I need to retrieve the commented cell’s content and the associated comment.
For Keynote, the same as Pages, except I manage to get the Presenter Notes.
Once done, I could replace the content accordingly.
What I’ve tried:
Using AppleScript commands such as:
every comment of document "Test"
Accessing properties like content or range of a comment.
Attempting various syntaxes (including using class specifiers) to force AppleScript to recognize comments.
Using ScriptingBridge in my Swift code, but I couldn’t find any mapping for a “comment” object in the Pages dictionary.
However, all these attempts result in errors such as “cannot convert …” or “this class is not key value coding-compliant for the key …” which leads me to believe that the iWork scripting dictionaries may not expose comments (or annotations) in a scriptable way.
Questions:
Is there a supported way to access the comments (and the associated commented text) in an iWork document via AppleScript or ScriptingBridge?
If so, what is the proper syntax or property name to use? (For example, should I be looking for a class named “comment”, “annotation”, or perhaps something else?)
If direct access via AppleScript/ScriptingBridge is not possible, what alternative approaches would you recommend for programmatically extracting comment data from iWork documents?
I apologize if my post isn't clear, it is translated from French. Any insights or examples would be greatly appreciated. Thank you!
Here is an AppleScript script to make it possible double-click files in Finder so that they will be opened in Vim, by Normen Hansen: https://github.com/normen/vim-macos-scripts/blob/master/open-file-in-vim.applescript (it must be used from Automator).
I try to simplify it and also to make it possible to use it without Automator. To use my own version, you only need to save it as application instead, like this:
osacompile -o open-with-vim.app open-with-vim.applescript
and then copy it to /Applications and set your .txt files to be opened using this app.
Here it is, it alsmost works:
-- https://github.com/normen/vim-macos-scripts/blob/master/open-file-in-vim.applescript
-- opens Vim with file or file list
-- sets current folder to parent folder of first file
on open theFiles
set command to {}
if input is null or input is {} or ((item 1 in input) as string) is "" then
-- no files, open vim without parameters
set end of command to "vim;exit"
else
set firstFile to (item 1 of theFiles) as string
tell application "Finder" to set pathFolderParent to quoted form of (POSIX path of ((folder of item firstFile) as string))
set end of command to "cd" & space & (pathFolderParent as string)
set end of command to ";hx" -- e.g. vi or any other command-line text editor
set fileList to {}
repeat with i from 1 to (theFiles count)
set end of fileList to space & quoted form of (POSIX path of (item i of theFiles as string))
end repeat
set end of command to (fileList as string)
set end of command to ";exit" -- if Terminal > Settings > Profiles > Shell > When the shell exits != Don't close the window
end if
set command to command as string
set myTab to null
tell application "Terminal"
if it is not running then
set myTab to (do script command in window 1)
else
set myTab to (do script command)
end if
activate
end tell
return
end open
The only problem is the if block in the very beginning:
if input is null or input is {} or ((item 1 in input) as string) is "" then
What is wrong here? How to make it work? (Assuming it already works fine if AppleScript is used using Automator.)
I'll ask Siri: What is the weather?"
and will get a valid response
I'll ask Siri to execute a shortcut my app has created
I get "what is the order?" (a phrase nowhere in my app)
I'll repeat the question about the weather
I now get "what is the order?"
***?
I've created an OpenIntent with an AppEntity as target.
Now I want to receive this entity when the intent is executed and the app is opened. How can I do that?
I can't find any information about it and there are no method for this in the AppDelegate
We have a watchOS app that provides many configurable widgets. Those widgets are configured and installed with help of AppIntent:
public struct RectComplAppIntent: AppIntent, WidgetConfigurationIntent, CustomIntentMigratedAppIntent {
@Parameter(title: "Style")
var style: String?
....
}
However when I print WidgetInfos with getCurrentConfigurations(), I sometimes got nil for configuration. At the same time widgets are not loaded. Exact steps:
User installs the pre-cofnigured .watchface.
Complications are not loaded since configuration is missing. I print getCurrentConfigurations() and get entries like this:
WidgetInfo:
- configuration: nil
- widgetConfigurationIntent: nil
- family: accessoryRectangular
- kind: Rectangle
Then user force-touches a face and opens editing mode. Returns to watch app, prints infos:
WidgetInfo:
- configuration: <INIntent: 0x780d290> {
style = vol1Logo;
}
- widgetConfigurationIntent: nil
- family: accessoryRectangular
- kind: Rectangle
– Suddenly intent appears with the correct style and complications start to show up.
How do you think, why it happens? Why after .watchface install all the WidgetInfo has nil intent (configuration)? What helps them to load later?
You can try this face yourself: https://cdn.watchfaces.co/watchfaces/glance-minimalist.watchface
Topic:
App & System Services
SubTopic:
Automation & Scripting
Tags:
WatchKit
watchOS
WidgetKit
App Intents
In my case, when two functions that start each Live Activity(not connected each other) are performed in LiveActivityIntent's perform(), it seems that only one will start.
(It's the same to start independently with two Task{})
And, set one to 'opensIntent' and separate it by opening another LiveActivityIntent, the result is same.
Also, every time I tap the Intent directly in the shortcut app, one activity will end within a matter of seconds, even if there are two for a while.
But, If openAppWhenRun to true, it seem to works without any problems.
I would appreciate it if you could give me a tip to fix this problem.
Hi
Here is my problem. I have a large number of portrait pictures in my adobe lightroom catalogue which I need to email to the subjects of the portraits.
These peoples names and email addresses are stored in my pictures metadata and I am able to export the pictures with accompanying text or CSV files containing the names & addresses.
What I want to do is export the pictures in bulk and automagically create and send individual emails for each picture, preferably with the persons name in the salutation at the start of the email.
I have done some googling trying to find a solution to this. I think that I need to use automator. (The lightroom option of mailing directly from within the programme isn't going to do the job, each mail would need to be addressed and written by hand each time). I have been playing around with it, but can see no way to populate the email address field or add the attachment.
Can anyone help me out, with even just the basic outline, of how this would be achieved please?
I made a set of Siri Shortcuts in my app with the AppShortcutsProvider, and they each have a set of phrases.
I can activate the shortcuts via Siri phrases or Spotlight search on iOS 18+, but not on iOS -17.
I've checked the documentation and see that AppShortcutsProvider is supported from iOS 16+, so I don't understand why I can't view the shortcuts in Spotlight or activate them with Siri unless it's at least iOS 18.
Any thoughts?
Topic:
App & System Services
SubTopic:
Automation & Scripting
Tags:
Spotlight
Siri and Voice
Shortcuts
App Intents
Hi, I want to open an email message with AppleScript. Everything is working correctly, but in the Mail app, instead of focusing on targetMessage, it highlights the email after the target message.
When I use:
tell targetMessage to open
the correct email opens in new window but the wrong email is highlighted in the Mail app list.
tell application "Mail"
activate
set targetAccount to missing value
repeat with anAccount in every account
if name of anAccount is "AccountName" then
set targetAccount to anAccount
exit repeat
end if
end repeat
if targetAccount is not missing value then
set targetBox to missing value
repeat with aBox in mailboxes of targetAccount
if name of aBox is "MailboxName" then
set targetBox to aBox
exit repeat
end if
end repeat
if targetBox is not missing value then
set targetMessage to missing value
set oneWeekAgo to (current date) - (7 * days)
set filteredMessages to (every message of targetBox whose date received ≥ oneWeekAgo)
repeat with aMessage in filteredMessages
try
if message id of aMessage is "MessageID" then
set targetMessage to aMessage
exit repeat
end if
end try
end repeat
if targetMessage is not missing value then
if (count of message viewers) > 0 then
set mailViewer to message viewer 1
else
set mailViewer to make new message viewer
end if
tell mailViewer
set selected mailboxes to {targetBox}
delay 0.2
set selected messages to {targetMessage}
end tell
return "Found"
else
return "Message Not found"
end if
else
return "Folder Not found"
end if
else
return "Account Not found"
end if
end tell
Why is this behavior happening?
Hi,
In my application I am donating AppIntent instances that I have to the system using the donate() API. I recently came across this article that talks about deleting donations but it does not mention how to handle AppIntent instances.
I am wondering when working with dynamic AppIntents (with different properties that can change in the future), should I be worried about "outdated" donated AppIntent instances? And if yes how can I delete previously donated AppIntent instances.
I have an AppIntent in one of my applications, which shows up in Shortcuts. I would like to run this intent/shortcut from my other application without jumping to Shortcuts with a deeplink. Is this possible?
I’m trying to develop a widget with a button that triggers an app intent.
I integrated the app intent into my app within a separate app framework. I tested it with Shortcuts and Siri, and it works well—it opens the app on the required screen. However, when I added a button Button(intent: MyIntent()) to my widget, it doesn’t work at all.
The only clue I found is the following message in the Xcode debug console:
“No ConnectionContext found for (some big integer)” when I tap on the widget's button.
However, I see the same message when running it through the Shortcuts app, and in that case, it works fine.
Does anyone know what might be causing this issue?
My Intent:
public struct OpenTextInputIntent: AppIntent {
public static var title: LocalizedStringResource = "Open text input"
public static var openAppWhenRun: Bool = true
@Parameter(title: "Predefined text")
public var predefinedText: String
@Dependency private var appCoordinator: AppCoordinatorProtocol
public init() { }
public func perform() async throws -> some IntentResult {
appCoordinator.openAddMessage(predefinedText: predefinedText)
return .result()
}
}
My widget's view:
struct SimpleWidgetView : View {
var entry: SimpleWidgetTimelineProvider.Entry
var body: some View {
ZStack(alignment: .leadingTop) {
button
}
}
private var button: some View {
Button(intent: OpenTextInputIntent()) {
Image(systemName: "mic.fill")
.resizable()
.aspectRatio(contentMode: .fit)
.iconFrame()
}
.buttonStyle(PlainButtonStyle())
.foregroundStyle(Color.white)
.padding(10)
.background(Circle().fill(Color.accent))
}
}
Intents Registration in the app target:
struct MyAppPackage: AppIntentsPackage {
static var includedPackages: [any AppIntentsPackage.Type] {
[FrameworkIntentsPackage.self]
}
}
struct MyAppShortcutsProvider: AppShortcutsProvider {
static var appShortcuts: [AppShortcut] {
AppShortcut(
intent: OpenTextInputIntent(),
phrases: ["Add message in \(.applicationName)"],
shortTitle: "Message input",
systemImageName: "pencil.circle.fill"
)
}
}
What I'm missing?
I’ve created several shortcuts that tell me the stock price of a given company. The shortcut queries Yahoo Finance using Get Contents of URL, with the URL
https://finance.yahoo.com/quote/TICKER SYMBOL/, for example
https://finance.yahoo.com/quote/PLTR/ for Palantir or
https://finance.yahoo.com/quote/AAPL/ for Apple, etc.
Then it uses RegEx to parse out the numbers which it then formats and displays in a notification. Simple. It works great for several stocks, but for some reason, it does not work correctly for Palantir. It shows an older “previous close” price. Oddly, when I go to the website myself, it shows me the current stock price.
So for today Mar 21 https://finance.yahoo.com/quote/PLTR/ shows me $90.96 (correct) but the shortcut, via Get Contents of URL, shows $87.39 (incorrect). This $87.39 price is listed further down in the page as a "previous close” price. I don’t get it.
Here is a link to my Palantir shortcut:
https://www.icloud.com/shortcuts/edea6ee0261245f49b078efc74d632dd
Here is a link to my Apple shortcut:
https://www.icloud.com/shortcuts/54a416393203432aa356fe76373e3f8b
So the question is, why does Get Contents of URL show an old stock price but when I go to the site myself, it shows the correct stock price … and only for Palantir? I have about six shortcuts running correctly. Palantir is the only one that does not work.
Been banging my head on this one for weeks. Any advice would be much appreciated.
Thank you,
Rob
IOS can record a key, a key video, but there is no key recording function, which is really bad👎Yeah, i hope you guys come on.💪
Topic:
App & System Services
SubTopic:
Automation & Scripting
Hello, experts!
I'm working on a VOIP application that handles audio calls and integrates with CallKit. The problem occurs when attempting to redial a previously made audio call from the system's call history. When I try to handle the NSUserActivity in the application(_:continue:restorationHandler:) method, it intercepts the INStartAudioCallIntent instead of the expected INStartCallIntent.
Background
Deprecation Warnings: I'm encountering deprecation warnings when using INStartAudioCallIntent and INStartVideoCallIntent:
'INStartAudioCallIntent' was deprecated in iOS 13.0: INStartAudioCallIntent is deprecated. Please adopt INStartCallIntent instead.
'INStartVideoCallIntent' was deprecated in iOS 13.0: INStartVideoCallIntent is deprecated. Please adopt INStartCallIntent instead.
As a result, I need to migrate to INStartCallIntent instead, but the issue is that when trying to redial a call from the system’s call history, INStartAudioCallIntent is still being triggered.
Working with Deprecated Intents: If I use INStartAudioCallIntent or INStartVideoCallIntent, everything works as expected, but I want to adopt INStartCallIntent to align with the current iOS recommendations.
Configuration:
CXProvider Configuration: The CXProvider is configured as follows:
let configuration = CXProviderConfiguration()
configuration.supportsVideo = true
configuration.maximumCallsPerCallGroup = 1
configuration.maximumCallGroups = 1
configuration.supportedHandleTypes = [.generic]
configuration.iconTemplateImageData = UIImage(asset: .callKitLogo)?.pngData()
let provider = CXProvider(configuration: configuration)
Outgoing Call Handle: When making an outgoing call, the CXHandle is created like this:
let handle = CXHandle(type: .generic, value: callId)
Info.plist Configuration: In the info.plist, the following key is defined:
<key>NSUserActivityTypes</key>
<array>
<string>INStartCallIntent</string>
</array>
Problem:
When trying to redial the audio call from the system's call history, the NSUserActivity received in the application(_:continue:restorationHandler:) method is an instance of INStartAudioCallIntent instead of INStartCallIntent. This happens even though INStartCallIntent is listed in NSUserActivityTypes in the info.plist and I want to migrate to the newer intent as recommended in iOS 13+.
Device:
iPhone 13 mini
iOS version 17.6.1
Topic:
App & System Services
SubTopic:
Automation & Scripting
Tags:
Foundation
CallKit
Intents
App Intents