Hello,
I am experiencing an issue with the App Intents framework where a parameter of type [String] (String Array) fails to persist user input in the Shortcuts app action editor.
Issue Description: When adding an item to the String Array parameter in the Shortcuts app action editor, the input text automatically clears/resets to empty within less than 1 second. This happens spontaneously while the keyboard is still active, or immediately after typing, making it impossible to input any values.
Environment:
Xcode Version: 26.2 (17C52)
iOS Version: 26.2.1
Device: iPhone 17
Code Snippet:
import AppIntents
import SwiftUI
struct TestStringArrayIntent: AppIntent {
static var title: LocalizedStringResource = "Test Array Input Bug"
static var description: IntentDescription = "Reproduces the issue where String Array input clears automatically."
// PROBLEM:
// Input for this parameter vanishes automatically < 1s after typing.
@Parameter(title: "Test Strings", default: [])
var strings: [String]
func perform() async throws -> some IntentResult & ReturnsValue<String> {
return .result(value: "Count: \(strings.count)")
}
}
Steps to Reproduce:
Build and install the app containing the code above.
Open the Shortcuts app and create a new shortcut.
Add the "Test Array Input Bug" action.
Tap the "Test Strings" parameter to add a new item.
Type any text (e.g., "Hi").
Observe: Wait for about 1 second
Observed Behavior: The text field clears itself automatically. The array remains empty ([]).
Expected Behavior: The text should remain in the field and be successfully added to the array.
**Filed as Feedback:**FB21808619
Thank you.
Shortcuts
RSS for tagHelp users quickly accomplish tasks related to your app with their voice or with a tap with the Shortcuts API.
Posts under Shortcuts tag
75 Posts
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I have a series of shortcuts that I’ve written that use the “Use Model” action to do various things. For example, I have a shortcut “Clipboard Markdown to Notes” that takes the content of the clipboard, creates a new note in Notes, converts the markdown content to rich text, adds it to the note etc.
One key step is to analyze the markdown content with “Use Model” and generate a short descriptive title for the note.
I use the on-device model for this, but sometimes the content and prompt exceed the context window size and the action fails with an error message to that effect.
In that case, I’d like to either repeat the action using the Cloud model, or, if the error was a refusal, to prompt the user to enter a title to use.
I‘ve tried using an IF based on whether the response had any text in it, but that didn’t work. No matter what I’ve tried, I can’t seem to find a way to catch the error from Use Model, determine what the error was, and take appropriate action.
Is there a way to do this?
(And by the way, a huge ”thank you” to whoever had the idea of making AppIntents visible in Shortcuts and adding the Use Model action — has made a huge difference already, and it lets us see what Siri will be able to use as well.)
Hey there, I’m wondering if there is a CRUD API for Apple Notes?
I just saw once that it’s possible through a Apple Script on Mac. But the problem for me is that the app should also run on iOS.
Does someone have an idea how to approach this?
I have a Quick Action which flattens a PDF via Applescript. The code works extremely well--I right click in finder and the PDF is flattened, annotations are burned in, no other applications are opened, and the action completes in less than 2 seconds.
Here is the Applescript code:
I have a Shortcut which completes several operations on already-flattened PDFs. Presently 1) I run my Quick Action by right-clicking a PDF in Finder in order to flatten it, and 2) then right-click that save PDF in Finder to run my Shortcut on that now-flattened PDF.
Ideally I'd like to add the Applescript code which flattens the PDF to the beginning of my Shortcut for the sake of convenience, and because sometimes I run my Shortcut having forgotten to flatten the PDF first.
However I'm finding that the Applescript code, when placed into a Run Applescript action in Shortcuts, gives this error message:
The exact same code when placed into a Run Applescript action in Automator and then run as a Quick Action by right-clicking the PDF in Finder, does not give an error and works perfectly.
Does anybody have an explanation (and possible solution) for why this is the case?
Topic:
App & System Services
SubTopic:
Automation & Scripting
Tags:
Swift
Automator
Shortcuts
AppleScript
I am very new to the macOS Shortcuts application. In my opinion, the documentation is sparse and totally inadequate. The internet seems to be the only method of figuring out how to use this application.
I have recently created a shortcut that is working well for me. It has several steps and is too large to fit in the Shortcuts editor window, so I cannot grab a screenshot of it for documenation purposes. I also cannot copy the contents of the editor window and paste them anywhere, such as in a new Note or TextEdit document.
I do think Apple should add a means to create a PDF document of the Shortcuts editor window's contents. I went to the official Apple feedback page to leave comments but, irony of ironies, the Shortcuts app is not listed there!
I have no idea what I am doing at this point but am excited to learn how to use Shortcuts to automate and simplify tasks that I have to perform frequently. Here's hoping the documentation and features for this application will one day be comprehensive, comprehensible, and complete.
Currently, we are developing an all-in-one DualSense utility for macOS. We are exploring how to integrate shortcuts into our app. Our vision is to have the user use the native Shortcuts app to choose the controller buttons that should trigger the shortcut action, such as opening Steam, turning on audio haptics, and more.
As we explore this approach, we want to see whether we need to build the UI in our app to set the triggers or can we do this inside of Shortcuts? Can button presses recorded by our app trigger shortcuts? Can those button inputs be customized inside of Shortcuts or should we develop it into our app? And if we have it in our app, can our app see, select, and trigger shortcuts?
Topic:
App & System Services
SubTopic:
Automation & Scripting
Tags:
macOS
Shortcuts
Intents
App Intents
The following code works perfectly fine in iOS 17, where I can retrieve the desired dependency value through @IntentParameterDependency as expected. However, in iOS 18, addTransaction always returns nil.
struct CategoryEntityQuery: EntityStringQuery {
@Dependency
private var persistentController: PersistentController
@IntentParameterDependency<AddTransactionIntent>(
\.$categoryType
)
var addTransaction
func entities(matching string: String) async throws -> [CategoryEnitity] {
guard let addTransaction else {
return []
}
// ...
}
func entities(for identifiers: [CategoryEnitity.ID]) async throws -> [CategoryEnitity] {
guard let addTransaction else {
return []
}
// ...
}
func suggestedEntities() async throws -> [CategoryEnitity] {
guard let addTransaction else {
return []
}
// ...
}
}
Has anyone else encountered the same issue? Any insights or potential workarounds would be greatly appreciated.
iOS: 18.0 (22A3354)
Xcode 16.0 (16A242d)
Hi there,
Does anyone know how to modify this Image compressor Shortcut https://www.icloud.com/shortcuts/e13d8013598f4f33830386a956a163dd so that the image it creates has the original file name + “-pressed”?
Eg “Image_123” becomes “Image_123-pressed”
I know of the action ‘Rename file’ but can’t make it work. Any help much appreciated:)
Hi there,
Does anyone know how to modify this Image compressor Shortcut https://www.icloud.com/shortcuts/e13d8013598f4f33830386a956a163dd so that the image it creates has the original file name + “-pressed”?
Eg “Image_123” becomes “Image_123-pressed”
I know of the action ‘Rename file’ but can’t make it work. The shortcut does batch processing of images if that makes any difference.
Any help much appreciated:)
Greetings, and Happy Holidays,
I've been building an on-device AI safety layer called Newton Engine, designed to validate prompts before they reach FoundationModels (or any LLM). Wanted to share v1.3 and get feedback from the community.
The Problem
Current AI safety is post-training — baked into the model, probabilistic, not auditable. When Apple Intelligence ships with FoundationModels, developers will need a way to catch unsafe prompts before inference, with deterministic results they can log and explain.
What Newton Does
Newton validates every prompt pre-inference and returns:
Phase (0/1/7/8/9)
Shape classification
Confidence score
Full audit trace
If validation fails, generation is blocked. If it passes (Phase 9), the prompt proceeds to the model.
v1.3 Detection Categories (14 total)
Jailbreak / prompt injection
Corrosive self-negation ("I hate myself")
Hedged corrosive ("Not saying I'm worthless, but...")
Emotional dependency ("You're the only one who understands")
Third-person manipulation ("If you refuse, you're proving nobody cares")
Logical contradictions ("Prove truth doesn't exist")
Self-referential paradox ("Prove that proof is impossible")
Semantic inversion ("Explain how truth can be false")
Definitional impossibility ("Square circle")
Delegated agency ("Decide for me")
Hallucination-risk prompts ("Cite the 2025 CDC report")
Unbounded recursion ("Repeat forever")
Conditional unbounded ("Until you can't")
Nonsense / low semantic density
Test Results
94.3% catch rate on 35 adversarial test cases (33/35 passed).
Architecture
User Input
↓
[ Newton ] → Validates prompt, assigns Phase
↓
Phase 9? → [ FoundationModels ] → Response
Phase 1/7/8? → Blocked with explanation
Key Properties
Deterministic (same input → same output)
Fully auditable (ValidationTrace on every prompt)
On-device (no network required)
Native Swift / SwiftUI
String Catalog localization (EN/ES/FR)
FoundationModels-ready (#if canImport)
Code Sample — Validation
let governor = NewtonGovernor()
let result = governor.validate(prompt: userInput)
if result.permitted {
// Proceed to FoundationModels
let session = LanguageModelSession()
let response = try await session.respond(to: userInput)
} else {
// Handle block
print("Blocked: Phase \(result.phase.rawValue) — \(result.reasoning)")
print(result.trace.summary) // Full audit trace
}
Questions for the Community
Anyone else building pre-inference validation for FoundationModels?
Thoughts on the Phase system (0/1/7/8/9) vs. simple pass/fail?
Interest in Shape Theory classification for prompt complexity?
Best practices for integrating with LanguageModelSession?
Links
GitHub: https://github.com/jaredlewiswechs/ada-newton
Technical overview: parcri.net
Happy to share more implementation details. Looking for feedback, collaborators, and anyone else thinking about deterministic AI safety on-device.
parcri.net has the link :)
Description
The Shortcut Automation Trigger Transaction frequently times out, ultimately causing the shortcut automation to fail. Please see the attached trace for details.
Additionally, the Trigger is activated even when the Transaction is declined.
Details
In the trace I see the error:
[WFWalletTransactionProvider observeForUpdatesWithInitialTransactionIfNeeded:transactionIdentifier:completion:]_block_invoke Hit timeout waiting for transaction with identifier: <private>, finishing.
Open bug report: FB14035016
使用APPIntent 的AppShortcutsProvider方式,最多只能添加10个AppShortcut,超过10个,代码编译就会报错
struct MeditationShortcuts: AppShortcutsProvider {
static var appShortcuts: [AppShortcut] {
AppShortcut(
intent: StartMeditationIntent(),
phrases: [
"Start a (.applicationName)",
"Begin (.applicationName)",
"Meditate with (.applicationName)",
"Start a (.$session) session with (.applicationName)",
"Begin a (.$session) session with (.applicationName)",
"Meditate on (.$session) with (.applicationName)"
]
)
}
}
如何能做到像特斯拉APP一样
Is there any way of creating complete Shortcuts automations and bundling them with my app? Specifically, I would like the user to be able to
Take a photo and open it with my app
Or take a screenshot and open it with my app
Of course I could offer a Share extension, but going through the Share menu and selecting my app there is time consuming for the user. I would like the user to be able to configure his or her action button such that it takes a new picture and opens it with my app right away.
I can, of course, offer the respective App Shortcuts and let the user combine them into a pipeline with the Take Screenshot or Take Photo system actions. However, only power users would do this. Hence, I would like to bundle this complete pipeline with my app, such that the user just has to assign his/her Action Button to this pipeline if he/she wants to use this feature.
How to go about this? I was thinking of exporting the shortcut into a file, bundling it with the app as a resource, and offering it via a Share action for the user to install it, or by sharing it on iCloud and adding the iCloud link to the UI of my app. What is the recommended approach?
Hi everyone,
I’m currently experimenting with App Intents and I’m trying to customize the section titles that appear at the top of groups of intents inside the Shortcuts app UI.
For example, in the Phone shortcut, there are built-in sections such as “Call Favorite Contacts” and “Call Recent Contacts” (see screenshot attached).
Apple’s own system apps such as Phone, Notes, and FaceTime seem to have fully custom section headers inside Shortcuts with icon.
My question is:
👉 Is there an API available that allows third-party apps to define these titles (or sections) programmatically?
I went through the AppIntents and Shortcuts documentation but couldn’t find anything.
From what I can tell, this might be private / Apple-only behavior, but I’d be happy to know if anybody has found a supported solution or a recommended alternative.
Has anyone dealt with this before?
Thanks!
Mickaël 🇫🇷
Topic:
App & System Services
SubTopic:
Automation & Scripting
Tags:
Siri and Voice
Shortcuts
Intents
App Intents
I've been implemented App Shortcuts into my apps which are localized for a variety of languages.
The WWDC23 "Spotlight your app with App Shortcuts" has been extremely helpful in resolving my localized trigger phrases issue, but before I continue filling out all of the trigger phrases for my application I am concerned about a limitation that was mention in the video and need some additional information about it.
The limitations noted in the video at minute mark 21:26 states that:
Maximum 10 App Shortcuts (OK)
Maximum 1000 trigger phrases...
If I have 1 app and 10 shortcuts, and each shortcut only uses (.applicationName), this means I get to have 100 trigger phrases for each shortcut (for the sake of the discussion).
What I'm unsure about is when I begin providing localization do the localized triggered phrases count toward the trigger phrase limit? Essentially, for every language I support do I have to drop 1/2 of all of my trigger phrases to stay under the limit?
At the moment, my app is supporting 40 languages and I would like to know how localization affects the trigger phrase limit.
Thank you!
Topic:
App & System Services
SubTopic:
Widgets & Live Activities
Tags:
Shortcuts
Intents
App Intents
We have a requirement to manage the shortcuts and hotkeys in our application, and have it to be intuitive and support multi-lingual fully. The understanding that we have currently is that most universal shortcuts and hotkeys on MacOS/iOS are expressed using English/Latin characters’ – and now, when a ‘pure foreign language physical or virtual keyboard’ is the ‘input device’ – we are unclear how the user would invoke such a hotkey.
Now, considering cases where other language keyboards have no Latin characters, in these environments, managing shortcuts and hotkeys becomes a rather difficult task. Taking a very simple example, the shortcut for Printing a page is Command/Control + 'P'. This can be an issue on Non English character keyboards like Arabic, where not only are there no letters for P, there is also no equivalent phonetic character as well, since the language itself does not have it.
Also – when we are wanting customizability of a hotkey by the user, how would the user express ‘which is the key combination for a given action they want to perform’.
So, based on these conditions, in order to provide the most comprehensive and optimal experience for the user in their own language, what is it that Apple recommend we do here, for Hotkeys/Shortcuts support in Pure Languages
Topic:
Accessibility & Inclusion
SubTopic:
General
Tags:
InputMethodKit
Internationalization
Shortcuts
Localization
I am creating an AppIntent to be used with Shortcuts and I would like to return a flexible dictionary of values with nested structures. As far as I understand the custom AppEntity only uses the displayRepresentation to store a title and subtitle which are LocalizedStringResource. types. Although I can convert my dictionary into a string I found no way in shortcuts to be able to retrieve the original structure of it and inspect individual elements like in subsequent actions. Is there a way to do this?
Thank you in advance
Nick Karanatsios
Hi, new to this forum.
Recently discovered how to share a location in Maps app with my Tesla to automatically start navigating. How cool is that!
Being the nerd that I am, I wrote a shortcut to select a contact and share it's address with my Tesla. That way, I don't leave the Maps app in memory to use up my battery, and don't have to go to all the trouble of swiping Maps out of memory. JK.
Anyway, when I share the shortcut-selected address with the Tesla, it says "Error this content could not be shared". To me this means the address as shared by the shortcut is not in the same format as when you share it directly from Maps.
So the question is, how can I send a properly formatted location from my shortcut?
Thanks...
The question:
Is there any chance that Apple will integrate Intune SDK into Apple apps such as Mail or Calendar, or create Siri-compatible Intune SDK-integrated versions of Mail and Calendar?
The reason for the question:
My team has been asked by VIPs in our company (e.g. execs and board members) if Siri can be used with Outlook, and the only way is through Shortcuts or by adding the Outlook account to Mail.
Both of these options would violate our security policies for these reasons:
Since our company policy and federal regulations don't permit us to allow access to company resources on non-MAM-protected apps, we can't allow our users to login to the Mail app and make full use of Siri, due to the lack of MAM controls for Mail and Calendar.
We only allow users to transfer data between policy-managed apps which have integrated the Intune SDK allowing us to enforce DLP and other security measures. The only way to enable Shortcuts would be to disable these security measures.
Topic:
Business & Education
SubTopic:
General
Tags:
Mobile Core Services
Enterprise
Siri and Voice
Shortcuts
I developed a XCode project using Flutter (v. 3.35.6).
The application basically has a IntentExtension to handle intents donation and the related business logic. We decided to go with ShortcutExtension in place of AppIntents because it fits with our app's use case (where basically we need to dynamically donate/remove intents).
We have an issue building the project, and it is due to the presence of the IntentExtension .appex file in the Build Phases --> Embed Foundation Extensions.
If we remove it , the project builds however the IntentHandling is not invoked in the Shortcuts app.
Build issue:
Generated error in the console:
Cycle inside Runner; building could produce unreliable results.
Cycle details:
→ Target 'Runner' has copy command from '/Users/federico.gatti/Documents/comfort-mobile-app/apps/comfort/ios/DerivedData/Runner/Build/Products/Release-quality-iphoneos/ShortcutsExtension.appex' to '/Users/federico.gatti/Documents/comfort-mobile-app/apps/comfort/ios/DerivedData/Runner/Build/Products/Release-quality-iphoneos/Runner.app/PlugIns/ShortcutsExtension.appex'
○ That command depends on command in Target 'Runner': script phase “[CP] Copy Pods Resources”
○ That command depends on command in Target 'Runner': script phase “[CP] Embed Pods Frameworks”
○ That command depends on command in Target 'Runner': script phase “FlutterFire: "flutterfire upload-crashlytics-symbols"”
○ That command depends on command in Target 'Runner': script phase “FlutterFire: "flutterfire bundle-service-file"”
○ That command depends on command in Target 'Runner': script phase “Thin Binary”
○ Target 'Runner' has process command with output '/Users/federico.gatti/Documents/comfort-mobile-app/apps/comfort/ios/DerivedData/Runner/Build/Products/Release-quality-iphoneos/Runner.app/Info.plist'
○ Target 'Runner' has copy command from '/Users/federico.gatti/Documents/comfort-mobile-app/apps/comfort/ios/DerivedData/Runner/Build/Products/Release-quality-iphoneos/ShortcutsExtension.appex' to '/Users/federico.gatti/Documents/comfort-mobile-app/apps/comfort/ios/DerivedData/Runner/Build/Products/Release-quality-iphoneos/Runner.app/PlugIns/ShortcutsExtension.appex'
Raw dependency cycle trace:
target: ->
node: ->
command: ->
node: /Users/federico.gatti/Documents/comfort-mobile-app/apps/comfort/ios/DerivedData/Runner/Build/Intermediates.noindex/Runner.build/Release-quality-iphoneos/Runner.build/Objects-normal/arm64/ExtractedAppShortcutsMetadata.stringsdata ->
command: P0:target-Runner-18c1723432283e0cc55f10a6dcfd9e0288a783a885d8b0b3beb2e9f90bde3f49-:Release-quality:ExtractAppIntentsMetadata ->
node: ->
command: P0:::Gate target-Runner-18c1723432283e0cc55f10a6dcfd9e0288a783a885d8b0b3beb2e9f90bde3f49--fused-phase10-copy-files ->
node: <Copy /Users/federico.gatti/Documents/comfort-mobile-app/apps/comfort/ios/DerivedData/Runner/Build/Products/Release-quality-iphoneos/Runner.app/PlugIns/ShortcutsExtension.appex> ->
CYCLE POINT ->
command: P0:target-Runner-18c1723432283e0cc55f10a6dcfd9e0288a783a885d8b0b3beb2e9f90bde3f49-:Release-quality:Copy /Users/federico.gatti/Documents/comfort-mobile-app/apps/comfort/ios/DerivedData/Runner/Build/Products/Release-quality-iphoneos/Runner.app/PlugIns/ShortcutsExtension.appex /Users/federico.gatti/Documents/comfort-mobile-app/apps/comfort/ios/DerivedData/Runner/Build/Products/Release-quality-iphoneos/ShortcutsExtension.appex ->
node: ->
command: P0:::Gate target-Runner-18c1723432283e0cc55f10a6dcfd9e0288a783a885d8b0b3beb2e9f90bde3f49--fused-phase9--cp--copy-pods-resources ->
node: /Users/federico.gatti/Documents/comfort-mobile-app/apps/comfort/ios/DerivedData/Runner/Build/Products/Release-quality-iphoneos/Runner.app/GoogleMapsResources.bundle ->
command: P2:target-Runner-18c1723432283e0cc55f10a6dcfd9e0288a783a885d8b0b3beb2e9f90bde3f49-:Release-quality:PhaseScriptExecution [CP] Copy Pods Resources /Users/federico.gatti/Documents/comfort-mobile-app/apps/comfort/ios/DerivedData/Runner/Build/Intermediates.noindex/Runner.build/Release-quality-iphoneos/Runner.build/Script-B728693F1F2684724A065652.sh ->
node: ->
command: P0:::Gate target-Runner-18c1723432283e0cc55f10a6dcfd9e0288a783a885d8b0b3beb2e9f90bde3f49--fused-phase8--cp--embed-pods-frameworks ->
node: /Users/federico.gatti/Documents/comfort-mobile-app/apps/comfort/ios/DerivedData/Runner/Build/Products/Release-quality-iphoneos/Runner.app/Frameworks/Alamofire.framework ->
command: P2:target-Runner-18c1723432283e0cc55f10a6dcfd9e0288a783a885d8b0b3beb2e9f90bde3f49-:Release-quality:PhaseScriptExecution [CP] Embed Pods Frameworks /Users/federico.gatti/Documents/comfort-mobile-app/apps/comfort/ios/DerivedData/Runner/Build/Intermediates.noindex/Runner.build/Release-quality-iphoneos/Runner.build/Script-1A1449CD6436E619E61D3E0D.sh ->
node: ->
command: P0:::Gate target-Runner-18c1723432283e0cc55f10a6dcfd9e0288a783a885d8b0b3beb2e9f90bde3f49--fused-phase7-flutterfire---flutterfire-upload-crashlytics-symbols- ->
node: <execute-shell-script-18c1723432283e0cc55f10a6dcfd9e024008e7f13be1da4979f78de280354094-target-Runner-18c1723432283e