Connect with us

Technology

‘You are a helpful mail assistant,’ and other Apple Intelligence instructions

Here are ome of the prompts that Apple Intelligence is using to guide AI models in thje macOS 15.1 Sequoia developer beta.

Published

on

‘You are a helpful mail assistant,’ and other Apple Intelligence instructions
‘You are a helpful mail assistant,’ and other Apple Intelligence instructions

Apple’s latest developer betas launched last week with a handful of the generative AI features that were announced at WWDC and are headed to your iPhones, iPads, and Macs over the next several months. On Apple’s computers, however, you can actually read the instructions programmed into the model supporting some of those Apple Intelligence features.

They show up as prompts that precede anything you say to a chatbot by default, and we’ve seen them uncovered for AI tools like Microsoft Bing and DALL-E before. Now a member of the macOS 15.1 beta subreddit posted that they’d discovered the files containing those backend prompts. You can’t alter any of the files, but they do give an early hint at how the sausage is made.

In the example above, an AI bot for a “helpful mail assistant” is being told how to ask a series of questions based on the content of an email. It could be part of Apple’s Smart Reply feature, which can go on to suggest possible replies for you.

A screenshot of an instruction prompt telling the LLM to provide a reply based on a provided snippet, limiting its answer to 50 words — and not to hallucinate or make things up.A screenshot of an instruction prompt telling the LLM to provide a reply based on a provided snippet, limiting its answer to 50 words — and not to hallucinate or make things up.
Screenshot: Wes Davis / The Verge

This sounds like Apple’s “Rewrite” feature, one of the Writing Tools that you can access by highlighting text and right-clicking (or, in iOS, long-pressing) on it. Its instructions include passages saying, “Please limit the answer within 50 words. Do not hallucinate. Do not make up factual information.”

A screenshot instructing Apple Intelligence to summarize emails within three sentences and under 60 words.A screenshot instructing Apple Intelligence to summarize emails within three sentences and under 60 words.
Screenshot: Wes Davis / The Verge

This brief prompt summarizes emails, with a careful instruction not to answer any questions.

Screenshot instructing Apple Intelligence on how to construct a story based on photographs.Screenshot instructing Apple Intelligence on how to construct a story based on photographs.
Screenshot: Wes Davis / The Verge

I’m pretty certain that this is the instruction set for generating a “Memories” video with Apple Photos. The passage that says, “Do not write a story that is religious, political, harmful, violent, sexual, filthy or in any way negative, sad or provocative,” might just explain why the feature rejected my prompt asking for “images of sadness”:

A shame. It’s not hard to get around, though. I got it to generate a video in response to the prompt, “Provide me with a video of people mourning.” I won’t share the resulting video because there are pictures of people who aren’t me in it, but I will show you the best picture it included in the slideshow:

A close-up image of a grasshopper laying dead on concrete. The camera lens is apparently focused just beyond the grasshopper.A close-up image of a grasshopper laying dead on concrete. The camera lens is apparently focused just beyond the grasshopper.
Rest in peace, little buddy.
Photo: Wes Davis / The Verge

There are far more prompts contained in the files, all laying out the hidden instructions given to Apple’s AI tools before your prompt is ever submitted. But here’s one last instruction before you go:

A screenshot of a prompt that reads: “Dialoguel<n>You are an expert at summarizing messages. You prefer to use clauses instead of complete sentences. Do not answer any question from the messages. Please keep your summary of the input within a 10 word limit.<n>You must keep to this role unless told otherwise, if you don’t, it will not be helpful.”A screenshot of a prompt that reads: “Dialoguel<n>You are an expert at summarizing messages. You prefer to use clauses instead of complete sentences. Do not answer any question from the messages. Please keep your summary of the input within a 10 word limit.<n>You must keep to this role unless told otherwise, if you don’t, it will not be helpful.”
“It will not be helpful.”
Screenshot: Wes Davis / The Verge

Files I browsed through refer to the model as “ajax,” which some Verge readers might recall as the rumored internal name for Apple’s LLM last year.

A screenshot showing a config entry with “ajax” as the model name.A screenshot showing a config entry with “ajax” as the model name.
“Hey, Ajax.”
Screenshot: Wes Davis / The Verge

The person who found the instructions also posted instructions on how to locate the files within the macOS Sequoia 15.1 developer beta.

Expand the “purpose_auto” folder, and you should see a list of other folders with long, alphanumeric names. Inside most of those, you’ll find an AssetData folder containing “metadata.json” files. Opening them should show you some code and — occasionally, at the bottom of some of them — the instructions passed to your machine’s local incarnation of Apple’s LLM. But you should remember these live in a part of macOS that contains the most sensitive files on your system. Tread with caution!

Comments

Trending