Skip to content

Fetching Prompts

Use getPrompt() to fetch a single prompt by ID:

const result = await promptly.getPrompt('review-prompt');

The result includes all prompt metadata from the CMS:

result.promptId; // 'review-prompt'
result.promptName; // 'Review Prompt'
result.systemMessage; // 'You are a helpful assistant...'
result.temperature; // 0.7
result.model; // LanguageModel (auto-resolved from CMS config)
result.version; // '2.0.0'

The userMessage property is a callable function that interpolates template variables:

const message = result.userMessage({
pickupLocation: 'London',
items: 'sofa',
});
// => 'Help with London moving sofa.'

If you’ve run codegen, the variables are fully typed.

To get the raw template string with ${variable} placeholders intact, use String():

const template = String(result.userMessage);
// => 'Help with ${pickupLocation} moving ${items}.'

By default, getPrompt() fetches the latest published version. To fetch a specific version:

const result = await promptly.getPrompt('review-prompt', {
version: '2.0.0',
});

Use getPrompts() to fetch multiple prompts in parallel:

const [reviewPrompt, welcomePrompt] = await promptly.getPrompts([
{ promptId: 'review-prompt' },
{ promptId: 'welcome-email', version: '2.0.0' },
]);

Each result in the returned tuple is typed to its own prompt’s variables:

reviewPrompt.userMessage({
pickupLocation: 'London',
items: 'sofa',
});
welcomePrompt.userMessage({
email: 'alice@example.com',
subject: 'Welcome',
});

The array is typed as a tuple - the first element matches the first request, the second matches the second, and so on.

Here’s a pattern for using fetched prompts with the Vercel AI SDK, including cache control for Anthropic models:

import { createPromptlyClient } from '@promptlycms/prompts';
import { generateText } from 'ai';
const promptly = createPromptlyClient();
const result = await promptly.getPrompt('review-prompt', {
version: '2.0.0',
});
const { text } = await generateText({
model: result.model,
system: result.systemMessage,
temperature: result.temperature,
messages: [
{
role: 'user',
content: result.userMessage({
pickupLocation: 'London',
items: 'sofa',
}),
providerOptions: {
anthropic: { cacheControl: { type: 'ephemeral' } },
},
},
],
});