Fetching Prompts
Single prompt
Section titled “Single prompt”Use getPrompt() to fetch a single prompt by ID:
const result = await promptly.getPrompt('review-prompt');Accessing metadata
Section titled “Accessing metadata”The result includes all prompt metadata from the CMS:
result.promptId; // 'review-prompt'result.promptName; // 'Review Prompt'result.systemMessage; // 'You are a helpful assistant...'result.temperature; // 0.7result.model; // LanguageModel (auto-resolved from CMS config)result.version; // '2.0.0'Template variable interpolation
Section titled “Template variable interpolation”The userMessage property is a callable function that interpolates template variables:
const message = result.userMessage({ pickupLocation: 'London', items: 'sofa',});// => 'Help with London moving sofa.'If you’ve run codegen, the variables are fully typed.
Raw template access
Section titled “Raw template access”To get the raw template string with ${variable} placeholders intact, use String():
const template = String(result.userMessage);// => 'Help with ${pickupLocation} moving ${items}.'Version pinning
Section titled “Version pinning”By default, getPrompt() fetches the latest published version. To fetch a specific version:
const result = await promptly.getPrompt('review-prompt', { version: '2.0.0',});Batch fetch
Section titled “Batch fetch”Use getPrompts() to fetch multiple prompts in parallel:
const [reviewPrompt, welcomePrompt] = await promptly.getPrompts([ { promptId: 'review-prompt' }, { promptId: 'welcome-email', version: '2.0.0' },]);Each result in the returned tuple is typed to its own prompt’s variables:
reviewPrompt.userMessage({ pickupLocation: 'London', items: 'sofa',});
welcomePrompt.userMessage({ email: 'alice@example.com', subject: 'Welcome',});The array is typed as a tuple - the first element matches the first request, the second matches the second, and so on.
Real-world example
Section titled “Real-world example”Here’s a pattern for using fetched prompts with the Vercel AI SDK, including cache control for Anthropic models:
import { createPromptlyClient } from '@promptlycms/prompts';import { generateText } from 'ai';
const promptly = createPromptlyClient();
const result = await promptly.getPrompt('review-prompt', { version: '2.0.0',});
const { text } = await generateText({ model: result.model, system: result.systemMessage, temperature: result.temperature, messages: [ { role: 'user', content: result.userMessage({ pickupLocation: 'London', items: 'sofa', }), providerOptions: { anthropic: { cacheControl: { type: 'ephemeral' } }, }, }, ],});Next steps
Section titled “Next steps”- Explore the AI SDK integration guide
- Learn about model resolution and custom resolvers
- Handle errors from the API
- Create and version your prompts in the Promptly CMS