Quick Start
This guide walks you through the basic workflow: setting your API key, creating a client, fetching a prompt, and using it with the Vercel AI SDK. By the end, your prompts will be managed in the Promptly CMS - not hardcoded in your codebase.
-
Set your API key
Add your Promptly API key to your environment. Create a
.envfile in your project root:.env PROMPTLY_API_KEY=pk_live_... -
Create a client
src/index.ts import { createPromptlyClient } from '@promptlycms/prompts';const promptly = createPromptlyClient();You can omit
apiKeyifPROMPTLY_API_KEYis set in your environment - the SDK picks it up automatically. -
Fetch a prompt
src/index.ts const result = await promptly.getPrompt('JPxlUpstuhXB5OwOtKPpj');// Prompt content - exactly what you wrote in the editorresult.systemMessage; // 'You are a helpful assistant.'result.promptName; // 'Code Review Helper'// Model config - from the sidebar settings in the editorresult.model; // LanguageModel (auto-resolved)result.temperature; // 0.7// Template variables - interpolate ${variables} from your user messageconst message = result.userMessage({name: 'Alice',task: 'code review',}); -
Use with the Vercel AI SDK
Destructure
getPrompt()and pass the properties directly togenerateText()orstreamText():src/index.ts import { generateText } from 'ai';const { userMessage, systemMessage, temperature, model } = await promptly.getPrompt('JPxlUpstuhXB5OwOtKPpj');const { text } = await generateText({model,system: systemMessage,prompt: userMessage({ name: 'Alice', task: 'code review' }),temperature,});
Complete example
Section titled “Complete example”import { createPromptlyClient } from '@promptlycms/prompts';import { generateText } from 'ai';
const { getPrompt } = createPromptlyClient();
const { userMessage, systemMessage, temperature, model } = await getPrompt( 'JPxlUpstuhXB5OwOtKPpj',);
const { text } = await generateText({ model, system: systemMessage, prompt: userMessage({ name: 'Alice', task: 'code review' }), temperature,});
console.log(text);Composers
Section titled “Composers”Composers let you orchestrate multiple prompts into a single output. Fetch a composer with getComposer(), run each prompt through an AI model, and stitch the results together with formatComposer():
import { createPromptlyClient } from '@promptlycms/prompts';import { generateText } from 'ai';
const promptly = createPromptlyClient();
const composer = await promptly.getComposer('my-composer', { input: { text: 'Hello', targetLang: 'French' },});
const { introPrompt, reviewPrompt, formatComposer } = composer;
const output = formatComposer({ introPrompt: await generateText(introPrompt), reviewPrompt: await generateText(reviewPrompt),});Each named prompt on the composer result (like introPrompt and reviewPrompt) is a ComposerPrompt object with model, system, prompt, and temperature properties — ready to spread directly into generateText() or streamText(). The formatComposer() function reassembles the final output from each prompt’s result, interleaving static content segments in the correct order.
Next steps
Section titled “Next steps”- Generate types for autocomplete and type safety
- Learn about fetching prompts in detail
- Explore the AI SDK integration guide
- See the Client API reference for
getComposer()andgetComposers()details - Manage your prompts, models, and versions from the Promptly CMS dashboard
- Need direct API access? See the REST API reference