Join a near you to learn about AI-assisted development in VS Code.

Craft language model prompts

You can build language model prompts by using string concatenation, but it's hard to compose features and make sure your prompts stay within the context window of language models. To overcome these limitations, you can use the @vscode/prompt-tsx library.

The @vscode/prompt-tsx library provides the following features:

  • TSX-based prompt rendering: Compose prompts using TSX components, making them more readable and maintainable
  • Priority-based pruning: Automatically prune less important parts of prompts to fit within the model's context window
  • Flexible token management: Use properties like flexGrow, flexReserve, and flexBasis to cooperatively use token budgets
  • Tool integration: Integrate with VS Code's language model tools API

For a complete overview of all features and detailed usage instructions, refer to the full README.

This article describes practical examples of prompt design with the library. The complete code for these examples can be found in the prompt-tsx repository.

Manage priorities in the conversation history

Including conversation history in your prompt is important as it enables the user to ask follow-up questions to previous messages. However, you want to make sure its priority is treated appropriately because history can grow large over time. We've found that the pattern which makes the most sense is usually to prioritize, in order:

  1. The base prompt instructions
  2. The current user query
  3. The last couple of turns of chat history
  4. Any supporting data
  5. As much of the remaining history as you can fit

For this reason, split the history into two parts in the prompt, where recent prompt turns are prioritized over general contextual information.

In this library, each TSX node in the tree has a priority that is conceptually similar to a zIndex where a higher number means a higher priority.

Step 1: Define the HistoryMessages component

To list history messages, define a HistoryMessages component. This example provides a good starting point, but you might have to expand it if you deal with more complex data types.

This example uses the PrioritizedList helper component, which automatically assigns ascending or descending priorities to each of its children.

import {
	UserMessage,
	AssistantMessage,
	PromptElement,
	BasePromptElementProps,
	PrioritizedList,
} from '@vscode/prompt-tsx';
import { ChatContext, ChatRequestTurn, ChatResponseTurn, ChatResponseMarkdownPart } from 'vscode';

interface IHistoryMessagesProps extends BasePromptElementProps {
	history: ChatContext['history'];
}

export class HistoryMessages extends PromptElement<IHistoryMessagesProps> {
	render(): PromptPiece {
		const history: (UserMessage | AssistantMessage)[] = [];
		for (const turn of this.props.history) {
			if (turn instanceof ChatRequestTurn) {
				history.push(<UserMessage>{turn.prompt}</UserMessage>);
			} else if (turn instanceof ChatResponseTurn) {
				history.push(
					<AssistantMessage name={turn.participant}>
						{chatResponseToMarkdown(turn)}
					</AssistantMessage>
				);
			}
		}
		return (
			<PrioritizedList priority={0} descending={false}>
				{history}
			</PrioritizedList>
		);
	}
}

Step 2: Define the Prompt component

Next, define a MyPrompt component that includes the base instructions, user query, and history messages with their appropriate priorities. Priority values are local among siblings. Remember that you might want to trim older messages in the history before touching anything else in the prompt, so you need to split up two <HistoryMessages> elements:

import {
	UserMessage,
	PromptElement,
	BasePromptElementProps,
} from '@vscode/prompt-tsx';

interface IMyPromptProps extends BasePromptElementProps {
	history: ChatContext['history'];
	userQuery: string;
}

export class MyPrompt extends PromptElement<IMyPromptProps> {
	render() {
		return (
			<>
				<UserMessage priority={100}>
					Here are your base instructions. They have the highest priority because you want to make
					sure they're always included!
				</UserMessage>
				{/* Older messages in the history have the lowest priority since they're less relevant */}
				<HistoryMessages history={this.props.history.slice(0, -2)} priority={0} />
				{/* The last 2 history messages are preferred over any workspace context you have below */}
				<HistoryMessages history={this.props.history.slice(-2)} priority={80} />
				{/* The user query is right behind the based instructions in priority */}
				<UserMessage priority={90}>{this.props.userQuery}</UserMessage>
				<UserMessage priority={70}>
					With a slightly lower priority, you can include some contextual data about the workspace
					or files here...
				</UserMessage>
			</>
		);
	}
}

Now, all older history messages are pruned before the library tries to prune other elements of the prompt.

Step 3: Define the History component

To make consumption a little easier, define a History component that wraps the history messages and uses the passPriority attribute to act as a pass-through container. With passPriority, its children are treated as if they are direct children of the containing element for prioritization purposes.

import { PromptElement, BasePromptElementProps } from '@vscode/prompt-tsx';

interface IHistoryProps extends BasePromptElementProps {
	history: ChatContext['history'];
	newer: number; // last 2 message priority values
	older: number; // previous message priority values
	passPriority: true; // require this prop be set!
}

export class History extends PromptElement<IHistoryProps> {
	render(): PromptPiece {
		return (
			<>
				<HistoryMessages history={this.props.history.slice(0, -<