DEV Community

Taki (Kieu Dang)
Taki (Kieu Dang)

Posted on

Understanding Prompt Templates in LangChain

Mastering prompt engineering and prompt templates in LangChain is essential for optimizing how your AI models interact with structured and unstructured data. Since you are working with NestJS, MongoDB Vector Search, OpenAI, and LangChain, I’ll tailor this guide to your use case—generating test cases from specification documents.


1. Understanding Prompt Templates in LangChain

A prompt template is a structured way to define dynamic prompts with placeholders, ensuring consistency and flexibility. In LangChain, these templates allow us to:

  • Inject variables dynamically (e.g., feature descriptions, past test cases, etc.).
  • Provide a context window for better responses.
  • Optimize performance with concise yet informative prompts.

Example of a Basic Prompt Template

import { PromptTemplate } from "langchain/prompts";

const template = `You are an AI assistant that helps manual testers generate test cases.
Given the feature specification below, generate detailed test cases.

Feature Specification:
{specification}

Test Cases:
`;

const prompt = new PromptTemplate({
  template,
  inputVariables: ["specification"],
});

// Example usage
const formattedPrompt = await prompt.format({
  specification: "A user can reset their password via email verification.",
});

console.log(formattedPrompt);
Enter fullscreen mode Exit fullscreen mode

Output

You are an AI assistant that helps manual testers generate test cases.
Given the feature specification below, generate detailed test cases.

Feature Specification:
A user can reset their password via email verification.

Test Cases:
Enter fullscreen mode Exit fullscreen mode

2. Advanced Prompt Engineering for Test Case Generation

For your RAG-based AI system, you need a structured prompt that:

  1. Retrieves domain knowledge (from MongoDB Vector Search).
  2. Includes relevant historical test cases (if available).
  3. Requests well-structured test cases (title, steps, expected results).

Structured Prompt Example

const testCasePrompt = new PromptTemplate({
  template: `
You are a software testing assistant helping manual testers generate detailed test cases.

## Context:
- The application domain: {domain_knowledge}
- Similar past test cases: {past_test_cases}

## Task:
Generate detailed test cases for the given feature.

Feature Specification:
{specification}

## Expected Test Case Format:
1. **Test Case Title**: A concise description of the test case.
2. **Steps to Reproduce**: Clear, numbered steps.
3. **Expected Result**: The expected behavior.

Test Cases:
`,
  inputVariables: ["domain_knowledge", "past_test_cases", "specification"],
});

// Example usage
const formattedTestCasePrompt = await testCasePrompt.format({
  domain_knowledge: "E-commerce checkout system with Stripe integration.",
  past_test_cases: "- Verify discount calculation\n- Validate payment processing",
  specification: "A user should be able to apply a discount coupon during checkout.",
});

console.log(formattedTestCasePrompt);
Enter fullscreen mode Exit fullscreen mode

Expected Output

You are a software testing assistant helping manual testers generate detailed test cases.

## Context:
- The application domain: E-commerce checkout system with Stripe integration.
- Similar past test cases: 
  - Verify discount calculation
  - Validate payment processing

## Task:
Generate detailed test cases for the given feature.

Feature Specification:
A user should be able to apply a discount coupon during checkout.

## Expected Test Case Format:
1. **Test Case Title**: A concise description of the test case.
2. **Steps to Reproduce**: Clear, numbered steps.
3. **Expected Result**: The expected behavior.

Test Cases:
Enter fullscreen mode Exit fullscreen mode

3. Using Prompt Templates with OpenAI in LangChain

To integrate this into your NestJS app with OpenAI, use LangChain’s LLM chain.

Step 1: Install Dependencies

npm install langchain openai
Enter fullscreen mode Exit fullscreen mode

Step 2: Use Prompt with OpenAI Model

import { OpenAI } from "langchain/llms/openai";
import { LLMChain } from "langchain/chains";

const model = new OpenAI({
  openAIApiKey: process.env.OPENAI_API_KEY, 
  temperature: 0.2, 
});

const chain = new LLMChain({
  llm: model,
  prompt: testCasePrompt, 
});

const response = await chain.call({
  domain_knowledge: "E-commerce checkout system with Stripe integration.",
  past_test_cases: "- Verify discount calculation\n- Validate payment processing",
  specification: "A user should be able to apply a discount coupon during checkout.",
});

console.log(response.text);
Enter fullscreen mode Exit fullscreen mode

4. Optimizing Prompt Performance

  • Use Examples (Few-Shot Learning): Instead of zero-shot, provide examples for better outputs.
  • Adjust Temperature: Lower values (0.1 - 0.3) produce more deterministic responses.
  • Chunk Large Text Inputs: If your specification documents are large, split them into smaller chunks before passing them to OpenAI.

Example of Few-Shot Prompting

const fewShotPrompt = new PromptTemplate({
  template: `
You are an AI that generates structured test cases for manual testers. Use the format below.

### Example 1:
Feature Specification:
A user can reset their password via email verification.

Test Case:
1. **Test Case Title**: Reset Password via Email
2. **Steps to Reproduce**:
   - Go to login page.
   - Click "Forgot Password".
   - Enter registered email.
   - Check email for verification link.
   - Click the link and set a new password.
3. **Expected Result**: User resets password successfully.

### Example 2:
Feature Specification:
A user should be able to apply a discount coupon during checkout.

Test Case:
`,
  inputVariables: ["specification"],
});
Enter fullscreen mode Exit fullscreen mode

5. Next Steps

  • Integrate MongoDB Vector Search: Retrieve relevant past test cases from MongoDB before generating new ones.
  • Use Memory in LangChain: Maintain chat memory for continuous context handling.
  • Fine-tune OpenAI Responses: Adjust your prompt templates based on real feedback from testers.

Qodo Takeover

Introducing Qodo Gen 1.0: Transform Your Workflow with Agentic AI

Rather than just generating snippets, our agents understand your entire project context, can make decisions, use tools, and carry out tasks autonomously.

Read full post →

Top comments (0)

AWS GenAI LIVE image

How is generative AI increasing efficiency?

Join AWS GenAI LIVE! to find out how gen AI is reshaping productivity, streamlining processes, and driving innovation.

Learn more

👋 Kindness is contagious

Please leave a ❤️ or a friendly comment on this post if you found it helpful!

Okay