October 8, 2024
OpenAI’s Python API is quickly becoming one of the most-downloaded Python packages. With…
Language models have rapidly evolved to become a cornerstone of many AI-driven applications.
However, their power is rooted in their advanced architectures and their ability to effectively interpret and respond to user prompts. In this context, LangChain introduces a game-changing tool: PromptTemplates
.
At first glance, one might perceive a prompt as a simple question or request.
Yet, in Language Models, prompts are the bridge that connects human intent to machine-generated responses. They guide the model, providing context, refining outputs, and modifying behaviours. And while crafting the perfect prompt might seem straightforward, the reality is that it’s both an art and a science.
Enter PromptTemplates
in LangChain.
These aren’t just about sending a question to a model. They offer a structured, reusable, and dynamic way to interact with various language models. From setting the context and defining instructions to dynamically adjusting the content based on user needs, PromptTemplates
offers a versatile approach to language model interactions.
This guide will take you through the intricacies of PromptTemplates
in LangChain, illuminating their significance, functionality, and the benefits they bring to the table. Whether you’re new to language models or a seasoned pro, understanding PromptTemplates
is paramount to harnessing the full potential of LangChain and the models it interacts with.
Language models (LLMs) require prompts to function.
A prompt is a set of instructions or inputs to guide the model’s response. The output from a prompt can be answers, sentence completions, or conversation responses. A well-constructed prompt template has the following sections:
Want to learn how to build modern software with LLMs using the newest tools and techniques in the field? Check out this free LLMOps course from industry expert Elvis Saravia of DAIR.AI, in collaboration with Comet.
Generating, sharing, and reusing prompts in a reproducible manner can be achieved using a few key components.
These include a text string or template that takes inputs and produces a prompt for the LLM, instructions to train the LLM, few-shot examples to enhance the model’s response, and a question to guide the language model.
These pre-defined recipes can contain instructions, context, few-shot examples, and questions that are appropriate for a particular task.
LangChain offers a set of tools for creating and working with prompt templates. These templates are designed to be model-agnostic, making them easier to reuse across different language models. Language models generally require prompts to be in the form of a string or a list of chat messages.
Why Use Prompt Templates? Prompt templates are useful when multiple inputs are needed, making code cleaner and more manageable.
LangChain provides PromptTemplate
to help create parametrized prompts for language models.
A PromptTemplate
allows creating a template string with placeholders, like {adjective}
or {content}
that can be formatted with input values to create the final prompt string.
Some key features:
from langchain import PromptTemplate, OpenAI
# Define a simple prompt template as a Python string
prompt_template = PromptTemplate.from_template("""
Human: What is the capital of {place}?
AI: The capital of {place} is {capital}
""")
prompt = prompt_template.format(place="California", capital="Sacramento")
print(prompt)
This will show the prompt as:
Human: What is the capital of California?
AI: The capital of California is Sacramento
You can take this prompt and pass it to an LLM:
prompt_template = PromptTemplate.from_template(
template="Write a {length} story about: {content}"
)
llm = OpenAI()
prompt = prompt_template.format(
length="2-sentence",
content="The hometown of the legendary data scientist, Harpreet Sahota"
)
response = llm.predict(
text=prompt
)
print(response)
Which outputs the following, almost true, tale of Harpreet Sahota:
Harpreet Sahota's small hometown was always proud of him, even before he became a household name as the legendary data scientist. His intelligence and dedication to the field has earned him recognition around the world.
You can instantiate a prompt template with no input variables, one input variable, or multiple input variables, like so:
# No Input Variable
no_input_prompt = PromptTemplate(input_variables=[], template="Tell me a joke.")
print(no_input_prompt.format())
# One Input Variable
one_input_prompt = PromptTemplate(input_variables=["adjective"], template="Tell me a {adjective} joke.")
print(one_input_prompt.format(adjective="funny"))
# Multiple Input Variables
multiple_input_prompt = PromptTemplate(
input_variables=["adjective", "content"],
template="Tell me a {adjective} joke about {content}."
)
multiple_input_prompt = multiple_input_prompt.format(adjective="funny", content="chickens")
print(multiple_input_prompt)
Which will output the following:
Tell me a joke.
Tell me a funny joke.
Tell me a funny joke about chickens.
And pass this to an LLM like so:
response = llm.predict(
text=multiple_input_prompt
)
print(response)
Q: What did the chicken do when he saw an earthquake?
A: He egg-scaped!
Here are some practical use cases for using a prompt template rather than passing a plain prompt to a language model:
Prompt templates allow you to define a template once and reuse it in multiple places. This avoids duplicating the same prompt logic over and over. For example, you could create a “summarize article” template and reuse it anytime you want a summary.
Prompt templates separate the prompt formatting from the model invocation. This makes the code more modular — you can change the template or the model independently.
Templates allow you to dynamically generate prompts by filling in template variables. This is useful when you want to customize the prompt based on user input or other runtime factors.
Templates can improve readability by encapsulating complex prompt logic in a simple interface. Named variables are often clearer than trying to embed logic directly in strings.
Changes to shared prompt logic only need to happen in one place rather than everywhere a prompt is defined. This improves maintainability.
So in summary, prompt templates improve reusability, modularity and maintenance of prompt engineering code compared to using raw prompt strings directly.
For chat models, LangChain provides ChatPromptTemplate
which allows creating a template for a list of chat messages.
You can use the provided chat message classes like AIMessage
, HumanMessage
, etc or plain tuples to define the chat messages.
ChatPromptTemplate
allows formatting the messages with input values to create the final list of chat messages.
from langchain.prompts import ChatPromptTemplate
chat_template = ChatPromptTemplate.from_messages([
("human", "What is the capital of {country}?"),
("ai", "The capital of {country} is {capital}.")
])
messages = chat_template.format_messages(
country="Canada",
capital="Winnipeg"
)
print(messages)
Which will output the following:
[HumanMessage(content='What is the capital of Canada?', additional_kwargs={}, example=False), AIMessage(content='The capital of Canada is Winnipeg.', additional_kwargs={}, example=False)]
Throughout our exploration of PromptTemplates
in LangChain, one thing becomes undeniably clear: the true power of a language model isn’t just in its underlying architecture but in how we communicate with it.
PromptTemplates
are not merely tools; they are the refined language through which we converse with sophisticated AI systems, ensuring precision, clarity, and adaptability in every interaction.
LangChain’s introduction of such a structured approach to prompts marks a significant step forward in the AI domain.
By emphasizing reusability, dynamism, and modularity, LangChain ensures that developers can maximize the efficacy of their language model interactions without getting bogged down by complexities.
As we move forward in this AI-driven era, tools like PromptTemplates
will undoubtedly play a pivotal role in defining the boundaries of what’s possible. They stand as a testament to the fact that, while the evolution of AI is essential, the methods we employ to interact with it are equally crucial.
With LangChain and PromptTemplates
at our disposal, the future of seamless, impactful, and meaningful AI interactions looks incredibly bright.