laitimes

langchain: Prompt in hand, I have it

author:Flydean programs those things

Brief introduction

prompts is the input of large language models, which is a powerful tool for applying large language models. There are no bad big language models, only bad prompts.

Writing prompts well can bring out 300% of the power of large language models.

In theory, it is not so easy to write good prompts, but Langchain has turned this theory into reality, let's take a look.

Good prompt

Sometimes, it's not that the language model we use isn't good enough, it's that the prompts we write aren't good enough.

Here are some of the principles for prompts for writing good big language models:

  1. Be specific and detailed: prompts should have clear questions or tasks while containing enough detail and context for large language models to understand and answer.
  2. Understandable and answerable: prompts should be clear and clear so that large language models can understand and answer. Avoid using language that is too abstract, vague, or offensive.
  3. Contextuality and context: Prompts should contain enough contextual and contextual information for the big language model to understand the importance and significance of the question and provide meaningful information in the response.
  4. Have goals and directions: prompts should be clear about the goals and direction of the problem or task so that the big language model can provide clear and useful answers to the information needed.
  5. Extensible and customizable: Prompts should be designed to be easily scalable and customized to suit different application scenarios and user needs.

Because many times, in similar scenarios, the general structure of our prompts is the same, only the specific details are different, at this time, you need to use prompt template.

What is a prompt template?

Prompt template is a prompt template, through prompt template, we can quickly generate multiple prompts.

Basically, the prompt template has already helped us describe the scene and what to do. We just need to fill in the specifics.

Here's a simple example of a prompt template:

from langchain import PromptTemplate


template = """/
假如你是一个金融公司的理财经理,请你分析一下{stock}这只股票。
"""

prompt = PromptTemplate.from_template(template)
prompt.format(stock="腾讯控股")

假如你是一个金融公司的理财经理,请你分析一下腾讯控股这只股票。
           

In this way, for the user, only the name of the stock needs to be entered. The other long string of text is not needed, which greatly saves the time of prompt construction.

Of course, this is just a very simple example, you can also format the answer in the prompt template, provide specific examples, etc., to get a better response.

Create a prompt template in Langchain

Simply put, a prompt template is something that formats input. In langchain, the corresponding utility class is called PromptTemplate.

In the simple example above, we have seen in general how to use the PromptTemplate.

In the above example, we called the PromptTemplate.from_template method, passing in a string of template.

In the string of template, we define a variable with parentheses. Finally, call the prompt.format method, specify the name and value of the variable, and complete the final creation of prompt.

In addition, multiple variables can be specified in the prompt template:

template = "请告诉我一个关于{personA}的{thingsB}"

prompt_template = PromptTemplate.from_template(template)
prompt_template.format(personA="小张", thingsB="故事")
           

Just specify the variable name in format.

In addition to using PromptTemplate.from_template methods, we can also create prompts directly using the constructor of PromptTemplate.

The constructor of a PromptTemplate can accept two parameters: input_variables and template.

input_variables is the variable name in the template, which is an array.

template is the specific content of the template, which is a string.

For example, we can construct a template with no variables:

no_input_prompt = PromptTemplate(input_variables=[], template="这是一个无参数模板。")
no_input_prompt.format()
           

We can also construct a template with parameters:

one_input_prompt = PromptTemplate(input_variables=["stock"], template="假如你是一个金融公司的理财经理,请你分析一下{stock}这只股票。")
one_input_prompt.format(stock="腾讯控股")
           

There are also templates with multiple parameters:

multiple_input_prompt = PromptTemplate(
    input_variables=["personA", "thingsB"], 
    template="请告诉我一个关于{personA}的{thingsB}"
)
multiple_input_prompt.format(personA="小张", thingsB="故事")
           

Chat's unique prompt template

As mentioned before when introducing langchain, although chat is based on LLM, it is still different from basic LLM.

The main difference is that chat messages are for different roles. For example, in OpenAI, chat messages can be divided into AI, human, or system roles.

This is a bit more complicated, but it allows for better categorization of messages.

Let's take a look at what are the PromptTemplates for chat in langchain:

from langchain.prompts import (
    ChatPromptTemplate,
    PromptTemplate,
    SystemMessagePromptTemplate,
    AIMessagePromptTemplate,
    HumanMessagePromptTemplate,
)
           

As with normal prompt templates, we can call the from_template of the MessagePromptTemplate to create the corresponding prompt:

template="现在你的角色是{role},请按该角色进行后续的对话."
system_message_prompt = SystemMessagePromptTemplate.from_template(template)
human_template="{text}"
human_message_prompt = HumanMessagePromptTemplate.from_template(human_template)
           

Of course you can create a prompt by constructing it:

prompt=PromptTemplate(
    template="现在你的角色是{role},请按该角色进行后续的对话.",
    input_variables=["role"],
)

           

Once you have one or more MessagePromptTemplates, you can use these MessagePromptTemplates to build ChatPromptTemplates:

chat_prompt = ChatPromptTemplate.from_messages([system_message_prompt, human_message_prompt])

chat_prompt.format_prompt(role="医生", text="帮我看看我的颜值还行吗?").to_messages()
           

summary

Well, the prompt template in basic langchain has been described. Let's try it.