Skip to content

聊天机器人示例

本示例展示如何使用 LangChain 1.2 构建一个带记忆的聊天机器人。

功能说明

  • 支持多轮对话
  • 保持上下文信息
  • 支持不同的语言模型
  • 可扩展的架构

代码实现

1. 安装依赖

bash
pip install langchain langchain-openai python-dotenv langchain-community

2. 创建配置文件

python
# .env
OPENAI_API_KEY=your-api-key

3. 实现聊天机器人

python
# chatbot.py
from langchain_openai import ChatOpenAI
from langchain_core.prompts import ChatPromptTemplate, MessagesPlaceholder
from langchain_core.output_parsers import StrOutputParser
from langchain_core.runnables.history import RunnableWithMessageHistory
from langchain_community.chat_message_histories import ChatMessageHistory
from dotenv import load_dotenv
import os

# 加载环境变量
load_dotenv()

class Chatbot:
    def __init__(self, model_name="gpt-3.5-turbo", temperature=0.7):
        # 初始化聊天模型
        self.llm = ChatOpenAI(
            api_key=os.environ.get("OPENAI_API_KEY"),
            model_name=model_name,
            temperature=temperature
        )
        
        # 创建聊天提示词模板
        self.prompt = ChatPromptTemplate.from_messages([
            ("system", "You are a helpful assistant."),
            MessagesPlaceholder(variable_name="history"),
            ("human", "{input}")
        ])
        
        # 创建输出解析器
        self.output_parser = StrOutputParser()
        
        # 使用 LCEL 语法创建基础链
        self.chain = self.prompt | self.llm | self.output_parser
        
        # 创建消息历史存储
        self.store = {}
    
    def get_session_history(self, session_id: str):
        """获取会话历史"""
        if session_id not in self.store:
            self.store[session_id] = ChatMessageHistory()
        return self.store[session_id]
    
    def chat(self, message, session_id="default"):
        """处理用户输入并返回响应"""
        # 创建带记忆的链
        chain_with_history = RunnableWithMessageHistory(
            self.chain,
            self.get_session_history,
            input_messages_key="input",
            history_messages_key="history"
        )
        
        # 运行链
        response = chain_with_history.invoke(
            {"input": message},
            config={"configurable": {"session_id": session_id}}
        )
        return response

if __name__ == "__main__":
    # 初始化聊天机器人
    chatbot = Chatbot()
    
    print("Chatbot: Hello! How can I help you today?")
    
    # 交互循环
    while True:
        user_input = input("You: ")
        if user_input.lower() in ["exit", "quit", "bye"]:
            print("Chatbot: Goodbye!")
            break
        
        response = chatbot.chat(user_input)
        print(f"Chatbot: {response}")

4. 运行聊天机器人

bash
python chatbot.py

示例输出

Chatbot: Hello! How can I help you today?
You: What is LangChain?

Chatbot: LangChain is a framework for building applications powered by language models. It provides a set of tools, components, and interfaces that simplify the process of creating LLM-powered applications. LangChain makes it easier to connect language models to other data sources, enable context-aware interactions, and build complex workflows.

You: Can you give me an example of how to use it?

Chatbot: Sure! Here's a simple example of how to use LangChain 1.2 to create a chatbot:

```python
from langchain_openai import ChatOpenAI
from langchain_core.prompts import ChatPromptTemplate, MessagesPlaceholder
from langchain_core.output_parsers import StrOutputParser
from langchain_core.runnables.history import RunnableWithMessageHistory
from langchain_community.chat_message_histories import ChatMessageHistory

# Initialize the chat model
chat_model = ChatOpenAI(api_key="your-api-key")

# Create a prompt template
prompt = ChatPromptTemplate.from_messages([
    ("system", "You are a helpful assistant."),
    MessagesPlaceholder(variable_name="history"),
    ("human", "{input}")
])

# Create output parser
output_parser = StrOutputParser()

# Create base chain
chain = prompt | chat_model | output_parser

# Create message history store
store = {}

def get_session_history(session_id: str):
    if session_id not in store:
        store[session_id] = ChatMessageHistory()
    return store[session_id]

# Create chain with history
chain_with_history = RunnableWithMessageHistory(
    chain,
    get_session_history,
    input_messages_key="input",
    history_messages_key="history"
)

# Start chatting
response = chain_with_history.invoke(
    {"input": "What is LangChain?"},
    config={"configurable": {"session_id": "user_123"}}
)
print(response)

This example creates a simple chatbot that can answer questions and remember previous interactions.

You: What are the core components of LangChain?

Chatbot: The core components of LangChain include:

  1. Models: Interfaces to various language models (LLMs, chat models, embedding models)
  2. Prompts: Templates for formatting input to language models
  3. Chains: Combinations of components to create workflows using LCEL
  4. Agents: Entities that can make decisions and use tools
  5. Tools: External services that agents can use
  6. Memory: Components for storing and managing conversation history
  7. Vector Stores: Databases for storing and retrieving embeddings
  8. Retrievers: Components for retrieving relevant information

These components work together to enable the creation of complex LLM applications.

You: exit Chatbot: Goodbye!


## 扩展功能

### 1. 添加工具

```python
from langchain.tools import BaseTool
from typing import Optional, Type
from pydantic import BaseModel, Field
from langchain.agents import AgentType, initialize_agent

# 定义工具输入模式
class WeatherInput(BaseModel):
    location: str = Field(description="城市名称")

# 创建天气工具
class WeatherTool(BaseTool):
    name = "get_weather"
    description = "获取指定城市的天气信息"
    args_schema: Type[BaseModel] = WeatherInput

    def _run(self, location: str) -> str:
        # 模拟天气 API 调用
        return f"{location} 的当前温度是 25°C,天气晴朗"

# 初始化工具
tools = [WeatherTool()]

# 初始化代理
agent = initialize_agent(
    tools=tools,
    llm=chat_model,
    agent=AgentType.CHAT_CONVERSATIONAL_REACT_DESCRIPTION,
    memory=memory,
    verbose=True
)

2. 使用自定义提示词

python
from langchain_core.prompts import ChatPromptTemplate, MessagesPlaceholder

# 创建自定义提示词模板
custom_prompt = ChatPromptTemplate.from_messages([
    ("system", "You are a helpful assistant specializing in {topic}."),
    MessagesPlaceholder(variable_name="history"),
    ("human", "{input}")
])

# 使用 LCEL 创建链
chain = (
    {"input": lambda x: x["input"], "topic": lambda x: x["topic"], "history": lambda x: x["history"]}
    | custom_prompt
    | llm
    | output_parser
)

# 运行链
response = chain.invoke({"input": "What is LangChain?", "topic": "AI frameworks"})

3. 集成向量存储

python
from langchain_openai import OpenAIEmbeddings
from langchain_community.vectorstores import FAISS
from langchain_community.document_loaders import TextLoader
from langchain.text_splitter import CharacterTextSplitter
from langchain.chains import RetrievalQA

# 加载文档
loader = TextLoader("path/to/document.txt")
documents = loader.load()

# 分割文档
text_splitter = CharacterTextSplitter(chunk_size=1000, chunk_overlap=0)
texts = text_splitter.split_documents(documents)

# 初始化嵌入模型
embeddings = OpenAIEmbeddings(api_key=os.environ.get("OPENAI_API_KEY"))

# 创建向量存储
vectorstore = FAISS.from_documents(texts, embeddings)

# 创建检索链
qa_chain = RetrievalQA.from_chain_type(
    llm=llm,
    chain_type="stuff",
    retriever=vectorstore.as_retriever()
)

# 运行检索链
response = qa_chain.run("What is LangChain?")

总结

本示例展示了如何使用 LangChain 1.2 构建一个带记忆的聊天机器人。通过这个示例,您可以了解到:

  1. 如何使用 LCEL 语法初始化聊天模型和提示词模板
  2. 如何使用 RunnableWithMessageHistory 管理对话历史
  3. 如何创建和运行链
  4. 如何扩展聊天机器人的功能

您可以根据具体需求,进一步扩展和定制这个聊天机器人,构建更复杂的 LLM 应用。