2026/1/5 11:05:27
网站建设
项目流程
企业网站有什么功能,排名轻松seo 网站,来个网站好人有好报2024,云南网站建设公司构建智能对话机器人
本文会讲解如何从零开始构建一个LLM对话应用。首先需要完成第02篇的环境搭建#xff0c;并掌握基本的Python语法。如果使用OpenAI API#xff0c;需要有API密钥#xff1b;如果使用本地模型#xff0c;可以通过Ollama运行。
我们要构建的应用包括基础对…构建智能对话机器人本文会讲解如何从零开始构建一个LLM对话应用。首先需要完成第02篇的环境搭建并掌握基本的Python语法。如果使用OpenAI API需要有API密钥如果使用本地模型可以通过Ollama运行。我们要构建的应用包括基础对话功能、多轮对话管理、Web界面和API服务。最简对话机器人从最简单的实现开始。只需要几行代码就可以调用大模型进行对话# chatbot_v1.py - 最简对话机器人importosfromopenaiimportOpenAI# 统一的客户端初始化clientOpenAI(api_keyos.getenv(OPENAI_API_KEY,your-api-key))# 统一使用的模型MODELgpt-3.5-turbodefchat(message:str)-str:responseclient.chat.completions.create(modelMODEL,messages[{role:user,content:message}])returnresponse.choices[0].message.content# 测试print(chat(Python中的装饰器是什么))如果想给机器人设置一个特定的人设或角色可以加入系统提示词# chatbot_v2.py - 带人设的机器人importosfromopenaiimportOpenAIclassChatBot:def__init__(self,system_prompt:str你是一个有帮助的AI助手):self.clientOpenAI(api_keyos.getenv(OPENAI_API_KEY))self.modelgpt-3.5-turbo# 统一模型名称self.system_promptsystem_promptdefchat(self,message:str)-str:try:responseself.client.chat.completions.create(modelself.model,messages[{role:system,content:self.system_prompt},{role:user,content:message}],temperature0.7)returnresponse.choices[0].message.contentexceptExceptionase:returnf错误:{str(e)}# 创建专业机器人coderChatBot(你是一个Python专家回答要简洁准确)print(coder.chat(如何实现单例模式))多轮对话管理简单的单轮对话很难满足实际需求。大多数应用需要记住历史对话内容这样模型才能理解上下文。对话历史管理# conversation.py - 完整对话管理importosfromtypingimportList,DictimportjsonfromdatetimeimportdatetimefromopenaiimportOpenAIclassConversationManager:def__init__(self,modelgpt-3.5-turbo,max_history10):self.clientOpenAI(api_keyos.getenv(OPENAI_API_KEY))self.modelmodel self.max_historymax_history self.conversations{}defcreate_session(self,session_id:strNone)-str:创建新会话ifnotsession_id:session_iddatetime.now().strftime(%Y%m%d_%H%M%S)self.conversations[session_id][]returnsession_iddefadd_message(self,session_id:str,role:str,content:str):添加消息到历史ifsession_idnotinself.conversations:self.create_session(session_id)self.conversations[session_id].append({role:role,content:content,timestamp:datetime.now().isoformat()})# 限制历史长度iflen(self.conversations[session_id])self.max_history*2:self.conversations[session_id]self.conversations[session_id][-self.max_history:]defchat(self,session_id:str,message:str)-str:进行对话ifsession_idnotinself.conversations:self.create_session(session_id)# 添加用户消息self.add_message(session_id,user,message)# 准备API调用的消息不包含timestampmessages[{role:m[role],content:m[content]}forminself.conversations[session_id]]# 调用APIresponseself.client.chat.completions.create(modelself.model,messagesmessages)# 添加助手回复replyresponse.choices[0].message.content self.add_message(session_id,assistant,reply)returnreplydefsave_session(self,session_id:str,filepath:str):保存会话withopen(filepath,w,encodingutf-8)asf:json.dump(self.conversations.get(session_id,[]),f,ensure_asciiFalse,indent2)defload_session(self,session_id:str,filepath:str):加载会话withopen(filepath,r,encodingutf-8)asf:self.conversations[session_id]json.load(f)# 使用示例managerConversationManager()sessionmanager.create_session()# 多轮对话print(manager.chat(session,我想学习机器学习))print(manager.chat(session,推荐一些入门书籍))print(manager.chat(session,第一本书的主要内容是什么))# 会记住上下文流式输出传统的API调用需要等待整个回复生成完毕。为了改善用户体验可以在生成的过程中逐步输出内容# streaming.py - 流式对话importsysclassStreamingChatBot:def__init__(self):self.clientOpenAI()defstream_chat(self,message:str):流式生成回复streamself.client.chat.completions.create(modelgpt-3.5-turbo,messages[{role:user,content:message}],streamTrue)full_responseforchunkinstream:ifchunk.choices[0].delta.content:contentchunk.choices[0].delta.content full_responsecontent# 实时打印sys.stdout.write(content)sys.stdout.flush()returnfull_response# 测试流式输出botStreamingChatBot()responsebot.stream_chat(写一个关于春天的诗)Web界面实现仅有命令行界面不太友好。下面展示两种方式创建Web聊天界面。Gradio版本快速上手# app_gradio.py - Gradio聊天界面importgradioasgrfromconversationimportConversationManager managerConversationManager()current_sessionmanager.create_session()defrespond(message,history):处理消息并返回响应responsemanager.chat(current_session,message)history.append((message,response))return,historydefclear_history():清空对话历史globalcurrent_session current_sessionmanager.create_session()return[]# 创建界面withgr.Blocks(titleLLM Chat)asdemo:gr.Markdown(## 智能对话助手)chatbotgr.Chatbot(height400)msggr.Textbox(placeholder输入消息...,label消息,lines2)withgr.Row():submitgr.Button(发送,variantprimary)cleargr.Button(清空对话)# 绑定事件msg.submit(respond,[msg,chatbot],[msg,chatbot])submit.click(respond,[msg,chatbot],[msg,chatbot])clear.click(clear_history,outputs[chatbot])# 启动if__name____main__:demo.launch(shareTrue,server_port7860)Gradio非常简洁适合快速原型化。如果你需要更多的定制化可以使用StreamlitStreamlit版本更多定制# app_streamlit.py - Streamlit聊天界面importstreamlitasstfromconversationimportConversationManager# 页面配置st.set_page_config(page_titleLLM Chat,page_icon,layoutwide)# 初始化ifmanagernotinst.session_state:st.session_state.managerConversationManager()st.session_state.session_idst.session_state.manager.create_session()ifmessagesnotinst.session_state:st.session_state.messages[]# 侧边栏withst.sidebar:st.title(⚙️ 设置)modelst.selectbox(模型,[gpt-3.5-turbo,gpt-4,gpt-4-turbo])temperaturest.slider(创造性,0.0,1.0,0.7)max_tokensst.slider(最大长度,50,2000,500)ifst.button(清空对话):st.session_state.messages[]st.session_state.session_idst.session_state.manager.create_session()st.rerun()# 主界面st.title( LLM对话助手)# 显示历史消息formessageinst.session_state.messages:withst.chat_message(message[role]):st.markdown(message[content])# 输入框ifprompt:st.chat_input(说点什么...):# 显示用户消息st.session_state.messages.append({role:user,content:prompt})withst.chat_message(user):st.markdown(prompt)# 生成回复withst.chat_message(assistant):message_placeholderst.empty()full_response# 这里可以实现流式输出responsest.session_state.manager.chat(st.session_state.session_id,prompt)# 模拟流式效果importtimeforcharinresponse:full_responsechar message_placeholder.markdown(full_response▌)time.sleep(0.01)message_placeholder.markdown(full_response)st.session_state.messages.append({role:assistant,content:full_response})高级功能扩展一旦基础功能稳定可以加入更多高级特性来增强应用。多模型支持# multi_model.py - 多模型路由classMultiModelChat:def__init__(self):self.models{openai:self._chat_openai,anthropic:self._chat_anthropic,local:self._chat_local}def_chat_openai(self,message:str)-str:clientOpenAI()responseclient.chat.completions.create(modelgpt-3.5-turbo,messages[{role:user,content:message}])returnresponse.choices[0].message.contentdef_chat_anthropic(self,message:str)-str:# Anthropic Claude实现fromanthropicimportAnthropic clientAnthropic()responseclient.messages.create(modelclaude-3-5-sonnet-20241022,max_tokens1000,messages[{role:user,content:message}])returnresponse.content[0].textdef_chat_local(self,message:str)-str:# 本地模型使用Ollamaimportrequests responserequests.post(http://localhost:11434/api/generate,json{model:llama2,prompt:message})returnresponse.json()[response]defchat(self,message:str,model:stropenai)-str:ifmodelnotinself.models:raiseValueError(f不支持的模型:{model})returnself.models[model](message)插件系统# plugins.py - 功能插件fromtypingimportCallable,DictimportjsonclassChatPlugin:聊天插件基类defprocess(self,message:str,context:Dict)-str:raiseNotImplementedErrorclassCodeExecutor(ChatPlugin):代码执行插件defprocess(self,message:str,context:Dict)-str:ifpythoninmessage:# 提取代码codemessage.split(python)[1].split()[0]# 安全执行实际应用需要沙箱try:exec_globals{}exec(code,exec_globals)returnf代码执行成功exceptExceptionase:returnf执行错误:{e}returnmessageclassWebSearch(ChatPlugin):网络搜索插件defprocess(self,message:str,context:Dict)-str:if搜索:inmessage:querymessage.split(搜索:)[1].strip()# 这里调用搜索APIreturnf搜索结果: [相关内容]returnmessageclassPluggableChatBot:def__init__(self):self.clientOpenAI()self.plugins[]defadd_plugin(self,plugin:ChatPlugin):添加插件self.plugins.append(plugin)defchat(self,message:str)-str:# 插件预处理context{}forplugininself.plugins:messageplugin.process(message,context)# 调用LLMresponseself.client.chat.completions.create(modelgpt-3.5-turbo,messages[{role:user,content:message}])resultresponse.choices[0].message.content# 插件后处理forplugininreversed(self.plugins):resultplugin.process(result,context)returnresult性能优化优化应用性能的常见方法包括缓存、批处理和异步调用# optimization.py - 性能优化技巧importasynciofromfunctoolsimportlru_cacheimporthashlibclassOptimizedChatBot:def__init__(self):self.clientOpenAI()self.cache{}lru_cache(maxsize100)def_get_embedding(self,text:str):缓存embedding计算responseself.client.embeddings.create(modeltext-embedding-ada-002,inputtext)returnresponse.data[0].embeddingdef_cache_key(self,message:str,model:str)-str:生成缓存键returnhashlib.md5(f{model}:{message}.encode()).hexdigest()asyncdefchat_async(self,message:str,model:strgpt-3.5-turbo):异步调用# 检查缓存cache_keyself._cache_key(message,model)ifcache_keyinself.cache:returnself.cache[cache_key]# 异步API调用responseawaitasyncio.to_thread(self.client.chat.completions.create,modelmodel,messages[{role:user,content:message}])resultresponse.choices[0].message.content self.cache[cache_key]resultreturnresultasyncdefbatch_chat(self,messages:list):批量处理tasks[self.chat_async(msg)formsginmessages]returnawaitasyncio.gather(*tasks)# 使用示例asyncdefmain():botOptimizedChatBot()# 批量处理questions[什么是Python?,如何学习编程?,推荐一些书籍]responsesawaitbot.batch_chat(questions)forq,rinzip(questions,responses):print(fQ:{q}\nA:{r}\n)# asyncio.run(main())部署和测试完成开发后需要将应用部署到生产环境。下面展示几种常见的部署方式。API服务化# api_server.py - FastAPI服务fromfastapiimportFastAPI,HTTPExceptionfrompydanticimportBaseModelfromtypingimportOptionalimportuvicorn appFastAPI(titleLLM Chat API)classChatRequest(BaseModel):message:strsession_id:Optional[str]Nonemodel:strgpt-3.5-turboclassChatResponse(BaseModel):response:strsession_id:strmanagerConversationManager()app.post(/chat,response_modelChatResponse)asyncdefchat(request:ChatRequest):try:session_idrequest.session_idormanager.create_session()responsemanager.chat(session_id,request.message)returnChatResponse(responseresponse,session_idsession_id)exceptExceptionase:raiseHTTPException(status_code500,detailstr(e))app.get(/health)asyncdefhealth():return{status:healthy}if__name____main__:uvicorn.run(app,host0.0.0.0,port8000)Docker部署# Dockerfile FROM python:3.10-slim WORKDIR /app COPY requirements.txt . RUN pip install --no-cache-dir -r requirements.txt COPY . . EXPOSE 7860 CMD [python, app_gradio.py]# docker-compose.ymlversion:3.8services:chatbot:build:.ports:-7860:7860environment:-OPENAI_API_KEY${OPENAI_API_KEY}volumes:-./data:/app/datarestart:unless-stopped测试在部署到生产环境前应该对关键功能进行测试# test_chat.py - 测试用例importunittestfromchatbot_v2importChatBotclassTestChatBot(unittest.TestCase):defsetUp(self):self.botChatBot()deftest_basic_chat(self):responseself.bot.chat(Hello)self.assertIsNotNone(response)self.assertIsInstance(response,str)deftest_context_preservation(self):managerConversationManager()sessionmanager.create_session()manager.chat(session,我叫小明)responsemanager.chat(session,我叫什么名字)self.assertIn(小明,response)if__name____main__:unittest.main()后续改进方向一旦基础应用可用你可以继续拓展功能。常见的改进包括添加语音输入和输出让应用更易用集成RAG系统增加知识库支持实现多语言支持让应用面向国际用户在本系列后续文章中这些功能会得到更详细的讨论。