文章预览
Messages:消息 上面调用LLM的方式,都是通过向 invoke 函数传入一个 string 参数,实现 text - in , text - out (文本输入,文本输出),这是比较传统、老式的completion接口;现在LLM主推的API接口,都是所谓的Chat Completion Models方式,实现 messages in , message out (消息输入,消息输出)。直接看例子🌰: from langchain_openai import ChatOpenAI from langchain_core . messages import SystemMessage , HumanMessage load_dotenv () # LLM style: text in, text out model = ChatOpenAI ( model = "gpt-4o" ) result = model . invoke ( 'What is 81 divided by 9?' ) print ( "\n== LLM style: text in, text out ==\n" ) print ( result ) # SystemMessage: # Message for priming AI behavior, usually passed in as the first of a sequenc of input messages. # HumanMessagse: # Message from a human to the AI model. messages = [ SystemMessage ( content = "Solve the following math problems" ), HumanMessage ( c
………………………………