动手点关注
干货不迷路
笔者在《一文探秘LLM应用开发》Prompt技术部分中曾介绍了类比推理提示(Analogical Prompting),让大模型自己通过给自己给自己找相似例子方式,来提升zero-shot的效果。
在Zero-Shot-CoT领域,最近几天(10.3)刚刚有一个新的研究《Large Language Models as Analogical Reasoners》提出,通过类比推理提示(Analogical Prompting)可以让大模型自己生成相似问题做为例子,从而再根据例子步骤形成思维链来解决新问题。
ully,公众号:AI工程化一文探秘LLM应用开发(21)-Prompt(提示工程技术、重要性与挑战)
而在这项技术发表不久,谷歌DeepMind就在10月9日提了一项新技术“Step-Back Prompting”,简称后退提示(STP),不是类比寻找相似示例,而是让LLMs自己抽象问题,得到更高维度概念和原理,再用这些知识推理并解决问题。这种思维模式非常类似于人类解决问题的方式,让大模型能够借鉴已有规律解决问题。
研究者用PaLM-2L模型做了实验,发现这种Prompt技巧能显著提升推理任务(STEM、知识问答、多步推理)的性能表现。
- STEM
- 知识问答
- 多步推理
想要使用STP,可以参考论文的prompt格式:
想要快速尝鲜中文版本,可以直接使用Poe上网友分享的bot( BotSTB ) 。 它的prompt是这样的:
你是世界知识的专家,擅长用后退提问策略,一步步仔细思考并回答问题。
后退提问是一种思考策略,意在从更宏观或更基础的角度去理解和分析一个特定的问题或情境。
这种策略要求我们在面对一个具体问题时,先“后退”一步,从一个更广泛或更根本的角度去提问和思考。这样做的目的是帮助我们更深入地理解问题的背景、原因或相关的基础知识,从而更好地回答原始问题。
策略
核心概念识别:首先确定问题的核心概念。例如,如果问题涉及到物理学中的力,那么可能需要后退到基础的力的定义和原理。
问题的范围:尝试识别问题的范围和上下文。这有助于确定后退的深度。有些问题可能只需要稍微后退一步,而其他问题可能需要深入到基础原理。
历史和背景:对于一些问题,了解其历史背景和发展可能会有助于提出恰当的后退问题。
原理和假设:明确当前问题的基础原理和假设。这可以帮助确定应该从哪些方面后退。
执行步骤
用中文和用户打招呼,要求用户输入一个问题,每当用户输入一个问题,你要根据以下流程回答问题.
给出至少3个符合<策略>的可选<后退提问>并分别回答。
将上述回答作为论据,有逻辑,条理的,使用可视化辅助对用户的问题进行最终作答。
示例:
在工程实现上,后退提示(STP)可以和RAG相结合,利用后退提示获得的抽象问题,获得更多与最终答案需要的的上下文信息,然后,再将获得的上下文和原始问题一起提交给LLM,从而让LLM获得更好的回答质量。
可以看到,step-back prompting与rag配合使用的方式 相比较于baseline提升了39.9%,相较于单纯RAG应用,提升了21.6%的效果。
该技术已经被langchain支持,使用方法可参考下面例子:
from langchain.chat_models import ChatOpenAI
from langchain.prompts import ChatPromptTemplate, FewShotChatMessagePromptTemplate
from langchain.schema.output_parser import StrOutputParser
from langchain.schema.runnable import RunnableLambda
# Few Shot Examples
examples = [
{
"input": "Could the members of The Police perform lawful arrests?",
"output": "what can the members of The Police do?"
},
{
"input": "Jan Sindel’s was born in what country?",
"output": "what is Jan Sindel’s personal history?"
},
]
# We now transform these to example messages
example_prompt = ChatPromptTemplate.from_messages(
[
("human", "{input}"),
("ai", "{output}"),
]
)
few_shot_prompt = FewShotChatMessagePromptTemplate(
example_prompt=example_prompt,
examples=examples,
)
prompt = ChatPromptTemplate.from_messages([
("system", """You are an expert at world knowledge. Your task is to step back and paraphrase a question to a more generic step-back question, which is easier to answer. Here are a few examples:"""),
# Few shot examples
few_shot_prompt,
# New question
("user", "{question}"),
])
question_gen = prompt | ChatOpenAI(temperature=0) | StrOutputParser()
question = "was chatgpt around while trump was president?"
question_gen.invoke({"question": question})
'when was ChatGPT developed?'
from langchain.utilities import DuckDuckGoSearchAPIWrapper
search = DuckDuckGoSearchAPIWrapper(max_results=4)
def retriever(query):
return search.run(query)
retriever(question)
'This includes content about former President Donald Trump. According to further tests, ChatGPT successfully wrote poems admiring all recent U.S. presidents, but failed when we entered a query for ... On Wednesday, a Twitter user posted screenshots of him asking OpenAI's chatbot, ChatGPT, to write a positive poem about former President Donald Trump, to which the chatbot declined, citing it ... While impressive in many respects, ChatGPT also has some major flaws. ... [President's Name]," refused to write a poem about ex-President Trump, but wrote one about President Biden ... During the Trump administration, Altman gained new attention as a vocal critic of the president. It was against that backdrop that he was rumored to be considering a run for California governor.'
retriever(question\_gen.invoke({"question": question}))
"Will Douglas Heaven March 3, 2023 Stephanie Arnett/MITTR | Envato When OpenAI launched ChatGPT, with zero fanfare, in late November 2022, the San Francisco-based artificial-intelligence company... ChatGPT, which stands for Chat Generative Pre-trained Transformer, is a large language model -based chatbot developed by OpenAI and launched on November 30, 2022, which enables users to refine and steer a conversation towards a desired length, format, style, level of detail, and language. ChatGPT is an artificial intelligence (AI) chatbot built on top of OpenAI's foundational large language models (LLMs) like GPT-4 and its predecessors. This chatbot has redefined the standards of... June 4, 2023 ⋅ 4 min read 124 SHARES 13K At the end of 2022, OpenAI introduced the world to ChatGPT. Since its launch, ChatGPT hasn't shown significant signs of slowing down in developing new..."
# response_prompt_template = """You are an expert of world knowledge. I am going to ask you a question. Your response should be comprehensive and not contradicted with the following context if they are relevant. Otherwise, ignore them if they are not relevant.
# {normal_context}
# {step_back_context}
# Original Question: {question}
# Answer:"""
# response_prompt = ChatPromptTemplate.from_template(response_prompt_template)
from langchain import hub
response_prompt = hub.pull("langchain-ai/stepback-answer")
chain = {
# Retrieve context using the normal question
"normal_context": RunnableLambda(lambda x: x['question']) | retriever,
# Retrieve context using the step-back question
"step_back_context": question_gen | retriever,
# Pass on the question
"question": lambda x: x["question"]
} | response_prompt | ChatOpenAI(temperature=0) | StrOutputParser()
chain.invoke({"question": question})
"No, ChatGPT was not around while Donald Trump was president. ChatGPT was launched on November 30, 2022, which is after Donald Trump's presidency. The context provided mentions that during the Trump administration, Altman, the CEO of OpenAI, gained attention as a vocal critic of the president. This suggests that ChatGPT was not developed or available during that time."
可在colab尝试:https://github.com/langchain-ai/langchain/blob/master/cookbook/stepback-qa.ipynb