Poster
in
Workshop: System-2 Reasoning at Scale
Can Stories Help LLMs Reason? Curating Information Space Through Narrative
Vahid Sadiri Javadi · Johanne Trippas · Lucie Flek
Narrative is widely recognized as a powerful tool for structuring information and facilitating comprehension of complex ideas in various domains, such as science communication. This paper investigates whether incorporating narrative elements can assist Large Language Models (LLMs) in solving complex tasks more effectively. We propose a novel approach, Story of Thought (SoT), integrating narrative structures into prompting techniques for problem-solving tasks. This approach involves constructing narratives around problem statements and creating a framework to identify and organize relevant information. We hypothesize that this narrative-based information curation process enhances problem comprehension by contextualizing critical information and highlighting causal relationships within the problem space. Our experimental results show that the SoT approach consistently surpasses Chain of Thought (CoT) and Analogical Reasoning in GPQA tasks, achieving higher accuracy and better solutions in physics, chemistry, and biology problem-solving tasks with all tested OpenAI, Meta, and Mistral LLMs.