View a PDF of the paper titled Talking with Tables for Better LLM Factual Data Interactions, by Jio Oh and 7 other authors
View PDF
HTML (experimental)
Abstract:Large Language Models (LLMs) often struggle with requests related to information retrieval and data manipulation that frequently arise in real-world scenarios under multiple conditions. In this paper, we demonstrate that leveraging tabular structures in LLM interactions, is more effective than utilizing other structures for handling prevalent requests that operate over factual data. Through comprehensive evaluations across various scenarios and request types, we show that providing tabular structures yields a 40.29\% average performance gain along with better robustness and token efficiency. Through attention-value analysis, we discover that tables help LLMs better locate relevant information, explaining these improvements. Beyond tables and text, we evaluate whether (1) blending structuredness within text, such as providing templates or fixing the order of attributes, and (2) other representative structures, such as knowledge graphs and JSON are helpful. We observe that utilizing tables offers the best balance between efficiency and effectiveness. The method remains robust to task complexity and adapts to unstructured sources through text-to-table conversion. Overall, we highlight the untapped potential of tabular representations for future LLM applications.
Submission history
From: Geon Heo [view email]
[v1]
Sun, 22 Dec 2024 23:31:03 UTC (907 KB)
[v2]
Sun, 25 May 2025 13:48:58 UTC (1,658 KB)
[v3]
Mon, 16 Jun 2025 12:00:53 UTC (352 KB)
[v4]
Thu, 8 Jan 2026 15:45:17 UTC (375 KB)


![[2412.17189] Talking with Tables for Better LLM Factual Data Interactions Measuring Intelligence Efficiency of Local AI](https://skytik.cc/wp-content/uploads/2025/11/Measuring-Intelligence-Efficiency-of-Local-AI-768x448.png)