View a PDF of the paper titled Entropy-Guided Dynamic Tokens for Graph-LLM Alignment in Molecular Understanding, by Zihao Jing and 5 other authors
View PDF
Abstract:Molecular understanding is central to advancing areas such as scientific discovery, yet Large Language Models (LLMs) struggle to understand molecular graphs effectively. Existing graph-LLM bridges often adapt the Q-Former-style connector with fixed-length static tokens, which is originally designed for vision tasks. These designs overlook stereochemistry and substructural context and typically require costly LLM-backbone fine-tuning, limiting efficiency and generalization. We introduce EDT-Former, an Entropy-guided Dynamic Token Transformer that generates tokens aligned with informative molecular patches, thereby preserving both local and global structural features for molecular graph understanding. Beyond prior approaches, EDT-Former enables alignment between frozen graph encoders and LLMs without tuning the LLM backbone (excluding the embedding layer), resulting in computationally efficient finetuning, and achieves stateof-the-art results on MoleculeQA, Molecule-oriented Mol-Instructions, and property prediction benchmarks (TDC, MoleculeNet), underscoring its effectiveness for scalable and generalizable multimodal molecular understanding
Submission history
From: Zihao Jing [view email]
[v1]
Mon, 2 Feb 2026 19:56:21 UTC (1,893 KB)
[v2]
Wed, 11 Feb 2026 08:00:07 UTC (1,892 KB)
[v3]
Mon, 2 Mar 2026 05:13:15 UTC (1,894 KB)


![[2602.02742] Entropy-Guided Dynamic Tokens for Graph-LLM Alignment in Molecular Understanding Measuring Intelligence Efficiency of Local AI](https://skytik.cc/wp-content/uploads/2025/11/Measuring-Intelligence-Efficiency-of-Local-AI-768x448.png)