ExACT: Teaching AI Agents to Explore with Reflective-MCTS and Exploratory Learning
Xiao Yu, Baolin Peng, Vineeth Vajipey, Hao Cheng, Michel Galley, Jianfeng Gao, Zhou Yu
- 🏛 Institutions
- Columbia, MSR
- 📅 Date
- October 2, 2024
- 📑 Publisher
- ICLR 2025 (Poster)
- 💻 Env
- Web
- 🔑 Keywords
TLDR
ExACT combines Reflective-MCTS test-time search with Exploratory Learning to teach web agents to explore, evaluate states, and backtrack. On VisualWebArena, the GPT-4o-based search agent improves substantially over prior methods, and the fine-tuned model recovers 87% of the search agent's performance while using much less inference compute.
Related papers
- WALT: Web Agents that Learn ToolsOctober 1, 2025 · ICLR 2026 (Poster)
- Attacking Vision-Language Computer Agents via Pop-upsNovember 4, 2024 · ACL 2025
- Tree Search for Language Model AgentsJuly 1, 2024 · TMLR 2025
- Dissecting Adversarial Robustness of Multimodal LM AgentsJune 18, 2024 · ICLR 2025 (Poster)
- VisualWebArena: Evaluating Multimodal Agents on Realistic Visual Web TasksJanuary 24, 2024 · ACL 2024