Papers
arxiv:2507.06229

Agent KB: Leveraging Cross-Domain Experience for Agentic Problem Solving

Published on Jul 8, 2025
· Submitted by
Xiangru Tang
on Jul 9, 2025
Authors:
,
,
,
,
,
,
,
,
,
,

Abstract

Agent KB, a hierarchical experience framework, enhances problem-solving success rates across different agents by enabling cross-agent knowledge transfer through a Reason-Retrieve-Refine pipeline.

AI-generated summary

As language agents tackle increasingly complex tasks, they struggle with effective error correction and experience reuse across domains. We introduce Agent KB, a hierarchical experience framework that enables complex agentic problem solving via a novel Reason-Retrieve-Refine pipeline. Agent KB addresses a core limitation: agents traditionally cannot learn from each other's experiences. By capturing both high-level strategies and detailed execution logs, Agent KB creates a shared knowledge base that enables cross-agent knowledge transfer. Evaluated on the GAIA benchmark, Agent KB improves success rates by up to 16.28 percentage points. On the most challenging tasks, Claude-3 improves from 38.46% to 57.69%, while GPT-4 improves from 53.49% to 73.26% on intermediate tasks. On SWE-bench code repair, Agent KB enables Claude-3 to improve from 41.33% to 53.33%. Our results suggest that Agent KB provides a modular, framework-agnostic infrastructure for enabling agents to learn from past experiences and generalize successful strategies to new tasks.

Community

Sign up or log in to comment

Get this paper in your agent:

hf papers read 2507.06229
Don't have the latest CLI?
curl -LsSf https://hf.co/cli/install.sh | bash

Models citing this paper 0

No model linking this paper

Cite arxiv.org/abs/2507.06229 in a model README.md to link it from this page.

Datasets citing this paper 0

No dataset linking this paper

Cite arxiv.org/abs/2507.06229 in a dataset README.md to link it from this page.

Spaces citing this paper 0

No Space linking this paper

Cite arxiv.org/abs/2507.06229 in a Space README.md to link it from this page.

Collections including this paper 5