We’ve seen LlamaIndex’s incredible power for building knowledge bases with Query and Chat Engines. Separately, in our LangChain projects, we’ve explored the world of agents—autonomous entities that can reason and use tools.
What happens when we merge these two worlds? We get an agent that not only acts but has a deep, queryable “long-term memory” powered by a LlamaIndex data index.
While LlamaIndex is famous for its data-first approach to RAG, it also has a robust framework for building agents that can reason and use tools, much like LangChain. The most powerful pattern is to give a LlamaIndex agent a tool that is, in fact, its own knowledge base.
This tutorial will guide you step-by-step through creating a ReAct agent within the LlamaIndex ecosystem and connecting it to the data index we built in our previous post.
Part 1: The Tools – The Agent’s Capabilities
To showcase the agent’s ability to choose the right tool for the job, we will give it two distinct capabilities.
Tool 1: The Knowledge Base (A QueryEngineTool)
This is the main event. We won’t give the agent our raw index. Instead, we’ll create a QueryEngine from our index and then wrap it in a QueryEngineTool. This abstraction creates a clean, well-described tool that the agent can easily understand and use.
# tools.py
from llama_index.core.tools import QueryEngineTool
from llama_index.core import StorageContext, load_index_from_storage
# Load the index we built in the previous post
storage_context = StorageContext.from_defaults(persist_dir="./storage")
index = load_index_from_storage(storage_context)
# Create a query engine from the index
query_engine = index.as_query_engine()
# Wrap the query engine in a tool
query_tool = QueryEngineTool.from_defaults(
query_engine=query_engine,
name="khardaha_knowledge_base",
description="Provides detailed information about the history and culture of Khardaha, West Bengal. Use this for specific questions about the city."
)
Tool 2: A Simple Function Tool
To prove the agent can make choices, let’s give it another simple tool. FunctionTool is LlamaIndex’s equivalent of LangChain’s @tool decorator.
# tools.py (continued)
from llama_index.core.tools import FunctionTool
from datetime import datetime
def get_current_day() -> str:
"""Returns the current day of the week (e.g., 'Wednesday')."""
# Use real datetime based on current system time
return datetime.now().strftime('%A')
day_tool = FunctionTool.from_defaults(
fn=get_current_day,
name="get_current_day"
)
We now have a list of two tools: one for general knowledge (get_current_day
) and one for specialized, deep knowledge (khardaha_knowledge_base
).
Part 2: Assembling the LlamaIndex Agent
LlamaIndex offers several types of agents. The most direct equivalent to the agents we’ve built in LangChain is the ReActAgent
. It follows the same Thought -> Action -> Observation
loop.
The ReActAgent.from_tools
constructor makes setup incredibly simple. It handles the complex prompt engineering needed to make the agent aware of its tools and to follow the ReAct response format.
Here is the complete main.py
to set up and run our agent.
# main.py
from dotenv import load_dotenv
from llama_index.core.agent import ReActAgent
from llama_index.llms.openai import OpenAI
from tools import query_tool, day_tool # Import our tools
# --- SETUP ---
load_dotenv()
tools = [query_tool, day_tool]
llm = OpenAI(model="gpt-4o")
# --- CREATE THE AGENT ---
agent = ReActAgent.from_tools(tools, llm=llm, verbose=True)
# --- RUN THE AGENT IN A CONVERSATIONAL LOOP ---
if __name__ == "__main__":
print("LlamaIndex agent is ready. Type 'exit' to end.")
while True:
user_input = input("You: ")
if user_input.lower() == "exit":
break
response = agent.chat(user_input)
print(f"Agent: {response}")
Part 3: Running the Agent and Seeing the Choice
Now, let’s have a conversation with our agent and watch its reasoning process, thanks to verbose=True
.
Conversation 1: Triggering the simple tool
You: What day of the week is it today?
Verbose Output:
Thought: The user is asking for the current day of the week. The `get_current_day` tool can answer this. Action: get_current_day Observation: Wednesday Thought: I have the answer. I will respond to the user. Agent: Today is Wednesday.
The agent correctly identified the right tool for the job and ignored the knowledge base.
Conversation 2: Triggering the RAG tool
You: Tell me about the religious heritage of Khardaha.
Verbose Output:
Thought: The user is asking a specific question about Khardaha. I should use the `khardaha_knowledge_base` tool to find this information. Action: khardaha_knowledge_base with input "What is the religious heritage of Khardaha?" Observation: Khardaha is known for its rich religious and cultural heritage, particularly its connection to the Vaishnava tradition... Thought: I have retrieved the information from the knowledge base. I will now synthesize it into a final answer. Agent: Khardaha is well-known for its rich religious and cultural heritage, especially its ties to the Vaishnava tradition. A key landmark is the Shyamsundar Temple. The city also hosts the annual Ghoshpara fair, which is a major event for the Kartabhaja sect.
The agent correctly identified that this specific question required deep knowledge and used our RAG tool to get a factually grounded answer from our documents.
Conclusion
You have now successfully built a LlamaIndex agent and, more importantly, have given that agent a “long-term memory” by equipping it with a tool to query your custom knowledge base.
While the agent architectures in LangChain and LlamaIndex share the same core ReAct principles, LlamaIndex’s agent framework is, unsurprisingly, deeply and elegantly integrated with its data structures. Creating a QueryEngineTool
is a first-class, highly optimized operation.
This pattern—giving an agent a RAG tool—is one of the most powerful in all of AI. It allows your agent to be both a general-purpose reasoner capable of simple tasks, and a deep subject-matter expert with perfect recall of your specific domain knowledge. You can now build intelligent agents in both of the industry’s leading frameworks, choosing the right ecosystem for your project’s needs.
Author

Experienced Cloud & DevOps Engineer with hands-on experience in AWS, GCP, Terraform, Ansible, ELK, Docker, Git, GitLab, Python, PowerShell, Shell, and theoretical knowledge on Azure, Kubernetes & Jenkins. In my free time, I write blogs on ckdbtech.com