This article will show you how to build a simple AI agent using the ADK (Agent Development Kit) to fetch data from a FASTMCP server. We’ll use a FASTMCPClient to connect to the server and a LlmAgent to intelligently route user requests to a custom tool.
The provided Python code demonstrates how to create this agent. Let’s break it down piece by piece.
Prerequisites and Setup
First, we handle our dependencies and configurations. The code starts with a warnings filter. This isn’t strictly necessary for the agent’s functionality but is a good practice for development, as it suppresses a non-critical UserWarning from pydantic.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 | import warningswarnings.filterwarnings( "ignore", message='Field name "config_type" in "SequentialAgent" shadows an attribute in parent "BaseAgent"', category=UserWarning, module="pydantic._internal._fields")import asynciofrom fastmcp import Clientfrom typing import Anyfrom google.genai import typesfrom dotenv import load_dotenvfrom google.adk.agents import LlmAgentfrom pydantic import BaseModel, Fieldfrom google.adk.agents import Agentfrom google.adk.events import Eventfrom google.adk.runners import Runnerfrom google.adk.sessions import InMemorySessionService |
We’re importing key components:
fastmcp.Client: The client library that allows us to make remote procedure calls (RPCs) to our FASTMCP server.google.adk.agents.LlmAgent: The core class for our agent. It’s an agent powered by a Large Language Model (LLM).asyncio: Required for asynchronous operations, which are essential for network I/O.- Other libraries like
pydanticandtypingprovide data validation and type hinting, respectively, which are common in modern Python development.
Creating the Agent’s Tool 
The heart of our agent’s functionality is the tool. A tool is a standard Python function that an LLM-powered agent can “choose” to call to perform a specific action. Here, our tool is a function called get_mcp_data.
1 2 3 4 5 6 7 | async def get_mcp_data(object_id: str) -> dict: """Fetches an object by its ID from the MCP server.""" print(f"Tool 'get_mcp_data' called with object_id: {object_id}") async with Client("http://127.0.0.1:8001/mcp") as client: single = await client.call_tool("get_object_by_id", {"object_id": object_id}) print("Fetched single:", single) return single |
This asynchronous function takes a single argument, object_id.
- It establishes a connection to our local FASTMCP server using the
fastmcp.Client. Theasync withstatement ensures the connection is properly managed. - It then uses
client.call_tool("get_object_by_id", {"object_id": object_id})to remotely execute theget_object_by_idfunction on the server. - The result from the server is returned as a dictionary.
The agent’s LLM will be responsible for understanding when to call this function and what object_id to pass to it.
Defining the LlmAgent
Now we put everything together by defining our agent. This is where we configure its personality, capabilities, and most importantly, tell it about the tool it can use.
1 2 3 4 5 6 7 8 9 | call_mcp_server_agent = LlmAgent( model="gemini-2.0-flash", name="assistant", description="This agent is used to get data using FASTMCP client by calling the FASTMCP server ", instruction="""Help user to fetch the data from the FASTMCP Server using FASTMCP Client. When the user asks to fetch data for a specific object ID, use the `get_mcp_data` tool and pass the ID to it. """, tools=[get_mcp_data],) |
Let’s look at the key parameters:
model="gemini-2.0-flash": We specify the LLM we want to use. This model will act as the “brain” of our agent.description: A brief summary of what the agent does.instruction: This is the most critical part. It’s the prompt that guides the LLM’s behavior. We explicitly tell the model: “When the user asks to fetch data for a specific object ID, use theget_mcp_datatool and pass the ID to it.” This instruction is how the agent knows to call our custom function.tools=[get_mcp_data]: This is a list of the callable functions the agent has access to. By including ourget_mcp_datafunction here, we are making it available for the LLM to use.
Finally, root_agent = call_mcp_server_agent simply makes this agent the entry point for our application. From here, you would typically use an ADK Runner to create an interactive session where users can chat with the agent and trigger the tool calls.
The full source code for this project is available on GitHub:
https://github.com/shdhumale/app.git

No comments:
Post a Comment