In this post, we’ll dive into how to use FastMCP to create and consume local servers. This is a common and powerful pattern for building AI agents that need to execute local tools without the complexity of a remote API call. We’ll start with a simple example and then expand to a more practical use case: wrapping a REST API.
A Simple FastMCP Server and Client
First, let’s create a straightforward server. This server defines a single tool called hello
that takes a name as a string and returns a greeting.
my_server.py
1 2 3 4 5 6 7 8 9 10 | from fastmcp import FastMCP mcp = FastMCP(name="MyServer") @mcp.tool def hello(name: str) -> str: return f"Hello, {name}!" if __name__ == "__main__": mcp.run() |
This server is self-contained. The mcp.run()
call inside the if __name__ == "__main__":
block starts the server. By default, it will use a StdioTransport, which means it communicates over standard input/output streams. This is the ideal transport for a client to run the server as a subprocess.
Now, let’s create a client to consume this server. The beauty of FastMCP is that the client handles the server’s lifecycle automatically.
my_client.py
1 2 3 4 5 6 7 8 9 10 11 12 13 | import asyncio from fastmcp import Client async def main(): # The Client will automatically infer the StdioTransport # and manage the server's subprocess life cycle. async with Client("my_server.py") as client: # Call the 'hello' tool on the server with a dictionary of arguments. result = await client.call_tool("hello", {"name": "World"}) print(result) if __name__ == "__main__": asyncio.run(main()) |
When you run my_client.py
, the FastMCP Client
will:
- Start
my_server.py
as a subprocess. - Send the
call_tool
request to the subprocess over standard I/O. - Receive the response.
- Shut down the server subprocess when the
async with
block exits.
The final output demonstrates the successful communication:
1 2 | (.venv) C:\vscode-python-workspace\adkagent>python my_client.py CallToolResult(content=[TextContent(type='text', text='Hello, World!', annotations=None, meta=None)], structured_content={'result': 'Hello, World!'}, data='Hello, World!', is_error=False) |
The output shows the raw CallToolResult
object, which contains the greeting “Hello, World!” from the server.

Wrapping a REST API with a FastMCP Server
Now, let’s create a more advanced server that acts as a wrapper for a public REST API. This is a powerful technique for exposing external services to an AI agent in a clean, tool-based format. We’ll use the API at https://api.restful-api.dev/objects
as our example.
restapi_mcp_server.py
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 | # restapi_mcp_server.py from fastmcp import FastMCP, Context import httpx mcp = FastMCP(name="RESTful API Wrapper <img draggable="false" role="img" class="emoji" alt=" BASE_URL = "https://api.restful-api.dev/objects" @mcp.tool() async def get_all_objects(ctx: Context): async with httpx.AsyncClient() as client: resp = await client.get(BASE_URL) return resp.json() @mcp.tool() async def get_object_by_id(object_id: str, ctx: Context): async with httpx.AsyncClient() as client: resp = await client.get(f"{BASE_URL}/{object_id}") return resp.json() @mcp.tool() async def add_object(data: dict, ctx: Context): async with httpx.AsyncClient() as client: resp = await client.post(BASE_URL, json=data) return resp.json() # ... other CRUD tools (update, patch, delete) # Note: The full code is available in the provided source repository. if __name__ == "__main__": mcp.run() |
This script defines several tools (e.g., get_object_by_id
, add_object
) that correspond directly to standard REST API operations. An AI agent or client can now call these human-readable tools without needing to know anything about httpx
, URLs, or HTTP methods.
Using an AI Agent as a Client to the FastMCP Server
Finally, let’s build an AI agent client that uses the Google Agent Development Kit (ADK) to interact with our new REST API wrapper server. This demonstrates a complete workflow where a user’s natural language request triggers a tool call to our local FastMCP server.
restapi_mcp_client.py
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 | import asyncio from fastmcp import Client from google.genai import types from google.adk.agents import LlmAgent from google.adk.runners import Runner from google.adk.sessions import InMemorySessionService # A simple tool function for the ADK agent. # This function is the bridge between the agent and our MCP server. async def get_mcp_data(object_id: str) -> dict: """Fetches an object by its ID from the MCP server.""" # The Client starts the restapi_mcp_server.py as a subprocess. async with Client("restapi-mcp-server.py") as client: single = await client.call_tool("get_object_by_id", {"object_id": object_id}) return single.data # The LLM Agent configured with our tool. call_mcp_server_agent = LlmAgent( model="gemini-2.0-flash", name="assistant", instruction="""Help the user fetch data for a specific object ID. When a user asks to fetch data, use the `get_mcp_data` tool with the provided ID. """, tools=[get_mcp_data], ) async def get_agent_async(query): # Boilerplate to set up the session and runner session_service = InMemorySessionService() session = await session_service.create_session(app_name="CALL_MCP_SERVER", user_id="1234", session_id="session1234") runner = Runner(agent=call_mcp_server_agent, app_name="CALL_MCP_SERVER", session_service=session_service) # Run the agent with a user query content = types.Content(role='user', parts=[types.Part(text=query)]) events = runner.run_async(user_id="1234", session_id="session1234", new_message=content) final_response = "Agent did not produce a final response." async for event in events: if event.is_final_response() and event.content: final_response = event.content.parts[0].text return final_response if __name__ == "__main__": final_result = asyncio.run(get_agent_async("Fetch the data for object_id 2")) print(f"\n--- Script Finished ---\nFinal returned value: {final_result}") |
When this client runs, the AI agent’s instructions tell it to use the get_mcp_data
tool. The tool function then uses the FastMCP Client
to call our local restapi_mcp_server.py
, which in turn fetches the data from the external REST API. The ADK agent then processes this data and provides a final response to the user.
Here’s the final output:
1 2 3 4 5 6 7 | (.venv) C:\vscode-python-workspace\adkagent>python restapi-mcp-client.py Tool 'get_mcp_data' called with object_id: 2 Fetched single: CallToolResult(...) Potential final response from [assistant]: OK. I have fetched the data for object ID 2. The result is: ... --- Script Finished --- Final returned value: OK. I have fetched the data for object ID 2. The result is: ... |
This full example showcases the power of combining an AI agent framework with a local FastMCP tool server. The FastMCP layer effectively abstracts away the complexity of communicating with local or remote services, providing a simple, consistent interface for your agents.

You can find all the source code for these examples on GitHub.
Now lets consume the same rest FASTMCPServer using Google ADK tool
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 | #Connecting local MCP server import warnings warnings.filterwarnings( "ignore", message='Field name "config_type" in "SequentialAgent" shadows an attribute in parent "BaseAgent"', category=UserWarning, module="pydantic._internal._fields" ) import asyncio from fastmcp import Client from typing import Any from google.genai import types from dotenv import load_dotenv from google.adk.agents import LlmAgent from pydantic import BaseModel, Field from google.adk.agents import Agent from google.adk.events import Event from google.adk.runners import Runner from google.adk.sessions import InMemorySessionService async def get_mcp_data(object_id: str) -> dict: """Fetches an object by its ID from the MCP server.""" print(f"Tool 'get_mcp_data' called with object_id: {object_id}") async with Client("restapi-mcp-server.py") as client: single = await client.call_tool("get_object_by_id", {"object_id": object_id}) print("Fetched single:", single) return single call_local_mcp_server_agent = LlmAgent( model="gemini-2.0-flash", name="assistant", description="This agent is used to get data using FASTMCP client by calling the FASTMCP server ", instruction="""Help user to fetch the data from the FASTMCP Server using FASTMCP Client. When the user asks to fetch data for a specific object ID, use the `get_mcp_data` tool and pass the ID to it. """, tools=[get_mcp_data], ) root_agent=call_local_mcp_server_agent |





source code :- https://github.com/shdhumale/app.git
No comments:
Post a Comment