This content originally appeared on DEV Community and was authored by Seenivasa Ramadurai
The landscape of AI development is rapidly evolving beyond single-agent systems toward interconnected ecosystems where agents can communicate, collaborate, and share capabilities. Two emerging protocols are making this vision a reality: Agent-to-Agent (A2A) and Model Context Protocol (MCP). In this post, we’ll explore how these protocols work together to create powerful, modular AI systems through practical code examples.
Understanding the Protocol Landscape
Agent-to-Agent (A2A) Protocol
The A2A protocol enables direct communication between AI agents, allowing them to:
- Exchange messages and requests
- Share tasks and delegate work
- Coordinate complex multi-agent workflows
- Maintain consistent communication standards
Model Context Protocol (MCP)
MCP provides a standardized way for AI models to access external tools and resources:
- Tool integration and execution
- Resource access (files, databases, APIs)
- Standardized request/response formats
- Transport layer abstraction (HTTP, WebSocket, etc.)
Architecture Overview
Our example demonstrates a practical implementation where:
- FastMCP Server exposes utility tools via MCP protocol
- A2A Agent Server handles user requests and coordinates tool usage
- OpenAI Integration provides natural language processing capabilities
User Request → A2A Agent → MCP Tools → Response
↓
OpenAI API
Building the FastMCP Server
The FastMCP server acts as our tool provider, exposing three key utilities:
Core Tools Implementation
Mathematical Operations
@mcp.tool(name="add_numbers", description="Add two numbers")
def add_numbers(a: float, b: float):
logging.info(f"add_numbers called with: a={a}, b={b}")
try:
result = a + b
return text_response(f"{a} + {b} = {result}")
except Exception as e:
return text_response(f"Error adding numbers: {str(e)}")
File System Access
@mcp.tool(name="read_file", description="Read the contents of a file")
def read_file(filename: str):
logging.info(f"read_file called with: filename={filename}")
try:
with open(filename, 'r') as f:
content = f.read()
return text_response(content)
except Exception as e:
return text_response(f"Error reading file '{filename}': {str(e)}")
System Information
@mcp.tool(name="get_time", description="Get the current date and time")
def get_time():
now = datetime.now()
response = (
f"Current time: {now.strftime('%H:%M:%S')}\n"
f"Today's date: {now.strftime('%A, %B %d, %Y')}"
)
return text_response(response)
Key Features of the MCP Implementation
Standardized Response Format: All tools return responses in a consistent MCP format using the text_response()
helper function.
Comprehensive Logging: Every tool call is logged with parameters and results, enabling debugging and monitoring.
Error Handling: Robust exception handling ensures the server remains stable even when tools encounter errors.
Transport Flexibility: The server uses HTTP transport but can easily switch to other protocols.
Creating the A2A Agent
The A2A agent serves as the intelligent coordinator, making decisions about when to use tools versus when to handle requests directly.
Intelligent Request Routing
def handle_task(self, task):
message_data = task.message or {}
content = message_data.get("content", {})
text = content.get("text", "") if isinstance(content, dict) else ""
# Pattern matching for different request types
numbers = extract_addition_numbers(text)
if numbers:
# Route to MCP addition tool
a, b = numbers
tool_result = asyncio.run(call_tool(self.mcp_url, "add_numbers", {"a": a, "b": b}))
elif text_lower.startswith("read file "):
# Route to MCP file reading tool
filename = extract_filename(text)
tool_result = asyncio.run(call_tool(self.mcp_url, "read_file", {"filename": filename}))
elif any(kw in text_lower for kw in ["time", "date", "today", "day"]):
# Route to MCP time tool
tool_result = asyncio.run(call_tool(self.mcp_url, "get_time", {}))
else:
# Route to OpenAI for general queries
response = self.openai_client.handle_message(message)
Advanced Pattern Recognition
The agent uses sophisticated pattern matching to identify user intent:
def extract_addition_numbers(text):
patterns = [
r'add\s+(\d+(?:\.\d+)?)\s+(?:and|to)?\s*(\d+(?:\.\d+)?)',
r'sum\s+(\d+(?:\.\d+)?)\s+(\d+(?:\.\d+)?)',
r'(\d+(?:\.\d+)?)\s*\+\s*(\d+(?:\.\d+)?)',
r'what is\s+(\d+(?:\.\d+)?)\s*\+\s*(\d+(?:\.\d+)?)',
r'(\d+(?:\.\d+)?)\s+plus\s+(\d+(?:\.\d+)?)'
]
# Handles various natural language expressions of addition
Agent Capabilities and Skills
The A2A agent defines its capabilities through an AgentCard
:
agent_card = AgentCard(
name="Simple Agent",
description="Agent using OpenAI and MCP tools",
skills=[
AgentSkill(name="Add Numbers", description="Add two numbers using the MCP tool"),
AgentSkill(name="Read File", description="Read a file using the MCP tool"),
AgentSkill(name="General Q&A", description="Answer general questions using OpenAI"),
AgentSkill(name="Get Time", description="Get current date and time using the MCP tool"),
]
)
Agent Card
This self-describing capability allows other agents to understand what services are available, enabling dynamic service discovery in multi-agent systems.
Integration Patterns
Asynchronous Tool Execution
The system uses async/await patterns for efficient tool execution:
async def call_tool(mcp_url, tool_name, parameters):
client = Client(mcp_url)
async with client:
result = await client.call_tool(tool_name, parameters)
return result
Hybrid AI Processing
The agent combines multiple AI approaches:
- Rule-based routing for structured requests
- OpenAI processing for natural language understanding
- MCP tool execution for specific capabilities
MCP server
A2A Agent
MCP Server side tool call logs
Reading a file from server side via MCP server tool
Sending Task to Agent
Agent calls right tool on MCP server side
Error Resilience
Both protocols implement comprehensive error handling:
- Tool execution failures are gracefully handled
- Network issues don’t crash the system
- Error messages are user-friendly and informative
Real-World Applications
This architecture pattern enables numerous practical applications:
Enterprise AI Systems
- Document Processing: Agents that can read, analyze, and summarize documents
- Data Analysis: Specialized agents for different data types and analysis methods
- Workflow Automation: Orchestrating complex business processes across multiple systems
Development Tools
- Code Analysis: Agents that can read code files and provide insights
- Testing Automation: Coordinated testing across different environments
- Documentation Generation: Automated creation of technical documentation
Personal Productivity
- Task Management: Agents that can schedule, remind, and track tasks
- Information Synthesis: Combining data from multiple sources into coherent reports
- Smart Assistants: Context-aware helpers that understand user preferences and history
Performance and Scalability Considerations
Connection Management
The FastMCP client uses context managers for efficient connection handling:
async with client:
result = await client.call_tool(tool_name, parameters)
Threading Architecture
The A2A server uses threading to handle concurrent requests:
server_thread = threading.Thread(target=run_server_thread, daemon=True)
server_thread.start()
Resource Optimization
- Tools are called only when needed based on request analysis
- OpenAI API calls are minimized through intelligent routing
- Logging provides visibility into system performance
Future Possibilities
The combination of A2A and MCP protocols opens exciting possibilities:
Multi-Agent Orchestration
- Agents that can discover and coordinate with other agents
- Dynamic load balancing across agent networks
- Specialized agents for different domains working together
Tool Ecosystem Growth
- Standardized tool sharing across organizations
- Community-driven tool libraries
- Dynamic tool discovery and integration
Enhanced AI Capabilities
- Agents that can learn from tool usage patterns
- Adaptive routing based on performance metrics
- Self-improving agent networks
Getting Started
To implement your own A2A/MCP system:
- Start with FastMCP: Create simple tools that expose your core capabilities
- Build an A2A Agent: Implement intelligent routing logic for your use cases
- Add AI Integration: Connect to language models for natural language processing
- Scale Gradually: Add more tools and agents as your needs grow
The protocols provide a solid foundation for building sophisticated AI systems that can grow and adapt with your requirements.
Conclusion
The A2A and MCP protocols represent a significant step forward in AI system architecture. By providing standardized ways for agents to communicate and access tools, they enable the creation of modular, scalable, and maintainable AI systems.
The example implementation demonstrates how these protocols work together to create intelligent systems that can:
- Route requests to appropriate handlers
- Execute specialized tools when needed
- Provide natural language interfaces
- Scale across multiple services
As AI systems become more complex, protocols like A2A and MCP will be essential for managing that complexity while maintaining flexibility and extensibility. The future of AI lies not in monolithic systems, but in collaborative networks of specialized agents working together to solve complex problems.
Whether you’re building enterprise AI solutions, development tools, or personal productivity systems, understanding and implementing these protocols will give you a significant advantage in creating robust, scalable AI applications.
Thanks
Sreeni Ramadorai
This content originally appeared on DEV Community and was authored by Seenivasa Ramadurai