×
Python agents in 70 lines: Building with MCP
Written by
Published on
Join our daily newsletter for breaking news, product launches and deals, research breakdowns, and other industry-leading AI coverage
Join Now

Hugging Face’s new MCP-powered Python framework allows developers to create sophisticated AI agents with minimal code, streamlining how language models interact with external tools. This implementation simplifies agent development by standardizing tool interactions through the Model Context Protocol (MCP), enabling Python developers to rapidly build conversational AI systems that can leverage a growing ecosystem of capabilities without complex custom integrations.

The big picture: Hugging Face has extended its hub client SDK to support the Model Context Protocol (MCP), allowing Python developers to create AI agents in roughly 70 lines of code.

  • MCP standardizes how large language models interact with external tools and APIs, eliminating the need for custom integrations for each tool.
  • This Python implementation was inspired by the existing “Tiny Agents in JS” project, bringing similar capabilities to the Python ecosystem.

Key technical details: An agent built with this framework functions essentially as a while loop built on top of an MCP client.

  • The MCPClient manages async connections to MCP servers, discovers available tools, formats them for LLM consumption, and routes tool execution to the appropriate server.
  • Developers can configure agents through a simple JSON file that specifies the model, provider, and connected tool servers.

How to get started: The implementation requires minimal setup to begin building functional AI agents.

  • Users need to install the latest huggingface_hub package with the MCP extra: pip install "huggingface_hub[mcp]>=0.32.0".
  • Agents can be run directly from the Hugging Face Hub tiny-agents Dataset or configured locally through an agent.json file.

Behind the framework: The agent operates as a conversational loop with clear functional stages.

  • It initializes with a system prompt, connects to configured MCP servers, processes user inputs through an LLM, executes tools as needed, and determines when to end the conversation.
  • The JSON configuration allows developers to specify which language model to use and which tool servers to connect to, including options for playwright and other capabilities.

Where it’s headed: Hugging Face suggests several paths for expanding the project’s capabilities.

  • Developers can benchmark different LLM models and inference providers to optimize performance.
  • The framework supports running tiny agents with local LLM inference servers for improved privacy or latency.
  • The community is encouraged to contribute unique tiny agents to expand the ecosystem.
Tiny Agents in Python: a MCP-powered agent in ~70 lines of code

Recent News

Gemini AI powers smarter automation and camera features in Google Home

Gemini AI now enables natural language creation of smart home routines and enhances camera functionality with searchable video content and automated monitoring.

Somerset Council trials AI to speed up special educational needs reports

AI automation allows Somerset caseworkers to reduce paperwork and spend more time directly helping children with special needs while maintaining human oversight of final decisions.

Surfshark report reveals alarming data collection by AI chatbots

Popular AI chatbots collect vast amounts of personal data, with Meta AI harvesting 90% of possible data types while others collect significantly less for the same functionality.