MCP Web Tools Server and Client

Developed a Model Context Protocol (MCP) server and client system that enables language models to interact with web content through standardized tools for web scraping, search, and content extraction.

View Code on GitHub

MCP Web Tools Server and Client
Python icon
Python
MCP Server-Client Architecture icon
MCP Server-Client Architecture
FastAPI icon
FastAPI
Streamlit icon
Streamlit
Docker icon
Docker
Node.js icon
Node.js
LangChain icon
LangChain
LangGraph icon
LangGraph

MCP Web Tools Server and Client

Introduction

I designed and developed a comprehensive Model Context Protocol (MCP) server and client system that enables large language models (LLMs) to interact with web content through standardized tools. This system facilitates seamless integration between AI models and external web resources, enhancing their capabilities for research, content analysis, and information retrieval.

System Architecture

The system is built on a modular architecture with several key components:

  1. MCP Server Core: Central component that implements the Model Context Protocol standard, handling tool registration and execution.
  2. Transport Layer: Supports multiple communication methods (stdio and SSE) for flexibility in different environments.
  3. Tool Modules: Specialized web interaction capabilities including web scraping, search, and advanced content extraction.
  4. Client Interface: Streamlit-based management UI for testing and configuring MCP servers.

Web Tools Implementation

The MCP server implements several powerful web tools that expand AI capabilities:

1. Web Scraping Tool

The web_scrape tool enables seamless retrieval of web content as markdown, making it readily consumable by language models. It handles automatic URL scheme addition, transformation with r.jina.ai, and comprehensive error handling for various scenarios.

The ddg_search tool enables comprehensive web searching with customizable parameters, including region settings, SafeSearch filtering, time limits for results, and maximum result counts. Search results are formatted consistently for easy consumption by language models.

3. Advanced Web Scraping

The advanced_scrape tool uses Crawl4AI to extract clean, structured content from complex web pages:

Sequential Thinking Tool

This tool provides structured reasoning capabilities for complex problem-solving. It was integrated from arben-adm/mcp-sequential-thinking with enhancements for persistence and dataset generation.

Enhanced Features

  • Thought Persistence: All thinking steps are stored for future reference and analysis
  • Dataset Generation: Collected thought patterns can be used to train specialized models
  • Structured Reasoning: Formal framework for breaking down complex problems

MCP Server Management UI

The system includes a Streamlit-based management interface that enables:

Integration with Claude

The system integrates seamlessly with Claude and other LLMs through Claude for Desktop configuration. This integration allows for simple setup where the MCP server is registered with paths to appropriate Python scripts. The configuration supports different tools with distinct initialization parameters.

Technical Implementation

Transport Layer

The system supports multiple transport mechanisms including stdio for command-line usage and SSE for web-based interfaces. The transport layer handles bidirectional communication between the LLM and the MCP server.

Error Handling

The system implements comprehensive error handling to ensure robustness, including specific handling for HTTP status errors, request failures, and unexpected exceptions. Each error is formatted appropriately for user understanding.

Asynchronous Processing

All tools use asynchronous programming for optimal performance, allowing non-blocking execution of potentially slow operations like web requests or complex processing tasks.

Key Technologies

Reading Time: 5 min read