MCP Server Trilogy

Live AI Infrastructure Started November 2025 Launched November 2025
TypeScript MCP Apify DeepSeek Qwen Markitdown

MCP Server Trilogy: When Claude Needs Friends

The Model Context Protocol changed everything. Suddenly AI agents could use tools, access data, talk to services. But everyone was building MCP servers for the same things — file systems, databases, web scraping. What about access to other AI models?

I built three MCP servers that give Claude (and any MCP client) superpowers it doesnt have natively.

The Problem

Claude is brilliant. But Claude is also expensive. And Claude cant read Chinese as well as models trained specifically for it. And Claude cant convert your PDF to markdown.

What if Claude could delegate? "Hey DeepSeek, youre cheaper — handle this batch processing." "Hey Qwen, this is Chinese text — youre better at this." "Hey Markitdown, convert this document so I can actually read it."

Thats the trilogy.

DeepSeek MCP Server

DeepSeek R1 and V3 are scary good. Comparable to GPT-4 at a fraction of the cost — 60-90% savings depending on the task. The catch? No native MCP support.

deepseek-mcp-server bridges that gap. Deploy it on Apify, point your MCP client at it, and suddenly Claude can call DeepSeek for the heavy lifting. Batch processing. Long context tasks. Anything where cost matters more than brand loyalty.

The server handles authentication, rate limiting, streaming responses. Your agent just sees another tool.

Qwen MCP Server

Alibabas Qwen models dominate Chinese language tasks. If your agent needs to process Mandarin text, analyze Chinese documents, or interact with Chinese users — Qwen is the right tool.

qwen-mcp-server exposes Qwen through MCP. Same pattern: deploy on Apify, connect via HTTP transport, let your primary agent delegate when appropriate.

The specialization matters. General models are generalists. Sometimes you need a specialist.

Markitdown MCP Server

This one solves a different problem. PDFs, Word docs, PowerPoints, images with text — all opaque to language models. You cant reason about what you cant read.

Microsofts Markitdown library converts documents to clean markdown. markitdown-mcp-server wraps it for MCP. Your agent receives a file, calls the tool, gets back structured text it can actually process.

Essential for RAG pipelines. Essential for document analysis. Essential for any workflow touching real-world files.

The Apify Pattern

All three servers run on Apify as standby Actors. The platform handles scaling, authentication, billing. Pay-per-event pricing means users only pay for actual tool calls.

The MCP transport is streamable HTTP. Connect from Claude Desktop, from custom agents, from any MCP client. Standard protocol, standard integration.

Why Build These?

AI infrastructure is still early. The tools that seem obvious in hindsight — they dont exist until someone builds them.

MCP gave us the protocol. These servers give us the endpoints. The ecosystem grows one tool at a time.


Servers: DeepSeek (cost-effective LLM) • Qwen (Chinese specialist) • Markitdown (document conversion)

Platform: Apify (standby Actors, pay-per-event billing)

Stack: TypeScript, MCP Protocol, HTTP Transport

Links: DeepSeekQwenMarkitdown