DeepL MCP Server: How to Build and Use Translation in LLM Applications
Efficiently translate content with AI-powered translation and enhance your interactions with large language models.
Large Language Models excel at many tasks but may not provide optimal translations for all languages. By combining the DeepL API with the Model Context Protocol (MCP), you can provide Claude and other MCP-compatible clients with access to DeepL's specialized translation capabilities, bringing in the ability to provide translations across numerous languages.
In this cookbook, we'll explore how to create an MCP server that connects DeepL's translation API with clients like Claude Desktop, GitHub Copilot, and any other clients that work with MCPs! This allows you to seamlessly translate text between languages within your conversations. To look at the code and start using it, go to GitHub.
What is MCP?
The Model Context Protocol (MCP) is an open standard introduced by Anthropic that standardizes how AI applications connect with external tools, data sources, and systems. Think of it as a "USB for AI integrations" – it provides a universal adapter between AI applications and external data sources through a standardized interface.
This elegant design solves the integrations problem: instead of building custom connectors between each AI application and each external tool, developers only need to implement the MCP standard once on each side, reducing integration complexity. In the context of the DeepL MCP Server, MCP allows AI assistants to seamlessly access DeepL's specialized translation capabilities while maintaining a consistent, secure communication protocol, combining the strengths of both systems to deliver a better User Experience.
Setting Up Your DeepL MCP Server
Prerequisites
Before you begin, you'll need:
A DeepL API key (get one at DeepL API)
Node.js installed on your system
Basic familiarity with JavaScript/Node.js
Step 1: Initialize Your Project
First, let's set up a new Node.js project:
Here's what each dependency does:
@modelcontextprotocol/sdk
: The MCP SDK that allows our server to communicate with MCP clients like Claude Desktopdeepl-node
: Official DeepL API client for Node.js, making it easy to interact with DeepL's translation serviceszod
: A TypeScript-first schema validation library that we'll use to define our tool parameters
The Model Context Protocol (MCP) enables AI systems to access external tools, providing them with specialized capabilities beyond their built-in functionality. For translation tasks, this is particularly valuable as it combines DeepL's translation expertise with Claude's conversational abilities.
Step 2: Create Your Server Implementation
The McpServer
class is the core of our implementation. It handles all the protocol-specific details of communicating with MCP clients. The StdioServerTransport
uses standard input/output streams for communication, which works well with Claude Desktop's execution model where it spawns separate processes for each server.
We're using environment variables to pass the DeepL API key to our server, which is a secure way to handle sensitive credentials. Create a file named src/index.mjs
with the following structure:
Step 3: Implement DeepL API Helper Functions
Let's add some helper functions to manage language lists and validation. These helper functions serve several important purposes:
Caching: We cache the language lists to avoid unnecessary API calls, as these lists rarely change and DeepL has API usage limits.
Validation: Before sending requests to DeepL, we validate that the requested language codes are supported. This provides better error messages to users and prevents unnecessary API calls with invalid parameters.
Separation of concerns: By extracting these functions, we keep our tool implementations clean and focused on their primary purpose.
The DeepL API distinguishes between source languages (languages you can translate from) and target languages (languages you can translate to), with slightly different sets of supported languages for each direction. Our implementation respects this distinction.
Step 4: Define MCP Tools
Let's define one of the most important tools our server will provide - the translation tool. For brevity, we'll only show one implementation, but the complete code includes additional tools like get-source-languages
, get-target-languages
, and rephrase-text
.
For the complete implementation of all tools, including get-source-languages
, get-target-languages
, and rephrase-text
, please refer to the GitHub repository.
Connecting to Claude Desktop
To use your DeepL MCP server with Claude Desktop:
Create or edit the Claude Desktop configuration file:
On macOS:
~/Library/Application Support/Claude/claude_desktop_config.json
On Windows:
%AppData%\Claude\claude_desktop_config.json
On Linux:
~/.config/Claude/claude_desktop_config.json
Add your DeepL MCP server configuration:
Replace
/path/to/deepl-mcp-server
with the actual path to your server directoryReplace
your-api-key-here
with your actual DeepL API keyRestart Claude Desktop
In case it worked, deepl should show up as one of the available tools within the "Search and Tools" menu.

Testing Your Server
You can test your server's capabilities by asking Claude Desktop translation-related questions:
"Can you translate 'Hello, how are you?' to German using DeepL?"
"Please translate this paragraph to Japanese using DeepL: [your text here]"
"Can you rephrase this text in a more formal way using DeepL: [your text here]"
"What languages can you translate from and to using DeepL?"
Although it isn't an exact science, mentioning the keyword "DeepL" helps Claude understand that we want to use that tool during the interaction.

Understanding the Code
Let's break down the key components of our implementation:
Server Initialization
The McpServer
class creates an MCP-compatible server that exposes tools to clients:
Tool Definition
Each tool follows a similar pattern:
Name of the tool
Description of what it does
Schema for parameters (using Zod)
Implementation function
For example, the translate-text
tool:
Transport Setup
The StdioServerTransport
enables communication between the MCP server and clients:
Wrapping Up
By following this cookbook, you've created an MCP server that enables Claude Desktop and other MCP-compatible clients to access DeepL's translation capabilities. This allows seamless translation within your conversations, improving the multilingual capabilities of your AI interactions.
As you expand your server, consider adding more features like:
Support for document translation
Custom glossaries for domain-specific terminology
Batch processing for multiple translations
Caching to improve performance and reduce API usage
The combination of specialized AI services (like DeepL) with general-purpose AI assistants (like Claude) through MCP creates powerful workflows that combine the strengths of different AI systems.
References
Last updated