This document provides a technical overview of how openai-tool2mcp bridges the OpenAI Assistant API with Model Context Protocol (MCP) servers.
openai-tool2mcp is designed as a protocol translation layer that sits between MCP-compatible clients and the OpenAI API:
sequenceDiagram
participant Client as "MCP Client<br>(e.g., Claude App)"
participant Server as "openai-tool2mcp Server"
participant OAIAPI as "OpenAI Assistant API"
Client->>Server: MCP Tool Request
note over Server: Protocol Translation
Server->>OAIAPI: OpenAI API Request
OAIAPI->>Server: OpenAI Tool Response
note over Server: Response Translation
Server->>Client: MCP Tool Response
MCP requests follow the Model Context Protocol specification, which defines a standard for tool usage. openai-tool2mcp maps these requests to OpenAI’s Assistant API format:
MCP Component | OpenAI Equivalent |
---|---|
Tool ID | Tool type |
Tool parameters | Tool parameters |
Tool context | Thread context |
Instructions | System prompt |
Responses from OpenAI tools are translated back to MCP format:
OpenAI Component | MCP Equivalent |
---|---|
Tool output | Tool response content |
Error information | Error messages |
Metadata | Tool context updates |
The system consists of the following main components:
Request Phase:
Processing Phase:
Response Phase:
The Web Search tool maps MCP search requests to OpenAI’s built-in web search capability:
flowchart TD
A[MCP Search Request] --> B{Protocol Translator}
B --> C[OpenAI Web Search]
C --> D[Search Results]
D --> B
B --> E[MCP Search Response]
The Code Interpreter tool maps MCP code execution requests to OpenAI’s code interpreter:
flowchart TD
A[MCP Code Execution Request] --> B{Protocol Translator}
B --> C[OpenAI Code Interpreter]
C --> D[Execution Results]
D --> B
B --> E[MCP Code Execution Response]
MCP and OpenAI have different approaches to maintaining state:
openai-tool2mcp handles this difference by:
The architecture is designed for extensibility: