Skip to main content

Build MCP Servers On Demand — No Code Required

Model Context Interface (MCI) is the fastest way to create and manage AI tools and MCP servers. Define tools in simple JSON or YAML files, use them programmatically in your code, or run them as MCP servers on the fly. MCP is now a part of MCI — enhanced with tagging, filtering, caching, and toolset management.

What’s New in MCI?

MCI is the next step. MCP is now integrated into MCI with enhanced features like tagging, filtering, caching, and toolset management. Use it programmatically via adapters or run as MCP servers via the mcix CLI.

Run as MCP Server

Use the mcix CLI tool to run MCI as an MCP server. Connect to Claude Desktop, VSCode, Cursor, or any STDIO-based MCP client. Your tools become instantly available in AI applications.

Programmatic Usage

Use MCI adapters (Python available now) to integrate tools directly into your applications. Acts like an MCP client — discovers tools and executes them from mci.json files programmatically.

Toolset Management

Works like npm for AI tools. Your main mci.json links to Toolsets and MCP servers stored in ./mci directory. Organize, share, and reuse tool collections effortlessly.

MCP Server Integration

Add any MCP server (HTTP or STDIO) to your config. Tools are cached locally for configurable periods. Mix and match tools from multiple servers with filtering and tagging.

Smart Caching

MCP tools register statically from cached files. The actual MCP server is only called during execution or when cache expires. Lightning-fast tool discovery with minimal overhead.

YAML Support

Write your schemas in JSON or YAML — whichever you prefer. Full YAML support is now integrated for more readable, human-friendly configurations.

Multiple Configurations

Run different MCI setups on demand with uvx mcix run --file ./mci/toolset-name.json. Create specialized servers for different use cases or agents.

Language Adapters

mci-py adapter powers everything — MCP server mode, caching, toolsets, and programmatic usage. More language adapters (Node.js, Go) coming soon.

Why Choose MCI?

Two Ways to Use

MCP Server or Programmatic Run as an MCP server via mcix CLI, or use mci-py adapter programmatically in your Python code. Same features, different deployment options.

MCP Server Creation

Build MCP servers on demand Create custom MCP servers in seconds by combining tools from multiple sources — other MCP servers, your own APIs, CLI tools, and shared community toolsets.

Simple & Declarative

JSON or YAML — your choice Define tools declaratively in simple schema files. No complex server setup, no coding required. Just clean, readable configurations that anyone can understand and review.

Universal Compatibility

Works everywhere MCP works Connect to Claude Desktop, VSCode, Cursor, or any MCP-compatible application. Or integrate directly into your Python apps using the adapter.

Enhanced MCP Integration

Best of both worlds Use existing MCP servers with added benefits — tagging, filtering, caching, and smart tool registration. Only call upstream servers when needed.

Multiple Execution Types

HTTP • CLI • File • Text Build your own tools that wrap REST APIs, command-line utilities, file operations, and text templates. Perfect for custom integrations.

Built-in Authentication

API Key • Bearer Token • Basic Auth • OAuth2 Comprehensive authentication support for your custom HTTP tools. Securely connect to any service without writing custom code.

Advanced Templating

Dynamic Values • Conditionals • Loops Powerful built-in template engine with environment variables, conditional logic, and iteration for complex, dynamic tool execution.

Performance & Caching

Lightning-fast tool discovery Tools from MCP servers are cached locally. Configure cache duration per server. Instant startup with on-demand execution.
MCI makes MCP server creation and tool integration accessible to everyone. No programming required — just simple JSON or YAML schemas.

The Vision: Simplifying MCP Server Creation

Creating MCP servers traditionally requires significant development effort — setting up servers, handling protocols, managing connections, and maintaining infrastructure. MCI changes everything by offering two powerful approaches:
  1. MCP Server Mode (mcix CLI): Create and run MCP servers on demand using simple configuration files
  2. Programmatic Mode (mci-py adapter): Integrate tools directly into your Python applications with MCP-like capabilities
Both modes share the same powerful core features — toolsets, MCP server integration, caching, and more.

The Problem with Traditional MCP Servers

Many MCP servers are essentially wrappers around APIs or CLI tools. While MCP is powerful for complex logic, sometimes you just need:
  • A simple API wrapper
  • Access to a command-line tool
  • File reading with templating
  • A combination of tools from different sources
Building a full server for these use cases is overkill. And integrating MCP clients into your applications adds complexity.

How MCI Solves This

1

Configuration Over Code

Define, don’t developWrite a simple JSON or YAML file instead of coding an entire server. MCI handles all the MCP protocol details, server lifecycle, and tool registration automatically. Use it as a server or programmatically.
2

Mix & Match Tools

Flexible tool sources Combine tools from multiple MCP servers, your own custom HTTP/CLI tools, file operations, and community Toolsets — all in one configuration. Apply filters and tags to organize them.
3

Smart Caching & Performance

Fast and efficient MCP server tools are cached locally. Configure cache expiration per server. Tools register instantly from cache, upstream servers only execute when needed. Works in both server and programmatic modes.
4

Choose Your Integration Method

Server or Programmatic:
  • MCP Server: Run uvx mcix run --file ./mci.json and connect to Claude, VSCode, or Cursor
  • Programmatic: Use mci-py adapter in your Python code to get tools and execute them directly
5

Toolset Ecosystem

Share and reusePackage tools into Toolsets stored in ./mci directory. Organize & Share them. Your main mci.json links to Toolsets and MCP servers you want to use. Works identically in both modes.
With MCI, you go from “I need tools” to “I have working tools” in minutes — whether you need an MCP server or direct programmatic integration.

Quick Example

See how simple it is to create an MCP server with MCI:
{
  "toolsets": [
    {
      "name": "my-tools.json"
    }
  ],
  "mcp_servers": {
    "filesystem": {
      "command": "npx",
      "args": ["-y", "@modelcontextprotocol/server-filesystem", "/workspace"],
      "config": {
        "expDays": 7,
        "filter": "only",
        "filterValue": "read_file,write_file,list_directory"
      }
    }
  }
}
That’s it! Same configuration works for both MCP server mode and programmatic usage. All features — toolsets, MCP integration, caching — work identically in both modes.

Key Features Explained

Works like npm for AI toolsToolsets are reusable collections of tools stored in your ./mci directory. Your main mci.json references them:
{
  "toolsets": [
    { "name": "github-tools" },
    {
      "name": "slack-tools.json",
      "filter": "tags",
      "filterValue": "communication"
    }
  ]
}
  • Share toolsets across projects
  • Filter tools by tags
  • Version control your tool collections
  • Mix tools from different authors
Use any existing MCP serverAdd HTTP or STDIO MCP servers to your configuration:
{
  "mcp_servers": {
    "filesystem": {
      "command": "npx",
      "args": ["-y", "@modelcontextprotocol/server-filesystem", "/workspace"],
      "config": {
        "expDays": 7,
        "filter": "only",
        "filterValue": "read_file,write_file,list_directory"
      }
    }
  }
}
Benefits:
  • Tools are cached locally for fast startup
  • Add tagging and filtering to any MCP server
  • Configurable cache expiration
  • Combine tools from multiple servers
Performance without complexityWhen you add an MCP server to MCI:
  1. First run: MCI calls the server to discover tools and caches them
  2. Subsequent runs: Tools load instantly from cache
  3. Execution: Upstream server is called only when tools are used
  4. Cache refresh: Automatically updates when TTL expires
This means:
  • Lightning-fast MCP server startup
  • Minimal overhead for tool discovery
  • Configurable cache duration per server
  • Works offline after initial cache
Write schemas your wayUse JSON for high speed or YAML for the best readability:
{
  "tools": [{
    "name": "example",
    "description": "An example tool",
    "execution": { "type": "text", "text": "Hello" }
  }]
}
Both formats work identically. Choose what works best for your team.
Different servers for different needsRun specialized MCI configurations:
# Development environment
uvx mcix run --file ./dev-tools.json

# Production monitoring
uvx mcix run --file ./prod-monitoring.json

# Customer support agent
uvx mcix run --file ./support-agent-tools.json
Each configuration can include:
  • Different toolsets
  • Different MCP servers
  • Custom tags and filters
  • Environment-specific settings
Organize and control tool accessAdd tags to tools:
./mci/admin-tools.yaml
tools:
  - name: deploy_service
    tags: [deployment, production, dangerous]
    # ... tool definition
./mci.yaml
toolsets:
  - name: admin-tools
    filter: "tags"
    filterValue: "deployment,production"
Filter tools when running:
# Only include tools with specific tags
uvx mcix run --file mci.json --filter tags:deployment,monitoring

# Exclude dangerous tools
uvx mcix run --file mci.json --filter withoutTags:dangerous

Sponsors & Support

Help Us Build the Future of MCP ToolingMCI is built and maintained by individual developers passionate about making MCP accessible to everyone. We’re not backed by tech companies or VC funding — just developers who believe in simplifying AI tool creation.
Ways to support MCI:
  • 🐛 Report bugs and suggest features on GitHub
  • 💻 Contribute code, documentation, or toolset examples
  • 📢 Spread the word — share MCI with your community
  • Star the repo to show your support
  • 💝 Become a sponsor to accelerate development

Sponsorship Benefits

🚀 Priority Support

Get direct help with your MCI implementations and use cases

📢 Visibility

Featured in our documentation, releases, and community channels

🎯 Custom Development

Request specific features or toolset implementations

🏆 Recognition

Logo placement and acknowledgment in our growing ecosystem
Interested in sponsoring? Contact us: revaz@usemci.dev
Every contribution helps us maintain the project, add new features, and support the growing MCI community. Thank you for your support! 🙏

Next steps

Common Use Cases

MCI excels at both creating custom MCP servers and providing programmatic tool integration. Choose the mode that fits your needs — or use both!
Transform any REST API into MCP toolsCreate custom tools that wrap third-party APIs like weather services, payment processors, or internal microservices:
tools:
  - name: get_weather
    description: Get current weather
    execution:
      type: http
      method: GET
      url: https://api.weather.com/current
      auth:
        type: apiKey
        in: header
        name: X-API-Key
Use as MCP server: Connect to Claude Desktop, VSCode, or other MCP clients
Use programmatically: Call directly from your Python application, acting similar to MCP client
Perfect for:
  • SaaS API integrations
  • Internal microservice access
  • Third-party service wrappers
  • Custom authentication flows
Build specialized tool collections for different agentsCreate focused tool collections for specific agent roles:
# As MCP servers
uvx mcix run --file ./mci/support-agent.json
uvx mcix run --file ./mci/dev-assistant.json
# Or programmatically in your agent framework
from mcipy import MCIClient

support_tools = MCIClient(json_file_path="./mci/support-agent.json")
dev_tools = MCIClient(json_file_path="./mci/dev-assistant.json")

# Use with LangChain, CrewAI, or your custom framework
Each can combine:
  • Relevant MCP servers (filesystem, search, etc.)
  • Custom API tools
  • Role-specific CLI wrappers
  • Filtered tool access via tags
One MCP server, multiple tool sourcesCombine tools from various sources into a single MCP endpoint:
{
  "toolsets": [
    "github-api.json",
    { "name": "slack-api.json" },
    { "name": "jira-api.json" }
  ],
  "mcp_servers": {
    { "filesystem": {"type": "stdio" /* ... */} },
    { "brave-search": {"type": "stdio" /* ... */} }
  }
}
Your AI application gets one unified interface to all tools — whether using MCP server mode or the programmatic adapter.

# Developer assistant

uvx mcix run --file ./mci/dev-assistant.json

# Data analyst

uvx mcix run --file ./mci/data-analyst.json

Each can combine:
  • Relevant MCP servers (filesystem, search, etc.)
  • Custom API tools
  • Role-specific CLI wrappers
  • Filtered tool access via tags
One MCP server, multiple tool sourcesCombine tools from various sources into a single MCP endpoint:
{
  "toolsets": [
    "github-api.json",
    { "name": "slack-api.json" },
    { "name": "jira-api.json" }
  ],
  "mcp_servers": {
    { "filesystem": {"type": "stdio" /* ... */} },
    { "brave-search": {"type": "stdio" /* ... */} }
  }
}
Your AI application gets one unified interface to all tools.
Manage complex prompts with File executionStore prompts as files with templating:
tools:
  - name: code_review_prompt
    description: Generate code review instructions
    execution:
      type: file
      path: ./prompts/code-review.md
      enableTemplating: true
prompts/code-review.md
# Code Review Instructions

Repository: {{props.repo_name}}
Language: {{props.language}}

Please review the code following these guidelines:

- Check for security vulnerabilities
- Verify error handling
- Assess code readability

{{env.CUSTOM_GUIDELINES}}

Getting Started

  • MCP Server Mode
  • Programmatic Mode (Python)
1

Install MCI

# Using uvx (recommended)
uvx mcix install
2

Create Your Configuration

Update a mci.json or mci.yaml file
3

Run as MCP Server

uvx mcix run
Your MCP server is now running! Tools are cached and ready to use.
4

Connect to Your Application

Add to Claude Desktop, VSCode, or Cursor config:
{
  "mcpServers": {
    "my-mci-server": {
      "command": "uvx",
      "args": ["mcix", "run"]
    }
  }
}