Test Specification Examples
Real-world examples of test specifications for different types of MCP servers. All examples have been verified to work with the actual implementation.
Available Examplesโ
Filesystem MCP Serverโ
Status: โ 100% Verified Working
Comprehensive test specification for the @modelcontextprotocol/server-filesystem
, demonstrating:
- File operations - create, read, write, move, delete operations
- Directory management - creation and listing operations
- Error handling - proper error responses for invalid operations
- MCP compliance - follows MCP 2025-06-18 specification format
- Performance validation - response time requirements
name: "Filesystem MCP Server (MCP-Compliant)"
version: "1.0.0"
description: "Testing @modelcontextprotocol/server-filesystem according to MCP specification"
capabilities:
tools: true # Filesystem operations work
resources: false # Resources not used
prompts: false # Not supported
sampling: false # Not supported
logging: false # Not enabled
server:
command: "npx"
args: ["-y", "@modelcontextprotocol/server-filesystem", "/tmp/mcp-test-sandbox"]
transport: "stdio"
startup_timeout_seconds: 30
Everything MCP Serverโ
Status: โ 100% Verified Working (8/8 tests passing)
Optimized test specification for the @modelcontextprotocol/server-everything
, featuring:
- Mathematical operations - addition with various number types
- Text processing - echo functionality with Unicode support
- Environment access - system environment variable debugging
- Long-running operations - progress notification testing
- Resource management - basic resource access and validation
- Server-reality alignment - only tests features that actually work
name: "Everything MCP Server (Working Tests)"
version: "2025.7.1"
description: "Only tests that are proven to work with the everything server"
capabilities:
tools: true # These specific tools work
resources: true # Basic resource access works
prompts: false # Not supported
sampling: false # Not supported - returns MCP error -32601
logging: true # Works
server:
command: "npx"
args: ["-y", "@modelcontextprotocol/server-everything"]
transport: "stdio"
CodePrism MCP Serverโ
Basic test specification for the CodePrism MCP server, demonstrating:
- Tool testing - repository stats, complexity analysis, symbol search
- Performance requirements - response time and memory constraints
- Error handling - validation patterns and expected behaviors
- Configuration - server setup and transport configuration
Usage Examplesโ
Running Test Specificationsโ
# Test the filesystem server
moth run codeprism-docs/docs/test-harness/examples/filesystem-server.yaml
# Test the everything server (verified 100% working)
moth run codeprism-docs/docs/test-harness/examples/everything-server.yaml
# Validate a specification before running
moth validate codeprism-docs/docs/test-harness/examples/filesystem-server.yaml
# Run with verbose output for debugging
moth -v run codeprism-docs/docs/test-harness/examples/everything-server.yaml
Configuration Validationโ
# Validate specification syntax and structure
moth validate filesystem-server.yaml --detailed
# Check all validation aspects
moth validate everything-server.yaml --check-all
# Generate validation report
moth validate codeprism-mcp.yaml --formats html --output ./validation-reports
Working Examples - Real Test Resultsโ
Filesystem Server Resultsโ
โ
Test Suite Finished โ
Suite: Filesystem MCP Server (MCP-Compliant)
Total Tests: 8, Passed: 8, Failed: 0
Duration: 2.3s
Success Rate: 100%
Everything Server Resultsโ
โ
Test Suite Finished โ
Suite: Everything MCP Server (Working Tests)
Total Tests: 8, Passed: 8, Failed: 0
Duration: 10.02s
Success Rate: 100%
Creating Your Own Test Specificationsโ
When creating test specifications for your MCP server:
- Start with Working Examples - Use our verified examples as templates
- Test Server Reality - Only claim capabilities your server actually supports
- Use Correct Tool Names - Verify tool names match server implementation
- Validate Output Formats - Check actual server response structure
- Set Realistic Timeouts - Base timeouts on actual server performance
- Include Error Tests - Test both success and failure scenarios
Minimal Working Templateโ
name: "My MCP Server"
version: "1.0.0"
description: "Basic test specification"
# Only claim capabilities your server actually supports
capabilities:
tools: true
resources: false
prompts: false
sampling: false
logging: false
server:
command: "my-server"
args: ["stdio"]
transport: "stdio"
startup_timeout_seconds: 30
tools:
- name: "my_tool" # Use exact tool name from server
description: "Test my tool functionality"
tests:
- name: "basic_test"
description: "Basic functionality test"
input:
param1: "value1"
expected:
error: false
fields:
- path: "$[0].text" # Use actual server response format
field_type: "string"
required: true
test_config:
timeout_seconds: 60
max_concurrency: 2
fail_fast: false
Server-Reality Best Practicesโ
Based on our testing experience:
โ Do Thisโ
- Verify tool names against actual server implementation
- Test response formats to understand actual output structure
- Set capabilities accurately (false for unsupported features)
- Use realistic timeouts based on actual performance
- Include both success and error test cases
- Validate with
moth validate
before running
โ Avoid Thisโ
- Claiming false capabilities (e.g.,
sampling: true
when unsupported) - Using wrong tool names (e.g.,
longOperation
vslongRunningOperation
) - Expecting wrong output formats (e.g.,
$.result
vs$[0].text
) - Setting unrealistic timeouts (too short for actual server performance)
- Only testing success cases (missing error scenario validation)
Specification Evolutionโ
Version 1.0 vs 2.0 Patternsโ
Version 1.0 (Documentation):
capabilities:
sampling: true # โ Often incorrect
prompts: true # โ Often unsupported
expected:
path: "$.result" # โ Wrong format
value: 8 # โ Wrong expectations
Version 2.0 (Server Reality):
capabilities:
sampling: false # โ
Accurate to server
prompts: false # โ
Tested and verified
expected:
fields:
- path: "$[0].text" # โ
Actual server format
contains: "100" # โ
Realistic validation
Testing Your Examplesโ
Before sharing test specifications:
- Run the specification against the actual server
- Achieve 100% success rate or document expected failures
- Validate with all checks using
moth validate --check-all
- Test on clean environment to ensure reproducibility
- Document any setup requirements (sandbox directories, etc.)
Need help creating test specifications?
- Check our Configuration Reference for complete YAML documentation
- Review our Working Examples for proven patterns
- Use Quick Start Guide for step-by-step setup