Skip to main content

User Guide

Complete guide to using the MCP Test Harness for testing Model Context Protocol (MCP) servers.

๐Ÿ“‹ Overviewโ€‹

The MCP Test Harness (moth) is a comprehensive tool for validating MCP server implementations. It provides:

  • Protocol Compliance Testing - Validate adherence to MCP specification
  • Performance Monitoring - Track response times and resource usage
  • Security Validation - Test authentication, authorization, and data protection
  • Custom Testing - Write your own validation logic
  • CI/CD Integration - Automated testing in your development pipeline

New to the Test Harness? Start with the Quick Start Guide for a 5-minute tutorial, or check Installation for setup instructions.

๐Ÿ”ง Command Overviewโ€‹

The moth binary provides several commands for testing MCP servers:

  • moth test - Execute test specifications against servers
  • moth validate - Validate YAML specification syntax
  • moth list - List available tests without running them
  • moth version - Show version information

Basic Usage:

# Run tests
moth test my-server.yaml

# Validate configuration
moth validate my-server.yaml --verbose

# List available tests
moth list my-server.yaml --detailed

Complete Command Reference: See CLI Reference for detailed command documentation, options, and examples.

๐ŸŽฏ Common Usage Patternsโ€‹

Development Workflowโ€‹

1. Validate Configuration First

# Always validate before running tests
moth validate my-server.yaml --verbose

# Check what tests will run
moth list my-server.yaml --detailed

2. Iterative Development

# Run specific tests during development
moth test my-server.yaml --filter "authentication.*" --fail-fast

# Use text output for immediate feedback
moth test my-server.yaml --output text --verbose

# Test single functionality quickly
moth test my-server.yaml --filter "specific_tool" --concurrency 1

3. Debug Test Failures

# Stop on first failure for debugging
moth test my-server.yaml --fail-fast --verbose

# Run problematic test in isolation
moth test my-server.yaml --filter "failing_test_name"

CI/CD Integrationโ€‹

1. Automated Testing Pipeline

# Validate all specs
moth validate test-specs/

# Run tests with CI-friendly output
moth test my-server.yaml --output junit --output-file results.xml

# Generate HTML reports for artifacts
moth test my-server.yaml --output html --output-file report.html

2. Performance Monitoring

# Performance regression testing
moth test performance-spec.yaml --concurrency 8

# Compare against performance baselines
moth test my-server.yaml --filter "performance.*"

๐Ÿ“Š Understanding Test Resultsโ€‹

Test Output Formatโ€‹

๐Ÿงช Running Test Suite: core-protocol-tests
================================================================================

โœ… PASS: initialize
Duration: 150ms
Server Response: {"protocolVersion": "2025-06-18", "capabilities": {...}}

โŒ FAIL: list_resources
Duration: 5000ms (TIMEOUT)
Error: Server did not respond within timeout period
Expected: Response with resource list
Actual: No response received

โš ๏ธ SKIP: optional_feature
Reason: Server does not advertise this capability

================================================================================
Suite Summary: core-protocol-tests
- Total Tests: 3
- Passed: 1
- Failed: 1
- Skipped: 1
- Success Rate: 33.3%
- Total Duration: 5.15s
================================================================================

Result Status Codesโ€‹

  • โœ… PASS - Test completed successfully, all validations passed
  • โŒ FAIL - Test failed validation or encountered an error
  • โš ๏ธ SKIP - Test was skipped (disabled, unsupported capability, etc.)
  • ๐Ÿ”„ RETRY - Test is being retried after failure
  • โฑ๏ธ TIMEOUT - Test exceeded maximum execution time

Performance Metricsโ€‹

Performance Summary:
- Average Response Time: 245ms
- 95th Percentile: 800ms
- 99th Percentile: 1.2s
- Fastest Test: initialize (98ms)
- Slowest Test: complex_analysis (1.8s)
- Memory Usage: Peak 156MB, Average 89MB
- Regression Status: โœ… No performance regressions detected

โš™๏ธ YAML Configuration Essentialsโ€‹

Complete Configuration Reference: See Configuration Reference for detailed documentation of all YAML options and advanced features.

Basic Server Configurationโ€‹

# Minimal configuration for stdio server
name: "My MCP Server Test"
version: "1.0.0"

capabilities:
tools: true
resources: false

server:
command: "cargo run --bin my-mcp-server"
args: ["stdio"]
transport: "stdio"
startup_timeout_seconds: 30

tools:
- name: "echo"
description: "Test echo functionality"
tests:
- name: "basic_echo"
input:
message: "Hello, World!"
expected:
error: false
fields:
- path: "$.result"
field_type: "string"
required: true

HTTP Server Configurationโ€‹

server:
transport: "http"
url: "http://localhost:3000"
connection_timeout: 10
headers:
Authorization: "Bearer ${API_TOKEN}"
Content-Type: "application/json"

๐Ÿš€ Advanced Testing Scenariosโ€‹

Custom Validation Scriptsโ€‹

Use Python scripts for complex validation logic:

tools:
- name: "complex_analysis"
tests:
- name: "custom_validation"
input:
project_path: "test-project"
expected:
error: false
custom_validation:
script: "scripts/validate_analysis.py"
timeout_seconds: 30

Custom validation script example:

#!/usr/bin/env python3
# scripts/validate_analysis.py

import json
import sys

def validate_response(response_data):
"""Custom validation logic"""
result = response_data.get('result', {})

# Custom business logic validation
if result.get('total_files', 0) < 1:
return False, "No files analyzed"

if 'languages' not in result:
return False, "Missing language analysis"

return True, "Validation passed"

if __name__ == "__main__":
response = json.load(sys.stdin)
success, message = validate_response(response)

if success:
print(f"โœ… {message}")
sys.exit(0)
else:
print(f"โŒ {message}")
sys.exit(1)

๐Ÿ“ˆ Best Practicesโ€‹

1. Test Organizationโ€‹

  • Group related tests in logical tool/resource sections
  • Use descriptive test names that explain the scenario
  • Include both success and error test cases
  • Test edge cases and boundary conditions

2. Configuration Managementโ€‹

  • Start with simple configurations and add complexity gradually
  • Use environment variables for sensitive data
  • Validate configurations before running tests
  • Version your test specifications alongside your server code

3. Performance Considerationsโ€‹

  • Set appropriate timeouts based on expected response times
  • Use concurrency judiciously - don't overwhelm your server
  • Monitor resource usage during test execution
  • Include performance requirements in critical tests

4. Debugging and Troubleshootingโ€‹

  • Use --verbose flag for detailed output during debugging
  • Run single tests with --filter to isolate issues
  • Check server logs when tests fail unexpectedly
  • Validate your YAML before running tests

๐Ÿ”— Next Stepsโ€‹


Need Help? Check the Troubleshooting Guide or open an issue for assistance.