Appearance
User Guide
Welcome to [dummy-responder]
[dummy-responder] is a configurable mock service that helps you test your applications by providing predictable responses across HTTP, WebSocket, and Server-Sent Events protocols. Whether you're developing a new API client, testing error handling, or simulating network conditions, [dummy-responder] makes it easy.
Quick Start
Option 1: Docker (Recommended)
The fastest way to get started is with Docker:
bash
# Run with Docker
docker run -p 8080:8080 dummy-responder
# Or use Docker Compose
curl -O https://gitlab.bjoernbartels.earth/devops/dummy-responder/-/raw/master/docker-compose.yml
docker-compose up
Option 2: Pre-built Binary
Download the latest binary for your platform from the releases page:
bash
# Download and run (example for Linux)
wget https://gitlab.bjoernbartels.earth/devops/dummy-responder/-/releases/latest/download/dummy-responder-linux-amd64
chmod +x dummy-responder-linux-amd64
# Run with default settings (port 8080)
./dummy-responder-linux-amd64
# Run with custom port via CLI flag
./dummy-responder-linux-amd64 -port 9090
# Run with custom port via environment variable
PORT=3000 ./dummy-responder-linux-amd64
# Show all available options
./dummy-responder-linux-amd64 -help
Option 3: Build from Source
If you have Go installed:
bash
git clone https://gitlab.bjoernbartels.earth/devops/dummy-responder.git
cd dummy-responder
# Run with default port
go run ./cmd/dummy-responder
# Run with custom port
go run ./cmd/dummy-responder -port 8090
# Run with environment variable
PORT=3000 go run ./cmd/dummy-responder
Configuration Options
The service can be configured via command-line flags or environment variables:
Option | CLI Flag | Environment Variable | Default | Description |
---|---|---|---|---|
Port | -port 8080 | PORT=8080 | 8080 | Port to listen on |
Help | -help | - | - | Show help message and exit |
Priority Order: CLI flags override environment variables, which override defaults.
bash
# Examples of different configuration methods
./dummy-responder -port 9090 # CLI flag
PORT=3000 ./dummy-responder # Environment variable
PORT=3000 ./dummy-responder -port 8080 # CLI flag overrides env var (uses 8080)
Verify Installation
Once running, test the service:
bash
curl http://localhost:8080/
# Response: Hello, World!
curl http://localhost:8080/health
# Response: {"status":"healthy","timestamp":"2024-12-19T10:30:00Z"}
# View current configuration
curl http://localhost:8080/config | jq .
# Response: Current service configuration as JSON
💡 Need detailed configuration options? See the Configuration Reference for comprehensive documentation of all configuration methods, options, and examples.
Basic Usage
Simple HTTP Requests
The most basic usage is sending HTTP requests and receiving responses:
bash
# Simple GET request
curl http://localhost:8080/
# Returns: Hello, World! (HTTP 200)
# GET request with custom message
curl "http://localhost:8080/?message=Custom+Response"
# Returns: Custom Response (HTTP 200)
Configuring Response Status Codes
Use the status
parameter to control the HTTP response code:
bash
# Success responses
curl "http://localhost:8080/?status=200" # OK
curl "http://localhost:8080/?status=201" # Created
curl "http://localhost:8080/?status=202" # Accepted
# Client error responses
curl "http://localhost:8080/?status=400" # Bad Request
curl "http://localhost:8080/?status=401" # Unauthorized
curl "http://localhost:8080/?status=404" # Not Found
# Server error responses
curl "http://localhost:8080/?status=500" # Internal Server Error
curl "http://localhost:8080/?status=503" # Service Unavailable
Adding Response Delays
Simulate network latency or slow responses using the delay
parameter:
bash
# 2 second delay
curl "http://localhost:8080/?delay=2s"
# 500 millisecond delay
curl "http://localhost:8080/?delay=500ms"
# 1 minute delay (for timeout testing)
curl "http://localhost:8080/?delay=1m"
Different Content Types
Control the response content type and format:
bash
# JSON response
curl "http://localhost:8080/?content-type=json"
# Returns: {"message":"Hello, World!","timestamp":"2024-12-19T10:30:00Z"}
# XML response
curl "http://localhost:8080/?content-type=xml"
# Returns: <?xml version="1.0"?><response><message>Hello, World!</message></response>
# HTML response
curl "http://localhost:8080/?content-type=html"
# Returns: <html><body><h1>Hello, World!</h1></body></html>
# Plain text (default)
curl "http://localhost:8080/?content-type=text"
# Returns: Hello, World!
Advanced HTTP Features
Simulating Failures
Test your error handling by simulating random failures:
bash
# 20% chance of returning a 500 error
curl "http://localhost:8080/?failure-rate=20"
# 50% chance of failure with custom error status
curl "http://localhost:8080/?failure-rate=50&failure-status=503"
Combining Parameters
You can combine multiple parameters for complex scenarios:
bash
# JSON response with 2-second delay and 201 status
curl "http://localhost:8080/?content-type=json&delay=2s&status=201"
# Simulate slow, failing service
curl "http://localhost:8080/?delay=5s&failure-rate=30&status=502"
# Custom message with specific content type
curl "http://localhost:8080/?message=Order+Created&content-type=json&status=201"
Request Information Echo
The service can echo back information about your request:
bash
# Echo request headers and details
curl "http://localhost:8080/?echo=true" -H "Authorization: Bearer token123"
# Response includes your request details in JSON format
WebSocket Communication
Basic WebSocket Connection
Test WebSocket connections and messaging:
bash
# Using wscat (install with: npm install -g wscat)
wscat -c ws://localhost:8080/ws
# Send messages and receive echoes
> Hello WebSocket!
< Echo: Hello WebSocket!
< Broadcast: New client connected
WebSocket Test Page
Open the included test page in your browser:
bash
# Open in your browser
open http://localhost:8080/test-ws.html
The test page provides:
- Connection status indicator
- Message sending interface
- Received message display
- Connection controls (connect/disconnect)
WebSocket with JavaScript
javascript
// Connect to WebSocket
const ws = new WebSocket('ws://localhost:8080/ws');
// Handle connection open
ws.onopen = function(event) {
console.log('Connected to WebSocket');
ws.send('Hello from JavaScript!');
};
// Handle incoming messages
ws.onmessage = function(event) {
console.log('Received:', event.data);
};
// Handle errors
ws.onerror = function(error) {
console.error('WebSocket error:', error);
};
// Handle connection close
ws.onclose = function(event) {
console.log('WebSocket connection closed');
};
Server-Sent Events (SSE)
Basic SSE Connection
Server-Sent Events provide a way to receive real-time updates from the server:
bash
# Connect to SSE stream
curl -N -H "Accept: text/event-stream" http://localhost:8080/events
# You'll receive periodic events like:
# data: {"timestamp":"2024-12-19T10:30:00Z","message":"Server time update","counter":1}
# data: {"timestamp":"2024-12-19T10:30:05Z","message":"Server time update","counter":2}
SSE Test Page
Open the included SSE test page:
bash
# Open in your browser
open http://localhost:8080/test-sse.html
SSE with JavaScript
javascript
// Connect to SSE stream
const eventSource = new EventSource('http://localhost:8080/events');
// Handle incoming events
eventSource.onmessage = function(event) {
const data = JSON.parse(event.data);
console.log('SSE Event:', data);
// Update your UI with the event data
document.getElementById('events').innerHTML +=
`<div>Time: ${data.timestamp}, Message: ${data.message}</div>`;
};
// Handle errors
eventSource.onerror = function(error) {
console.error('SSE error:', error);
};
// Close connection when done
// eventSource.close();
Health and Status Endpoints
Health Check
Monitor service health:
bash
curl http://localhost:8080/health
# Response: {"status":"healthy","timestamp":"2024-12-19T10:30:00Z","uptime":"2h30m"}
API Documentation
Access interactive API documentation:
bash
# Open Swagger UI in browser
open http://localhost:8080/swagger/
# Get OpenAPI spec as JSON
curl http://localhost:8080/swagger/doc.json
# Get OpenAPI spec as YAML
curl http://localhost:8080/swagger/doc.yaml
Testing Scenarios
API Client Testing
Test how your API client handles different scenarios:
bash
# Test successful response
curl "http://localhost:8080/?status=200&content-type=json"
# Test client error handling
curl "http://localhost:8080/?status=400&message=Invalid+request"
# Test server error handling
curl "http://localhost:8080/?status=500&message=Database+unavailable"
# Test timeout handling
curl --max-time 2 "http://localhost:8080/?delay=5s"
Load Testing Setup
Prepare for load testing:
bash
# Start multiple instances
docker run -d -p 8081:8080 dummy-responder
docker run -d -p 8082:8080 dummy-responder
docker run -d -p 8083:8080 dummy-responder
# Test with different response times
curl "http://localhost:8081/?delay=100ms" # Fast
curl "http://localhost:8082/?delay=500ms" # Medium
curl "http://localhost:8083/?delay=1s" # Slow
Error Rate Testing
Test how your application handles various error rates:
bash
# Low error rate (5%)
curl "http://localhost:8080/?failure-rate=5"
# Medium error rate (25%)
curl "http://localhost:8080/?failure-rate=25"
# High error rate (50%)
curl "http://localhost:8080/?failure-rate=50"
Configuration Options
Command Line Flags
The service supports various configuration options via command-line flags:
bash
# Show all available options
./dummy-responder -help
# Set custom port
./dummy-responder -port 9090
# Simple usage - no additional flags needed for basic operation
Environment Variables
Configure the service behavior with environment variables:
bash
# Change the port (same as -port flag)
PORT=9090 ./dummy-responder
# Configure timeouts (future feature)
READ_TIMEOUT=30s WRITE_TIMEOUT=30s ./dummy-responder
Configuration Priority
Configuration options are applied in this order (highest to lowest priority):
- Command-line flags (e.g.,
-port 8090
) - Environment variables (e.g.,
PORT=8090
) - Default values (e.g.,
8080
)
bash
# Examples showing priority
PORT=3000 ./dummy-responder # Uses port 3000 (env var)
PORT=3000 ./dummy-responder -port 8090 # Uses port 8090 (CLI flag overrides env var)
./dummy-responder # Uses port 8080 (default)
Docker Environment
bash
# Using environment file
echo "PORT=8080" > .env
echo "MAX_CONNECTIONS=1000" >> .env
docker run --env-file .env -p 8080:8080 dummy-responder
# Using docker-compose with environment
cat > docker-compose.yml << EOF
version: '3.8'
services:
dummy-responder:
image: dummy-responder
ports:
- "8080:8080"
environment:
- PORT=8080
- MAX_CONNECTIONS=1000
EOF
docker-compose up
Integration Examples
CI/CD Pipeline Testing
Use [dummy-responder] in your CI/CD pipelines:
yaml
# GitHub Actions example
- name: Start mock service
run: |
docker run -d -p 8080:8080 --name mock-service dummy-responder
- name: Wait for service
run: |
timeout 30 bash -c 'until curl -f http://localhost:8080/health; do sleep 1; done'
- name: Run integration tests
run: |
# Your tests here, pointing to http://localhost:8080
npm test
- name: Cleanup
run: docker stop mock-service
Development Environment
Set up a development environment with multiple services:
yaml
# docker-compose.dev.yml
version: '3.8'
services:
# Main application
my-app:
build: .
ports:
- "3000:3000"
environment:
- API_URL=http://api-mock:8080
- WEBSOCKET_URL=ws://api-mock:8080/ws
depends_on:
- api-mock
# Mock API service
api-mock:
image: dummy-responder
ports:
- "8080:8080"
environment:
- PORT=8080
# Mock slow service
slow-service:
image: dummy-responder
ports:
- "8081:8080"
environment:
- DEFAULT_DELAY=2s
Testing Different Environments
Simulate different environment conditions:
bash
# Production-like environment (fast, reliable)
docker run -d -p 8080:8080 \
-e PORT=8080 \
dummy-responder
# Staging environment (moderate delays, some failures)
docker run -d -p 8081:8080 \
-e DEFAULT_DELAY=200ms \
-e DEFAULT_FAILURE_RATE=5 \
dummy-responder
# Development environment (slow, high failure rate)
docker run -d -p 8082:8080 \
-e DEFAULT_DELAY=1s \
-e DEFAULT_FAILURE_RATE=20 \
dummy-responder
Monitoring and Basic Output
Limited Logging Implementation
The service provides minimal console output focused on essential operations. For production environments requiring detailed logging, consider integrating structured logging frameworks.
Viewing Basic Output
bash
# Docker output
docker logs <container-name>
# Follow output in real-time
docker logs -f <container-name>
# Output with timestamps
docker logs -t <container-name>
Understanding Service Output
The service provides minimal, essential output:
Starting dummy-responder on port 8080
Configuration loaded successfully
Server running at http://localhost:8080
What You'll See:
- Service startup messages
- Configuration loading status
- Critical errors (if any)
- Graceful shutdown notifications
For Debugging:
- Use
/config
endpoint to inspect current configuration - Use
/health
endpoint for service status - Check interactive test page at
/test
for WebSocket/SSE testing
Health Monitoring
bash
# Basic health check
curl http://localhost:8080/health
# Configuration inspection
curl http://localhost:8080/config
# Using health check in monitoring
while true; do
if curl -f http://localhost:8080/health > /dev/null 2>&1; then
echo "Service healthy"
else
echo "Service unhealthy"
fi
sleep 10
done
Troubleshooting
Common Issues
Service won't start on port 8080
bash
# Check what's using the port
lsof -i :8080
# Use a different port
PORT=8090 docker run -p 8090:8090 dummy-responder
WebSocket connections fail
bash
# Check if WebSocket endpoint is reachable
curl -i -N \
-H "Connection: Upgrade" \
-H "Upgrade: websocket" \
-H "Sec-WebSocket-Key: test" \
-H "Sec-WebSocket-Version: 13" \
http://localhost:8080/ws
SSE stream doesn't work
bash
# Test SSE endpoint directly
curl -N -H "Accept: text/event-stream" http://localhost:8080/events
# Check if proxy/firewall is interfering
curl -H "Cache-Control: no-cache" http://localhost:8080/events
High memory usage
bash
# Check resource usage
docker stats <container-name>
# Restart with resource limits
docker run -p 8080:8080 --memory=128m --cpus=0.5 dummy-responder
Getting Help
- Check the logs for error messages
- Test with curl to isolate issues
- Verify connectivity with health checks
- Check documentation at
/swagger/
- Report issues on GitHub with:
- Version information
- Configuration used
- Error messages
- Steps to reproduce
Debug Mode
Enable debug logging for troubleshooting:
bash
# Docker
docker run -e LOG_LEVEL=DEBUG -p 8080:8080 dummy-responder
# Binary
LOG_LEVEL=DEBUG ./dummy-responder
# Docker Compose
environment:
- LOG_LEVEL=DEBUG
Best Practices
Performance Testing
- Start with low concurrency and gradually increase
- Monitor both the service and your application
- Use realistic delays and failure rates
- Test timeout handling scenarios
Development Workflow
- Use Docker Compose for consistent environments
- Include health checks in your test scripts
- Test both success and failure scenarios
- Document your test scenarios
Production Considerations
- Don't use in production environments
- Use resource limits in container environments
- Monitor resource usage during extended testing
- Clean up containers after testing
Examples and Scripts
All examples and test scripts are available in the examples/
directory:
curl-examples.sh
- Collection of curl commandstest-ws.html
- WebSocket testing pagetest-sse.html
- Server-Sent Events testing pageload-test.js
- k6 load testing scriptci-integration.md
- CI/CD integration examples
What's Next?
- Explore the API documentation at
http://api.responder.bjoernbartels.tech/swagger/
- Try the interactive examples in your browser
- Integrate with your testing pipeline
- Check the developer guide for advanced usage
- Contribute improvements via GitHub
Have questions? Check our FAQ or create an issue for support.