Documentation for Jetty

Testing and Simulation

Jetty includes built-in tools for testing APIs and simulating failure conditions without writing backend code. Use these features to mock endpoints, replay captured traffic, inject latency or errors, and return templated responses -- all from the dashboard or CLI.


Virtual Endpoints

Virtual endpoints let you define mock HTTP responses directly in Jetty, without running any local server. Requests to a virtual endpoint are handled entirely at the edge -- your machine does not need to be online.

When to use virtual endpoints

  • Prototype an API contract before writing backend code
  • Provide a stub for a service a teammate depends on
  • Return a fixed response to a webhook sender while you iterate on handler logic
  • Serve a health-check or status page without a running app

Creating a virtual endpoint

  1. Open Bridge and navigate to the tunnel you want to configure.
  2. Click Virtual Endpoints in the tunnel detail view.
  3. Click Add Endpoint.
  4. Configure the endpoint:
Field Description
Method HTTP method to match (GET, POST, PUT, DELETE, or ANY)
Path URL path to match (e.g. /api/status, /webhooks/ack)
Status code HTTP status code to return (default: 200)
Headers Response headers as key-value pairs
Body Response body (plain text, JSON, HTML, or XML)
  1. Click Save.

Example: mock a status endpoint

Setting Value
Method GET
Path /api/status
Status 200
Headers Content-Type: application/json
Body {"status": "ok", "version": "1.0.0"}

Requests to https://your-tunnel.tunnels.usejetty.online/api/status now return the mock response, even if no local server is running.

CLI usage

Create virtual endpoints from the command line:

jetty mock /api/status --status=200 --body='{"status":"ok"}' --header="Content-Type: application/json"

List active mocks:

jetty mock --list

Remove a mock:

jetty mock /api/status --delete

Priority

Virtual endpoints take precedence over tunnel forwarding. If a virtual endpoint matches a request, the mock response is returned without forwarding to your local app. Remove the virtual endpoint to resume normal forwarding for that path.


Record/Replay Mode

Record mode captures real API traffic flowing through your tunnel and stores it for later replay. This lets you develop against real data without hitting live services repeatedly.

Recording traffic

Enable recording on a tunnel:

  1. Open the tunnel in Bridge.
  2. Toggle Record Mode on.
  3. Send traffic through the tunnel normally (from a browser, webhook, or API client).
  4. Captured request/response pairs appear in the Recordings tab.

Or from the CLI:

jetty share 3000 --record

What gets recorded

Each recording captures:

  • Request method, path, query string, and headers
  • Request body
  • Response status code, headers, and body
  • Timestamp and round-trip duration

Recordings are stored per-tunnel and respect your redaction tier settings. Sensitive headers are stripped according to your configured redaction tier.

Replaying recordings

Replay a recorded session to re-send the same requests to your local app:

jetty replay --session=latest

Or replay a specific recording by ID:

jetty replay --recording=abc123

Replay sends the original request to your local upstream and displays the new response alongside the recorded response, making it easy to spot regressions.

Use cases

  • Offline development: Record webhook payloads from Stripe or GitHub, then replay them while developing without internet access.
  • Regression testing: Record a sequence of API calls, make code changes, replay, and compare responses.
  • Demo preparation: Capture a working flow and replay it during a live demo without depending on external services.

Latency and Error Injection

Inject artificial latency, errors, or connection failures into tunnel traffic to test how your application handles degraded conditions.

Configuring injection rules

  1. Open the tunnel in Bridge.
  2. Navigate to Fault Injection in the tunnel settings.
  3. Add a rule:
Field Description
Path pattern Which requests to affect (e.g. /api/*, * for all)
Latency (ms) Additional delay before forwarding (0 for none)
Error rate (%) Percentage of requests to fail (0--100)
Error status HTTP status code to return for failed requests (default: 500)
Error body Optional response body for injected errors
  1. Click Save.

CLI usage

Add latency to all requests:

jetty share 3000 --latency=500

Inject errors on 10% of requests:

jetty share 3000 --error-rate=10 --error-status=503

Target specific paths:

jetty share 3000 --fault="/api/payments*:latency=2000,error-rate=5"

Examples

Simulate a slow database:

Add 2 seconds of latency to API endpoints:

Path Latency Error rate
/api/* 2000ms 0%

Test timeout handling:

Add 30 seconds of latency to trigger client-side timeouts:

Path Latency Error rate
/api/slow-endpoint 30000ms 0%

Simulate intermittent failures:

Return 503 on 20% of requests:

Path Latency Error rate Error status
* 0ms 20% 503

Simulate a complete outage:

Return 500 on all requests:

Path Latency Error rate Error status
* 0ms 100% 500

Tips

  • Injection rules are applied before forwarding to your local app. Requests that receive an injected error are never sent upstream.
  • Latency is additive -- it adds to the natural response time of your local app.
  • Use the traffic inspector to verify that injection is working as expected. Injected responses are tagged in the request log.

Response Templates

Response templates let you define dynamic mock responses using variables and simple logic. Templates are used with virtual endpoints to return responses that vary based on the incoming request.

Template variables

Use double-brace syntax in your response body:

Variable Description
{{request.method}} HTTP method (GET, POST, etc.)
{{request.path}} Request path
{{request.query.name}} Query parameter by name
{{request.header.name}} Request header by name (case-insensitive)
{{request.body}} Raw request body
{{timestamp}} Current ISO 8601 timestamp
{{uuid}} Random UUID
{{random.int(min,max)}} Random integer in range

Example: echo endpoint

Return a JSON response that mirrors the incoming request:

{
  "received_at": "{{timestamp}}",
  "method": "{{request.method}}",
  "path": "{{request.path}}",
  "id": "{{uuid}}"
}

Example: webhook acknowledgment

Return a provider-specific acknowledgment:

{
  "status": "received",
  "event_id": "{{request.header.X-GitHub-Delivery}}",
  "processed_at": "{{timestamp}}"
}

Example: mock user API

Return different data based on the requested user ID:

{
  "id": {{request.query.id}},
  "name": "Test User {{request.query.id}}",
  "email": "user{{request.query.id}}@example.com",
  "created_at": "{{timestamp}}"
}

A request to /api/users?id=42 returns:

{
  "id": 42,
  "name": "Test User 42",
  "email": "user42@example.com",
  "created_at": "2026-04-04T12:00:00Z"
}

Using templates with the CLI

jetty mock /api/echo --body='{"method":"{{request.method}}","ts":"{{timestamp}}"}'

Send feedback

Found an issue or have a suggestion? Let us know.