Skip to content

Prompt LabCraft better prompts.

A multi-provider prompt engineering workbench. Enhance, A/B test, and compose prompts across Anthropic, OpenAI, Gemini, OpenRouter, and Ollama — on the web or as a Chrome extension.

Open Web App Install Extension

Works with your stack

Anthropic
OpenAI
Google Gemini
OpenRouter
Ollama
// capabilities

Everything you need to engineer prompts

One-Click Enhance

Refine any prompt instantly using your active provider. Prompt Lab analyzes structure, specificity, and clarity — then rewrites it better.

A/B Testing

Run the same prompt against two models side-by-side. Compare output quality, tone, and completeness across providers in real time.

Prompt Composer

Build complex system prompts with modular blocks. Chain instructions, personas, and constraints into structured templates.

Prompt Library

Save, tag, and organize your best prompts. Import and export your collection as JSON for portability across machines.

Credential Isolation

API keys stay in your browser. On the web app, keys live in localStorage and route through a CORS proxy. The Chrome extension isolates them in a Manifest V3 service worker.

Provider Hot-Swap

Switch providers without losing context. Your library, settings, and experiment history persist across every provider change.

Use Anywhere

Open the web app at promptlab.tools/app in any browser. Or install the Chrome extension for a side panel that stays open while you browse — zero tab-switching.

// under the hood

One panel, five providers, zero config drift

Every API call routes through a single service worker that translates your prompt into each provider's native format. Responses are normalized so the UI never needs to know which model answered.

background.js
// Anthropic — native format, routed via callModel()
async function callModel(payload) {
  const res = await fetch('https://api.anthropic.com/v1/messages', {
    method: 'POST',
    headers: { 'x-api-key': key },
    body: JSON.stringify(payload)
  });
  return normalize('anthropic', await res.json());
}
// OpenAI — chat completions format
async function callOpenAI(payload) {
  const msgs = toChatMessages(payload);
  const res = await fetch('https://api.openai.com/v1/chat/completions', {
    method: 'POST',
    headers: { Authorization: `Bearer ${key}` },
    body: JSON.stringify({ model, messages: msgs })
  });
  return normalize('openai', await res.json());
}
// Ollama — local, no auth required
async function callOllama(payload) {
  const msgs = toChatMessages(payload);
  const res = await fetch(`${baseUrl}/api/chat`, {
    method: 'POST',
    body: JSON.stringify({
      model, messages: msgs, stream: false
    })
  });
  return normalize('ollama', await res.json());
}
// get started

Two ways to get started

A

Use the web app

Open promptlab.tools/app — no install, no setup. Works in any modern browser.

B

Install the Chrome extension

Download the release, open chrome://extensions, enable Developer Mode, and load unpacked for side-panel access.

2

Choose your provider

Open Settings, pick a provider, enter your API key, and select a model from the dropdown — or type a slug for OpenRouter and Ollama.

3

Start experimenting

Type a prompt, hit Enhance. A/B test variants. Save to your library. Switch providers and compare. Iterate until it's perfect.

Ready to experiment?

Stop guessing at prompts. Start engineering them.

Open Web App View Source