Skip to content

No AI dependency

A locked architectural rule: Quay must work fully without any AI or LLM call. Any AI feature is opt-in, off by default, and has a deterministic non-AI fallback that’s complete on its own.

Three reasons:

  1. Trust. A database client is a trust surface — when you ask it to run a query you expect deterministic, auditable behaviour. “Quay decided to add a JOIN because the AI thought you needed it” is the kind of thing that breaks a tool’s trust permanently.
  2. Cost. Quay is a $15-30/mo product. Routing every EXPLAIN through OpenAI’s pricing tier would either kill the margin or force a price hike that compromises the value prop.
  3. Privacy. Some users connect to databases that the company’s security policy forbids exposing to a third-party API. Default-on AI breaks that without warning.

There’s a single switch in Settings → Pro Plus → AI assistant: “AI features enabled”. It’s off by default. Until it’s on, no AI surface anywhere in the app makes a network call to any AI provider — every AI button is hidden, every AI prompt is greyed out, every AI panel says “Enable AI to use this”.

When you enable it, you set:

  • Provider — Anthropic (Claude) or OpenAI
  • API key — your own; stored locally at ~/.config/quay/ai-settings.json, never synced, never sent to Quay’s servers
  • Model — defaults to a small fast one (claude-haiku-4-5 / gpt-4o-mini); free-form override
  • Schema-aware prompts — separate toggle, off by default. When off, Quay never sends your table/column names with the prompt; the AI gets the abstract question only.

Every AI feature in the app has a deterministic non-AI implementation that’s:

  • The default (the AI version is opt-in)
  • Documented as a separate feature in its own right
  • Not crippled to make the AI look better
AI featureNon-AI fallback
AI EXPLAIN (“why is this query slow?”)Rule-based EXPLAIN interpreter — parses the plan, surfaces seq scans on big tables, missing indexes, sort spilling, nested-loop blowups. See EXPLAIN tips
AI generate (“write me an UPDATE for…”)Snippet expansion + autocomplete in the SQL editor
AI explain (column / table / migration)Schema browser + comments column inline
AI suggestion (next migration)Diff between two connections via Schema diff + the migration runner
AI translate query (PG → MySQL)Cross-DB sync + Quay’s dialect-aware import

The rule keeps the rest of the team honest — when someone proposes “this’d be great with AI”, the next question is “what’s the non-AI version?”. If there isn’t one, the feature doesn’t ship as AI-only. The AI version is added to a working non-AI base, never the other way around.

Pro Plus tier:

  • AI EXPLAIN — hover-over on plan node, get a plain-language interpretation. Falls back to the rule-based interpreter when AI is off.
  • AI generate⌘K in the editor opens a small prompt; the result is inserted as text, not executed. You still hit ⌘↵ to run.
  • AI explain table / column — right-click on a schema node, “Explain”. Reads from comments + recent query patterns; AI enabled or not, it’s surfaced as a help panel.

That’s it. We deliberately haven’t built “AI runs the migration”, “AI auto-fixes the schema”, or “AI watches your query and corrects it” — all of those are paths where confidence drops below the North Star.

Vector similarity search against Qdrant / Weaviate / Pinecone is a query feature, not an AI feature — Quay sends a vector you supply (or paste), the engine returns top-K. There’s no LLM in the loop on Quay’s side. See Vector DBs.

Same as zero-deps: at code review. A PR that introduces a network call to an AI provider needs to have the non-AI fallback in the same PR, with the AI path gated behind the user’s enable toggle. “AI-only feature” is a phrase that doesn’t ship.