We read every Flux (1 / 2 / Kontext) prompt guide so you don't have to.
Paste your prompt below - we'll rewrite it using Black Forest Labs's official best practices.
0/100
0
Role
0
Ctx
0
Task
0
Constr
0
Fmt
0 chars
0 chars
- Built on Black Forest Labs's official prompting guide
- Handles excellent_text_rendering, flux_kontext_natural_lang_edit, fill_inpaint, canny automatically
- Free, instant, no signup
What Flux actually rewards
We pulled this from Black Forest Labs's official guidance and what works in production. The short version:
- →Lead with subject.
- →Use comma separation between distinct concepts.
- →Specify lighting explicitly (biggest impact).
- →Describe foreground/midground/background hierarchically for layered scenes.
- →Use Flux Kontext for image editing with natural language.
Before you hit send, check:
- ☐Lead with subject?
- ☐Did you use comma separation between distinct concepts?
- ☐Did you specify lighting explicitly (biggest impact)?
- ☐Describe foreground/midground/background hierarchically for layered scenes?
- ☐Did you use Flux Kontext for image editing with natural language?
Common mistakes we fix automatically
- AvoidDon't keyword-stack.
- AvoidDon't add SD quality boosters ('ultra-detailed, 8k, masterpiece') — wastes tokens.
Ready to rewrite for Flux?
Frequently asked questions
- Which versions of Flux (1 / 2 / Kontext) does this support?
- We support flux-1-schnell, flux-1-dev, flux-1-pro, flux-1-ultra, flux-2-klein, flux-2-max, flux-2-pro, flux-2-flex, flux-2-dev, flux-kontext. We apply the prompt patterns Black Forest Labs recommends for each, so the rewrite is tuned to the version you're using.
- Is my prompt stored or used for training?
- No. Prompts are sent to the rewriter, scored, returned, and discarded. We don't train on them and we don't keep them around.
- Do I need to know prompt engineering to use this?
- Nope. That's the point. Paste what you have, click Rewrite, get back a version that follows Black Forest Labs's official guidance.
- What makes this different from Flux (1 / 2 / Kontext)'s own "improve prompt" feature?
- Built-in optimizers use the model's own preferences. Ours is built on Black Forest Labs's official documentation and patterns that consistently produce better results in production. We keep rewrites inside the length window Flux (1 / 2 / Kontext) responds best to.
Optimizing for a different AI?