SEO Article Spec Builder

Paste competitor URLs ranked for your target keyword, and this tool extracts heading patterns, word counts, entities, and intent — then builds two copy-ready prompts for Claude: one for writing the article from scratch, one for patching an existing draft to spec. No AI calls. No paid APIs. Every output is derived from the pages you provide.

Use top organic results for your keyword. Avoid forums, YouTube, and social pages.

Frequently Asked Questions

Why does an article spec improve search rankings?

A spec forces your content to match what already ranks — correct heading structure, topical depth, and entity coverage. Without it, writers guess at structure and miss the signals search engines use to evaluate relevance and authority. Briefs produced from real competitor data consistently outperform those written from scratch.

Why do I need to provide competitor URLs?

Competitor pages that already rank contain the ground truth for a keyword — heading patterns, approximate length, and entity coverage. This tool extracts those patterns without a paid SERP API by fetching the pages you specify directly. You know your vertical, so you pick the most relevant pages.

How is search intent detected without a paid API?

Intent is inferred from two signals: keyword modifier words (how, best, buy, شرح, etc.) and the title patterns of the competitor pages you provide. The rules-based engine scores each intent type and returns the highest-confidence classification.

What are the Writing and Review prompts for?

The Writing Prompt contains everything Claude needs to write the article to spec from scratch — structure, entities, FAQs, tone rules, and formatting requirements. The Review Prompt instructs Claude to patch an existing draft to meet spec, returning only changed sections rather than rewriting the entire piece.

Is it safe to enter competitor URLs?

Yes. The server fetches pages using a standard browser user-agent. Competitor sites see a normal HTTP request — no credentials, cookies, or identifying information from you are sent.

What SSRF protection is in place?

Only http and https schemes are permitted. The server resolves each hostname and rejects requests to private IP ranges (10.x, 192.168.x, 172.16–31.x, 127.x, ::1) and localhost before any network request is made.

What are the limitations of this tool?

The tool cannot parse JavaScript-rendered pages — only static HTML is processed. Sites behind login walls, CAPTCHA, or aggressive bot protection will fail to fetch. It has no access to search volume or ranking difficulty data. You need at least 1 URL to run, but 3–5 give significantly better heading frequency data.

How long are competitor pages cached?

Each URL is cached server-side for 6 hours keyed by its md5 hash. This reduces repeated requests to competitor sites and speeds up re-runs on the same URL list.