Raptor
Autonomous SEO audit engine with 114+ analyzers, health scoring, AI-driven analysis, and Cloudflare Browser Rendering. Built in Rust for speed and reliability.
Duration
Ongoing
Team Size
1 developer
Industry
SEO & Developer Tools
Client
Open Source
Project Results
SEO analyzers across technical, on-page, and structured data
Full site audit on most small business websites
JSON, HTML, CSV, and SVG report output
The Challenge
Most SEO audit tools are slow, expensive, and treat their analysis like a black box. They spit out generic recommendations that don’t account for context, miss JavaScript-rendered content entirely, and charge hundreds per month for the privilege. I needed a tool that could crawl a site fast, render SPAs properly, and produce structured data that an AI could actually reason about, not just flag keywords.
Our Solution
I built Raptor from scratch in Rust. It’s an async concurrent crawler with 114+ SEO analyzers covering everything from title tags and schema markup to security headers and content readability. It integrates with Cloudflare Browser Rendering to handle JavaScript-heavy sites that traditional crawlers miss entirely. The output is structured JSON designed to feed directly into Claude for intelligent analysis, or exposed as an MCP server for agentic workflows where the AI crawls, analyzes, and acts autonomously.
Key Features
Technologies Used
Technical Implementation
frontend
Self-contained HTML reports built with Minijinja templates compiled directly into the binary. Light theme CSS, interactive tabs for Overview, Issues, Pages, and Structure views. SVG site structure visualization. A mascot whose mood changes based on the audit score.
backend
Core engine written in Rust using Tokio for async I/O. BFS crawl queue with configurable depth and page limits. Event-driven architecture using CrawlEvent channels that decouple the engine from any UI, making it embeddable as a library.
integrations
Cloudflare Browser Rendering for JavaScript-heavy sites. Claude API integration for intelligent analysis. MCP server for agentic workflows. Python companion tools for NER (spaCy), internal link graph visualization, and historical SERP tracking via Wayback Machine.
performance
Rust’s zero-cost abstractions and Tokio’s async runtime deliver high-throughput crawling. Responsible by default: honors robots.txt, polite delays, exponential backoff, rate limiting. Library-first architecture means zero CLI coupling.