# Crawler.sh > A fast, local-first desktop app and CLI for crawling websites, analyzing SEO issues, extracting content as Markdown, and exporting results in multiple formats. Crawler.sh is a fast, local-first web crawler that runs entirely on your machine — no cloud setup required. It ships as both a CLI tool and a native desktop app, enabling developers, SEO professionals, and content teams to crawl entire websites, extract readable content as clean Markdown, run automated SEO audits, and export results in standard formats like JSON, Sitemap XML, and CSV. Designed for privacy and offline use, crawler.sh processes everything locally and outputs data in open, standard formats. - **Site Crawling** — *Crawl any website within the same domain with configurable concurrency, depth limits, and polite request delays, handling thousands of pages in seconds.* - **Content Extraction** — *Automatically converts page HTML to clean Markdown, including word count, author byline, and excerpt for every page crawled.* - **SEO Analysis** — *Runs 16 automated checks per page, detecting missing titles, duplicate meta descriptions, noindex directives, thin content, long URLs, and more; exports findings as CSV or TXT.* - **Multiple Output Formats** — *Stream results as NDJSON during crawl, or export to JSON arrays, W3C-compliant Sitemap XML, CSV, and human-readable TXT.* - **CLI Tool** — *Install via a single shell command; use `crawl`, `info`, `export`, and `seo` subcommands for full automation and scripting workflows.* - **Desktop App** — *Native macOS app with an interactive visual dashboard featuring 8 cards: live feed, SEO issues panel, status charts, content browser, and export controls.* - **Local-First & Privacy-Friendly** — *All crawling and analysis runs on your own machine with no data sent to external servers, making it suitable for sensitive or offline environments.* - **Upcoming Cloud API** — *A hosted crawling API with scheduled crawls, webhooks, and a web dashboard is currently in development.* ## Features - Concurrent website crawling with configurable depth and page limits - Content extraction to clean Markdown with word count and author byline - 16-category automated SEO analysis per page - Export to NDJSON, JSON, Sitemap XML, CSV, and TXT - CLI with crawl, info, export, and seo subcommands - Native desktop app with 8 interactive dashboard cards - Real-time crawl feed with status badges - Local-first and offline-capable - W3C-compliant Sitemap XML generation - Broken link and HTTP status code detection ## Platforms MACOS, WEB, API ## Pricing Paid ## Links - Website: https://crawler.sh - Documentation: https://crawler.sh/docs - EveryDev.ai: https://www.everydev.ai/tools/crawler-sh