Dyer is designed for reliable, flexible and fast web crawling, providing some high-level, comprehensive features without compromising speed.
-
Updated
Jun 15, 2025 - Rust
Dyer is designed for reliable, flexible and fast web crawling, providing some high-level, comprehensive features without compromising speed.
Spider ported to Python
Rust Web Crawler saving pages on Redis
A small library for building fast and highly customizable web crawlers
A simple trap for web crawlers
LinkCollector is web-crawler which collects links of given host recursively
🌊 ~ seaward is a grep-like tool for the web.
A web crawler using Rust.
Crawls websites recursively. High Performance, with seed DB and store into index. Written in Rust.
Web Crawler for internet graph data
A CLI tool for inspecting and analyzing web links.
A high-performance web content downloader and localizer built with Rust. Leverages Rust's powerful concurrency to efficiently batch download web pages and save them as local files.
Multi-threaded Web crawler with support for custom fetching and persisting logic
🕷️ Crawls websites for URLs, and stores them in a textfile.
Simple binary that allows recursively crawling a webpage, while searching for a keyword. Multiple pages are crawled efficiently and concurrently
Add a description, image, and links to the web-crawler topic page so that developers can more easily learn about it.
To associate your repository with the web-crawler topic, visit your repo's landing page and select "manage topics."