ArshErgon / aws-crawler

Crawl websites and upload crawled data to S3

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

made with rust

aws-crawler

CLI program in Rust that crawls a website using the Spider crate and uploads the crawled data to an S3 bucket using the AWS SDK S3 crate.

How-to

To use aws-crawler:

cargo run website_to_crawl bucket_name region

Please double-check that you have correctly set up your AWS credentials, and that you have the necessary permissions to create and write to an S3 bucket in the specified region.

About

Crawl websites and upload crawled data to S3


Languages

Language:Rust 100.0%