ernesto-jimenez / crawler

Easily crawl websites in Go.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

crawler

GoDoc Build Status Go Report Card

A simple package to quickly build programs that require crawling websites.

go get github.com/ernesto-jimenez/crawler

Usage

func Example() {
	startURL := "https://godoc.org"

	cr, err := crawler.New()
	if err != nil {
		panic(err)
	}

	err = cr.Crawl(startURL, func(url string, res *crawler.Response, err error) error {
		if err != nil {
			fmt.Printf("error: %s", err.Error())
			return nil
		}
		fmt.Printf("%s - Links: %d Assets: %d\n", url, len(res.Links), len(res.Assets))
		return crawler.ErrSkipURL
	})
	if err != nil {
		panic(err)
	}
	// Output:
	// https://godoc.org/ - Links: 39 Assets: 5
}

About

Easily crawl websites in Go.

License:MIT License


Languages

Language:Go 100.0%