projectdiscovery / katana

A next-generation crawling and spidering framework.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Library: Silent flag doesn't work. Library too verbose

ca5ua1 opened this issue · comments

Essentially, it's the same as projectdiscovery/naabu#958


Hi, I try to use Katana as a library and the problem I have is that Katana generate on its own terminal output what I don't want without verbose flag.

katana version: 1.0.5

Current Behavior:

When running scan with Katana library it writes following lines in stdout/stdout on its own:

[INF] Started standard crawling for => https://example.com #STDERR
https://example.com #STDOUT
...

And it's correlates with with output from CLI.

pkg ➤ katana -u 'https://scanme.sh/'

   __        __
  / /_____ _/ /____ ____  ___ _
 /  '_/ _  / __/ _  / _ \/ _  /
/_/\_\\_,_/\__/\_,_/_//_/\_,_/

                projectdiscovery.io

[INF] Current katana version v1.0.5 (latest)
[INF] Started standard crawling for => https://scanme.sh/
https://scanme.sh/

pkg ➤ katana -u 'https://scanme.sh/' -silent
https://scanme.sh/

pkg ➤ katana -u 'https://scanme.sh/' -silent > /dev/null
*nothing*

The problem is Silent: true, option when running as a library doesn't help at all and do not suppress either STDERR or STDOUT.

Expected Behavior:

No output at all

Steps To Reproduce:

Slightly modify example code (add silent and remove logging of result):

package main

import (
	"math"

	"github.com/projectdiscovery/gologger"
	"github.com/projectdiscovery/katana/pkg/engine/standard"
	"github.com/projectdiscovery/katana/pkg/output"
	"github.com/projectdiscovery/katana/pkg/types"
)

func main() {
	options := &types.Options{
		MaxDepth:     3,             // Maximum depth to crawl
		FieldScope:   "rdn",         // Crawling Scope Field
		BodyReadSize: math.MaxInt,   // Maximum response size to read
		Timeout:      10,            // Timeout is the time to wait for request in seconds
		Concurrency:  10,            // Concurrency is the number of concurrent crawling goroutines
		Parallelism:  10,            // Parallelism is the number of urls processing goroutines
		Delay:        0,             // Delay is the delay between each crawl requests in seconds
		RateLimit:    150,           // Maximum requests to send per second
		Strategy:     "depth-first", // Visit strategy (depth-first, breadth-first)
		OnResult: func(result output.Result) { // Callback function to execute for result
			// gologger.Info().Msg(result.Request.URL)
		},
                Silent: true,
	}
	crawlerOptions, err := types.NewCrawlerOptions(options)
	if err != nil {
		gologger.Fatal().Msg(err.Error())
	}
	defer crawlerOptions.Close()
	crawler, err := standard.New(crawlerOptions)
	if err != nil {
		gologger.Fatal().Msg(err.Error())
	}
	defer crawler.Close()
	var input = "https://www.hackerone.com"
	err = crawler.Crawl(input)
	if err != nil {
		gologger.Warning().Msgf("Could not crawl %s: %s", input, err.Error())
	}
}
commented

Headless and ShowBrowser , Same question.