google / gonids

gonids is a library to parse IDS rules, with a focus primarily on Suricata rule compatibility. There is a discussion forum available that you can join on Google Groups: https://groups.google.com/forum/#!topic/gonids/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

design question: lexer in goroutine?

julienschmidt opened this issue · comments

While looking through the code, I noticed that the lexer runs its main loop its own goroutine:

func lex(input string) (*lexer, error) {
	...
	l := &lexer{
		input: input,
		items: make(chan item),
	}
	go l.run()
	return l, nil
}

Items are then received from the unbuffered channel l.items , thus making the code run fully sequential again.

Is there any other good reason to run it in it's own goroutine? If not, the overhead for synchronization and context switches could be avoided.

commented

@clem1 Any thoughts here. I don't see a strong reason for this to be its own goroutine, but maybe you had given this some thought?

commented

So this code was copied from the official Golang text lexer [0]. I guess the main reason is to not wait for the end of the lexing in the parsing, e.g. parser can exit earlier without having the lexing to be completely finished. Not sure it maters in our case since rules are relatively small and very fast to lex.

Happy to have the lex() being fully sequential.

[0] https://talks.golang.org/2011/lex.slide#1

commented

@julienschmidt do you want to take a look at making this change? I don't know that it gains us a lot to remove it, so I don't plan to tackle this in the short term (versus adding features, better parsing, etc.)
If not I'll close this issue for now.

I'm actually curious if this would pay off. I'll write a PoC and benchmark as soon as I have time for it.