Seldaek / monolog

Sends your logs to files, sockets, inboxes, databases and various web services

Home Page:https://seldaek.github.io/monolog/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Is Monolog a good choice for logging to same file from multiple scripts?

c0dehunter opened this issue · comments

Will using Monolog remove the potential bottleneck as compared to logging with file_put_contents with LOCK_EX flag?

I have the following public-facing PHP script, which gets executed about 10x/s on average, and 100x/s in peak times. This resulted in a bottleneck since scripts have to wait for each other due to LOCK_EX.

<?php
	// ... do some things
	// then log the result:
	file_put_contents("/home/logs/public.log", "\n" . date("d.m. G:i:s") . " (some log) $httpcode $retry_count", FILE_APPEND | LOCK_EX);
?>

I am considering rewriting it to use Monolog like this:

<?php
	require_once(DIR.'/vendor/autoload.php');
	use MonologLogger;
	use MonologHandlerStreamHandler;
	
	$logger = new Logger('public-script');
	$logger->pushHandler(new StreamHandler('/home/logs/public.log', Logger::INFO));

	// ... do some things
	// then log the result:
	$logger->info("(some log) $httpcode $retry_count");
?>

Nope it won't help. Either you enable locking on StreamHandler and you end up with more or less the same situation (it does keep the file handle open and just locks/unlocks when writing though, so it may be slightly better than file_put_contents which will open a file handle every time, but I doubt it'll help that much), or you don't enable locking, but then you are going to end up with race conditions in the logs potentially if you have a very high write rate.

It may help if you use SyslogUdpHandler or similar though as an external daemon can accept messages in parallel and flush them all in sequence without blocking your FPM workers.