uniqueId will cause the job not finished...
vFire opened this issue · comments
When I set the uniqueId in the queue job file, it won't finished anymore... The queue will be blocked ...
Hello @vFire
could you provide an example of your code for this? Which versions are you running?
Cheers,
Peter
Hi Spekulatius,
Thanks for your quickly response, it's because of the local .env file and app config, run in browser and local command sometimes will mix up the key in redis ...
I've fixed it, I realized that when the queue is running, even I define it in a laravel job, the queue is still running independently, so if I want to stop the queue, I have to kill the worker directly, I just simply define 100 urls each queue to process, and schedule the job every one minute to run. That works for me in real business, although I think still need some adjustment in further to improve the efficient when more and more sites need to be crawed parallelly.
If you know any better solution, pls let me know, much appreciated!
Hey @vFire
sounds like you found a solution. Maybe you could ensure the job isn't run double? https://laravel.com/docs/8.x/queues#preventing-job-overlaps this might help.
Cheers,
Peter