spekulatius / spatie-crawler-toolkit-for-laravel

A toolkit for Spatie's Crawler and Laravel.

Home Page:https://releasecandidate.dev

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

uniqueId will cause the job not finished...

vFire opened this issue · comments

commented

When I set the uniqueId in the queue job file, it won't finished anymore... The queue will be blocked ...

Hello @vFire

could you provide an example of your code for this? Which versions are you running?

Cheers,
Peter

commented

Hi Spekulatius,

Thanks for your quickly response, it's because of the local .env file and app config, run in browser and local command sometimes will mix up the key in redis ...
I've fixed it, I realized that when the queue is running, even I define it in a laravel job, the queue is still running independently, so if I want to stop the queue, I have to kill the worker directly, I just simply define 100 urls each queue to process, and schedule the job every one minute to run. That works for me in real business, although I think still need some adjustment in further to improve the efficient when more and more sites need to be crawed parallelly.

If you know any better solution, pls let me know, much appreciated!

Hey @vFire

sounds like you found a solution. Maybe you could ensure the job isn't run double? https://laravel.com/docs/8.x/queues#preventing-job-overlaps this might help.

Cheers,
Peter