Handling Asynchronous task with FastAPI and Celery
- Integrate Celery into a FastAPI app and create tasks.
- Containerize FastAPI, Celery, and Redis with Docker.
- Run processes in the background with a separate worker process.
- Save Celery logs to a file.
- Set up Flower to monitor and administer Celery jobs and workers.
- Test a Celery task with both unit and integration tests.
docker-compose up -d --build
Redirect to:
- http://localhost:8004 for app
- http://localhost:5556/ for flower dashboard
Run
curl http://localhost:8004/tasks -H "Content-Type: application/json" --data '{"type": 1}'
Get the task_id, then call the updated endpoint to view the status, example:
curl http://localhost:8004/tasks/f3ae36f1-58b8-4c2b-bf5b-739c80e9d7ff
Test
docker-compose exec web python -m pytest
I conclude that celery can be used to execute repeatable tasks and break up complex, resource-intensive tasks so that computational workload can be distributed across a number of machine to reduce the time completion and the load on the machine handling client requests.