Starter utilizing FastAPI and Celery with RabbitMQ for task queue, Redis for Celery backend and Flower for monitoring the Celery tasks, based on FastAPI with Celery.
- Docker
Copy the sample .env
files to the right location and modify values if needed.
cp ./docker/flower/.env.sample ./docker/flower/.env
cp ./docker/rabbitmq/.env.sample ./docker/rabbitmq/.env
cp ./docker/redis/.env.sample ./docker/redis/.env
# if running w/ docker add the two below also
cp ./docker/api/.env.sample ./docker/api/.env
cp ./docker/worker/.env.sample ./docker/worker/.env
- Run command
docker-compose up
to start up the RabbitMQ, Redis, flower and our application/worker instances. - Navigate to the http://localhost:8000/docs and execute test API call. You can monitor the execution of the celery tasks in the console logs or navigate to the flower monitoring app at http://localhost:5555 (username: user, password: test).
- Python >= 3.7
- RabbitMQ instance
- Redis instance
The RabbitMQ, Redis and flower services can be started with
docker-compose -f docker-compose-services.yml up
Execute the following command: poetry install --dev
- Start the FastAPI web application with
poetry run hypercorn app/main:app --reload
. - Start the celery worker with command
poetry run celery worker -A app.worker.celery_worker -l info -Q test-queue -c 1
- Navigate to the http://localhost:8000/docs and execute test API call. You can monitor the execution of the celery tasks in the console logs or navigate to the flower monitoring app at http://localhost:5555 (username: user, password: test).