A FinOps application, powered by TimeGPT, GPT-4, and Vantage. 100% open source.
⛅️💸🤖
Time.GPT.Vantage.-.19.August.2023.mp4
Live Demo: TimeGPT + Vantage by Serpa Cloud
This project fetches historical cloud costs from Vantage's API. Using TimeGPT, a general pre-trained model for time series, it analyzes past behaviors, identifying trends and patterns to accurately predict future usage and detect anomalies. Finally, it uses GPT-4 to provide a possible explanation for the cause of the atypical spend.
The project includes a server component to manage the services and a web app based on React.
Before running this application, you'll need:
- Token/API key from Vantage
- Token/API key from Nixtla
- Token/API key from OpenAI
- NodeJS v16+ installed
After cloning, install dependencies by running
yarn
To run the project, you need to execute both components in parallel:
export NIXTLA_TOKEN="MY_NIXTLA_TOKEN" && export OPENAI_API_KEY="MY_OPENAI_API_KEY" && yarn serve
This runs the backend server at http://localhost:7001.
yarn start
This runs the web app in development mode. Open http://localhost:3000 in your browser. The page will reload when you make changes.
yarn build
export NIXTLA_TOKEN="MY_NIXTLA_TOKEN" && export OPENAI_API_KEY="MY_OPENAI_API_KEY" && yarn serve
This runs the backend server at http://localhost:7001 and exposes static files at the root.
docker build --pull --rm -f "Dockerfile" -t gpt-time-nixtla-vantage:latest "."
docker run --env OPENAI_API_KEY="MY_OPENAI_API_KEY" --env NIXTLA_TOKEN="MY_NIXTLA_TOKEN" -p 127.0.0.1:80:80/tcp gpt-time-nixtla-vantage:latest
This runs the backend server at http://localhost and exposes static files at the root.
docker run --env OPENAI_API_KEY="MY_OPENAI_API_KEY" --env NIXTLA_TOKEN="MY_NIXTLA_TOKEN" -p 127.0.0.1:80:80/tcp serpacloud/gpt-time-nixtla-vantage:latest
This runs the backend server at http://localhost and exposes static files at the root.
You can make modifications to this application and easily deploy it using Serpa Cloud. Just fork it on GitHub or upload the application to a new repository.
When configuring the deployment, remember to add the environment variables with your API keys, OPENAI_API_KEY, and NIXTLA_TOKEN.
This is a demo of the capabilities of using Nixtla, Vantange, and OpenAI to forecast and analyze cloud costs. It also intends to showcase how easy deployments are with Serpa. However, if the community shows interest, we are happy to include more features.
Top of mind, some next steps could be:
- Add support for Pump
- Add support for Cloudthread
- Update the front-end to show progress.
If you're a developer who'd like to help with any of these, please open an issue to discuss the best way to tackle the challenge.
Made with love in 🇲🇽 by 🏳️⚧️.