A Question Answering (QA) chatbot built with Vercel, OpenAI, and Steamship
- Next.js
- OpenAI API (REST endpoint)
- API Routes (Edge runtime)
- Steamship API (AI orchestration stack)
Execute create-next-app
with npm or Yarn to bootstrap the example:
npx create-next-app --example https://github.com/steamship-core/vercel-examples/tree/main/prompt-app
# or
yarn create next-app --example https://github.com/steamship-core/vercel-examples/tree/prompt-app
Steamship is an AI orchestration stack that auto-manages prompts, image generation, embeddings, vector search, and more. Think of it as a host for Vercel-style API functions, but with a managed, stateful, AI stack built-in.
Deploy your Steamship stack from this project's root folder with:
pip install steamship
cd steamship
ship deploy
Rename .env.example
to .env.local
:
cp .env.example .env.local
Then:
- update
STEAMSHIP_API_KEY
with your Steamship API Key. - update
STEAMSHIP_PACKAGE_HANDLE
with the package name you selected when deploying your Steamship Stack
Run your Next.js stack in development mode:
npm install
npm run dev
# or
yarn
yarn dev
The app should be up and running at http://localhost:3000.
When you like what you see, deploy it to the cloud with Vercel (Documentation).