This project demonstrates how an MCP (Model Context Protocol) server can inject functionality into an application without requiring the app to take any dependencies on the resource type. It provides two example MCP servers and a sample OpenAI MCP client.
- Purpose: Extracts and formats transcripts from YouTube videos for LLM consumption.
- How it works:
- Accepts a YouTube video URL.
- Extracts the video ID and fetches the transcript using the
youtube-transcript-api. - Formats each transcript entry as
[MM:SS] Textfor easy reading and processing.
- Usage:
- Run the server in debug mode:
uv run mcp dev yt.py - Use the
fetch_youtube_transcripttool to get a formatted transcript from a YouTube URL.
- Run the server in debug mode:
-
Purpose: Provides tools to interact with the Chinook sample database, a sample SQLite music database.
-
Features:
- List all artists and genres.
- Run arbitrary
SELECTSQL queries (read-only). - Get all tracks for a given artist (using a prompt-generated SQL statement).
- Get artist info (using the argument
name).
-
Usage:
- Ensure
chinook.dbis present in the project directory. - Run the server:
uv run mcp dev sqlite_server.py - Use the provided tools to query artists, genres, or tracks. For artist info and tracks, use the argument
name(notartist_name).
When you run #4 (client_agentsdk_mcp.py), you'll see that you only really need two tools, run_sql() and get_sqlite_schema(). The
run_sql()tool allows you to execute any SQL query, whileget_sqlite_schema()provides the schema of the Chinook database, which allows the LLM to understand the structure of the data, and construct appropriate queries! The other methods are there for show and tell, but are not strictly necessary for the client to function. - Ensure
- Purpose: Allows conversational interaction with both MCP servers via OpenAI, using tool schemas for argument validation.
- Usage:
- Configure your
.envfile (see below). - Run:
python client.py - Type prompts such as:
get me info about the artist "AC/DC" from sqlite
- Configure your
- Purpose: Provides an OpenAI MCP client using the Agent SDK, supporting multi-agent orchestration and tool usage.
- Features:
- Uses Azure OpenAI via environment variables for endpoint, key, deployment, and API version.
- Launches two MCP tool servers:
sqlite_server.pyandyt.py. - Defines three agents:
- MCP Agent: Uses both MCP tool servers for database and YouTube operations.
- Pirate Agent: Answers in pirate slang (for fun, no tools).
- Orchestrator Agent: Routes user prompts to the appropriate agent (MCP, Pirate, or itself) based on the request.
- Interactive prompt loop: type a prompt and get a response from the orchestrator or a specialized agent.
- Usage:
- Configure your
.envfile as described below. - Run:
python client_agentsdk_mcp.py - Type prompts such as:
get me info about the artist "AC/DC" from sqlitetalk like a pirateget the transcript for https://www.youtube.com/watch?v=dQw4w9WgXcQ&list=RDdQw4w9WgXcQ&start_radio=1
- Configure your
yt.py— YouTube transcript MCP serversqlite_server.py— SQLite Chinook DB MCP serverclient.py— OpenAI MCP client for both serversclient_agentsdk_mcp.py— OpenAI MCP client with OpenAI Agent SDKchinook.db— SQLite database filepyproject.toml— Project dependencies.env.example— Example environment configuration
- Python 3.13+
- Install dependencies:
uv pip install -r requirements.txt # or, if using pyproject.toml: uv pip install .
Copy .env.example to .env and fill in the required values:
AZURE_OPENAI_ENDPOINT=...
AZURE_OPENAI_KEY=...
AZURE_OPENAI_DEPLOYMENT=...
SQLITE_MCP_URL=http://localhost:8001
YT_MCP_URL=http://localhost:8002
- Make sure both MCP servers are running and accessible at the URLs above.
- This project is for demonstration purposes and is not production-hardened.
- The MCP servers use the fastmcp and mcp libraries.
- No direct dependency on the resource type is required in the consuming app.
Feel free to extend these servers or add new MCP tools for other resource types!