JetbrainsAI-OpenRouterProxy-plugin is a proxy plugin that connects the AI Assistant in JetBrains IDEs to the cloud-based AI models of OpenRouter.ai. This allows developers to freely choose and utilize over 100 of the latest models, such as GPT, Claude, and Gemini, directly within the Jetbrains AI Assistant during their coding workflow.
- OpenRouter Model Integration:
- Seamlessly integrates over 100 cloud models from OpenRouter.ai into the AI Assistant's model list.
- Real-time Parameter Control:
- Adjust AI response parameters like Temperature and Top-P in real-time from the IDE's tool window.
- Customizable system prompts.
- Preset System:
- Save and easily switch between optimized AI settings for different tasks like creative writing, coding, and analysis.
- Convenient Management:
- Whitelist feature to selectively expose only the models you want to use.
- Check the status of your OpenRouter API key.
- Local Model Support:
- If you have a local Ollama server, you can use both cloud and local models simultaneously.
JetbrainsAI-OpenRouterProxy-plugin/
βββ src/main/kotlin/com/zxcizc/ollamaopenrouterproxyjetbrainsplugin/
β βββ ProxyServer.kt # Core proxy server implementation
β βββ PluginSettingsState.kt # Manages plugin settings
β βββ PluginSettingsComponent.kt # UI components for settings
β βββ PluginSettingsConfigurable.kt # Integrates with the settings page
β βββ ProxyControlToolWindowFactory.kt # Factory for the tool window
βββ src/main/resources/
β βββ META-INF/
β βββ plugin.xml # Plugin configuration
βββ build.gradle.kts # Gradle build configuration
- JetBrains IDE (IntelliJ IDEA, PyCharm, etc. 2024.1+)
- Java 21+
- OpenRouter.ai API Key
-
Build the Plugin
git clone https://github.com/zxcizc/JetbrainsAI-OpenRouterProxy-plugin.git cd JetbrainsAI-OpenRouterProxy-plugin ./gradlew buildPlugin -
Install the Plugin In your IDE, go to
File β Settings β Plugins β Install Plugin from Diskand select the.jarfile from thebuild/libs/directory. -
Initial Setup
- Navigate to Tools β JetbrainsAI OpenRouter Proxy Settings.
- Enter your OpenRouter API Key.
- (Optional) Add your desired models to the whitelist.
-
AI Assistant Integration Go to Tools β AI Assistant β Models and set the Ollama URL to
http://localhost:11444. (This plugin acts as a proxy server at this address).
- Core:
- Kotlin (v2.1.0)
- Java HttpServer
- Plugin Framework:
- IntelliJ Platform SDK (v2025.1.3)
- Libraries:
- Gson (v2.10.1)
- Kotlin Coroutines
- Build Tool:
- Gradle (v8.13)
Bug reports, feature suggestions, and pull requests are always welcome!
- Fork the repository
- Create your feature branch (
git checkout -b feature/AmazingFeature) - Commit your changes (
git commit -m 'Add some AmazingFeature') - Push to the branch (
git push origin feature/AmazingFeature) - Open a Pull Request
This project is distributed under the GPL-3.0 License.
β If you find this plugin helpful for your development workflow, please give it a star! β