HardCodeDev777 / SimpleOllamaUnity

Communicate in Unity(also at runtime) with LLMs via Ollama in just two lines of code!

Repository from Github https://github.comHardCodeDev777/SimpleOllamaUnityRepository from Github https://github.comHardCodeDev777/SimpleOllamaUnity

Ollama Unity C# License Last commit Tag Top lang

πŸ¦™ SimpleOllamaUnity β€” Unity Extension

Listed in Ollama Community Integrations

Listed in ai-gamedev-tools

 

Communicate with local LLMs in Unity using Ollama β€” in just two lines of code.


πŸš€ Overview

SimpleOllamaUnity is an Unity extension that lets you communicate with Ollama in just two lines of code! It also works at runtime, so you can use it in your games!

You can easily configure the following for a quick start:

  • πŸ€– Model
  • πŸ“ƒ System prompt
  • 🌐 Ollama URI
  • πŸ‘€ Reasoning (optional β€” can be disabled)

πŸ“¦ Installation

  1. Download the latest .unitypackage file from the Releases page.
  2. Drag & drop it into your Unity project window.
  3. Unity will automatically compile the editor extension.

No additional setup required.

The Plugins folder includes several .dll files required for integration.


πŸ’» Usage

var ollama = new Ollama(new OllamaConfig(
    modelName: "qwen2.5:3b",
    systemPrompt: "Your answer mustn't be more than 10 words"
    ));

var response = await ollama.SendMessage(new OllamaRequest(
    userPrompt: "When was GitHub created?"
));

Yes, that’s it β€” only two lines of code! πŸŽ‰

 

To use a custom server URI:

var ollama = new Ollama(new OllamaConfig(
    modelName: "qwen2.5:3b",
    systemPrompt: "Your answer mustn't be more than 10 words",
    uri: "http://my-custom-server.local:3000/api/process"
)); 

 

You can also remove reasoning from models that can do it:

var response = await ollama.SendMessage(new OllamaRequest(
    userPrompt: "When was GitHub created?",
    clearThinking: true
));

This will remove all reasoning (from <think> to </think>).

   

πŸ§ͺ Full Example:

using UnityEngine;
using HardCodeDev.SimpleOllamaUnity;

public class Test : MonoBehaviour
{
    private async void Start()
    {
        var ollama = new OllamaBase(new OllamaConfig(
            modelName: "qwen2.5:3b",
            systemPrompt: "Your answer mustn't be more than 10 words"
        ));

        var response = await ollama.SendMessage(new OllamaRequest(
            userPrompt: "When was GitHub created?"
        ));

        Debug.Log(response); // Prints LLM response to the console
    }
}

πŸ›  TODO

  • Review which .dll files in the Plugins folder are actually required and remove the unnecessary ones.

πŸ“„ License

This project is licensed under the MIT License.
See the LICENSE file for full terms.


πŸ‘¨β€πŸ’» Author

HardCodeDev


πŸ’¬ Got feedback, found a bug, or want to contribute? Open an issue or fork the repo on GitHub!

About

Communicate in Unity(also at runtime) with LLMs via Ollama in just two lines of code!

License:MIT License


Languages

Language:C# 100.0%