Nutlope / aicommits

A CLI that writes your git commit messages for you with AI

Home Page:https://www.npmjs.com/package/aicommits

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Add support for custom OpenAI base url (LocalAI integration)

0x326 opened this issue · comments

Feature request

Adding a OpenAI url setting allows integration with https://localai.io/, a locally-run API that is compatible with the OpenAI API specification.

Why?

Running the AI server locally provides enhanced data privacy.

Alternatives

No response

Additional context

No response

maybe the most important feature request I've seen

In the mean time:

  1. Locate the npm package for aicommits. In windows, normal nodejs installation it's C:\Program Files\nodejs\node_modules\aicommits
  2. Edit /dist/cli.mjs: Open this file, find api.openai.com and replace it with your URL. There is only one occurrence of this text.

In the mean time:

  1. Locate the npm package for aicommits. In windows, normal nodejs installation it's C:\Program Files\nodejs\node_modules\aicommits
  2. Edit /dist/cli.mjs: Open this file, find api.openai.com and replace it with your URL. There is only one occurrence of this text.

Run npm root -g to get the global node_modules path

well almost, I also had to setup nginx and tell node to ingore my self-signed cert, and change out gpt-3.5-turbo with a model I had pulled with ollama already, I did this in nix with something like this:

{ config, lib, pkgs, ... }:
let
in
{
  services = {
    nginx = {
      enable = true;
      virtualHosts = let
        base = locations: {
          inherit locations;
          forceSSL = true;
          #enableACME = true;
        };
        proxy = port: base {
          "/".proxyPass = "http://127.0.0.1:" + toString(port) + "/";
        };
      in {
        # Define example.com as reverse-proxied service on 127.0.0.1:3000
        "localhost" = proxy 11434 // {
          default = true; 
          sslCertificate = /etc/localhost/localhost.pem;
          sslCertificateKey = /etc/localhost/localhost-key.pem;
        };
      };
    };
    ollama = {
      enable = true;
      acceleration = "cuda";
    };
  };
  systemd.services = {
    ollama.serviceConfig.DynamicUser = lib.mkForce false;
  };
  environment.systemPackages = with pkgs; [ 
    ollama
  ];
}