fatihusta / spago

spaGO is a beautiful and maintainable machine learning library written in Go designed to support relevant neural network architectures in natural language processing tasks

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool



Build Coverage Go Report Card Maintainability Documentation License PRs Welcome Awesome Go


If you like the project, please ★ star this repository to show your support! 🤩

If you're interested in NLP-related functionalities, be sure to explore the Cybertron package!

Spago is a Machine Learning library written in pure Go designed to support relevant neural architectures in Natural Language Processing.

Spago is self-contained, in that it uses its own lightweight computational graph both for training and inference, easy to understand from start to finish.

It provides:

  • Automatic differentiation via dynamic define-by-run execution
  • Feed-forward layers (Linear, Highway, Convolution...)
  • Recurrent layers (LSTM, GRU, BiLSTM...)
  • Attention layers (Self-Attention, Multi-Head Attention...)
  • Gradient descent optimizers (Adam, RAdam, RMS-Prop, AdaGrad, SGD)
  • Gob compatible neural models for serialization

Usage

Requirements:

Clone this repo or get the library:

go get -u github.com/nlpodyssey/spago

Getting Started

A good place to start is by looking at the implementation of built-in neural models, such as the LSTM.

Example 1

Here is an example of how to calculate the sum of two variables:

package main

import (
	"fmt"
	"log"

	"github.com/nlpodyssey/spago/ag"
	"github.com/nlpodyssey/spago/mat"
)

func main() {
	// define the type of the elements in the tensors
	type T = float32

	// create a new node of type variable with a scalar
	a := mat.Scalar(T(2.0), mat.WithGrad(true)) // create another node of type variable with a scalar
	b := mat.Scalar(T(5.0), mat.WithGrad(true)) // create an addition operator (the calculation is actually performed here)
	c := ag.Add(a, b)

	// print the result
	fmt.Printf("c = %v (float%d)\n", c.Value(), c.Value().Item().BitSize())

	c.AccGrad(mat.Scalar(T(0.5)))

	if err := ag.Backward(c); err != nil {
		log.Fatalf("error during Backward(): %v", err)
	}

	fmt.Printf("ga = %v\n", a.Grad())
	fmt.Printf("gb = %v\n", b.Grad())
}

Output:

c = [7] (float32)
ga = [0.5]
gb = [0.5]

Example 2

Here is a simple implementation of the perceptron formula:

package main

import (
	"fmt"
	
	. "github.com/nlpodyssey/spago/ag"
	"github.com/nlpodyssey/spago/mat"
)

func main() {
	x := mat.Scalar(-0.8)
	w := mat.Scalar(0.4)
	b := mat.Scalar(-0.2)

	y := Sigmoid(Add(Mul(w, x), b))

	fmt.Printf("y = %0.3f\n", y.Value().Item())
}

Contributing

If you think something is missing or could be improved, please open issues and pull requests.

To start contributing, check the Contributing Guidelines.

Contact

We highly encourage you to create an issue as it will contribute to the growth of the community. However, if you prefer to communicate with us privately, please feel free to email Matteo Grella with any questions or comments you may have.

Acknowledgments

Spago is part of the open-source NLP Odyssey initiative initiated by members of the EXOP team (now part of Crisis24).

Sponsors

See our Open Collective page if you too are interested in becoming a sponsor.

About

spaGO is a beautiful and maintainable machine learning library written in Go designed to support relevant neural network architectures in natural language processing tasks

License:BSD 2-Clause "Simplified" License


Languages

Language:Go 83.1%Language:Assembly 16.9%Language:C 0.1%