Neural network in Go

Overview

network

Package network is a simple implementation of a nonbiased neural network.

The networks created by this package can be trained with backpropagation and use a variety of activation functions.

For example, the following code trains a simple 2x3x1 neural network the XOR function:

config := []network.LayerConf{
	network.LayerConf{Inputs: 2},
	network.LayerConf{Inputs:3, Activation: activation.LeakyReLU{Leak: 0.01}},
	network.LayerConf{Inputs:1, Activation: activation.LeakyReLU{Leak: 0.01}},
}
net, err := network.NewNetwork(config)
if err != nil {
	log.Fatalln(`can't create network`, err)
}

// Training samples
samples := map[[2]float64][]float64{
	[2]float64{0, 0}: []float64{0},
	[2]float64{0, 1}: []float64{1},
	[2]float64{1, 0}: []float64{1},
	[2]float64{1, 1}: []float64{0},
}

targetMSE := 0.005  // Desired Mean Squared Error
learningRate := 0.1 // Learning rate for the network, larger is faster, smaller is more accurate

var iter int

for iter = 0; iter < 1000; iter++ {
	meanSquaredError := float64(0)

	for input, target := range samples {
		input := input[:]
		output := net.Forward(input)
		error := net.Error(output, target)
		net.Backprop(input, error, learningRate)

		for _, e := range error {
			meanSquaredError += math.Pow(e, 2)
		}
	}

	meanSquaredError /= float64(len(samples))

	if meanSquaredError <= targetMSE {
		break
	}
}

log.Println(`Took`, iter, `iterations to reach target MSE`, targetMSE)

for input, target := range samples {
	log.Println(`Input:`, input, `Target:`, target, `Output:`, net.Forward(input[:])
}

Usage

type LayerConf

type LayerConf struct {
	Inputs     int
	Activation activation.Activation
}

LayerConf represents a configuration for one single layer in the network

type Network

type Network struct {
}

Network is structure that represents an unbiased neural network

func NewNetwork

func NewNetwork(layerConfigs []LayerConf) (*Network, error)

NewNetwork creates a new neural network with the desired layer configurations. The activation is ignored for the first layer and has to be set to nil.

The following creates a fully connected 2x3x1 network with sigmoid activation between all layers:

config := []LayerConf{
  LayerConf{Inputs: 2, Activation: nil},
  LayerConf{Inputs: 3, Activation: SigmoidActivation{}},
  LayerConf{Inputs: 1, Activation: SigmoidActivation{}},
}
net := network.NewNetwork(config)

func (*Network) Backprop

func (n *Network) Backprop(inputs, error []float64, learningRate float64)

Backprop performs one pass of back propagation through the network for the given input, error and learning rate.

Before Backprop is called, you need to do one forward pass for the input with Forward. A typical usage looks like this:

input := []float64{0, 1.0, 2.0}
target := []float64{0, 1}
output := net.Forward(input)
error := net.Error(output, target)
net.Backprop(input, error, 0.1) // Perform back propagation with learning rate 0.1

func (*Network) Error

func (n *Network) Error(outputs, targets []float64) []float64

Error computes the error of the given outputs when compared to the given targets.

This is intended to be used during training. See the documentation for Backprop for an example usage.

func (*Network) Forward

func (n *Network) Forward(inputs []float64) []float64

Forward performs a forward pass through the network for the given inputs. The returned value is the output of the uppermost layer of neurons.

func (*Network) Restore

func (n *Network) Restore(prefix string) error

Restore restores a network that was previously saved with Snapshot.

The result is undefined if the network architecture differs.

func (*Network) Snapshot

func (n *Network) Snapshot(prefix string) error

Snapshot stores a snapshot of all layers to files prefixed with prefix. The files are suffixed with the layer number and the string .layer.

You might also like...
A multilayer perceptron network implemented in Go, with training via backpropagation.

Neural Go I'm in the process of making significant changes to this package, particularly, to make it more modular, and to base it around an actual lin

Generative Adversarial Network in Go via Gorgonia
Generative Adversarial Network in Go via Gorgonia

Generative adversarial networks Recipe for simple GAN in Golang ecosystem via Gorgonia library Table of Contents About Why Instruments Usage Code expl

Tpu-traffic-classifier - This small program creates ipsets and iptables rules for nodes in the Solana network

TPU traffic classifier This small program creates ipsets and iptables rules for

fonet is a deep neural network package for Go.

fonet fonet is a deep neural network package for Go. It's mainly created because I wanted to learn about neural networks and create my own package. I'

Artificial Neural Network

go-deep Feed forward/backpropagation neural network implementation. Currently supports: Activation functions: sigmoid, hyperbolic, ReLU Solvers: SGD,

Neural Network for Go.

gonet gonet is a Go module implementing multi-layer Neural Network. Install Install the module with: go get github.com/dathoangnd/gonet Import it in

onnx-go gives the ability to import a pre-trained neural network within Go without being linked to a framework or library.
onnx-go gives the ability to import a pre-trained neural network within Go without being linked to a framework or library.

This is a Go Interface to Open Neural Network Exchange (ONNX). Overview onnx-go contains primitives to decode a onnx binary model into a computation b

Golang Neural Network
Golang Neural Network

Varis Neural Networks with GO About Package Some time ago I decided to learn Go language and neural networks. So it's my variation of Neural Networks

A neural network library built in Go

go-mind A neural network library built in Go. Usage import "github.com/stevenmiller888/go-mind" m := mind.New(0.7, 10000, 3, "sigmoid") m.Learn([][]

Example of Neural Network models of social and personality psychology phenomena

SocialNN Example of Neural Network models of social and personality psychology phenomena This repository gathers a collection of neural network models

The DGL Operator makes it easy to run Deep Graph Library (DGL) graph neural network training on Kubernetes

DGL Operator The DGL Operator makes it easy to run Deep Graph Library (DGL) graph neural network distributed or non-distributed training on Kubernetes

A command-line tool that uses a fake AI neural network to tell whether or not Ben is sus

Is ben sus? Let's use an AI neural network, trained by his own tweets to figure out. A command-line tool that uses a fake AI neural network to tell wh

Neural network in Go

network Package network is a simple implementation of a nonbiased neural network. The networks created by this package can be trained with backpropaga

Go (Golang) encrypted deep learning library; Fully homomorphic encryption over neural network graphs

DC DarkLantern A lantern is a portable case that protects light, A dark lantern is one who's light can be hidden at will. DC DarkLantern is a golang i

Gopher-ml - Simple neural network built in go

Building a neural network from scratch with Golang Simply neural network built i

network-node-manager is a kubernetes controller that controls the network configuration of a node to resolve network issues of kubernetes.
network-node-manager is a kubernetes controller that controls the network configuration of a node to resolve network issues of kubernetes.

Network Node Manager network-node-manager is a kubernetes controller that controls the network configuration of a node to resolve network issues of ku

K8s-network-config-operator - Kubernetes network config operator to push network config to switches

Kubernetes Network operator Will add more to the readme later :D Operations The

Neural Networks written in go

gobrain Neural Networks written in go Getting Started The version 1.0.0 includes just basic Neural Network functions such as Feed Forward and Elman Re

An implementation of Neural Turing Machines
An implementation of Neural Turing Machines

Neural Turing Machines Package ntm implements the Neural Turing Machine architecture as described in A.Graves, G. Wayne, and I. Danihelka. arXiv prepr

Owner
Shingirai Chanakira
Just another developer.
Shingirai Chanakira
Neural Network for Go.

gonet gonet is a Go module implementing multi-layer Neural Network. Install Install the module with: go get github.com/dathoangnd/gonet Import it in

Dat Hoang 77 Nov 25, 2022
onnx-go gives the ability to import a pre-trained neural network within Go without being linked to a framework or library.

This is a Go Interface to Open Neural Network Exchange (ONNX). Overview onnx-go contains primitives to decode a onnx binary model into a computation b

Olivier Wulveryck 456 Dec 24, 2022
Golang Neural Network

Varis Neural Networks with GO About Package Some time ago I decided to learn Go language and neural networks. So it's my variation of Neural Networks

Artem Filippov 48 Sep 27, 2022
A neural network library built in Go

go-mind A neural network library built in Go. Usage import "github.com/stevenmiller888/go-mind" m := mind.New(0.7, 10000, 3, "sigmoid") m.Learn([][]

Steven Miller 166 Aug 27, 2022
Example of Neural Network models of social and personality psychology phenomena

SocialNN Example of Neural Network models of social and personality psychology phenomena This repository gathers a collection of neural network models

null 7 Dec 5, 2022
Neural network in Go

network Package network is a simple implementation of a nonbiased neural network. The networks created by this package can be trained with backpropaga

Shingirai Chanakira 0 Nov 25, 2021
Go (Golang) encrypted deep learning library; Fully homomorphic encryption over neural network graphs

DC DarkLantern A lantern is a portable case that protects light, A dark lantern is one who's light can be hidden at will. DC DarkLantern is a golang i

Raven 2 Oct 31, 2022
Neural Networks written in go

gobrain Neural Networks written in go Getting Started The version 1.0.0 includes just basic Neural Network functions such as Feed Forward and Elman Re

Go Machine Learning 531 Dec 20, 2022
An implementation of Neural Turing Machines

Neural Turing Machines Package ntm implements the Neural Turing Machine architecture as described in A.Graves, G. Wayne, and I. Danihelka. arXiv prepr

Fumin 398 Sep 13, 2022
Implementation of E(n)-Equivariant Graph Neural Networks, in Pytorch

EGNN - Pytorch Implementation of E(n)-Equivariant Graph Neural Networks, in Pytorch. May be eventually used for Alphafold2 replication.

Phil Wang 257 Dec 23, 2022