Artificial Neural Network

Overview

go-deep

Go Report Card Build Status Coverage Status GoDoc

Feed forward/backpropagation neural network implementation. Currently supports:

  • Activation functions: sigmoid, hyperbolic, ReLU
  • Solvers: SGD, SGD with momentum/nesterov, Adam
  • Classification modes: regression, multi-class, multi-label, binary
  • Supports batch training in parallel
  • Bias nodes

Networks are modeled as a set of neurons connected through synapses. No GPU computations - don't use this for any large scale applications.

Todo:

  • Dropout
  • Batch normalization

Install

go get -u github.com/patrikeh/go-deep

Usage

Import the go-deep package

import (
	"fmt"
	deep "github.com/patrikeh/go-deep"
	"github.com/patrikeh/go-deep/training"
)

Define some data...

var data = training.Examples{
	{[]float64{2.7810836, 2.550537003}, []float64{0}},
	{[]float64{1.465489372, 2.362125076}, []float64{0}},
	{[]float64{3.396561688, 4.400293529}, []float64{0}},
	{[]float64{1.38807019, 1.850220317}, []float64{0}},
	{[]float64{7.627531214, 2.759262235}, []float64{1}},
	{[]float64{5.332441248, 2.088626775}, []float64{1}},
	{[]float64{6.922596716, 1.77106367}, []float64{1}},
	{[]float64{8.675418651, -0.242068655}, []float64{1}},
}

Create a network with two hidden layers of size 2 and 2 respectively:

n := deep.NewNeural(&deep.Config{
	/* Input dimensionality */
	Inputs: 2,
	/* Two hidden layers consisting of two neurons each, and a single output */
	Layout: []int{2, 2, 1},
	/* Activation functions: Sigmoid, Tanh, ReLU, Linear */
	Activation: deep.ActivationSigmoid,
	/* Determines output layer activation & loss function: 
	ModeRegression: linear outputs with MSE loss
	ModeMultiClass: softmax output with Cross Entropy loss
	ModeMultiLabel: sigmoid output with Cross Entropy loss
	ModeBinary: sigmoid output with binary CE loss */
	Mode: deep.ModeBinary,
	/* Weight initializers: {deep.NewNormal(μ, σ), deep.NewUniform(μ, σ)} */
	Weight: deep.NewNormal(1.0, 0.0),
	/* Apply bias */
	Bias: true,
})

Train:

// params: learning rate, momentum, alpha decay, nesterov
optimizer := training.NewSGD(0.05, 0.1, 1e-6, true)
// params: optimizer, verbosity (print stats at every 50th iteration)
trainer := training.NewTrainer(optimizer, 50)

training, heldout := data.Split(0.5)
trainer.Train(n, training, heldout, 1000) // training, validation, iterations

resulting in:

Epochs        Elapsed       Error         
---           ---           ---           
5             12.938µs      0.36438       
10            125.691µs     0.02261       
15            177.194µs     0.00404       
...     
1000          10.703839ms   0.00000       

Finally, make some predictions:

fmt.Println(data[0].Input, "=>", n.Predict(data[0].Input))
fmt.Println(data[5].Input, "=>", n.Predict(data[5].Input))

Alternatively, batch training can be performed in parallell:

optimizer := NewAdam(0.001, 0.9, 0.999, 1e-8)
// params: optimizer, verbosity (print info at every n:th iteration), batch-size, number of workers
trainer := training.NewBatchTrainer(optimizer, 1, 200, 4)

training, heldout := data.Split(0.75)
trainer.Train(n, training, heldout, 1000) // training, validation, iterations

Examples

See training/trainer_test.go for a variety of toy examples of regression, multi-class classification, binary classification, etc.

See examples/ for more realistic examples:

Dataset Topology Epochs Accuracy
wines [5 5] 10000 ~98%
mnist [50] 25 ~97%
Comments
  • Nesterov flag

    Nesterov flag

    Why does the Nesterov flag not affect the result of calculations? /training/solver.go:

    // Update returns the update for a given weight
    func (o *SGD) Update(value, gradient float64, iteration, idx int) float64 {
    	lr := o.lr / (1 + o.decay*float64(iteration))
    
    	o.moments[idx] = o.momentum*o.moments[idx] - lr*gradient
    
    	if o.nesterov {
    		o.moments[idx] = o.momentum*o.moments[idx] - lr*gradient
    	}
    
    	return o.moments[idx]
    }
    
    opened by rilshok 3
  • Using interfaces for activation and loss functions.

    Using interfaces for activation and loss functions.

    Currently there are a few Activation/Loss functions which are provided in go-deep. While internally these functions follow an interface, this is not completely exposed to the clients of this library. For example, If I would like to experiment with a new activation function I would not be able to without modifying the code in go-deep.

    If we can expose the ability for clients of this library to provide their own activation functions by providing types that implement the Differentiable and Loss interfaces, I think the flexibility of this library will be greatly increased.

    If this is something that sounds reasonable, I would be more than happy to take a crack at implementing this and submitting a merge request.

    opened by NDari 3
  • Save Neural struct in Disk

    Save Neural struct in Disk

    Hello,

    I am trying to save the Neural structure to disk using "encoding / gob" but it seems that it does not work. Am I doing something wrong or should I use something else?

    thank you

    opened by cepalle 2
  • Support printer with interface

    Support printer with interface

    Would be great to support passing in an interface for the printer that the batch trainer uses, in order to implement custom stopping depending on loss. So loss would need to be part of the interface signature in numeric form.

    opened by yonderblue 2
  • can we add a user defined loss functions to our model?

    can we add a user defined loss functions to our model?

    Is there any way to use the user-defined loss functions so that the model can be trained based on the loss from these defined functions? or am i missing something?

    opened by susanth19 1
  • ReLU is not correct.

    ReLU is not correct.

    Hi!

    There is a code in the activation.go:

    func (a ReLU) F(x float64) float64 { return math.Max(x, 0) }
    
    func (a ReLU) Df(y float64) float64 {
    	if y > 0 {
    		return 1
    	}
    	return 0
    }
    

    This is no correct! The right option:

    func (a ReLU) F(x float64) float64 { return math.Max(x, 0) }
    
    func (a ReLU) Df(y float64) float64 {
    	if y < 0 {
    		return 1
    	}
    	return 0
    }
    

    For example: https://github.com/FluxML/NNlib.jl/blob/master/src/activations.jl

    Q

    Do you plan to develop the library further? Thanks!

    opened by michail-vestnik 1
  • Train history feature

    Train history feature

    I'm using the library for some project, and I'm finding out how useful a training history feature is useful (as you can find in any soa library in python). Is there any plan for implement it ?

    In any case, I'm starting a new one.

    opened by ave-nabulio 1
  • Example showing errors

    Example showing errors

    Hi, the example in the readme is giving errors:

    // params: learning rate, momentum, alpha decay, nesterov optimizer := training.NewSGD(0.05, 0.1, 1e-6, true) // params: optimizer, verbosity (print stats at every 50th iteration) trainer := training.NewTrainer(optimizer, 50)

    Cannot use optimizer as (type *SGD) as type Solver

    opened by domgolonka 1
  • Question about sparse networks

    Question about sparse networks

    Is connecting a sparse network similar to what is being done in NewNeural() supposed to work? I noticed the batch trainer uses NewNeural() internally.

    opened by yonderblue 0
  • Feature Request: Add Mish activation

    Feature Request: Add Mish activation

    Mish is a new novel activation function proposed in this paper. It has shown promising results so far and has been adopted in several packages including:

    All benchmarks, analysis and links to official package implementations can be found in this repository

    It would be nice to have Mish as an option within the activation function group.

    This is the comparison of Mish with other conventional activation functions in a SEResNet-50 for CIFAR-10: se50_1

    opened by digantamisra98 0
  • feat: Add LeakyReLU activation function

    feat: Add LeakyReLU activation function

    I added another activation function, i.e. LeakyReLU. This can be increase numerical efficiency in cases where you have flat cost functions.

    Another implementation choice could be add another parameter to ReLU, whose default is 0 and then just use that with other values also for LeakyReLU.

    opened by oscar-defelice 0
Owner
Patrik Ehrencrona
Patrik Ehrencrona
Neural Network for Go.

gonet gonet is a Go module implementing multi-layer Neural Network. Install Install the module with: go get github.com/dathoangnd/gonet Import it in

Dat Hoang 77 Nov 25, 2022
onnx-go gives the ability to import a pre-trained neural network within Go without being linked to a framework or library.

This is a Go Interface to Open Neural Network Exchange (ONNX). Overview onnx-go contains primitives to decode a onnx binary model into a computation b

Olivier Wulveryck 456 Dec 24, 2022
Golang Neural Network

Varis Neural Networks with GO About Package Some time ago I decided to learn Go language and neural networks. So it's my variation of Neural Networks

Artem Filippov 48 Sep 27, 2022
A neural network library built in Go

go-mind A neural network library built in Go. Usage import "github.com/stevenmiller888/go-mind" m := mind.New(0.7, 10000, 3, "sigmoid") m.Learn([][]

Steven Miller 166 Aug 27, 2022
Example of Neural Network models of social and personality psychology phenomena

SocialNN Example of Neural Network models of social and personality psychology phenomena This repository gathers a collection of neural network models

null 7 Dec 5, 2022
Neural network in Go

network Package network is a simple implementation of a nonbiased neural network. The networks created by this package can be trained with backpropaga

Shingirai Chanakira 0 Nov 25, 2021
Go (Golang) encrypted deep learning library; Fully homomorphic encryption over neural network graphs

DC DarkLantern A lantern is a portable case that protects light, A dark lantern is one who's light can be hidden at will. DC DarkLantern is a golang i

Raven 2 Oct 31, 2022
Neural Networks written in go

gobrain Neural Networks written in go Getting Started The version 1.0.0 includes just basic Neural Network functions such as Feed Forward and Elman Re

Go Machine Learning 531 Dec 20, 2022
An implementation of Neural Turing Machines

Neural Turing Machines Package ntm implements the Neural Turing Machine architecture as described in A.Graves, G. Wayne, and I. Danihelka. arXiv prepr

Fumin 398 Sep 13, 2022
Implementation of E(n)-Equivariant Graph Neural Networks, in Pytorch

EGNN - Pytorch Implementation of E(n)-Equivariant Graph Neural Networks, in Pytorch. May be eventually used for Alphafold2 replication.

Phil Wang 257 Dec 23, 2022
A multilayer perceptron network implemented in Go, with training via backpropagation.

Neural Go I'm in the process of making significant changes to this package, particularly, to make it more modular, and to base it around an actual lin

Schuyler Erle 64 Sep 27, 2022
Generative Adversarial Network in Go via Gorgonia

Generative adversarial networks Recipe for simple GAN in Golang ecosystem via Gorgonia library Table of Contents About Why Instruments Usage Code expl

Dimitrii Lopanov 74 Dec 2, 2022
Tpu-traffic-classifier - This small program creates ipsets and iptables rules for nodes in the Solana network

TPU traffic classifier This small program creates ipsets and iptables rules for

Triton One 10 Nov 23, 2022
Artificial Neural Network

go-deep Feed forward/backpropagation neural network implementation. Currently supports: Activation functions: sigmoid, hyperbolic, ReLU Solvers: SGD,

Patrik Ehrencrona 405 Jan 8, 2023
Synthetic media is a realistic transformation of audio and video using artificial intelligence.

Synthetic media is a realistic transformation of audio and video using artificial intelligence.

null 1 Nov 20, 2021
fonet is a deep neural network package for Go.

fonet fonet is a deep neural network package for Go. It's mainly created because I wanted to learn about neural networks and create my own package. I'

Barnabás Pataki 71 Oct 27, 2022
Neural Network for Go.

gonet gonet is a Go module implementing multi-layer Neural Network. Install Install the module with: go get github.com/dathoangnd/gonet Import it in

Dat Hoang 77 Nov 25, 2022
onnx-go gives the ability to import a pre-trained neural network within Go without being linked to a framework or library.

This is a Go Interface to Open Neural Network Exchange (ONNX). Overview onnx-go contains primitives to decode a onnx binary model into a computation b

Olivier Wulveryck 456 Dec 24, 2022
Golang Neural Network

Varis Neural Networks with GO About Package Some time ago I decided to learn Go language and neural networks. So it's my variation of Neural Networks

Artem Filippov 48 Sep 27, 2022
A neural network library built in Go

go-mind A neural network library built in Go. Usage import "github.com/stevenmiller888/go-mind" m := mind.New(0.7, 10000, 3, "sigmoid") m.Learn([][]

Steven Miller 166 Aug 27, 2022