Go binding for TensorFlow Lite

Overview

go-tflite

Go binding for TensorFlow Lite

Usage

model := tflite.NewModelFromFile("sin_model.tflite")
if model == nil {
	log.Fatal("cannot load model")
}
defer model.Delete()

options := tflite.NewInterpreterOptions()
defer options.Delete()

interpreter := tflite.NewInterpreter(model, options)
defer interpreter.Delete()

interpreter.AllocateTensors()

v := float64(1.2) * math.Pi / 180.0
input := interpreter.GetInputTensor(0)
input.Float32s()[0] = float32(v)
interpreter.Invoke()
got := float64(interpreter.GetOutputTensor(0).Float32s()[0])

See _example for more examples

Requirements

  • TensorFlow Lite - This release requires 2.2.0-rc3

Tensorflow Installation

You must install Tensorflow Lite C API. Assuming the source is under /source/directory/tensorflow

$ cd /source/directory/tensorflow
$ bazel build --config opt --config monolithic tensorflow:libtensorflow_c.so

Or to just compile the tensorflow lite libraries:

$ cd /some/path/tensorflow
$ bazel build --config opt --config monolithic //tensorflow/lite:libtensorflowlite.so
$ bazel build --config opt --config monolithic //tensorflow/lite/c:libtensorflowlite_c.so

In order for go to find the headers you must set the CGO_CFLAGS environment variable for the source and libraries of tensorflow. If your libraries are not installed in a standard location, you must also give the go linker the path to the shared librares with the CGO_LDFLAGS environment variable.

$ export CGO_CFLAGS=-I/source/directory/tensorflow
$ export CGO_LDFLAGS=-L/path/to/tensorflow/libaries

If you don't love bazel, you can try Makefile.tflite. Put this file as Makefile in tensorflow/lite/c, and run make. Sorry, this has not been test for Linux or Mac

Then run go build on some of the examples.

Edge TPU

To be able to compile and use the EdgeTPU delegate, you need to install the libraries from here: https://github.com/google-coral/edgetpu

There is also a deb package here: https://coral.withgoogle.com/docs/accelerator/get-started/#1-install-the-edge-tpu-runtime

The libraries from should be installed in a system wide library path like /usr/local/lib The include files should be installed somewhere that is accesable from your CGO include path

For x86:

cd /tmp && git clone https://github.com/google-coral/edgetpu.git && \
cp edgetpu/libedgetpu/direct/k8/libedgetpu.so.1.0 /usr/local/lib/libedgetpu.so.1.0 && \
ln -rs /usr/local/lib/libedgetpu.so.1.0 /usr/local/lib/libedgetpu.so.1 && \
ln -rs /usr/local/lib/libedgetpu.so.1.0 /usr/local/lib/libedgetpu.so && \
mkdir -p /usr/local/include/libedgetpu && \
cp edgetpu/libedgetpu/edgetpu.h /usr/local/include/edgetpu.h && \
cp edgetpu/libedgetpu/edgetpu_c.h /usr/local/include/edgetpu_c.h && \
rm -Rf edgetpu

License

MIT

Author

Yasuhrio Matsumoto (a.k.a. mattn)

Issues
  • Raspberry Pi 4 + edgetpu

    Raspberry Pi 4 + edgetpu "Internal: Unsupported data type in custom op handler: 0"

    Hello

    I tried to run example with edgetpu but it gives me

    EdgeTPU Version: BuildLabel(COMPILER=6.3.0 20170516,DATE=redacted,TIME=redacted,CL_NUMBER=291256449), RuntimeVersion(13)
    Internal: Unsupported data type in custom op handler: 0
    Node number -1095957936 (8ɶ<­¾üÍ϶D) @2.
    
    2020/06/15 23:44:53 allocate failed
    

    What would be the cause?

    opened by KyungWonPark 22
  • Cannot open shared object file: No such file or directory

    Cannot open shared object file: No such file or directory

    Hey, I wanted to try your promising library so I went through all the steps to produce my libtensorflowlite_c.so :smiley:.

    I then copy/pasted your label_image example and ran it using go run main.go producing this error:

    /tmp/go-build075675325/b001/exe/main: error while loading shared libraries: libtensorflowlite_c.so: cannot open shared object file: No such file or directory
    exit status 127
    

    I understand my libtensorflowlite_c.so may not be placed in the correct location. However, in https://github.com/mattn/go-tflite/blob/master/tflite.go I see...

    #cgo LDFLAGS: -L${SRCDIR}/../../tensorflow/tensorflow/tensorflow/lite/experimental/c -ltensorflowlite_c
    

    ...which seems to refer to the actual path my file sits in. So I don't really get why it cannot find it. Did I miss something ? Is there another step necessary to run go-tflite a part from placing the c library in tensorflow repo ?

    P.S: just to make it clear, here is an extract of my tree:

    $GOPATH/src/
        |_ github.com/
            |_ mattn/
            |       |_ go-tflite/
            |       |_ go-pointer/
            |_ Jonarod/
            |       |_ tests-labels-tflite/
            |           |_ main.go
            |_ tensorflow/
                    |_ tensorflow/
                        |_ tensorflow/
                            |_ lite/
                                |_ experimental / c / libtensorflowlite_c.so
    
    opened by Jonarod 6
  • Can you add a license?

    Can you add a license?

    This is great work. I actually got this to work with r14 and have been playing around with it.

    I would like to open source my code but would feel better if I either got your permission to release my code (which includes much of yours) or for you to add a license. (preferably BSD) Feel free to copy this and put your name in it's place: https://github.com/snowzach/gorestapi/blob/master/LICENSE.md

    I have added support for the Coral edgetpu hardware accelerator.

    Thanks!

    opened by snowzach 5
  • ARM support

    ARM support

    Hey, I'm curious to know if this has go-tflite has been tried on devices like the rasberry pi?

    I'm currently rebuilding the libtensorflow_c.so file for ARM devices to use along on my rasberry pi, if this has been done before, would love to hear how that turned out to be.

    If not, once I make any headway, I can share what I did to get things right. Let me know :)

    opened by rushabhnagda11 5
  • Add cgo flags for android

    Add cgo flags for android

    The pull request remove -lrt for android build #37 solved some of my issues. I used to change the cgo LDFLAGS every time I switch testing amd64 / arm32 / arm64. These two flags cgo android and cgo linux,!android work like a charm!

    Commit changes

    • Adding two other library flags -lm -llog for android, which is needed when building android aar library using gomobile bind.
    • Using -ltensorflowlite_c instead of -ltensorflowlite-delegate_xnnpack -lXNNPACK in delegates/xnnpack/xnnpack.go. I'm not sure if I'm right about this. In my practice, compiling lib using
      $ bazel build --config opt --config monolithic  --define tflite_with_xnnpack=true //tensorflow/lite:libtensorflowlite.so
      $ bazel build --config opt --config monolithic  --define tflite_with_xnnpack=true //tensorflow/lite/c:libtensorflowlite_c.so
      

      will include xnnpack in libtensorflowlite_c.so. This seems easier in using of xnnpack.

    • In my basic tests, -ldl -lrt is not required for both linux and android, maybe they are not for my testing scenarios.

    Testing Android flags

    • Project folder structure
      $ tree testTflite/
      # testTflite/
      # ├── main.go
      # └── runTest
      #     └── runTest.go
      
    • runTest/runTest.go
      package runTest
      
      import (
      	"fmt"
      	"github.com/mattn/go-tflite"
      	"github.com/mattn/go-tflite/delegates/xnnpack"
      )
      
      func Run(model_path string) {
      	model := tflite.NewModelFromFile(model_path)
      	options := tflite.NewInterpreterOptions()
      	options.AddDelegate(xnnpack.New(xnnpack.DelegateOptions{NumThreads: 2}))
      	defer options.Delete()
      
      	interp := tflite.NewInterpreter(model, options)
      	status := interp.AllocateTensors()
      	input := interp.GetInputTensor(0)
      	in := input.Float32s()
      	fmt.Println(status, len(in))
      	status = interp.Invoke()
      	output := interp.GetOutputTensor(0).Float32s()
      	fmt.Println(output)
      }
      
    • main.go
      package main
      
      import "testTflite/runTest"
      
      func main() {
      	runTest.Run("model.tflite")
      }
      
    • Test
      cd testTflite
      go mod init testTflite
      # Use my custom go-tflite
      go mod edit -replace=github.com/mattn/go-tflite=$HOME/workspace/go-tflite
      go mod tidy
      
      # Build test for amd64
      export CGO_LDFLAGS=-L$HOME/workspace/tensorflow/bazel-bin/tensorflow/lite/c
      export CGO_CFLAGS=-I$HOME/workspace/tensorflow/
      go build main.go
      
      # Build test for arm64
      export CGO_LDFLAGS="-L$HOME/workspace/tensorflow.arm64/bazel-bin/tensorflow/lite/c"
      export CGO_CFLAGS="-I$HOME/workspace/tensorflow.arm64/"
      SDK_HOME="$HOME/Android/Sdk/ndk/21.0.6113669/toolchains/llvm/prebuilt/linux-x86_64"
      ANDROID_CC="$SDK_HOME/bin/aarch64-linux-android29-clang -Wl,-rpath-link,$SDK_HOME/sysroot/usr/lib/aarch64-linux-android/29"
      ANDROID_CXX="$SDK_HOME/bin/aarch64-linux-android29-clang++ -Wl,-rpath-link,$SDK_HOME/sysroot/usr/lib/aarch64-linux-android/29"
      ANDROID_ARCH="arm64"
      CGO_ENABLED=1 GOOS=android GOARCH=$ANDROID_ARCH CC=$ANDROID_CC CXX=$ANDROID_CXX go build main.go
      
      # gomobile bind test for arm64, will throw error without `-llog -lm`
      go get golang.org/x/mobile/bind
      gomobile bind -v -o test.aar -target=android/arm64 testTflite/runTest
      
    opened by leondgarse 5
  • How does one install Tensorflow Lite C API

    How does one install Tensorflow Lite C API

    Hi, I must be pretty ignorant, but I cannot figure out how to install Tensorflow Lite C API. I just got my coral TPU, and the sales from Google told me that they worked with you @mattn to allow tensorflow lite to work in go. But everywhere on the tensorflow website, it shows installation with pip in python. I got bazel working, but I don't know where /source/directory/tensorflow is supposed to be. I git clone https://github.com/tensorflow/tensorflow and checked out v2.2.0-rc3 but no dice.

    Could you enlighten me for the initial setup?

    My goal is to develop in go on a ubuntu intel 64 machine and then, once everything is running, install everything on a raspberry pi and compile it over there. I'm saying this because the only instructions to make Tensorflow for C is for arm based systems on their website https://www.tensorflow.org/lite/guide/build_arm64

    opened by willmendil 4
  • Can I build tflite cpp using

    Can I build tflite cpp using "offical" scripts?

    I found that Tensorflow provides the build scripts for the tflite cpp lib in: https://www.tensorflow.org/lite/guide/build_rpi

    So I'm wondering if I can use it instead? Or if not, can I simply modify something to make it compatible? Because I got the following error running the example:

    [email protected]:~/go/src/github.com/mattn/go-tflite/_example/ssd $ go run main.go 
    # github.com/mattn/go-tflite
    ../../tflite_experimental.go: In function 'writeToTensorAsVector':
    ../../tflite_experimental.go:64:13: error: 'TfLiteTensor {aka struct <anonymous>}' has no member named 'quantization'; did you mean 'allocation'?
       if (tensor->quantization.type == kTfLiteAffineQuantization) {
                 ^~
    ../../tflite_experimental.go:64:36: error: 'kTfLiteAffineQuantization' undeclared (first use in this function)
       if (tensor->quantization.type == kTfLiteAffineQuantization) {
                                        ^~~~~~~~~~~~~~~~~~~~~~~~~
    ../../tflite_experimental.go:64:36: note: each undeclared identifier is reported only once for each function it appears in
    ../../tflite_experimental.go:65:5: error: unknown type name 'TfLiteAffineQuantization'
         TfLiteAffineQuantization* q_params =
         ^~~~~~~~~~~~~~~~~~~~~~~~
    ../../tflite_experimental.go:66:10: error: 'TfLiteAffineQuantization' undeclared (first use in this function)
             (TfLiteAffineQuantization*)(tensor->quantization.params);
              ^~~~~~~~~~~~~~~~~~~~~~~~
    ../../tflite_experimental.go:66:35: error: expected expression before ')' token
             (TfLiteAffineQuantization*)(tensor->quantization.params);
                                       ^
    ../../tflite_experimental.go:66:43: error: 'TfLiteTensor {aka struct <anonymous>}' has no member named 'quantization'; did you mean 'allocation'?
             (TfLiteAffineQuantization*)(tensor->quantization.params);
                                               ^~
    ../../tflite_experimental.go:67:17: error: request for member 'scale' in something not a structure or union
         if (q_params->scale) {
                     ^~
    ../../tflite_experimental.go:68:20: error: request for member 'scale' in something not a structure or union
           free(q_params->scale);
                        ^~
    ../../tflite_experimental.go:69:15: error: request for member 'scale' in something not a structure or union
           q_params->scale = NULL;
                   ^~
    ../../tflite_experimental.go:71:17: error: request for member 'zero_point' in something not a structure or union
         if (q_params->zero_point) {
                     ^~
    ../../tflite_experimental.go:72:20: error: request for member 'zero_point' in something not a structure or union
           free(q_params->zero_point);
                        ^~
    ../../tflite_experimental.go:73:15: error: request for member 'zero_point' in something not a structure or union
           q_params->zero_point = NULL;
                   ^~
    ../../tflite_experimental.go:82:9: error: 'TfLiteTensor {aka struct <anonymous>}' has no member named 'quantization'; did you mean 'allocation'?
       tensor->quantization.type = kTfLiteNoQuantization;
             ^~
    ../../tflite_experimental.go:82:31: error: 'kTfLiteNoQuantization' undeclared (first use in this function)
       tensor->quantization.type = kTfLiteNoQuantization;
                                   ^~~~~~~~~~~~~~~~~~~~~
    ../../tflite_experimental.go:83:9: error: 'TfLiteTensor {aka struct <anonymous>}' has no member named 'quantization'; did you mean 'allocation'?
       tensor->quantization.params = NULL;
             ^~
    

    Sorry that I have no ideas about those C stuffs! Could you please give me some advices?

    opened by imdatngo 3
  • Guide for Raspberry Pi? (cannot find -ltensorflow-lite)

    Guide for Raspberry Pi? (cannot find -ltensorflow-lite)

    Hello

    Could you write tutorial for setting this up in raspberry pi?

    I'm trying to build it but I'm stuck at

    /usr/bin/ld: cannot find -ltensorflow-lite
    

    Thank you

    opened by KyungWonPark 3
  • GPU delegate

    GPU delegate

    added a gpu delegate with high level API. Simply add

    delegate := tflite.NewGPUDelegate(nil)
    if delegate == nil {
    	log.Fatal("cannot create delegate")
    }
    defer delegate.Delete()
    

    and

    options := tflite.NewInterpreterOptions()
    ...
    options.AddDelegate(delegate)
    

    However, at the moment of writing, my GPU delegate throws

    INFO: Created TensorFlow Lite delegate for GPU. Next operations are not supported by GPU delegate: ADD: AVERAGE_POOL_2D: CONV_2D: DEPTHWISE_CONV_2D: RESHAPE: First 0 operations will run on the GPU, and the remaining 65 on the CPU.

    for a simple mobilenet_v2 model

    opened by nibeh 3
  • added new delegate, which could load a tflite delegate from a library…

    added new delegate, which could load a tflite delegate from a library…

    …, e.g. libedgetpu Tested and working with the coral edgetpu USB accelerator

    opened by nibeh 2
  • Any thoughts about how to possibly use tflite-micro?

    Any thoughts about how to possibly use tflite-micro?

    Hello, there

    I was looking at https://github.com/tensorflow/tflite-micro and wondering if it would be possible to use go-tflite with that and TinyGo.

    Any thoughts on this would be appreciated. Thank you.

    opened by deadprogram 0
  • use static library

    use static library

    tensorflow provides scripts to build static tf lite libraries: tflite make scripts

    Are there plans to support .a files additionally to the .so files build with bazel?

    opened by rschwarz-sotec 1
  • Include path changed for tensorflow files

    Include path changed for tensorflow files

    You may or may not have seen the git issue where I was trying to get the edgetpu to work for DOODS.

    With the tensorflow commit that supports the edgetpu they have changed the import path to tensorflow/lite/c/c_api.h from tensorflow/lite/experimental/c/c_api.h

    Not sure how you want to handle this. I was trying to base my DOODS code off of your repo.

    opened by snowzach 2
  • dockerfiles for cpu anf gpu version

    dockerfiles for cpu anf gpu version

    Hi,

    Hope you are all well !

    It would be awesome to have a dockerfile for cpu and gpu building go-tflite for quick test purpose.

    Do you think it is possible ? Can we add it to the todo list ?

    Cheers, X

    opened by x0rzkov 0
  • Compiling -ltensorflowlite_c_delegate_gpu

    Compiling -ltensorflowlite_c_delegate_gpu

    Hello! Hope you doing great!

    I found GPU delegates in project and it need library tensorflowlite_c_delegate_gpu.

    Can you please explain how to compile it or where i can find it?

    Thanks!

    opened by scryptosan 0
Owner
mattn
Long-time Golang user&contributor, Google Dev Expert for Go, and author of many Go tools, Vim plugin author. Windows hacker C#/Java/C/C++
mattn
Go binding for TensorFlow Lite

go-tflite Go binding for TensorFlow Lite Usage model := tflite.NewModelFromFile("sin_model.tflite") if model == nil { log.Fatal("cannot load model")

mattn 215 Jul 22, 2021
Gorgonia is a library that helps facilitate machine learning in Go.

Gorgonia is a library that helps facilitate machine learning in Go. Write and evaluate mathematical equations involving multidimensional arrays easily

Gorgonia 4.1k Jul 27, 2021
Gorgonia is a library that helps facilitate machine learning in Go.

Gorgonia is a library that helps facilitate machine learning in Go. Write and evaluate mathematical equations involving multidimensional arrays easily

Gorgonia 4.1k Jul 19, 2021
A Go idiomatic binding to the C++ core of PyTorch

GoTorch GoTorch reimplements PyTorch high-level APIs, including modules and functionals, in idiomatic Go. Thus enables deep learning programming in Go

Yi Wang 53 Jul 17, 2021
Prophecis is a one-stop machine learning platform developed by WeBank

Prophecis is a one-stop machine learning platform developed by WeBank. It integrates multiple open-source machine learning frameworks, has the multi tenant management capability of machine learning compute cluster, and provides full stack container deployment and management services for production environment.

WeBankFinTech 198 Jul 26, 2021
A Kubernetes Native Batch System (Project under CNCF)

Volcano is a batch system built on Kubernetes. It provides a suite of mechanisms that are commonly required by many classes of batch & elastic workloa

Volcano 1.8k Jul 27, 2021
Deploy, manage, and scale machine learning models in production

Deploy, manage, and scale machine learning models in production. Cortex is a cloud native model serving platform for machine learning engineering teams.

Cortex Labs 7.6k Jul 21, 2021
Standard machine learning models

Cog: Standard machine learning models Define your models in a standard format, store them in a central place, run them anywhere. Standard interface fo

Replicate 70 Jul 27, 2021