Tensorflow + Go, the gopher way

Related tags

go golang tensorflow
Overview

tfgo: TensorFlow in Go

GoDoc Build Status

TensorFlow's Go bindings are hard to use: tfgo makes it easy!

No more problems like:

  • Scoping: each new node will have a new and unique name
  • Typing: attributes are automatically converted to a supported type instead of throwing errors at runtime

Also, it uses Method chaining making possible to write pleasant Go code.

Dependencies

  1. TensorFlow-2.4.0 lib. How to install tensorflow.
  2. TensorFlow bindings github.com/galeone/tensorflow. In order to correctly work with TensorFlow 2.4 in Go, we have to use a fork I created with some fix for the Go bindings. Bindings can be too large for go mod proxy, so you may want to switch off proxy usage by executing go env -w GONOSUMDB="github.com/galeone/tensorflow" to pull code directly using system installed git. It changes nothing in the user interface -- you can use go modules as usual.

Installation

go get github.com/galeone/tfgo

Getting started

The core data structure of the TensorFlow's Go bindings is the op.Scope struct. tfgo allows creating new *op.Scope that solves the scoping issue mentioned above.

Since we're defining a graph, let's start from its root (empty graph)

root := tg.NewRoot()

We can now place nodes into this graphs and connect them. Let's say we want to multiply a matrix for a column vector and then add another column vector to the result.

Here's the complete source code.

package main

import (
        "fmt"
        tg "github.com/galeone/tfgo"
        tf "github.com/galeone/tensorflow/tensorflow/go"
)

func main() {
        root := tg.NewRoot()
        A := tg.NewTensor(root, tg.Const(root, [2][2]int32{{1, 2}, {-1, -2}}))
        x := tg.NewTensor(root, tg.Const(root, [2][1]int64{{10}, {100}}))
        b := tg.NewTensor(root, tg.Const(root, [2][1]int32{{-10}, {10}}))
        Y := A.MatMul(x.Output).Add(b.Output)
        // Please note that Y is just a pointer to A!

        // If we want to create a different node in the graph, we have to clone Y
        // or equivalently A
        Z := A.Clone()
        results := tg.Exec(root, []tf.Output{Y.Output, Z.Output}, nil, &tf.SessionOptions{})
        fmt.Println("Y: ", results[0].Value(), "Z: ", results[1].Value())
        fmt.Println("Y == A", Y == A) // ==> true
        fmt.Println("Z == A", Z == A) // ==> false
}

that produces

Y:  [[200] [-200]] Z:  [[200] [-200]]
Y == A true
Z == A false

The list of the available methods is available on GoDoc: http://godoc.org/github.com/galeone/tfgo

Computer Vision using data flow graph

TensorFlow is rich of methods for performing operations on images. tfgo provides the image package that allows using the Go bindings to perform computer vision tasks in an elegant way.

For instance, it's possible to read an image, compute its directional derivative along the horizontal and vertical directions, compute the gradient and save it.

The code below does that, showing the different results achieved using correlation and convolution operations.

package main

import (
        tg "github.com/galeone/tfgo"
        "github.com/galeone/tfgo/image"
        "github.com/galeone/tfgo/image/filter"
        "github.com/galeone/tfgo/image/padding"
        tf "github.com/galeone/tensorflow/tensorflow/go"
        "os"
)

func main() {
        root := tg.NewRoot()
        grayImg := image.Read(root, "/home/pgaleone/airplane.png", 1)
        grayImg = grayImg.Scale(0, 255)

        // Edge detection using sobel filter: convolution
        Gx := grayImg.Clone().Convolve(filter.SobelX(root), image.Stride{X: 1, Y: 1}, padding.SAME)
        Gy := grayImg.Clone().Convolve(filter.SobelY(root), image.Stride{X: 1, Y: 1}, padding.SAME)
        convoluteEdges := image.NewImage(root.SubScope("edge"), Gx.Square().Add(Gy.Square().Value()).Sqrt().Value()).EncodeJPEG()

        Gx = grayImg.Clone().Correlate(filter.SobelX(root), image.Stride{X: 1, Y: 1}, padding.SAME)
        Gy = grayImg.Clone().Correlate(filter.SobelY(root), image.Stride{X: 1, Y: 1}, padding.SAME)
        correlateEdges := image.NewImage(root.SubScope("edge"), Gx.Square().Add(Gy.Square().Value()).Sqrt().Value()).EncodeJPEG()

        results := tg.Exec(root, []tf.Output{convoluteEdges, correlateEdges}, nil, &tf.SessionOptions{})

        file, _ := os.Create("convolved.png")
        file.WriteString(results[0].Value().(string))
        file.Close()

        file, _ = os.Create("correlated.png")
        file.WriteString(results[1].Value().(string))
        file.Close()
}

airplane.png

airplane

convolved.png

convolved

correlated.png

correlated

the list of the available methods is available on GoDoc: http://godoc.org/github.com/galeone/tfgo/image

Train in Python, Serve in Go

TensorFlow 2 comes with a lot of easy way to export a computational graph (e.g. Keras model, or a function decorated with @tf.function) to the SavedModel serialization format (that's the only one officially supported).

saved model

Using TensorFlow 2 (with Keras or tf.function) + tfgo, exporting a trained model (or a generic computational graph) and use it in Go is straightforward.

Just dig into the example to understand how to serve a trained model with tfgo.

Python code

import tensorflow as tf

model = tf.keras.Sequential(
    [
        tf.keras.layers.Conv2D(
            8,
            (3, 3),
            strides=(2, 2),
            padding="valid",
            input_shape=(28, 28, 1),
            activation=tf.nn.relu,
            name="inputs",
        ),  # 14x14x8
        tf.keras.layers.Conv2D(
            16, (3, 3), strides=(2, 2), padding="valid", activation=tf.nn.relu
        ),  # 7x716
        tf.keras.layers.Flatten(),
        tf.keras.layers.Dense(10, name="logits"),  # linear
    ]
)

tf.saved_model.save(model, "output/keras")

Go code

package main

import (
        "fmt"
        tg "github.com/galeone/tfgo"
        tf "github.com/galeone/tensorflow/tensorflow/go"
)

func main() {
        // A model exported with tf.saved_model.save()
        // automatically comes with the "serve" tag because the SavedModel
        // file format is designed for serving.
        // This tag contains the various functions exported. Among these, there is
        // always present the "serving_default" signature_def. This signature def
        // works exactly like the TF 1.x graph. Get the input tensor and the output tensor,
        // and use them as placeholder to feed and output to get, respectively.

        // To get info inside a SavedModel the best tool is saved_model_cli
        // that comes with the TensorFlow Python package.

        // e.g. saved_model_cli show --all --dir output/keras
        // gives, among the others, this info:

        // signature_def['serving_default']:
        // The given SavedModel SignatureDef contains the following input(s):
        //   inputs['inputs_input'] tensor_info:
        //       dtype: DT_FLOAT
        //       shape: (-1, 28, 28, 1)
        //       name: serving_default_inputs_input:0
        // The given SavedModel SignatureDef contains the following output(s):
        //   outputs['logits'] tensor_info:
        //       dtype: DT_FLOAT
        //       shape: (-1, 10)
        //       name: StatefulPartitionedCall:0
        // Method name is: tensorflow/serving/predict

        model := tg.LoadModel("test_models/output/keras", []string{"serve"}, nil)

        fakeInput, _ := tf.NewTensor([1][28][28][1]float32{})
        results := model.Exec([]tf.Output{
                model.Op("StatefulPartitionedCall", 0),
        }, map[tf.Output]*tf.Tensor{
                model.Op("serving_default_inputs_input", 0): fakeInput,
        })

        predictions := results[0]
        fmt.Println(predictions.Value())
}

Why?

Thinking about computation represented using graphs, describing computing in this way is, in one word, challenging.

Also, tfgo brings GPU computations to Go and allows writing parallel code without worrying about the device that executes it (just place the graph into the device you desire: that's it!)

Contribute

I love contributions. Seriously. Having people that share your same interests and want to face your same challenges it's something awesome.

If you'd like to contribute, just dig in the code and see what can be added or improved. Start a discussion opening an issue and let's talk about it.

Just follow the same design I use into the image package ("override" the same Tensor methods, document the methods, test your changes, ...)

There are a lot of packages that can be added, like the image package. Feel free to work on a brand new package: I'd love to see this kind of contributions!

TensorFlow installation

Manual

Download and install the C library from https://www.tensorflow.org/install/lang_c

curl -L "https://storage.googleapis.com/tensorflow/libtensorflow/libtensorflow-cpu-linux-x86_64-2.4.0.tar.gz" | sudo tar -C /usr/local -xz
sudo ldconfig

Docker

docker pull tensorflow/tensorflow:2.4.0

Or you can use system package manager.

Issues
  • Embedding a Python interpreter - Tensorflow Python API support

    Embedding a Python interpreter - Tensorflow Python API support

    I'm looking for collaborators for making tfgo a Tensorflow Go API 1:1 Python compatible.

    A brief recap:

    What is tfgo

    tfgo is a wrapper built around the Tensorflow Go bindings that allow:

    • defining graphs using method chaining in order to build graphs exactly as everyone is used to think about them (as a flow that goes from the input to the output)
    • importing a SavedModel and execute it
    • using some high-level operations built upon the low-level Go bindings (e.g. tfgo/image).

    It can successfully be used to run input pre-processing + inference or to define dataflow graph directly in Go. See the README.

    The problem:

    tfgo is not a 1:1 mapping with the Python API since the tensorflow/op package only exposes some of the C++ primitives and the Tensorflow Python API is way more complex and complete of the C++ API.

    People are familiar in using Tensorflow in Python and moreover, there are certain objects like the optimizers, the Variables, the Keras layers, the Keras losses, and so on that are defined in Python only.

    Therefore tfgo has a limited utility. There have been several people that asked me to add the tf.Variable support (the last one https://github.com/galeone/tfgo/issues/13 ) since they would be interested in training the models directly in Go and more in general, to have a 1:1 mapping between the Python functionalities and the tfgo functionalities.

    The idea

    tfgo can execute SavedModel objects and right now we can use Python to define a computational graph even to train a model, exporting a SavedModel that describes the whole training phase and execute it in tfgo (just define the model + the optimizer + the training operation and export this graph).

    Wouldn't be cool to make it possible to define the graph from a Go program, but with the complete support of the Python API?

    The idea is to embed a Python interpreter (ideally 3, 2 if there are technological constraints) inside tfgo, create a Go API that correctly communicates with the interpreter what the user wants and that builds a SavedModel. After that, tfgo then reads and executes it (exactly like the tf.Session in Python).

    e.g. if the user asks for something the Go package can do, just build the Graph using the Go bindings/tfgo - otherwise, use the Python interpreter.

    Tensorflow 2.0

    The Tensorflow Python API contains the decorator @tf.function Using this decorator it is possible to convert in its graph representation every correctly defined function (that uses the tf. primitives for everything).

    This could help a lot in implementing the idea since we can just:

    1. build the Python function body from Go
    2. decorating it using @tf.function
    3. export its graph representation in a SavedModel
    4. load it in tfgo for the execution

    Like we were used to working in Tensorflow < 2, with the graph-definition + session execution paradigm.

    Thoughts? Suggestions?

    help wanted 
    opened by galeone 34
  • How to use the model trained by tf.estimator

    How to use the model trained by tf.estimator

    For example, when I used tf.estimator training model, input_fn was used in training and feature_column was specified, so there was no name of input node and output node, could you help me?

    est = tf.estimator.LinearClassifier(feature_columns=one_hot_feature_columns + crossed_columns,
                                                n_classes=2,
                                                model_dir=model_path,
                                                optimizer=tf.train.FtrlOptimizer(
                                                    learning_rate=0.01,
                                                    l2_regularization_strength=0.02)
                                                )
    
    opened by atlantiswqq 23
  • How to create models in golang?

    How to create models in golang?

    Is there a feasible way to create and train new models within golang? Either by tfgo, or other tools?

    opened by arnwas 18
  • Support for Keras/TF_2.0?

    Support for Keras/TF_2.0?

    Hello,

    I am working in Python and using TensorFlow's Keras API ( e.g. tf.keras.models.Sequential). Would I be able to export it (export_savedmodel) and then load it into Go with TFGO? Or do I need to stick to "pure" TensorFlow expressions?

    Also, if I use TensorFlow 2.0, will it work correctly with TFGO?

    Thanks!

    opened by ej-white 12
  • Slow inference

    Slow inference

    First, thanks so much for working on this library! I'm using it for serving a couple of models with good success.

    I ran into something that surprised me today; it's not necessarily an issue with this library, but wanted to get your thoughts. I'm comparing execution of an SSD detector via a python script and a go executable. Just comparing the elapsed wall clock time with unix time command, the go executable is quite a bit faster -- around 4 seconds vs around 13 seconds for the python script. However, I wanted finer-grained timing data, so I measured just the model.Exec call in go vs the model.predict call in python, and I found that the go version is roughly twice as slow (3 sec vs 1.7 sec). My guess is that for a single run, parsing all the python code accounts for all of the extra time.

    Both of these are using the same model, although the python version is defining the model in code and loading weights from an hdf5 file, while the go version is loading model + weights from the SavedModel file (.pb format) -- not sure if that would make any difference.

    Do you have any ideas about why the graph execution would be slower under go, or how I could speed it up?

    Thanks!

    opened by rubysolo 11
  • Update both repos to TF2.4.1

    Update both repos to TF2.4.1

    Seems the tfgo and the fork of the official tensorflow repo are at 2.4.0-rc @galeone Are you going to update to 2.4.1?

    opened by Dilshat 10
  • How to use TensorFlow's Universal Sentence Encoder

    How to use TensorFlow's Universal Sentence Encoder

    How would I load in the universal-sentence-encoder-large embedding model?

    In Python

    embed = hub.Module("https://tfhub.dev/google/universal-sentence-encoder-large/3")
    embeddings = embed([
        "The quick brown fox jumps over the lazy dog.",
        "I am a sentence for which I would like to get its embedding"])
    
    print session.run(embeddings)
    

    In GO I've tried

    model, err := tf.LoadSavedModel("universal-sentence-encoder-large", []string{"serve"}, nil)
    
    if err != nil {
        fmt.Printf("Error loading saved model: %s\n", err.Error())
        return
    }
    

    but the program panics a when trying to load in the model πŸ˜•

    when I use the saved_model_cli I get empty results

    The given SavedModel contains the following tag-sets:
    
    

    How would I use the model? The directory looks like:

    β”œβ”€β”€ assets
    β”œβ”€β”€ saved_model.pb
    β”œβ”€β”€ tfhub_module.pb
    └── variables
    

    and the data was downloaded and unzipped from https://tfhub.dev/google/universal-sentence-encoder-large/3?tf-hub-format=compressed

    opened by drbh 10
  • Init golang module

    Init golang module

    Init golang modules to pin dependencies

    opened by ninedraft 10
  • installation issue: can't find for_core_protos_go_proto

    installation issue: can't find for_core_protos_go_proto

    go get github.com/galeone/tfgo go: finding module for package github.com/tensorflow/tensorflow/tensorflow/go/core/protobuf/for_core_protos_go_proto go: finding module for package github.com/tensorflow/tensorflow/tensorflow/go/core/protobuf/for_core_protos_go_proto ../../../../.gvm/pkgsets/go1.15/global/pkg/mod/github.com/tensorflow/[email protected]+incompatible/tensorflow/go/saved_model.go:25:2: module github.com/tensorflow/[email protected] found (v2.4.1+incompatible), but does not contain package github.com/tensorflow/tensorflow/tensorflow/go/core/protobuf/for_core_protos_go_proto

    opened by pa-m 10
  • ambiguous import: found package in multiple modules

    ambiguous import: found package in multiple modules

    I am importing the packages as suggested in the examples:

    tf "github.com/galeone/tensorflow/tensorflow/go"
    tg "github.com/galeone/tfgo"
    

    and get the following error

    main.go:11:2: ambiguous import: found package github.com/galeone/tensorflow/tensorflow/go in multiple modules:
            github.com/galeone/tensorflow v2.3.1+incompatible (~/.gvm/pkgsets/go1.15.8/global/pkg/mod/github.com/galeone/[email protected]+incompatible/tensorflow/go)
            github.com/galeone/tensorflow/tensorflow/go v0.0.0-20210519172502-4018d721b591 (~/.gvm/pkgsets/go1.15.8/global/pkg/mod/github.com/galeone/tensorflow/tensorflow/[email protected])
    
    ../../.gvm/pkgsets/go1.15.8/global/pkg/mod/github.com/galeone/[email protected]/ops.go:24:2: ambiguous import: found package github.com/galeone/tensorflow/tensorflow/go/op in multiple modules:
            github.com/galeone/tensorflow v2.3.1+incompatible (~/.gvm/pkgsets/go1.15.8/global/pkg/mod/github.com/galeone/[email protected]+incompatible/tensorflow/go/op)
            github.com/galeone/tensorflow/tensorflow/go v0.0.0-20210519172502-4018d721b591 (~/.gvm/pkgsets/go1.15.8/global/pkg/mod/github.com/galeone/tensorflow/tensorflow/[email protected]/op)
    
    

    After running go build the relevant section of my go.sum looks like:

    github.com/galeone/tensorflow v1.15.4 h1:TQeJss9Aeipg2K6kNwfqKRNpDLLwBdCouRKdCfzQ2pg=
    github.com/galeone/tensorflow v2.3.1+incompatible h1:RRiPEbcVK2IghF7YFDDF33tx+XMr2NuCriDBlMWYm5s=
    github.com/galeone/tensorflow v2.3.1+incompatible/go.mod h1:tPYvIhe58Qvzh/hJfdy0881FcAnouYskaz5tNIDEeMA=
    github.com/galeone/tensorflow/tensorflow/go v0.0.0-20210519172502-4018d721b591 h1:1UOml7GsssubL3OW53W9+kBk5BQICiG95TNXAmTrrsM=
    github.com/galeone/tensorflow/tensorflow/go v0.0.0-20210519172502-4018d721b591/go.mod h1:0LCzFWUL71lYeHtxlL/15k/+5ZKVzJk6Z+hLX1UBoUQ=
    github.com/galeone/tfgo v0.0.0-20210519185601-7d7131a16882 h1:YrghpSKeSJYE24fn/NzuAVXukc9npefUV6j10sgnj8Y=
    github.com/galeone/tfgo v0.0.0-20210519185601-7d7131a16882/go.mod h1:05ASagqJQa1Xev+FhblKviD9OAbRUWN4XAN2A1+aTd0=
    

    and under github.com/galeone the following packages are installed:

    tensorflow          
    [email protected]+incompatible
    [email protected]  
    [email protected]
    
    opened by frankpacini 1
  • how to train model with tfgo?

    how to train model with tfgo?

    opened by ZonghaoWang 1
  • Bump urllib3 from 1.26.4 to 1.26.5 in /test_models

    Bump urllib3 from 1.26.4 to 1.26.5 in /test_models

    Bumps urllib3 from 1.26.4 to 1.26.5.

    Release notes

    Sourced from urllib3's releases.

    1.26.5

    :warning: IMPORTANT: urllib3 v2.0 will drop support for Python 2: Read more in the v2.0 Roadmap

    • Fixed deprecation warnings emitted in Python 3.10.
    • Updated vendored six library to 1.16.0.
    • Improved performance of URL parser when splitting the authority component.

    If you or your organization rely on urllib3 consider supporting us via GitHub Sponsors

    Changelog

    Sourced from urllib3's changelog.

    1.26.5 (2021-05-26)

    • Fixed deprecation warnings emitted in Python 3.10.
    • Updated vendored six library to 1.16.0.
    • Improved performance of URL parser when splitting the authority component.
    Commits
    • d161647 Release 1.26.5
    • 2d4a3fe Improve performance of sub-authority splitting in URL
    • 2698537 Update vendored six to 1.16.0
    • 07bed79 Fix deprecation warnings for Python 3.10 ssl module
    • d725a9b Add Python 3.10 to GitHub Actions
    • 339ad34 Use pytest==6.2.4 on Python 3.10+
    • f271c9c Apply latest Black formatting
    • 1884878 [1.26] Properly proxy EOF on the SSLTransport test suite
    • See full diff in compare view

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    • @dependabot use these labels will set the current labels as the default for future PRs for this repo and language
    • @dependabot use these reviewers will set the current reviewers as the default for future PRs for this repo and language
    • @dependabot use these assignees will set the current assignees as the default for future PRs for this repo and language
    • @dependabot use this milestone will set the current milestone as the default for future PRs for this repo and language

    You can disable automated security fix PRs for this repo from the Security Alerts page.

    dependencies 
    opened by dependabot[bot] 1
  • OCR example

    OCR example

    Can you use this package as OCR, is there a simple example that can extract text from image?

    opened by hajsf 0
  • model.Exec: panic: Cannot parse tensor from proto: dtype: DT_VARIANT

    model.Exec: panic: Cannot parse tensor from proto: dtype: DT_VARIANT

    Here's the example I'm trying: https://gist.github.com/9nut/f95bb4cbe9c223e9f73a9e06429f71ac

    I get a panic, an error message similar to this tensorflow issue that @galeone also commented on: https://github.com/tensorflow/tensorflow/issues/44428

    Here's the output from my attempt:

    2021-03-26 14:00:39.257859: I tensorflow/cc/saved_model/reader.cc:32] Reading SavedModel from: ./centernet_hourglass_512x512_kpts_1 2021-03-26 14:00:39.657372: I tensorflow/cc/saved_model/reader.cc:55] Reading meta graph with tags { serve } 2021-03-26 14:00:39.657419: I tensorflow/cc/saved_model/reader.cc:93] Reading SavedModel debug info (if present) from: ./centernet_hourglass_512x512_kpts_1 2021-03-26 14:00:39.657502: I tensorflow/core/platform/cpu_feature_guard.cc:142] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN) to use the following CPU instructions in performance-critical operations: SSE4.2 AVX AVX2 FMA To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags. 2021-03-26 14:00:41.293101: I tensorflow/cc/saved_model/loader.cc:206] Restoring SavedModel bundle. 2021-03-26 14:00:46.853281: I tensorflow/cc/saved_model/loader.cc:190] Running initialization op on SavedModel bundle at path: ./centernet_hourglass_512x512_kpts_1 2021-03-26 14:00:48.256126: I tensorflow/cc/saved_model/loader.cc:277] SavedModel load for tags { serve }; Status: success: OK. Took 8998267 microseconds. 2021-03-26 14:00:53.231174: W tensorflow/core/grappler/optimizers/loop_optimizer.cc:906] Skipping loop optimization for Merge node with control input: StatefulPartitionedCall/StatefulPartitionedCall/cond/then/_1776/cond/Assert_3/AssertGuard/branch_executed/_1989 2021-03-26 14:00:54.407937: E tensorflow/core/framework/tensor.cc:555] Could not decode variant with type_name: "tensorflow::TensorList". Perhaps you forgot to register a decoder via REGISTER_UNARY_VARIANT_DECODE_FUNCTION? 2021-03-26 14:00:54.407991: W tensorflow/core/framework/op_kernel.cc:1740] OP_REQUIRES failed at constant_op.cc:79 : Invalid argument: Cannot parse tensor from tensor_proto. 2021-03-26 14:00:54.442763: E tensorflow/core/framework/tensor.cc:555] Could not decode variant with type_name: "tensorflow::TensorList". Perhaps you forgot to register a decoder via REGISTER_UNARY_VARIANT_DECODE_FUNCTION? 2021-03-26 14:00:54.443281: W tensorflow/core/framework/op_kernel.cc:1740] OP_REQUIRES failed at constant_op.cc:79 : Invalid argument: Cannot parse tensor from proto: dtype: DT_VARIANT tensor_shape { } variant_val { type_name: "tensorflow::TensorList" metadata: "\000\001\377\377\377\377\377\377\377\377\377\001\022\002\010\004" tensors { dtype: DT_FLOAT tensor_shape { dim { size: 4 } } float_val: 0 float_val: 0 float_val: 1 float_val: 1 } }

    panic: Cannot parse tensor from proto: dtype: DT_VARIANT tensor_shape { } variant_val { type_name: "tensorflow::TensorList" metadata: "\000\001\377\377\377\377\377\377\377\377\377\001\022\002\010\004" tensors { dtype: DT_FLOAT tensor_shape { dim { size: 4 } } float_val: 0 float_val: 0 float_val: 1 float_val: 1 } }

         [[{{node StatefulPartitionedCall/StatefulPartitionedCall/map/TensorArrayUnstack_1/TensorListFromTensor/_0__cf__1}}]]
    

    goroutine 1 [running]: github.com/galeone/tfgo.(*Model).Exec(0xc00012c030, 0xc003f2fe20, 0x1, 0x1, 0xc003f2fe68, 0x0, 0xc013463ea8, 0x1) $GOPATH/GoPkgs/pkg/mod/github.com/galeone/[email protected]/model.go:73 +0xd7 main.inferCenterNet() $HOME/src/savedmodelsrv/runinfer.go:80 +0x4c5 main.main() $HOME/src/savedmodelsrv/main.go:60 +0x25 $

    opened by 9nut 1
  • installation issue: can't find for_core_protos_go_proto

    installation issue: can't find for_core_protos_go_proto

    I'm opening up a new issue since none of the solutions given in the original issue fix the problem at all.

    C:\Users\User\go\src\testTensor>go get github.com/galeone/tfgo
    package github.com/tensorflow/tensorflow/tensorflow/go/core/protobuf/for_core_protos_go_proto: cannot find package "github.com/tensorflow/tensorflow/tensorflow/go/core/protobuf/for_core_protos_go_proto" in any
    of:
            C:\Go\src\github.com\tensorflow\tensorflow\tensorflow\go\core\protobuf\for_core_protos_go_proto (from $GOROOT)
            C:\Users\User\go\src\github.com\tensorflow\tensorflow\tensorflow\go\core\protobuf\for_core_protos_go_proto (from $GOPATH)
    

    I am attempting to be able to run this code, but this repository in it's current state is completely uninstallable.

    help wanted 
    opened by true-zero 8
  • Create mapping from model for model.Exec

    Create mapping from model for model.Exec

    Hi,

    Is there a way to automatically create the mapping for the Placeholder (e.g. Placeholder_1) to the input name (e.g. Height) from the model?

    Also is there a way to automatically read the "prob_out" from the model?

    []tf.Output and map[tf.Output]*tf.Tensor are both used for model.Exec:

    height, _ := tf.NewTensor([1]int64{"Height"})
    width, _ := tf.NewTensor([1]int64{"Width"})
    
    results := model.Exec([]tf.Output{
        model.Op("prob_out", 0),
    }, map[tf.Output]*tf.Tensor{
       model.Op("Placeholder_1", 0): height,
       model.Op("Placeholder_2", 0): width,
    })
    

    Thanks.

    help wanted 
    opened by Arnold1 4
Owner
Paolo Galeone
Researcher in love with: Machine/Deep Learning, PostgreSQL, Tensorflow and Go. Owner of @nerdzeu
Paolo Galeone
Go binding for TensorFlow Lite

go-tflite Go binding for TensorFlow Lite Usage model := tflite.NewModelFromFile("sin_model.tflite") if model == nil { log.Fatal("cannot load model")

mattn 215 Jul 22, 2021
Gorgonia is a library that helps facilitate machine learning in Go.

Gorgonia is a library that helps facilitate machine learning in Go. Write and evaluate mathematical equations involving multidimensional arrays easily

Gorgonia 4.1k Jul 27, 2021
Gorgonia is a library that helps facilitate machine learning in Go.

Gorgonia is a library that helps facilitate machine learning in Go. Write and evaluate mathematical equations involving multidimensional arrays easily

Gorgonia 4.1k Jul 19, 2021
A Go idiomatic binding to the C++ core of PyTorch

GoTorch GoTorch reimplements PyTorch high-level APIs, including modules and functionals, in idiomatic Go. Thus enables deep learning programming in Go

Yi Wang 53 Jul 17, 2021
A Kubernetes Native Batch System (Project under CNCF)

Volcano is a batch system built on Kubernetes. It provides a suite of mechanisms that are commonly required by many classes of batch & elastic workloa

Volcano 1.8k Jul 27, 2021
Standard machine learning models

Cog: Standard machine learning models Define your models in a standard format, store them in a central place, run them anywhere. Standard interface fo

Replicate 70 Jul 27, 2021
Prophecis is a one-stop machine learning platform developed by WeBank

Prophecis is a one-stop machine learning platform developed by WeBank. It integrates multiple open-source machine learning frameworks, has the multi tenant management capability of machine learning compute cluster, and provides full stack container deployment and management services for production environment.

WeBankFinTech 198 Jul 26, 2021
Deploy, manage, and scale machine learning models in production

Deploy, manage, and scale machine learning models in production. Cortex is a cloud native model serving platform for machine learning engineering teams.

Cortex Labs 7.6k Jul 21, 2021
A recommender system service based on collaborative filtering written in Go

Language: English | δΈ­ζ–‡ gorse: Go Recommender System Engine Build Coverage Report GoDoc RTD Demo gorse is an offline recommender system backend based o

Zhenghao Zhang 3.6k Jul 27, 2021
Golang Neural Network

Varis Neural Networks with GO About Package Some time ago I decided to learn Go language and neural networks. So it's my variation of Neural Networks

Artem Filippov 36 Jul 7, 2021
The Go kernel for Jupyter notebooks and nteract.

gophernotes - Use Go in Jupyter notebooks and nteract gophernotes is a Go kernel for Jupyter notebooks and nteract. It lets you use Go interactively i

GopherData 3k Jul 23, 2021
A High-level Machine Learning Library for Go

Overview Goro is a high-level machine learning library for Go built on Gorgonia. It aims to have the same feel as Keras. Usage import ( . "github.

AUNUM 282 Jul 10, 2021
Reinforcement Learning in Go

Overview Gold is a reinforcement learning library for Go. It provides a set of agents that can be used to solve challenges in various environments. Th

AUNUM 228 Jul 21, 2021
Ensembles of decision trees in go/golang.

CloudForest Google Group Fast, flexible, multi-threaded ensembles of decision trees for machine learning in pure Go (golang). CloudForest allows for a

Ryan Bressler 687 Jul 17, 2021