Go support for Google's protocol buffers

Overview

Go support for Protocol Buffers

GoDev Build Status

This module (github.com/golang/protobuf) contains Go bindings for protocol buffers.

It has been superseded by the google.golang.org/protobuf module, which contains an updated and simplified API, support for protobuf reflection, and many other improvements. We recommend that new code use the google.golang.org/protobuf module.

Versions v1.4 and later of github.com/golang/protobuf are implemented in terms of google.golang.org/protobuf. Programs which use both modules must use at least version v1.4 of this one.

See the developer guide for protocol buffers in Go for a general guide for how to get started using protobufs in Go.

See release note documentation for more information about individual releases of this project.

See documentation for the next major revision for more information about the purpose, usage, and history of this project.

Package index

Summary of the packages provided by this module:

  • proto: Package proto provides functions operating on protobuf messages such as cloning, merging, and checking equality, as well as binary serialization and text serialization.
  • jsonpb: Package jsonpb serializes protobuf messages as JSON.
  • ptypes: Package ptypes provides helper functionality for protobuf well-known types.
  • ptypes/any: Package any is the generated package for google/protobuf/any.proto.
  • ptypes/empty: Package empty is the generated package for google/protobuf/empty.proto.
  • ptypes/timestamp: Package timestamp is the generated package for google/protobuf/timestamp.proto.
  • ptypes/duration: Package duration is the generated package for google/protobuf/duration.proto.
  • ptypes/wrappers: Package wrappers is the generated package for google/protobuf/wrappers.proto.
  • ptypes/struct: Package structpb is the generated package for google/protobuf/struct.proto.
  • protoc-gen-go/descriptor: Package descriptor is the generated package for google/protobuf/descriptor.proto.
  • protoc-gen-go/plugin: Package plugin is the generated package for google/protobuf/compiler/plugin.proto.
  • protoc-gen-go: The protoc-gen-go binary is a protoc plugin to generate a Go protocol buffer package.

Reporting issues

The issue tracker for this project is located here.

Please report any issues with a sufficient description of the bug or feature request. Bug reports should ideally be accompanied by a minimal reproduction of the issue. Irreproducible bugs are difficult to diagnose and fix (and likely to be closed after some period of time). Bug reports must specify the version of the Go protocol buffer module and also the version of the protocol buffer toolchain being used.

Contributing

This project is open-source and accepts contributions. See the contribution guide for more information.

Compatibility

This module and the generated code are expected to be stable over time. However, we reserve the right to make breaking changes without notice for the following reasons:

  • Security: A security issue in the specification or implementation may come to light whose resolution requires breaking compatibility. We reserve the right to address such issues.
  • Unspecified behavior: There are some aspects of the protocol buffer specification that are undefined. Programs that depend on unspecified behavior may break in future releases.
  • Specification changes: It may become necessary to address an inconsistency, incompleteness, or change in the protocol buffer specification, which may affect the behavior of existing programs. We reserve the right to address such changes.
  • Bugs: If a package has a bug that violates correctness, a program depending on the buggy behavior may break if the bug is fixed. We reserve the right to fix such bugs.
  • Generated additions: We reserve the right to add new declarations to generated Go packages of .proto files. This includes declared constants, variables, functions, types, fields in structs, and methods on types. This may break attempts at injecting additional code on top of what is generated by protoc-gen-go. Such practice is not supported by this project.
  • Internal changes: We reserve the right to add, modify, and remove internal code, which includes all unexported declarations, the generator package, and all packages under internal.

Any breaking changes outside of these will be announced 6 months in advance to [email protected].

Comments
  • Rev of proto-gen-go to ProtoPackageIsVersion3 causing breakage

    Rev of proto-gen-go to ProtoPackageIsVersion3 causing breakage

    I am hopeful that there is a simple solution to get around this issue :)

    A few hours ago, this commit introduced proto.ProtoPackageIsVersion3

    and set const generatedCodeVersion = 3

    Our CI system (and I suspect many other peoples) do a:

    go get github.com/golang/protobuf/protoc-gen-go
    

    as part of setting up the system, and also do a:

    go generate ./...

    and check to make sure that regenerated files match the files in the commit (basically, checking for regen on updating the .proto files).

    This has been going swimmingly until a few hours ago. In trying to debug this problem, I came to realize that the fact this worked at all is a testament to how good you guys are at backward compatibility 👍

    What I discovered digging in is that we have been vendoring github.com/golang/protobuf/ v1.2.0 (because of other dependencies we have), but of course go get pulls the latest from master.

    v1.2.0 lacks proto.ProtoPackageIsVersion3, so our generated code suddenly doesn't compile.

    So I fell back to using:

    go install ./vendor/github.com/golang/protobuf/protoc-gen-go/
    

    in our CI. Unfortunately, this seems to result in some generation differences. If I run it locally, push, and then our CI runs it, the generated code doesn't quite match.

    I get diffs like:

    -var fileDescriptor_networkservice_d4de7ff5f5769d53 = []byte{
    +var fileDescriptor_networkservice_5b085879d0d87bdf = []byte{
    

    I don't precisely consider this a bug on your end, you guys are super good about backward compatibility, and at some point you do have to make the change.

    Do you have any ideas about how I might navigate all of this? I suspect lots and lots of folks will shortly have similar issues...

    opened by edwarnicke 60
  • reflect/protoregistry: conflicts with same filename

    reflect/protoregistry: conflicts with same filename

    Hello,

    Sharing a same .proto filename with different proto package names results in these errors:

    2020/05/10 15:03:12 WARNING: proto: file "common.proto" is already registered previously from: "git.example.com/proto/common" currently from: "git.example.com/proto/a" A future release will panic on registration conflicts. See: https://developers.google.com/protocol-buffers/docs/reference/go/faq#namespace-conflict

    2020/05/10 15:03:12 WARNING: proto: file "common.proto" is already registered previously from: "git.example.com/proto/common" currently from: "git.example.com/proto/b" A future release will panic on registration conflicts. See: https://developers.google.com/protocol-buffers/docs/reference/go/faq#namespace-conflict

    Here is my layout:

    proto/
      common/
        common.proto (package common)
      a/
        common.proto (package a and importing common/common.proto)
        foo.proto (importing common.proto)
        ...
      b/
        common.proto (package b and importing common/common.proto)
        bar.proto  (importing common.proto)
        ...
    

    I don't think my case is similar to both quoted in documentation https://developers.google.com/protocol-buffers/docs/reference/go/faq#namespace-conflict. I don't generate multiple times a same proto file (here I generate one time common/common.proto and I import from others files) and I'm using different package names. It would be better if I rename package common to something else, but I don't think it's my issue here.

    What's the best practice to solve this issue ? Is it normal to get this warning in my case ? I mean, it doesn't look uncommon to have the same filename in different projects.

    Thanks

    opened by guyguy333 48
  • internal/impl: implement support for non-pointer message fields

    internal/impl: implement support for non-pointer message fields

    We’re in the process of migrating from github.com/golang/protobuf to the v2 API (google.golang.org/protobuf) and hit a panic when using proto.Equal to compare protos.

    We traced the issue to a use of (reflect.Value).IsNil() here: https://github.com/protocolbuffers/protobuf-go/blob/1f5b6fe64530cac2061a3d315b7e44966b1a200b/internal/impl/message_reflect_field.go#L386-L392

    (reflect.Value).IsNil() panics when the underlying value is not a pointer type (e.g. struct pointer, channel, slice, map, etc.).

    It looks like v2’s internal API assumes that all non-scalar values are pointers, which in this case is not true—we’re using GoGo Protobuf’s feature of non-nullable fields in most of our messages, primarily time.Time values and occasional struct value where we don’t want the additional GC load.

    We forked this and replaced the IsNil call with IsZero, which in this use case works correctly. Suggestion: swap out calls to IsNil with calls to IsZero to prevent this runtime panic and work correctly with legacy v1 protos.

    Additionally, it looks like this package reimplements IsZero to some extent here: https://github.com/protocolbuffers/protobuf-go/blob/1f5b6fe64530cac2061a3d315b7e44966b1a200b/internal/descfmt/stringer.go#L224-L243

    Thanks!

    opened by ydnar 36
  • proto: make the Message interface behaviorally complete

    proto: make the Message interface behaviorally complete

    Filing on behalf of @alandonovan.

    The proto.Message interface is unsatisfactory. A behavioral interface is an abstraction over the underlying data types that exposes just the operations necessary for their correct use. By contrast, proto.Message exposes essentially no useful functionality and serves only as a marker. But a marker of what? If one could assume all its implementations were protoc-generated struct types with field tags, then at least it would be possible to write reflection-based algorithms that do useful things. However, there are many concrete types satisfying Message that are not of this form.

    It's not only the set of implementations that is unbounded; the set of useful operations is also large and growing. The two most important, Marshal and Unmarshal, are handled quite cleanly since there are separate behavioral interfaces for Marshaler and Unmarshaler that allow each concrete type to implement these operations. But there are many functions in the proto API, for which no interface exists: proto.Merge, proto.Clone, the extensions API, and so on.

    The cross-product of concrete implementations and operations is growing, but the fraction of these combinations that actually work is diminishing.

    I think we should assess what it would take to change the proto.Message interface, and all its implementations, so that it is a true behavioral interface. This would require at a minimum that the interface include a new method that provides complete control over the abstract state of a message: accessing and updating its fields, inspecting any extensions or unrecognized fields, and so on, without revealing the concrete representation. It should be possible to implement all the major functions in the proto API, as well as most users' ad hoc functions, in terms of this interface so that they work with any concrete implementation. If an optimized version of a crucial operation is available, the generic implementation should dispatch to it, as it does today for Marshal and Unmarshal.

    We can't add methods to proto.Message without breaking backwards compatibility. One approach we can take is to define proto.MessageV2 that is a much more semantically complete interface that provides a form of "protobuf reflection". In Marshal, Unmarshal, Merge, Clone, and so on, we can type assert if it implements proto.MessageV2 and use that interface to implement generic versions of those functions. If proto.Message doesn't satisfy proto.MessageV2, then Merge can just fail (it already does on most third-party implementations of proto.Message).

    breaking-change 
    opened by dsnet 33
  • APIv2: track breaking changes

    APIv2: track breaking changes

    This is a meta-issue for tracking any breaking changes that occur before the final v2 release. Since v2 is not yet released, we reserve the right to make breaking changes to improve the API to be as good as it can be.

    The CL (changelist) that introduces a breaking change will be posted here as a comment. Subscribe to this issue to be notified of breakages. We try to batch multiple breaking changes close together in time to reduce user churn.

    breaking-change 
    opened by dsnet 29
  • protoc-gen-go: unexport XXX_ fields from generated types

    protoc-gen-go: unexport XXX_ fields from generated types

    The exported XXX_ fields on generated message structs are awkward. They make "obvious" reflection code over proto messages easy to get wrong (by not skipping the XXX_ fields), and they pollute the documentation of the generated packages with irrelevant implementation details.

    I believe it is possible to unexport them. They appear to need to be exported due to the restriction on reflect.Value.Interface of unexported fields, but we can observe that every nested proto.Message is reached through only exported fields or XXX_ fields.

    One technique we could use to access the fields after unexporting them is double-embedding. Start with an exported type in the proto package with an unexported (pointer) accessor method. Add one (unexported) struct type per generated proto package which embeds the exported struct. Embed this unexported struct in each message which needs the corresponding field.

    The unexported struct field is itself unexported, but since that embeds a struct with an unexported method defined in the proto package it is accessible within proto via type-assertion to an interface containing the method.

    A rough sketch: https://play.golang.org/p/yDjZZtGmB6

    opened by bcmills 29
  •  protoc-gen-go: program not found or is not executable

    protoc-gen-go: program not found or is not executable

    Hi, I correctly installed and protobuf in my Ubuntu 18.04.01 Server Edition following the steps here outlined: https://github.com/protocolbuffers/protobuf/blob/master/src/README.md

    and without any problems compiled the example found here: https://developers.google.com/protocol-buffers/docs/cpptutorial

    protoc --cpp_out=. ./addressbook.proto
    -rw-rw-r-- 1 marco marco  49K Jan 30 11:26 addressbook.pb.cc
    -rw-rw-r-- 1 marco marco  27K Jan 30 11:26 addressbook.pb.h
    -rw-rw-r-- 1 marco marco  410 Jan 30 11:25 addressbook.proto
    

    But when trying to do the same in golang:

    protoc --go_out=. addressbook.proto
    protoc-gen-go: program not found or is not executable
    --go_out: protoc-gen-go: Plugin failed with status code 1.
    

    Other information:

    in my .bashrc I added the following lines:

    export GOPATH=${HOME}/go:$PATH
    export GOBIN=$HOME/go/bin:$PATH
    

    go version:

    go version go1.10.4 linux/amd64
    

    How to make protoc-gen-go functioning? Looking forward to your kind help. Marco

    opened by marcoippolito 28
  • Proposal: Overloading option go_package to indicate Go import path

    Proposal: Overloading option go_package to indicate Go import path

    Right now, a .proto file can have a line like

    option go_package = "foo";
    

    which means that the generated .pb.go file should have a package foo statement.

    I am proposing to overload this option with two new possible syntaxes:

    option go_package = "github.com/example/foo";
    

    This means that generated code should be dropped into the github.com/example/foo directory (relative to whatever is passed to protoc's --go_out flag), overriding the default behaviour of matching the path to the .proto file. It also means the .pb.go file should have a package foo statement.

    option go_package = "github.com/example/foo;bar";
    

    This means the same as the previous syntax, but additionally means that the .pb.go file should have a package bar statement, for the rare cases where people want the package name to differ from the final path component of the import path.

    This would hopefully mean that far fewer M parameters need to be passed through the --go_out flag. It would subsume #137 and maybe #63.

    Thoughts? Opinions?

    @robpike @zellyn @mwitkow @awalterschulze @peter-edge @tamird

    opened by dsymonds 28
  • proposal: protoc-gen-go should be able to compile multiple packages per run

    proposal: protoc-gen-go should be able to compile multiple packages per run

    Passing multiple packages' worth of *.proto files currently fails when using (protoc-gen-go)[https://github.com/golang/protobuf/blob/master/protoc-gen-go/generator/generator.go#L612:L614].

    The same functionality is supported by other generators (at least the C++ generator shipped with protoc). protoc-gen-go should support this as well.

    proposal 
    opened by tamird 28
  • proto: cannot use m.ProtoMethods() (type *protoreflect.Message) as type *struct ...

    proto: cannot use m.ProtoMethods() (type *protoreflect.Message) as type *struct ...

    I am recently try to build a project which I had build it multiple times successfully. But recently I get this error while building grpcurl as part of build process, is there any imcompatible changes made to your repo? /# Build grpcurl go get github.com/fullstorydev/grpcurl /# google.golang.org/protobuf/proto /root/go/src/google.golang.org/protobuf/proto/proto_methods.go:18:23: cannot use m.ProtoMethods() (type *protoreflect.Message) as type *struct ...

    needs-investigation 
    opened by m-liaee 27
  • protoc-gen-go: support go_name option to specify Go identifier

    protoc-gen-go: support go_name option to specify Go identifier

    Background

    Identifiers in proto are not directly translatable into Go for several reasons:

    1. By convention, naming styles are different:
      • Protos uses CamelCase for messages, groups, enums, services, and methods
      • Protos uses snake_case for fields and oneof fields
      • Protos uses SCREAMING_SNAKE_CASE for enum values
      • Go uses CamelCase for everything
    2. As a language rule, Go uses an uppercase letter to indicate visibility. Protobuf has no such visibility rules and everything is assumed to be publicly visible to everything else.
    3. Proto allows you to declare an enum or message within a message. Thus, a Child message defined in the scope of a Parent message is named Parent.Child. However, Go has no language equivalent for this (see golang/go#20467).

    Status quo

    When protoc-gen-go generates Go identifiers, it goes through some lexicographical transformation to convert proto naming conventions to Go naming conventions (see generator.CamelCase).

    This lexicographical re-write works well most of the time, but is sub-par in the following cases:

    • It cannot handle abbreviations well: enable_gps => EnableGps
    • It can result in name conflicts:
      • foo and Foo are distinct identifiers in proto, but both are named Foo in Go to work around visibility rules.
      • Parent_Child and Parent.Child are distinct identifiers proto, but both are named Parent_Child in Go to work around Go's lack of nested declarations (see my concern #513).

    Proposal

    I propose that we add support for a go_name option. This is an option that all Go generators should be aware of and respect, with adoption starting with protoc-gen-go.

    Example:

    syntax = "proto3";
    
    package example.goname;
    
    message FooMessage {
        string enable_gps = 1 [go_name = "EnableGPS"]
    }
    

    The option can be applied to following: MessageOptions, FieldOptions, OneOfOptions, EnumOptions, and EnumValueOptions.

    • The go_name option for MessageOptions and EnumOptions must be supported by all generators since a proto file may specify a message or enum from another proto file by name. The generator must be able to reliably reference the target type.
      • Note that this bakes in the assumption that generated code will always have a top-level Go type for each message and enum proto type.
    • It is optional whether a generator respects the option for the other types, although protoc-gen-go will.
      • EnumValueOptions controls the enum variable name
      • FieldOptions and OneOfOptions controls the struct field name
        • For a hypothetical generator that emit messages with no fields, but only getters and setters, it up to the implementation what they want to do. Using SetX and GetX is a reasonable approach.

    Optionally supported for: ServiceOptions and MethodOptions

    • protoc-gen-go does not directly support services. However, it is reasonable that grpc also respect these options.

    They do not apply to: FileOptions and ExtensionRangeOptions.

    • Do not apply because there is no sensible meaning for go_name here. go_package is the file-level equivalent that controls naming and import path information for the Go package.

    Secondly, I propose that we formalize the rules for how Go names are generated when the go_name option is lacking. Essentially, document the functionality of generator.CamelCase. This is one step in the direction of having a sensible ecosystem of various generators and/or plugins (#547).

    Use-cases

    • Provide an opt-out when making breaking changes to the generator. For example, suppose we dropped the enum type prefix to fix #513, we would need to provide a way for users to continue using the old name.
    • For situations where automatic CamelCasing is sub-par, provides the user the ability to provide an idiomatic Go name (e.g., EnableGPS)
      • This supersede the customname, enum_customname and enumvalue_customname options in gogo/protobuf.
    • Satisfy the desire to generate unexported message and type names (see #501)
      • Intentionally choosing an unexported name means you are certain that nobody else imports your proto file.
    • Provides a workaround for rare situations of name conflicts

    Design specifics

    What is valid syntax for go_name? The string must be a valid Go identifier per the language specification. If it is invalid, then the generator reports an error. It is the user's responsibility to ensure that Go identifiers do not conflict. A generator implementation may choose to report an error at generation time if such a conflict exists.

    How does this work with nested messages and enums? Suppose you have a proto file with message Parent { message Child { option go_name = "Foo"; } }. Is the type name of the inner message going to be Foo or Parent_Foo. I argue that the name is "Foo" as that is strictly more expressive than simply prefixing the parent name.

    What happens when field names are unexported? The current implementation of the proto package will crash under this situation. However, I argue that this behavior is just an implementation detail of the current package.

    There are ways to workaround this. For example, you could generate a Reference method that returns a pointer to any field of a given unexported name.

    func (m *FooMessage) Reference(name string) interface{} {
        switch name {
        case "field1": return &m.field1
        case "field2": return &m.field2
        case "field3": return &m.field3
        default:       panic(fmt.Sprintf("%T has no field named: %s", *m, name))
        }
    }
    

    The protoc package could type assert for this method and use it.

    However, I would prefer not polluting the public API of generated messages any further. Thus, for the initial support for go_name in protoc-gen-go, we can intentionally report an error until some future time we decide to fully support this. However, other generators like protoc-gen-gogo may support this. I'm comfortable with the fact that it is at least possible.

    I have several ideas for #276 that will be useful here.

    What does it mean for field extensions? TBD.

    What does it mean for enum value defaults? TBD.

    What does it mean for fields belonging to a oneof? TBD.

    \cc @neild @rogpeppe @awalterschulze @htuch

    proposal proposal-hold generator-proto-option 
    opened by dsnet 27
  • How to un/marshal custom types with Google?

    How to un/marshal custom types with Google?

    So this library now recommends to switch to google's implementation, which is ok with me. But the problem is that this library has

    type JSONPBMarshaler interface {
    	MarshalJSONPB(*Marshaler) ([]byte, error)
    }
    
    type JSONPBUnmarshaler interface {
    	UnmarshalJSONPB(*Unmarshaler, []byte) error
    }
    

    but the Google library does not. So what is the migration path in this case? The note // Deprecated: Custom types should implement protobuf reflection instead. is quite useless because it does not say anything useful.

    opened by ivanjaros 0
  • UseProtoNames=false doesn't seem to use

    UseProtoNames=false doesn't seem to use "lowerCamelCase name in JSON field names."

    What version of protobuf and what language are you using? protoc-gen-go v1.28.1 protoc v3.21.9

    What did you do? I'm marshaling a protobuf message using protojson

    The message User has an EmployeeNumber field:

    // (optional) employee number
    EmployeeNumber *string `protobuf:"bytes,2,opt,name=EmployeeNumber,proto3,oneof" json:"EmployeeNumber,omitempty"`
    

    I'm using the following go code to create such a User and marshal it:

            number := "number"
    	user := User{
    		EmployeeNumber: &number,
    	}
    	options := protojson.MarshalOptions{
    		EmitUnpopulated: true,
    		UseProtoNames:   false,
    		UseEnumNumbers:  true,
    	}
    	json_bytes, err := options.Marshal(&user)
    	println(string(json_bytes))
    

    What did you expect to see? Some json containing "employeeNumber":"number". That is, using lowerCamelCase, given I opted for UseProtoNames: false The description is (from https://pkg.go.dev/google.golang.org/protobuf/encoding/protojson#MarshalOptions):

    // UseProtoNames uses proto field name instead of lowerCamelCase name in JSON
    // field names.
    UseProtoNames [bool]
    

    since I'm using false I expect lowerCamelCase.

    What did you see instead? Some json containing "EmployeeNumber":"number". That is, not using lowerCamelCase because I used false for UseProtoNames and the description for that field (as I understand it) indicates lowerCamelCase name in JSON field names will be used.

    opened by horacimacias 3
  • protocmp: how to ignore field in one of?

    protocmp: how to ignore field in one of?

    I have a message M with a Oneof field O, one of the values O can take Ox has a field F.

    I want to ignore F in a comparison.

    I'm unable to address the field without either getting a panic or a compilation error using either protocmp.IgnoreFields or protocmp.IgnoreOneofs.

    How can I ignore F in a cmp.Equal comparison?

    opened by HartBlanc 2
  • Should include the name of enum in error message when unmarshaling an invalid enum value

    Should include the name of enum in error message when unmarshaling an invalid enum value

    Currently, when passing an invalid enum value, protobuf returns error saying invalid value for enum type: \"<value>\": Request:

    {
        "enumField": "invalid_value"
    }
    

    Response:

    {
        "code": 3,
        "message": "proto: (line X:Y): invalid value for enum type: \"invalid_value\"",
        "details": []
    }
    

    I expect it to return the name of the enum instead of the value. I check the source code of the decoder: https://github.com/protocolbuffers/protobuf-go/blob/f0e23c7a8f8d55910f877091fdad71d3abb26d10/encoding/protojson/decode.go#L331-L340

    And I think it should have the commented line:

    case protoreflect.EnumKind:
    	if v, ok := unmarshalEnum(tok, fd); ok {
    		return v, nil
    	}
    	// return protoreflect.Value{}, d.newError(tok.Pos(), "invalid value %v for enum type: %v", tok.RawString(), fd.Enum().Name())
    default:
    	panic(fmt.Sprintf("unmarshalScalar: invalid scalar kind %v", kind))
    }
    return protoreflect.Value{}, d.newError(tok.Pos(), "invalid value for %v type: %v", kind, tok.RawString())
    

    so the error message would look like this:

    "message": "proto: (line X:Y): invalid value \"invalid_value\" for enum type: enum_name"`
    
    opened by nvatuan 6
  • Golang - oneof field creates m receiver instead of x

    Golang - oneof field creates m receiver instead of x

    What version of protobuf and what language are you using? go version go1.19 windows/amd64 libprotoc 3.19.1 google.golang.org/genproto v0.0.0-20220930163606-c98284e70a91 google.golang.org/protobuf v1.28.1

    What did you do? I have added oneof field into proto definition

    What did you expect to see? Oneof method to have x receiver.

    What did you see instead? Oneof method has m receiver.

    opened by ivanjaros 1
  • How to reuse []byte field when unmarshaling?

    How to reuse []byte field when unmarshaling?

    I have a client/server app that uses gRPC to communicate. Profiler shows that up to 12.6% of memory is allocated in consumeBytesNoZero().

    The code is unmarshaling a message that has a bytes (i.e. []byte) field a few levels deep. Then that data is processed and then the message object/struct could be reused to read the next message into if it was possible, but I don't see how. I mean, I can reuse it but looking at the code, a new byte slice will be allocated in any case. The bytes field is always up to N KB in size (client ensures that) so perfect for reuse on the server side. Is there any way to reuse the allocated memory that I don't see?

    I see two ways:

    • Add support for something like Unmarshal() to the message so that I could add custom unmarshaling logic or something like that. Ideally not a fixed method, actually, but a way to dynamically inject custom logic for a particular type.
    • Add an option to UnmarshalOptions specially for that case.

    Perhaps combining the two is the best way forward - ability to inject dynamic unmarshaling code for types via UnmarshalOptions?

    Screen Shot 2022-10-05 at 10 49 47 am
    opened by ash2k 7
Releases(v1.5.2)
  • v1.5.2(Mar 29, 2021)

  • v1.5.1(Mar 18, 2021)

  • v1.5.0(Mar 18, 2021)

    Overview

    This marks the ptypes package as deprecated and upgrades the dependency on google.golang.org/protobuf to a pre-release version of v1.26.0. A subsequent patch release will update the dependency to v1.26.0 proper.

    Notable changes

    • (#1217) ptypes: deprecate the package
    • (#1214) rely on protodesc.ToFileDescriptorProto
    Source code(tar.gz)
    Source code(zip)
  • v1.4.3(Oct 15, 2020)

  • v1.4.2(May 14, 2020)

  • v1.4.1(May 4, 2020)

  • v1.4.0(Apr 13, 2020)

    Overview

    This release of the github.com/golang/protobuf module introduces a number of significant changes relative to the previous minor release. In particular, this module is now implemented in terms of the new google.golang.org/protobuf module, which is the next major revision of Go bindings for protocol buffers. From this point onwards, most of the development effort for Go protobufs will be dedicated to the new module, with minimal changes being made to this module.

    See the release notes for the new module for specific implementation details that may affect this release.

    Backwards compatibility

    This release maintains backwards compatibility with previous releases of this module. Any observable changes in behavior are to fix bugs, change unspecified behavior, or to make behavior more compliant with the protobuf specification. The compatibility document provides us the freedom to make changes in these areas.

    Notable changes

    Wire serialization

    Wire serialization is now implemented in terms of the new proto package by calling out to the relevant functionality in that package (e.g., proto.Marshal and proto.Unmarshal). There should be no observable changes in behavior other what is mentioned elsewhere in the release notes (e.g., behavior around errors or nil values).

    JSON and text serialization

    The JSON and text format implementations have been ported to use protobuf reflection under the hood instead of relying on Go reflection. This provides flexibility as they can operate on any concrete message type that properly implements the new proto.Message interface.

    The implementations do not use the new protojson or prototext packages in order to maintain a higher degree of backwards compatibility. Our analysis unfortunately showed us that too many tests rely on their output being stable by performing byte-for-byte comparisons. Even though the compatibility promise gives us the freedom to change the output, we have chosen not to do so for pragmatic reasons. The implementations are now functionally frozen (bugs and all) and will not receive future improvements. Users are encouraged to migrate to the protojson or prototext packages instead.

    Well-known types

    The well-known types declared under ptypes are moved to the google.golang.org/protobuf module. The packages continue to exist, but all declarations forward to ones in the new module.

    For a period of time, it is expected that the protoc-gen-go plugin continues to generate code with dependencies on the well-known types in this module. The import paths for the well-known types are determined by the go_package option specified in the .proto files. Since these files are managed by the main protocol buffers project, it will require a new release of the protobuf toolchain before the new import paths take effect. Depending on this module’s packages for well-known types is fine since they simply forward over to the new packages.

    While descriptor and plugin are not packages for well-known types, they too have also been moved to the new module.

    Types registry

    In order for dynamic usages of protobufs to operate correctly, there must be a unified registry between this module and the new google.golang.org/protobuf module. The protoregistry package is the primary registry for all protobuf files that are linked into a Go program. The registration functions (e.g., proto.RegisterType) in this package forward to the global registries in that module, and the lookup functions (e.g., proto.MessageType) in this package source from the global registries in that module.

    Nil values

    Use of the google.golang.org/protobuf module for the underlying implementation means that semantics for nil values may have changed. See that module’s release notes for details.

    Errors

    A consequence of using the google.golang.org/protobuf module as the underlying implementation means that the text for some errors may change in trivial ways. This may break brittle tests depending on the error message. See that module’s release notes for details.

    Generated code

    The protoc-gen-go plugin in this module is now a thin wrapper over the protoc-gen-go plugin in the google.golang.org/protobuf module. As a result, there are many changes to the generated code. See that module’s release notes for details. Users should migrate to use the new protoc-gen-go plugin instead of the old one. Code generated by either plugin should be compatible with either module.

    For backward compatibility purposes, the protoc-gen-go plugin in this module continues to support generation of gRPC bindings, while the protoc-gen-go plugin in the new module does not.

    Upcoming breakage changes

    Per the compatibility agreement for Go protobufs, we promise to make announcements about deliberate or potentially breaking changes 6 months in advance.

    None of these changes are effective yet in this release.

    Deprecation of Descriptor methods

    The Descriptor methods on generated messages are deprecated and may be removed in the future.

    The descriptor.Message type is deprecated as fewer and fewer message types in the future will implement the descriptor.Message interface. Existing usages should migrate to the new proto.Message interface.

    The descriptor.ForMessage function is also deprecated. Users should migrate the code to use protobuf reflection to access descriptor information.

    Removal of generator package

    The generator package is an internal implementation detail of protoc-gen-go and existed before the internal package mechanism. It has long been exempt from the compatibility promise. A future release of this module will remove the package. Existing users must migrate to the new compiler/protogen package, which is a stable API for implementing protoc plugins, or fork the existing generator package.

    Similarly, the grpc package is also an internal implementation detail of protoc-gen-go and will also be deleted in a future release of this module.

    Source code(tar.gz)
    Source code(zip)
  • v1.3.5(Mar 12, 2020)

  • v1.4.0-rc.4(Mar 9, 2020)

  • v1.4.0-rc.3(Mar 2, 2020)

  • v1.4.0-rc.2(Feb 29, 2020)

  • v1.3.4(Feb 26, 2020)

    Notable changes:

    • Updated google/protobuf/*.proto to github.com/protocolbuffers/protobuf v3.11.4.
    • Minor change to protoc-gen-go output to avoid post-gofmt variation between go1.13 and go1.14.
    Source code(tar.gz)
    Source code(zip)
  • v1.4.0-rc.1(Feb 13, 2020)

  • v1.3.3(Jan 29, 2020)

  • v1.3.2(Jul 9, 2019)

    Notable changes:

    • #785: grpc code generation: add an UnimplementedServer type implementing each server interface, returning an unimplemented error for each method
    • #851: convert prints to os.Stderr to use log.Printf
    • #883: jsonpb: fix marshaling of Duration with negative nanoseconds
    Source code(tar.gz)
    Source code(zip)
  • v1.3.1(Mar 14, 2019)

  • v1.3.0(Feb 26, 2019)

    Notable changes:

    • #699: add a go.mod module file
    • #701: stop generating package "// import" comment
    • #741: deprecate {Unm,M}arshalMessageSet{JSON}
    • #760: different internal implementation of oneofs
    • various minor changes to code generation
    Source code(tar.gz)
    Source code(zip)
  • v1.2.0(Aug 20, 2018)

    Notable changes:

    • #605 jsonpb.Unmarshal correctly handles escaped strings for some of the well-known types
    • #627 jsonpb.Unmarshal correctly returns an error when trying to marshal a scalar enum into repeated enums.
    • #630 jsonpb.Unmarshal correctly handles string versions of numeric primitives
    • #644 jsonpb.Unmarshal correctly handles enum values in maps
    • #624 fixes a performance regression where nested maps led to quadratic behavior
    • #628 reverts strict UTF-8 validation for proto2 string fields
    • #660 makes UTF-8 validation errors for proto3 string fields non-fatal
    Source code(tar.gz)
    Source code(zip)
  • v1.1.0(Apr 30, 2018)

    Notable changes:

    • The serialization logic (for both Marshal and Unmarshal) has been optimized. Testing inside Google demonstrate that the new implementation is about 1.3x to 2.1x faster.
    • google/protobuf#272 Unknown fields in proto3 are now preserved.
    • #472 JSON marshal and unmarshal now check for proto2 required field.
    • #483 GetExtension returns the raw bytes if the ExtensionDesc does not have type information.
    • #499 Invalid UTF-8 in string types are rejected during marshal and unmarshal.
    • #511 Fields of bad wire types are treated as unknown fields instead of an error.
    • #544 The protoc-gen-go generator supports the paths=source_relative option.
    • #561 Unknown fields in map entries are now ignored instead of causing an unmarshal error.
    • #583 The "import public" feature is now implemented using Go type aliases.
    Source code(tar.gz)
    Source code(zip)
Owner
Go
The Go Programming Language
Go
Encode and decode Go (golang) struct types via protocol buffers.

protostructure protostructure is a Go library for encoding and decoding a struct type over the wire. This library is useful when you want to send arbi

Mitchell Hashimoto 171 Nov 15, 2022
Musgo is a Go code generator for binary MUS format with validation support.

Musgo is a Go code generator for binary MUS format with validation support. Generated code converts data to and from MUS format.

null 60 Dec 29, 2022
Go support for Google's protocol buffers

Go support for Protocol Buffers This module (github.com/golang/protobuf) contains Go bindings for protocol buffers. It has been superseded by the goog

Go 8.9k Dec 29, 2022
Go support for Protocol Buffers - Google's data interchange format

Go support for Protocol Buffers - Google's data interchange format Google's data interchange format. Copyright 2010 The Go Authors. https://github.com

Tinker Board - Android 0 Dec 15, 2021
Protocol Buffers for Go with Gadgets

GoGo Protobuf looking for new ownership Protocol Buffers for Go with Gadgets gogoprotobuf is a fork of golang/protobuf with extra code generation feat

gogoprotobuf 5.5k Jan 7, 2023
A proxy server than converts JSON request bodies to protocol buffers

Welcome to Protoxy ?? What is Protoxy? Protoxy allows you to test your REST APIs that use Protocol Buffer serialization through Postman and other API

null 28 Nov 1, 2022
A new way of working with Protocol Buffers.

Buf All documentation is hosted at https://buf.build. Please head over there for more details. Goal Buf’s long-term goal is to enable schema-driven de

null 5.9k Jan 1, 2023
Encode and decode Go (golang) struct types via protocol buffers.

protostructure protostructure is a Go library for encoding and decoding a struct type over the wire. This library is useful when you want to send arbi

Mitchell Hashimoto 171 Nov 15, 2022
A Protocol Buffers compiler that generates optimized marshaling & unmarshaling Go code for ProtoBuf APIv2

vtprotobuf, the Vitess Protocol Buffers compiler This repository provides the protoc-gen-go-vtproto plug-in for protoc, which is used by Vitess to gen

PlanetScale 548 Jan 1, 2023
🔄 A command-line utility to export Protocol Buffers (proto) files to YAML, and JSON

proto2yaml ?? A command-line utility to export Protocol Buffers (proto) files to YAML, and JSON. Currently supported exports are for: Packages Service

Kristof Kowalski 9 Nov 10, 2022
Protocol Buffers - Google's data interchange format

Protocol Buffers - Google's data interchange format Copyright 2008 Google Inc. https://developers.google.com/protocol-buffers/ Overview Protocol Buffe

Protocol Buffers 57.6k Jan 3, 2023
Package pbpgx provides a toolkit for easier Protocol Buffers interaction with PostgreSQL databases.

PBPGX Package pbpgx provides a toolkit for easier Protocol Buffers interaction with PostgreSQL databases. Pbpgx supports the Protocol Buffer types gen

Tim Möhlmann 11 Jun 27, 2022
A plugin of protoc that for using a service of Protocol Buffers as http.Handler definition

protoc-gen-gohttp protoc-gen-gohttp is a plugin of protoc that for using a service of Protocol Buffers as http.Handler definition. The generated inter

John_Suu 1 Dec 9, 2021
Estudos com Golang, GRPC e Protocol Buffers

Golang, GRPC e Protocol Buffers Estudos com Golang, GRPC e Protocol Buffers Projeto feito para fins de estudos. Para rodar basta seguir os passos abai

Marcos Daniel 1 Feb 10, 2022
This is a golang C2 + Implant that communicates via Protocol Buffers (aka. protobufs).

Br4vo6ix DISCLAIMER: This tool is for educational, competition, and training purposes only. I am in no way responsible for any abuse of this tool This

RITSEC Redteam 4 Nov 9, 2022
Protocol Buffers to HTTP client code generator/converter

Proto2http Proto2http provides a code generation tool to convert your protocol buffers (.proto) files into invokable HTTP request. Usage proto2http -p

Kodiiing 6 Oct 23, 2022
Eunomia is a distributed application framework that support Gossip protocol, QuorumNWR algorithm, PBFT algorithm, PoW algorithm, and ZAB protocol and so on.

Introduction Eunomia is a distributed application framework that facilitates developers to quickly develop distributed applications and supports distr

Cong 2 Sep 28, 2021
wire protocol for multiplexing connections or streams into a single connection, based on a subset of the SSH Connection Protocol

qmux qmux is a wire protocol for multiplexing connections or streams into a single connection. It is based on the SSH Connection Protocol, which is th

Jeff Lindsay 205 Dec 26, 2022
A simple tool to convert socket5 proxy protocol to http proxy protocol

Socket5 to HTTP 这是一个超简单的 Socket5 代理转换成 HTTP 代理的小工具。 如何安装? Golang 用户 # Required Go 1.17+ go install github.com/mritd/s2h@master Docker 用户 docker pull m

mritd 10 Jan 2, 2023
A project outputs Bluetooth Low Energy (BLE) sensors data in InfluxDB line protocol formatA project outputs Bluetooth Low Energy (BLE) sensors data in InfluxDB line protocol format

Intro This project outputs Bluetooth Low Energy (BLE) sensors data in InfluxDB line protocol format. It integrates nicely with the Telegraf execd inpu

Marc Venturini 1 Apr 15, 2022