Kafka producer and consumer tool in protobuf format.

Related tags

Messaging protokaf
Overview

protokaf

Kafka producer and consumer tool in protobuf format.

Features

  • Consume and produce messages using Protobuf protocol
  • Trace messages with Jaeger
  • Create custom templates for one or multiple messages and produce them to Kafka

Install

go install github.com/SberMarket-Tech/[email protected]

Configuration

Configuration file is optional, so you can skip this section.

In order for Protokaf to work, it needs to know how to reach your Kafka broker. First option is to provide --broker each time you invoke Protokaf. Another option is to use a configuration file. You can provide Protokaf with a configuration file with option -F .. on the command line. Or by default Protokaf will search its config files in .protokaf.yaml and $HOME/.protokaf.yaml respectively.

Example of .protokaf.yaml

debug: true
broker: "<addr>:<port>"
kafka-auth-dsn: "SCRAM-SHA-512:<namespace>:<passwd>"
proto: "<dir>/<protofile>"

Help

$ protokaf help

List metadata

$ protokaf list [-t <topic>(,<topic>...)]
1 brokers:
 broker 1 "127.0.0.1:9093"
2 topics:
  topic "test-topic", partitions: 1
    partition 0, leader 1, replicas: [1] (offline: []), isrs: [1]
  topic "test", partitions: 1
    partition 0, leader 1, replicas: [1] (offline: []), isrs: [1]

Produce

Help

$ protokaf produce -h

Examples

This proto file will be used in the examples below.

api/example.ptoto

syntax = "proto3";

package example;

message HelloRequest {
  string name = 1;
  int32 age = 2;
}

A simple produce message

$ protokaf produce HelloRequest \
    --broker kafka:9092 \
    --proto api/example.proto \
    --topic test \
    --data '{"name": "Alice", "age": 11}'

Produce message with headers

$ protokaf produce HelloRequest \
    --broker kafka:9092 \
    --proto api/example.proto \
    --topic test \
    --header "priority=high" \
    --header "application=protokaf" \
    --data '{"name": "Alice", "age": 11}'

Produce message with template

$ protokaf produce HelloRequest \
    --broker kafka:9092 \
    --proto api/example.proto \
    --topic test \
    --data '{"name": {{randomFemaleName | quote}}, "age": {{randomNumber 10 20}}}' \
    --count 10 \
    --seed 42

Produce message with Kafka auth

$ protokaf produce HelloRequest \
    --broker kafka:9093 \
    --kafka-auth-dsn "SCRAM-SHA-512:login:passwd" \
    --proto api/example.proto \
    --topic test \
    --data '{"name": "Alice", "age": 11}'

Read data from stdin or flag

Read message HelloRequest from stdin, produce to test topic

$ echo '{"name": "Alice", "age": 11}' | protokaf produce HelloRequest -t test

Read message HelloRequest from -d value, produce to test topic

$ protokaf produce HelloRequest -t test -d '{"name": "Alice", "age": 11}'

Template

Template options

  • --seed <int> You can set number greater then zero to produce the same pseudo-random sequence of messages
  • --count <int> Useful for generating messages with random data
  • --concurrency <int> Number of message senders to run concurrently for const concurrency producing

Show all template functions

$ protokaf produce --template-functions-print

Consume

Help

$ protokaf help consume

Examples

$ protokaf consume HelloRequest \
    --broker kafka:9092 \
    --proto api/example.proto \
    --group mygroup \
    --topic test

Read messages from Kafka test topic, use group mygroup, print to stdout

$ protokaf consume HelloRequest -G mygroup -t test

Read the last 10 messages from test topic, then exit

$ protokaf consume HelloRequest -G mygroup -t test -c 10

Testing

Prepare test environment

make docker-dev-up
make kafka-users
make install # optional (you can use 'go run . <args> <flags>')
Issues
  • Parameters for producing from file

    Parameters for producing from file

    Hey, thank you for your project! Would be nice to have ability to keep parameters in file and get it from saved file.

    Save to file:

    $ protokaf generate HelloRequest \
        --data '{"name": {{randomFemaleName | quote}}, "age": {{randomNumber 10 20}}}' \
        --count 10 \
        --seed 42
        --save data.json
    

    After we can use file to produce:

    $ protokaf produce HelloRequest \
        --load data.json
    
    opened by asusikov 2
  • Support Lua-scripts to generate random data for producing

    Support Lua-scripts to generate random data for producing

    To generate some specific data would be great to support Lua-script.

    For example, if my script is in funcs.lua:

    function randomCardNumber() {
        return "4242424242424242"
    }
    

    It can be passed to input like:

    $ protokaf produce CardRequest \
      --lua funcs.lua
      --data '{"card": {{randomCardNumber | quote}}}'
      ...
    
    opened by ArtyomKomarenko 0
  • Several configurations in one file

    Several configurations in one file

    I would like to be able to create several configurations in .protokaf.yaml: Something like that:

    profiles:
      local:
        debug: true
        broker: 0.0.0.0:9092
    
      stage:
        debug: true
        broker: stage.host:9093
        kafka-auth-dsn: "SCRAM-SHA-256:admin:secret"
    

    And using it that way:

    $ protokaf consume HelloRequest -P stage ...
    
    opened by Arkweid 0
Owner
SberMarket Tech
Open Source by RnD team
SberMarket Tech
go - Kafka protobuf message producer

Kafka prtobuf message producer This project is used to produce protobuf messages message to given kafka topic UseCase Integration testing Debugging Mo

Neeraj 1 Jan 13, 2022
redisqueue provides a producer and consumer of a queue that uses Redis streams

redisqueue redisqueue provides a producer and consumer of a queue that uses Redis streams. Features A Producer struct to make enqueuing messages easy.

Robin Joseph 83 Jun 14, 2022
Simple-messaging - Brokerless messaging. Pub/Sub. Producer/Consumer. Pure Go. No C.

Simple Messaging Simple messaging for pub/sub and producer/consumer. Pure Go! Usage Request-Response Producer: consumerAddr, err := net.ResolveTCPAddr

IchHabeKeineNamen 1 Jan 20, 2022
Experiments using Go 1.18beta1's Generic typings and the Segmentio kafka-go consumer client

Experiments using Go 1.18beta1's Generic typings and the Segmentio kafka-go consumer client

Derek Chau 0 Jan 28, 2022
A CLI tool for interacting with Kafka through the Confluent Kafka Rest Proxy

kafkactl Table of contents kafkactl Table of contents Overview Build Development Overview kafkactl is a CLI tool to interact with Kafka through the Co

Alexandre Barone 0 Nov 1, 2021
provider-kafka is a Crossplane Provider that is used to manage Kafka resources.

provider-kafka provider-kafka is a Crossplane Provider that is used to manage Kafka resources. Usage Create a provider secret containing a json like t

Crossplane Contrib 13 May 26, 2022
A Multi Consumer per Message Queue with persistence and Queue Stages.

CrimsonQ A Multi Consumer per Message Queue with persistence and Queue Stages. Under Active Development Crimson Queue allows you to have multiple cons

Yousef Wadi 10 Apr 23, 2022
Gomaxscale - CDC consumer for MaxScale

gomaxscale Go library that allows consuming from MaxScale CDC listener. Useful f

Rafael Dantas Justo 3 Feb 22, 2022
CLI Tool to Stress Apache Kafka Clusters

Kafka Stress - Stress Test Tool for Kafka Clusters, Producers and Consumers Tunning Installation Docker docker pull fidelissauro/kafka-stress:latest d

Matheus Fidelis 26 Mar 19, 2022
Kafka tool to emit tombstones for messages based on header value matches

Langolier Langolier is a CLI tool to consume a Kafka topic and emit tombstones for messages matched by header name/value pairs. Usage Usage of langoli

jsd1982 2 Sep 22, 2021
Sarama is a Go library for Apache Kafka 0.8, and up.

sarama Sarama is an MIT-licensed Go client library for Apache Kafka version 0.8 (and later). Getting started API documentation and examples are availa

Shopify 8.7k Jun 30, 2022
Implementation of the NELI leader election protocol for Go and Kafka

goNELI Implementation of the NELI leader election protocol for Go and Kafka. goNELI encapsulates the 'fast' variation of the protocol, running in excl

Obsidian Dynamics 57 Apr 28, 2022
Apache Kafka Web UI for exploring messages, consumers, configurations and more with a focus on a good UI & UX.

Kowl - Apache Kafka Web UI Kowl (previously known as Kafka Owl) is a web application that helps you to explore messages in your Apache Kafka cluster a

CloudHut 2.5k Jun 27, 2022
pubsub controller using kafka and base on sarama. Easy controll flow for actions streamming, event driven.

Psub helper for create system using kafka to streaming and events driven base. Install go get github.com/teng231/psub have 3 env variables for config

Te Nguyen 5 May 29, 2022
Study project that uses Apache Kafka as syncing mechanism between two databases, with producers and consumers written in Go.

Kafka DB Sync Study project that uses Apache Kafka as syncing mechanisms between a monolith DB and a microservice. The main purpose of this project is

Diego Hordi 0 Dec 5, 2021
A quick introduction to how Apache Kafka works and differs from other messaging systems using an example application.

Apache Kafka in 6 minutes A quick introduction to how Apache Kafka works and differs from other messaging systems using an example application. In thi

bagher sohrabi 2 Oct 27, 2021
ChizBroker is a fast and simple GRPC based implementation of kafka.

Chiz Broker: a broker for fun ChizBroker is a fast and simple GRPC based implementation of kafka. Features: Ready to be deployed on kubernetes Prometh

Sina Amininasab 40 May 22, 2022
implentacion queue in kafka, rabbit and sqs

Big Queue on Go This is a simple big queue and implementation in kafka, rabbit and aws sqs. Publish in a topic in kafka: Use NewPublisher method to cr

Patricia Bonaldy 0 Dec 29, 2021
Kfchc - Kafka Connect (connectors / tasks) HealthCheck For AWS ALB and more

kfchc / Kafka Connect HealthCheck Kafka Connect (connectors / tasks) HealthCheck

yuuki takezawa 1 Jan 1, 2022