Library for scanning data from a database into Go structs and more

Overview

scany

Build Status Go Report Card codecov PkgGoDev Mentioned in Awesome Go

Overview

Go favors simplicity, and it's pretty common to work with a database via driver directly without any ORM. It provides great control and efficiency in your queries, but here is a problem: you need to manually iterate over database rows and scan data from all columns into a corresponding destination. It can be error-prone verbose and just tedious. scany aims to solve this problem. It allows developers to scan complex data from a database into Go structs and other composite types with just one function call and don't bother with rows iteration.

scany isn't limited to any specific database. It integrates with database/sql, so any database with database/sql driver is supported. It also works with pgx library native interface. Apart from the out-of-the-box support, scany can be easily extended to work with almost any database library.

Note that scany isn't an ORM. First of all, it works only in one direction: it scans data into Go objects from the database, but it can't build database queries based on those objects. Secondly, it doesn't know anything about relations between objects e.g: one to many, many to many.

Features

  • Custom database column name via struct tag
  • Reusing structs via nesting or embedding
  • NULLs and custom types support
  • Omitted struct fields
  • Apart from structs, support for other destination types: maps, slices, etc.

Install

go get github.com/georgysavva/scany

How to use with database/sql

package main

import (
	"context"
	"database/sql"

	"github.com/georgysavva/scany/sqlscan"
)

type User struct {
	ID    string
	Name  string
	Email string
	Age   int
}

func main() {
	ctx := context.Background()
	db, _ := sql.Open("postgres", "example-connection-url")

	var users []*User
	sqlscan.Select(ctx, db, &users, `SELECT id, name, email, age FROM users`)
	// users variable now contains data from all rows.
}

Use sqlscan package to work with database/sql standard library.

How to use with pgx native interface

package main

import (
	"context"

	"github.com/jackc/pgx/v4/pgxpool"

	"github.com/georgysavva/scany/pgxscan"
)

type User struct {
	ID    string
	Name  string
	Email string
	Age   int
}

func main() {
	ctx := context.Background()
	db, _ := pgxpool.Connect(ctx, "example-connection-url")

	var users []*User
	pgxscan.Select(ctx, db, &users, `SELECT id, name, email, age FROM users`)
	// users variable now contains data from all rows.
}

Use pgxscan package to work with pgx library native interface.

How to use with other database libraries

Use dbscan package that works with an abstract database, and can be integrated with any library that has a concept of rows. This particular package implements core scany features and contains all the logic. Both sqlscan and pgxscan use dbscan internally.

Comparison with sqlx

  • sqlx only works with database/sql standard library. scany isn't limited only to database/sql. It also supports pgx native interface and can be extended to work with any database library independent of database/sql
  • In terms of scanning and mapping abilities, scany provides all features of sqlx
  • scany has a simpler API and much fewer concepts, so it's easier to start working with

Supported Go versions

scany supports Go 1.13 and higher.

Roadmap

  • Add ability to set custom function to translate struct field to the column name, instead of the default to snake case function
  • Allow to use a custom separator for embedded structs prefix, instead of the default "."

Tests

The easiest way to run the tests is:

go test ./...

scany runs a CockroachDB server to execute its tests. It will download, cache and run the CockroachDB binary for you. It's very convenient since the only requirement to run the tests is an internet connection. Alternatively, you can download the CockroachDB binary yourself and pass the path to the binary into tests:

go test ./... -cockroach-binary cockroach

golangci-lint

This project uses golangci-lint v1.38.0.

To run the linter locally do the following:

  1. Install golangci-lint program
  2. In the project root type: golangci-lint run

Contributing

Every feature request or question is appreciated. Don't hesitate. Just post an issue or PR.

License

This project is licensed under the terms of the MIT license.

Comments
  • Are there any plans to support pgx v5?

    Are there any plans to support pgx v5?

    In a month or so, pgx v5 will be officially released.

    • https://github.com/jackc/pgx/issues/1169
    • https://github.com/jackc/pgx/issues/1061#issuecomment-1186250809

      This is the last major change planned for v5. I plan to release the (hopefully) final alpha release within the next few days (maybe even tonight), then a beta release a couple weeks after that, and the general release a few weeks after that.

    Here are the commits I supported v5 for reference.

    • https://github.com/jnst/scany/commits/master

    pgx v5 requires go 1.18 or higher. do you plan to support this in scany?

    opened by jnst 18
  • When wrapping a single anonymous structure we lose its column mapping

    When wrapping a single anonymous structure we lose its column mapping

    In the case need to wrap a structure anonymously to simply implement some interfaces or custom methods, scany loses the column mapping throwing the error: scany: scan row into struct fields: can't scan into dest[1]: cannot assign ...

    type UUID struct {
        pgtype.UUID
    }
    q := `SELECT id FROM ...`
    
    var rows []struct {
        UUID `db:"id"`
    }
    if err := pgxscan.Select(ctx, pg.Client(tx), &rows, q); err != nil {
        return nil, err
    }
    
    // listing uuids
    fmt.Println(rows)
    

    I propose the following fix: #38

    opened by idaunis 17
  • scanning to nested structs

    scanning to nested structs

    Hello I was using sqlx and I moved to pgx with scany.

    I have an issue here that I didn't resolve with sqlx also, i hope that there is some sort of resolution to this instead of manually scanning every row.

    I use PostgreSQL 12.4 and I'm executing a function that returns a table with the following:

    CREATE FUNCTION ... RETURNS TABLE (
                 type text,
                  name text,
                  based_on text[],
                  source text,
                  ingredients json[],
                  accessories json[],
                  glasses json[],
                  created_at timestamptz
                  )
    ...
    

    I use sqlgen for GraphQL and it created the following classes:

    type SearchResultRow struct {
    	Name        string            `json:"name" db:"name"`
    	Type        string            `json:"type" db:"type"`
    	AddedBy     *string           `json:"added_by" db:"added_by"`
    	Source      *string           `json:"source" db:"source"`
    	Ratings     *int              `json:"ratings" db:"ratings"`
    	BasedOn     pq.StringArray    `json:"based_on" db:"based_on"`
    	Ingredients []*IngredientType `json:"ingredients" db:"ingredients"`
    	Accessories []*AccessoryType  `json:"accessories" db:"accessories"`
    	Glasses     []*GlassType      `json:"glasses" db:"glasses"`
    	CreatedAt   string            `json:"created_at" db:"created_at"`
    }
    
    type IngredientType struct {
    	Name       string         `json:"name" db:"name"`
    	Amount     float64        `json:"amount" db:"amount"`
    	AmountType pq.StringArray `json:"amount_type" db:"amount_type"`
    }
    
    type AccessoryType struct {
    	Name       string         `json:"name" db:"name"`
    	Amount     float64        `json:"amount" db:"amount"`
    	AmountType pq.StringArray `json:"amount_type" db:"amount_type"`
    }
    
    type GlassType struct {
    	Name       string         `json:"name" db:"name"`
    	Amount     float64        `json:"amount" db:"amount"`
    	AmountType pq.StringArray `json:"amount_type" db:"amount_type"`
    }
    

    I'm querying the database using pgxscan with the following code:

    var rows []*model.SearchResultRow
    err := pgxscan.Select(context.Background(), Connection, &rows, sqlQuery, query);
    

    and the error that I got is this: scany: scan row into struct fields: can't scan into dest[4]: unknown oid 199 cannot be scanned into *[]*model.IngredientType

    is there anything I can do to overcome this ? I wouldn't mind returning different types from the database, changing it from json array to one json object, anything that can help me resolve this.

    thank you

    opened by kfirufk 16
  • use fmt to wrap errors, not archived pkg/error

    use fmt to wrap errors, not archived pkg/error

    currently, github.com/pkg/errors is used to return error, however since go 1.13 introduced error wrapping, this is deprecated, and the repo has been archived since.

    help wanted 
    opened by mrkagelui 9
  • pgx module needs to be updated

    pgx module needs to be updated

    The version of pgx used uses an older version of pgtype which does not support nullable pointers to custom types. Later versions of pgtype >= 1.4.0 support this.

    opened by PaulForgey 8
  • Error handling

    Error handling

    Hi thanks for the library.

    Can you explain the best way to handle a pgx.ErrNoRows? It appears I can't just do something like

    
    if err := pgxscan.Select(ctx, db, "select * from whatever"); err != nil {
        // note this is the standard import "errors"
        if errors.Is(err, pgx.ErrNoRows) {
            return fmt.Errorf("no results: %w", err)
        }
        return err
    }
    

    Does this mean I have to use "pkg/errors" to find the cause?

    Thanks

    opened by jfyne 8
  • Missing support for sql.Null types with aggregate functions like SUM

    Missing support for sql.Null types with aggregate functions like SUM

    I just tried plugging in this library for my use case with pgx; for the most part, it works great. One bug I have run into is for a code snippet like this

    	var mySum sql.NullInt64
    	query := "SELECT SUM(my_column) FROM my_table"
    	err := pgxscan.Get(ctx, p.pool, &mySum, query)
    	if err != nil {
    		logger.Errorw("Failed to fetch sum", "err", err)
    	}
    

    In this case, I would expect the SUM() aggregator to be parsed and then populate the sql.NullInt64 type. However, this query produces the error: `scany: column: 'sum': no corresponding field found, or it's unexported in sql.NullInt64``.

    I noticed that by changing the declaration to var mySum int64, this appeared to solve the problem.

    I am guessing that the bug might be that the sql.Null* types do not populate the struct correctly when aggregate functions are at play, so I would expect those to be populated appropriately rather than throwing an error.

    opened by evanfuller 6
  • ❓ Use array_agg

    ❓ Use array_agg

    Hey. I have a problem that I described earlier at this link.

    I updated my code a bit and it turns out the following.

    package sqlstore
    
    import (
    	"context"
    	"fmt"
    	"log"
    	"os"
    	"time"
    
    	"github.com/georgysavva/scany/pgxscan"
    	"github.com/jackc/pgx/v4"
    )
    
    // Service ...
    type Service struct {
    	ID   int    `db:"id"`
    	Name string `db:"name"`
    }
    
    // Price ...
    type Price struct {
    	ID        int        `db:"id"`
    	Value     int        `db:"value"`
    	Category  string     `db:"category"`
    	Available bool       `db:"available"`
    	Services  []Service  `db:"services"`
    	CreatedAt time.Time  `db:"created_at"`
    	DeletedAt *time.Time `db:"deleted_at"`
    }
    
    // Event ...
    type Event struct {
    	ID          int        `db:"id"`
    	Name        string     `db:"name"`
    	Description string     `db:"description"`
    	Address     string     `db:"address"`
    	StartDate   time.Time  `db:"start_date"`
    	Duration    int        `db:"duration"`
    	Prices      []Price    `db:"prices"`
    	CreatedAt   time.Time  `db:"created_at"`
    	DeletedAt   *time.Time `db:"deleted_at"`
    }
    
    // GetEvents ...
    func GetEvents(conn *pgx.Conn) ([]Event, error) {
    	var items []Event
    
    	err := pgxscan.Select(
    		context.Background(),
    		conn,
    		&items,
    		`
    			SELECT 
    				e.*,
    				array_agg(array[pr.*]) prices
    			FROM events e LEFT JOIN
    				(select
    					p2.*
    				from prices p2
    				group by p2.id) pr
    				ON e.id = pr.event_id
    			WHERE e.deleted_at IS NULL GROUP BY e.id ORDER BY e.id DESC
    		`,
    	)
    	if err != nil {
    		return nil, err
    	}
    
    	return items, nil
    }
    
    func main() {
    
    	conn, err := pgx.Connect(context.Background(), os.Getenv("DATABASE_URL"))
    	defer conn.Close(context.Background())
    
    	events, err := GetEvents(conn)
    	if err != nil {
    		log.Fatalln(err)
    	}
    	fmt.Printf("%+v\n", events)
    
    }
    

    The question is as follows. How to use array_agg? I am unable to decode the array []Price

    scany: scan row into struct fields: can't scan into dest[20]: unknown oid 2287 cannot be scanned into *[]Price
    

    If possible, show with this example how to decode an array from postgres

    Thanks

    opened by sxwebdev 6
  • hybrid mapping struct / map[string]float

    hybrid mapping struct / map[string]float

    Hi,

    First of all, thank you very much for contributing this library to the community. It seems to bee quite easy to use.

    I am trying to get arbitrary data out from a timescale database and I would like to mix structs and map[string] scanning.

    Given those structs:

    type Data struct {
    	PhConTot float64
    	PhGenTot float64
    }
    
    type Metric struct {
    	Time     time.Time
    	DeviceId string
    	Data     Data
    }
    

    I am correctly scanning data with

    	var metrics []*Metric
    	if err := pgxscan.Select(ctx, conn, &metrics, `
    		select time, device_id, ph_con_tot as "data.ph_con_tot", ph_gen_tot as "data.ph_gen_tot"
    		from metrics_computer_wide
    		  WHERE device_id = '9d5eaae0-421b-11ec-9949-7f0fdad2c99c' and ph_con_tot is not null and 
    		  time > '2022-04-01' and time <= '2022-04-05'
    		`); err != nil {
    		fmt.Fprintf(os.Stderr, "Unable to query database: %v\n", err)
    	}
    

    But since Data fields can be a huge list of options, I would like it to be a map[string]float64. For example

    type MetricMap struct {
    	Time     time.Time
    	DeviceId string
    	Data     map[string]float64
    }
    

    And querying like

    	var metricsMap []*MetricMap
    	if err := pgxscan.Select(ctx, conn, &metricsMap, `
    		select time, device_id, ph_con_tot as "data.ph_con_tot", ph_gen_tot as "data.ph_gen_tot"
    		from metrics_computer_wide
    		  WHERE device_id = '9d5eaae0-421b-11ec-9949-7f0fdad2c99c' and ph_con_tot is not null and 
    		  time > '2022-04-01' and time <= '2022-04-05' limit 10
    		`); err != nil {
    		fmt.Fprintf(os.Stderr, "Unable to query database: %v\n", err)
    	}
    

    Then I am getting

    Unable to query database: scany: column: 'data.ph_con_tot': no corresponding field found, or it's unexported in main.MetricMap

    Is there any way to mix structs and map[string]? or can I extend the scanner?

    Thank you very much!

    opened by xorduna 5
  • Any examples showing how to handle NULL db entries?

    Any examples showing how to handle NULL db entries?

    In trying to run the Get() or ScanRow() method of the pgxscan library, if any of the fields are NULL in database an error is returned. Is there a way to scan while accounting for possible NULL values?

    Thanks Chris

    opened by ccrotty 5
  • Scanning postgres array columns into map[string]interface{}

    Scanning postgres array columns into map[string]interface{}

    Thanks a lot for this great library!

    I try to scan a query where some of the columns are arrays, with the destination being a map[string]interface{}. This works fine for all columns, but the array columns don't turn into slices, but instead an object with a structure like this:

    "myArray": {
        "Elements": ["my first value"],
        "Dimensions": [
          {
            "Length": 1,
            "LowerBound": 1
          }
        ],
        "Status": 2
      }
    

    What I actually expected is:

    "myArray": ["my first value"]
    

    Is this intentional? Is there some way to get around this? If I scan into a struct with slice fields, it works fine. But for my use case, a map is needed.

    opened by andersarpi 5
  • Select with pgxpool Pool NewWithConfig

    Select with pgxpool Pool NewWithConfig

    I'm trying to figure out how to get scany to work with pgxPool where I build my connection using NewWithConfig rather than just the "New" function.

    pool, err = pgxpool.NewWithConfig(context.Background(), config)
    sqlscan.Select(r.Context(), pool, &wallet, `SELECT stripe_id FROM member_wallet WHERE member_uuid = ?`, "123")
    
    

    it complains that

    Cannot use 'pool' (type *pgxpool.Pool) as the type Querier Type does not implement 'Querier' as some methods are missing: QueryContext(ctx context.Context, query string, args ...interface{}) (*sql.Rows, error)
    

    I've looked at everything in the Pool object, and nothing would conform to that requirement.. Any ideas?

    opened by trentmurray 0
  • Don't reset the destination slice in ScanAll method

    Don't reset the destination slice in ScanAll method

    Hi, I found out that ScanAll resets the destination slice image

    Is there any way to not empty the slice and keep slice elements' pointers, or can you explain why exactly it's required? Maybe you could add a new function/modify existing one that will solve this problem.

    Here's a simplified example of a real case that can occur - when some data references the slice elements:

    image image

    image

    Code: https://go.dev/play/p/smM5FLMDrHA

    opened by CyganFx 2
  • FR: config to ignore columns not found in destination type

    FR: config to ignore columns not found in destination type

    Hi, Thanks for building this!

    We have an SQL query that's part static (e.g. select * from users) and part dynamic (e.g. multiple joins with user-defined tables, the number of joins is decided in runtime). We have no problem scanning the "dynamic" columns by ourselves, but we hoped to use scany to scan the static part, so for example the following would be great:

    type User struct {
        ...
    }
    
    type Result struct {
        User
        DynamicStuff []DynamicData
    }
    ...
    ...
    scanner := dbscan.NewRowScanner(...).IgnoreUnknownColumns()
    // we use per-row scanning interface so we can do our own scans for the dynamic part.
    for rows.Next() {
        var user User
        scanner.Scan(&user)
        // .. our own scanning for the dynamic part ..
        d := scanDynamicStuff(&rows)
        result := Result{user, d}
    }
    

    Please let me if this makes sense - we can do a PR as well.

    help wanted 
    opened by yalon 3
  • Named query parameters implemented

    Named query parameters implemented

    I've integrated my pgxe library into scany.

    ~~Note:~~ ~~What this PR still lacks however is the possiblity to do named queries that do not return anything (NamedExec()).~~

    ~~Also test cases for the new api methods need to be added and the documentation needs to be improved.~~

    ~~The query parser as of right now only supports : as the delimeter, if we want to support more delimeters i need to update the code~~

    opened by anton7r 33
  • Scanning into a recursive struct

    Scanning into a recursive struct

    Hi,

    I'd like to scan into a recursive struct:

    type Node struct {
    	ID       string
    	Parent   *Node
    }
    var nodes []*Node
    err := sqlscan.Select(ctx, conns.PgClient, &nodes, `select id, null as parent from nodes where parent_id is null`)
    

    This causes an infinite loop as scany introspects struct recursively.

    Any tips for achieving that? I might be missing something obvious since I'm new to Go and scany. Ideally I'd like not to touch the Node struct definition since in my real-life code it is auto-generated. Also I'd like to avoid using db:"-" on Parent since I might want to get the immediate parent with a join.

    Thanks!

    help wanted 
    opened by n-e 2
Releases(v2.0.0)
  • v2.0.0(Oct 30, 2022)

    What's Changed

    • support pgx v5 by @vadimi in https://github.com/georgysavva/scany/pull/94
    • replace pgtype with pgx/v5/pgtype by @vadimi in https://github.com/georgysavva/scany/pull/97

    New Contributors

    • @vadimi made their first contribution in https://github.com/georgysavva/scany/pull/94

    Full Changelog: https://github.com/georgysavva/scany/compare/v1.2.1...v2.0.0

    Source code(tar.gz)
    Source code(zip)
  • v2.0.0-alpha.3(Oct 9, 2022)

  • v2.0.0-alpha.2(Oct 1, 2022)

  • v2.0.0-alpha.1(Oct 1, 2022)

  • v1.2.1(Oct 1, 2022)

    What's Changed

    • Fix Github Workflow by @georgysavva in https://github.com/georgysavva/scany/pull/95

    Full Changelog: https://github.com/georgysavva/scany/compare/v1.2.0...v1.2.1

    Source code(tar.gz)
    Source code(zip)
  • v1.2.0(Sep 5, 2022)

    What's Changed

    • Replace archived pkg/error with std lib errors by @mrkagelui in https://github.com/georgysavva/scany/pull/90

    New Contributors

    • @mrkagelui made their first contribution in https://github.com/georgysavva/scany/pull/90

    Full Changelog: https://github.com/georgysavva/scany/compare/v1.1.0...v1.2.0

    Source code(tar.gz)
    Source code(zip)
  • v1.1.0(Aug 1, 2022)

    What's Changed

    • FR-72 - Allow to ignore unknown db colums by @talbSE in https://github.com/georgysavva/scany/pull/85

    New Contributors

    • @talbSE made their first contribution in https://github.com/georgysavva/scany/pull/85

    Full Changelog: https://github.com/georgysavva/scany/compare/v0.3.0...v1.1.0

    Source code(tar.gz)
    Source code(zip)
  • v1.0.0(May 11, 2022)

  • v0.3.0(Jan 11, 2022)

    What's Changed

    • Introduce API object to customize library behavior
    • Handle single-column case for custom scannable types

    New Contributors

    • @nopcoder made their first contribution in https://github.com/georgysavva/scany/pull/70

    Full Changelog: https://github.com/georgysavva/scany/compare/v0.2.9...v0.3.0

    Source code(tar.gz)
    Source code(zip)
  • v0.2.9(Jul 1, 2021)

  • v0.2.8(Apr 1, 2021)

  • v0.2.7(Nov 1, 2020)

  • v0.2.6(Oct 16, 2020)

    • pgxscan and sqlscan return pgx.ErrNoRows and sql.ErrNoRows correspondingly
    • improve documentation for pgxscan and sqlscan about querying support
    Source code(tar.gz)
    Source code(zip)
  • v0.2.5(Aug 25, 2020)

  • v0.2.4(Jul 29, 2020)

  • v0.2.3(Jul 28, 2020)

  • v0.2.2(Jul 16, 2020)

  • v0.2.1(Jul 15, 2020)

    Allow ambiguous struct fields. Now, dbscan doesn't return an error if multiple struct fields map to the same column. It scans data into the first

    Source code(tar.gz)
    Source code(zip)
  • v0.2.0(Jul 13, 2020)

  • v0.1.0(Jul 15, 2020)

Owner
Georgy Savva
Georgy Savva
Use SQL to instantly query file, domain, URL and IP scanning results from VirusTotal.

VirusTotal Plugin for Steampipe Use SQL to query file, domain, URL and IP scanning results from VirusTotal. Get started → Documentation: Table definit

Turbot 23 Nov 10, 2022
Interactive terminal user interface and CLI for database connections. MySQL, PostgreSQL. More to come.

?? dbui dbui is the terminal user interface and CLI for database connections. It provides features like, Connect to multiple data sources and instance

Kanan Rahimov 116 Dec 3, 2022
Database Access Layer for Golang - Testable, Extendable and Crafted Into a Clean and Elegant API

REL Modern Database Access Layer for Golang. REL is golang orm-ish database layer for layered architecture. It's testable and comes with its own test

REL 605 Dec 6, 2022
Use SQL to instantly query resources, data sources and more from Terraform code. Open source CLI. No DB required.

Terraform Plugin for Steampipe Use SQL to query data from Terraform configuration files. Get started → Documentation: Table definitions & examples Com

Turbot 19 Nov 16, 2022
[mirror] the database client and tools for the Go vulnerability database

The Go Vulnerability Database golang.org/x/vulndb This repository is a prototype of the Go Vulnerability Database. Read the Draft Design. Neither the

Go 201 Nov 25, 2022
Database - Example project of database realization using drivers and models

database Golang based database realization Description Example project of databa

Denis 1 Feb 10, 2022
An observability database aims to ingest, analyze and store Metrics, Tracing and Logging data.

BanyanDB BanyanDB, as an observability database, aims to ingest, analyze and store Metrics, Tracing and Logging data. It's designed to handle observab

The Apache Software Foundation 138 Dec 5, 2022
Sync MySQL data into elasticsearch

go-mysql-elasticsearch is a service syncing your MySQL data into Elasticsearch automatically. It uses mysqldump to fetch the origin data at first, the

siddontang 3.9k Nov 27, 2022
Lightweight SQL database written in Go for prototyping and playing with text (CSV, JSON) data

gopicosql Lightweight SQL database written in Go for prototyping and playing wit

null 2 Jul 27, 2022
sqlx is a library which provides a set of extensions on go's standard database/sql library

sqlx is a library which provides a set of extensions on go's standard database/sql library. The sqlx versions of sql.DB, sql.TX, sql.Stmt, et al. all leave the underlying interfaces untouched, so that their interfaces are a superset on the standard ones. This makes it relatively painless to integrate existing codebases using database/sql with sqlx.

Jason Moiron 12.9k Dec 6, 2022
Convert data exports from various services to a single SQLite database

Bionic Bionic is a tool to convert data exports from web apps to a single SQLite database. Bionic currently supports data exports from Google, Apple H

Bionic 173 Dec 4, 2022
Dumpling is a fast, easy-to-use tool written by Go for dumping data from the database(MySQL, TiDB...) to local/cloud(S3, GCP...) in multifarious formats(SQL, CSV...).

?? Dumpling Dumpling is a tool and a Go library for creating SQL dump from a MySQL-compatible database. It is intended to replace mysqldump and mydump

PingCAP 267 Nov 9, 2022
Create key value sqlite3 database from tabular data, fast.

Turn tabular data into a lookup table using sqlite3. This is a working PROTOTYPE with limitations, e.g. no customizations, the table definition is fixed, etc.

Martin Czygan 5 Apr 2, 2022
Make a sqlite3 database from tabular data, fast.

MAKTA make a database from tabular data Turn tabular data into a lookup table using sqlite3. This is a working PROTOTYPE with limitations, e.g. no cus

Martin Czygan 5 Apr 2, 2022
A database connection tool for sensitive data

go-sql 用于快速统计数据库行数、敏感字段匹配、数据库连接情况。 usage ./go-sql_darwin_amd64 -h ./go-sql_darwin_amd64 -f db.yaml -k name,user ./go-sql_darwin_amd64 -f db.yaml --min

null 5 Apr 4, 2022
A go package to add support for data at rest encryption if you are using the database/sql.

go-lockset A go package to add support for data at rest encryption if you are using the database/sql to access your database. Installation In your Gol

Bartlomiej Mika 0 Jan 30, 2022
sqlbench measures and compares the execution time of one or more SQL queries.

sqlbench sqlbench measures and compares the execution time of one or more SQL queries. The main use case is benchmarking simple CPU-bound query varian

Felix Geisendörfer 361 Nov 17, 2022
Use SQL to instantly query users, groups, applications and more from Okta. Open source CLI. No DB required.

Okta Plugin for Steampipe Use SQL to query infrastructure including users, groups, applications and more from Okta. Get started → Documentation: Table

Turbot 6 Nov 10, 2022