Type safe SQL query builder and struct mapper for Go

Overview

GoDoc-postgres GoDoc-mysql CI Go Report Card Coverage Status

sq (Structured Query)

🎯 🏆 sq is a code-generated, type safe query builder and struct mapper for Go. 🏆 🎯
Documentation   •   Reference   •   Examples

This package provides type safe querying on top of Go's database/sql. It is essentially a glorified string builder, but automates things in all the right places to make working with SQL queries pleasant and boilerplate-free.

  • Avoid magic strings. SQL queries written in Go are full of magic strings: strings specified directly within application code that have an impact on the application's behavior. Specifically, you have to hardcode table or column names over and over into your queries (even ORMs are guilty of this!). Such magic strings are prone to typos and hard to change as your database schema changes. sq generates table structs from your database and ensures that whatever query you write is always reflective of what's actually in your database. more info

  • Better null handling. Handling NULLs is a bit of a pain in the ass in Go. You have to either use pointers (cannot be used in HTML templates) or sql.NullXXX structs (extra layer of indirection). sq scans NULLs as zero values, while still offering you the ability to check if the column was NULL. more info

  • The mapper function is the SELECT clause.

    • database/sql requires you to repeat the list of columns twice in the exact same order, once for SELECT-ing and once for scanning. If you mess the order up, that's an error.
    • Reflection-based mapping (struct tags) has you defining a set of possible column names to map, and then requires you repeat those columns names again in your query. If you mistype a column name in the struct tag, that's an error. If you SELECT a column that's not present in the struct, that's an error.
    • In sq whatever you SELECT is automatically mapped. This means you just have to write your query, execute it and if there were no errors, the data is already in your Go variables. No iterating rows, no specifying column scan order, no error checking three times. Write your query, run it, you're done.
    • more info

Features

Getting started

go get github.com/bokwoon95/go-structured-query

You will also need the dialect-specific code generator

# Postgres
go get github.com/bokwoon95/go-structured-query/cmd/sqgen-postgres

# MySQL
go get github.com/bokwoon95/go-structured-query/cmd/sqgen-mysql

Generate tables from your database

# for more options, check out --help

# Postgres
sqgen-postgres tables --database 'postgres://name:[email protected]:5432/dbname?sslmode=disable' --overwrite

# MySQL
sqgen-postgres tables --database 'name:[email protected](127.0.0.1:3306)/dbname' --schema dbname --overwrite

For an example of what the generated file looks like, check out postgres/devlab_tables_test.go.

Importing sq

Each SQL dialect has its own sq package. Import the sq package for the dialect you are using accordingly:

// Postgres
import (
    sq "github.com/bokwoon95/go-structured-query/postgres"
)

// MySQL
import (
    sq "github.com/bokwoon95/go-structured-query/mysql"
)

Examples

You just want to see code, right? Here's some.

SELECT

-- SQL
SELECT u.user_id, u.name, u.email, u.created_at
FROM public.users AS u
WHERE u.name = 'Bob';
// Go
u := tables.USERS().As("u") // table is code generated
var user User
var users []User
err := sq.
    From(u).
    Where(u.NAME.EqString("Bob")).
    Selectx(func(row *sq.Row) {
        user.UserID = row.Int(u.USER_ID)
        user.Name = row.String(u.NAME)
        user.Email = row.String(u.EMAIL)
        user.CreatedAt = row.Time(u.CREATED_AT)
    }, func() {
        users = append(users, user)
    }).
    Fetch(db)
if err != nil {
    // handle error
}

INSERT

-- SQL
INSERT INTO public.users (name, email)
VALUES ('Bob', '[email protected]'), ('Alice', '[email protected]'), ('Eve', '[email protected]');
// Go
u := tables.USERS().As("u") // table is code generated
users := []User{
    {Name: "Bob",   Email: "[email protected]"},
    {Name: "Alice", Email: "[email protected]"},
    {Name: "Eve  ", Email: "[email protected]"},
}
rowsAffected, err := sq.
    InsertInto(u).
    Valuesx(func(col *sq.Column) {
        for _, user := range users {
            col.SetString(u.NAME, user.Name)
            col.SetString(u.EMAIL, user.Email)
        }
    }).
    Exec(db, sq.ErowsAffected)
if err != nil {
    // handle error
}

UPDATE

-- SQL
UPDATE public.users
SET name = 'Bob', password = 'qwertyuiop'
WHERE email = '[email protected]';
// Go
u := tables.USERS().As("u") // table is code generated
user := User{
    Name:     "Bob",
    Email:    "[email protected]",
    Password: "qwertyuiop",
}
rowsAffected, err := sq.
    Update(u).
    Setx(func(col *sq.Column) {
        col.SetString(u.NAME, user.Name)
        col.SetString(u.PASSWORD, user.Password)
    }).
    Where(u.EMAIL.EqString(user.Email)).
    Exec(db, sq.ErowsAffected)
if err != nil {
    // handle error
}

DELETE

-- SQL
DELETE FROM public.users AS u
USING public.user_roles AS ur
JOIN public.user_roles_students AS urs ON urs.user_role_id = ur.user_role_id
WHERE u.user_id = ur.user_id AND urs.team_id = 15;
// Go
u   := tables.USERS().As("u")                 // tables are code generated
ur  := tables.USER_ROLES().As("ur")           // tables are code generated
urs := tables.USER_ROLES_STUDENTS().As("urs") // tables are code generated
rowsAffected, err := sq.
    DeleteFrom(u).
    Using(ur).
    Join(urs, urs.USER_ROLE_ID.Eq(ur.USER_ROLE_ID)).
    Where(
        u.USER_ID.Eq(ur.USER_ID),
        urs.TEAM_ID.EqInt(15),
    ).
    Exec(db, sq.ErowsAffected)
if err != nil {
    // handle error
}

For more information, check out the Basics.

For a list of example queries, check out Query Building.

Project Status

The external API is considered stable. Any changes will only be add to the API (like support for custom loggers and structured logging). If you have any feature requests or if you find bugs do open a new issue.

Contributing

See CONTRIBUTING.md

Appendix

Why this exists

I wrote this because I needed a more convenient way to scan database rows into nested structs, some of which exist twice in the same struct due to self joined tables. That made sqlx's StructScan unsuitable (e.g. cannot handle type Child struct { Father Person; Mother Person; }). database/sql's way of scanning is really verbose especially since I had about ~25 fields to scan into, some of which could be null. That's a lot of sql Null structs needed! Because I had opted to -not- pollute my domain structs with sql.NullInt64/ sql.NullString etc, I had to create a ton of intermediate Null structs just to contain the possible null fields, then transfer their zero value back into the domain struct. There had to be a better way. I just wanted their zero values, since everything in Go accomodates the zero value.

sq is therefore a data mapper first, and query builder second. I try my best to make the query builder as faithful to SQL as possible, but the main reason for its existence was always the struct mapping.

The case for ALL_CAPS

Here are the reasons why ALL_CAPS is used for table and column names over the idiomatic MixedCaps:

  1. jOOQ does it.
  2. It's SQL. It's fine if it doesn't follow Go convention, because it isn't Go.
    • Go requires exported fields by capitalized.
    • SQL, being case insensitive, generally uses underscores as word delimiters.
    • ALL_CAPS is a blend that satisfies both Go's export rules and SQL's naming conventions.
    • In my opinion, it is also easier to read because table and column names visually stand out from application code.
  3. Avoids clashing with interface methods. For a struct to fit the Table interface, it has to possess the methods GetAlias() and GetName(). This means that no columns can be called 'GetAlias' or 'GetName' because it would clash with the interface methods. This is sidestepped by following an entirely different naming scheme for columns i.e. ALL_CAPS.

On SQL Type Safety

sq makes no effort to check the semantics of your SQL queries at runtime. Any type checking is entirely enforced by what methods that you can call and argument types that you can pass to these methods. For example, You can call Asc()/Desc() and NullsFirst()/NullsLast() on any selected field and it would pass the type checker, because Asc()/Desc()/NullsFirst()/NullsLast() still return a Field interface:

u := tables.USERS().As("u")
sq.Select(u.USER_ID, u.USERNAME.Asc().NullsLast()).From(u)

which would translate to

SELECT u.user_id, u.username ASC NULLS LAST FROM users AS u
-- obviously wrong, you can't use ASC NULLS LAST inside the SELECT clause

The above example passes the Go type checker, so sq will happily build the query string -- even if that SQL query is sematically wrong. In practice, as long as you aren't trying to actively do the wrong thing (like in the above example), the limited type safety will prevent you from making the most common types of errors.

It also means the query builder will never fail: there's no boilerplate error checking required. Any semantic errors will be deferred to the database to point it out to you.

Dialect agnostic query builder?

sq is not dialect agnostic. This means I can add your favorite dialect specific SQL features without the headache of cross-dialect compatibility. It also makes contributions easier, as you just have to focus on your own SQL dialect and not care about the others.

Issues
  • Add support for postgres UUID fields

    Add support for postgres UUID fields

    Adds support for the uuid postgres type, using the github.com/google/uuid package for storing UUIDs. I think in the future, it might be worth adding in configuration in sqgen-postgres to generate uuid fields as BinaryFields or StringFields (since they can be scanned into either data type just fine), but I'm keeping the initial PR small.

    Unfortunately, database/sql doesn't support a NullUUID type like it does for NullString, etc., so I've used the uuid.Nil (which is just a [16]byte with all 0's) to represent an "null" uuid type. I'm happy to change the implementation of null UUIDs if there's a better way to do this.

    I've added a few test cases, and populated the media table in the postgres devlab database so we can query for a deterministic UUID in the tests.

    If this PR gets accepted, I can also add the implementation for mysql and the accompanying documentation, just wanted to small start at first.

    I also moved the txdb registration into a func TestMain(t *testing.M) {} block - from what I can tell, I think that's a more common practice for test setup than init functions.

    opened by quentin-fox 11
  • Allow the `Subquery` struct to satisfy the `Field` interface

    Allow the `Subquery` struct to satisfy the `Field` interface

    It would be useful if the Subquery type could satisfy the Field interface, so that it could be used to embed subqueries in a select clause, without needing to add it to the from clause.

    e.g. it would make it possible to write the following query (pulled from a project I'm working on)

    SELECT
    	matches.id,
    	matches.date_created,
    	matches.num_players,
    	(select jsonb_agg(
    		jsonb_build_object(
    			'ID', match_users.id,
    			'MatchID', match_users.match_id,
    			'UserID', match_users.user_id,
    			'Order', match_users.turn_order,
    			'IsCreator', match_users.is_creator,
    			'IsWinner', match_users.is_winner
    		))
    		from public.match_users where match_users.match_id = matches.id
    	) as "match_users"
    	FROM public.matches
            WHERE matches.id = $1;
    

    Is there another way of doing this that I'm not aware of with this library? I don't mind reverting to plain SQL for cases like these, it would just be a nice addition to the query builder.

    opened by quentin-fox 9
  • Refactor sqgen-* to standalone library (decoupled from cobra), add lots of tests

    Refactor sqgen-* to standalone library (decoupled from cobra), add lots of tests

    For one of my own projects, I wanted to directly call the sqgen-postgres logic, rather than invoke it through a shell command. To do this, I extracted almost all of the sqgen-postgres and sqgen-mysql into the respective sqgen/postgres or sqgen/mysql packages, with a shared sqgen package for simple shared utilities.

    As a result, the process of code generation is no longer tied to cobra - the cmd/sqgen-postgres/ and cmd/sqgen-mysql/ packages are now just a thin wrapper around the code generation library in the sqgen/* package.

    I also did a decent amount of refactoring within the logic for each of the code generators so that it was a bit easier to test. The few major changes I made were:

    1. No more custom String type, as the QuoteSpace and Export functions were moved into a template.FuncMap and injected into the template - this means that the fields that previously used String can use the string type instead, which simplifies a decent amount of the generation code
    2. The logic for populating the fields of a Table/Function/Field is now a method (called Populate) on the given object. If it needs information about the rest of the functions/tables (i.e. is it a duplicate, is it an overload, etc.), these are passed into the Populate method as arguments - this made the testing for various edge cases a bit simpler.
    3. The generated code is written to an io.Writer, so a writer of any sort can be written to. This is also used for dry-run generation, as os.Stdout is used as the writer, instead of writing to a file.
    4. goimports is added as a dependency, and is used as a library (instead of through a shell command)
    5. For the sqgen-postgres functions command, the pg_catalog.pg_proc.prokind column is not available for Postgres versions lower than 11. I added a check to see which version of Postgres is being generated from (see the queryPgVersion and checkProkindSupport functions in sqgen/postgres/functions.go), and modify the query to use the legacy pg_proc columns to essentially do the same thing as p.prokind = 'f'.

    Of course this is a stupid number of changes to add all at once, so I've aimed to test as much of the code as possible, both through unit and integration tests.

    In the integration_test.go files, I check the output of the library against the output of sqgen-postgres and sqgen-mysql against the test databases (using the testate) prior to any of the changes that I've made, and there are no differences in the output.

    Please let me know if there is anything you'd like me to change/test more, etc. before accepting the PR! I know that this is a large amount of code to change, but the core logic is pretty much the same as before, just structured a bit differently, and with a bunch of tests added.

    opened by quentin-fox 3
  • Question: TimeField and sql.NullTime

    Question: TimeField and sql.NullTime

    Does a TimeField accept an sql.NullTime value on its AssignmentField TimeField.Set ?

    opened by mrg0lden 3
  • add cobra flag to allow tables to be excluded from generation

    add cobra flag to allow tables to be excluded from generation

    I'm using the golang-migrate tool in order to create migrations for my tables, and that requires that there's a table called "schema_migrations" added to my database schema. I didn't want it to be included in the list of tables generated by this library, so I added the flags to support that.

    The filtering is done at the sql-level, so that all downstream commands don't have to know about any of the excluded tables.

    I was also playing around with having the exclude pattern be case-sensitive, but from what I can tell the table names in this library are case-insensitive anyways (for postgres, at least), so I decided not to add in case-sensitivity.

    I also split out part of the sql generation into separate commands so that they could be more easily tested, if you require that for the PR.

    Thanks for the great library! The DX is way better than a lot of the other alternatives.

    opened by quentin-fox 1
  • fix: sqgen-mysql has no commands

    fix: sqgen-mysql has no commands

    sqgen-mysql was doing nothing (except printing Code generation for the sq package), because the tables command was never actually added.

    Unrelatedly, I think this is a really interesting take on a golang query builder, and I really like the idea! The docs are exceptionally good too, awesome work there.

    opened by kblomster 1
  • data mapping for pointer fields will overwrite every item in the slice

    data mapping for pointer fields will overwrite every item in the slice

    sq's multi-row struct mapping works by repeatedly binding data to a single variable, then appending that variable to a slice. This works if the variable is a value type, since it will be copied by value every append: subsequent modifications to that variable will not affect what has already been appended into the slice. However if there were any fields that are pointer types, subsequent modifications to those fields will overwrite them for everything that has already been appended to the slice (since they were copied by reference).

    There are workarounds for this (manually copying data before append), but it's not very user friendly as it requires understanding the inner workings of the library in order to fully grok.

    I'm honestly not sure if there's a way around it, since the current way is the only way to have a fully 'generic' mapper function that works for any data type. By encapsulating stateful modifications in a closure function (or a pointer receiver method), the function for all mapper functions can have the same type of func (*sq.Row).

    Even Go generics may not be sufficient here as they do not permit type parameters on methods. The following code below is ideal, but will not be achievable with Go's generics:

    func (q SelectQuery) Get[Item any](db *DB, mapper func(*Row) Item) (Item, error)
    func (q SelectQuery) GetSlice[Item any](db *DB, mapper func(*Row) Item) ([]Item, error)
    func (q SelectQuery) GetAccumulate[Item, Items any](db *DB, initial Items, mapper func(*Row) Item, accumulator func(Item, Items) Items) (Items, error)
    

    The document states that 'a suitably parameterized top-level function' will have to be written. Perhaps that may be the workaround.

    func Get[Item any](db *DB, q Query, mapper func(*Row) Item) (Item, error)
    func GetSlice[Item any](db *DB, q Query, mapper func(*Row) Item) ([]Item, error)
    

    However the invocation will be so clunky compared to regular Selectx that I am not so sure that it is a direct upgrade:

    // using Selectx
    var user User
    var user []User
    err := sq.
        From(u).
        Where(u.Name.EqString("bob")).
        Selectx(user.RowMapper, func() { users = append(users, user) }).
        Fetch(db)
    if err != nil {
    }
    fmt.Println(users)
    
    // using GetSlice
    users, err := sq.GetSlice(db,
        sq.From(u).
            Where(u.NAME.EqString("bob")),
        User{}.RowMapper,
    if err != nil {
    }
    fmt.Println(users)
    )
    

    The only value Get brings over Selectx is that the mapper function can return a new value everytime rather than doing some side effect on a pointer receiver. My main issue with this is that it solves the problem of overwriting pointers fields but with an entirely different way of doing things (converting the query from a method to a top level function).

    opened by bokwoon95 3
  • Suggestion: Support Table/Field creation from Table structure

    Suggestion: Support Table/Field creation from Table structure

    When going to production you need to ensure the database is the same structure. It would make sense that we can run a function to create any missing tables/fields based on the structure generated

    Just a suggestion.

    enhancement 
    opened by ocomsoft 5
Releases(v0.3.1)
A Go (golang) package that enhances the standard database/sql package by providing powerful data retrieval methods as well as DB-agnostic query building capabilities.

ozzo-dbx Summary Description Requirements Installation Supported Databases Getting Started Connecting to Database Executing Queries Binding Parameters

Ozzo Framework 543 Oct 13, 2021
golang orm and sql builder

gosql gosql is a easy ORM library for Golang. Style: var userList []UserModel err := db.FetchAll(&userList, gosql.Columns("id","name"), gosql.

RushTeam 147 Sep 24, 2021
Command line tool to generate idiomatic Go code for SQL databases supporting PostgreSQL, MySQL, SQLite, Oracle, and Microsoft SQL Server

About xo xo is a command-line tool to generate Go code based on a database schema or a custom query. xo works by using database metadata and SQL intro

XO 2.9k Oct 15, 2021
Go database query builder library for PostgreSQL

buildsqlx Go Database query builder library Installation Selects, Ordering, Limit & Offset GroupBy / Having Where, AndWhere, OrWhere clauses WhereIn /

Arthur Kushman 38 Sep 2, 2021
a golang library for sql builder

Gendry gendry is a Go library that helps you operate database. Based on go-sql-driver/mysql, it provides a series of simple but useful tools to prepar

DiDi 1.3k Oct 19, 2021
igor is an abstraction layer for PostgreSQL with a gorm like syntax.

igor igor is an abstraction layer for PostgreSQL, written in Go. Igor syntax is (almost) compatible with GORM. When to use igor You should use igor wh

Paolo Galeone 84 Aug 2, 2021
SQL builder and query library for golang

__ _ ___ __ _ _ _ / _` |/ _ \ / _` | | | | | (_| | (_) | (_| | |_| | \__, |\___/ \__, |\__,_| |___/ |_| goqu is an expressive SQL bu

Doug Martin 1.3k Oct 22, 2021
Additions to Go's database/sql for super fast performance and convenience. (fork of gocraft/dbr)

dbr (fork of gocraft/dbr) provides additions to Go's database/sql for super fast performance and convenience. Getting Started // create a connection (

Free and open source software developed at Mail.Ru 148 Sep 22, 2021
Fast SQL query builder for Go

sqlf A fast SQL query builder for Go. sqlf statement builder provides a way to: Combine SQL statements from fragments of raw SQL and arguments that ma

Vlad Glushchuk 27 Oct 13, 2021
Fluent SQL generation for golang

sqrl - fat-free version of squirrel - fluent SQL generator for Go Non thread safe fork of squirrel. The same handy fluffy helper, but with extra lette

Ivan Kirichenko 231 Sep 26, 2021
gosq is a parsing engine for a simplicity-focused, template-based SQL query builder for Go.

gosq is a parsing engine for a simplicity-focused, template-based SQL query builder for Go.

Sang-gon Lee 40 Oct 9, 2021
Query git repositories with SQL. Generate reports, perform status checks, analyze codebases. 🔍 📊

askgit askgit is a command-line tool for running SQL queries on git repositories. It's meant for ad-hoc querying of git repositories on disk through a

Augmentable 2.6k Oct 20, 2021
Fluent SQL generation for golang

Squirrel is "complete". Bug fixes will still be merged (slowly). Bug reports are welcome, but I will not necessarily respond to them. If another fork

null 4.3k Oct 24, 2021
Simple SQL extensions for Go

go-sx provides some extensions to the standard library database/sql package. It is designed for those who wish to use the full power of SQL without a heavy abstraction layer.

travel audience 178 Aug 24, 2021
Zero boilerplate database operations for Go

(Now compatible with MySQL and PostgreSQL!) Everyone knows that performing simple DATABASE queries in Go takes numerous lines of code that is often re

null 317 Oct 13, 2021
Bulk query SQLite database over the network

SQLiteQueryServer Bulk query SQLite database over the network. Way faster than SQLiteProxy!

Assaf Morami 47 Jun 30, 2021
Golang Sequel ORM that support Enum, JSON, Spatial and many more

sqlike A golang SQL ORM which anti toxic query and focus on latest features. Installation go get github.com/si3nloong/sqlike Fully compatible with nat

SianLoong 15 Oct 18, 2021
Document-oriented, embedded SQL database

Genji Document-oriented, embedded, SQL database Table of contents Table of contents Introduction Features Installation Usage Using Genji's API Using d

Genji 750 Oct 12, 2021
Go fearless SQL. Sqlvet performs static analysis on raw SQL queries in your Go code base.

Sqlvet Sqlvet performs static analysis on raw SQL queries in your Go code base to surface potential runtime errors at build time. Feature highlights:

QP Hou 428 Sep 17, 2021