GraphJin - Build APIs in 5 minutes with GraphQL. An instant GraphQL to SQL compiler.

Overview

GraphJin - Build APIs in 5 minutes

GoDoc GoReport Apache 2.0 Docker build Discord Chat

GraphJin gives you a high performance GraphQL API without you having to write any code. GraphQL is automagically compiled into an efficient SQL query. Use it either as a library or a standalone service.

1. Quick Install

Mac (Homebrew)

brew install dosco/graphjin/graphjin

Ubuntu (Snap)

sudo snap install --classic graphjin

Debian and Redhat (releases)

Download the .deb or .rpm from the releases page and install with dpkg -i and rpm -i respectively.

Go Install

go get github.com/dosco/graphjin

2. Create New API

graphjin new <app_name>

cd <app_name>
docker-compose run api db:setup
docker-compose up

About GraphJin

After working on several products through my career I found that we spend way too much time on building API backends. Most APIs also need constant updating, and this costs time and money.

It's always the same thing, figure out what the UI needs then build an endpoint for it. Most API code involves struggling with an ORM to query a database and mangle the data into a shape that the UI expects to see.

I didn't want to write this code anymore, I wanted the computer to do it. Enter GraphQL, to me it sounded great, but it still required me to write all the same database query code.

Having worked with compilers before I saw this as a compiler problem. Why not build a compiler that converts GraphQL to highly efficient SQL.

This compiler is what sits at the heart of GraphJin, with layers of useful functionality around it like authentication, remote joins, rails integration, database migrations, and everything else needed for you to build production-ready apps with it.

Better APIs Faster!

Lets take for example a simple blog app. You'll probably need the following APIs user management, posts, comments, votes. Each of these areas need apis for listing, creating, updating, deleting. Off the top of my head thats like 12 APIs if not more. This is just for managing things for rendering the blog posts, home page, profile page you probably need many more view apis that fetch a whole bunch of things at the same time. This is a lot and we're still talking something simple like a basic blogging app. All these APIs have to be coded up by someone and then the code maintained, updated, made secure, fast, etc. We are talking weeks to months of work if not more. Also remember your mobile and web developers have to wait around till this is all done.

With GraphJin your web and mobile developers can start building instantly. All they have to do is just build the GraphQL queries they need and GraphJin fetches the data. Nothing to maintain no backend API code, its secure, lighting fast and has tons of useful features like subscriptions, rate limiting, etc built-in. With GraphJin your building APIs in minutes not days.

Features

  • Works with Postgres, MySQL8 and Yugabyte DB
  • Complex nested queries and mutations
  • Realtime updates with subscriptions
  • Build infinite scroll, feeds, nested comments, etc
  • Auto learns database tables and relationships
  • Role and Attribute-based access control
  • Opaque cursor-based efficient pagination
  • Full-text search and aggregations
  • JWT tokens supported (Auth0, JWKS, Firebase, etc)
  • Join database queries with remote REST APIs
  • Also works with existing Ruby-On-Rails apps
  • Rails authentication supported (Redis, Memcache, Cookie)
  • A simple config file
  • High performance Go codebase
  • Tiny docker image and low memory requirements
  • Fuzz tested for security
  • Database migrations tool
  • Database seeding tool
  • OpenCensus Support: Zipkin, Prometheus, X-Ray, Stackdriver
  • API Rate Limiting
  • Highly scalable and fast

graphjin-screenshot-final

Documentation

Quick Start

Documentation

Build APIs in 5 minutes with GraphJin

GraphQL vs REST

GraphQL Examples

Using it in your own code

go get github.com/dosco/graphjin/core
package main

import (
  "context"
  "database/sql"
  "fmt"
  "log"

  "github.com/dosco/graphjin/core"
  _ "github.com/jackc/pgx/v4/stdlib"
)

func main() {
  db, err := sql.Open("pgx", "postgres://postgres:@localhost:5432/example_db")
  if err != nil {
    log.Fatal(err)
  }

  sg, err := core.NewGraphJin(nil, db)
  if err != nil {
    log.Fatal(err)
  }

  query := `
    query {
      posts {
      id
      title
    }
  }`

  ctx := context.Background()
  ctx = context.WithValue(ctx, core.UserIDKey, 1)

  res, err := sg.GraphQL(ctx, query, nil)
  if err != nil {
    log.Fatal(err)
  }

  fmt.Println(string(res.Data))
}

Reach out

We're happy to help you leverage GraphJin reach out if you have questions

twitter/dosco

discord/graphjin (Chat)

Production use

The popular 42papers.com site for discovering trending papers in AI and Computer Science uses GraphJin for it's entire backend.

License

Apache Public License 2.0

Copyright (c) 2019-present Vikram Rangnekar

Issues
  • Super Graph as a library (aka. embedded mode)

    Super Graph as a library (aka. embedded mode)

    What would you like to be added:

    A clean API to integrate Super Graph into other GoLang apps. This would be done as a http handler that can be plugged into an existing router or at a much lower level where you can provide a config object and create a new Super Graph instance to be used in your code.

    Also hooks can be added for various things like onQuery, onMutation, onMutationComplete, etc, etc. This would help code using Super Graph as a library provide their own behaviour to execute during request handling.

    The Super Graph GraphQL compilers (QCode and SQL) are already available as a library, this work would focus on moving more pieces of the serv package into a clean API.

    Why is this needed:

    Currently I run two services one for custom apis like authentication and the other Super Graph. Going ahead other custom endpoints like file upload etc would possibly also be added to the first service. It would be great if I could instead bundle it all together into a single app and also be able to augment Super Graph with my own app code.

    enhancement 
    opened by dosco 31
  • Cue validation for query and mutations variables

    Cue validation for query and mutations variables

    checkout tests for examples.

    limitations:

    1. only works with variables. can not find a way to cast field args into json or struct.
    2. can not use string inside cue schema (double quote " problem) inside plain graphql query, but it can be done with passing validation schema through variables.
    opened by ehsanonline 20
  • RLS security support via session variables

    RLS security support via session variables

    Hello,

    I'd like to suggest a feature: setting current role from the session. That would allow RLS policies to be run using the session's user name, while not needing to set any securrity models the config file.

    My implementation suggestion would be hooking into sql generating and prepending the session user name like this:

    conn.Query(ctx, 'SET session authorization howe; SELECT * FROM...')

    This would allow reusing connections from the pool (as they already are) without more worries.

    While we're at it, why not supporting session variables as well? They could be used on RLS policies, functions, triggers, etc. That woud be just as easy to implement: conn.Query(ctx, `SET supergraph.company_id=1234; SELECT * FROM...')

    Of course the '1234' parameter above should be sent as a parameter in the prepared query; that was just an easier to read example.

    Let's not forget to reset all those variables before running new queries; the final command should look like something this: conn.Query(ctx, `RESET ALL;SET supergraph.myvar1='xxx'; SET supergraph.myvar2='yyy'; SELECT * FROM...')

    I think there should be internally a hook for before sending queries to the server, and the standalone server should have a default implementation getting vars from jwt etc., but when running the server on embedded mode, the caller should be able to customize it. On jwt, the "sub" token should probably be the role name, and the payload could set the session variables.

    I didn't have a deep look into the source, but should be easy enough to implement. I could look into it if you're not willing to implement this, but think it's a good idea.

    Any comments?

    Thanks, Howe

    enhancement 
    opened by howesteve 17
  • Incorrect SQL query using variables for filters

    Incorrect SQL query using variables for filters

    I think we have a problem with filters.

    Inspired by this: https://github.com/dosco/super-graph/issues/1#issuecomment-477236321

    I used this in my dev.yml:

    variables:
        account_id: "select account_id from users where id = $user_id"
    
    - name: user
        tables:
            - name: players
                query:
                    filters: ["{ account_id: { _eq: $account_id } }"]
    

    And I get this error:

    ERR C:/super-graph/serv/http.go:104 > failed to handle request error="ERROR: invalid input syntax for type bigint: \"select account_id from users where id = 2\" (SQLSTATE 22P02)"
    
    SELECT "_sg_auth_info"."role", (CASE "_sg_auth_info"."role" WHEN 'user' THEN (SELECT json_build_object('player', "__sel_0"."json") as "__root" FROM (SELECT json_build_object('id', "players_0"."id", 'created_at', "players_0"."created_at", 'account_id', "players_0"."account_id", 'amount', "players_0"."amount", 'note', "players_0"."note") AS "json" FROM (SELECT "players"."id", "players"."created_at", "players"."account_id", "players"."amount", "players"."note" FROM "players" WHERE (((("players"."account_id") = 'select account_id from users where id = 2' :: bigint) AND (("players"."id") =  '2' :: bigint))) LIMIT ('1') :: integer) AS "players_0") AS "__sel_0") WHEN 'admin' THEN (SELECT json_build_object('player', "__sel_0"."json") as "__root" FROM (SELECT json_build_object('id', "players_0"."id", 'created_at', "players_0"."created_at", 'account_id', "players_0"."account_id", 'amount', "players_0"."amount", 'note', "players_0"."note") AS "json" FROM (SELECT "players"."id", "players"."created_at", "players"."account_id", "players"."amount", "players"."note" FROM "players" WHERE ((("players"."id") =  '2' :: bigint)) LIMIT ('1') :: integer) AS "players_0") AS "__sel_0") END) FROM (SELECT (CASE WHEN EXISTS (SELECT * FROM users WHERE id = 2) THEN (SELECT (CASE WHEN id = 1000 THEN 'admin' ELSE 'user' END) FROM (SELECT * FROM users WHERE id = 2) AS "_sg_auth_roles_query" LIMIT 1) ELSE 'anon' END) FROM (VALUES (1)) AS "_sg_auth_filler") AS "_sg_auth_info"(role) LIMIT 1
    

    The query is visibly incorrect. A bug or is it my fault?

    bug 
    opened by frederikhors 14
  • Does not resolve queries through a linking table (Many-To-Many)

    Does not resolve queries through a linking table (Many-To-Many)

    I have three three tables: film, actor, and film_actor (details below). The following GraphQL query:

    query {
      actor {
        actor_id
        first_name
        last_name
        film {
          film_id
        }
      }
    }
    
    # Output 
    
    {
      "error": {
        "error": "something wrong no remote ids found in db response",
        "data": null
      }
    }
    
    
                                                  Table "public.film"
          Column      |            Type             | Collation | Nullable |                Default                
    ------------------+-----------------------------+-----------+----------+---------------------------------------
     film_id          | integer                     |           | not null | nextval('film_film_id_seq'::regclass)
     title            | character varying(255)      |           | not null | 
     description      | text                        |           |          | 
     release_year     | year                        |           |          | 
     language_id      | smallint                    |           | not null | 
     rental_duration  | smallint                    |           | not null | 3
     rental_rate      | numeric(4,2)                |           | not null | 4.99
     length           | smallint                    |           |          | 
     replacement_cost | numeric(5,2)                |           | not null | 19.99
     rating           | mpaa_rating                 |           |          | 'G'::mpaa_rating
     last_update      | timestamp without time zone |           | not null | now()
     special_features | text[]                      |           |          | 
     fulltext         | tsvector                    |           | not null | 
    Indexes:
        "film_pkey" PRIMARY KEY, btree (film_id)
        "film_fulltext_idx" gist (fulltext)
        "idx_fk_language_id" btree (language_id)
        "idx_title" btree (title)
    Foreign-key constraints:
        "film_language_id_fkey" FOREIGN KEY (language_id) REFERENCES language(language_id) ON UPDATE CASCADE ON DELETE RESTRICT
    Referenced by:
        TABLE "film_actor" CONSTRAINT "film_actor_film_id_fkey" FOREIGN KEY (film_id) REFERENCES film(film_id) ON UPDATE CASCADE ON DELETE RESTRICT
        TABLE "film_category" CONSTRAINT "film_category_film_id_fkey" FOREIGN KEY (film_id) REFERENCES film(film_id) ON UPDATE CASCADE ON DELETE RESTRICT
        TABLE "inventory" CONSTRAINT "inventory_film_id_fkey" FOREIGN KEY (film_id) REFERENCES film(film_id) ON UPDATE CASCADE ON DELETE RESTRICT
    Triggers:
        film_fulltext_trigger BEFORE INSERT OR UPDATE ON film FOR EACH ROW EXECUTE PROCEDURE tsvector_update_trigger('fulltext', 'pg_catalog.english', 'title', 'description')
        last_updated BEFORE UPDATE ON film FOR EACH ROW EXECUTE PROCEDURE last_updated()
    
                                                Table "public.actor"
       Column    |            Type             | Collation | Nullable |                 Default                 
    -------------+-----------------------------+-----------+----------+-----------------------------------------
     actor_id    | integer                     |           | not null | nextval('actor_actor_id_seq'::regclass)
     first_name  | character varying(45)       |           | not null | 
     last_name   | character varying(45)       |           | not null | 
     last_update | timestamp without time zone |           | not null | now()
    Indexes:
        "actor_pkey" PRIMARY KEY, btree (actor_id)
        "idx_actor_last_name" btree (last_name)
    Referenced by:
        TABLE "film_actor" CONSTRAINT "film_actor_actor_id_fkey" FOREIGN KEY (actor_id) REFERENCES actor(actor_id) ON UPDATE CASCADE ON DELETE RESTRICT
    Triggers:
        last_updated BEFORE UPDATE ON actor FOR EACH ROW EXECUTE PROCEDURE last_updated()
    
                             Table "public.film_actor"
       Column    |            Type             | Collation | Nullable | Default 
    -------------+-----------------------------+-----------+----------+---------
     actor_id    | smallint                    |           | not null | 
     film_id     | smallint                    |           | not null | 
     last_update | timestamp without time zone |           | not null | now()
    Indexes:
        "film_actor_pkey" PRIMARY KEY, btree (actor_id, film_id)
        "idx_fk_film_id" btree (film_id)
    Foreign-key constraints:
        "film_actor_actor_id_fkey" FOREIGN KEY (actor_id) REFERENCES actor(actor_id) ON UPDATE CASCADE ON DELETE RESTRICT
        "film_actor_film_id_fkey" FOREIGN KEY (film_id) REFERENCES film(film_id) ON UPDATE CASCADE ON DELETE RESTRICT
    Triggers:
        last_updated BEFORE UPDATE ON film_actor FOR EACH ROW EXECUTE PROCEDURE last_updated()
    
    bug 
    opened by FourSigma 13
  • Camel Case Support Extend To Mutation Variables

    Camel Case Support Extend To Mutation Variables

    What would you like to be added:

    A few months ago, we discussed adding a feature for Camel Case and there is a setting to control this behavior.

    I'd like to see this extended to mutated variables as well.

    { "insert": { "id": "01FVPABVWGDG4VBD76N9XCBWwg", "first_name": "Elliot", "last_name": "something", "email": "fragile_something", "policy_group_id": "01FVPABVWFS242TE0FCX5J7RHH", "phone": "something", "dob": "2022-01-01" } }

    I'd prefer to be able to provide camel case variables and they'd be converted into the snake case vars.

    Why is this needed: Well, I can't recall the term for it. But if I'm querying the database and get a user for example, I should be able to copy the dataset returned and paste it as update variables and it should process.

    Which means that the system should accept the variables for camel case when mutating the object.

    enhancement 
    opened by magicMustard 10
  • When no row is found return an object, do not throw an error

    When no row is found return an object, do not throw an error

    What would you like to be added:

    Maybe we should not throw an error when no row is found.

    Why is this needed:

    Today SuperGraph as a library throws in this case with:

    error="sql: no rows in result set"
    

    Maybe we should answer with this instead:

    {
      "errors": [
        {
          "message": "not found",
          "path": [
            "player"
          ]
        }
      ],
      "data": null
    }
    

    I think this is more corresponding with what all the other GraphQL backends return and all the GraphQL clients expect in response.

    And even if I'm not an expert in GraphQL syntax, I think this is what is foreseen by the standard.

    What do you think?

    enhancement 
    opened by frederikhors 10
  • Error when trying to start supergraph via docker-compose which generated using go.

    Error when trying to start supergraph via docker-compose which generated using go.

    Hi,

    I could able to generate files using the below commands,

    go get github.com/dosco/super-graph super-graph new testapp

    After that, I tried to start services using docker-compose file,

    [email protected]:~/testapp$ docker logs -f testapp_testapp_api_1 restarting "./super-graph" when it changes (additional dirs: ./config) INF roles_query not defined: attribute based access control disabled ERR failed to initialize Super Graph: error fetching version: failed to connect to host=db user=postgres database=testapp_development: server error (FATAL: database "testapp_development" does not exist (SQLSTATE 3D000))

    ``

    Upon searching I could able to see that the below command to fix the issue but getting some query issues also,

    [email protected]:~/testapp$ docker-compose run testapp_api ./super-graph db:setup Creating network "testapp_default" with the default driver Creating testapp_db_1 ... done INF created database 'testapp_development' INF 2020-06-16 10:15:01 executing 0_init.sql up -- Write your migrate up statements here

    CREATE TABLE public.users ( id bigint GENERATED ALWAYS AS IDENTITY PRIMARY KEY, full_name text, email text UNIQUE NOT NULL CHECK (length(email) < 255), created_at timestamptz NOT NULL NOT NULL DEFAULT NOW(), updated_at timestamptz NOT NULL NOT NULL DEFAULT NOW() );

    ERR ERROR: syntax error at or near ";" (SQLSTATE 42601)

    Please let me know what is wrong.

    opened by asharabu 10
  • SG-as-a-library: can't read Vars during inserts

    SG-as-a-library: can't read Vars during inserts

    I'm using this code:

    func NewConfig() core.Config {
    	return core.Config{
    		DefaultBlock: true,
    
    		Blocklist: []string{
    			"password",
    			"account_id",
        },
    
    		Vars: map[string]string{
    			"account_id": "sql:select account_id from users where id = $user_id",
        },
    
    		Roles: []core.Role{
    			{
    				Name: "user",
    				Tables: []core.RoleTable{
    					{
    						Name: "players",
    						Query: &core.Query{
    							Filters: []string{
    								"{ account_id: { _eq: $account_id } }",
    							},
    						},
    						Insert: &core.Insert{
    							Presets: map[string]string{
    								"account_id": "$account_id",
    							},
    						},
    					},
    				},
    			},
    		},
    	}
    }
    

    What is not working here is:

    Insert: &core.Insert{
    	Presets: map[string]string{
    		"account_id": "$account_id",
    	},
    },
    

    The error is:

    error="required variable 'account_id' of type 'bigint' must be set"

    It works if I use this:

    Insert: &core.Insert{
    	Presets: map[string]string{
    		"account_id": "123",
    	},
    },
    

    and works if I use the "global" $user_id var, but of course this is wrong.

    Should I think SG can't read Vars map[string]string in Insert hooks?

    bug 
    opened by frederikhors 9
  • Mutation not creating query yaml file

    Mutation not creating query yaml file

    What version of GraphJin are you using? graphjin version

    v0.20.4

    Have you tried reproducing the issue with the latest release?

    Yes

    What is the hardware spec (RAM, OS)?

    Linux, Intel 6core, 16 gb.

    Steps to reproduce the issue (config used to run GraphJin).

    `db, err := sql.Open("pgx", DsnStrBuilder()) if err != nil { log.Fatal(err) }

    conf := &core.Config{
    	DisableAllowList: false,
    	Production:       false,
    	Roles: []core.Role{
    		{
    			Name: "anon",
    			Tables: []core.RoleTable{
    				{
    					Name:   "users",
    					Schema: "rover",
    					Query: &core.Query{
    						Columns: []string{
    							"id",
    						},
    					},
    					Insert: &core.Insert{
    						Columns: []string{
    							"id",
    							"first_name",
    						},
    					},
    				},
    			},
    		},
    	},
    }
    
    sg, err := core.NewGraphJin(conf, db)
    
    if err != nil {
    	log.Fatal(err)
    }
    
    //query := `query findUsers{}`
    
    query := `mutation createUser {
    	users(insert: $data) {}
    }`
    
    vars := json.RawMessage(`{
    	"data": {
    		"id": "01G6GZ1MN8EQ9PAQHR0KR809CV",
    		"first_name": "Tim",
    		"last_name": "somethign",
    		"email": "woiefe",
    		"phone": "04506gerge"
    	}
    }`)
    
    ctx := context.Background()
    
    res, err := sg.GraphQL(ctx, query, vars, nil)
    
    if err != nil {
    	log.Fatal(err)
    }
    
    var data interface{}
    
    if err = jsoniter.Unmarshal(res.Data, &data); err != nil {
    	log.Println(err)
    }
    
    spew.Dump(data)`
    

    I'm just doing some crude testing and I've created a mutation called "createUser" - however it does not seem to be creating a query.yaml file for said mutation.

    When I change the confid to production to force use the allow list, it complains that the mutation does not exist.

    Am I doing something wrong?

    Expected behaviour and actual result.

    I suspect it should be creating a mutation query file?

    Correct me if I'm wrong.

    bug 
    opened by magicMustard 8
  • A

    A "special function" for dynamic/custom/variables/on-the-fly queries

    What would you like to be added:

    A "special feature" for fully custom and dynamic queries using GraphQL variables.

    Why is this needed:

    A really incredible peculiarity of SuperGraph is that of creating very fast pre-compiled queries.

    This clashes with the needs that emerged from issues #114 and #115: having the ability to create dynamic and custom queries.

    It would be great to be able to use GraphQL's intrinsic dynamism (expressed by the variables in the paypload) to create queries on-the-fly (obviously not pre-compiled), (maybe) slower but certainly more useful in some applications.

    Example:

    {
      "query": "query players($order_by: playerOrderBy!) {
        players(order_by: $order_by) {
          id
          created_at
        }
      }",
      "operationName": "players",
      "variables": {
        "order_by": {
          "id": "desc"
        }
      }
    }
    

    or

    {
      "query": "query players($where: playerWhere!) {
        players(where: $where) {
          id
          created_at
        }
      }",
      "operationName": "players",
      "variables": {
        "where": {
          "and": {
            "price": {
              "gt": "10"
            },
            "not": {
              "id": {
                "eq": "5"
              }
            }
          }
        }
      }
    }
    

    What you can today

    I think one of the ways to get around this today with SuperGraph is to create many queries in clients (javascript or whatever it is) each for every need.

    Example:

    1. Sorting by different columns; by ID DESC:

      export const all_players_by_id_desc = `
        query players {
          players(order_by: {id: desc}) {
            id
            created_at
          }
        }
      `
      
    2. Sorting by different columns; by created_ad DESC:

      export const all_players_by_created_at_desc = `
        query players {
          players(order_by: {created_at: desc}) {
            id
            created_at
          }
        }
      `
      

    And so on...

    As you can understand, this is really expensive.

    With SuperGraph I increase the backend productivity but I decrease the frontend one.

    Can you suggest something different, @dosco? Where do you think I'm wrong?

    enhancement 
    opened by frederikhors 8
  • Build(deps): Bump undici from 5.4.0 to 5.8.0 in /serv/web

    Build(deps): Bump undici from 5.4.0 to 5.8.0 in /serv/web

    Bumps undici from 5.4.0 to 5.8.0.

    Release notes

    Sourced from undici's releases.

    v5.8.0

    ⚠️ Security Fixes ⚠️

    What's Changed

    New Contributors

    Full Changelog: https://github.com/nodejs/undici/compare/v5.7.0...v5.7.1

    v5.7.0

    What's Changed

    New Contributors

    Full Changelog: https://github.com/nodejs/undici/compare/v5.6.1...v5.7.0

    v5.6.1

    What's Changed

    New Contributors

    Full Changelog: https://github.com/nodejs/undici/compare/v5.6.0...v5.6.1

    v5.6.0

    What's Changed

    ... (truncated)

    Commits
    • 26f60b7 Bumped v5.8.0
    • 0a5bee9 Merge pull request from GHSA-q768-x9m6-m9qp
    • a29a151 Merge pull request from GHSA-3cvr-822r-rqcc
    • 722976c docs: updated proxy docs - renamed already used const proxy to proxyServer (#...
    • b6af4e6 fix(body mixin): only allow Uint8Array chunks (#1550)
    • 6c9e634 fix(mock utils): set Readable.abort (#1549)
    • 22e2f39 ci: fix up exclude (#1544)
    • 99205ec feat: use weighted round robin in balancedPool (#1069)
    • 5b57e8c chore: exclude windows node 16 (#1542)
    • 93e31a2 Drop PR title validation (#1543)
    • Additional commits viewable in compare view

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    • @dependabot use these labels will set the current labels as the default for future PRs for this repo and language
    • @dependabot use these reviewers will set the current reviewers as the default for future PRs for this repo and language
    • @dependabot use these assignees will set the current assignees as the default for future PRs for this repo and language
    • @dependabot use this milestone will set the current milestone as the default for future PRs for this repo and language

    You can disable automated security fix PRs for this repo from the Security Alerts page.

    dependencies javascript 
    opened by dependabot[bot] 1
  • Build(deps): Bump terser from 5.14.0 to 5.14.2 in /serv/web

    Build(deps): Bump terser from 5.14.0 to 5.14.2 in /serv/web

    Bumps terser from 5.14.0 to 5.14.2.

    Changelog

    Sourced from terser's changelog.

    v5.14.2

    • Security fix for RegExps that should not be evaluated (regexp DDOS)
    • Source maps improvements (#1211)
    • Performance improvements in long property access evaluation (#1213)

    v5.14.1

    • keep_numbers option added to TypeScript defs (#1208)
    • Fixed parsing of nested template strings (#1204)
    Commits

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    • @dependabot use these labels will set the current labels as the default for future PRs for this repo and language
    • @dependabot use these reviewers will set the current reviewers as the default for future PRs for this repo and language
    • @dependabot use these assignees will set the current assignees as the default for future PRs for this repo and language
    • @dependabot use this milestone will set the current milestone as the default for future PRs for this repo and language

    You can disable automated security fix PRs for this repo from the Security Alerts page.

    dependencies javascript 
    opened by dependabot[bot] 1
  • Distinct argument not listed in schema

    Distinct argument not listed in schema

    What version of GraphJin are you using? graphjin version

    Latest - v0.20.16

    Steps to reproduce the issue (config used to run GraphJin).

    Any config. In the graphql web ui, run a query with a distinct argument. It should run successfully. Then, open the schema panel and find the arguments for the resolver you just queried. The distinct argument shouldn't be listed there, even though it's actually accepted by Graphjin

    bug 
    opened by hi019 0
  • Unnamed queries cause subsequent queries to hang indefinitely.

    Unnamed queries cause subsequent queries to hang indefinitely.

    What version of GraphJin are you using? graphjin version

    0.20.14 via homebrew

    Have you tried reproducing the issue with the latest release?

    Yes

    What is the hardware spec (RAM, OS)?

    MacBook Pro M1 Max 32gb

    Steps to reproduce the issue (config used to run GraphJin).

    Run any unnamed query. Eg.

    query {
      users {
        name
      }
    }
    

    Expected behaviour and actual result.

    Expected: The query returns results and the server continues to run.

    Actual: The query returns results, but in the log we get WRN allow list save: no name defined for query and subsequent queries hang indefinitely.

    Named queries work fine. E.g.

    query GetUsers {
      users {
        name
      }
    }
    
    bug 
    opened by akhudek 2
  • RBAC Filters Cause Intermittant Crashes

    RBAC Filters Cause Intermittant Crashes

    I have a possible reason for this failure after diffing through the src for a few days but I can't quite isolate it so I figured I'd create an issue. I'll add detail as I continue to investigate (might fork to add additional logging).

    What version of GraphJin are you using? 0.20.0

    Have you tried reproducing the issue with the latest release? No

    What is the hardware spec (RAM, OS)? Mac 16gb/ default golang aws lambda

    Steps to reproduce the issue (config used to run GraphJin).

    Essentially when I add a filter to a role, so using the example of the webshop, add a filter by user id on the product query, then I occasionally get an error that something like "column 'products.user_id' not found"

    However this only happens when I have loads of filters (~15) and LOADS of tables across a lot of schemas in my db (around 300 total tables across 20 schemas).

    Which leads me to think the filter validation might be being called before the database discovery is fully finished?

    It's happening 95% in the lambda we've set up (using GraphJin as a service) but it occassionally happens in the local version I've got running.

    I'm actively investigating so will add more detail as I find it.

    bug 
    opened by VandalSausage 6
Releases(v0.20.22)
Owner
Vikram Rangnekar
Founder 42papers.com. Also building Super Graph the automagical GraphQL to SQL compiler in Go.
Vikram Rangnekar
Go fearless SQL. Sqlvet performs static analysis on raw SQL queries in your Go code base.

Sqlvet Sqlvet performs static analysis on raw SQL queries in your Go code base to surface potential runtime errors at build time. Feature highlights:

QP Hou 451 Aug 8, 2022
RbacCustom - Create Tables Before executing the APIs

rbacCas Create Tables Before executing the APIs Members Table CREATE TABLE publi

Akash Desale 0 Feb 8, 2022
A Golang library for using SQL.

dotsql A Golang library for using SQL. It is not an ORM, it is not a query builder. Dotsql is a library that helps you keep sql files in one place and

Gustavo Chaín 643 Aug 1, 2022
a golang library for sql builder

Gendry gendry is a Go library that helps you operate database. Based on go-sql-driver/mysql, it provides a series of simple but useful tools to prepar

DiDi 1.4k Aug 4, 2022
Database Abstraction Layer (dbal) for Go. Support SQL builder and get result easily (now only support mysql)

godbal Database Abstraction Layer (dbal) for go (now only support mysql) Motivation I wanted a DBAL that No ORM、No Reflect、Concurrency Save, support S

徐佳军 53 Jan 23, 2022
SQL builder and query library for golang

__ _ ___ __ _ _ _ / _` |/ _ \ / _` | | | | | (_| | (_) | (_| | |_| | \__, |\___/ \__, |\__,_| |___/ |_| goqu is an expressive SQL bu

Doug Martin 1.6k Aug 5, 2022
SQL query builder for Go

GoSQL Query builder with some handy utility functions. Documentation For full documentation see the pkg.go.dev or GitBook. Examples // Open database a

Travis Harmon 22 Jul 14, 2022
Type safe SQL builder with code generation and automatic query result data mapping

Jet Jet is a complete solution for efficient and high performance database access, consisting of type-safe SQL builder with code generation and automa

null 769 Aug 2, 2022
A Go (golang) package that enhances the standard database/sql package by providing powerful data retrieval methods as well as DB-agnostic query building capabilities.

ozzo-dbx Summary Description Requirements Installation Supported Databases Getting Started Connecting to Database Executing Queries Binding Parameters

Ozzo Framework 565 Jul 28, 2022
Write your SQL queries in raw files with all benefits of modern IDEs, use them in an easy way inside your application with all the profit of compile time constants

About qry is a general purpose library for storing your raw database queries in .sql files with all benefits of modern IDEs, instead of strings and co

Sergey Treinis 22 Jul 4, 2022
Type safe SQL query builder and struct mapper for Go

sq (Structured Query) ?? ?? sq is a code-generated, type safe query builder and struct mapper for Go. ?? ?? Documentation • Reference • Examples This

null 164 Jul 22, 2022
Fast SQL query builder for Go

sqlf A fast SQL query builder for Go. sqlf statement builder provides a way to: Combine SQL statements from fragments of raw SQL and arguments that ma

Vlad Glushchuk 77 Jul 21, 2022
💥 A lightweight DSL & ORM which helps you to write SQL in Go.

sqlingo is a SQL DSL (a.k.a. SQL Builder or ORM) library in Go. It generates code from the database and lets you write SQL queries in an elegant way.

Qishuai Liu 248 Aug 2, 2022
Fluent SQL generation for golang

sqrl - fat-free version of squirrel - fluent SQL generator for Go Non thread safe fork of squirrel. The same handy fluffy helper, but with extra lette

Ivan Kirichenko 242 Jul 26, 2022
Fluent SQL generation for golang

Squirrel is "complete". Bug fixes will still be merged (slowly). Bug reports are welcome, but I will not necessarily respond to them. If another fork

null 5.1k Aug 6, 2022
golang orm and sql builder

gosql gosql is a easy ORM library for Golang. Style: var userList []UserModel err := db.FetchAll(&userList, gosql.Columns("id","name"), gosql.

RushTeam 161 Jun 24, 2022
Analyzer: helps uncover bugs by reporting a diagnostic for mistakes of *sql.Rows usage.

sqlrows sqlrows is a static code analyzer which helps uncover bugs by reporting a diagnostic for mistakes of sql.Rows usage. Install You can get sqlro

GoStaticAnalysis 86 Mar 24, 2022
LBADD: An experimental, distributed SQL database

LBADD Let's build a distributed database. LBADD is an experimental distributed SQL database, written in Go. The goal of this project is to build a dat

Tom Arrell 378 Jul 2, 2022
A Go library for collecting sql.DBStats in Prometheus format

sqlstats A Go library for collecting sql.DBStats and exporting them in Prometheus format. A sql.DB object represents a pool of zero or more underlying

Daniel Middlecote 147 Aug 6, 2022