CLI tool that can execute SQL queries on CSV, LTSV, JSON and TBLN. Can output to various formats.

Overview

trdsql

PkgGoDev Go Report Card GoCover.io GitHub Actions

CLI tool that can execute SQL queries on CSV, LTSV, JSON and TBLN.

It is a tool like q, textql and others.

The difference from these tools is that the syntax of PostgreSQL or MySQL can be used.

Please refer to godoc and _example for usage as a library.

trdsql.gif

1. INSTALL

1.1. go get

go get -d github.com/noborus/trdsql
cd $GOPATH/src/github.com/noborus/trdsql
make
make install

1.1.1. Requirements

go 1.13 or higher.

1.2. Download binary

Download binary from the releases page(Linux/Windows/macOS).

1.3. Homebrew

brew install noborus/tap/trdsql

1.4. MacPorts

sudo port selfupdate
sudo port install trdsql

2. Docker

2.1. Docker pull

Pull the latest image from the Docker hub.

docker pull noborus/trdsql

2.2. image build

Or build it yourself.

docker build -t trdsql .

2.3. Docker Run

Docker run.

docker run --rm -it -v $(pwd):/tmp trdsql [options and commands]

3. Usage

trdsql [options] SQL

3.1. global options

  • -a filename analyze the file and suggest SQL.
  • -A filename analyze the file but only suggest SQL.
  • -config filename configuration file location.
  • -db db name specify db name of the setting.
  • -dblist display db list of configure.
  • -driver string database driver. [ mysql | postgres | sqlite3 ]
  • -dsn string database driver specific data source name.
  • -debug debug print.
  • -help display usage information.
  • -version display version information.
  • -q filename read query from the specified file.

3.2. Input formats

  • -ig guess format from extension. (default)
  • -icsv CSV format for input.
  • -ijson JSON format for input.
  • -iltsv LTSV format for input.
  • -itbln TBLN format for input.

3.2.1. Input options

  • -ih the first line is interpreted as column names(CSV only).
  • -id character field delimiter for input. (default ",")(CSV only)
  • -ipath string PATH string for input(JSON/JSONL only).
  • -is int skip header row.
  • -ir int number of row pre-read for column determination. (default 1)

3.3. Output formats

  • -ocsv CSV format for output. (default)
  • -ojson JSON format for output.
  • -ojsonl JSONL(JSON Lines) format for output.
  • -oltsv LTSV format for output.
  • -oat ASCII Table format for output.
  • -omd Markdown format for output.
  • -oraw Raw format for output.
  • -ovf Vertical format for output.
  • -otbln TBLN format for output.

Or, guess the output format by file name.

3.3.1. Output options

  • -out filename output file name.
  • -out-without-guess output without guessing (when using -out).
  • -oh output column name as header.
  • -od character field delimiter for output. (default ",")(CSV and RAW only).
  • -oq character quote character for output. (default """)(CSV only).
  • -oaq enclose all fields in quotes for output(CSV only).
  • -ocrlf use CRLF for output. End each output line with '\r\n' instead of '\n'.")(CSV only).
  • -onowrap do not wrap long columns(AT and MD only).

4. Example

test.csv file.

1,Orange
2,Melon
3,Apple

Please write a file name like a table name.

trdsql "SELECT * FROM test.csv"

-q filename can execute SQL from file

trdsql -q test.sql

4.1. STDIN input

"-" or "stdin" is received from standard input instead of file name.

ps | trdsql -id " " "SELECT * FROM -"

or

ps | trdsql -id " " "SELECT * FROM stdin"

4.2. Multiple files

Multiple matched files can be executed as one table.

$ trdsql -ih "SELECT * FROM tt*.csv"
1,test1
2,test2
3,test3

Note: It is not possible to mix different formats (ex: CSV and LTSV).

4.3. Compressed files

If the file is compressed with gzip, bz2, zstd, lz4, xz, it will be automatically uncompressed.

trdsql "SELECT * FROM testdata/test.csv.gz"
trdsql "SELECT * FROM testdata/test.csv.zst"

It is possible to mix uncompressed and compressed files using wildcards.

trdsql "SELECT * FROM testdata/test.csv*"

4.4. Output file

-out filename option to output the file to a file.

trdsql -out result.csv "SELECT * FROM testdata/test.csv ORDER BY c1"

4.5. Output compression

-oz compression type to compress and output.

trdsql -oz gz "SELECT * FROM testdata/test.csv ORDER BY c1" > result.csv.gz

4.6. Guess by output file name

The filename of -out filename option determines the output format(csv, ltsv, json, tbln, raw, md, at, vf, jsonl) and compression format(gzip, bz2, zstd,lz4, xz) by guess.

Guess by extension output format + output compression (eg .csv.gz, .ltsv.lz4, .md.zst ...).

The following is an LTSV file compressed in zstd.

trdsql -out result.ltsv.zst "SELECT * FROM testdata/test.csv"

4.7. Columns is not constant

If the number of columns is not a constant, read and decide multiple rows.

$ trdsql -ir 3 -iltsv "SELECT * FROM test_indefinite.ltsv"
1,Orange,50,,
2,Melon,500,ibaraki,
3,Apple,100,aomori,red

4.8. TSV (Tab Separated Value)

-id "\\t" is input from TSV (Tab Separated Value)

1	Orange
2	Melon
3	Apple
trdsql -id "\t" "SELECT * FROM test-tab.csv"

-od "\\t" is TSV (Tab Separated Value) output.

trdsql -od "\t" "SELECT * FROM test.csv"
1	Orange
2	Melon
3	Apple

4.9. LTSV (Labeled Tab-separated Values)

-iltsv is input from LTSV(Labeled Tab-separated Values).

sample.ltsv

id:1	name:Orange	price:50
id:2	name:Melon	price:500
id:3	name:Apple	price:100
$ trdsql -iltsv "SELECT * FROM sample.ltsv"
1,Orange,50
2,Melon,500
3,Apple,100

Note: Only the columns in the first row are targeted.

-oltsv is LTSV(Labeled Tab-separated Values) output.

$ trdsql -iltsv -oltsv "SELECT * FROM sample.ltsv"
id:1	name:Orange	price:50
id:2	name:Melon	price:500
id:3	name:Apple	price:100

4.10. JSON

-ijson is input from JSON.

sample.json

[
  {
    "id": "1",
    "name": "Orange",
    "price": "50"
  },
  {
    "id": "2",
    "name": "Melon",
    "price": "500"
  },
  {
    "id": "3",
    "name": "Apple",
    "price": "100"
  }
]
$ trdsql -ijson "SELECT * FROM sample.json"
1,Orange,50
2,Melon,500
3,Apple,100

JSON can contain structured types, but trdsql is stored as it is as JSON string.

sample2.json

[
    {
      "id": 1,
      "name": "Drolet",
      "attribute": { "country": "Maldives", "color": "burlywood" }
    },
    {
      "id": 2,
      "name": "Shelly",
      "attribute": { "country": "Yemen", "color": "plum" }
    },
    {
      "id": 3,
      "name": "Tuck",
      "attribute": { "country": "Mayotte", "color": "antiquewhite" }
    }
]
$ trdsql -ijson "SELECT * FROM sample2.json"
1,Drolet,"{""color"":""burlywood"",""country"":""Maldives""}"
2,Shelly,"{""color"":""plum"",""country"":""Yemen""}"
3,Tuck,"{""color"":""antiquewhite"",""country"":""Mayotte""}"

Please use SQL function.

$ trdsql -ijson "SELECT id, name, JSON_EXTRACT(attribute,'$country'), JSON_EXTRACT(attribute,'$color') FROM sample2.json"
1,Drolet,Maldives,burlywood
2,Shelly,Yemen,plum
3,Tuck,Mayotte,antiquewhite

Another json format. JSONL(JSON Lines).

sample2.json

{"id": "1","name": "Orange","price": "50"}
{"id": "2","name": "Melon","price": "500"}
{"id": "3","name": "Apple","price": "100"}

-ojson is JSON Output.

trdsql -ojson "SELECT * FROM test.csv"
[
  {
    "c1": "1",
    "c2": "Orange"
  },
  {
    "c1": "2",
    "c2": "Melon"
  },
  {
    "c1": "3",
    "c2": "Apple"
  }
]

4.11. JSONL

To output in JSONL, specify -ojsonl.

trdsql -ojsonl "SELECT * FROM test.csv"
{"c1":"1","c2":"Orange"}
{"c1":"2","c2":"Melon"}
{"c1":"3","c2":"Apple"}

4.12. TBLN

-itbln is input from TBLN.

sample.tbln

; name: | id | name |
; type: | int | text |
| 1 | Bob |
| 2 | Alice |
$ trdsql -itbln "SELECT * FROM sample.tbln"
1,Bob
2,Alice

TBLN file reflects extras name and type.

-otbln is TBLN Output.

$ trdsql -otbln "SELECT c1::int as id, c2::text as name FROM test.csv"
; created_at: 2019-03-22T13:20:31+09:00
; name: | id | name |
; type: | int | text |
| 1 | Orange |
| 2 | Melon |
| 3 | Apple |

TBLN can contain column names and type definitions. Please refer to https://tbln.dev/ for details of TBLN.

4.13. Raw output

-oraw is Raw Output. It is used when "escape processing is unnecessary" in CSV output. (For example, when outputting JSON in the database).

$ trdsql -oraw "SELECT row_to_json(t,TRUE) FROM test.csv AS t"
{"c1":"1",
 "c2":"Orange"}
{"c1":"2",
 "c2":"Melon"}
{"c1":"3",
 "c2":"Apple"}

Multiple delimiter characters can be used for raw.

trdsql -oraw -od "\t|\t" -db pdb "SELECT * FROM test.csv"
1	|	Orange
2	|	Melon
3	|	Apple

4.14. ASCII Table & MarkDown output

-oat is ASCII table output.

$ trdsql -oat "SELECT * FROM test.csv"
+----+--------+
| C1 |   C2   |
+----+--------+
|  1 | Orange |
|  2 | Melon  |
|  3 | Apple  |
+----+--------+

-omd is Markdown output.

$ trdsql -omd "SELECT * FROM test.csv"
| C1 |   C2   |
|----|--------|
|  1 | Orange |
|  2 | Melon  |
|  3 | Apple  |

The -onowrap option does not wrap long columns in at or md output.

4.15. Vertical format output

-ovf is Vertical format output("column name | value" vertically).

$ trdsql -ovf "SELECT * FROM test.csv"
---[ 1]--------------------------------------------------------
  c1 | 1
  c2 | Orange
---[ 2]--------------------------------------------------------
  c1 | 2
  c2 | Melon
---[ 3]--------------------------------------------------------
  c1 | 3
  c2 | Apple

4.16. SQL function

$ trdsql "SELECT count(*) FROM test.csv"
3

The default column names are c1, c2,...

$ trdsql "SELECT c2,c1 FROM test.csv"
Orange,1
Melon,2
Apple,3

"- ih" sets the first line to column name

$ ps |trdsql -ih -oh -id " " "SELECT \`PID\`, \`TTY\`, \`TIME\`, \`CMD\` FROM -"
TIME,TTY,PID,CMD
00:00:00,pts/20,3452,ps
00:00:00,pts/20,3453,trdsql
00:00:05,pts/20,15576,zsh

4.17. JOIN

The SQL JOIN can be used.

user.csv

1,userA
2,userB

hist.csv

1,2017-7-10
2,2017-7-10
2,2017-7-11
$ trdsql "SELECT u.c1,u.c2,h.c2 FROM user.csv as u LEFT JOIN hist.csv as h ON(u.c1=h.c1)"
1,userA,2017-7-10
2,userB,2017-7-10
2,userB,2017-7-11

4.18. PostgreSQL

When using PostgreSQL, specify postgres for driver and driver-specific data source name for dsn.

trdsql -driver postgres -dsn "dbname=test" "SELECT count(*) FROM test.csv "

4.18.1. Function

The PostgreSQL driver can use the window function.

$ trdsql -driver postgres -dsn "dbname=test" "SELECT row_number() OVER (ORDER BY c2),c1,c2 FROM test.csv"
1,3,Apple
2,2,Melon
3,1,Orange

For example, the generate_series function can be used.

$ trdsql -driver postgres -dsn "dbname=test" "SELECT generate_series(1,3);"
1
2
3

4.18.2. Join table and CSV file is possible

Test database has a colors table.

$ psql test -c "SELECT * FROM colors"
 id |  name  
----+--------
  1 | orange
  2 | green
  3 | red
(3 rows)

Join table and CSV file.

$ trdsql -driver postgres -dsn "dbname=test" "SELECT t.c1,t.c2,c.name FROM test.csv AS t LEFT JOIN colors AS c ON (t.c1::int = c.id)"
1,Orange,orange
2,Melon,green
3,Apple,red

To create a table from a file, use "CREATE TABLE ... AS SELECT...".

trdsql -driver postgres -dns "dbname=test" "CREATE TABLE fruits (id, name) AS SELECT c1::int, c2 FROM fruits.csv "
$ psql -c "SELECT * FROM fruits;"
 id |  name  
----+--------
  1 | Orange
  2 | Melon
  3 | Apple
(3 rows)

4.19. MySQL

When using MySQL, specify mysql for driver and connection information for dsn.

$ trdsql -driver mysql -dsn "user:[email protected]/test" "SELECT GROUP_CONCAT(c2 ORDER BY c2 DESC) FROM testdata/test.csv"
"g,d,a"
$ trdsql -driver mysql -dsn "user:[email protected]/test" "SELECT c1, SHA2(c2,224) FROM test.csv"
1,a063876767f00792bac16d0dac57457fc88863709361a1bb33f13dfb
2,2e7906d37e9523efeefb6fd2bc3be6b3f2991678427bedc296f9ddb6
3,d0b8d1d417a45c7c58202f55cbb617865f1ef72c606f9bce54322802

MySQL can join tables and CSV files as well as PostgreSQL.

4.20. Analyze

The -a filename option parses the file and outputs table information and SQL examples.

trdsql -a testdata/test.ltsv
The table name is testdata/header.csv.
The file type is CSV.

Data types:
+-------------+------+
| column name | type |
+-------------+------+
| id          | text |
| \`name\`    | text |
+-------------+------+

Data samples:
+----+----------+
| id | \`name\` |
+----+----------+
|  1 | Orange   |
+----+----------+

Examples:
trdsql -db sdb -ih "SELECT id, \`name\` FROM testdata/header.csv"
trdsql -db sdb -ih "SELECT id, \`name\` FROM testdata/header.csv WHERE id = '1'"
trdsql -db sdb -ih "SELECT id, count(id) FROM testdata/header.csv GROUP BY id"
trdsql -db sdb -ih "SELECT id, \`name\` FROM testdata/header.csv ORDER BY id LIMIT 10"

Other options(-id,-ih,-ir,-is,icsv,iltsv,-ijson,-itbln...) are available.

trdsql -ih -a testdata/header.csv

Similarly, with -A filename option, only Examples (SQL) is output.

trdsql -ih -A testdata/header.csv
trdsql -ih "SELECT id, \`name\` FROM testdata/header.csv"
trdsql -ih "SELECT id, \`name\` FROM testdata/header.csv WHERE id = '1'"
trdsql -ih "SELECT id, count(id) FROM testdata/header.csv GROUP BY id"
trdsql -ih "SELECT id, \`name\` FROM testdata/header.csv ORDER BY id LIMIT 10"

4.21. configuration

You can specify driver and dsn in the configuration file.

Unix like.

${HOME}/.config/trdsql/config.json

Windows (ex).

C:\Users\{"User"}\AppData\Roaming\trdsql\config.json

Or use the -config file option.

trdsql -config config.json "SELECT * FROM test.csv"

sample: config.json

{
  "db": "pdb",
  "database": {
    "sdb": {
      "driver": "sqlite3",
      "dsn": ""
    },
    "pdb": {
      "driver": "postgres",
      "dsn": "user=test dbname=test"
    },
    "mdb": {
      "driver": "mysql",
      "dsn": "user:[email protected]/dbname"
    }
  }
}

The default database is an entry of "db".

If you put the setting in you can specify the name with -db.

$ trdsql -debug -db pdb "SELECT * FROM test.csv"
2017/07/18 02:27:47 driver: postgres, dsn: user=test dbname=test
2017/07/18 02:27:47 CREATE TEMPORARY TABLE "test.csv" ( c1 text,c2 text );
2017/07/18 02:27:47 INSERT INTO "test.csv" (c1,c2) VALUES ($1,$2);
2017/07/18 02:27:47 SELECT * FROM "test.csv"
1,Orange
2,Melon
3,Apple

5. Library

Example of use as a library.

package main

import (
        "log"

        "github.com/noborus/trdsql"
)

func main() {
        trd := trdsql.NewTRDSQL(
                trdsql.NewImporter(trdsql.InDelimiter(":")),
                trdsql.NewExporter(trdsql.NewWriter()),
        )
        err := trd.Exec("SELECT c1 FROM /etc/passwd")
        if err != nil {
                log.Fatal(err)
        }
}

Please refer to godoc and _example for usage as a library.

See also psutilsql, which uses trdsql as a library.

6. License

MIT

Please check each license of SQL driver.

Comments
  • How would I use as Library to output to CSV  file?

    How would I use as Library to output to CSV file?

    I'm using trying to use the library. I can select the fields but I can't export it to the CSV file or I do not understand it correctly.

    func main() {
    	trd := trdsql.NewTRDSQL(
    		trdsql.NewImporter(trdsql.InDelimiter(",")),
    		trdsql.NewExporter(trdsql.NewWriter()),
    	)
    	err := trd.Exec("SELECT COUNT(*) FROM sample.csv")
    	if err != nil {
    		log.Fatal(err)
    	}
    }
    

    How can I output the result to CSV?

    question 
    opened by sujit-baniya 15
  • Improve CSV output formatting to include quotation characters

    Improve CSV output formatting to include quotation characters

    When for example you have a source file like this

    "col1","col2" "dat1","dat2"

    And output it as -ocsv the results are

    col1,col2 dat1,dat2

    You can right now configure the separator character with -od A similar one for quotes could be added

    enhancement question 
    opened by elvarb 11
  • SQLite

    SQLite "cannot commit - no transaction is active" on large data sets

    When a sufficiently large amount of data is passed to trdsql configured to use SQLite, an error is thrown indicating that there is no active transaction to commit.

    Here's a demonstration. Although pulling from /dev/urandom, the behavior was consistent. Worked fine for any number less than or equal to 78768, and threw the error for larger numbers. (I would not be surprised if the cutoff number were different in every environment though.)

    for byteCnt in 78768 78769
    do
    	dd if=/dev/urandom bs=$byteCnt count=1 |od -tx1 |sed 's/ /\x0a/g;' |
    	trdsql -oat 'select count(*) as RecCnt, c1 as Val from - group by c1 order by count(*) desc limit 5';
    done;
    
    1+0 records in
    1+0 records out
    78768 bytes (79 kB, 77 KiB) copied, 0.0055778 s, 14.1 MB/s
    +--------+-----+
    | RecCnt | Val |
    +--------+-----+
    |    353 | a1  |
    |    348 |  52 |
    |    348 | d8  |
    |    347 | 8a  |
    |    345 |  81 |
    +--------+-----+
    1+0 records in
    1+0 records out
    78769 bytes (79 kB, 77 KiB) copied, 0.0052549 s, 15.0 MB/s
    +--------+-----+
    | RecCnt | Val |
    +--------+-----+
    +--------+-----+
    2019/10/09 16:26:31 ERROR(COMMIT):cannot commit - no transaction is active
    

    I looked briefly at the relevant code but didn't see anything alarming...

    • trdsql.go: Exec()
    • importer.go: ImportFile()
    • database.go: Import(), insertImport()
    bug 
    opened by jetzerb 7
  • Querying set of CSVs which a lot of entries does not output any json

    Querying set of CSVs which a lot of entries does not output any json

    I am working on a medium-large open data set of Belgian companies, which is freely downloadable at https://kbopub.economie.fgov.be/kbo-open-data/login

    I am trying to execute following query to denormalize data into 1 file and it works fine when the output target is CSV, but nothing happens when outputting JSON:

    docker run --rm -it -v $(pwd):/tmp noborus/trdsql -ojson -ih " \
    SELECT * FROM /tmp/enterprise.csv as e \
    LEFT JOIN /tmp/address.csv as a ON (e.EnterpriseNumber=a.EntityNumber) \
    LEFT JOIN /tmp/contact.csv as c ON (e.EnterpriseNumber=c.EntityNumber) \
    LEFT JOIN /tmp/denomination.csv as d ON (e.EnterpriseNumber=d.EntityNumber) \
    ";
    

    Any ideas?

    question 
    opened by sbeaupre 6
  • Can't you use the WHERE statement?

    Can't you use the WHERE statement?

    func main() {
    	jsonString := `
    {
      "deviceInfo": {
        "deviceName": "设备3",
        "edgeDeviceName": "设备three",
        "topic": "GG2",
        "encryption": false,
        "compression": false
      },
      "telemetry": {
        "ts": 1585194439000,
        "aa": 6,
        "bb": 10,
        "OPC温度": 10370,
        "OPC湿度": "86",
        "OPC电量": true,
    	"staType":0
      }
    }
    `
    	trdsql.EnableDebug()
    	r := bytes.NewBufferString(jsonString)
    	importer, err := trdsql.NewBufferImporter("test", r, trdsql.InFormat(trdsql.JSON))
    	if err != nil {
    		log.Fatal(err)
    	}
    	writer := trdsql.NewWriter(trdsql.OutFormat(trdsql.VF))
    	trd := trdsql.NewTRDSQL(importer, trdsql.NewExporter(writer))
    
    	err = trd.Exec("SELECT telemetry FROM test where telemetry.aa > 3")
    	if err != nil {
    		log.Fatal(err)
    	}
    }
    

    Out :

    2021/10/20 10:54:13 driver: sqlite3, dsn: 
    2021/10/20 10:54:13 CREATE TEMPORARY TABLE test ( `deviceInfo` text, `telemetry` text );
    2021/10/20 10:54:13 INSERT INTO test (`deviceInfo`, `telemetry`) VALUES (?,?);
    2021/10/20 10:54:13 SELECT telemetry FROM test where telemetry.aa > 3
    2021/10/20 10:54:13 export: no such column: telemetry.aa [SELECT telemetry FROM test where telemetry.aa > 3]
    exit status 1
    
    opened by shumintao 5
  • SQL joins are not reflected in json output

    SQL joins are not reflected in json output

    When using the join from the example, the json output is different than the csv output (only the latter is correct)

    ❯ cat hist.csv
    1,2017-7-10
    2,2017-7-10
    2,2017-7-11%
    
    ❯ cat user.csv
    1,userA
    2,userB%
    
    ❯ docker run --rm -it -v $(pwd):/tmp noborus/trdsql "SELECT u.c1,u.c2,h.c2 FROM /tmp/user.csv as u LEFT JOIN /tmp/hist.csv as h ON(u.c1=h.c1)"
    1,userA,2017-7-10
    2,userB,2017-7-10
    2,userB,2017-7-11
    
    
    ❯ docker run --rm -it -v $(pwd):/tmp noborus/trdsql -ojson "SELECT u.c1,u.c2,h.c2 FROM /tmp/user.csv as u LEFT JOIN /tmp/hist.csv as h ON(u.c1=h.c1)"
    [
      {
        "c1": "1",
        "c2": "2017-7-10"
      },
      {
        "c1": "2",
        "c2": "2017-7-10"
      },
      {
        "c1": "2",
        "c2": "2017-7-11"
      }
    ]
    
    question good first issue 
    opened by sbeaupre 5
  • Backquotes in output when selecting from table containing dashes

    Backquotes in output when selecting from table containing dashes

    Hardcoded values containing a dash are unexpectedly output with backquotes when selecting from

    • stdin
    • a filename containing one or more dashes

    Table data containing dashes is correctly output without any added backquotes.

    I can work around it by enclosing my filenames in quotes when selecting directly from them, but that same method does not work to quote stdin.

    Some examples:

    $ FILE=/tmp/foo-bar.csv  # filename containing a dash
    $ echo "x,y-z" > $FILE
    $ cat $FILE | trdsql -oat "select '$FILE' as stdin_quoted_dashes, * from -"                                                                                                                                                             
    +---------------------+----+-----+
    | stdin_quoted_dashes | c1 | c2  |
    +---------------------+----+-----+
    | /tmp/foo`-`bar.csv  | x  | y-z |
    +---------------------+----+-----+
    
    $ trdsql -oat "select '$FILE' as unquoted_file_full_string_quoted, * from $FILE"                                                                                                                                                        
    +----------------------------------+----+-----+
    | unquoted_file_full_string_quoted | c1 | c2  |
    +----------------------------------+----+-----+
    | `/tmp/foo-bar.csv`               | x  | y-z |
    +----------------------------------+----+-----+
    
    $ trdsql -oat "select '$FILE' as quoted_file_ok, * from \"$FILE\""                                                                                                                                                                      
    +------------------+----+-----+
    |  quoted_file_ok  | c1 | c2  |
    +------------------+----+-----+
    | /tmp/foo-bar.csv | x  | y-z |
    +------------------+----+-----+
    
    $ # but can't use the same workaround for stdin:(
    $ cat $FILE | trdsql -oat "select '$FILE' as quoted_stdin, * from \"-\""                                                                                                                                                                
    2019/08/26 08:40:46 ERROR(EXPORT):SQL:no such table: -
    [select '/tmp/foo-bar.csv' as quoted_stdin, * from "-"]
    
    $ uname -srvmpio
    Linux 4.9.184-linuxkit #1 SMP Tue Jul 2 22:58:16 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux
    
    $ trdsql --version
    0.6.3
    
    opened by jetzerb 4
  • Make this project dependency ready

    Make this project dependency ready

    I would like to use this as a dependency of my own application, however I cannot use it as such because all the code is in the main package.

    Please move supporting code to it's own package and reference from your main.go so this can be used from other go programs.

    Moreover, structs appear to use all private fields, so those would need to be updated as well.

    Please also provide an option to supply an input stream to the TRDSQL struct, instead of assuming a file.

    opened by stipo42 4
  • Consecutive Tab Delimiters Collapsed To A Single Character

    Consecutive Tab Delimiters Collapsed To A Single Character

    When querying a tab-delimited file, if there's an empty column, the rest of the row shifts over one position. Works fine as long as there is data in every column.

    echo 'a,b,c,d 1,2,3,4 5,,,8 9,A,B,C' | sed 's/,/\t/g; s/ /\n/g;' |trdsql -ih -id '\t' -oat 'select * from -'
    
    +---+---+---+---+
    | a | b | c | d |
    +---+---+---+---+
    | 1 | 2 | 3 | 4 |
    | 5 | 8 |   |   | <-- value from column "d" is output under column "b"
    | 9 | A | B | C |
    +---+---+---+---+
    
    

    For completeness, I tested all the ASCII values 1-126 as delimiters. The problem only occurs for a few values:

    • 9 (Horizontal Tab)
    • 11 (Vertical Tab)
    • 12 (Form Feed)
    • 32 (Space)
    for ord in $(seq 1 126)
    do
    	[ $ord -eq 10 ] && continue; # linefeed
    	[ $ord -eq 13 ] && continue; # carriage return
    	[ $ord -eq 34 ] && continue; # double quote
    	[ $ord -eq 39 ] && continue; # single quote
    	[ $ord -eq 49 ] && continue; # the number 1 (our data below)
    
    	delim=$(printf \\$(printf '%03o' $ord));
    	[ $ord -eq 92 ] && delim="\\$delim";
    
    	sedDelim=$delim;
    	[ "$delim" = "/" ] && sedDelim="\/";
    	[ "$delim" = "&" ] && sedDelim="\&";
    
    	output=$(
    		echo "1,,1" | sed "s/,/$sedDelim/g;" |
    		trdsql -id "$delim" 'select * from -';
    	);
    
    	[ "$output" = "1,,1" ] || echo "$output for ord $ord";
    done;
    
    1,1 for ord 9
    1,1 for ord 11
    1,1 for ord 12
    1,1 for ord 32
    

    While I've never encountered a file delimited by VT, FF or Space, I regularly encounter tab delimited files, so this is a fairly important issue.

    bug 
    opened by jetzerb 3
  • build Dockerfile with multi-stage ? It also supports ARM64

    build Dockerfile with multi-stage ? It also supports ARM64

    this is my dockerfile for test.

    But the mirror image is still huge, about 800M

    FROM --platform=$BUILDPLATFORM golang:1.16 as builder
    ENV TZ=Asia/Shanghai LANG="C.UTF-8"
    ARG TARGETARCH
    ARG TARGETOS
    
    WORKDIR /workspace
    # Copy the Go Modules manifests
    COPY go.mod go.mod
    COPY go.sum go.sum
    ENV GO111MODULE=on
    ENV GOPROXY=https://goproxy.io,direct
    
    # Copy the go source
    COPY query query
    
    # Build
    RUN CGO_ENABLED=1 GOOS=$TARGETOS GOARCH=$TARGETARCH go build --tags='json1' -a -o queryd query/cmd/main.go
    
    FROM golang:1.16-buster
    WORKDIR /
    COPY --from=builder /workspace/queryd .
    ENTRYPOINT ["/queryd"]
    
    help wanted 
    opened by shumintao 2
  • Not able to use JSON_EXTRACT

    Not able to use JSON_EXTRACT

    Hi, first of all thank you for this great tool.

    I'm trying to use the documentation page to reply JSON_EXTRACT example, My input is

    [
      {
        "id": 1,
        "name": "Drolet",
        "attribute": { "country": "Maldives", "color": "burlywood" }
      },
      {
        "id": 2,
        "name": "Shelly",
        "attribute": { "country": "Yemen", "color": "plum" }
      },
      {
        "id": 3,
        "name": "Tuck",
        "attribute": { "country": "Mayotte", "color": "antiquewhite" }
      }
    ]
    

    If I run the below command I have nothing.

    trdsql -ijson "SELECT id, name, JSON_EXTRACT(attribute,'$country'), JSON_EXTRACT(attribute,'$color') FROM sample2.json"
    

    I'm using trdsql version v0.9.0.

    What's wrong in my command?

    Thank you

    opened by aborruso 2
  • Cannot use the tool, process gets instantly killed with 137 exit code

    Cannot use the tool, process gets instantly killed with 137 exit code

    Even if I run just trdsql -version it exists with a 137 exit code instantly. Any ideas? I'm using 0.10.0 and have installed (and tried to reinstall it already) via Homebrew.

    I have used an older version in the past without issues. It's not a memory problem on my computer as I have plenty of RAM available.

    ~ ❯ trdsql -version                                                                                                                                                                                                        ✘ KILL 05:30:11 PM
    [1]    16749 killed     trdsql -version
    
    help wanted 
    opened by rdsedmundo 7
  • Support go1.18 or later

    Support go1.18 or later

    Add https://github.com/multiprocessio/go-sqlite3-stdlib to the driver with the name "sqlite3_ext". The following examples are possible:

    trdsql -driver sqlite3_ext "SELECT repeat('x', 2)"
    

    Resolved #140.

    opened by noborus 1
  • Support for DuckDB

    Support for DuckDB

    Support for DuckDB (https://duckdb.org/) would be great. A quick search shows that there is a Go SQL driver for it: https://github.com/marcboeker/go-duckdb (not sure if it is compatible with trdsql).

    enhancement 
    opened by agguser 1
  • Multiple queries

    Multiple queries

    Please support multiple queries. Currently, it seems that only last query is output, e.g.

    $ trdsql 'select 1; select 2,3'
    2,3
    

    Note: union is not used here because of different number of columns.

    A workaround is to run multiple commands (e.g. trdsql 'select 1'; trdsql 'select 2,3'), but database will be created multiple times. And if input is from stdin, you have to save it first to share same input for each command.

    enhancement 
    opened by agguser 1
  • Specifying column types

    Specifying column types

    Please add an option to auto-detect column types (e.g. -itype), or specify numeric column types (e.g. -inum c2,c3).

    Currently, it seems that column types are not auto-detected, which may cause wrong order, e.g.

    $ d='a,1
    b,2
    c,11'
    
    $ <<<$d trdsql -a -
    The table name is -.
    The file type is CSV.
    
    Data types:
    +-------------+------+
    | column name | type |
    +-------------+------+
    | c1          | text |
    | c2          | text |
    +-------------+------+
    …
    
    $ <<<$d trdsql 'select * from - order by c2'
    a,1
    c,11
    b,2
    
    $ <<<$d trdsql 'select * from - order by cast(c2 as number)'
    a,1
    b,2
    c,11
    
    
    enhancement 
    opened by agguser 3
  • binary is not completely statically compiled

    binary is not completely statically compiled

    $ trdsql -ijson "select verbose_name from fudaneducn.json group by verbose_name"
    trdsql: /lib/x86_64-linux-gnu/libc.so.6: version `GLIBC_2.28' not found (required by trdsql)
    trdsql: /lib/x86_64-linux-gnu/libc.so.6: version `GLIBC_2.32' not found (required by trdsql)
    trdsql: /lib/x86_64-linux-gnu/libc.so.6: version `GLIBC_2.33' not found (required by trdsql)
    

    It depends on a particular version of glibc

    my glibc info

    strings /lib/x86_64-linux-gnu/libc.so.6 | grep GLIBC
    GLIBC_2.2.5
    GLIBC_2.2.6
    GLIBC_2.3
    GLIBC_2.3.2
    GLIBC_2.3.3
    GLIBC_2.3.4
    GLIBC_2.4
    GLIBC_2.5
    GLIBC_2.6
    GLIBC_2.7
    GLIBC_2.8
    GLIBC_2.9
    GLIBC_2.10
    GLIBC_2.11
    GLIBC_2.12
    GLIBC_2.13
    GLIBC_2.14
    GLIBC_2.15
    GLIBC_2.16
    GLIBC_2.17
    GLIBC_2.18
    GLIBC_2.22
    GLIBC_2.23
    GLIBC_2.24
    GLIBC_2.25
    GLIBC_2.26
    GLIBC_2.27
    GLIBC_PRIVATE
    GNU C Library (Ubuntu GLIBC 2.27-3ubuntu1) stable release version 2.27.
    
    opened by akkuman 3
Releases(v0.10.1)
Owner
Noboru Saito
Interested in SQL
Noboru Saito
Dumpling is a fast, easy-to-use tool written by Go for dumping data from the database(MySQL, TiDB...) to local/cloud(S3, GCP...) in multifarious formats(SQL, CSV...).

?? Dumpling Dumpling is a tool and a Go library for creating SQL dump from a MySQL-compatible database. It is intended to replace mysqldump and mydump

PingCAP 267 Nov 9, 2022
Get data from .csv files use SQL-like queries.

csvql Get data from .csv files use SQL-like queries. Задание Необходимо написать консольную программу, которая по заданному клиентом запросу осуществл

Mikhail Belov 0 Dec 7, 2021
Go-postgres - go-postgres library provide NoSQL functionality which can execute queries with pool of connections

GO Postgres go-postgres library provide NoSQL functionality which can execute queries with pool of connections. What is this repository for? Establish

Damindu Lakmal 0 Dec 31, 2021
Parses a file and associate SQL queries to a map. Useful for separating SQL from code logic

goyesql This package is based on nleof/goyesql but is not compatible with it any more. This package introduces support for arbitrary tag types and cha

null 0 Oct 20, 2021
write APIs using direct SQL queries with no hassle, let's rethink about SQL

SQLer SQL-er is a tiny portable server enables you to write APIs using SQL query to be executed when anyone hits it, also it enables you to define val

Mohammed Al Ashaal 2k Nov 23, 2022
OctoSQL is a query tool that allows you to join, analyse and transform data from multiple databases and file formats using SQL.

OctoSQL OctoSQL is a query tool that allows you to join, analyse and transform data from multiple databases, streaming sources and file formats using

Jacob Martin 4.2k Nov 26, 2022
Lightweight SQL database written in Go for prototyping and playing with text (CSV, JSON) data

gopicosql Lightweight SQL database written in Go for prototyping and playing wit

null 2 Jul 27, 2022
A tool to run queries in defined frequency and expose the count as prometheus metrics. Supports MongoDB and SQL

query2metric A tool to run db queries in defined frequency and expose the count as prometheus metrics. Why ? Product metrics play an important role in

S Santhosh Nagaraj 19 Jul 1, 2022
Mergestat - a command-line tool for running SQL queries on git repositories and related data sources

Query git repositories with SQL. Generate reports, perform status checks, analyze codebases. ?? ??

MergeStat 86 Nov 24, 2022
Use SQL to data from CSV files. Open source CLI. No DB required.

CSV Plugin for Steampipe Use SQL to query data from CSV files. Get started → Documentation: Table definitions & examples Community: Slack Channel Get

Turbot 11 Nov 10, 2022
sqlbench measures and compares the execution time of one or more SQL queries.

sqlbench sqlbench measures and compares the execution time of one or more SQL queries. The main use case is benchmarking simple CPU-bound query varian

Felix Geisendörfer 361 Nov 17, 2022
Querycrate - A simple library that loads and keeps SQL queries from files

QueryCrate A simple library for loading & getting string queries from files. How

A Really Fake Floordiv Since 4th January, 2022 1 Feb 15, 2022
pggen - generate type safe Go methods from Postgres SQL queries

pggen - generate type safe Go methods from Postgres SQL queries pggen is a tool that generates Go code to provide a typesafe wrapper around Postgres q

Joe Schafer 202 Nov 21, 2022
SQL API is designed to be able to run queries on databases without any configuration by simple HTTP call.

SQL API SQL API is designed to be able to run queries on databases without any configuration by simple HTTP call. The request contains the DB credenti

Çiçeksepeti Tech 25 Nov 15, 2022
Opionated sql formatter for use with .go files containing backticked queries

fumpt-the-sql Opionated sql formatter for use with .go files containing backticked queries. Uses https://sqlformat.darold.net/ for the actual sql form

Geoff Cowan 1 Dec 10, 2021
Single binary CLI for generating structured JSON, CSV, Excel, etc.

fakegen: Single binary CLI for generating a random schema of M columns to populate N rows of JSON, CSV, Excel, etc. This program generates a random sc

Multiprocess Labs LLC 56 Aug 30, 2022
Go-sql-reader - Go utility to read the externalised sql with predefined tags

go-sql-reader go utility to read the externalised sql with predefined tags Usage

null 0 Jan 25, 2022
This Service collects slow queries and returns them in response.

pgsql-api-go This Service collects slow queries and returns them in response. Status This service is the very first version of the project. App is up

ngh 0 Dec 30, 2021
Simple pgx wrapper to execute and scan query results

pig Simple pgx wrapper to execute and scan query results. Features All-in-one tool; Simple transactions management: You can set idle_in_transaction_se

Alexey Popov 11 Nov 14, 2022