Snowflake grafana datasource plugin allows Snowflake data to be visually represented in Grafana dashboards.

Overview

Snowflake Grafana Data Source

Build

With the Snowflake plugin, you can visualize your Snowflake data in Grafana and build awesome chart.

Get started with the plugin

Set up the Snowflake Data Source

Install the Data Source

  1. Install the plugin into the grafana plugin folder:
grafana-cli --pluginUrl https://github.com/michelin/snowflake-grafana-datasource/releases/latest/download/snowflake-grafana-datasource.zip plugins install michelin-snowflake-datasource

or

cd /var/lib/grafana/plugins/
wget https://github.com/michelin/snowflake-grafana-datasource/releases/latest/download/snowflake-grafana-datasource.zip
unzip snowflake-grafana-datasource.zip
  1. Edit the grafana configuration file to allow unsigned plugins:
  • Linux:/etc/grafana/grafana.ini
  • macOS:/usr/local/etc/grafana/grafana.ini
[plugins]
allow_loading_unsigned_plugins = michelin-snowflake-datasource

Or with docker

docker run -d \
-p 3000:3000 \
-v "$(pwd)"/grafana-plugins:/var/lib/grafana/plugins \
--name=grafana \
-e "GF_PLUGINS_ALLOW_LOADING_UNSIGNED_PLUGINS=michelin-snowflake-datasource" \
grafana/grafana
  1. Restart grafana

Configure the Datasource

  • Open the side menu by clicking the Grafana icon in the top header.
  • In the side menu under the Configuration icon you should find a link named Data Sources.
  • Click the + Add data source button in the top header.
  • Select Snowflake.

Add your authentication and configuration details.

Setting datasources

Available configuration fields are as follows:

Name Description
Account Name Specifies the full name of your account (provided by Snowflake)
Username Specifies the login name of the user for the connection.
Password Specifies the password for the specified user.
Role (Optional) Specifies the default access control role to use in the Snowflake session initiated by Grafana.
Warehouse (Optional) Specifies the virtual warehouse to use once connected.
Database (Optional) Specifies the default database to use once connected.
Schema (Optional) Specifies the default schema to use for the specified database once connected.
Extra Options (Optional) Specifies a series of one or more parameters, in the form of <param>=<value>, with each parameter separated by the ampersand character (&), and no spaces anywhere in the connection string.

Supported Macros

Macros can be used within a query to simplify syntax and allow for dynamic parts.

Macro example Description
$__time(dateColumn) Will be replaced by an expression to convert to a UNIX timestamp and rename the column to time. For example, TRY_TO_TIMESTAMP(dateColumn) as time
$__timeEpoch(dateColumn) Will be replaced by an expression to convert to a UNIX timestamp and rename the column to time.
$__timeFilter(dateColumn) Will be replaced by a time range filter using the specified column name. For example, dateColumn BETWEEN 1494410783 AND 1494410983
$__timeFrom() Will be replaced by the start of the currently active time selection. For example, 1494410783
$__timeTo() Will be replaced by the end of the currently active time selection. For example, 1494410983
$__timeGroup(dateColumn,'5m') Will be replaced by an expression usable in GROUP BY clause. For example, *floor(extract(epoch from dateColumn)/120)120
$__timeGroup(dateColumn,'5m', 0) Same as above but with a fill parameter so missing points in that series will be added by grafana and 0 will be used as value.
$__timeGroup(dateColumn,'5m', NULL) Same as above but NULL will be used as value for missing points.
$__timeGroup(dateColumn,'5m', previous) Same as above but the previous value in that series will be used as fill value if no value has been seen yet NULL will be used (only available in Grafana 5.3+).
$__timeGroupAlias(dateColumn,'5m') Will be replaced identical to $__timeGroup but with an added column alias time (only available in Grafana 5.3+).
$__unixEpochFilter(dateColumn) Will be replaced by a time range filter using the specified column name with times represented as Unix timestamp. For example, dateColumn > 1494410783 AND dateColumn < 1494497183
$__unixEpochNanoFilter(dateColumn) Will be replaced by a time range filter using the specified column name with times represented as nanosecond timestamp. For example, dateColumn > 1494410783152415214 AND dateColumn < 1494497183142514872
$__unixEpochNanoFrom() Will be replaced by the start of the currently active time selection as nanosecond timestamp. For example, 1494410783152415214
$__unixEpochNanoTo() Will be replaced by the end of the currently active time selection as nanosecond timestamp. For example, 1494497183142514872
$__unixEpochGroup(dateColumn,'5m', [fillmode]) Same as $__timeGroup but for times stored as Unix timestamp (only available in Grafana 5.3+).
$__unixEpochGroupAlias(dateColumn,'5m', [fillmode]) Same as above but also adds a column alias (only available in Grafana 5.3+).

Write Queries

Create a panel in a dashboard and select a Snowflake Data Source to start using the query editor.

Select a query type 'Time Series' or 'Table'.

For Time series query:

  • Date / time column can appear anywhere in the query as long as it is included (you must define the time formatted columns name)
  • A numerical column must be included.

Query editor

Layout of a query

Simple query

SELECT
  <time_column> as time,
  <numerical_column>
FROM
  <table>
WHERE
  $__timeFilter(<time_column>) 
  AND $<variable> = 'xxxxx' -- custom grafana variables start with dollar sign

SQL Query Group By Interval

SELECT
  $__timeGroup(<time_column>, $__interval), -- group time by interval
  <numerical_column>
FROM
  <table>
WHERE
  $__timeFilter(<time_column>)
  AND $<variable> = 'xxxxx'
GROUP BY 
  <time_column>

Create an annotation

Annotations allow you to overlay events on a graph. To create an annotation, in the dashboard settings click "Annotations", and "New".

Development

The snowflake datasource is a data source backend plugin composed of both frontend and backend components.

Frontend

  1. Install dependencies

    yarn install
  2. Build plugin in development mode or run in watch mode

    yarn dev

    or

    yarn watch
  3. Build plugin in production mode

    yarn build

Backend

  1. Build backend plugin binaries for Linux, Windows and Darwin:

    mage -v
  2. List all available Mage targets for additional commands:

    mage -l

License

Snowflake grafana plugin has been released under Apache License 2.0. Please, refer to the LICENSE file for further information.

Comments
  • No data in response.

    No data in response.

    Hi,

    I installed this on a test grafana instance at work and hooked it up to snowflake.

    I can see the query going out to snowflake and returning results, but there's no data being plotted on the graph.

    I've attached the panel JSON and the query JSON from the query inspector.

    panel.txt query.txt

    needs investigation 
    opened by norganna 8
  • Add support for variables

    Add support for variables

    Hello!

    I'm adding support for variables.

    https://grafana.com/docs/grafana/latest/developers/plugins/add-support-for-variables/

    To fix issue #3

    Thanks!

    opened by inacionery 3
  • queryresultmeta object comes as custom object

    queryresultmeta object comes as custom object

    When I run timeseries query to fetch server memory utilized in group or hosts or all hosts, grafana queryresult meta always show as custom object with rowCount: 1; Because of this flowcharting-panel couldn't pickup the right metric; It always returns "Apply to column" as return table metadata instead of host column values.

    image image

    Where as using mysql connector shows timeseries-Wide

    image bug 
    opened by kranthikirang 2
  • %*p characters in password causing problem

    %*p characters in password causing problem

    Our Snowflake password looks like xxxxxxx%*pxxxxxxxxx

    If we try to authenticate using this password - we get an error: Validation query error : invalid URL escape "%*p"

    bug 
    opened by vladimirvilinski 2
  • Dashboard variables and annotations lack ability to enter a Query

    Dashboard variables and annotations lack ability to enter a Query

    Both dashboard variables and the annotations are missing the option to add a Query. Unable to add variables and annotations to the dashboards. Is this something meant to be supported by the plugin?

    opened by mshirman 2
  • Convert long frame to wide frame

    Convert long frame to wide frame

    This PR adds the support to convert a Long formatted time series Frame to a Wide format. The main functionality is done by calling function LongToWide() from "github.com/grafana/grafana-plugin-sdk-go/data" package. For the full description of this function, check the function description in the package source code.

    One of the major purpose of this PR is to adding field labels to frame, which is required when apply "Labels to fields" transformation.

    Fixes: #15, fixes: #19

    opened by heanlan 1
  • Bump terser from 4.8.0 to 4.8.1

    Bump terser from 4.8.0 to 4.8.1

    Bumps terser from 4.8.0 to 4.8.1.

    Changelog

    Sourced from terser's changelog.

    v4.8.1 (backport)

    • Security fix for RegExps that should not be evaluated (regexp DDOS)
    Commits

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    • @dependabot use these labels will set the current labels as the default for future PRs for this repo and language
    • @dependabot use these reviewers will set the current reviewers as the default for future PRs for this repo and language
    • @dependabot use these assignees will set the current assignees as the default for future PRs for this repo and language
    • @dependabot use this milestone will set the current milestone as the default for future PRs for this repo and language

    You can disable automated security fix PRs for this repo from the Security Alerts page.

    dependencies javascript 
    opened by dependabot[bot] 1
  • Bump postcss from 7.0.35 to 7.0.39

    Bump postcss from 7.0.35 to 7.0.39

    Bumps postcss from 7.0.35 to 7.0.39.

    Release notes

    Sourced from postcss's releases.

    7.0.39

    • Reduce package size.
    • Backport nanocolors to picocolors migration.

    7.0.38

    • Update Processor#version.

    7.0.37

    • Backport chalk to nanocolors migration.

    7.0.36

    • Backport ReDoS vulnerabilities from PostCSS 8.
    Changelog

    Sourced from postcss's changelog.

    7.0.39

    • Reduce package size.
    • Backport nanocolors to picocolors migration.

    7.0.38

    • Update Processor#version.

    7.0.37

    • Backport chalk to nanocolors migration.

    7.0.36

    • Backport ReDoS vulnerabilities from PostCSS 8.
    Commits

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    • @dependabot use these labels will set the current labels as the default for future PRs for this repo and language
    • @dependabot use these reviewers will set the current reviewers as the default for future PRs for this repo and language
    • @dependabot use these assignees will set the current assignees as the default for future PRs for this repo and language
    • @dependabot use this milestone will set the current milestone as the default for future PRs for this repo and language

    You can disable automated security fix PRs for this repo from the Security Alerts page.

    dependencies 
    opened by dependabot[bot] 1
  • Timeseries-wide group by 2 coloumns

    Timeseries-wide group by 2 coloumns

    Grafana Version: 9.1.1

    Executed Query:

    SELECT $__timeGroup(DATETIME,$__interval) as Time, HOST as metric, (max(USED/1024*1024)/max(TOTAL/1024*1024)*100) AS used_percent FROM SERVER_MEMORY WHERE $__timeFilter(DATETIMEUTC,$__interval) AND SERVICE='service1' GROUP BY 1,2 ORDER BY 1

    Using mysql

    image

    Where as using snowflake plugin

    image

    Expected result: We should see a separate column for each HOST(metric) in a given time series

    opened by kranthikirang 0
  • Time-series unable to add

    Time-series unable to add "Labels to fields" transformation

    I'm trying to add a "Labels to fields" transformation to the time-series panel, so that it will show the "group by" column instead of aggregated metric column in the legend. But it tells me there is no available label.

    Executed query:

    SELECT timeField as time,
    CONCAT(source, '->', destination) as pair,
    AVG(throughput)
    FROM table
    WHERE $__timeFilter(time)
    GROUP BY time, pair
    HAVING AVG(throughput) > 0
    ORDER BY time
    

    Expected(snowflake plugin by Grafana-lab): image

    Actual graph-view: image

    Actual table-view: image

    Grafana: v8.3.3

    enhancement 
    opened by heanlan 0
  • Time formatted Columns don't seem to work

    Time formatted Columns don't seem to work

    When I add a field with a timestamp in it to the Time Formatted Columns, I get a 500 error. Also, normally when you have two dimensions and set Stacked Series normal, I expect to see a stacked bar chart, but I don't, instead it stays unstacked. This works fine on the Postgres connector, and the premium snowflake connector that comes with Grafana premium. Without these two being fixed, we are going to have to resort to using grafana cloud, which I'm not keen to do. Are these known issues?

    bug 
    opened by jamesadinmo 0
  • query rows hard limit 10000

    query rows hard limit 10000

    Is there any reason to keep 10000 as a hard limit for query rows?

    https://github.com/michelin/snowflake-grafana-datasource/blob/master/pkg/query.go#L19

    In a time series with 5 minute samples 20 days worth for 3 nodes would be 17,280 rows as an example

    opened by kranthikirang 0
  • GROUP BY 2 columns results column name with field name + label

    GROUP BY 2 columns results column name with field name + label

    Grafana Version: 9.1.1 Plugin version: v1.2.0

    Executed Query:

    SELECT $__timeGroup(DATETIME,$__interval) as Time, HOST as metric, (max(USED/10241024)/max(TOTAL/10241024)*100) AS used_percent FROM SERVER_MEMORY WHERE $__timeFilter(DATETIMEUTC,$__interval) GROUP BY 1,2 ORDER BY 1

    Everything looks good except data comes like below

    image image

    USED_PERCENT is appended with HOST (included with a space). Using flowchart plugin which grabs the field name always displays as "USED_PERCENT" for all the hosts

    image

    Expected Behavior:

    We should see labels as field names i.e., each HOST should displayed as column name instead of "USED_PERCENT {HOST}" it should be just "HOST"

    opened by kranthikirang 0
  • Can we sign the Snowflake plug-in

    Can we sign the Snowflake plug-in

    This is not necessarily and issue but I wasn't sure how to reach out to the contributors here...

    Is it possible to sign this data source?

    Thanks,

    Kurt

    enhancement 
    opened by anybodysguest 0
Releases(v1.2.0)
Owner
Michelin
Michelin Open Source Platform
Michelin
Cole - Cole can use his sixth sense to give you metrics about your Grafana dashboards

Cole Cole can use his sixth sense to give you metrics about your Grafana dashboa

Nicolas Takashi 40 Jul 21, 2022
A Grafana backend plugin for automatic synchronization of dashboard between multiple Grafana instances.

Grafana Dashboard Synchronization Backend Plugin A Grafana backend plugin for automatic synchronization of dashboard between multiple Grafana instance

Novatec Consulting GmbH 6 Apr 8, 2022
Grafana DB2 Data Source Backend Plugin

Grafana DB2 Data Source Backend Plugin This template is a starting point for building Grafana Data Source Backend Plugins What is Grafana Data Source

null 2 Dec 13, 2021
Grafana Data Source Backend Plugin Template

Grafana Data Source Backend Plugin Template This template is a starting point for building Grafana Data Source Backend Plugins What is Grafana Data So

null 0 Jan 16, 2022
Grafana Data Source Backend Plugin

Grafana Data Source Backend Plugin This plugin allows you to receive telemetry i

Khmelev Anton 3 Sep 18, 2022
Terraform-grafana-dashboard - Grafana dashboard Terraform module

terraform-grafana-dashboard terraform-grafana-dashboard for project Requirements

hadenlabs 1 May 2, 2022
Grafana-threema-forwarder - Alert forwarder from Grafana webhooks to Threema wire messages

Grafana to Threema alert forwarder Although Grafana has built in support for pus

Péter Szilágyi 2 Feb 13, 2022
Mattermost outline plugin allows you to search your teams documents.

mattermost-plugin-outline Mattermost Outline plugin allows you to search your teams documents. Installation In Mattermost 5.16 and later, this plugin

Lujeni 9 Sep 12, 2022
Graph and alert on '.rrd' data using grafana, RRDTool and RRDSrv.

Grafana RRD Datasource A grafana datasource for reading '.rrd' files via RRDTool and RRDsrv. With this datasource you will be able to create grafana d

null 9 Jun 7, 2022
Kubectl Locality Plugin - A plugin to get the locality of pods

Kubectl Locality Plugin - A plugin to get the locality of pods

John Howard 6 Nov 18, 2021
A block parser tool that allows extraction of various data types on DAS

das-database A block parser tool that allows extraction of various data types on DAS (register, edit, sell, transfer, ...) from CKB Prerequisites Ubun

DAS 14 Sep 15, 2022
A kubectl plugin for finding decoded secret data with productive search flags.

kubectl-secret-data What is it? This is a kubectl plugin for finding decoded secret data. Since kubectl only outputs base64-encoded secrets, it makes

Keisuke Umegaki 37 Jul 18, 2022
A kubectl plugin for finding decoded secret data with productive search flags.

kubectl-secret-data What is it? This is a kubectl plugin for finding decoded secret data. Since kubectl outputs base64-encoded secrets basically, it m

Keisuke Umegaki 37 Jul 18, 2022
Grafana Tempo is a high volume, minimal dependency distributed tracing backend.

Grafana Tempo is an open source, easy-to-use and high-scale distributed tracing backend. Tempo is cost-efficient, requiring only object storage to ope

Grafana Labs 2.3k Sep 26, 2022
Grafana Dashboard Manager

Grafana dash-n-grab Grafana Dash-n-Grab (GDG) -- Dashboard/DataSource Manager. The purpose of this project is to provide an easy to use CLI to interac

NetSage 127 Sep 21, 2022
Download your Fitbit weight history and connect to InfluxDB and Grafana

WemonFit Weight monitoring for Fitbit, using InfluxDB and Grafana Generating a new certificate openssl req -new -newkey rsa:2048 -nodes -keyout lo

Eduardo Argollo 0 Dec 5, 2021
PoC for Grafana 8.x Local File Inclusion (Pre-Auth)

Grafana 8.x Local File Inclusion (Pre-Auth) CVE: Pending All credits go to j0v and his tweet https://twitter.com/j0v0x0/status/1466845212626542607 Dis

Tay 9 Apr 13, 2022
Grafana Unauthorized arbitrary file reading vulnerability

CVE-2021-43798 Grafana Unauthorized arbitrary file reading vulnerability 8.3.1 (2021-12-07) Security: Fixes CVE-2021-43798 . For more information, see

Jas502n 284 Sep 25, 2022
Otus prometheus grafana for golang

HW Prometheus. Grafana Clone the repo: git clone https://github.com/alikhanmurzayev/otus_kuber_part_3.git && cd otus_kuber_part_3 Prepare workspace: m

null 0 Dec 17, 2021