DBT Cloud Terraform Provider

Overview

terraform-provider-dbt-cloud

Terraform Provider for DBT Cloud

Primarily focused on managing jobs in DBT Cloud, given what is available via the API. Data sources for other concepts are added for convenience. In order to use this provider, add the following to your Terraform providers setup, with the latest version number.

terraform {
  required_providers {
    dbt = {
      source  = "GtheSheep/dbt-cloud"
      version = "<version>"
    }
  }
}

Authentication

If you want to explicitly set the authentication variables on the provider, you can do so as below, though likely via a variables.tf file or config in your CI-CD pipeline to keep these credentials safe.

provider "dbt" {
  // required
  account_id = ...
  token      = "..."
}

You can also set them via environment variables, DBT_CLOUD_ACCOUNT_ID and DBT_CLOUD_TOKEN for the account_id and token respectively.

Examples

Check out the examples/ folder for some usage options, these are intended to simply showcase what this module can do rather than be best practices for any given use case.

Comments
  • Error: Unable to login to DBT Cloud

    Error: Unable to login to DBT Cloud

    Hi and thanks for this cool project! I'm unable to authenticate with either DBT_CLOUD_ACCOUNT_ID and DBT_CLOUD_TOKEN env vars or explicitly configuring the provider as follows:

    # main.tf
    terraform {
      required_providers {
        dbt = {
          source  = "GtheSheep/dbt-cloud"
          version = "0.0.67"
        }
      }
    }
    
    provider "dbt" {
      account_id  = <account_id>
      token          = "<token>"
    }
    
    data "dbt_cloud_project" "dbt_cloud_project" {
      project_id = var.dbt_cloud_project_id
    }
    

    I'm getting a Error: Unable to login to DBT Cloud when running terraform apply. I've verified that the account id and token work with the dbt Cloud API v2:

    curl --request GET \
      --url https://cloud.getdbt.com/api/v2/accounts/$DBT_CLOUD_ACCOUNT_ID/projects/ \
      --header "Content-Type: application/json" \
      --header "Authorization: Token $DBT_CLOUD_TOKEN" \
      | jq .
    

    Debug logs

    2021-10-27T12:49:05.649+0300 [DEBUG] created provider logger: level=debug
    2021-10-27T12:49:05.649+0300 [INFO]  provider: configuring client automatic mTLS
    2021-10-27T12:49:05.735+0300 [DEBUG] provider: starting plugin: path=.terraform/providers/registry.terraform.io/gthesheep/dbt-cloud/0.0.67/linux_amd64/terraform-provider-dbt-cloud_v0.0.67 args=[.terraform/providers/registry.terraform.io/gthesheep/dbt-cloud/0.0.67/linux_amd64/terraform-provider-dbt-cloud_v0.0.67]
    2021-10-27T12:49:05.736+0300 [DEBUG] provider: plugin started: path=.terraform/providers/registry.terraform.io/gthesheep/dbt-cloud/0.0.67/linux_amd64/terraform-provider-dbt-cloud_v0.0.67 pid=22052
    2021-10-27T12:49:05.736+0300 [DEBUG] provider: waiting for RPC address: path=.terraform/providers/registry.terraform.io/gthesheep/dbt-cloud/0.0.67/linux_amd64/terraform-provider-dbt-cloud_v0.0.67
    2021-10-27T12:49:05.746+0300 [INFO]  provider.terraform-provider-dbt-cloud_v0.0.67: configuring server automatic mTLS: timestamp=2021-10-27T12:49:05.746+0300
    2021-10-27T12:49:05.787+0300 [DEBUG] provider: using plugin: version=5
    2021-10-27T12:49:05.787+0300 [DEBUG] provider.terraform-provider-dbt-cloud_v0.0.67: plugin address: address=/tmp/plugin595756387 network=unix timestamp=2021-10-27T12:49:05.787+0300
    2021-10-27T12:49:15.899+0300 [INFO]  backend/local: plan operation completed
    ╷
    │ Error: Unable to login to DBT Cloud
    │ 
    │   with provider["registry.terraform.io/gthesheep/dbt-cloud"],
    │   on main.tf line 10, in provider "dbt":
    │   10: provider "dbt" {
    │ 
    │ unexpected end of JSON input
    ╵
    2021-10-27T12:49:15.901+0300 [DEBUG] provider.stdio: received EOF, stopping recv loop: err="rpc error: code = Unavailable desc = transport is closing"
    2021-10-27T12:49:15.902+0300 [DEBUG] provider: plugin process exited: path=.terraform/providers/registry.terraform.io/gthesheep/dbt-cloud/0.0.67/linux_amd64/terraform-provider-dbt-cloud_v0.0.67 pid=22052
    2021-10-27T12:49:15.902+0300 [DEBUG] provider: plugin exited
    
    opened by stumelius 25
  • Possibility to create database connection and repository

    Possibility to create database connection and repository

    Hi,

    First of all thanks for the repo, pretty helpful. I am wondering if there is also the possibility to create a database connection and repository. Let me explain:

    Looking at your project structure:

    type Project struct {
    	ID                     *int    `json:"id,omitempty"`
    	Name                   string  `json:"name"`
    	DbtProjectSubdirectory *string `json:"dbt_project_subdirectory,omitempty"`
    	ConnectionID           *int    `json:"connection_id,integer,omitempty"`
    	RepositoryID           *int    `json:"repository_id,integer,omitempty"`
    	State                  int     `json:"state"`
    	AccountID              int     `json:"account_id"`
    }
    

    I see that is possible to pass repoID and connectionID, I guess those should be already existing. Any chance to create those through this terraform module?

    Thanks for any clarification

    stale 
    opened by edge7 9
  • Fetch `deploy_key` on repository creation

    Fetch `deploy_key` on repository creation

    For one of my projects I'm able to more fully automate my infrastructure if I can retrieve the public key from DBT. Through a bit of snooping how DBT builds their own pages I found that you can do this by passing ?include_related=['deploy_key'] on the repositories endpoint.

    I've not worked on a terraform provider before, so this was a bit of guesswork on my part. Ideally I don't think you'd want to include this new query-parameter all of the time.

    Happy to clean this up & write some tests if you want to give some feedback.

    opened by mittonface 7
  • Circular dependency: project + connection + repository

    Circular dependency: project + connection + repository

    Hey @GtheSheep - thank you very much, this repository is very helpful.

    I'm trying to create a project, a connection and a repository using the resources from this provider, without having to take manual steps via the UI. My setup currently looks like this:

    data "dbt_cloud_project" "default_project" {
      project_id = dbt_cloud_project.project.id
    }
    
    resource "dbt_cloud_project" "project" {
      name            = "My project"
    
      connection_id    = dbt_cloud_connection.snowflake_connection.connection_id
      repository_id     = dbt_cloud_repository.project_vcs.repository_id
    }
    
    resource "dbt_cloud_connection" "snowflake_connection" {
      project_id    = data.dbt_cloud_project.default_project.project_id
    
      type        = "snowflake"
      name        = "snowflake_connection_${var.environment}"
      role        = "n/a"
    
      account           = var.snowflake_account
      warehouse         = var.snowflake_warehouse
      database          = var.snowflake_dbt_database
    
      allow_keep_alive  = true
      is_active         = true
    }
    
    resource "dbt_cloud_repository" "project_vcs" {
      project_id    = data.dbt_cloud_project.default_project.project_id
    
      remote_url = "<redacted>"
      is_active = true
    }
    

    When I plan, unfortunately, I get this error:

    ╷
    │ Error: Cycle: module.dbt_cloud_staging.dbt_cloud_repository.project_vcs, module.dbt_cloud_staging.dbt_cloud_connection.snowflake_connection, module.dbt_cloud_staging.dbt_cloud_project.project, module.dbt_cloud_staging.data.dbt_cloud_project.default_project
    │ 
    │ 
    ╵
    

    Am I right to understand that even though the project has support for a repository and connection, we cannot use them?

    opened by felipefrancisco 7
  • Adding environment variables

    Adding environment variables

    I have a question in regards to imports. I don't see that the documentation shows anything about import existing resources. Has imports been implemented and they just are not in the docs?

    opened by Bryan-Meier 7
  • Adding Snowflake token auth

    Adding Snowflake token auth

    @GtheSheep, I am curious if you will be implementing other authentication types besides "password" for credentials? I am asking because we use keypair in Snowflake and thus use that in our dbt Cloud credentials.

    opened by Bryan-Meier 6
  • Environment name is not changed

    Environment name is not changed

    Hi, Provider successfully created a new development environment with name "Development new2".

    But it fails to update it to "Development" on another plan & apply.

    Plan:

    Terraform v1.0.11
    on linux_amd64
    Configuring remote state backend...
    Initializing Terraform configuration...
    dbt_cloud_environment.development: Refreshing state... [id=31429:50092]
    
    Terraform used the selected providers to generate the following execution
    plan. Resource actions are indicated with the following symbols:
      ~ update in-place
    
    Terraform will perform the following actions:
    
      # dbt_cloud_environment.development will be updated in-place
      ~ resource "dbt_cloud_environment" "development" {
            id                = "31429:50092"
          ~ name              = "Development new2" -> "Development"
            # (6 unchanged attributes hidden)
        }
    
    Plan: 0 to add, 1 to change, 0 to destroy.
    

    Apply:

    Terraform v1.0.11
    on linux_amd64
    Initializing plugins and modules...
    dbt_cloud_environment.development: Modifying... [id=31429:50092]
    dbt_cloud_environment.development: Modifications complete after 1s [id=31429:50092]
    
    Apply complete! Resources: 0 added, 1 changed, 0 destroyed.
    
    Outputs:
    
    dbt_cloud_project_name = "Mono Data Platform"
    

    But in the end it stays "Development new2"

    opened by lizdeika 5
  • Add self deferring functionality on dbt_cloud_job

    Add self deferring functionality on dbt_cloud_job

    The logic is a little more complicated than other configurations. Summarized below:

    Create Job:
    if self_deferring:
      create a POST request to create a job, then send a subsequent POST request to update it with it's own ID
    
    Update Job:
    if self_deferring changed:
      if self_deferring == true:
        set deferring_job_id to itself
      else:
        set deferring_job_id to the actual deferring_job_id value in the resource
    
    Reading from remote resource:
    if deferring_job_id == job.ID:
      set self_deferring in ResourceSchema to true
    

    Tested with make test (Though I noticed there were no unit tests for Jobs, only acceptance tests, which I will transition over to you)

    opened by mwufigma 4
  • package not available for darwin_arm64 on registry.terraform.io

    package not available for darwin_arm64 on registry.terraform.io

    First I'd just like to say that I'm super excited about the work being done here, thanks a ton for putting in the effort.

    I think it's possible to get around the terraform registry issue by cloning/building locally but I'm hoping to wrap this in cdktf and am not quite sure if the fix will work there. Do you have plans to release the provider for other platforms?

    opened by bcgalvin 4
  • Resource

    Resource "dbt_cloud_job" bug

    Hi looks like there is a bug when I am trying to modify a job. The error I receive is. If I do a destroy and create everything runs as expected is only on update that this is happening

    dbt_cloud_job.job_develop: Modifying... [id=30]
    ╷
    │ Error: Plugin did not respond
    │ 
    │   with dbt_cloud_job.job_develop,
    │   on main.tf line 44, in resource "dbt_cloud_job" "job_develop":
    │   44: resource "dbt_cloud_job" "job_develop" {
    │ 
    │ The plugin encountered an error, and failed to respond to the plugin.(*GRPCProvider).ApplyResourceChange call. The plugin logs may contain more details.
    ╵
    
    Stack trace from the terraform-provider-dbt-cloud_v0.1.0 plugin:
    
    panic: interface conversion: interface {} is nil, not bool
    
    goroutine 9 [running]:
    github.com/gthesheep/terraform-provider-dbt-cloud/pkg/resources.resourceJobUpdate({0x103314f68, 0x14000414d20}, 0x1400030f600, {0x1032e4240, 0x140002335c0})
            github.com/gthesheep/terraform-provider-dbt-cloud/pkg/resources/job.go:327 +0x12d4
    github.com/hashicorp/terraform-plugin-sdk/v2/helper/schema.(*Resource).update(0x140003ae8c0, {0x103314fa0, 0x140000936e0}, 0x1400030f600, {0x1032e4240, 0x140002335c0})
            github.com/hashicorp/terraform-plugin-sdk/[email protected]/helper/schema/resource.go:376 +0x118
    
    
    opened by hadiel05 3
  • Add support for custom host_url

    Add support for custom host_url

    Thanks very much for this provider, it is highly useful. I would like to suggest adding a third option of host_url to the client so users with custom url (https://cloud.<company>.getdbt.com) can also use it. Thanks and regards!

    opened by octavio-melo 3
  • Feature - Add support for dbt cloud artefacts (source freshness and docs)

    Feature - Add support for dbt cloud artefacts (source freshness and docs)

    Closes #98

    This is my attempt at adding links to source freshness and docs jobs (which I called artefacts) at the project level. I also updated the data source to contain the job ids for docs and source freshness.

    Behind the scenes, the implementation is quite close to the project_connection one, calling the xxxProject() methods.

    One thing that I wasn't sure how to manage properly was to get the ids set to nil instead of 0 when there is no job_id defined.

    This is still WIP but I'd be keen for any feedback until now.

    Still to be done:

    • [x] add tests (and figure out how those work in Go/Terraform)
    • [x] update docs
    opened by b-per 4
  • Feature - Add ability to override env var values at the job level

    Feature - Add ability to override env var values at the job level

    dbt Cloud allows overriding the environment variable values at the job level. Adding this capability would require modifying the existing dbt_cloud_job resource (or creating new resources?).

    Endpoints and payloads

    I have some working code in Python but not in Go. Here are the specs for the CRUD operations.

    Get

    • HTTP Method: GET
    • url: "{base_url}/api/v3/accounts/{account_id}/projects/{project_id}/environment-variables/job/?job_definition_id={job_id}"

    Create

    • HTTP Method: POST
    • url: "{base_url}/api/v3/accounts/{account_id}/projects/{project_id}/environment-variables/"
    • payload:
    {
        "name": "<env_var_name>",
        "type": "job",
        "display_value": null,
        "job_definition_id": <job_id>,
        "id": null,
        "project_id":  <project_id>,
        "account_id":  <account_id>,
        "raw_value": "<value_for_the_override>"
    }
    

    Update

    • HTTP Method: POST
    • url: "{base_url}/api/v3/accounts/{account_id}/projects/{project_id}/environment-variables/{env_var_override_id}/"
    • payload:
    {
        "name": "<env_var_name>",
        "type": "job",
        "display_value": null,
        "job_definition_id": <job_id>,
        "id": <env_var_override_id>,
        "project_id": <project_id>,
        "account_id": <account_id>,
        "raw_value": "<value_for_the_override>"
    }
    

    Delete

    • HTTP Method: DELETE
    • url: "{base_url}/api/v3/accounts/{account_id}/projects/{project_id}/environment-variables/{env_var_override_id}/"
      • where env_var_override_id is the id of the existing override (gotten from the GET)
    opened by b-per 1
  • Databricks connection type

    Databricks connection type

    Is there a way to use this provider with the databricks connection on a project? looking through the code looks like only snowflake and bigquery is supported? https://github.com/GtheSheep/terraform-provider-dbt-cloud/blob/86c89283157915277916ae315beb11fe5ea9db7a/pkg/resources/connection.go#L14

    opened by hadiel05 9
  • [Crash] Plugin error when creating dbt_cloud_snowflake_credential

    [Crash] Plugin error when creating dbt_cloud_snowflake_credential

    Context: I am trying to define dbt cloud resources:

    1. project
    2. database (snowflake) connection
    3. environment
    4. project repo (connected to github)
    5. credentials (snowflake)
    6. job

    terraform apply threw an error when I provide the password parameter to dbt_cloud_snowflake_credential, and ran fine without it.

    Code:

    resource "dbt_cloud_project" "dbt_cloud_project" {
        name = "test-project"
        dbt_project_subdirectory = "dbt"
    }
    
    resource "dbt_cloud_connection" "dbt_cloud_connection" {
        name = "snowflake-sandbox-connection"
        account = "${var.snowflake_account_id}.${var.snowflake_region}"
        database = snowflake_database.database.name
        project_id = dbt_cloud_project.dbt_cloud_project.id
        role = snowflake_role.data_scientist.name
        type = "snowflake"
        warehouse = local.workspace.warehouse_size
    }
    
    resource "dbt_cloud_project_connection" "dbt_cloud_project_connection" {
        connection_id = dbt_cloud_connection.dbt_cloud_connection.connection_id
        project_id = dbt_cloud_project.dbt_cloud_project.id
    }
    
    resource "dbt_cloud_environment" "dev_environment" {
        dbt_version = "1.0.0"
        name = "Development"
        project_id = dbt_cloud_project.dbt_cloud_project.id
        type = "development"
    }
    
    resource "dbt_cloud_repository" "github_repo" {
        project_id = dbt_cloud_project.dbt_cloud_project.id
        remote_url = "[email protected]:xxx/xxx.git"
    }
    
    resource "dbt_cloud_project_repository" "github_repo" {
        project_id = dbt_cloud_project.dbt_cloud_project.id
        repository_id = dbt_cloud_repository.github_repo.repository_id
    }
    
    resource "dbt_cloud_snowflake_credential" "snowflake_credential" {
        auth_type = "password"
        num_threads = 4
        project_id = dbt_cloud_project.dbt_cloud_project.id
        schema = snowflake_schema.schema.name
        user = var.snowflake_username
        password = var.snowflake_password
    }
    
    resource "dbt_cloud_job" "test_job" {
        environment_id = dbt_cloud_environment.dev_environment.environment_id
        execute_steps = ["dbt compile"]
        name = "test_job"
        project_id = dbt_cloud_project.dbt_cloud_project.id
        triggers = {schedule: true}
        schedule_days = [1]
        schedule_hours = [0]
    }
    

    Terraform plan:

      # dbt_cloud_snowflake_credential.snowflake_credential will be updated in-place
      ~ resource "dbt_cloud_snowflake_credential" "snowflake_credential" {
            id            = "146395:154576"
          + password      = (sensitive value)
            # (7 unchanged attributes hidden)
        }
    

    Error log:

    dbt_cloud_snowflake_credential.snowflake_credential: Modifying... [id=146395:154576]
    dbt_cloud_job.test_job: Modifying... [id=117792]
    ╷
    │ Error: Plugin did not respond
    │ 
    │   with dbt_cloud_snowflake_credential.snowflake_credential,
    │   on dbt-cloud.tf line 38, in resource "dbt_cloud_snowflake_credential" "snowflake_credential":
    │   38: resource "dbt_cloud_snowflake_credential" "snowflake_credential" {
    │ 
    │ The plugin encountered an error, and failed to respond to the plugin.(*GRPCProvider).ApplyResourceChange call. The plugin logs may contain more
    │ details.
    ╵
    ╷
    │ Error: Plugin did not respond
    │ 
    │   with dbt_cloud_job.test_job,
    │   on dbt-cloud.tf line 47, in resource "dbt_cloud_job" "test_job":
    │   47: resource "dbt_cloud_job" "test_job" {
    │ 
    │ The plugin encountered an error, and failed to respond to the plugin.(*GRPCProvider).ApplyResourceChange call. The plugin logs may contain more
    │ details.
    ╵
    
    Stack trace from the terraform-provider-dbt-cloud_v0.1.0 plugin:
    
    panic: interface conversion: interface {} is nil, not bool
    
    goroutine 39 [running]:
    github.com/gthesheep/terraform-provider-dbt-cloud/pkg/resources.resourceJobUpdate({0x18b3b08, 0xc0005431a0}, 0xc00040ce00, {0x179c200, 0xc00042a900})
            github.com/gthesheep/terraform-provider-dbt-cloud/pkg/resources/job.go:326 +0x1045
    github.com/hashicorp/terraform-plugin-sdk/v2/helper/schema.(*Resource).update(0xc000186fc0, {0x18b3b40, 0xc000606a80}, 0xd, {0x179c200, 0xc00042a900})
            github.com/hashicorp/terraform-plugin-sdk/[email protected]/helper/schema/resource.go:376 +0x12e
    github.com/hashicorp/terraform-plugin-sdk/v2/helper/schema.(*Resource).Apply(0xc000186fc0, {0x18b3b40, 0xc000606a80}, 0xc000544f70, 0xc00040cc80, {0x179c200, 0xc00042a900})
            github.com/hashicorp/terraform-plugin-sdk/[email protected]/helper/schema/resource.go:482 +0x785
    github.com/hashicorp/terraform-plugin-sdk/v2/helper/schema.(*GRPCProviderServer).ApplyResourceChange(0xc0003923f0, {0x18b3a98, 0xc000308000}, 0xc00043a370)
            github.com/hashicorp/terraform-plugin-sdk/[email protected]/helper/schema/grpc_provider.go:1021 +0xdaa
    github.com/hashicorp/terraform-plugin-go/tfprotov5/tf5server.(*server).ApplyResourceChange(0xc000240640, {0x18b3b40, 0xc000606390}, 0xc0003da150)
            github.com/hashicorp/[email protected]/tfprotov5/tf5server/server.go:812 +0x56b
    github.com/hashicorp/terraform-plugin-go/tfprotov5/internal/tfplugin5._Provider_ApplyResourceChange_Handler({0x1787be0, 0xc000240640}, {0x18b3b40, 0xc000606390}, 0xc000542180, 0x0)
            github.com/hashicorp/[email protected]/tfprotov5/internal/tfplugin5/tfplugin5_grpc.pb.go:385 +0x170
    google.golang.org/grpc.(*Server).processUnaryRPC(0xc0002aa8c0, {0x18c0818, 0xc0004024e0}, 0xc00060c480, 0xc0003a2150, 0x1d7ba60, 0x0)
            google.golang.org/[email protected]/server.go:1282 +0xccf
    google.golang.org/grpc.(*Server).handleStream(0xc0002aa8c0, {0x18c0818, 0xc0004024e0}, 0xc00060c480, 0x0)
            google.golang.org/[email protected]/server.go:1619 +0xa2a
    google.golang.org/grpc.(*Server).serveStreams.func1.2()
            google.golang.org/[email protected]/server.go:921 +0x98
    created by google.golang.org/grpc.(*Server).serveStreams.func1
            google.golang.org/[email protected]/server.go:919 +0x294
    
    Error: The terraform-provider-dbt-cloud_v0.1.0 plugin crashed!
    
    This is always indicative of a bug within the plugin. It would be immensely
    helpful if you could report the crash with the plugin's maintainers so that it
    can be fixed. The output above should help diagnose the issue.
    
    stale 
    opened by alex-ld 14
Releases(v0.1.4)
`tfx` is a standalone CLI for Terraform Cloud and Terraform Enterprise.

TFx CLI tfx is a standalone CLI for Terraform Cloud and Terraform Enterprise. The initial focus of tfx is to execute the API-Driven workflow for a Wor

Tom Straub 54 Dec 21, 2022
Terraform Redshift provider

Terraform Provider for AWS Redshift This provider allows to manage with Terraform AWS Redshift objects like users, groups, schemas, etc.. It's publish

Brainly 25 Dec 12, 2022
Terraform MCS Provider

Terraform MCS Provider Documentation https://registry.terraform.io/providers/MailRuCloudSolutions/mcs/latest/docs Requirements Terraform 1.0.x Go 1.16

Mail.Ru Cloud Solutions 16 Jun 5, 2022
A convenience provider for reading `.env` files in Terraform.

Terraform Provider dotenv This is a convenience provider for Terraform that provides a data source for reading .env files. Using the provider Below is

John Houston 9 Dec 23, 2022
Command Line Interface for Terraform Enterprise/Cloud ( tecli )

In a world where everything is Terraform, teams use Terraform Cloud API to manage their workloads. TECLI increases teams productivity by facilitating such interaction and by providing easy commands that can be executed on a terminal or on CI/CD systems.

Amazon Web Services - Labs 201 Dec 16, 2022
minectl 🗺 is a cli for creating Minecraft (java or bedrock) server on different cloud provider.

minectl ?? minectl️️ is a cli for creating Minecraft (java or bedrock) server on different cloud provider. It is a private side project of me, to lear

Engin Diri 108 Jan 3, 2023
Rclone ("rsync for cloud storage") is a command line program to sync files and directories to and from different cloud storage providers.

Rclone ("rsync for cloud storage") is a command line program to sync files and directories to and from different cloud storage providers.

rclone 36.4k Jan 5, 2023
A component on PolarStack, a hybrid cloud Alibaba Cloud DBaaS product

What is PolarDB Stack Daemon? PolarDB Stack Daemon is a component on PolarStack, a hybrid cloud Alibaba Cloud DBaaS product, and is designed mainly fo

null 10 Nov 21, 2021
Rclone ("rsync for cloud storage") is a command-line program to sync files and directories to and from different cloud storage providers.

Website | Documentation | Download | Contributing | Changelog | Installation | Forum Rclone Rclone ("rsync for cloud storage") is a command-line progr

null 0 Nov 5, 2021
The Cloud Aviator: TUI client for cloud services (AWS, Vultr, Heroku, Render.com, ...)

=== T H E C L O U D A V I A T O R === ⠀⠀⠀⠀⠀⠀⠀⠀⢶⣦⣄⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠙⢿⣷⣦⡀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀

◤◢◤◢◤◢◤◢ 39 Dec 17, 2022
A CLI Tool to easily generate your Terraform configuration

Tf Tf is a command line tool to easily generate your Terraform configuration with an interactive prompt. Inspiration Boredom in Covid-19 Installation

Gowtham Munukutla 20 Sep 30, 2022
Declaratively configure your Hydra server with Terraform.

terraform-provider-hydra The Terraform Hydra provider is a plugin for Terraform that allows for declarative management of a Hydra instance. You can fi

Determinate Systems 34 Nov 9, 2022
CLI to support with downloading and compiling terraform providers for Mac with M1 chip

m1-terraform-provider-helper A CLI to help with managing the installation and compilation of terraform providers when running a new M1 Mac. Motivation

kreuzwerker GmbH 265 Jan 2, 2023
Terraform gitlab environment

terraform-gitlab-environment terraform-gitlab-environment for gitlab Requirements This is a list of plugins that need to be installed previously to en

hadenlabs 2 Feb 23, 2022
A CLI command to parse Terraform execution result and notify it to Backlog

tf2b Fork of mercari/tfnotify tf2b parses Terraform commands' execution result and applies it to an arbitrary template and then notifies it to Backlog

Yuichi Watanabe 1 Oct 15, 2021
CLI to migrate Terraform Configuration and State

tfmigrator CLI CLI to migrate Terraform Configuration and State. The migration feature is implemented with tfmigrator/tfmigrator, so please see the do

null 34 Dec 14, 2022
Command line tool for Google Cloud Datastore, written in Go

dsio dsio is a command line tool for Google Cloud Datastore. This tool is under development. Please use in your own risk. Features Bulk upsert entitie

Shinichi Nishimura 66 Feb 8, 2022
Generate High Level Cloud Architecture diagrams using YAML syntax.

A commandline tool that generate High Level microservice & serverless Architecture diagrams using a declarative syntax defined in a YAML file.

Luca Sepe 557 Dec 24, 2022
Run your MapReduce workloads as a single binary on a single machine with multiple CPUs and high memory. Pricing of a lot of small machines vs heavy machines is the same on most cloud providers.

gomap Run your MapReduce workloads as a single binary on a single machine with multiple CPUs and high memory. Pricing of a lot of small machines vs he

null 21 Sep 16, 2022