A library for writing backup programs in Golang

Related tags

DevOps Tools barkup
Overview

Barkup

Build Status Coverage Status

godoc.org/github.com/keighl/barkup

Barkup is a library for backing things up. It provides tools for writing bare-bones backup programs in Go. The library is broken out into exporters and storers. Currently, those are:

Exporters: MySQL Postgres RethinkDB

Storers: S3

Quick Example (mysql to s3)

Here's a go program that backups up a MySQL database (Exporter) to an S3 bucket (Storer) using barkup. The resulting executable is plopped on a server somewhere and scheduled to execute via CRON.

package main

import "github.com/keighl/barkup"

func main() {

    // Configure a MySQL exporter
    mysql := &barkup.MySQL{
        Host:     "localhost",
        Port:     "3306",
        DB:       "production_db",
        User:     "root",
        Password: "cheese",
    }

    // Configure a S3 storer
    s3 := &barkup.S3{
        Region:       "us-east-1",
        Bucket:       "backups",
        AccessKey:    "*************",
        ClientSecret: "**********************",
    }

    // Export the database, and send it to the
    // bucket in the `db_backups` folder
    err := mysql.Export().To("db_backups/", s3)
    if err != nil {
        panic(err)
    }
}
$ go build
@hourly /path/to/backup-program

Cron

Because barkup executes system commands like tar, mysqldump, etc, you may run into issues where crontab can't find 'em. The crontab shell is super stripped down, but you can shim the PATH environment variable to access stuff.

PATH=$PATH:/usr/local/bin:/usr/bin:/bin

@hourly /path/to/backup-program

This is especially the case for the RethinkDB exporter rethink-dump which executes system commands itself ("exec inception").


Exporters

Exporters provide a common interface for backing things up via the Export() method. It writes an export file to the local disk, and returns an ExportResult which can be passed on to a storer, or to another location on the disk.

// Exporter
mysql := &barkup.MySQL{...}

// Export Result
result := mysql.Export()
if (result.Error != nil) { panic(result.Error) }

// Send it to a directory path on a storer
err := result.To("backups/", storer)

// OR just move it somewhere on the local disk
err := result.To("~/backups/", nil)

MySQL

The mysql exporter uses mysqldump to pump out a gzipped archive of your database. mysqldump must be installed on your system (it probably is if you're using mysql), and accessible to the user running the final program (again, it probabaly is).

Usage

mysql := &barkup.MySQL{
  Host: "127.0.0.1",
  Port: "3306",
  DB: "XXXXX",
  User: "XXXXX",
  Password: "XXXXX",
  // Any extra mysqldump options
  Options: []string{"--skip-extended-insert"}
}

// Writes a file `./bu_DBNAME_TIMESTAMP.sql.tar.gz`
result := mysql.Export()

if (result.Error != nil) { panic(result.Error) }

Postgres

The postgres exporter uses pg_dump to make a gzipped archive of your database. pg_dump must be installed on your system (it probably is if you're using postgres).

Usage

postgres := &barkup.Postgres{
  Host: "127.0.0.1",
  Port: "5432",
  DB: "XXXXXXXX",

  // Not necessary if the program runs as an authorized pg user/role
  Username: "XXXXXXXX",

  // Any extra pg_dump options
  Options: []string{"--no-owner"},
}

// Writes a file `./bu_DBNAME_TIMESTAMP.sql.tar.gz`
result := postgres.Export()

if (result.Error != nil) { panic(result.Error) }

Connection credentials

You have two options for allowing barkup to connect to your DB.

Add a ~/.pgpass for account that will run the backup program.

Or, run the backup program from an authenticated user, like postgres:

$ sudo -i -u postgres
$ ./backup-program

RethinkDB

The RethinkDB exporter uses rethinkdb dump to make a gzipped archive of your cluster. rethinkdb-dump must be installed on your system.

$ sudo pip install rethinkdb

Usage

rethink := &barkup.RethinkDB{
  Name: "nightly",
  Connection: "0.0.0.0:28015",
  // You can specify specific databases and/or tables to dump (by default it dumps your whole cluster)
  Targets: []string{"site", "leads.contacts"},
}

// Writes a file `./bu_nightly_TIMESTAMP.tar.gz`
result := rethink.Export()
if (result.Error != nil) { panic(result.Error) }

Storers

Storers take an ExportResult object and provide a common interface for moving a backup to someplace safe.

// For chaining an ExportRestult
err := someExportResult.To("backups/", someStorer)

// OR
err := someStorer.Store(someExportResult, "backups/")

S3

The S3 storer puts the exported file into a bucket at a specified directory. Note, you shouldn't use your global AWS credentials for this. Instead, create bucket specific credentials via IAM.

Usage

s3 := &barkup.S3{
  Region: "us-east-1",
  Bucket: "backups",
  AccessKey: "XXXXXXXXXXXXX",
  ClientSecret: "XXXXXXXXXXXXXXXXXXXXX",
}

err := someExportResult.To("data/", s3)

Region IDs

  • us-east-1
  • us-west-1
  • us-west-2
  • eu-west-1
  • ap-southeast-1
  • ap-southeast-2
  • ap-northeast-1
  • sa-east-1
Issues
  • Add support for softlayer, rackspace and openstack, and swift backups

    Add support for softlayer, rackspace and openstack, and swift backups

    Tested it out with softlayers backups, the external dependency being used is generic enough to support the others so they should work fine

    @keighl let me know what you think

    opened by saifabid 3
  • [WIP] Make dbname optional

    [WIP] Make dbname optional

    Hello i decided to write an improved version of the mysql exporter.

    What: Made the mysql exporter able to backup all databases into separate sql files. Why: When I backup mysql, I find it very convenient to have one file per database. Current exporter only supported one file through the --all-databases flag. How: Connect to the database and retrieve the database names. Loop through them calling mysqldump for each of them.

    This is WIP, I didn't write tests and I am new to go. I would like to hear your thoughts. :)

    opened by blacksails 1
  • Package goamz/s3 was moved, so barkup is not compiling

    Package goamz/s3 was moved, so barkup is not compiling

    Hi, i just did a go get github.com/keighl/barkup and i got this:

    go get -u -v github.com/keighl/barkup github.com/keighl/barkup (download) Fetching https://launchpad.net/goamz/aws?go-get=1 Parsing meta tags from https://launchpad.net/goamz/aws?go-get=1 (status code 404)launchpad.net/goamz (download) go: missing Bazaar command. See https://golang.org/s/gogetcmd package launchpad.net/goamz/aws: exec: "bzr": executable file not found in $PATH Fetching https://launchpad.net/goamz/s3?go-get=1 Parsing meta tags from https://launchpad.net/goamz/s3?go-get=1 (status code 404) package launchpad.net/goamz/s3: cannot find package "launchpad.net/goamz/s3"

    It looks like the goamz/s3 library was moved form launchpad to github and then barkup is not compiling correctly when fetched by go.

    opened by afboteros 0
  • Log CombinedOutput instead of just Output

    Log CombinedOutput instead of just Output

    Output() only shows the messages written to stdout but not stderr. Switching to CombinedOutput() would log both stdout and stderr to result.Error.CmdOutput.

    opened by maddie 1
Owner
Kyle Truscott
Kyle Truscott
Lightweight, single-binary Backup Repository client. Part of E2E Backup Architecture designed by RiotKit

Backup Maker Tiny backup client packed in a single binary. Interacts with a Backup Repository server to store files, uses GPG to secure your backups e

RiotKit 1 Apr 4, 2022
🛅 Backup your Kubernetes Stateful Applications

Stash Stash by AppsCode is a cloud-native data backup and recovery solution for Kubernetes workloads. If you are running production workloads in Kuber

Stash by AppsCode 1.1k Jun 29, 2022
Kubegres is a Kubernetes operator allowing to create a cluster of PostgreSql instances and manage databases replication, failover and backup.

Kubegres is a Kubernetes operator allowing to deploy a cluster of PostgreSql pods with data replication enabled out-of-the box. It brings simplicity w

Reactive Tech Ltd 1k Jun 30, 2022
Dgraph Backup and Restore (cloud). Read-only mirror.

dgbrx Dgraph Backup and Restore X dgbrx is a Go commandline tool which helps to do a backup, restore or clean on a Dgraph Cloud (aka slash / managed)

Sorcero 2 Oct 28, 2021
Tape backup software optimized for large WORM data and long-term recoverability

Mixtape Backup software for tape users with lots of WORM data. Draft design License This codebase is not open-source software (or free, or "libre") at

Dave Anderson 15 Apr 23, 2022
Kstone is an etcd management platform, providing cluster management, monitoring, backup, inspection, data migration, visual viewing of etcd data, and intelligent diagnosis.

Kstone 中文 Kstone is an etcd management platform, providing cluster management, monitoring, backup, inspection, data migration, visual viewing of etcd

TKEStack 544 Jun 27, 2022
Simple SFTP backup tool for files.

BakTP Simple SFTP backup tool for files. config.example.json Contains an example how to backup a database. This application can be added to crontab -e

Paul 0 Dec 30, 2021
WaffleSyrup - Simple backup solution written by Go.

WaffleSyrup Simple backup solution written by Go. Usage WaffleSyrup runs in the current working directory. It will create ./tmp directory to save tarb

Qroffle 2 Apr 22, 2022
MongoBackup - This is container that takes backup of MongoDB

MongoBackup This is container that takes backup of MongoDB. It is ment to be ran

Michal Maděra 1 Feb 15, 2022
Lxmin - Backup and Restore LXC instances from MinIO

lxmin Backup and restore LXC instances from MinIO Usage NAME: lxmin - backup a

Multi-Cloud Object Storage 13 May 10, 2022
A simple program to automatically backup a database using git. Err handling by Sentry, Reporting by Betteruptime. Made with 🩸 , 😓 & 😭

backup What is this? A Simple program to automatically backup a database using git. Err handling by Sentry, Uses heartbeats by Betteruptime Made with

Joshy 3 Mar 8, 2022
Simple backup tool for PostgreSQL

pg_back dumps databases from PostgreSQL Description pg_back is a dump tool for PostgreSQL. The goal is to dump all or some databases with globals at o

Nicolas Thauvin 242 Jun 22, 2022
Github-backup application

Github-backup application This application clone your github repository with all commits, branch, tags etc. to your local disk Dependencies This App u

Kirill Scherba 18 Jun 9, 2022
This manager helps handle the life cycle of your eBPF programs

eBPF Manager This repository implements a manager on top of Cilium's eBPF library. This declarative manager simplifies attaching and detaching eBPF pr

Datadog, Inc. 22 Jun 20, 2022
Collection of mini-programs demonstrating Kubernetes client-go usage.

Kubernetes client-go examples Collection of mini-programs covering various client-go use cases. The intention (at least so far) is to test (more or le

Ivan Velichko 339 Jun 26, 2022
Simple tuning work for go programs in high concurrency scenarios.

go-program-tuning Simple tuning work for go programs in high concurrency scenarios. Installation Run the following command under your project: go get

Afeyer 3 Mar 15, 2022
Go Support Code For Writing Falcosecurity Plugins

plugin-sdk-go Go package to facilitate writing Falco/Falco libs plugins. Before using this package, review the developer's guide which fully documents

Mark Stemm 2 Sep 20, 2021
A Terraform provider for writing ZIP archive files

Terraform Provider: Zip A Terraform provider for creating a dynamic ZIP archive.

Skpr 1 Jan 11, 2022
Navmux - Test project evaluate writing the equivalent of boat repro using go

navmux Test project evaluate writing the equivalent of boat repro using go. The

Martin Marsh 1 Jan 10, 2022