Giter Club home page Giter Club logo

gpbackup-s3-plugin's Introduction

Using the S3 Storage Plugin with gpbackup and gprestore

The S3 plugin lets you use an Amazon Simple Storage Service (Amazon S3) location to store and retrieve backups when you run gpbackup and gprestore.

To use the S3 plugin, you specify the location of the plugin and the AWS login and backup location in a configuration file. When you run gpbackup or gprestore, you specify the configuration file with the option --plugin-config.

If you perform a backup operation with the gpbackup option --plugin-config, you must also specify the --plugin-config option when you restore the backup with gprestore.

The S3 plugin supports both AWS and custom storage servers that implement the S3 interface.

Pre-Requisites

The project requires the Go Programming language version 1.13 or higher. Follow the directions here for installation, usage and configuration instructions.

Downloading

go get github.com/greenplum-db/gpbackup-s3-plugin/...

Building and installing binaries

Switch your current working directory to the above gpbackup_s3_plugin source directory

Build

make build

This will build the gpbackup_s3_plugin binary in $HOME/go/bin.

Install

make install

This will install the gpbackup_s3_plugin binary on all the segments hosts. Note that GPDB must be sourced for this to work.

Test

make test

Runs the unit tests

S3 Storage Plugin Configuration File Format

The configuration file specifies the absolute path to the gpbackup_s3_plugin executable, AWS connection credentials, and S3 location.

The configuration file must be a valid YAML document in the following format:

executablepath: <absolute-path-to-gpbackup_s3_plugin>
options: 
  region: <aws-region>
  endpoint: <s3-endpoint>
  aws_access_key_id: <aws-user-id>
  aws_secret_access_key: <aws-user-id-key>
  bucket: <s3-bucket>
  folder: <s3-location>
  encryption: [on|off]
  http_proxy: <http-proxy>

executablepath is the absolute path to the plugin executable (eg: use the fully expanded path of $GPHOME/bin/gpbackup_s3_plugin).

Below are the s3 plugin options

Option Name Description
region aws region (will be ignored if endpoint is specified
endpoint endpoint to a server implementing the S3 interface
aws_access_key_id AWS S3 ID to access the S3 bucket location that stores backup files
aws_secret_access_key AWS S3 passcode for the S3 ID to access the S3 bucket location
bucket name of the S3 bucket. The bucket must exist with the necessary permissions
folder S3 location for backups. During a backup operation, the plugin creates the S3 location if it does not exist in the S3 bucket.
encryption Enable or disable SSL encryption to connect to S3. Valid values are on and off. On by default
http_proxy your http proxy url
backup_max_concurrent_requests concurrency level for any file's backup request
backup_multipart_chunksize maximum buffer/chunk size for multipart transfers during backup
restore_max_concurrent_requests concurrency level for any file's restore request
restore_multipart_chunksize maximum buffer/chunk size for multipart transfers during restore

Example

This is an example S3 storage plugin configuration file that is used in the next gpbackup example command. The name of the file is s3-test-config.yaml.

executablepath: $GPHOME/bin/gpbackup_s3_plugin
options: 
  region: us-west-2
  aws_access_key_id: test-s3-user
  aws_secret_access_key: asdf1234asdf
  bucket: gpdb-backup
  folder: test/backup3

This gpbackup example backs up the database demo using the S3 storage plugin. The absolute path to the S3 storage plugin configuration file is /home/gpadmin/s3-test.

gpbackup --dbname demo --single-data-file --plugin-config /home/gpadmin/s3-test-config.yaml

The S3 storage plugin writes the backup files to this S3 location in the AWS region us-west-2.

gpdb-backup/test/backup3/backups/YYYYMMDD/YYYYMMDDHHMMSS/

Notes

The S3 storage plugin application must be in the same location on every Greenplum Database host. The configuration file is required only on the coordinator host.

Using Amazon S3 to back up and restore data requires an Amazon AWS account with access to the Amazon S3 bucket. The Amazon S3 bucket permissions required are Upload/Delete for the S3 user ID that uploads the files and Open/Download and View for the S3 user ID that accesses the files.

gpbackup-s3-plugin's People

Contributors

kyeap-vmware avatar chrishajas avatar khuddlefish avatar ajr-vmware avatar shivzone avatar deart2k avatar bmdoil avatar jimmyyih avatar soumyadeep2007 avatar hughcapet avatar roicos avatar brosander avatar jmcatamney avatar stolb27 avatar gp-releng avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.