WIP: Add locking of borgmatic config file #254
70
.drone.yml
70
.drone.yml
|
@ -2,52 +2,112 @@
|
|||
kind: pipeline
|
||||
name: python-3-5-alpine-3-10
|
||||
|
||||
services:
|
||||
- name: postgresql
|
||||
image: postgres:11.6-alpine
|
||||
environment:
|
||||
POSTGRES_PASSWORD: test
|
||||
POSTGRES_DB: test
|
||||
- name: mysql
|
||||
image: mariadb:10.3
|
||||
environment:
|
||||
MYSQL_ROOT_PASSWORD: test
|
||||
MYSQL_DATABASE: test
|
||||
|
||||
steps:
|
||||
- name: build
|
||||
image: python:3.5-alpine3.10
|
||||
pull: always
|
||||
commands:
|
||||
- scripts/run-tests
|
||||
- scripts/run-full-tests
|
||||
---
|
||||
kind: pipeline
|
||||
name: python-3-6-alpine-3-10
|
||||
|
||||
services:
|
||||
- name: postgresql
|
||||
image: postgres:11.6-alpine
|
||||
environment:
|
||||
POSTGRES_PASSWORD: test
|
||||
POSTGRES_DB: test
|
||||
- name: mysql
|
||||
image: mariadb:10.3
|
||||
environment:
|
||||
MYSQL_ROOT_PASSWORD: test
|
||||
MYSQL_DATABASE: test
|
||||
|
||||
steps:
|
||||
- name: build
|
||||
image: python:3.6-alpine3.10
|
||||
pull: always
|
||||
commands:
|
||||
- scripts/run-tests
|
||||
- scripts/run-full-tests
|
||||
---
|
||||
kind: pipeline
|
||||
name: python-3-7-alpine-3-10
|
||||
|
||||
services:
|
||||
- name: postgresql
|
||||
image: postgres:11.6-alpine
|
||||
environment:
|
||||
POSTGRES_PASSWORD: test
|
||||
POSTGRES_DB: test
|
||||
- name: mysql
|
||||
image: mariadb:10.3
|
||||
environment:
|
||||
MYSQL_ROOT_PASSWORD: test
|
||||
MYSQL_DATABASE: test
|
||||
|
||||
steps:
|
||||
- name: build
|
||||
image: python:3.7-alpine3.10
|
||||
pull: always
|
||||
commands:
|
||||
- scripts/run-tests
|
||||
- scripts/run-full-tests
|
||||
---
|
||||
kind: pipeline
|
||||
name: python-3-7-alpine-3-7
|
||||
|
||||
services:
|
||||
- name: postgresql
|
||||
image: postgres:10.11-alpine
|
||||
environment:
|
||||
POSTGRES_PASSWORD: test
|
||||
POSTGRES_DB: test
|
||||
- name: mysql
|
||||
image: mariadb:10.1
|
||||
environment:
|
||||
MYSQL_ROOT_PASSWORD: test
|
||||
MYSQL_DATABASE: test
|
||||
|
||||
steps:
|
||||
- name: build
|
||||
image: python:3.7-alpine3.7
|
||||
pull: always
|
||||
commands:
|
||||
- scripts/run-tests
|
||||
- scripts/run-full-tests
|
||||
---
|
||||
kind: pipeline
|
||||
name: python-3-8-alpine-3-10
|
||||
|
||||
services:
|
||||
- name: postgresql
|
||||
image: postgres:11.6-alpine
|
||||
environment:
|
||||
POSTGRES_PASSWORD: test
|
||||
POSTGRES_DB: test
|
||||
- name: mysql
|
||||
image: mariadb:10.3
|
||||
environment:
|
||||
MYSQL_ROOT_PASSWORD: test
|
||||
MYSQL_DATABASE: test
|
||||
|
||||
steps:
|
||||
- name: build
|
||||
image: python:3.8-alpine3.10
|
||||
pull: always
|
||||
commands:
|
||||
- scripts/run-tests
|
||||
- scripts/run-full-tests
|
||||
---
|
||||
kind: pipeline
|
||||
name: documentation
|
||||
|
|
45
NEWS
45
NEWS
|
@ -1,3 +1,48 @@
|
|||
1.4.21.dev0
|
||||
* #268: Override particular configuration options from the command-line via "--override" flag. See
|
||||
the documentation for more information:
|
||||
https://torsion.org/borgmatic/docs/how-to/make-per-application-backups/#configuration-overrides
|
||||
* #270: Only trigger "on_error" hooks and monitoring failures for "prune", "create", and "check"
|
||||
actions, and not for other actions.
|
||||
* When pruning with verbosity level 1, list pruned and kept archives. Previously, this information
|
||||
was only shown at verbosity level 2.
|
||||
|
||||
1.4.20
|
||||
* Fix repository probing during "borgmatic init" to respect verbosity flag and remote_path option.
|
||||
* #249: Update Healthchecks/Cronitor/Cronhub monitoring integrations to fire for "check" and
|
||||
"prune" actions, not just "create".
|
||||
|
||||
1.4.19
|
||||
* #259: Optionally change the internal database dump path via "borgmatic_source_directory" option
|
||||
in location configuration section.
|
||||
* #271: Support piping "borgmatic list" output to grep by logging certain log levels to console
|
||||
stdout and others to stderr.
|
||||
* Retain colored output when piping or redirecting in an interactive terminal.
|
||||
* Add end-to-end tests for database dump and restore. These are run on developer machines with
|
||||
Docker Compose for approximate parity with continuous integration tests.
|
||||
|
||||
1.4.18
|
||||
* Fix "--repository" flag to accept relative paths.
|
||||
* Fix "borgmatic umount" so it only runs Borg once instead of once per repository / configuration
|
||||
file.
|
||||
* #253: Mount whole repositories via "borgmatic mount" without any "--archive" flag.
|
||||
* #269: Filter listed paths via "borgmatic list --path" flag.
|
||||
|
||||
1.4.17
|
||||
* #235: Pass extra options directly to particular Borg commands, handy for Borg options that
|
||||
borgmatic does not yet support natively. Use "extra_borg_options" in the storage configuration
|
||||
section.
|
||||
* #266: Attempt to repair any inconsistencies found during a consistency check via
|
||||
"borgmatic check --repair" flag.
|
||||
|
||||
1.4.16
|
||||
* #256: Fix for "before_backup" hook not triggering an error when the command contains "borg" and
|
||||
has an exit code of 1.
|
||||
* #257: Fix for garbled Borg file listing when using "borgmatic create --progress" with
|
||||
verbosity level 1 or 2.
|
||||
* #260: Fix for missing Healthchecks monitoring payload or HTTP 500 due to incorrect unicode
|
||||
encoding.
|
||||
|
||||
1.4.15
|
||||
* Fix for database dump removal incorrectly skipping some database dumps.
|
||||
* #123: Support for mounting an archive as a FUSE filesystem via "borgmatic mount" action, and
|
||||
|
|
|
@ -1,7 +1,7 @@
|
|||
import logging
|
||||
|
||||
from borgmatic.borg import extract
|
||||
from borgmatic.execute import execute_command
|
||||
from borgmatic.execute import execute_command, execute_command_without_capture
|
||||
|
||||
DEFAULT_CHECKS = ('repository', 'archives')
|
||||
DEFAULT_PREFIX = '{hostname}-'
|
||||
|
@ -91,23 +91,23 @@ def check_archives(
|
|||
consistency_config,
|
||||
local_path='borg',
|
||||
remote_path=None,
|
||||
repair=None,
|
||||
only_checks=None,
|
||||
):
|
||||
'''
|
||||
Given a local or remote repository path, a storage config dict, a consistency config dict,
|
||||
local/remote commands to run, and an optional list of checks to use instead of configured
|
||||
checks, check the contained Borg archives for consistency.
|
||||
local/remote commands to run, whether to attempt a repair, and an optional list of checks
|
||||
to use instead of configured checks, check the contained Borg archives for consistency.
|
||||
|
||||
If there are no consistency checks to run, skip running them.
|
||||
'''
|
||||
checks = _parse_checks(consistency_config, only_checks)
|
||||
check_last = consistency_config.get('check_last', None)
|
||||
lock_wait = None
|
||||
extra_borg_options = storage_config.get('extra_borg_options', {}).get('check', '')
|
||||
|
||||
if set(checks).intersection(set(DEFAULT_CHECKS + ('data',))):
|
||||
remote_path_flags = ('--remote-path', remote_path) if remote_path else ()
|
||||
lock_wait = storage_config.get('lock_wait', None)
|
||||
lock_wait_flags = ('--lock-wait', str(lock_wait)) if lock_wait else ()
|
||||
|
||||
verbosity_flags = ()
|
||||
if logger.isEnabledFor(logging.INFO):
|
||||
|
@ -119,13 +119,21 @@ def check_archives(
|
|||
|
||||
full_command = (
|
||||
(local_path, 'check')
|
||||
+ (('--repair',) if repair else ())
|
||||
+ _make_check_flags(checks, check_last, prefix)
|
||||
+ remote_path_flags
|
||||
+ lock_wait_flags
|
||||
+ (('--remote-path', remote_path) if remote_path else ())
|
||||
+ (('--lock-wait', str(lock_wait)) if lock_wait else ())
|
||||
+ verbosity_flags
|
||||
+ (tuple(extra_borg_options.split(' ')) if extra_borg_options else ())
|
||||
+ (repository,)
|
||||
)
|
||||
|
||||
# The Borg repair option trigger an interactive prompt, which won't work when output is
|
||||
# captured.
|
||||
if repair:
|
||||
execute_command_without_capture(full_command, error_on_warnings=True)
|
||||
return
|
||||
|
||||
execute_command(full_command, error_on_warnings=True)
|
||||
|
||||
if 'extract' in checks:
|
||||
|
|
|
@ -104,16 +104,19 @@ def _make_exclude_flags(location_config, exclude_filename=None):
|
|||
)
|
||||
|
||||
|
||||
BORGMATIC_SOURCE_DIRECTORY = '~/.borgmatic'
|
||||
DEFAULT_BORGMATIC_SOURCE_DIRECTORY = '~/.borgmatic'
|
||||
|
||||
|
||||
def borgmatic_source_directories():
|
||||
def borgmatic_source_directories(borgmatic_source_directory):
|
||||
'''
|
||||
Return a list of borgmatic-specific source directories used for state like database backups.
|
||||
'''
|
||||
if not borgmatic_source_directory:
|
||||
borgmatic_source_directory = DEFAULT_BORGMATIC_SOURCE_DIRECTORY
|
||||
|
||||
return (
|
||||
[BORGMATIC_SOURCE_DIRECTORY]
|
||||
if os.path.exists(os.path.expanduser(BORGMATIC_SOURCE_DIRECTORY))
|
||||
[borgmatic_source_directory]
|
||||
if os.path.exists(os.path.expanduser(borgmatic_source_directory))
|
||||
else []
|
||||
)
|
||||
|
||||
|
@ -134,7 +137,8 @@ def create_archive(
|
|||
storage config dict, create a Borg archive and return Borg's JSON output (if any).
|
||||
'''
|
||||
sources = _expand_directories(
|
||||
location_config['source_directories'] + borgmatic_source_directories()
|
||||
location_config['source_directories']
|
||||
+ borgmatic_source_directories(location_config.get('borgmatic_source_directory'))
|
||||
)
|
||||
|
||||
pattern_file = _write_pattern_file(location_config.get('patterns'))
|
||||
|
@ -150,6 +154,7 @@ def create_archive(
|
|||
files_cache = location_config.get('files_cache')
|
||||
default_archive_name_format = '{hostname}-{now:%Y-%m-%dT%H:%M:%S.%f}'
|
||||
archive_name_format = storage_config.get('archive_name_format', default_archive_name_format)
|
||||
extra_borg_options = storage_config.get('extra_borg_options', {}).get('create', '')
|
||||
|
||||
full_command = (
|
||||
(local_path, 'create')
|
||||
|
@ -170,7 +175,11 @@ def create_archive(
|
|||
+ (('--remote-path', remote_path) if remote_path else ())
|
||||
+ (('--umask', str(umask)) if umask else ())
|
||||
+ (('--lock-wait', str(lock_wait)) if lock_wait else ())
|
||||
+ (('--list', '--filter', 'AME-') if logger.isEnabledFor(logging.INFO) and not json else ())
|
||||
+ (
|
||||
('--list', '--filter', 'AME-')
|
||||
if logger.isEnabledFor(logging.INFO) and not json and not progress
|
||||
else ()
|
||||
)
|
||||
+ (('--info',) if logger.getEffectiveLevel() == logging.INFO and not json else ())
|
||||
+ (
|
||||
('--stats',)
|
||||
|
@ -181,6 +190,7 @@ def create_archive(
|
|||
+ (('--dry-run',) if dry_run else ())
|
||||
+ (('--progress',) if progress else ())
|
||||
+ (('--json',) if json else ())
|
||||
+ (tuple(extra_borg_options.split(' ')) if extra_borg_options else ())
|
||||
+ (
|
||||
'{repository}::{archive_name_format}'.format(
|
||||
repository=repository, archive_name_format=archive_name_format
|
||||
|
@ -192,7 +202,7 @@ def create_archive(
|
|||
# The progress output isn't compatible with captured and logged output, as progress messes with
|
||||
# the terminal directly.
|
||||
if progress:
|
||||
execute_command_without_capture(full_command)
|
||||
execute_command_without_capture(full_command, error_on_warnings=False)
|
||||
return
|
||||
|
||||
if json:
|
||||
|
@ -202,4 +212,4 @@ def create_archive(
|
|||
else:
|
||||
output_log_level = logging.INFO
|
||||
|
||||
return execute_command(full_command, output_log_level)
|
||||
return execute_command(full_command, output_log_level, error_on_warnings=False)
|
||||
|
|
|
@ -27,7 +27,7 @@ def extract_last_archive_dry_run(repository, lock_wait=None, local_path='borg',
|
|||
+ (repository,)
|
||||
)
|
||||
|
||||
list_output = execute_command(full_list_command, output_log_level=None)
|
||||
list_output = execute_command(full_list_command, output_log_level=None, error_on_warnings=False)
|
||||
|
||||
try:
|
||||
last_archive_name = list_output.strip().splitlines()[-1]
|
||||
|
|
|
@ -39,5 +39,7 @@ def display_archives_info(
|
|||
)
|
||||
|
||||
return execute_command(
|
||||
full_command, output_log_level=None if info_arguments.json else logging.WARNING
|
||||
full_command,
|
||||
output_log_level=None if info_arguments.json else logging.WARNING,
|
||||
error_on_warnings=False,
|
||||
)
|
||||
|
|
|
@ -11,6 +11,7 @@ INFO_REPOSITORY_NOT_FOUND_EXIT_CODE = 2
|
|||
|
||||
def initialize_repository(
|
||||
repository,
|
||||
storage_config,
|
||||
encryption_mode,
|
||||
append_only=None,
|
||||
storage_quota=None,
|
||||
|
@ -18,11 +19,17 @@ def initialize_repository(
|
|||
remote_path=None,
|
||||
):
|
||||
'''
|
||||
Given a local or remote repository path, a Borg encryption mode, whether the repository should
|
||||
be append-only, and the storage quota to use, initialize the repository. If the repository
|
||||
already exists, then log and skip initialization.
|
||||
Given a local or remote repository path, a storage configuration dict, a Borg encryption mode,
|
||||
whether the repository should be append-only, and the storage quota to use, initialize the
|
||||
repository. If the repository already exists, then log and skip initialization.
|
||||
'''
|
||||
info_command = (local_path, 'info', repository)
|
||||
info_command = (
|
||||
(local_path, 'info')
|
||||
+ (('--info',) if logger.getEffectiveLevel() == logging.INFO else ())
|
||||
+ (('--debug',) if logger.isEnabledFor(logging.DEBUG) else ())
|
||||
+ (('--remote-path', remote_path) if remote_path else ())
|
||||
+ (repository,)
|
||||
)
|
||||
logger.debug(' '.join(info_command))
|
||||
|
||||
try:
|
||||
|
@ -33,6 +40,8 @@ def initialize_repository(
|
|||
if error.returncode != INFO_REPOSITORY_NOT_FOUND_EXIT_CODE:
|
||||
raise
|
||||
|
||||
extra_borg_options = storage_config.get('extra_borg_options', {}).get('init', '')
|
||||
|
||||
init_command = (
|
||||
(local_path, 'init')
|
||||
+ (('--encryption', encryption_mode) if encryption_mode else ())
|
||||
|
@ -41,8 +50,9 @@ def initialize_repository(
|
|||
+ (('--info',) if logger.getEffectiveLevel() == logging.INFO else ())
|
||||
+ (('--debug',) if logger.isEnabledFor(logging.DEBUG) else ())
|
||||
+ (('--remote-path', remote_path) if remote_path else ())
|
||||
+ (tuple(extra_borg_options.split(' ')) if extra_borg_options else ())
|
||||
+ (repository,)
|
||||
)
|
||||
|
||||
# Don't use execute_command() here because it doesn't support interactive prompts.
|
||||
execute_command_without_capture(init_command)
|
||||
execute_command_without_capture(init_command, error_on_warnings=False)
|
||||
|
|
|
@ -36,15 +36,18 @@ def list_archives(repository, storage_config, list_arguments, local_path='borg',
|
|||
+ make_flags('remote-path', remote_path)
|
||||
+ make_flags('lock-wait', lock_wait)
|
||||
+ make_flags_from_arguments(
|
||||
list_arguments, excludes=('repository', 'archive', 'successful')
|
||||
list_arguments, excludes=('repository', 'archive', 'paths', 'successful')
|
||||
)
|
||||
+ (
|
||||
'::'.join((repository, list_arguments.archive))
|
||||
if list_arguments.archive
|
||||
else repository,
|
||||
)
|
||||
+ (tuple(list_arguments.paths) if list_arguments.paths else ())
|
||||
)
|
||||
|
||||
return execute_command(
|
||||
full_command, output_log_level=None if list_arguments.json else logging.WARNING
|
||||
full_command,
|
||||
output_log_level=None if list_arguments.json else logging.WARNING,
|
||||
error_on_warnings=False,
|
||||
)
|
||||
|
|
|
@ -17,9 +17,9 @@ def mount_archive(
|
|||
remote_path=None,
|
||||
):
|
||||
'''
|
||||
Given a local or remote repository path, an archive name, a filesystem mount point, zero or more
|
||||
paths to mount from the archive, extra Borg mount options, a storage configuration dict, and
|
||||
optional local and remote Borg paths, mount the archive onto the mount point.
|
||||
Given a local or remote repository path, an optional archive name, a filesystem mount point,
|
||||
zero or more paths to mount from the archive, extra Borg mount options, a storage configuration
|
||||
dict, and optional local and remote Borg paths, mount the archive onto the mount point.
|
||||
'''
|
||||
umask = storage_config.get('umask', None)
|
||||
lock_wait = storage_config.get('lock_wait', None)
|
||||
|
@ -33,14 +33,14 @@ def mount_archive(
|
|||
+ (('--debug', '--show-rc') if logger.isEnabledFor(logging.DEBUG) else ())
|
||||
+ (('--foreground',) if foreground else ())
|
||||
+ (('-o', options) if options else ())
|
||||
+ ('::'.join((repository, archive)),)
|
||||
+ (('::'.join((repository, archive)),) if archive else (repository,))
|
||||
+ (mount_point,)
|
||||
+ (tuple(paths) if paths else ())
|
||||
)
|
||||
|
||||
# Don't capture the output when foreground mode is used so that ctrl-C can work properly.
|
||||
if foreground:
|
||||
execute_command_without_capture(full_command)
|
||||
execute_command_without_capture(full_command, error_on_warnings=False)
|
||||
return
|
||||
|
||||
execute_command(full_command)
|
||||
execute_command(full_command, error_on_warnings=False)
|
||||
|
|
|
@ -49,6 +49,7 @@ def prune_archives(
|
|||
'''
|
||||
umask = storage_config.get('umask', None)
|
||||
lock_wait = storage_config.get('lock_wait', None)
|
||||
extra_borg_options = storage_config.get('extra_borg_options', {}).get('prune', '')
|
||||
|
||||
full_command = (
|
||||
(local_path, 'prune')
|
||||
|
@ -57,11 +58,16 @@ def prune_archives(
|
|||
+ (('--umask', str(umask)) if umask else ())
|
||||
+ (('--lock-wait', str(lock_wait)) if lock_wait else ())
|
||||
+ (('--stats',) if not dry_run and logger.isEnabledFor(logging.INFO) else ())
|
||||
+ (('--info',) if logger.getEffectiveLevel() == logging.INFO else ())
|
||||
+ (('--info', '--list') if logger.getEffectiveLevel() == logging.INFO else ())
|
||||
+ (('--debug', '--list', '--show-rc') if logger.isEnabledFor(logging.DEBUG) else ())
|
||||
+ (('--dry-run',) if dry_run else ())
|
||||
+ (('--stats',) if stats else ())
|
||||
+ (tuple(extra_borg_options.split(' ')) if extra_borg_options else ())
|
||||
+ (repository,)
|
||||
)
|
||||
|
||||
execute_command(full_command, output_log_level=logging.WARNING if stats else logging.INFO)
|
||||
execute_command(
|
||||
full_command,
|
||||
output_log_level=logging.WARNING if stats else logging.INFO,
|
||||
error_on_warnings=False,
|
||||
)
|
||||
|
|
|
@ -164,6 +164,13 @@ def parse_arguments(*unparsed_arguments):
|
|||
default=None,
|
||||
help='Write log messages to this file instead of syslog',
|
||||
)
|
||||
global_group.add_argument(
|
||||
'--override',
|
||||
metavar='SECTION.OPTION=VALUE',
|
||||
nargs='+',
|
||||
dest='overrides',
|
||||
help='One or more configuration file options to override with specified values',
|
||||
)
|
||||
global_group.add_argument(
|
||||
'--version',
|
||||
dest='version',
|
||||
|
@ -266,6 +273,13 @@ def parse_arguments(*unparsed_arguments):
|
|||
add_help=False,
|
||||
)
|
||||
check_group = check_parser.add_argument_group('check arguments')
|
||||
check_group.add_argument(
|
||||
'--repair',
|
||||
dest='repair',
|
||||
default=False,
|
||||
action='store_true',
|
||||
help='Attempt to repair any inconsistencies found (experimental and only for interactive use)',
|
||||
)
|
||||
check_group.add_argument(
|
||||
'--only',
|
||||
metavar='CHECK',
|
||||
|
@ -326,7 +340,7 @@ def parse_arguments(*unparsed_arguments):
|
|||
'--repository',
|
||||
help='Path of repository to use, defaults to the configured repository if there is only one',
|
||||
)
|
||||
mount_group.add_argument('--archive', help='Name of archive to mount', required=True)
|
||||
mount_group.add_argument('--archive', help='Name of archive to mount')
|
||||
mount_group.add_argument(
|
||||
'--mount-point',
|
||||
metavar='PATH',
|
||||
|
@ -412,6 +426,13 @@ def parse_arguments(*unparsed_arguments):
|
|||
help='Path of repository to list, defaults to the configured repository if there is only one',
|
||||
)
|
||||
list_group.add_argument('--archive', help='Name of archive to list')
|
||||
list_group.add_argument(
|
||||
'--path',
|
||||
metavar='PATH',
|
||||
nargs='+',
|
||||
dest='paths',
|
||||
help='Paths to list from archive, defaults to the entire archive',
|
||||
)
|
||||
list_group.add_argument(
|
||||
'--short', default=False, action='store_true', help='Output only archive or path names'
|
||||
)
|
||||
|
|
|
@ -53,6 +53,7 @@ def run_configuration(config_filename, config, arguments):
|
|||
borg_environment.initialize(storage)
|
||||
encountered_error = None
|
||||
|
||||
error_repository = ''
|
||||
prune_create_or_check = {'prune', 'create', 'check'}.intersection(arguments)
|
||||
|
||||
witten
commented
Given that this is in
... and then this would go through the rest of the borgmatic logging/error/summary machinery. Another option would be this:
Which would fall through and trigger the Given that this is in `run_configuration()` now, you can just:
```
except IOError as error:
yield from make_error_log_records(
'{}: Failed to acquire lock'.format(config_filename), error
)
return
```
... and then this would go through the rest of the borgmatic logging/error/summary machinery.
Another option would be this:
```
except IOError as error:
encountered_error = error
yield from make_error_log_records(
'{}: Failed to acquire lock'.format(config_filename), error
)
```
Which would fall through and trigger the `on_error` hook, if that's what you want to do in this case.
|
||||
if location.get("lock_client", False):
|
||||
witten
commented
Make sure you run tests with Make sure you run tests with `tox`, as the code formatter will probably complain about this. To have it reformat things for you: `tox -e black` is all you need to do.
|
||||
lock_f = open(config_filename)
|
||||
|
@ -64,30 +65,33 @@ def run_configuration(config_filename, config, arguments):
|
|||
'{}: Failed to acquire lock'.format(config_filename), error
|
||||
)
|
||||
|
||||
if not encountered_error and 'create' in arguments:
|
||||
if not encountered_error:
|
||||
try:
|
||||
witten
commented
These can be combined into a single These can be combined into a single `if` statement.
|
||||
dispatch.call_hooks(
|
||||
'ping_monitor',
|
||||
hooks,
|
||||
config_filename,
|
||||
monitor.MONITOR_HOOK_NAMES,
|
||||
monitor.State.START,
|
||||
global_arguments.dry_run,
|
||||
)
|
||||
command.execute_hook(
|
||||
hooks.get('before_backup'),
|
||||
hooks.get('umask'),
|
||||
config_filename,
|
||||
'pre-backup',
|
||||
global_arguments.dry_run,
|
||||
)
|
||||
dispatch.call_hooks(
|
||||
'dump_databases',
|
||||
hooks,
|
||||
config_filename,
|
||||
dump.DATABASE_HOOK_NAMES,
|
||||
global_arguments.dry_run,
|
||||
)
|
||||
if prune_create_or_check:
|
||||
dispatch.call_hooks(
|
||||
'ping_monitor',
|
||||
hooks,
|
||||
config_filename,
|
||||
monitor.MONITOR_HOOK_NAMES,
|
||||
monitor.State.START,
|
||||
global_arguments.dry_run,
|
||||
)
|
||||
if 'create' in arguments:
|
||||
command.execute_hook(
|
||||
hooks.get('before_backup'),
|
||||
hooks.get('umask'),
|
||||
config_filename,
|
||||
'pre-backup',
|
||||
global_arguments.dry_run,
|
||||
)
|
||||
dispatch.call_hooks(
|
||||
'dump_databases',
|
||||
hooks,
|
||||
config_filename,
|
||||
dump.DATABASE_HOOK_NAMES,
|
||||
location,
|
||||
global_arguments.dry_run,
|
||||
)
|
||||
except (OSError, CalledProcessError) as error:
|
||||
encountered_error = error
|
||||
yield from make_error_log_records(
|
||||
|
@ -115,37 +119,40 @@ def run_configuration(config_filename, config, arguments):
|
|||
'{}: Error running actions for repository'.format(repository_path), error
|
||||
)
|
||||
|
||||
if 'create' in arguments and not encountered_error:
|
||||
if not encountered_error:
|
||||
try:
|
||||
dispatch.call_hooks(
|
||||
'remove_database_dumps',
|
||||
hooks,
|
||||
config_filename,
|
||||
dump.DATABASE_HOOK_NAMES,
|
||||
global_arguments.dry_run,
|
||||
)
|
||||
command.execute_hook(
|
||||
hooks.get('after_backup'),
|
||||
hooks.get('umask'),
|
||||
config_filename,
|
||||
'post-backup',
|
||||
global_arguments.dry_run,
|
||||
)
|
||||
dispatch.call_hooks(
|
||||
'ping_monitor',
|
||||
hooks,
|
||||
config_filename,
|
||||
monitor.MONITOR_HOOK_NAMES,
|
||||
monitor.State.FINISH,
|
||||
global_arguments.dry_run,
|
||||
)
|
||||
if 'create' in arguments:
|
||||
dispatch.call_hooks(
|
||||
'remove_database_dumps',
|
||||
hooks,
|
||||
config_filename,
|
||||
dump.DATABASE_HOOK_NAMES,
|
||||
location,
|
||||
global_arguments.dry_run,
|
||||
)
|
||||
command.execute_hook(
|
||||
hooks.get('after_backup'),
|
||||
hooks.get('umask'),
|
||||
config_filename,
|
||||
'post-backup',
|
||||
global_arguments.dry_run,
|
||||
)
|
||||
if {'prune', 'create', 'check'}.intersection(arguments):
|
||||
dispatch.call_hooks(
|
||||
'ping_monitor',
|
||||
hooks,
|
||||
config_filename,
|
||||
monitor.MONITOR_HOOK_NAMES,
|
||||
monitor.State.FINISH,
|
||||
global_arguments.dry_run,
|
||||
)
|
||||
except (OSError, CalledProcessError) as error:
|
||||
encountered_error = error
|
||||
yield from make_error_log_records(
|
||||
'{}: Error running post-backup hook'.format(config_filename), error
|
||||
)
|
||||
|
||||
if encountered_error:
|
||||
if encountered_error and prune_create_or_check:
|
||||
try:
|
||||
command.execute_hook(
|
||||
hooks.get('on_error'),
|
||||
|
@ -200,6 +207,7 @@ def run_actions(
|
|||
logger.info('{}: Initializing repository'.format(repository))
|
||||
borg_init.initialize_repository(
|
||||
repository,
|
||||
storage,
|
||||
arguments['init'].encryption_mode,
|
||||
arguments['init'].append_only,
|
||||
arguments['init'].storage_quota,
|
||||
|
@ -240,10 +248,13 @@ def run_actions(
|
|||
consistency,
|
||||
local_path=local_path,
|
||||
remote_path=remote_path,
|
||||
repair=arguments['check'].repair,
|
||||
only_checks=arguments['check'].only,
|
||||
)
|
||||
if 'extract' in arguments:
|
||||
if arguments['extract'].repository is None or repository == arguments['extract'].repository:
|
||||
if arguments['extract'].repository is None or validate.repositories_match(
|
||||
repository, arguments['extract'].repository
|
||||
):
|
||||
logger.info(
|
||||
'{}: Extracting archive {}'.format(repository, arguments['extract'].archive)
|
||||
)
|
||||
|
@ -260,8 +271,16 @@ def run_actions(
|
|||
progress=arguments['extract'].progress,
|
||||
)
|
||||
if 'mount' in arguments:
|
||||
if arguments['mount'].repository is None or repository == arguments['mount'].repository:
|
||||
logger.info('{}: Mounting archive {}'.format(repository, arguments['mount'].archive))
|
||||
if arguments['mount'].repository is None or validate.repositories_match(
|
||||
repository, arguments['mount'].repository
|
||||
):
|
||||
if arguments['mount'].archive:
|
||||
logger.info(
|
||||
'{}: Mounting archive {}'.format(repository, arguments['mount'].archive)
|
||||
)
|
||||
else:
|
||||
logger.info('{}: Mounting repository'.format(repository))
|
||||
|
||||
borg_mount.mount_archive(
|
||||
repository,
|
||||
arguments['mount'].archive,
|
||||
|
@ -273,15 +292,10 @@ def run_actions(
|
|||
local_path=local_path,
|
||||
remote_path=remote_path,
|
||||
)
|
||||
if 'umount' in arguments:
|
||||
logger.info(
|
||||
'{}: Unmounting mount point {}'.format(repository, arguments['umount'].mount_point)
|
||||
)
|
||||
borg_umount.unmount_archive(
|
||||
mount_point=arguments['umount'].mount_point, local_path=local_path
|
||||
)
|
||||
if 'restore' in arguments:
|
||||
if arguments['restore'].repository is None or repository == arguments['restore'].repository:
|
||||
if arguments['restore'].repository is None or validate.repositories_match(
|
||||
repository, arguments['restore'].repository
|
||||
):
|
||||
logger.info(
|
||||
'{}: Restoring databases from archive {}'.format(
|
||||
repository, arguments['restore'].archive
|
||||
|
@ -298,6 +312,7 @@ def run_actions(
|
|||
hooks,
|
||||
repository,
|
||||
dump.DATABASE_HOOK_NAMES,
|
||||
location,
|
||||
restore_names,
|
||||
)
|
||||
|
||||
|
@ -329,6 +344,7 @@ def run_actions(
|
|||
restore_databases,
|
||||
repository,
|
||||
dump.DATABASE_HOOK_NAMES,
|
||||
location,
|
||||
global_arguments.dry_run,
|
||||
)
|
||||
dispatch.call_hooks(
|
||||
|
@ -336,10 +352,13 @@ def run_actions(
|
|||
restore_databases,
|
||||
repository,
|
||||
dump.DATABASE_HOOK_NAMES,
|
||||
location,
|
||||
global_arguments.dry_run,
|
||||
)
|
||||
if 'list' in arguments:
|
||||
if arguments['list'].repository is None or repository == arguments['list'].repository:
|
||||
if arguments['list'].repository is None or validate.repositories_match(
|
||||
repository, arguments['list'].repository
|
||||
):
|
||||
logger.info('{}: Listing archives'.format(repository))
|
||||
json_output = borg_list.list_archives(
|
||||
repository,
|
||||
|
@ -351,7 +370,9 @@ def run_actions(
|
|||
if json_output:
|
||||
yield json.loads(json_output)
|
||||
if 'info' in arguments:
|
||||
if arguments['info'].repository is None or repository == arguments['info'].repository:
|
||||
if arguments['info'].repository is None or validate.repositories_match(
|
||||
repository, arguments['info'].repository
|
||||
):
|
||||
logger.info('{}: Displaying summary info for archives'.format(repository))
|
||||
json_output = borg_info.display_archives_info(
|
||||
repository,
|
||||
|
@ -364,7 +385,7 @@ def run_actions(
|
|||
yield json.loads(json_output)
|
||||
|
||||
|
||||
def load_configurations(config_filenames):
|
||||
def load_configurations(config_filenames, overrides=None):
|
||||
'''
|
||||
Given a sequence of configuration filenames, load and validate each configuration file. Return
|
||||
the results as a tuple of: dict of configuration filename to corresponding parsed configuration,
|
||||
|
@ -378,7 +399,7 @@ def load_configurations(config_filenames):
|
|||
for config_filename in config_filenames:
|
||||
try:
|
||||
configs[config_filename] = validate.parse_configuration(
|
||||
config_filename, validate.schema_filename()
|
||||
config_filename, validate.schema_filename(), overrides
|
||||
)
|
||||
except (ValueError, OSError, validate.Validation_error) as error:
|
||||
logs.extend(
|
||||
|
@ -440,6 +461,14 @@ def make_error_log_records(message, error=None):
|
|||
pass
|
||||
|
||||
|
||||
def get_local_path(configs):
|
||||
'''
|
||||
Arbitrarily return the local path from the first configuration dict. Default to "borg" if not
|
||||
set.
|
||||
'''
|
||||
return next(iter(configs.values())).get('location', {}).get('local_path', 'borg')
|
||||
|
||||
|
||||
def collect_configuration_run_summary_logs(configs, arguments):
|
||||
'''
|
||||
Given a dict of configuration filename to corresponding parsed configuration, and parsed
|
||||
|
@ -510,6 +539,15 @@ def collect_configuration_run_summary_logs(configs, arguments):
|
|||
if results:
|
||||
json_results.extend(results)
|
||||
|
||||
if 'umount' in arguments:
|
||||
logger.info('Unmounting mount point {}'.format(arguments['umount'].mount_point))
|
||||
try:
|
||||
borg_umount.unmount_archive(
|
||||
mount_point=arguments['umount'].mount_point, local_path=get_local_path(configs)
|
||||
)
|
||||
except (CalledProcessError, OSError) as error:
|
||||
yield from make_error_log_records('Error unmounting mount point', error)
|
||||
|
||||
if json_results:
|
||||
sys.stdout.write(json.dumps(json_results))
|
||||
|
||||
|
@ -559,7 +597,7 @@ def main(): # pragma: no cover
|
|||
sys.exit(0)
|
||||
|
||||
config_filenames = tuple(collect.collect_config_filenames(global_arguments.config_paths))
|
||||
configs, parse_logs = load_configurations(config_filenames)
|
||||
configs, parse_logs = load_configurations(config_filenames, global_arguments.overrides)
|
||||
|
||||
colorama.init(autoreset=True, strip=not should_do_markup(global_arguments.no_color, configs))
|
||||
try:
|
||||
|
|
|
@ -0,0 +1,71 @@
|
|||
import io
|
||||
|
||||
import ruamel.yaml
|
||||
|
||||
|
||||
def set_values(config, keys, value):
|
||||
'''
|
||||
Given a hierarchy of configuration dicts, a sequence of parsed key strings, and a string value,
|
||||
descend into the hierarchy based on the keys to set the value into the right place.
|
||||
'''
|
||||
if not keys:
|
||||
return
|
||||
|
||||
first_key = keys[0]
|
||||
if len(keys) == 1:
|
||||
config[first_key] = value
|
||||
return
|
||||
|
||||
if first_key not in config:
|
||||
config[first_key] = {}
|
||||
|
||||
set_values(config[first_key], keys[1:], value)
|
||||
|
||||
|
||||
def convert_value_type(value):
|
||||
'''
|
||||
Given a string value, determine its logical type (string, boolean, integer, etc.), and return it
|
||||
converted to that type.
|
||||
'''
|
||||
return ruamel.yaml.YAML(typ='safe').load(io.StringIO(value))
|
||||
|
||||
|
||||
def parse_overrides(raw_overrides):
|
||||
'''
|
||||
Given a sequence of configuration file override strings in the form of "section.option=value",
|
||||
parse and return a sequence of tuples (keys, values), where keys is a sequence of strings. For
|
||||
instance, given the following raw overrides:
|
||||
|
||||
['section.my_option=value1', 'section.other_option=value2']
|
||||
|
||||
... return this:
|
||||
|
||||
(
|
||||
(('section', 'my_option'), 'value1'),
|
||||
(('section', 'other_option'), 'value2'),
|
||||
)
|
||||
|
||||
Raise ValueError if an override can't be parsed.
|
||||
'''
|
||||
if not raw_overrides:
|
||||
return ()
|
||||
|
||||
try:
|
||||
return tuple(
|
||||
(tuple(raw_keys.split('.')), convert_value_type(value))
|
||||
for raw_override in raw_overrides
|
||||
for raw_keys, value in (raw_override.split('=', 1),)
|
||||
)
|
||||
except ValueError:
|
||||
raise ValueError('Invalid override. Make sure you use the form: SECTION.OPTION=VALUE')
|
||||
|
||||
|
||||
def apply_overrides(config, raw_overrides):
|
||||
'''
|
||||
Given a sequence of configuration file override strings in the form of "section.option=value"
|
||||
and a configuration dict, parse each override and set it the configuration dict.
|
||||
'''
|
||||
overrides = parse_overrides(raw_overrides)
|
||||
|
||||
for (keys, value) in overrides:
|
||||
set_values(config, keys, value)
|
|
@ -141,6 +141,14 @@ map:
|
|||
desc: |
|
||||
Exclude files with the NODUMP flag. Defaults to false.
|
||||
example: true
|
||||
borgmatic_source_directory:
|
||||
type: str
|
||||
desc: |
|
||||
Path for additional source files used for temporary internal state like
|
||||
borgmatic database dumps. Note that changing this path prevents "borgmatic
|
||||
restore" from finding any database dumps created before the change. Defaults
|
||||
to ~/.borgmatic
|
||||
example: /tmp/borgmatic
|
||||
storage:
|
||||
desc: |
|
||||
Repository storage options. See
|
||||
|
@ -249,6 +257,29 @@ map:
|
|||
Bypass Borg error about a previously unknown unencrypted repository. Defaults to
|
||||
false.
|
||||
example: true
|
||||
extra_borg_options:
|
||||
map:
|
||||
init:
|
||||
type: str
|
||||
desc: Extra command-line options to pass to "borg init".
|
||||
example: "--make-parent-dirs"
|
||||
prune:
|
||||
type: str
|
||||
desc: Extra command-line options to pass to "borg prune".
|
||||
example: "--save-space"
|
||||
create:
|
||||
type: str
|
||||
desc: Extra command-line options to pass to "borg create".
|
||||
example: "--no-files-cache"
|
||||
check:
|
||||
type: str
|
||||
desc: Extra command-line options to pass to "borg check".
|
||||
example: "--save-space"
|
||||
desc: |
|
||||
Additional options to pass directly to particular Borg commands, handy for Borg
|
||||
options that borgmatic does not yet support natively. Note that borgmatic does
|
||||
not perform any validation on these options. Running borgmatic with
|
||||
"--verbosity 2" shows the exact Borg command-line invocation.
|
||||
retention:
|
||||
desc: |
|
||||
Retention policy for how many backups to keep in each category. See
|
||||
|
|
|
@ -1,11 +1,12 @@
|
|||
import logging
|
||||
import os
|
||||
|
||||
import pkg_resources
|
||||
import pykwalify.core
|
||||
import pykwalify.errors
|
||||
import ruamel.yaml
|
||||
|
||||
from borgmatic.config import load
|
||||
from borgmatic.config import load, override
|
||||
|
||||
|
||||
def schema_filename():
|
||||
|
@ -81,11 +82,12 @@ def remove_examples(schema):
|
|||
return schema
|
||||
|
||||
|
||||
def parse_configuration(config_filename, schema_filename):
|
||||
def parse_configuration(config_filename, schema_filename, overrides=None):
|
||||
'''
|
||||
Given the path to a config filename in YAML format and the path to a schema filename in
|
||||
pykwalify YAML schema format, return the parsed configuration as a data structure of nested
|
||||
dicts and lists corresponding to the schema. Example return value:
|
||||
Given the path to a config filename in YAML format, the path to a schema filename in pykwalify
|
||||
YAML schema format, a sequence of configuration file override strings in the form of
|
||||
"section.option=value", return the parsed configuration as a data structure of nested dicts and
|
||||
lists corresponding to the schema. Example return value:
|
||||
|
||||
{'location': {'source_directories': ['/home', '/etc'], 'repository': 'hostname.borg'},
|
||||
'retention': {'keep_daily': 7}, 'consistency': {'checks': ['repository', 'archives']}}
|
||||
|
@ -101,6 +103,8 @@ def parse_configuration(config_filename, schema_filename):
|
|||
except (ruamel.yaml.error.YAMLError, RecursionError) as error:
|
||||
raise Validation_error(config_filename, (str(error),))
|
||||
|
||||
override.apply_overrides(config, overrides)
|
||||
|
||||
validator = pykwalify.core.Core(source_data=config, schema_data=remove_examples(schema))
|
||||
parsed_result = validator.validate(raise_exception=False)
|
||||
|
||||
|
@ -112,6 +116,24 @@ def parse_configuration(config_filename, schema_filename):
|
|||
return parsed_result
|
||||
|
||||
|
||||
def normalize_repository_path(repository):
|
||||
'''
|
||||
Given a repository path, return the absolute path of it (for local repositories).
|
||||
'''
|
||||
# A colon in the repository indicates it's a remote repository. Bail.
|
||||
if ':' in repository:
|
||||
return repository
|
||||
|
||||
return os.path.abspath(repository)
|
||||
|
||||
|
||||
def repositories_match(first, second):
|
||||
'''
|
||||
Given two repository paths (relative and/or absolute), return whether they match.
|
||||
'''
|
||||
return normalize_repository_path(first) == normalize_repository_path(second)
|
||||
|
||||
|
||||
def guard_configuration_contains_repository(repository, configurations):
|
||||
'''
|
||||
Given a repository path and a dict mapping from config filename to corresponding parsed config
|
||||
|
@ -133,9 +155,7 @@ def guard_configuration_contains_repository(repository, configurations):
|
|||
|
||||
if count > 1:
|
||||
raise ValueError(
|
||||
'Can\'t determine which repository to use. Use --repository option to disambiguate'.format(
|
||||
repository
|
||||
)
|
||||
'Can\'t determine which repository to use. Use --repository option to disambiguate'
|
||||
)
|
||||
|
||||
return
|
||||
|
@ -145,7 +165,7 @@ def guard_configuration_contains_repository(repository, configurations):
|
|||
config_repository
|
||||
for config in configurations.values()
|
||||
for config_repository in config['location']['repositories']
|
||||
if repository == config_repository
|
||||
if repositories_match(repository, config_repository)
|
||||
)
|
||||
)
|
||||
|
||||
|
|
|
@ -9,15 +9,15 @@ ERROR_OUTPUT_MAX_LINE_COUNT = 25
|
|||
BORG_ERROR_EXIT_CODE = 2
|
||||
|
||||
|
||||
def exit_code_indicates_error(command, exit_code, error_on_warnings=False):
|
||||
def exit_code_indicates_error(command, exit_code, error_on_warnings=True):
|
||||
'''
|
||||
Return True if the given exit code from running the command corresponds to an error.
|
||||
If error on warnings is False, then treat exit code 1 as a warning instead of an error.
|
||||
'''
|
||||
# If we're running something other than Borg, treat all non-zero exit codes as errors.
|
||||
if 'borg' in command[0] and not error_on_warnings:
|
||||
return bool(exit_code >= BORG_ERROR_EXIT_CODE)
|
||||
if error_on_warnings:
|
||||
return bool(exit_code != 0)
|
||||
|
||||
return bool(exit_code != 0)
|
||||
return bool(exit_code >= BORG_ERROR_EXIT_CODE)
|
||||
|
||||
|
||||
def log_output(command, process, output_buffer, output_log_level, error_on_warnings):
|
||||
|
@ -65,7 +65,7 @@ def execute_command(
|
|||
shell=False,
|
||||
extra_environment=None,
|
||||
working_directory=None,
|
||||
error_on_warnings=False,
|
||||
error_on_warnings=True,
|
||||
):
|
||||
'''
|
||||
Execute the given command (a sequence of command/argument strings) and log its output at the
|
||||
|
@ -75,7 +75,7 @@ def execute_command(
|
|||
file. If shell is True, execute the command within a shell. If an extra environment dict is
|
||||
given, then use it to augment the current environment, and pass the result into the command. If
|
||||
a working directory is given, use that as the present working directory when running the
|
||||
command.
|
||||
command. If error on warnings is False, then treat exit code 1 as a warning instead of an error.
|
||||
|
||||
Raise subprocesses.CalledProcessError if an error occurs while running the command.
|
||||
'''
|
||||
|
@ -110,14 +110,14 @@ def execute_command(
|
|||
)
|
||||
|
||||
|
||||
def execute_command_without_capture(full_command, working_directory=None, error_on_warnings=False):
|
||||
def execute_command_without_capture(full_command, working_directory=None, error_on_warnings=True):
|
||||
'''
|
||||
Execute the given command (a sequence of command/argument strings), but don't capture or log its
|
||||
output in any way. This is necessary for commands that monkey with the terminal (e.g. progress
|
||||
display) or provide interactive prompts.
|
||||
|
||||
If a working directory is given, use that as the present working directory when running the
|
||||
command.
|
||||
command. If error on warnings is False, then treat exit code 1 as a warning instead of an error.
|
||||
'''
|
||||
logger.debug(' '.join(full_command))
|
||||
|
||||
|
|
|
@ -2,11 +2,24 @@ import glob
|
|||
import logging
|
||||
import os
|
||||
|
||||
from borgmatic.borg.create import DEFAULT_BORGMATIC_SOURCE_DIRECTORY
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
DATABASE_HOOK_NAMES = ('postgresql_databases', 'mysql_databases')
|
||||
|
||||
|
||||
def make_database_dump_path(borgmatic_source_directory, database_hook_name):
|
||||
'''
|
||||
Given a borgmatic source directory (or None) and a database hook name, construct a database dump
|
||||
path.
|
||||
'''
|
||||
if not borgmatic_source_directory:
|
||||
borgmatic_source_directory = DEFAULT_BORGMATIC_SOURCE_DIRECTORY
|
||||
|
||||
return os.path.join(borgmatic_source_directory, database_hook_name)
|
||||
|
||||
|
||||
def make_database_dump_filename(dump_path, name, hostname=None):
|
||||
'''
|
||||
Based on the given dump directory path, database name, and hostname, return a filename to use
|
||||
|
|
|
@ -97,4 +97,4 @@ def ping_monitor(ping_url_or_uuid, config_filename, state, dry_run):
|
|||
|
||||
if not dry_run:
|
||||
logging.getLogger('urllib3').setLevel(logging.ERROR)
|
||||
requests.post(ping_url, data=payload)
|
||||
requests.post(ping_url, data=payload.encode('utf-8'))
|
||||
|
|
|
@ -4,15 +4,24 @@ import os
|
|||
from borgmatic.execute import execute_command
|
||||
from borgmatic.hooks import dump
|
||||
|
||||
DUMP_PATH = '~/.borgmatic/mysql_databases'
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
def dump_databases(databases, log_prefix, dry_run):
|
||||
def make_dump_path(location_config): # pragma: no cover
|
||||
'''
|
||||
Make the dump path from the given location configuration and the name of this hook.
|
||||
'''
|
||||
return dump.make_database_dump_path(
|
||||
location_config.get('borgmatic_source_directory'), 'mysql_databases'
|
||||
)
|
||||
|
||||
|
||||
def dump_databases(databases, log_prefix, location_config, dry_run):
|
||||
'''
|
||||
Dump the given MySQL/MariaDB databases to disk. The databases are supplied as a sequence of
|
||||
dicts, one dict describing each database as per the configuration schema. Use the given log
|
||||
prefix in any log entries. If this is a dry run, then don't actually dump anything.
|
||||
prefix in any log entries. Use the given location configuration dict to construct the
|
||||
destination path. If this is a dry run, then don't actually dump anything.
|
||||
'''
|
||||
dry_run_label = ' (dry run; not actually dumping anything)' if dry_run else ''
|
||||
|
||||
|
@ -20,7 +29,9 @@ def dump_databases(databases, log_prefix, dry_run):
|
|||
|
||||
for database in databases:
|
||||
name = database['name']
|
||||
dump_filename = dump.make_database_dump_filename(DUMP_PATH, name, database.get('hostname'))
|
||||
dump_filename = dump.make_database_dump_filename(
|
||||
make_dump_path(location_config), name, database.get('hostname')
|
||||
)
|
||||
command = (
|
||||
('mysqldump', '--add-drop-database')
|
||||
+ (('--host', database['hostname']) if 'hostname' in database else ())
|
||||
|
@ -44,37 +55,43 @@ def dump_databases(databases, log_prefix, dry_run):
|
|||
)
|
||||
|
||||
|
||||
def remove_database_dumps(databases, log_prefix, dry_run): # pragma: no cover
|
||||
def remove_database_dumps(databases, log_prefix, location_config, dry_run): # pragma: no cover
|
||||
'''
|
||||
Remove the database dumps for the given databases. The databases are supplied as a sequence of
|
||||
dicts, one dict describing each database as per the configuration schema. Use the log prefix in
|
||||
any log entries. If this is a dry run, then don't actually remove anything.
|
||||
any log entries. Use the given location configuration dict to construct the destination path. If
|
||||
this is a dry run, then don't actually remove anything.
|
||||
'''
|
||||
dump.remove_database_dumps(DUMP_PATH, databases, 'MySQL', log_prefix, dry_run)
|
||||
dump.remove_database_dumps(
|
||||
make_dump_path(location_config), databases, 'MySQL', log_prefix, dry_run
|
||||
)
|
||||
|
||||
|
||||
def make_database_dump_patterns(databases, log_prefix, names):
|
||||
def make_database_dump_patterns(databases, log_prefix, location_config, names):
|
||||
'''
|
||||
Given a sequence of configurations dicts, a prefix to log with, and a sequence of database
|
||||
names to match, return the corresponding glob patterns to match the database dumps in an
|
||||
archive. An empty sequence of names indicates that the patterns should match all dumps.
|
||||
Given a sequence of configurations dicts, a prefix to log with, a location configuration dict,
|
||||
and a sequence of database names to match, return the corresponding glob patterns to match the
|
||||
database dumps in an archive. An empty sequence of names indicates that the patterns should
|
||||
match all dumps.
|
||||
'''
|
||||
return [
|
||||
dump.make_database_dump_filename(DUMP_PATH, name, hostname='*') for name in (names or ['*'])
|
||||
dump.make_database_dump_filename(make_dump_path(location_config), name, hostname='*')
|
||||
for name in (names or ['*'])
|
||||
]
|
||||
|
||||
|
||||
def restore_database_dumps(databases, log_prefix, dry_run):
|
||||
def restore_database_dumps(databases, log_prefix, location_config, dry_run):
|
||||
'''
|
||||
Restore the given MySQL/MariaDB databases from disk. The databases are supplied as a sequence of
|
||||
dicts, one dict describing each database as per the configuration schema. Use the given log
|
||||
prefix in any log entries. If this is a dry run, then don't actually restore anything.
|
||||
prefix in any log entries. Use the given location configuration dict to construct the
|
||||
destination path. If this is a dry run, then don't actually restore anything.
|
||||
'''
|
||||
dry_run_label = ' (dry run; not actually restoring anything)' if dry_run else ''
|
||||
|
||||
for database in databases:
|
||||
dump_filename = dump.make_database_dump_filename(
|
||||
DUMP_PATH, database['name'], database.get('hostname')
|
||||
make_dump_path(location_config), database['name'], database.get('hostname')
|
||||
)
|
||||
restore_command = (
|
||||
('mysql', '--batch')
|
||||
|
|
|
@ -4,15 +4,24 @@ import os
|
|||
from borgmatic.execute import execute_command
|
||||
from borgmatic.hooks import dump
|
||||
|
||||
DUMP_PATH = '~/.borgmatic/postgresql_databases'
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
def dump_databases(databases, log_prefix, dry_run):
|
||||
def make_dump_path(location_config): # pragma: no cover
|
||||
'''
|
||||
Make the dump path from the given location configuration and the name of this hook.
|
||||
'''
|
||||
return dump.make_database_dump_path(
|
||||
location_config.get('borgmatic_source_directory'), 'postgresql_databases'
|
||||
)
|
||||
|
||||
|
||||
def dump_databases(databases, log_prefix, location_config, dry_run):
|
||||
'''
|
||||
Dump the given PostgreSQL databases to disk. The databases are supplied as a sequence of dicts,
|
||||
one dict describing each database as per the configuration schema. Use the given log prefix in
|
||||
any log entries. If this is a dry run, then don't actually dump anything.
|
||||
any log entries. Use the given location configuration dict to construct the destination path. If
|
||||
this is a dry run, then don't actually dump anything.
|
||||
'''
|
||||
dry_run_label = ' (dry run; not actually dumping anything)' if dry_run else ''
|
||||
|
||||
|
@ -20,7 +29,9 @@ def dump_databases(databases, log_prefix, dry_run):
|
|||
|
||||
for database in databases:
|
||||
name = database['name']
|
||||
dump_filename = dump.make_database_dump_filename(DUMP_PATH, name, database.get('hostname'))
|
||||
dump_filename = dump.make_database_dump_filename(
|
||||
make_dump_path(location_config), name, database.get('hostname')
|
||||
)
|
||||
all_databases = bool(name == 'all')
|
||||
command = (
|
||||
('pg_dumpall' if all_databases else 'pg_dump', '--no-password', '--clean')
|
||||
|
@ -44,37 +55,43 @@ def dump_databases(databases, log_prefix, dry_run):
|
|||
execute_command(command, extra_environment=extra_environment)
|
||||
|
||||
|
||||
def remove_database_dumps(databases, log_prefix, dry_run): # pragma: no cover
|
||||
def remove_database_dumps(databases, log_prefix, location_config, dry_run): # pragma: no cover
|
||||
'''
|
||||
Remove the database dumps for the given databases. The databases are supplied as a sequence of
|
||||
dicts, one dict describing each database as per the configuration schema. Use the log prefix in
|
||||
any log entries. If this is a dry run, then don't actually remove anything.
|
||||
any log entries. Use the given location configuration dict to construct the destination path. If
|
||||
this is a dry run, then don't actually remove anything.
|
||||
'''
|
||||
dump.remove_database_dumps(DUMP_PATH, databases, 'PostgreSQL', log_prefix, dry_run)
|
||||
dump.remove_database_dumps(
|
||||
make_dump_path(location_config), databases, 'PostgreSQL', log_prefix, dry_run
|
||||
)
|
||||
|
||||
|
||||
def make_database_dump_patterns(databases, log_prefix, names):
|
||||
def make_database_dump_patterns(databases, log_prefix, location_config, names):
|
||||
'''
|
||||
Given a sequence of configurations dicts, a prefix to log with, and a sequence of database
|
||||
names to match, return the corresponding glob patterns to match the database dumps in an
|
||||
archive. An empty sequence of names indicates that the patterns should match all dumps.
|
||||
Given a sequence of configurations dicts, a prefix to log with, a location configuration dict,
|
||||
and a sequence of database names to match, return the corresponding glob patterns to match the
|
||||
database dumps in an archive. An empty sequence of names indicates that the patterns should
|
||||
match all dumps.
|
||||
'''
|
||||
return [
|
||||
dump.make_database_dump_filename(DUMP_PATH, name, hostname='*') for name in (names or ['*'])
|
||||
dump.make_database_dump_filename(make_dump_path(location_config), name, hostname='*')
|
||||
for name in (names or ['*'])
|
||||
]
|
||||
|
||||
|
||||
def restore_database_dumps(databases, log_prefix, dry_run):
|
||||
def restore_database_dumps(databases, log_prefix, location_config, dry_run):
|
||||
'''
|
||||
Restore the given PostgreSQL databases from disk. The databases are supplied as a sequence of
|
||||
dicts, one dict describing each database as per the configuration schema. Use the given log
|
||||
prefix in any log entries. If this is a dry run, then don't actually restore anything.
|
||||
prefix in any log entries. Use the given location configuration dict to construct the
|
||||
destination path. If this is a dry run, then don't actually restore anything.
|
||||
'''
|
||||
dry_run_label = ' (dry run; not actually restoring anything)' if dry_run else ''
|
||||
|
||||
for database in databases:
|
||||
dump_filename = dump.make_database_dump_filename(
|
||||
DUMP_PATH, database['name'], database.get('hostname')
|
||||
make_dump_path(location_config), database['name'], database.get('hostname')
|
||||
)
|
||||
restore_command = (
|
||||
('pg_restore', '--no-password', '--clean', '--if-exists', '--exit-on-error')
|
||||
|
|
|
@ -26,7 +26,7 @@ def interactive_console():
|
|||
Return whether the current console is "interactive". Meaning: Capable of
|
||||
user input and not just something like a cron job.
|
||||
'''
|
||||
return sys.stdout.isatty() and os.environ.get('TERM') != 'dumb'
|
||||
return sys.stderr.isatty() and os.environ.get('TERM') != 'dumb'
|
||||
|
||||
|
||||
def should_do_markup(no_color, configs):
|
||||
|
@ -48,6 +48,42 @@ def should_do_markup(no_color, configs):
|
|||
return interactive_console()
|
||||
|
||||
|
||||
class Multi_stream_handler(logging.Handler):
|
||||
'''
|
||||
A logging handler that dispatches each log record to one of multiple stream handlers depending
|
||||
on the record's log level.
|
||||
'''
|
||||
|
||||
def __init__(self, log_level_to_stream_handler):
|
||||
super(Multi_stream_handler, self).__init__()
|
||||
self.log_level_to_handler = log_level_to_stream_handler
|
||||
self.handlers = set(self.log_level_to_handler.values())
|
||||
|
||||
def flush(self): # pragma: no cover
|
||||
super(Multi_stream_handler, self).flush()
|
||||
|
||||
for handler in self.handlers:
|
||||
handler.flush()
|
||||
|
||||
def emit(self, record):
|
||||
'''
|
||||
Dispatch the log record to the approriate stream handler for the record's log level.
|
||||
'''
|
||||
self.log_level_to_handler[record.levelno].emit(record)
|
||||
|
||||
def setFormatter(self, formatter): # pragma: no cover
|
||||
super(Multi_stream_handler, self).setFormatter(formatter)
|
||||
|
||||
for handler in self.handlers:
|
||||
handler.setFormatter(formatter)
|
||||
|
||||
def setLevel(self, level): # pragma: no cover
|
||||
super(Multi_stream_handler, self).setLevel(level)
|
||||
|
||||
for handler in self.handlers:
|
||||
handler.setLevel(level)
|
||||
|
||||
|
||||
LOG_LEVEL_TO_COLOR = {
|
||||
logging.CRITICAL: colorama.Fore.RED,
|
||||
logging.ERROR: colorama.Fore.RED,
|
||||
|
@ -87,7 +123,19 @@ def configure_logging(
|
|||
if log_file_log_level is None:
|
||||
log_file_log_level = console_log_level
|
||||
|
||||
console_handler = logging.StreamHandler()
|
||||
# Log certain log levels to console stderr and others to stdout. This supports use cases like
|
||||
# grepping (non-error) output.
|
||||
console_error_handler = logging.StreamHandler(sys.stderr)
|
||||
console_standard_handler = logging.StreamHandler(sys.stdout)
|
||||
console_handler = Multi_stream_handler(
|
||||
{
|
||||
logging.CRITICAL: console_error_handler,
|
||||
logging.ERROR: console_error_handler,
|
||||
logging.WARN: console_standard_handler,
|
||||
logging.INFO: console_standard_handler,
|
||||
logging.DEBUG: console_standard_handler,
|
||||
}
|
||||
)
|
||||
console_handler.setFormatter(Console_color_formatter())
|
||||
console_handler.setLevel(console_log_level)
|
||||
|
||||
|
|
|
@ -1,12 +1,12 @@
|
|||
<h2>Improve this documentation</h2>
|
||||
|
||||
<p>Have an idea on how to make this documentation even better? Send your
|
||||
feedback below! (But if you need help installing or using borgmatic, please
|
||||
use our <a href="https://torsion.org/borgmatic/#issues">issue tracker</a>
|
||||
instead.)</p>
|
||||
feedback below! But if you need help with borgmatic, or have an idea for a
|
||||
borgmatic feature, please use our <a href="https://torsion.org/borgmatic/#issues">issue
|
||||
tracker</a> instead.</p>
|
||||
|
||||
<form id="suggestion-form">
|
||||
<div><label for="suggestion">Suggestion</label></div>
|
||||
<div><label for="suggestion">Documentation suggestion</label></div>
|
||||
<textarea id="suggestion" rows="8" cols="60" name="suggestion"></textarea>
|
||||
<div data-sk-error="suggestion" class="form-error"></div>
|
||||
<input id="_page" type="hidden" name="_page">
|
||||
|
|
|
@ -23,8 +23,12 @@ hooks:
|
|||
```
|
||||
|
||||
Prior to each backup, borgmatic dumps each configured database to a file
|
||||
(located in `~/.borgmatic/`) and includes it in the backup. After the backup
|
||||
completes, borgmatic removes the database dump files to recover disk space.
|
||||
and includes it in the backup. After the backup completes, borgmatic removes
|
||||
the database dump files to recover disk space.
|
||||
|
||||
borgmatic creates these temporary dump files in `~/.borgmatic` by default. To
|
||||
customize this path, set the `borgmatic_source_directory` option in the
|
||||
`location` section of borgmatic's configuration.
|
||||
|
||||
Here's a more involved example that connects to remote databases:
|
||||
|
||||
|
|
|
@ -75,14 +75,22 @@ tox -e isort
|
|||
### End-to-end tests
|
||||
|
||||
borgmatic additionally includes some end-to-end tests that integration test
|
||||
with Borg for a few representative scenarios. These tests don't run by default
|
||||
because they're relatively slow and depend on Borg. If you would like to run
|
||||
them:
|
||||
with Borg and supported databases for a few representative scenarios. These
|
||||
tests don't run by default when running `tox`, because they're relatively slow
|
||||
and depend on Docker containers for runtime dependencies. These tests tests do
|
||||
run on the continuous integration (CI) server, and running them on your
|
||||
developer machine is the closest thing to CI test parity.
|
||||
|
||||
If you would like to run the full test suite, first install Docker and [Docker
|
||||
Compose](https://docs.docker.com/compose/install/). Then run:
|
||||
|
||||
```bash
|
||||
tox -e end-to-end
|
||||
scripts/run-full-dev-tests
|
||||
```
|
||||
|
||||
Note that this scripts assumes you have permission to run Docker. If you
|
||||
don't, then you may need to run with `sudo`.
|
||||
|
||||
## Code style
|
||||
|
||||
Start with [PEP 8](https://www.python.org/dev/peps/pep-0008/). But then, apply
|
||||
|
|
|
@ -100,6 +100,12 @@ borgmatic mount --archive host-2019-... --mount-point /mnt
|
|||
This mounts the entire archive on the given mount point `/mnt`, so that you
|
||||
can look in there for your files.
|
||||
|
||||
Omit the `--archive` flag to mount all archives (lazy-loaded):
|
||||
|
||||
```bash
|
||||
borgmatic mount --mount-point /mnt
|
||||
```
|
||||
|
||||
If you'd like to restrict the mounted filesystem to only particular paths from
|
||||
your archive, use the `--path` flag, similar to the `extract` action above.
|
||||
For instance:
|
||||
|
|
|
@ -22,6 +22,11 @@ When you set up multiple configuration files like this, borgmatic will run
|
|||
each one in turn from a single borgmatic invocation. This includes, by
|
||||
default, the traditional `/etc/borgmatic/config.yaml` as well.
|
||||
|
||||
Each configuration file is interpreted independently, as if you ran borgmatic
|
||||
for each configuration file one at a time. In other words, borgmatic does not
|
||||
perform any merging of configuration files by default. If you'd like borgmatic
|
||||
to merge your configuration files, see below about configuration includes.
|
||||
|
||||
And if you need even more customizability, you can specify alternate
|
||||
configuration paths on the command-line with borgmatic's `--config` option.
|
||||
See `borgmatic --help` for more information.
|
||||
|
@ -110,6 +115,40 @@ Note that this `<<` include merging syntax is only for merging in mappings
|
|||
directly, please see the section above about standard includes.
|
||||
|
||||
|
||||
## Configuration overrides
|
||||
|
||||
In more complex multi-application setups, you may want to override particular
|
||||
borgmatic configuration file options at the time you run borgmatic. For
|
||||
instance, you could reuse a common configuration file for multiple
|
||||
applications, but then set the repository for each application at runtime. Or
|
||||
you might want to try a variant of an option for testing purposes without
|
||||
actually touching your configuration file.
|
||||
|
||||
Whatever the reason, you can override borgmatic configuration options at the
|
||||
command-line via the `--override` flag. Here's an example:
|
||||
|
||||
```bash
|
||||
borgmatic create --override location.remote_path=borg1
|
||||
```
|
||||
|
||||
What this does is load your configuration files, and for each one, disregard
|
||||
the configured value for the `remote_path` option in the `location` section,
|
||||
and use the value of `borg1` instead.
|
||||
|
||||
Note that the value is parsed as an actual YAML string, so you can even set
|
||||
list values by using brackets. For instance:
|
||||
|
||||
```bash
|
||||
borgmatic create --override location.repositories=[test1.borg,test2.borg]
|
||||
```
|
||||
|
||||
There is not currently a way to override a single element of a list without
|
||||
replacing the whole list.
|
||||
|
||||
Be sure to quote your overrides if they contain spaces or other characters
|
||||
that your shell may interpret.
|
||||
|
||||
|
||||
## Related documentation
|
||||
|
||||
* [Set up backups with borgmatic](https://torsion.org/borgmatic/docs/how-to/set-up-backups/)
|
||||
|
|
|
@ -57,10 +57,10 @@ tests](https://torsion.org/borgmatic/docs/how-to/extract-a-backup/).
|
|||
|
||||
## Error hooks
|
||||
|
||||
When an error occurs during a backup or another action, borgmatic can run
|
||||
configurable shell commands to fire off custom error notifications or take
|
||||
other actions, so you can get alerted as soon as something goes wrong. Here's
|
||||
a not-so-useful example:
|
||||
When an error occurs during a `prune`, `create`, or `check` action, borgmatic
|
||||
can run configurable shell commands to fire off custom error notifications or
|
||||
take other actions, so you can get alerted as soon as something goes wrong.
|
||||
Here's a not-so-useful example:
|
||||
|
||||
```yaml
|
||||
hooks:
|
||||
|
@ -91,10 +91,11 @@ here:
|
|||
* `output`: output of the command that failed (may be blank if an error
|
||||
occurred without running a command)
|
||||
|
||||
Note that borgmatic runs the `on_error` hooks for any action in which an error
|
||||
occurs, not just the `create` action. But borgmatic does not run `on_error`
|
||||
hooks if an error occurs within a `before_everything` or `after_everything`
|
||||
hook. For more about hooks, see the [borgmatic hooks
|
||||
Note that borgmatic runs the `on_error` hooks only for `prune`, `create`, or
|
||||
`check` actions or hooks in which an error occurs, and not other actions.
|
||||
borgmatic does not run `on_error` hooks if an error occurs within a
|
||||
`before_everything` or `after_everything` hook. For more about hooks, see the
|
||||
[borgmatic hooks
|
||||
documentation](https://torsion.org/borgmatic/docs/how-to/add-preparation-and-cleanup-steps-to-backups/),
|
||||
especially the security information.
|
||||
|
||||
|
@ -116,21 +117,22 @@ hooks:
|
|||
With this hook in place, borgmatic pings your Healthchecks project when a
|
||||
backup begins, ends, or errors. Specifically, before the <a
|
||||
href="https://torsion.org/borgmatic/docs/how-to/add-preparation-and-cleanup-steps-to-backups/">`before_backup`
|
||||
hooks</a> run, borgmatic lets Healthchecks know that a backup has started.
|
||||
hooks</a> run, borgmatic lets Healthchecks know that it has started if any of
|
||||
the `prune`, `create`, or `check` actions are run.
|
||||
|
||||
Then, if the backup completes successfully, borgmatic notifies Healthchecks of
|
||||
Then, if the actions complete successfully, borgmatic notifies Healthchecks of
|
||||
the success after the `after_backup` hooks run, and includes borgmatic logs in
|
||||
the payload data sent to Healthchecks. This means that borgmatic logs show up
|
||||
in the Healthchecks UI, although be aware that Healthchecks currently has a
|
||||
10-kilobyte limit for the logs in each ping.
|
||||
|
||||
If an error occurs during the backup, borgmatic notifies Healthchecks after
|
||||
If an error occurs during any action, borgmatic notifies Healthchecks after
|
||||
the `on_error` hooks run, also tacking on logs including the error itself. But
|
||||
the logs are only included for errors that occur within the borgmatic `create`
|
||||
action (and not other actions).
|
||||
the logs are only included for errors that occur when a `prune`, `create`, or
|
||||
`check` action is run.
|
||||
|
||||
Note that borgmatic sends logs to Healthchecks by applying the maximum of any
|
||||
other borgmatic verbosity level (`--verbosity`, `--syslog-verbosity`, etc.),
|
||||
other borgmatic verbosity levels (`--verbosity`, `--syslog-verbosity`, etc.),
|
||||
as there is not currently a dedicated Healthchecks verbosity setting.
|
||||
|
||||
You can configure Healthchecks to notify you by a [variety of
|
||||
|
@ -155,10 +157,11 @@ hooks:
|
|||
With this hook in place, borgmatic pings your Cronitor monitor when a backup
|
||||
begins, ends, or errors. Specifically, before the <a
|
||||
href="https://torsion.org/borgmatic/docs/how-to/add-preparation-and-cleanup-steps-to-backups/">`before_backup`
|
||||
hooks</a> run, borgmatic lets Cronitor know that a backup has started. Then,
|
||||
if the backup completes successfully, borgmatic notifies Cronitor of the
|
||||
success after the `after_backup` hooks run. And if an error occurs during the
|
||||
backup, borgmatic notifies Cronitor after the `on_error` hooks run.
|
||||
hooks</a> run, borgmatic lets Cronitor know that it has started if any of the
|
||||
`prune`, `create`, or `check` actions are run. Then, if the actions complete
|
||||
successfully, borgmatic notifies Cronitor of the success after the
|
||||
`after_backup` hooks run. And if an error occurs during any action, borgmatic
|
||||
notifies Cronitor after the `on_error` hooks run.
|
||||
|
||||
You can configure Cronitor to notify you by a [variety of
|
||||
mechanisms](https://cronitor.io/docs/cron-job-notifications) when backups fail
|
||||
|
@ -182,10 +185,11 @@ hooks:
|
|||
With this hook in place, borgmatic pings your Cronhub monitor when a backup
|
||||
begins, ends, or errors. Specifically, before the <a
|
||||
href="https://torsion.org/borgmatic/docs/how-to/add-preparation-and-cleanup-steps-to-backups/">`before_backup`
|
||||
hooks</a> run, borgmatic lets Cronhub know that a backup has started. Then,
|
||||
if the backup completes successfully, borgmatic notifies Cronhub of the
|
||||
success after the `after_backup` hooks run. And if an error occurs during the
|
||||
backup, borgmatic notifies Cronhub after the `on_error` hooks run.
|
||||
hooks</a> run, borgmatic lets Cronhub know that it has started if any of the
|
||||
`prune`, `create`, or `check` actions are run. Then, if the actions complete
|
||||
successfully, borgmatic notifies Cronhub of the success after the
|
||||
`after_backup` hooks run. And if an error occurs during any action, borgmatic
|
||||
notifies Cronhub after the `on_error` hooks run.
|
||||
|
||||
Note that even though you configure borgmatic with the "start" variant of the
|
||||
ping URL, borgmatic substitutes the correct state into the URL when pinging
|
||||
|
|
|
@ -3,15 +3,11 @@ title: How to set up backups with borgmatic
|
|||
---
|
||||
## Installation
|
||||
|
||||
To get up and running, first [install
|
||||
Borg](https://borgbackup.readthedocs.io/en/stable/installation.html), at
|
||||
least version 1.1.
|
||||
First, [install
|
||||
Borg](https://borgbackup.readthedocs.io/en/stable/installation.html), at least
|
||||
version 1.1.
|
||||
|
||||
By default, borgmatic looks for its configuration files in `/etc/borgmatic/`
|
||||
and `/etc/borgmatic.d/`, where the root user typically has read access.
|
||||
|
||||
So, to download and install borgmatic as the root user, run the following
|
||||
commands:
|
||||
Then, download and install borgmatic by running the following command:
|
||||
|
||||
```bash
|
||||
sudo pip3 install --user --upgrade borgmatic
|
||||
|
|
|
@ -23,10 +23,11 @@ git push github $version
|
|||
rm -fr dist
|
||||
python3 setup.py bdist_wheel
|
||||
python3 setup.py sdist
|
||||
gpg --detach-sign --armor dist/*
|
||||
twine upload -r pypi dist/borgmatic-*.tar.gz
|
||||
twine upload -r pypi dist/borgmatic-*-py3-none-any.whl
|
||||
|
||||
# Set release changelogs on projects.evoworx.org and GitHub.
|
||||
# Set release changelogs on projects.torsion.org and GitHub.
|
||||
release_changelog="$(cat NEWS | sed '/^$/q' | grep -v '^\S')"
|
||||
escaped_release_changelog="$(echo "$release_changelog" | sed -z 's/\n/\\n/g' | sed -z 's/\"/\\"/g')"
|
||||
curl --silent --request POST \
|
||||
|
|
|
@ -0,0 +1,14 @@
|
|||
#!/bin/sh
|
||||
|
||||
# This script is for running all tests, including end-to-end tests, on a developer machine. It sets
|
||||
# up database containers to run tests against, runs the tests, and then tears down the containers.
|
||||
#
|
||||
# Run this script from the root directory of the borgmatic source.
|
||||
#
|
||||
# For more information, see:
|
||||
# https://torsion.org/borgmatic/docs/how-to/develop-on-borgmatic/
|
||||
|
||||
set -e
|
||||
|
||||
docker-compose --file tests/end-to-end/docker-compose.yaml up --force-recreate \
|
||||
--abort-on-container-exit
|
|
@ -0,0 +1,18 @@
|
|||
#!/bin/sh
|
||||
|
||||
# This script installs test dependencies and runs all tests, including end-to-end tests. It
|
||||
# is designed to run inside a test container, and presumes that other test infrastructure like
|
||||
# databases are already running. Therefore, on a developer machine, you should not run this script
|
||||
# directly. Instead, run scripts/run-full-dev-tests
|
||||
#
|
||||
# For more information, see:
|
||||
# https://torsion.org/borgmatic/docs/how-to/develop-on-borgmatic/
|
||||
|
||||
set -e
|
||||
|
||||
python -m pip install --upgrade pip==19.3.1
|
||||
pip install tox==3.14.1
|
||||
export COVERAGE_FILE=/tmp/.coverage
|
||||
tox --workdir /tmp/.tox
|
||||
apk add --no-cache borgbackup postgresql-client mariadb-client
|
||||
tox --workdir /tmp/.tox -e end-to-end
|
|
@ -1,13 +0,0 @@
|
|||
#!/bin/sh
|
||||
|
||||
# This script is intended to be run from the continuous integration build
|
||||
# server, and not on a developer machine. For that, see:
|
||||
# https://torsion.org/borgmatic/docs/how-to/develop-on-borgmatic/
|
||||
|
||||
set -e
|
||||
|
||||
python -m pip install --upgrade pip==19.3.1
|
||||
pip install tox==3.14.0
|
||||
tox
|
||||
apk add --no-cache borgbackup
|
||||
tox -e end-to-end
|
2
setup.py
2
setup.py
|
@ -1,6 +1,6 @@
|
|||
from setuptools import find_packages, setup
|
||||
|
||||
VERSION = '1.4.15'
|
||||
VERSION = '1.4.21.dev0'
|
||||
|
||||
|
||||
setup(
|
||||
|
|
|
@ -0,0 +1,25 @@
|
|||
version: '3'
|
||||
services:
|
||||
postgresql:
|
||||
image: postgres:11.6-alpine
|
||||
environment:
|
||||
POSTGRES_PASSWORD: test
|
||||
POSTGRES_DB: test
|
||||
mysql:
|
||||
image: mariadb:10.4
|
||||
environment:
|
||||
MYSQL_ROOT_PASSWORD: test
|
||||
MYSQL_DATABASE: test
|
||||
tests:
|
||||
image: python:3.7-alpine3.10
|
||||
volumes:
|
||||
- "../..:/app:ro"
|
||||
tmpfs:
|
||||
- "/app/borgmatic.egg-info"
|
||||
tty: true
|
||||
working_dir: /app
|
||||
command:
|
||||
- /app/scripts/run-full-tests
|
||||
depends_on:
|
||||
- postgresql
|
||||
- mysql
|
|
@ -44,13 +44,13 @@ def test_borgmatic_command():
|
|||
generate_configuration(config_path, repository_path)
|
||||
|
||||
subprocess.check_call(
|
||||
'borgmatic -v 2 --config {} --init --encryption repokey'.format(config_path).split(' ')
|
||||
'borgmatic -v 2 --config {} init --encryption repokey'.format(config_path).split(' ')
|
||||
)
|
||||
|
||||
# Run borgmatic to generate a backup archive, and then list it to make sure it exists.
|
||||
subprocess.check_call('borgmatic --config {}'.format(config_path).split(' '))
|
||||
output = subprocess.check_output(
|
||||
'borgmatic --config {} --list --json'.format(config_path).split(' ')
|
||||
'borgmatic --config {} list --json'.format(config_path).split(' ')
|
||||
).decode(sys.stdout.encoding)
|
||||
parsed_output = json.loads(output)
|
||||
|
||||
|
@ -61,7 +61,7 @@ def test_borgmatic_command():
|
|||
# Extract the created archive into the current (temporary) directory, and confirm that the
|
||||
# extracted file looks right.
|
||||
output = subprocess.check_output(
|
||||
'borgmatic --config {} --extract --archive {}'.format(config_path, archive_name).split(
|
||||
'borgmatic --config {} extract --archive {}'.format(config_path, archive_name).split(
|
||||
' '
|
||||
)
|
||||
).decode(sys.stdout.encoding)
|
||||
|
@ -70,7 +70,7 @@ def test_borgmatic_command():
|
|||
|
||||
# Exercise the info flag.
|
||||
output = subprocess.check_output(
|
||||
'borgmatic --config {} --info --json'.format(config_path).split(' ')
|
||||
'borgmatic --config {} info --json'.format(config_path).split(' ')
|
||||
).decode(sys.stdout.encoding)
|
||||
parsed_output = json.loads(output)
|
||||
|
||||
|
|
|
@ -0,0 +1,83 @@
|
|||
import json
|
||||
import os
|
||||
import shutil
|
||||
import subprocess
|
||||
import sys
|
||||
import tempfile
|
||||
|
||||
|
||||
def write_configuration(config_path, repository_path, borgmatic_source_directory):
|
||||
'''
|
||||
Write out borgmatic configuration into a file at the config path. Set the options so as to work
|
||||
for testing. This includes injecting the given repository path, borgmatic source directory for
|
||||
storing database dumps, and encryption passphrase.
|
||||
'''
|
||||
config = '''
|
||||
location:
|
||||
source_directories:
|
||||
- {}
|
||||
repositories:
|
||||
- {}
|
||||
borgmatic_source_directory: {}
|
||||
|
||||
storage:
|
||||
encryption_passphrase: "test"
|
||||
|
||||
hooks:
|
||||
postgresql_databases:
|
||||
- name: test
|
||||
hostname: postgresql
|
||||
username: postgres
|
||||
password: test
|
||||
mysql_databases:
|
||||
- name: test
|
||||
hostname: mysql
|
||||
username: root
|
||||
password: test
|
||||
'''.format(
|
||||
config_path, repository_path, borgmatic_source_directory
|
||||
)
|
||||
|
||||
config_file = open(config_path, 'w')
|
||||
config_file.write(config)
|
||||
config_file.close()
|
||||
|
||||
|
||||
def test_database_dump_and_restore():
|
||||
# Create a Borg repository.
|
||||
temporary_directory = tempfile.mkdtemp()
|
||||
repository_path = os.path.join(temporary_directory, 'test.borg')
|
||||
borgmatic_source_directory = os.path.join(temporary_directory, '.borgmatic')
|
||||
|
||||
original_working_directory = os.getcwd()
|
||||
|
||||
try:
|
||||
config_path = os.path.join(temporary_directory, 'test.yaml')
|
||||
write_configuration(config_path, repository_path, borgmatic_source_directory)
|
||||
|
||||
subprocess.check_call(
|
||||
'borgmatic -v 2 --config {} init --encryption repokey'.format(config_path).split(' ')
|
||||
)
|
||||
|
||||
# Run borgmatic to generate a backup archive including a database dump
|
||||
subprocess.check_call('borgmatic create --config {} -v 2'.format(config_path).split(' '))
|
||||
|
||||
# Get the created archive name.
|
||||
output = subprocess.check_output(
|
||||
'borgmatic --config {} list --json'.format(config_path).split(' ')
|
||||
).decode(sys.stdout.encoding)
|
||||
parsed_output = json.loads(output)
|
||||
|
||||
assert len(parsed_output) == 1
|
||||
assert len(parsed_output[0]['archives']) == 1
|
||||
archive_name = parsed_output[0]['archives'][0]['archive']
|
||||
|
||||
# Restore the database from the archive.
|
||||
subprocess.check_call(
|
||||
'borgmatic --config {} restore --archive {}'.format(config_path, archive_name).split(
|
||||
' '
|
||||
)
|
||||
)
|
||||
finally:
|
||||
os.chdir(original_working_directory)
|
||||
shutil.rmtree(temporary_directory)
|
|
@ -352,13 +352,6 @@ def test_parse_arguments_requires_archive_with_extract():
|
|||
module.parse_arguments('--config', 'myconfig', 'extract')
|
||||
|
||||
|
||||
def test_parse_arguments_requires_archive_with_mount():
|
||||
flexmock(module.collect).should_receive('get_default_config_paths').and_return(['default'])
|
||||
|
||||
with pytest.raises(SystemExit):
|
||||
module.parse_arguments('--config', 'myconfig', 'mount', '--mount-point', '/mnt')
|
||||
|
||||
|
||||
def test_parse_arguments_requires_archive_with_restore():
|
||||
flexmock(module.collect).should_receive('get_default_config_paths').and_return(['default'])
|
||||
|
||||
|
|
|
@ -0,0 +1,40 @@
|
|||
import pytest
|
||||
|
||||
from borgmatic.config import override as module
|
||||
|
||||
|
||||
@pytest.mark.parametrize(
|
||||
'value,expected_result',
|
||||
(
|
||||
('thing', 'thing'),
|
||||
('33', 33),
|
||||
('33b', '33b'),
|
||||
('true', True),
|
||||
('false', False),
|
||||
('[foo]', ['foo']),
|
||||
('[foo, bar]', ['foo', 'bar']),
|
||||
),
|
||||
)
|
||||
def test_convert_value_type_coerces_values(value, expected_result):
|
||||
assert module.convert_value_type(value) == expected_result
|
||||
|
||||
|
||||
def test_apply_overrides_updates_config():
|
||||
raw_overrides = [
|
||||
'section.key=value1',
|
||||
'other_section.thing=value2',
|
||||
'section.nested.key=value3',
|
||||
'new.foo=bar',
|
||||
]
|
||||
config = {
|
||||
'section': {'key': 'value', 'other': 'other_value'},
|
||||
'other_section': {'thing': 'thing_value'},
|
||||
}
|
||||
|
||||
module.apply_overrides(config, raw_overrides)
|
||||
|
||||
assert config == {
|
||||
'section': {'key': 'value1', 'other': 'other_value', 'nested': {'key': 'value3'}},
|
||||
'other_section': {'thing': 'value2'},
|
||||
'new': {'foo': 'bar'},
|
||||
}
|
|
@ -212,3 +212,30 @@ def test_parse_configuration_raises_for_validation_error():
|
|||
|
||||
with pytest.raises(module.Validation_error):
|
||||
module.parse_configuration('config.yaml', 'schema.yaml')
|
||||
|
||||
|
||||
def test_parse_configuration_applies_overrides():
|
||||
mock_config_and_schema(
|
||||
'''
|
||||
location:
|
||||
source_directories:
|
||||
- /home
|
||||
|
||||
repositories:
|
||||
- hostname.borg
|
||||
|
||||
local_path: borg1
|
||||
'''
|
||||
)
|
||||
|
||||
result = module.parse_configuration(
|
||||
'config.yaml', 'schema.yaml', overrides=['location.local_path=borg2']
|
||||
)
|
||||
|
||||
assert result == {
|
||||
'location': {
|
||||
'source_directories': ['/home'],
|
||||
'repositories': ['hostname.borg'],
|
||||
'local_path': 'borg2',
|
||||
}
|
||||
}
|
||||
|
|
|
@ -158,6 +158,21 @@ def test_make_check_flags_with_default_checks_and_prefix_includes_prefix_flag():
|
|||
assert flags == ('--prefix', 'foo-')
|
||||
|
||||
|
||||
def test_check_archives_with_repair_calls_borg_with_repair_parameter():
|
||||
checks = ('repository',)
|
||||
consistency_config = {'check_last': None}
|
||||
flexmock(module).should_receive('_parse_checks').and_return(checks)
|
||||
flexmock(module).should_receive('_make_check_flags').and_return(())
|
||||
flexmock(module).should_receive('execute_command').never()
|
||||
flexmock(module).should_receive('execute_command_without_capture').with_args(
|
||||
('borg', 'check', '--repair', 'repo'), error_on_warnings=True
|
||||
).once()
|
||||
|
||||
module.check_archives(
|
||||
repository='repo', storage_config={}, consistency_config=consistency_config, repair=True
|
||||
)
|
||||
|
||||
|
||||
@pytest.mark.parametrize(
|
||||
'checks',
|
||||
(
|
||||
|
@ -296,3 +311,17 @@ def test_check_archives_with_retention_prefix():
|
|||
module.check_archives(
|
||||
repository='repo', storage_config={}, consistency_config=consistency_config
|
||||
)
|
||||
|
||||
|
||||
def test_check_archives_with_extra_borg_options_calls_borg_with_extra_options():
|
||||
checks = ('repository',)
|
||||
consistency_config = {'check_last': None}
|
||||
flexmock(module).should_receive('_parse_checks').and_return(checks)
|
||||
flexmock(module).should_receive('_make_check_flags').and_return(())
|
||||
insert_execute_command_mock(('borg', 'check', '--extra', '--options', 'repo'))
|
||||
|
||||
module.check_archives(
|
||||
repository='repo',
|
||||
storage_config={'extra_borg_options': {'check': '--extra --options'}},
|
||||
consistency_config=consistency_config,
|
||||
)
|
||||
|
|
|
@ -184,14 +184,21 @@ def test_borgmatic_source_directories_set_when_directory_exists():
|
|||
flexmock(module.os.path).should_receive('exists').and_return(True)
|
||||
flexmock(module.os.path).should_receive('expanduser')
|
||||
|
||||
assert module.borgmatic_source_directories() == [module.BORGMATIC_SOURCE_DIRECTORY]
|
||||
assert module.borgmatic_source_directories('/tmp') == ['/tmp']
|
||||
|
||||
|
||||
def test_borgmatic_source_directories_empty_when_directory_does_not_exist():
|
||||
flexmock(module.os.path).should_receive('exists').and_return(False)
|
||||
flexmock(module.os.path).should_receive('expanduser')
|
||||
|
||||
assert module.borgmatic_source_directories() == []
|
||||
assert module.borgmatic_source_directories('/tmp') == []
|
||||
|
||||
|
||||
def test_borgmatic_source_directories_defaults_when_directory_not_given():
|
||||
flexmock(module.os.path).should_receive('exists').and_return(True)
|
||||
flexmock(module.os.path).should_receive('expanduser')
|
||||
|
||||
assert module.borgmatic_source_directories(None) == [module.DEFAULT_BORGMATIC_SOURCE_DIRECTORY]
|
||||
|
||||
|
||||
DEFAULT_ARCHIVE_NAME = '{hostname}-{now:%Y-%m-%dT%H:%M:%S.%f}'
|
||||
|
@ -206,7 +213,9 @@ def test_create_archive_calls_borg_with_parameters():
|
|||
flexmock(module).should_receive('_make_pattern_flags').and_return(())
|
||||
flexmock(module).should_receive('_make_exclude_flags').and_return(())
|
||||
flexmock(module).should_receive('execute_command').with_args(
|
||||
('borg', 'create') + ARCHIVE_WITH_PATHS, output_log_level=logging.INFO
|
||||
('borg', 'create') + ARCHIVE_WITH_PATHS,
|
||||
output_log_level=logging.INFO,
|
||||
error_on_warnings=False,
|
||||
)
|
||||
|
||||
module.create_archive(
|
||||
|
@ -232,7 +241,9 @@ def test_create_archive_with_patterns_calls_borg_with_patterns():
|
|||
flexmock(module).should_receive('_make_pattern_flags').and_return(pattern_flags)
|
||||
flexmock(module).should_receive('_make_exclude_flags').and_return(())
|
||||
flexmock(module).should_receive('execute_command').with_args(
|
||||
('borg', 'create') + pattern_flags + ARCHIVE_WITH_PATHS, output_log_level=logging.INFO
|
||||
('borg', 'create') + pattern_flags + ARCHIVE_WITH_PATHS,
|
||||
output_log_level=logging.INFO,
|
||||
error_on_warnings=False,
|
||||
)
|
||||
|
||||
module.create_archive(
|
||||
|
@ -258,7 +269,9 @@ def test_create_archive_with_exclude_patterns_calls_borg_with_excludes():
|
|||
flexmock(module).should_receive('_make_pattern_flags').and_return(())
|
||||
flexmock(module).should_receive('_make_exclude_flags').and_return(exclude_flags)
|
||||
flexmock(module).should_receive('execute_command').with_args(
|
||||
('borg', 'create') + exclude_flags + ARCHIVE_WITH_PATHS, output_log_level=logging.INFO
|
||||
('borg', 'create') + exclude_flags + ARCHIVE_WITH_PATHS,
|
||||
output_log_level=logging.INFO,
|
||||
error_on_warnings=False,
|
||||
)
|
||||
|
||||
module.create_archive(
|
||||
|
@ -284,6 +297,7 @@ def test_create_archive_with_log_info_calls_borg_with_info_parameter():
|
|||
flexmock(module).should_receive('execute_command').with_args(
|
||||
('borg', 'create', '--list', '--filter', 'AME-', '--info', '--stats') + ARCHIVE_WITH_PATHS,
|
||||
output_log_level=logging.INFO,
|
||||
error_on_warnings=False,
|
||||
)
|
||||
insert_logging_mock(logging.INFO)
|
||||
|
||||
|
@ -308,7 +322,9 @@ def test_create_archive_with_log_info_and_json_suppresses_most_borg_output():
|
|||
flexmock(module).should_receive('_make_pattern_flags').and_return(())
|
||||
flexmock(module).should_receive('_make_exclude_flags').and_return(())
|
||||
flexmock(module).should_receive('execute_command').with_args(
|
||||
('borg', 'create', '--json') + ARCHIVE_WITH_PATHS, output_log_level=None
|
||||
('borg', 'create', '--json') + ARCHIVE_WITH_PATHS,
|
||||
output_log_level=None,
|
||||
error_on_warnings=False,
|
||||
)
|
||||
insert_logging_mock(logging.INFO)
|
||||
|
||||
|
@ -336,6 +352,7 @@ def test_create_archive_with_log_debug_calls_borg_with_debug_parameter():
|
|||
('borg', 'create', '--list', '--filter', 'AME-', '--stats', '--debug', '--show-rc')
|
||||
+ ARCHIVE_WITH_PATHS,
|
||||
output_log_level=logging.INFO,
|
||||
error_on_warnings=False,
|
||||
)
|
||||
insert_logging_mock(logging.DEBUG)
|
||||
|
||||
|
@ -359,7 +376,9 @@ def test_create_archive_with_log_debug_and_json_suppresses_most_borg_output():
|
|||
flexmock(module).should_receive('_make_pattern_flags').and_return(())
|
||||
flexmock(module).should_receive('_make_exclude_flags').and_return(())
|
||||
flexmock(module).should_receive('execute_command').with_args(
|
||||
('borg', 'create', '--json') + ARCHIVE_WITH_PATHS, output_log_level=None
|
||||
('borg', 'create', '--json') + ARCHIVE_WITH_PATHS,
|
||||
output_log_level=None,
|
||||
error_on_warnings=False,
|
||||
)
|
||||
insert_logging_mock(logging.DEBUG)
|
||||
|
||||
|
@ -385,7 +404,9 @@ def test_create_archive_with_dry_run_calls_borg_with_dry_run_parameter():
|
|||
flexmock(module).should_receive('_make_pattern_flags').and_return(())
|
||||
flexmock(module).should_receive('_make_exclude_flags').and_return(())
|
||||
flexmock(module).should_receive('execute_command').with_args(
|
||||
('borg', 'create', '--dry-run') + ARCHIVE_WITH_PATHS, output_log_level=logging.INFO
|
||||
('borg', 'create', '--dry-run') + ARCHIVE_WITH_PATHS,
|
||||
output_log_level=logging.INFO,
|
||||
error_on_warnings=False,
|
||||
)
|
||||
|
||||
module.create_archive(
|
||||
|
@ -414,6 +435,7 @@ def test_create_archive_with_dry_run_and_log_info_calls_borg_without_stats_param
|
|||
('borg', 'create', '--list', '--filter', 'AME-', '--info', '--dry-run')
|
||||
+ ARCHIVE_WITH_PATHS,
|
||||
output_log_level=logging.INFO,
|
||||
error_on_warnings=False,
|
||||
)
|
||||
insert_logging_mock(logging.INFO)
|
||||
|
||||
|
@ -443,6 +465,7 @@ def test_create_archive_with_dry_run_and_log_debug_calls_borg_without_stats_para
|
|||
('borg', 'create', '--list', '--filter', 'AME-', '--debug', '--show-rc', '--dry-run')
|
||||
+ ARCHIVE_WITH_PATHS,
|
||||
output_log_level=logging.INFO,
|
||||
error_on_warnings=False,
|
||||
)
|
||||
insert_logging_mock(logging.DEBUG)
|
||||
|
||||
|
@ -468,6 +491,7 @@ def test_create_archive_with_checkpoint_interval_calls_borg_with_checkpoint_inte
|
|||
flexmock(module).should_receive('execute_command').with_args(
|
||||
('borg', 'create', '--checkpoint-interval', '600') + ARCHIVE_WITH_PATHS,
|
||||
output_log_level=logging.INFO,
|
||||
error_on_warnings=False,
|
||||
)
|
||||
|
||||
module.create_archive(
|
||||
|
@ -492,6 +516,7 @@ def test_create_archive_with_chunker_params_calls_borg_with_chunker_params_param
|
|||
flexmock(module).should_receive('execute_command').with_args(
|
||||
('borg', 'create', '--chunker-params', '1,2,3,4') + ARCHIVE_WITH_PATHS,
|
||||
output_log_level=logging.INFO,
|
||||
error_on_warnings=False,
|
||||
)
|
||||
|
||||
module.create_archive(
|
||||
|
@ -516,6 +541,7 @@ def test_create_archive_with_compression_calls_borg_with_compression_parameters(
|
|||
flexmock(module).should_receive('execute_command').with_args(
|
||||
('borg', 'create', '--compression', 'rle') + ARCHIVE_WITH_PATHS,
|
||||
output_log_level=logging.INFO,
|
||||
error_on_warnings=False,
|
||||
)
|
||||
|
||||
module.create_archive(
|
||||
|
@ -540,6 +566,7 @@ def test_create_archive_with_remote_rate_limit_calls_borg_with_remote_ratelimit_
|
|||
flexmock(module).should_receive('execute_command').with_args(
|
||||
('borg', 'create', '--remote-ratelimit', '100') + ARCHIVE_WITH_PATHS,
|
||||
output_log_level=logging.INFO,
|
||||
error_on_warnings=False,
|
||||
)
|
||||
|
||||
module.create_archive(
|
||||
|
@ -562,7 +589,9 @@ def test_create_archive_with_one_file_system_calls_borg_with_one_file_system_par
|
|||
flexmock(module).should_receive('_make_pattern_flags').and_return(())
|
||||
flexmock(module).should_receive('_make_exclude_flags').and_return(())
|
||||
flexmock(module).should_receive('execute_command').with_args(
|
||||
('borg', 'create', '--one-file-system') + ARCHIVE_WITH_PATHS, output_log_level=logging.INFO
|
||||
('borg', 'create', '--one-file-system') + ARCHIVE_WITH_PATHS,
|
||||
output_log_level=logging.INFO,
|
||||
error_on_warnings=False,
|
||||
)
|
||||
|
||||
module.create_archive(
|
||||
|
@ -586,7 +615,9 @@ def test_create_archive_with_numeric_owner_calls_borg_with_numeric_owner_paramet
|
|||
flexmock(module).should_receive('_make_pattern_flags').and_return(())
|
||||
flexmock(module).should_receive('_make_exclude_flags').and_return(())
|
||||
flexmock(module).should_receive('execute_command').with_args(
|
||||
('borg', 'create', '--numeric-owner') + ARCHIVE_WITH_PATHS, output_log_level=logging.INFO
|
||||
('borg', 'create', '--numeric-owner') + ARCHIVE_WITH_PATHS,
|
||||
output_log_level=logging.INFO,
|
||||
error_on_warnings=False,
|
||||
)
|
||||
|
||||
module.create_archive(
|
||||
|
@ -610,7 +641,9 @@ def test_create_archive_with_read_special_calls_borg_with_read_special_parameter
|
|||
flexmock(module).should_receive('_make_pattern_flags').and_return(())
|
||||
flexmock(module).should_receive('_make_exclude_flags').and_return(())
|
||||
flexmock(module).should_receive('execute_command').with_args(
|
||||
('borg', 'create', '--read-special') + ARCHIVE_WITH_PATHS, output_log_level=logging.INFO
|
||||
('borg', 'create', '--read-special') + ARCHIVE_WITH_PATHS,
|
||||
output_log_level=logging.INFO,
|
||||
error_on_warnings=False,
|
||||
)
|
||||
|
||||
module.create_archive(
|
||||
|
@ -635,7 +668,9 @@ def test_create_archive_with_option_true_calls_borg_without_corresponding_parame
|
|||
flexmock(module).should_receive('_make_pattern_flags').and_return(())
|
||||
flexmock(module).should_receive('_make_exclude_flags').and_return(())
|
||||
flexmock(module).should_receive('execute_command').with_args(
|
||||
('borg', 'create') + ARCHIVE_WITH_PATHS, output_log_level=logging.INFO
|
||||
('borg', 'create') + ARCHIVE_WITH_PATHS,
|
||||
output_log_level=logging.INFO,
|
||||
error_on_warnings=False,
|
||||
)
|
||||
|
||||
module.create_archive(
|
||||
|
@ -662,6 +697,7 @@ def test_create_archive_with_option_false_calls_borg_with_corresponding_paramete
|
|||
flexmock(module).should_receive('execute_command').with_args(
|
||||
('borg', 'create', '--no' + option_name.replace('_', '')) + ARCHIVE_WITH_PATHS,
|
||||
output_log_level=logging.INFO,
|
||||
error_on_warnings=False,
|
||||
)
|
||||
|
||||
module.create_archive(
|
||||
|
@ -687,6 +723,7 @@ def test_create_archive_with_files_cache_calls_borg_with_files_cache_parameters(
|
|||
flexmock(module).should_receive('execute_command').with_args(
|
||||
('borg', 'create', '--files-cache', 'ctime,size') + ARCHIVE_WITH_PATHS,
|
||||
output_log_level=logging.INFO,
|
||||
error_on_warnings=False,
|
||||
)
|
||||
|
||||
module.create_archive(
|
||||
|
@ -710,7 +747,9 @@ def test_create_archive_with_local_path_calls_borg_via_local_path():
|
|||
flexmock(module).should_receive('_make_pattern_flags').and_return(())
|
||||
flexmock(module).should_receive('_make_exclude_flags').and_return(())
|
||||
flexmock(module).should_receive('execute_command').with_args(
|
||||
('borg1', 'create') + ARCHIVE_WITH_PATHS, output_log_level=logging.INFO
|
||||
('borg1', 'create') + ARCHIVE_WITH_PATHS,
|
||||
output_log_level=logging.INFO,
|
||||
error_on_warnings=False,
|
||||
)
|
||||
|
||||
module.create_archive(
|
||||
|
@ -736,6 +775,7 @@ def test_create_archive_with_remote_path_calls_borg_with_remote_path_parameters(
|
|||
flexmock(module).should_receive('execute_command').with_args(
|
||||
('borg', 'create', '--remote-path', 'borg1') + ARCHIVE_WITH_PATHS,
|
||||
output_log_level=logging.INFO,
|
||||
error_on_warnings=False,
|
||||
)
|
||||
|
||||
module.create_archive(
|
||||
|
@ -759,7 +799,9 @@ def test_create_archive_with_umask_calls_borg_with_umask_parameters():
|
|||
flexmock(module).should_receive('_make_pattern_flags').and_return(())
|
||||
flexmock(module).should_receive('_make_exclude_flags').and_return(())
|
||||
flexmock(module).should_receive('execute_command').with_args(
|
||||
('borg', 'create', '--umask', '740') + ARCHIVE_WITH_PATHS, output_log_level=logging.INFO
|
||||
('borg', 'create', '--umask', '740') + ARCHIVE_WITH_PATHS,
|
||||
output_log_level=logging.INFO,
|
||||
error_on_warnings=False,
|
||||
)
|
||||
|
||||
module.create_archive(
|
||||
|
@ -782,7 +824,9 @@ def test_create_archive_with_lock_wait_calls_borg_with_lock_wait_parameters():
|
|||
flexmock(module).should_receive('_make_pattern_flags').and_return(())
|
||||
flexmock(module).should_receive('_make_exclude_flags').and_return(())
|
||||
flexmock(module).should_receive('execute_command').with_args(
|
||||
('borg', 'create', '--lock-wait', '5') + ARCHIVE_WITH_PATHS, output_log_level=logging.INFO
|
||||
('borg', 'create', '--lock-wait', '5') + ARCHIVE_WITH_PATHS,
|
||||
output_log_level=logging.INFO,
|
||||
error_on_warnings=False,
|
||||
)
|
||||
|
||||
module.create_archive(
|
||||
|
@ -805,7 +849,9 @@ def test_create_archive_with_stats_calls_borg_with_stats_parameter():
|
|||
flexmock(module).should_receive('_make_pattern_flags').and_return(())
|
||||
flexmock(module).should_receive('_make_exclude_flags').and_return(())
|
||||
flexmock(module).should_receive('execute_command').with_args(
|
||||
('borg', 'create', '--stats') + ARCHIVE_WITH_PATHS, output_log_level=logging.WARNING
|
||||
('borg', 'create', '--stats') + ARCHIVE_WITH_PATHS,
|
||||
output_log_level=logging.WARNING,
|
||||
error_on_warnings=False,
|
||||
)
|
||||
|
||||
module.create_archive(
|
||||
|
@ -821,6 +867,32 @@ def test_create_archive_with_stats_calls_borg_with_stats_parameter():
|
|||
)
|
||||
|
||||
|
||||
def test_create_archive_with_progress_and_log_info_calls_borg_with_progress_parameter_and_no_list():
|
||||
flexmock(module).should_receive('borgmatic_source_directories').and_return([])
|
||||
flexmock(module).should_receive('_expand_directories').and_return(('foo', 'bar'))
|
||||
flexmock(module).should_receive('_expand_home_directories').and_return(())
|
||||
flexmock(module).should_receive('_write_pattern_file').and_return(None)
|
||||
flexmock(module).should_receive('_make_pattern_flags').and_return(())
|
||||
flexmock(module).should_receive('_make_exclude_flags').and_return(())
|
||||
flexmock(module).should_receive('execute_command_without_capture').with_args(
|
||||
('borg', 'create', '--info', '--stats', '--progress') + ARCHIVE_WITH_PATHS,
|
||||
error_on_warnings=False,
|
||||
)
|
||||
insert_logging_mock(logging.INFO)
|
||||
|
||||
module.create_archive(
|
||||
dry_run=False,
|
||||
repository='repo',
|
||||
location_config={
|
||||
'source_directories': ['foo', 'bar'],
|
||||
'repositories': ['repo'],
|
||||
'exclude_patterns': None,
|
||||
},
|
||||
storage_config={},
|
||||
progress=True,
|
||||
)
|
||||
|
||||
|
||||
def test_create_archive_with_progress_calls_borg_with_progress_parameter():
|
||||
flexmock(module).should_receive('borgmatic_source_directories').and_return([])
|
||||
flexmock(module).should_receive('_expand_directories').and_return(('foo', 'bar'))
|
||||
|
@ -829,7 +901,7 @@ def test_create_archive_with_progress_calls_borg_with_progress_parameter():
|
|||
flexmock(module).should_receive('_make_pattern_flags').and_return(())
|
||||
flexmock(module).should_receive('_make_exclude_flags').and_return(())
|
||||
flexmock(module).should_receive('execute_command_without_capture').with_args(
|
||||
('borg', 'create', '--progress') + ARCHIVE_WITH_PATHS
|
||||
('borg', 'create', '--progress') + ARCHIVE_WITH_PATHS, error_on_warnings=False
|
||||
)
|
||||
|
||||
module.create_archive(
|
||||
|
@ -853,7 +925,9 @@ def test_create_archive_with_json_calls_borg_with_json_parameter():
|
|||
flexmock(module).should_receive('_make_pattern_flags').and_return(())
|
||||
flexmock(module).should_receive('_make_exclude_flags').and_return(())
|
||||
flexmock(module).should_receive('execute_command').with_args(
|
||||
('borg', 'create', '--json') + ARCHIVE_WITH_PATHS, output_log_level=None
|
||||
('borg', 'create', '--json') + ARCHIVE_WITH_PATHS,
|
||||
output_log_level=None,
|
||||
error_on_warnings=False,
|
||||
).and_return('[]')
|
||||
|
||||
json_output = module.create_archive(
|
||||
|
@ -879,7 +953,9 @@ def test_create_archive_with_stats_and_json_calls_borg_without_stats_parameter()
|
|||
flexmock(module).should_receive('_make_pattern_flags').and_return(())
|
||||
flexmock(module).should_receive('_make_exclude_flags').and_return(())
|
||||
flexmock(module).should_receive('execute_command').with_args(
|
||||
('borg', 'create', '--json') + ARCHIVE_WITH_PATHS, output_log_level=None
|
||||
('borg', 'create', '--json') + ARCHIVE_WITH_PATHS,
|
||||
output_log_level=None,
|
||||
error_on_warnings=False,
|
||||
).and_return('[]')
|
||||
|
||||
json_output = module.create_archive(
|
||||
|
@ -908,6 +984,7 @@ def test_create_archive_with_source_directories_glob_expands():
|
|||
flexmock(module).should_receive('execute_command').with_args(
|
||||
('borg', 'create', 'repo::{}'.format(DEFAULT_ARCHIVE_NAME), 'foo', 'food'),
|
||||
output_log_level=logging.INFO,
|
||||
error_on_warnings=False,
|
||||
)
|
||||
flexmock(module.glob).should_receive('glob').with_args('foo*').and_return(['foo', 'food'])
|
||||
|
||||
|
@ -933,6 +1010,7 @@ def test_create_archive_with_non_matching_source_directories_glob_passes_through
|
|||
flexmock(module).should_receive('execute_command').with_args(
|
||||
('borg', 'create', 'repo::{}'.format(DEFAULT_ARCHIVE_NAME), 'foo*'),
|
||||
output_log_level=logging.INFO,
|
||||
error_on_warnings=False,
|
||||
)
|
||||
flexmock(module.glob).should_receive('glob').with_args('foo*').and_return([])
|
||||
|
||||
|
@ -958,6 +1036,7 @@ def test_create_archive_with_glob_calls_borg_with_expanded_directories():
|
|||
flexmock(module).should_receive('execute_command').with_args(
|
||||
('borg', 'create', 'repo::{}'.format(DEFAULT_ARCHIVE_NAME), 'foo', 'food'),
|
||||
output_log_level=logging.INFO,
|
||||
error_on_warnings=False,
|
||||
)
|
||||
|
||||
module.create_archive(
|
||||
|
@ -980,7 +1059,9 @@ def test_create_archive_with_archive_name_format_calls_borg_with_archive_name():
|
|||
flexmock(module).should_receive('_make_pattern_flags').and_return(())
|
||||
flexmock(module).should_receive('_make_exclude_flags').and_return(())
|
||||
flexmock(module).should_receive('execute_command').with_args(
|
||||
('borg', 'create', 'repo::ARCHIVE_NAME', 'foo', 'bar'), output_log_level=logging.INFO
|
||||
('borg', 'create', 'repo::ARCHIVE_NAME', 'foo', 'bar'),
|
||||
output_log_level=logging.INFO,
|
||||
error_on_warnings=False,
|
||||
)
|
||||
|
||||
module.create_archive(
|
||||
|
@ -1005,6 +1086,7 @@ def test_create_archive_with_archive_name_format_accepts_borg_placeholders():
|
|||
flexmock(module).should_receive('execute_command').with_args(
|
||||
('borg', 'create', 'repo::Documents_{hostname}-{now}', 'foo', 'bar'),
|
||||
output_log_level=logging.INFO,
|
||||
error_on_warnings=False,
|
||||
)
|
||||
|
||||
module.create_archive(
|
||||
|
@ -1017,3 +1099,28 @@ def test_create_archive_with_archive_name_format_accepts_borg_placeholders():
|
|||
},
|
||||
storage_config={'archive_name_format': 'Documents_{hostname}-{now}'},
|
||||
)
|
||||
|
||||
|
||||
def test_create_archive_with_extra_borg_options_calls_borg_with_extra_options():
|
||||
flexmock(module).should_receive('borgmatic_source_directories').and_return([])
|
||||
flexmock(module).should_receive('_expand_directories').and_return(('foo', 'bar'))
|
||||
flexmock(module).should_receive('_expand_home_directories').and_return(())
|
||||
flexmock(module).should_receive('_write_pattern_file').and_return(None)
|
||||
flexmock(module).should_receive('_make_pattern_flags').and_return(())
|
||||
flexmock(module).should_receive('_make_exclude_flags').and_return(())
|
||||
flexmock(module).should_receive('execute_command').with_args(
|
||||
('borg', 'create', '--extra', '--options') + ARCHIVE_WITH_PATHS,
|
||||
output_log_level=logging.INFO,
|
||||
error_on_warnings=False,
|
||||
)
|
||||
|
||||
module.create_archive(
|
||||
dry_run=False,
|
||||
repository='repo',
|
||||
location_config={
|
||||
'source_directories': ['foo', 'bar'],
|
||||
'repositories': ['repo'],
|
||||
'exclude_patterns': None,
|
||||
},
|
||||
storage_config={'extra_borg_options': {'create': '--extra --options'}},
|
||||
)
|
||||
|
|
|
@ -15,7 +15,7 @@ def insert_execute_command_mock(command, working_directory=None, error_on_warnin
|
|||
|
||||
def insert_execute_command_output_mock(command, result):
|
||||
flexmock(module).should_receive('execute_command').with_args(
|
||||
command, output_log_level=None
|
||||
command, output_log_level=None, error_on_warnings=False
|
||||
).and_return(result).once()
|
||||
|
||||
|
||||
|
|
|
@ -10,7 +10,7 @@ from ..test_verbosity import insert_logging_mock
|
|||
|
||||
def test_display_archives_info_calls_borg_with_parameters():
|
||||
flexmock(module).should_receive('execute_command').with_args(
|
||||
('borg', 'info', 'repo'), output_log_level=logging.WARNING
|
||||
('borg', 'info', 'repo'), output_log_level=logging.WARNING, error_on_warnings=False
|
||||
)
|
||||
|
||||
module.display_archives_info(
|
||||
|
@ -20,7 +20,9 @@ def test_display_archives_info_calls_borg_with_parameters():
|
|||
|
||||
def test_display_archives_info_with_log_info_calls_borg_with_info_parameter():
|
||||
flexmock(module).should_receive('execute_command').with_args(
|
||||
('borg', 'info', '--info', 'repo'), output_log_level=logging.WARNING
|
||||
('borg', 'info', '--info', 'repo'),
|
||||
output_log_level=logging.WARNING,
|
||||
error_on_warnings=False,
|
||||
)
|
||||
insert_logging_mock(logging.INFO)
|
||||
module.display_archives_info(
|
||||
|
@ -30,7 +32,7 @@ def test_display_archives_info_with_log_info_calls_borg_with_info_parameter():
|
|||
|
||||
def test_display_archives_info_with_log_info_and_json_suppresses_most_borg_output():
|
||||
flexmock(module).should_receive('execute_command').with_args(
|
||||
('borg', 'info', '--json', 'repo'), output_log_level=None
|
||||
('borg', 'info', '--json', 'repo'), output_log_level=None, error_on_warnings=False
|
||||
).and_return('[]')
|
||||
|
||||
insert_logging_mock(logging.INFO)
|
||||
|
@ -43,7 +45,9 @@ def test_display_archives_info_with_log_info_and_json_suppresses_most_borg_outpu
|
|||
|
||||
def test_display_archives_info_with_log_debug_calls_borg_with_debug_parameter():
|
||||
flexmock(module).should_receive('execute_command').with_args(
|
||||
('borg', 'info', '--debug', '--show-rc', 'repo'), output_log_level=logging.WARNING
|
||||
('borg', 'info', '--debug', '--show-rc', 'repo'),
|
||||
output_log_level=logging.WARNING,
|
||||
error_on_warnings=False,
|
||||
)
|
||||
insert_logging_mock(logging.DEBUG)
|
||||
|
||||
|
@ -54,7 +58,7 @@ def test_display_archives_info_with_log_debug_calls_borg_with_debug_parameter():
|
|||
|
||||
def test_display_archives_info_with_log_debug_and_json_suppresses_most_borg_output():
|
||||
flexmock(module).should_receive('execute_command').with_args(
|
||||
('borg', 'info', '--json', 'repo'), output_log_level=None
|
||||
('borg', 'info', '--json', 'repo'), output_log_level=None, error_on_warnings=False
|
||||
).and_return('[]')
|
||||
|
||||
insert_logging_mock(logging.DEBUG)
|
||||
|
@ -67,7 +71,7 @@ def test_display_archives_info_with_log_debug_and_json_suppresses_most_borg_outp
|
|||
|
||||
def test_display_archives_info_with_json_calls_borg_with_json_parameter():
|
||||
flexmock(module).should_receive('execute_command').with_args(
|
||||
('borg', 'info', '--json', 'repo'), output_log_level=None
|
||||
('borg', 'info', '--json', 'repo'), output_log_level=None, error_on_warnings=False
|
||||
).and_return('[]')
|
||||
|
||||
json_output = module.display_archives_info(
|
||||
|
@ -79,7 +83,7 @@ def test_display_archives_info_with_json_calls_borg_with_json_parameter():
|
|||
|
||||
def test_display_archives_info_with_archive_calls_borg_with_archive_parameter():
|
||||
flexmock(module).should_receive('execute_command').with_args(
|
||||
('borg', 'info', 'repo::archive'), output_log_level=logging.WARNING
|
||||
('borg', 'info', 'repo::archive'), output_log_level=logging.WARNING, error_on_warnings=False
|
||||
)
|
||||
|
||||
module.display_archives_info(
|
||||
|
@ -89,7 +93,7 @@ def test_display_archives_info_with_archive_calls_borg_with_archive_parameter():
|
|||
|
||||
def test_display_archives_info_with_local_path_calls_borg_via_local_path():
|
||||
flexmock(module).should_receive('execute_command').with_args(
|
||||
('borg1', 'info', 'repo'), output_log_level=logging.WARNING
|
||||
('borg1', 'info', 'repo'), output_log_level=logging.WARNING, error_on_warnings=False
|
||||
)
|
||||
|
||||
module.display_archives_info(
|
||||
|
@ -102,7 +106,9 @@ def test_display_archives_info_with_local_path_calls_borg_via_local_path():
|
|||
|
||||
def test_display_archives_info_with_remote_path_calls_borg_with_remote_path_parameters():
|
||||
flexmock(module).should_receive('execute_command').with_args(
|
||||
('borg', 'info', '--remote-path', 'borg1', 'repo'), output_log_level=logging.WARNING
|
||||
('borg', 'info', '--remote-path', 'borg1', 'repo'),
|
||||
output_log_level=logging.WARNING,
|
||||
error_on_warnings=False,
|
||||
)
|
||||
|
||||
module.display_archives_info(
|
||||
|
@ -116,7 +122,9 @@ def test_display_archives_info_with_remote_path_calls_borg_with_remote_path_para
|
|||
def test_display_archives_info_with_lock_wait_calls_borg_with_lock_wait_parameters():
|
||||
storage_config = {'lock_wait': 5}
|
||||
flexmock(module).should_receive('execute_command').with_args(
|
||||
('borg', 'info', '--lock-wait', '5', 'repo'), output_log_level=logging.WARNING
|
||||
('borg', 'info', '--lock-wait', '5', 'repo'),
|
||||
output_log_level=logging.WARNING,
|
||||
error_on_warnings=False,
|
||||
)
|
||||
|
||||
module.display_archives_info(
|
||||
|
@ -131,6 +139,7 @@ def test_display_archives_info_passes_through_arguments_to_borg(argument_name):
|
|||
flexmock(module).should_receive('execute_command').with_args(
|
||||
('borg', 'info', '--' + argument_name.replace('_', '-'), 'value', 'repo'),
|
||||
output_log_level=logging.WARNING,
|
||||
error_on_warnings=False,
|
||||
)
|
||||
|
||||
module.display_archives_info(
|
||||
|
|
|
@ -24,7 +24,7 @@ def insert_info_command_not_found_mock():
|
|||
|
||||
def insert_init_command_mock(init_command, **kwargs):
|
||||
flexmock(module).should_receive('execute_command_without_capture').with_args(
|
||||
init_command
|
||||
init_command, error_on_warnings=False
|
||||
).once()
|
||||
|
||||
|
||||
|
@ -32,7 +32,7 @@ def test_initialize_repository_calls_borg_with_parameters():
|
|||
insert_info_command_not_found_mock()
|
||||
insert_init_command_mock(INIT_COMMAND + ('repo',))
|
||||
|
||||
module.initialize_repository(repository='repo', encryption_mode='repokey')
|
||||
module.initialize_repository(repository='repo', storage_config={}, encryption_mode='repokey')
|
||||
|
||||
|
||||
def test_initialize_repository_raises_for_borg_init_error():
|
||||
|
@ -42,14 +42,16 @@ def test_initialize_repository_raises_for_borg_init_error():
|
|||
)
|
||||
|
||||
with pytest.raises(subprocess.CalledProcessError):
|
||||
module.initialize_repository(repository='repo', encryption_mode='repokey')
|
||||
module.initialize_repository(
|
||||
repository='repo', storage_config={}, encryption_mode='repokey'
|
||||
)
|
||||
|
||||
|
||||
def test_initialize_repository_skips_initialization_when_repository_already_exists():
|
||||
insert_info_command_found_mock()
|
||||
flexmock(module).should_receive('execute_command_without_capture').never()
|
||||
|
||||
module.initialize_repository(repository='repo', encryption_mode='repokey')
|
||||
module.initialize_repository(repository='repo', storage_config={}, encryption_mode='repokey')
|
||||
|
||||
|
||||
def test_initialize_repository_raises_for_unknown_info_command_error():
|
||||
|
@ -58,21 +60,27 @@ def test_initialize_repository_raises_for_unknown_info_command_error():
|
|||
)
|
||||
|
||||
with pytest.raises(subprocess.CalledProcessError):
|
||||
module.initialize_repository(repository='repo', encryption_mode='repokey')
|
||||
module.initialize_repository(
|
||||
repository='repo', storage_config={}, encryption_mode='repokey'
|
||||
)
|
||||
|
||||
|
||||
def test_initialize_repository_with_append_only_calls_borg_with_append_only_parameter():
|
||||
insert_info_command_not_found_mock()
|
||||
insert_init_command_mock(INIT_COMMAND + ('--append-only', 'repo'))
|
||||
|
||||
module.initialize_repository(repository='repo', encryption_mode='repokey', append_only=True)
|
||||
module.initialize_repository(
|
||||
repository='repo', storage_config={}, encryption_mode='repokey', append_only=True
|
||||
)
|
||||
|
||||
|
||||
def test_initialize_repository_with_storage_quota_calls_borg_with_storage_quota_parameter():
|
||||
insert_info_command_not_found_mock()
|
||||
insert_init_command_mock(INIT_COMMAND + ('--storage-quota', '5G', 'repo'))
|
||||
|
||||
module.initialize_repository(repository='repo', encryption_mode='repokey', storage_quota='5G')
|
||||
module.initialize_repository(
|
||||
repository='repo', storage_config={}, encryption_mode='repokey', storage_quota='5G'
|
||||
)
|
||||
|
||||
|
||||
def test_initialize_repository_with_log_info_calls_borg_with_info_parameter():
|
||||
|
@ -80,7 +88,7 @@ def test_initialize_repository_with_log_info_calls_borg_with_info_parameter():
|
|||
insert_init_command_mock(INIT_COMMAND + ('--info', 'repo'))
|
||||
insert_logging_mock(logging.INFO)
|
||||
|
||||
module.initialize_repository(repository='repo', encryption_mode='repokey')
|
||||
module.initialize_repository(repository='repo', storage_config={}, encryption_mode='repokey')
|
||||
|
||||
|
||||
def test_initialize_repository_with_log_debug_calls_borg_with_debug_parameter():
|
||||
|
@ -88,18 +96,33 @@ def test_initialize_repository_with_log_debug_calls_borg_with_debug_parameter():
|
|||
insert_init_command_mock(INIT_COMMAND + ('--debug', 'repo'))
|
||||
insert_logging_mock(logging.DEBUG)
|
||||
|
||||
module.initialize_repository(repository='repo', encryption_mode='repokey')
|
||||
module.initialize_repository(repository='repo', storage_config={}, encryption_mode='repokey')
|
||||
|
||||
|
||||
def test_initialize_repository_with_local_path_calls_borg_via_local_path():
|
||||
insert_info_command_not_found_mock()
|
||||
insert_init_command_mock(('borg1',) + INIT_COMMAND[1:] + ('repo',))
|
||||
|
||||
module.initialize_repository(repository='repo', encryption_mode='repokey', local_path='borg1')
|
||||
module.initialize_repository(
|
||||
repository='repo', storage_config={}, encryption_mode='repokey', local_path='borg1'
|
||||
)
|
||||
|
||||
|
||||
def test_initialize_repository_with_remote_path_calls_borg_with_remote_path_parameter():
|
||||
insert_info_command_not_found_mock()
|
||||
insert_init_command_mock(INIT_COMMAND + ('--remote-path', 'borg1', 'repo'))
|
||||
|
||||
module.initialize_repository(repository='repo', encryption_mode='repokey', remote_path='borg1')
|
||||
module.initialize_repository(
|
||||
repository='repo', storage_config={}, encryption_mode='repokey', remote_path='borg1'
|
||||
)
|
||||
|
||||
|
||||
def test_initialize_repository_with_extra_borg_options_calls_borg_with_extra_options():
|
||||
insert_info_command_not_found_mock()
|
||||
insert_init_command_mock(INIT_COMMAND + ('--extra', '--options', 'repo'))
|
||||
|
||||
module.initialize_repository(
|
||||
repository='repo',
|
||||
storage_config={'extra_borg_options': {'init': '--extra --options'}},
|
||||
encryption_mode='repokey',
|
||||
)
|
||||
|
|
|
@ -10,129 +10,154 @@ from ..test_verbosity import insert_logging_mock
|
|||
|
||||
def test_list_archives_calls_borg_with_parameters():
|
||||
flexmock(module).should_receive('execute_command').with_args(
|
||||
('borg', 'list', 'repo'), output_log_level=logging.WARNING
|
||||
('borg', 'list', 'repo'), output_log_level=logging.WARNING, error_on_warnings=False
|
||||
)
|
||||
|
||||
module.list_archives(
|
||||
repository='repo',
|
||||
storage_config={},
|
||||
list_arguments=flexmock(archive=None, json=False, successful=False),
|
||||
list_arguments=flexmock(archive=None, paths=None, json=False, successful=False),
|
||||
)
|
||||
|
||||
|
||||
def test_list_archives_with_log_info_calls_borg_with_info_parameter():
|
||||
flexmock(module).should_receive('execute_command').with_args(
|
||||
('borg', 'list', '--info', 'repo'), output_log_level=logging.WARNING
|
||||
('borg', 'list', '--info', 'repo'),
|
||||
output_log_level=logging.WARNING,
|
||||
error_on_warnings=False,
|
||||
)
|
||||
insert_logging_mock(logging.INFO)
|
||||
|
||||
module.list_archives(
|
||||
repository='repo',
|
||||
storage_config={},
|
||||
list_arguments=flexmock(archive=None, json=False, successful=False),
|
||||
list_arguments=flexmock(archive=None, paths=None, json=False, successful=False),
|
||||
)
|
||||
|
||||
|
||||
def test_list_archives_with_log_info_and_json_suppresses_most_borg_output():
|
||||
flexmock(module).should_receive('execute_command').with_args(
|
||||
('borg', 'list', '--json', 'repo'), output_log_level=None
|
||||
('borg', 'list', '--json', 'repo'), output_log_level=None, error_on_warnings=False
|
||||
)
|
||||
insert_logging_mock(logging.INFO)
|
||||
|
||||
module.list_archives(
|
||||
repository='repo',
|
||||
storage_config={},
|
||||
list_arguments=flexmock(archive=None, json=True, successful=False),
|
||||
list_arguments=flexmock(archive=None, paths=None, json=True, successful=False),
|
||||
)
|
||||
|
||||
|
||||
def test_list_archives_with_log_debug_calls_borg_with_debug_parameter():
|
||||
flexmock(module).should_receive('execute_command').with_args(
|
||||
('borg', 'list', '--debug', '--show-rc', 'repo'), output_log_level=logging.WARNING
|
||||
('borg', 'list', '--debug', '--show-rc', 'repo'),
|
||||
output_log_level=logging.WARNING,
|
||||
error_on_warnings=False,
|
||||
)
|
||||
insert_logging_mock(logging.DEBUG)
|
||||
|
||||
module.list_archives(
|
||||
repository='repo',
|
||||
storage_config={},
|
||||
list_arguments=flexmock(archive=None, json=False, successful=False),
|
||||
list_arguments=flexmock(archive=None, paths=None, json=False, successful=False),
|
||||
)
|
||||
|
||||
|
||||
def test_list_archives_with_log_debug_and_json_suppresses_most_borg_output():
|
||||
flexmock(module).should_receive('execute_command').with_args(
|
||||
('borg', 'list', '--json', 'repo'), output_log_level=None
|
||||
('borg', 'list', '--json', 'repo'), output_log_level=None, error_on_warnings=False
|
||||
)
|
||||
insert_logging_mock(logging.DEBUG)
|
||||
|
||||
module.list_archives(
|
||||
repository='repo',
|
||||
storage_config={},
|
||||
list_arguments=flexmock(archive=None, json=True, successful=False),
|
||||
list_arguments=flexmock(archive=None, paths=None, json=True, successful=False),
|
||||
)
|
||||
|
||||
|
||||
def test_list_archives_with_lock_wait_calls_borg_with_lock_wait_parameters():
|
||||
storage_config = {'lock_wait': 5}
|
||||
flexmock(module).should_receive('execute_command').with_args(
|
||||
('borg', 'list', '--lock-wait', '5', 'repo'), output_log_level=logging.WARNING
|
||||
('borg', 'list', '--lock-wait', '5', 'repo'),
|
||||
output_log_level=logging.WARNING,
|
||||
error_on_warnings=False,
|
||||
)
|
||||
|
||||
module.list_archives(
|
||||
repository='repo',
|
||||
storage_config=storage_config,
|
||||
list_arguments=flexmock(archive=None, json=False, successful=False),
|
||||
list_arguments=flexmock(archive=None, paths=None, json=False, successful=False),
|
||||
)
|
||||
|
||||
|
||||
def test_list_archives_with_archive_calls_borg_with_archive_parameter():
|
||||
storage_config = {}
|
||||
flexmock(module).should_receive('execute_command').with_args(
|
||||
('borg', 'list', 'repo::archive'), output_log_level=logging.WARNING
|
||||
('borg', 'list', 'repo::archive'), output_log_level=logging.WARNING, error_on_warnings=False
|
||||
)
|
||||
|
||||
module.list_archives(
|
||||
repository='repo',
|
||||
storage_config=storage_config,
|
||||
list_arguments=flexmock(archive='archive', json=False, successful=False),
|
||||
list_arguments=flexmock(archive='archive', paths=None, json=False, successful=False),
|
||||
)
|
||||
|
||||
|
||||
def test_list_archives_with_path_calls_borg_with_path_parameter():
|
||||
storage_config = {}
|
||||
flexmock(module).should_receive('execute_command').with_args(
|
||||
('borg', 'list', 'repo::archive', 'var/lib'),
|
||||
output_log_level=logging.WARNING,
|
||||
error_on_warnings=False,
|
||||
)
|
||||
|
||||
module.list_archives(
|
||||
repository='repo',
|
||||
storage_config=storage_config,
|
||||
list_arguments=flexmock(archive='archive', paths=['var/lib'], json=False, successful=False),
|
||||
)
|
||||
|
||||
|
||||
def test_list_archives_with_local_path_calls_borg_via_local_path():
|
||||
flexmock(module).should_receive('execute_command').with_args(
|
||||
('borg1', 'list', 'repo'), output_log_level=logging.WARNING
|
||||
('borg1', 'list', 'repo'), output_log_level=logging.WARNING, error_on_warnings=False
|
||||
)
|
||||
|
||||
module.list_archives(
|
||||
repository='repo',
|
||||
storage_config={},
|
||||
list_arguments=flexmock(archive=None, json=False, successful=False),
|
||||
list_arguments=flexmock(archive=None, paths=None, json=False, successful=False),
|
||||
local_path='borg1',
|
||||
)
|
||||
|
||||
|
||||
def test_list_archives_with_remote_path_calls_borg_with_remote_path_parameters():
|
||||
flexmock(module).should_receive('execute_command').with_args(
|
||||
('borg', 'list', '--remote-path', 'borg1', 'repo'), output_log_level=logging.WARNING
|
||||
('borg', 'list', '--remote-path', 'borg1', 'repo'),
|
||||
output_log_level=logging.WARNING,
|
||||
error_on_warnings=False,
|
||||
)
|
||||
|
||||
module.list_archives(
|
||||
repository='repo',
|
||||
storage_config={},
|
||||
list_arguments=flexmock(archive=None, json=False, successful=False),
|
||||
list_arguments=flexmock(archive=None, paths=None, json=False, successful=False),
|
||||
remote_path='borg1',
|
||||
)
|
||||
|
||||
|
||||
def test_list_archives_with_short_calls_borg_with_short_parameter():
|
||||
flexmock(module).should_receive('execute_command').with_args(
|
||||
('borg', 'list', '--short', 'repo'), output_log_level=logging.WARNING
|
||||
('borg', 'list', '--short', 'repo'),
|
||||
output_log_level=logging.WARNING,
|
||||
error_on_warnings=False,
|
||||
).and_return('[]')
|
||||
|
||||
module.list_archives(
|
||||
repository='repo',
|
||||
storage_config={},
|
||||
list_arguments=flexmock(archive=None, json=False, successful=False, short=True),
|
||||
list_arguments=flexmock(archive=None, paths=None, json=False, successful=False, short=True),
|
||||
)
|
||||
|
||||
|
||||
|
@ -154,13 +179,14 @@ def test_list_archives_passes_through_arguments_to_borg(argument_name):
|
|||
flexmock(module).should_receive('execute_command').with_args(
|
||||
('borg', 'list', '--' + argument_name.replace('_', '-'), 'value', 'repo'),
|
||||
output_log_level=logging.WARNING,
|
||||
error_on_warnings=False,
|
||||
).and_return('[]')
|
||||
|
||||
module.list_archives(
|
||||
repository='repo',
|
||||
storage_config={},
|
||||
list_arguments=flexmock(
|
||||
archive=None, json=False, successful=False, **{argument_name: 'value'}
|
||||
archive=None, paths=None, json=False, successful=False, **{argument_name: 'value'}
|
||||
),
|
||||
)
|
||||
|
||||
|
@ -169,24 +195,25 @@ def test_list_archives_with_successful_calls_borg_to_exclude_checkpoints():
|
|||
flexmock(module).should_receive('execute_command').with_args(
|
||||
('borg', 'list', '--glob-archives', module.BORG_EXCLUDE_CHECKPOINTS_GLOB, 'repo'),
|
||||
output_log_level=logging.WARNING,
|
||||
error_on_warnings=False,
|
||||
).and_return('[]')
|
||||
|
||||
module.list_archives(
|
||||
repository='repo',
|
||||
storage_config={},
|
||||
list_arguments=flexmock(archive=None, json=False, successful=True),
|
||||
list_arguments=flexmock(archive=None, paths=None, json=False, successful=True),
|
||||
)
|
||||
|
||||
|
||||
def test_list_archives_with_json_calls_borg_with_json_parameter():
|
||||
flexmock(module).should_receive('execute_command').with_args(
|
||||
('borg', 'list', '--json', 'repo'), output_log_level=None
|
||||
('borg', 'list', '--json', 'repo'), output_log_level=None, error_on_warnings=False
|
||||
).and_return('[]')
|
||||
|
||||
json_output = module.list_archives(
|
||||
repository='repo',
|
||||
storage_config={},
|
||||
list_arguments=flexmock(archive=None, json=True, successful=False),
|
||||
list_arguments=flexmock(archive=None, paths=None, json=True, successful=False),
|
||||
)
|
||||
|
||||
assert json_output == '[]'
|
||||
|
|
|
@ -8,7 +8,9 @@ from ..test_verbosity import insert_logging_mock
|
|||
|
||||
|
||||
def insert_execute_command_mock(command):
|
||||
flexmock(module).should_receive('execute_command').with_args(command).once()
|
||||
flexmock(module).should_receive('execute_command').with_args(
|
||||
command, error_on_warnings=False
|
||||
).once()
|
||||
|
||||
|
||||
def test_mount_archive_calls_borg_with_required_parameters():
|
||||
|
@ -116,7 +118,7 @@ def test_mount_archive_with_log_debug_calls_borg_with_debug_parameters():
|
|||
|
||||
def test_mount_archive_calls_borg_with_foreground_parameter():
|
||||
flexmock(module).should_receive('execute_command_without_capture').with_args(
|
||||
('borg', 'mount', '--foreground', 'repo::archive', '/mnt')
|
||||
('borg', 'mount', '--foreground', 'repo::archive', '/mnt'), error_on_warnings=False
|
||||
).once()
|
||||
|
||||
module.mount_archive(
|
||||
|
|
|
@ -10,7 +10,7 @@ from ..test_verbosity import insert_logging_mock
|
|||
|
||||
def insert_execute_command_mock(prune_command, output_log_level):
|
||||
flexmock(module).should_receive('execute_command').with_args(
|
||||
prune_command, output_log_level=output_log_level
|
||||
prune_command, output_log_level=output_log_level, error_on_warnings=False
|
||||
).once()
|
||||
|
||||
|
||||
|
@ -75,7 +75,9 @@ def test_prune_archives_with_log_info_calls_borg_with_info_parameter():
|
|||
flexmock(module).should_receive('_make_prune_flags').with_args(retention_config).and_return(
|
||||
BASE_PRUNE_FLAGS
|
||||
)
|
||||
insert_execute_command_mock(PRUNE_COMMAND + ('--stats', '--info', 'repo'), logging.INFO)
|
||||
insert_execute_command_mock(
|
||||
PRUNE_COMMAND + ('--stats', '--info', '--list', 'repo'), logging.INFO
|
||||
)
|
||||
insert_logging_mock(logging.INFO)
|
||||
|
||||
module.prune_archives(
|
||||
|
@ -188,3 +190,18 @@ def test_prune_archives_with_lock_wait_calls_borg_with_lock_wait_parameters():
|
|||
storage_config=storage_config,
|
||||
retention_config=retention_config,
|
||||
)
|
||||
|
||||
|
||||
def test_prune_archives_with_extra_borg_options_calls_borg_with_extra_options():
|
||||
retention_config = flexmock()
|
||||
flexmock(module).should_receive('_make_prune_flags').with_args(retention_config).and_return(
|
||||
BASE_PRUNE_FLAGS
|
||||
)
|
||||
insert_execute_command_mock(PRUNE_COMMAND + ('--extra', '--options', 'repo'), logging.INFO)
|
||||
|
||||
module.prune_archives(
|
||||
dry_run=False,
|
||||
repository='repo',
|
||||
storage_config={'extra_borg_options': {'prune': '--extra --options'}},
|
||||
retention_config=retention_config,
|
||||
)
|
||||
|
|
|
@ -20,7 +20,18 @@ def test_run_configuration_runs_actions_for_each_repository():
|
|||
assert results == expected_results
|
||||
|
||||
|
||||
def test_run_configuration_executes_hooks_for_create_action():
|
||||
def test_run_configuration_calls_hooks_for_prune_action():
|
||||
flexmock(module.borg_environment).should_receive('initialize')
|
||||
flexmock(module.command).should_receive('execute_hook').never()
|
||||
flexmock(module.dispatch).should_receive('call_hooks').at_least().twice()
|
||||
flexmock(module).should_receive('run_actions').and_return([])
|
||||
config = {'location': {'repositories': ['foo']}}
|
||||
arguments = {'global': flexmock(dry_run=False), 'prune': flexmock()}
|
||||
|
||||
list(module.run_configuration('test.yaml', config, arguments))
|
||||
|
||||
|
||||
def test_run_configuration_executes_and_calls_hooks_for_create_action():
|
||||
flexmock(module.borg_environment).should_receive('initialize')
|
||||
flexmock(module.command).should_receive('execute_hook').twice()
|
||||
flexmock(module.dispatch).should_receive('call_hooks').at_least().twice()
|
||||
|
@ -31,6 +42,28 @@ def test_run_configuration_executes_hooks_for_create_action():
|
|||
list(module.run_configuration('test.yaml', config, arguments))
|
||||
|
||||
|
||||
def test_run_configuration_calls_hooks_for_check_action():
|
||||
flexmock(module.borg_environment).should_receive('initialize')
|
||||
flexmock(module.command).should_receive('execute_hook').never()
|
||||
flexmock(module.dispatch).should_receive('call_hooks').at_least().twice()
|
||||
flexmock(module).should_receive('run_actions').and_return([])
|
||||
config = {'location': {'repositories': ['foo']}}
|
||||
arguments = {'global': flexmock(dry_run=False), 'check': flexmock()}
|
||||
|
||||
list(module.run_configuration('test.yaml', config, arguments))
|
||||
|
||||
|
||||
def test_run_configuration_does_not_trigger_hooks_for_list_action():
|
||||
flexmock(module.borg_environment).should_receive('initialize')
|
||||
flexmock(module.command).should_receive('execute_hook').never()
|
||||
flexmock(module.dispatch).should_receive('call_hooks').never()
|
||||
flexmock(module).should_receive('run_actions').and_return([])
|
||||
config = {'location': {'repositories': ['foo']}}
|
||||
arguments = {'global': flexmock(dry_run=False), 'list': flexmock()}
|
||||
|
||||
list(module.run_configuration('test.yaml', config, arguments))
|
||||
|
||||
|
||||
def test_run_configuration_logs_actions_error():
|
||||
flexmock(module.borg_environment).should_receive('initialize')
|
||||
flexmock(module.command).should_receive('execute_hook')
|
||||
|
@ -86,7 +119,7 @@ def test_run_configuration_logs_on_error_hook_error():
|
|||
).and_return(expected_results[1:])
|
||||
flexmock(module).should_receive('run_actions').and_raise(OSError)
|
||||
config = {'location': {'repositories': ['foo']}}
|
||||
arguments = {'global': flexmock(dry_run=False)}
|
||||
arguments = {'global': flexmock(dry_run=False), 'create': flexmock()}
|
||||
|
||||
results = list(module.run_configuration('test.yaml', config, arguments))
|
||||
|
||||
|
@ -169,6 +202,18 @@ def test_make_error_log_records_generates_nothing_for_other_error():
|
|||
assert logs == ()
|
||||
|
||||
|
||||
def test_get_local_path_uses_configuration_value():
|
||||
assert module.get_local_path({'test.yaml': {'location': {'local_path': 'borg1'}}}) == 'borg1'
|
||||
|
||||
|
||||
def test_get_local_path_without_location_defaults_to_borg():
|
||||
assert module.get_local_path({'test.yaml': {}}) == 'borg'
|
||||
|
||||
|
||||
def test_get_local_path_without_local_path_defaults_to_borg():
|
||||
assert module.get_local_path({'test.yaml': {'location': {}}}) == 'borg'
|
||||
|
||||
|
||||
def test_collect_configuration_run_summary_logs_info_for_success():
|
||||
flexmock(module.command).should_receive('execute_hook').never()
|
||||
flexmock(module).should_receive('run_configuration').and_return([])
|
||||
|
@ -324,6 +369,22 @@ def test_collect_configuration_run_summary_logs_run_configuration_error():
|
|||
assert {log.levelno for log in logs} == {logging.CRITICAL}
|
||||
|
||||
|
||||
def test_collect_configuration_run_summary_logs_run_umount_error():
|
||||
flexmock(module.validate).should_receive('guard_configuration_contains_repository')
|
||||
flexmock(module).should_receive('run_configuration').and_return([])
|
||||
flexmock(module.borg_umount).should_receive('unmount_archive').and_raise(OSError)
|
||||
flexmock(module).should_receive('make_error_log_records').and_return(
|
||||
[logging.makeLogRecord(dict(levelno=logging.CRITICAL, levelname='CRITICAL', msg='Error'))]
|
||||
)
|
||||
arguments = {'umount': flexmock(mount_point='/mnt')}
|
||||
|
||||
logs = tuple(
|
||||
module.collect_configuration_run_summary_logs({'test.yaml': {}}, arguments=arguments)
|
||||
)
|
||||
|
||||
assert {log.levelno for log in logs} == {logging.INFO, logging.CRITICAL}
|
||||
|
||||
|
||||
def test_collect_configuration_run_summary_logs_outputs_merged_json_results():
|
||||
flexmock(module).should_receive('run_configuration').and_return(['foo', 'bar']).and_return(
|
||||
['baz']
|
||||
|
|
|
@ -0,0 +1,82 @@
|
|||
import pytest
|
||||
from flexmock import flexmock
|
||||
|
||||
from borgmatic.config import override as module
|
||||
|
||||
|
||||
def test_set_values_with_empty_keys_bails():
|
||||
config = {}
|
||||
|
||||
module.set_values(config, keys=(), value='value')
|
||||
|
||||
assert config == {}
|
||||
|
||||
|
||||
def test_set_values_with_one_key_sets_it_into_config():
|
||||
config = {}
|
||||
|
||||
module.set_values(config, keys=('key',), value='value')
|
||||
|
||||
assert config == {'key': 'value'}
|
||||
|
||||
|
||||
def test_set_values_with_one_key_overwrites_existing_key():
|
||||
config = {'key': 'old_value', 'other': 'other_value'}
|
||||
|
||||
module.set_values(config, keys=('key',), value='value')
|
||||
|
||||
assert config == {'key': 'value', 'other': 'other_value'}
|
||||
|
||||
|
||||
def test_set_values_with_multiple_keys_creates_hierarchy():
|
||||
config = {}
|
||||
|
||||
module.set_values(config, ('section', 'key'), 'value')
|
||||
|
||||
assert config == {'section': {'key': 'value'}}
|
||||
|
||||
|
||||
def test_set_values_with_multiple_keys_updates_hierarchy():
|
||||
config = {'section': {'other': 'other_value'}}
|
||||
module.set_values(config, ('section', 'key'), 'value')
|
||||
|
||||
assert config == {'section': {'key': 'value', 'other': 'other_value'}}
|
||||
|
||||
|
||||
def test_parse_overrides_splits_keys_and_values():
|
||||
flexmock(module).should_receive('convert_value_type').replace_with(lambda value: value)
|
||||
raw_overrides = ['section.my_option=value1', 'section.other_option=value2']
|
||||
expected_result = (
|
||||
(('section', 'my_option'), 'value1'),
|
||||
(('section', 'other_option'), 'value2'),
|
||||
)
|
||||
|
||||
module.parse_overrides(raw_overrides) == expected_result
|
||||
|
||||
|
||||
def test_parse_overrides_allows_value_with_equal_sign():
|
||||
flexmock(module).should_receive('convert_value_type').replace_with(lambda value: value)
|
||||
raw_overrides = ['section.option=this===value']
|
||||
expected_result = ((('section', 'option'), 'this===value'),)
|
||||
|
||||
module.parse_overrides(raw_overrides) == expected_result
|
||||
|
||||
|
||||
def test_parse_overrides_raises_on_missing_equal_sign():
|
||||
flexmock(module).should_receive('convert_value_type').replace_with(lambda value: value)
|
||||
raw_overrides = ['section.option']
|
||||
|
||||
with pytest.raises(ValueError):
|
||||
module.parse_overrides(raw_overrides)
|
||||
|
||||
|
||||
def test_parse_overrides_allows_value_with_single_key():
|
||||
flexmock(module).should_receive('convert_value_type').replace_with(lambda value: value)
|
||||
raw_overrides = ['option=value']
|
||||
expected_result = ((('option',), 'value'),)
|
||||
|
||||
module.parse_overrides(raw_overrides) == expected_result
|
||||
|
||||
|
||||
def test_parse_overrides_handles_empty_overrides():
|
||||
module.parse_overrides(raw_overrides=None) == ()
|
|
@ -1,4 +1,5 @@
|
|||
import pytest
|
||||
from flexmock import flexmock
|
||||
|
||||
from borgmatic.config import validate as module
|
||||
|
||||
|
@ -95,7 +96,38 @@ def test_remove_examples_strips_examples_from_sequence_of_maps():
|
|||
assert schema == {'seq': [{'map': {'foo': {'desc': 'thing'}}}]}
|
||||
|
||||
|
||||
def test_normalize_repository_path_passes_through_remote_repository():
|
||||
repository = 'example.org:test.borg'
|
||||
|
||||
module.normalize_repository_path(repository) == repository
|
||||
|
||||
|
||||
def test_normalize_repository_path_passes_through_absolute_repository():
|
||||
repository = '/foo/bar/test.borg'
|
||||
flexmock(module.os.path).should_receive('abspath').and_return(repository)
|
||||
|
||||
module.normalize_repository_path(repository) == repository
|
||||
|
||||
|
||||
def test_normalize_repository_path_resolves_relative_repository():
|
||||
repository = 'test.borg'
|
||||
absolute = '/foo/bar/test.borg'
|
||||
flexmock(module.os.path).should_receive('abspath').and_return(absolute)
|
||||
|
||||
module.normalize_repository_path(repository) == absolute
|
||||
|
||||
|
||||
def test_repositories_match_does_not_raise():
|
||||
flexmock(module).should_receive('normalize_repository_path')
|
||||
|
||||
module.repositories_match('foo', 'bar')
|
||||
|
||||
|
||||
def test_guard_configuration_contains_repository_does_not_raise_when_repository_in_config():
|
||||
flexmock(module).should_receive('repositories_match').replace_with(
|
||||
lambda first, second: first == second
|
||||
)
|
||||
|
||||
module.guard_configuration_contains_repository(
|
||||
repository='repo', configurations={'config.yaml': {'location': {'repositories': ['repo']}}}
|
||||
)
|
||||
|
@ -116,6 +148,10 @@ def test_guard_configuration_contains_repository_errors_when_repository_assumed_
|
|||
|
||||
|
||||
def test_guard_configuration_contains_repository_errors_when_repository_missing_from_config():
|
||||
flexmock(module).should_receive('repositories_match').replace_with(
|
||||
lambda first, second: first == second
|
||||
)
|
||||
|
||||
with pytest.raises(ValueError):
|
||||
module.guard_configuration_contains_repository(
|
||||
repository='nope',
|
||||
|
@ -124,6 +160,10 @@ def test_guard_configuration_contains_repository_errors_when_repository_missing_
|
|||
|
||||
|
||||
def test_guard_configuration_contains_repository_errors_when_repository_matches_config_twice():
|
||||
flexmock(module).should_receive('repositories_match').replace_with(
|
||||
lambda first, second: first == second
|
||||
)
|
||||
|
||||
with pytest.raises(ValueError):
|
||||
module.guard_configuration_contains_repository(
|
||||
repository='repo',
|
||||
|
|
|
@ -4,6 +4,14 @@ from flexmock import flexmock
|
|||
from borgmatic.hooks import dump as module
|
||||
|
||||
|
||||
def test_make_database_dump_path_joins_arguments():
|
||||
assert module.make_database_dump_path('/tmp', 'super_databases') == '/tmp/super_databases'
|
||||
|
||||
|
||||
def test_make_database_dump_path_defaults_without_source_directory():
|
||||
assert module.make_database_dump_path(None, 'super_databases') == '~/.borgmatic/super_databases'
|
||||
|
||||
|
||||
def test_make_database_dump_filename_uses_name_and_hostname():
|
||||
flexmock(module.os.path).should_receive('expanduser').and_return('databases')
|
||||
|
||||
|
|
|
@ -60,7 +60,7 @@ def test_ping_monitor_hits_ping_url_for_start_state():
|
|||
flexmock(module).should_receive('Forgetful_buffering_handler')
|
||||
ping_url = 'https://example.com'
|
||||
flexmock(module.requests).should_receive('post').with_args(
|
||||
'{}/{}'.format(ping_url, 'start'), data=''
|
||||
'{}/{}'.format(ping_url, 'start'), data=''.encode('utf-8')
|
||||
)
|
||||
|
||||
module.ping_monitor(ping_url, 'config.yaml', state=module.monitor.State.START, dry_run=False)
|
||||
|
@ -68,19 +68,21 @@ def test_ping_monitor_hits_ping_url_for_start_state():
|
|||
|
||||
def test_ping_monitor_hits_ping_url_for_finish_state():
|
||||
ping_url = 'https://example.com'
|
||||
payload = flexmock()
|
||||
payload = 'data'
|
||||
flexmock(module).should_receive('format_buffered_logs_for_payload').and_return(payload)
|
||||
flexmock(module.requests).should_receive('post').with_args(ping_url, data=payload)
|
||||
flexmock(module.requests).should_receive('post').with_args(
|
||||
ping_url, data=payload.encode('utf-8')
|
||||
)
|
||||
|
||||
module.ping_monitor(ping_url, 'config.yaml', state=module.monitor.State.FINISH, dry_run=False)
|
||||
|
||||
|
||||
def test_ping_monitor_hits_ping_url_for_fail_state():
|
||||
ping_url = 'https://example.com'
|
||||
payload = flexmock()
|
||||
payload = 'data'
|
||||
flexmock(module).should_receive('format_buffered_logs_for_payload').and_return(payload)
|
||||
flexmock(module.requests).should_receive('post').with_args(
|
||||
'{}/{}'.format(ping_url, 'fail'), data=payload
|
||||
'{}/{}'.format(ping_url, 'fail'), data=payload.encode('utf')
|
||||
)
|
||||
|
||||
module.ping_monitor(ping_url, 'config.yaml', state=module.monitor.State.FAIL, dry_run=False)
|
||||
|
@ -88,10 +90,10 @@ def test_ping_monitor_hits_ping_url_for_fail_state():
|
|||
|
||||
def test_ping_monitor_with_ping_uuid_hits_corresponding_url():
|
||||
ping_uuid = 'abcd-efgh-ijkl-mnop'
|
||||
payload = flexmock()
|
||||
payload = 'data'
|
||||
flexmock(module).should_receive('format_buffered_logs_for_payload').and_return(payload)
|
||||
flexmock(module.requests).should_receive('post').with_args(
|
||||
'https://hc-ping.com/{}'.format(ping_uuid), data=payload
|
||||
'https://hc-ping.com/{}'.format(ping_uuid), data=payload.encode('utf-8')
|
||||
)
|
||||
|
||||
module.ping_monitor(ping_uuid, 'config.yaml', state=module.monitor.State.FINISH, dry_run=False)
|
||||
|
|
|
@ -8,6 +8,7 @@ from borgmatic.hooks import mysql as module
|
|||
def test_dump_databases_runs_mysqldump_for_each_database():
|
||||
databases = [{'name': 'foo'}, {'name': 'bar'}]
|
||||
output_file = flexmock()
|
||||
flexmock(module).should_receive('make_dump_path').and_return('')
|
||||
flexmock(module.dump).should_receive('make_database_dump_filename').and_return(
|
||||
'databases/localhost/foo'
|
||||
).and_return('databases/localhost/bar')
|
||||
|
@ -21,23 +22,25 @@ def test_dump_databases_runs_mysqldump_for_each_database():
|
|||
extra_environment=None,
|
||||
).once()
|
||||
|
||||
module.dump_databases(databases, 'test.yaml', dry_run=False)
|
||||
module.dump_databases(databases, 'test.yaml', {}, dry_run=False)
|
||||
|
||||
|
||||
def test_dump_databases_with_dry_run_skips_mysqldump():
|
||||
databases = [{'name': 'foo'}, {'name': 'bar'}]
|
||||
flexmock(module).should_receive('make_dump_path').and_return('')
|
||||
flexmock(module.dump).should_receive('make_database_dump_filename').and_return(
|
||||
'databases/localhost/foo'
|
||||
).and_return('databases/localhost/bar')
|
||||
flexmock(module.os).should_receive('makedirs').never()
|
||||
flexmock(module).should_receive('execute_command').never()
|
||||
|
||||
module.dump_databases(databases, 'test.yaml', dry_run=True)
|
||||
module.dump_databases(databases, 'test.yaml', {}, dry_run=True)
|
||||
|
||||
|
||||
def test_dump_databases_runs_mysqldump_with_hostname_and_port():
|
||||
databases = [{'name': 'foo', 'hostname': 'database.example.org', 'port': 5433}]
|
||||
output_file = flexmock()
|
||||
flexmock(module).should_receive('make_dump_path').and_return('')
|
||||
flexmock(module.dump).should_receive('make_database_dump_filename').and_return(
|
||||
'databases/database.example.org/foo'
|
||||
)
|
||||
|
@ -61,12 +64,13 @@ def test_dump_databases_runs_mysqldump_with_hostname_and_port():
|
|||
extra_environment=None,
|
||||
).once()
|
||||
|
||||
module.dump_databases(databases, 'test.yaml', dry_run=False)
|
||||
module.dump_databases(databases, 'test.yaml', {}, dry_run=False)
|
||||
|
||||
|
||||
def test_dump_databases_runs_mysqldump_with_username_and_password():
|
||||
databases = [{'name': 'foo', 'username': 'root', 'password': 'trustsome1'}]
|
||||
output_file = flexmock()
|
||||
flexmock(module).should_receive('make_dump_path').and_return('')
|
||||
flexmock(module.dump).should_receive('make_database_dump_filename').and_return(
|
||||
'databases/localhost/foo'
|
||||
)
|
||||
|
@ -79,12 +83,13 @@ def test_dump_databases_runs_mysqldump_with_username_and_password():
|
|||
extra_environment={'MYSQL_PWD': 'trustsome1'},
|
||||
).once()
|
||||
|
||||
module.dump_databases(databases, 'test.yaml', dry_run=False)
|
||||
module.dump_databases(databases, 'test.yaml', {}, dry_run=False)
|
||||
|
||||
|
||||
def test_dump_databases_runs_mysqldump_with_options():
|
||||
databases = [{'name': 'foo', 'options': '--stuff=such'}]
|
||||
output_file = flexmock()
|
||||
flexmock(module).should_receive('make_dump_path').and_return('')
|
||||
flexmock(module.dump).should_receive('make_database_dump_filename').and_return(
|
||||
'databases/localhost/foo'
|
||||
)
|
||||
|
@ -97,12 +102,13 @@ def test_dump_databases_runs_mysqldump_with_options():
|
|||
extra_environment=None,
|
||||
).once()
|
||||
|
||||
module.dump_databases(databases, 'test.yaml', dry_run=False)
|
||||
module.dump_databases(databases, 'test.yaml', {}, dry_run=False)
|
||||
|
||||
|
||||
def test_dump_databases_runs_mysqldump_for_all_databases():
|
||||
databases = [{'name': 'all'}]
|
||||
output_file = flexmock()
|
||||
flexmock(module).should_receive('make_dump_path').and_return('')
|
||||
flexmock(module.dump).should_receive('make_database_dump_filename').and_return(
|
||||
'databases/localhost/all'
|
||||
)
|
||||
|
@ -115,30 +121,33 @@ def test_dump_databases_runs_mysqldump_for_all_databases():
|
|||
extra_environment=None,
|
||||
).once()
|
||||
|
||||
module.dump_databases(databases, 'test.yaml', dry_run=False)
|
||||
module.dump_databases(databases, 'test.yaml', {}, dry_run=False)
|
||||
|
||||
|
||||
def test_make_database_dump_patterns_converts_names_to_glob_paths():
|
||||
flexmock(module).should_receive('make_dump_path').and_return('')
|
||||
flexmock(module.dump).should_receive('make_database_dump_filename').and_return(
|
||||
'databases/*/foo'
|
||||
).and_return('databases/*/bar')
|
||||
|
||||
assert module.make_database_dump_patterns(flexmock(), flexmock(), ('foo', 'bar')) == [
|
||||
assert module.make_database_dump_patterns(flexmock(), flexmock(), {}, ('foo', 'bar')) == [
|
||||
'databases/*/foo',
|
||||
'databases/*/bar',
|
||||
]
|
||||
|
||||
|
||||
def test_make_database_dump_patterns_treats_empty_names_as_matching_all_databases():
|
||||
flexmock(module).should_receive('make_dump_path').and_return('/dump/path')
|
||||
flexmock(module.dump).should_receive('make_database_dump_filename').with_args(
|
||||
module.DUMP_PATH, '*', '*'
|
||||
'/dump/path', '*', '*'
|
||||
).and_return('databases/*/*')
|
||||
|
||||
assert module.make_database_dump_patterns(flexmock(), flexmock(), ()) == ['databases/*/*']
|
||||
assert module.make_database_dump_patterns(flexmock(), flexmock(), {}, ()) == ['databases/*/*']
|
||||
|
||||
|
||||
def test_restore_database_dumps_restores_each_database():
|
||||
databases = [{'name': 'foo'}, {'name': 'bar'}]
|
||||
flexmock(module).should_receive('make_dump_path').and_return('')
|
||||
flexmock(module.dump).should_receive('make_database_dump_filename').and_return(
|
||||
'databases/localhost/foo'
|
||||
).and_return('databases/localhost/bar')
|
||||
|
@ -153,11 +162,12 @@ def test_restore_database_dumps_restores_each_database():
|
|||
('mysql', '--batch'), input_file=input_file, extra_environment=None
|
||||
).once()
|
||||
|
||||
module.restore_database_dumps(databases, 'test.yaml', dry_run=False)
|
||||
module.restore_database_dumps(databases, 'test.yaml', {}, dry_run=False)
|
||||
|
||||
|
||||
def test_restore_database_dumps_runs_mysql_with_hostname_and_port():
|
||||
databases = [{'name': 'foo', 'hostname': 'database.example.org', 'port': 5433}]
|
||||
flexmock(module).should_receive('make_dump_path').and_return('')
|
||||
flexmock(module.dump).should_receive('make_database_dump_filename').and_return(
|
||||
'databases/localhost/foo'
|
||||
)
|
||||
|
@ -182,11 +192,12 @@ def test_restore_database_dumps_runs_mysql_with_hostname_and_port():
|
|||
extra_environment=None,
|
||||
).once()
|
||||
|
||||
module.restore_database_dumps(databases, 'test.yaml', dry_run=False)
|
||||
module.restore_database_dumps(databases, 'test.yaml', {}, dry_run=False)
|
||||
|
||||
|
||||
def test_restore_database_dumps_runs_mysql_with_username_and_password():
|
||||
databases = [{'name': 'foo', 'username': 'root', 'password': 'trustsome1'}]
|
||||
flexmock(module).should_receive('make_dump_path').and_return('')
|
||||
flexmock(module.dump).should_receive('make_database_dump_filename').and_return(
|
||||
'databases/localhost/foo'
|
||||
)
|
||||
|
@ -202,4 +213,4 @@ def test_restore_database_dumps_runs_mysql_with_username_and_password():
|
|||
extra_environment={'MYSQL_PWD': 'trustsome1'},
|
||||
).once()
|
||||
|
||||
module.restore_database_dumps(databases, 'test.yaml', dry_run=False)
|
||||
module.restore_database_dumps(databases, 'test.yaml', {}, dry_run=False)
|
||||
|
|
|
@ -5,6 +5,7 @@ from borgmatic.hooks import postgresql as module
|
|||
|
||||
def test_dump_databases_runs_pg_dump_for_each_database():
|
||||
databases = [{'name': 'foo'}, {'name': 'bar'}]
|
||||
flexmock(module).should_receive('make_dump_path').and_return('')
|
||||
flexmock(module.dump).should_receive('make_database_dump_filename').and_return(
|
||||
'databases/localhost/foo'
|
||||
).and_return('databases/localhost/bar')
|
||||
|
@ -25,22 +26,24 @@ def test_dump_databases_runs_pg_dump_for_each_database():
|
|||
extra_environment=None,
|
||||
).once()
|
||||
|
||||
module.dump_databases(databases, 'test.yaml', dry_run=False)
|
||||
module.dump_databases(databases, 'test.yaml', {}, dry_run=False)
|
||||
|
||||
|
||||
def test_dump_databases_with_dry_run_skips_pg_dump():
|
||||
databases = [{'name': 'foo'}, {'name': 'bar'}]
|
||||
flexmock(module).should_receive('make_dump_path').and_return('')
|
||||
flexmock(module.dump).should_receive('make_database_dump_filename').and_return(
|
||||
'databases/localhost/foo'
|
||||
).and_return('databases/localhost/bar')
|
||||
flexmock(module.os).should_receive('makedirs').never()
|
||||
flexmock(module).should_receive('execute_command').never()
|
||||
|
||||
module.dump_databases(databases, 'test.yaml', dry_run=True)
|
||||
module.dump_databases(databases, 'test.yaml', {}, dry_run=True)
|
||||
|
||||
|
||||
def test_dump_databases_runs_pg_dump_with_hostname_and_port():
|
||||
databases = [{'name': 'foo', 'hostname': 'database.example.org', 'port': 5433}]
|
||||
flexmock(module).should_receive('make_dump_path').and_return('')
|
||||
flexmock(module.dump).should_receive('make_database_dump_filename').and_return(
|
||||
'databases/database.example.org/foo'
|
||||
)
|
||||
|
@ -64,11 +67,12 @@ def test_dump_databases_runs_pg_dump_with_hostname_and_port():
|
|||
extra_environment=None,
|
||||
).once()
|
||||
|
||||
module.dump_databases(databases, 'test.yaml', dry_run=False)
|
||||
module.dump_databases(databases, 'test.yaml', {}, dry_run=False)
|
||||
|
||||
|
||||
def test_dump_databases_runs_pg_dump_with_username_and_password():
|
||||
databases = [{'name': 'foo', 'username': 'postgres', 'password': 'trustsome1'}]
|
||||
flexmock(module).should_receive('make_dump_path').and_return('')
|
||||
flexmock(module.dump).should_receive('make_database_dump_filename').and_return(
|
||||
'databases/localhost/foo'
|
||||
)
|
||||
|
@ -90,11 +94,12 @@ def test_dump_databases_runs_pg_dump_with_username_and_password():
|
|||
extra_environment={'PGPASSWORD': 'trustsome1'},
|
||||
).once()
|
||||
|
||||
module.dump_databases(databases, 'test.yaml', dry_run=False)
|
||||
module.dump_databases(databases, 'test.yaml', {}, dry_run=False)
|
||||
|
||||
|
||||
def test_dump_databases_runs_pg_dump_with_format():
|
||||
databases = [{'name': 'foo', 'format': 'tar'}]
|
||||
flexmock(module).should_receive('make_dump_path').and_return('')
|
||||
flexmock(module.dump).should_receive('make_database_dump_filename').and_return(
|
||||
'databases/localhost/foo'
|
||||
)
|
||||
|
@ -114,11 +119,12 @@ def test_dump_databases_runs_pg_dump_with_format():
|
|||
extra_environment=None,
|
||||
).once()
|
||||
|
||||
module.dump_databases(databases, 'test.yaml', dry_run=False)
|
||||
module.dump_databases(databases, 'test.yaml', {}, dry_run=False)
|
||||
|
||||
|
||||
def test_dump_databases_runs_pg_dump_with_options():
|
||||
databases = [{'name': 'foo', 'options': '--stuff=such'}]
|
||||
flexmock(module).should_receive('make_dump_path').and_return('')
|
||||
flexmock(module.dump).should_receive('make_database_dump_filename').and_return(
|
||||
'databases/localhost/foo'
|
||||
)
|
||||
|
@ -139,11 +145,12 @@ def test_dump_databases_runs_pg_dump_with_options():
|
|||
extra_environment=None,
|
||||
).once()
|
||||
|
||||
module.dump_databases(databases, 'test.yaml', dry_run=False)
|
||||
module.dump_databases(databases, 'test.yaml', {}, dry_run=False)
|
||||
|
||||
|
||||
def test_dump_databases_runs_pg_dumpall_for_all_databases():
|
||||
databases = [{'name': 'all'}]
|
||||
flexmock(module).should_receive('make_dump_path').and_return('')
|
||||
flexmock(module.dump).should_receive('make_database_dump_filename').and_return(
|
||||
'databases/localhost/all'
|
||||
)
|
||||
|
@ -154,30 +161,33 @@ def test_dump_databases_runs_pg_dumpall_for_all_databases():
|
|||
extra_environment=None,
|
||||
).once()
|
||||
|
||||
module.dump_databases(databases, 'test.yaml', dry_run=False)
|
||||
module.dump_databases(databases, 'test.yaml', {}, dry_run=False)
|
||||
|
||||
|
||||
def test_make_database_dump_patterns_converts_names_to_glob_paths():
|
||||
flexmock(module).should_receive('make_dump_path').and_return('')
|
||||
flexmock(module.dump).should_receive('make_database_dump_filename').and_return(
|
||||
'databases/*/foo'
|
||||
).and_return('databases/*/bar')
|
||||
|
||||
assert module.make_database_dump_patterns(flexmock(), flexmock(), ('foo', 'bar')) == [
|
||||
assert module.make_database_dump_patterns(flexmock(), flexmock(), {}, ('foo', 'bar')) == [
|
||||
'databases/*/foo',
|
||||
'databases/*/bar',
|
||||
]
|
||||
|
||||
|
||||
def test_make_database_dump_patterns_treats_empty_names_as_matching_all_databases():
|
||||
flexmock(module).should_receive('make_dump_path').and_return('/dump/path')
|
||||
flexmock(module.dump).should_receive('make_database_dump_filename').with_args(
|
||||
module.DUMP_PATH, '*', '*'
|
||||
'/dump/path', '*', '*'
|
||||
).and_return('databases/*/*')
|
||||
|
||||
assert module.make_database_dump_patterns(flexmock(), flexmock(), ()) == ['databases/*/*']
|
||||
assert module.make_database_dump_patterns(flexmock(), flexmock(), {}, ()) == ['databases/*/*']
|
||||
|
||||
|
||||
def test_restore_database_dumps_restores_each_database():
|
||||
databases = [{'name': 'foo'}, {'name': 'bar'}]
|
||||
flexmock(module).should_receive('make_dump_path').and_return('')
|
||||
flexmock(module.dump).should_receive('make_database_dump_filename').and_return(
|
||||
'databases/localhost/foo'
|
||||
).and_return('databases/localhost/bar')
|
||||
|
@ -201,11 +211,12 @@ def test_restore_database_dumps_restores_each_database():
|
|||
extra_environment=None,
|
||||
).once()
|
||||
|
||||
module.restore_database_dumps(databases, 'test.yaml', dry_run=False)
|
||||
module.restore_database_dumps(databases, 'test.yaml', {}, dry_run=False)
|
||||
|
||||
|
||||
def test_restore_database_dumps_runs_pg_restore_with_hostname_and_port():
|
||||
databases = [{'name': 'foo', 'hostname': 'database.example.org', 'port': 5433}]
|
||||
flexmock(module).should_receive('make_dump_path').and_return('')
|
||||
flexmock(module.dump).should_receive('make_database_dump_filename').and_return(
|
||||
'databases/localhost/foo'
|
||||
)
|
||||
|
@ -244,11 +255,12 @@ def test_restore_database_dumps_runs_pg_restore_with_hostname_and_port():
|
|||
extra_environment=None,
|
||||
).once()
|
||||
|
||||
module.restore_database_dumps(databases, 'test.yaml', dry_run=False)
|
||||
module.restore_database_dumps(databases, 'test.yaml', {}, dry_run=False)
|
||||
|
||||
|
||||
def test_restore_database_dumps_runs_pg_restore_with_username_and_password():
|
||||
databases = [{'name': 'foo', 'username': 'postgres', 'password': 'trustsome1'}]
|
||||
flexmock(module).should_receive('make_dump_path').and_return('')
|
||||
flexmock(module.dump).should_receive('make_database_dump_filename').and_return(
|
||||
'databases/localhost/foo'
|
||||
)
|
||||
|
@ -283,4 +295,4 @@ def test_restore_database_dumps_runs_pg_restore_with_username_and_password():
|
|||
extra_environment={'PGPASSWORD': 'trustsome1'},
|
||||
).once()
|
||||
|
||||
module.restore_database_dumps(databases, 'test.yaml', dry_run=False)
|
||||
module.restore_database_dumps(databases, 'test.yaml', {}, dry_run=False)
|
||||
|
|
|
@ -4,44 +4,28 @@ from flexmock import flexmock
|
|||
from borgmatic import execute as module
|
||||
|
||||
|
||||
def test_exit_code_indicates_error_with_borg_error_is_true():
|
||||
assert module.exit_code_indicates_error(('/usr/bin/borg1', 'init'), 2)
|
||||
|
||||
|
||||
def test_exit_code_indicates_error_with_borg_warning_is_false():
|
||||
assert not module.exit_code_indicates_error(('/usr/bin/borg1', 'init'), 1)
|
||||
|
||||
|
||||
def test_exit_code_indicates_error_with_borg_success_is_false():
|
||||
assert not module.exit_code_indicates_error(('/usr/bin/borg1', 'init'), 0)
|
||||
|
||||
|
||||
def test_exit_code_indicates_error_with_borg_error_and_error_on_warnings_is_true():
|
||||
assert module.exit_code_indicates_error(('/usr/bin/borg1', 'init'), 2, error_on_warnings=True)
|
||||
|
||||
|
||||
def test_exit_code_indicates_error_with_borg_warning_and_error_on_warnings_is_true():
|
||||
assert module.exit_code_indicates_error(('/usr/bin/borg1', 'init'), 1, error_on_warnings=True)
|
||||
|
||||
|
||||
def test_exit_code_indicates_error_with_borg_success_and_error_on_warnings_is_false():
|
||||
assert not module.exit_code_indicates_error(
|
||||
('/usr/bin/borg1', 'init'), 0, error_on_warnings=True
|
||||
@pytest.mark.parametrize(
|
||||
'exit_code,error_on_warnings,expected_result',
|
||||
(
|
||||
(2, True, True),
|
||||
(2, False, True),
|
||||
(1, True, True),
|
||||
(1, False, False),
|
||||
(0, True, False),
|
||||
(0, False, False),
|
||||
),
|
||||
)
|
||||
def test_exit_code_indicates_error_respects_exit_code_and_error_on_warnings(
|
||||
exit_code, error_on_warnings, expected_result
|
||||
):
|
||||
assert (
|
||||
module.exit_code_indicates_error(
|
||||
('command',), exit_code, error_on_warnings=error_on_warnings
|
||||
)
|
||||
is expected_result
|
||||
)
|
||||
|
||||
|
||||
def test_exit_code_indicates_error_with_non_borg_error_is_true():
|
||||
assert module.exit_code_indicates_error(('/usr/bin/command',), 2)
|
||||
|
||||
|
||||
def test_exit_code_indicates_error_with_non_borg_warning_is_true():
|
||||
assert module.exit_code_indicates_error(('/usr/bin/command',), 1)
|
||||
|
||||
|
||||
def test_exit_code_indicates_error_with_non_borg_success_is_false():
|
||||
assert not module.exit_code_indicates_error(('/usr/bin/command',), 0)
|
||||
|
||||
|
||||
def test_execute_command_calls_full_command():
|
||||
full_command = ['foo', 'bar']
|
||||
flexmock(module.os, environ={'a': 'b'})
|
||||
|
|
|
@ -22,14 +22,14 @@ def test_to_bool_passes_none_through():
|
|||
|
||||
def test_interactive_console_false_when_not_isatty(capsys):
|
||||
with capsys.disabled():
|
||||
flexmock(module.sys.stdout).should_receive('isatty').and_return(False)
|
||||
flexmock(module.sys.stderr).should_receive('isatty').and_return(False)
|
||||
|
||||
assert module.interactive_console() is False
|
||||
|
||||
|
||||
def test_interactive_console_false_when_TERM_is_dumb(capsys):
|
||||
with capsys.disabled():
|
||||
flexmock(module.sys.stdout).should_receive('isatty').and_return(True)
|
||||
flexmock(module.sys.stderr).should_receive('isatty').and_return(True)
|
||||
flexmock(module.os.environ).should_receive('get').with_args('TERM').and_return('dumb')
|
||||
|
||||
assert module.interactive_console() is False
|
||||
|
@ -37,7 +37,7 @@ def test_interactive_console_false_when_TERM_is_dumb(capsys):
|
|||
|
||||
def test_interactive_console_true_when_isatty_and_TERM_is_not_dumb(capsys):
|
||||
with capsys.disabled():
|
||||
flexmock(module.sys.stdout).should_receive('isatty').and_return(True)
|
||||
flexmock(module.sys.stderr).should_receive('isatty').and_return(True)
|
||||
flexmock(module.os.environ).should_receive('get').with_args('TERM').and_return('smart')
|
||||
|
||||
assert module.interactive_console() is True
|
||||
|
@ -113,6 +113,17 @@ def test_should_do_markup_prefers_PY_COLORS_to_interactive_console_value():
|
|||
assert module.should_do_markup(no_color=False, configs={}) is True
|
||||
|
||||
|
||||
def test_multi_stream_handler_logs_to_handler_for_log_level():
|
||||
error_handler = flexmock()
|
||||
error_handler.should_receive('emit').once()
|
||||
info_handler = flexmock()
|
||||
|
||||
multi_handler = module.Multi_stream_handler(
|
||||
{module.logging.ERROR: error_handler, module.logging.INFO: info_handler}
|
||||
)
|
||||
multi_handler.emit(flexmock(levelno=module.logging.ERROR))
|
||||
|
||||
|
||||
def test_console_color_formatter_format_includes_log_message():
|
||||
plain_message = 'uh oh'
|
||||
record = flexmock(levelno=logging.CRITICAL, msg=plain_message)
|
||||
|
@ -132,6 +143,9 @@ def test_color_text_without_color_does_not_raise():
|
|||
|
||||
|
||||
def test_configure_logging_probes_for_log_socket_on_linux():
|
||||
flexmock(module).should_receive('Multi_stream_handler').and_return(
|
||||
flexmock(setFormatter=lambda formatter: None, setLevel=lambda level: None)
|
||||
)
|
||||
flexmock(module).should_receive('Console_color_formatter')
|
||||
flexmock(module).should_receive('interactive_console').and_return(False)
|
||||
flexmock(module.logging).should_receive('basicConfig').with_args(
|
||||
|
@ -147,6 +161,9 @@ def test_configure_logging_probes_for_log_socket_on_linux():
|
|||
|
||||
|
||||
def test_configure_logging_probes_for_log_socket_on_macos():
|
||||
flexmock(module).should_receive('Multi_stream_handler').and_return(
|
||||
flexmock(setFormatter=lambda formatter: None, setLevel=lambda level: None)
|
||||
)
|
||||
flexmock(module).should_receive('Console_color_formatter')
|
||||
flexmock(module).should_receive('interactive_console').and_return(False)
|
||||
flexmock(module.logging).should_receive('basicConfig').with_args(
|
||||
|
@ -163,6 +180,9 @@ def test_configure_logging_probes_for_log_socket_on_macos():
|
|||
|
||||
|
||||
def test_configure_logging_sets_global_logger_to_most_verbose_log_level():
|
||||
flexmock(module).should_receive('Multi_stream_handler').and_return(
|
||||
flexmock(setFormatter=lambda formatter: None, setLevel=lambda level: None)
|
||||
)
|
||||
flexmock(module).should_receive('Console_color_formatter')
|
||||
flexmock(module.logging).should_receive('basicConfig').with_args(
|
||||
level=logging.DEBUG, handlers=tuple
|
||||
|
@ -173,6 +193,9 @@ def test_configure_logging_sets_global_logger_to_most_verbose_log_level():
|
|||
|
||||
|
||||
def test_configure_logging_skips_syslog_if_not_found():
|
||||
flexmock(module).should_receive('Multi_stream_handler').and_return(
|
||||
flexmock(setFormatter=lambda formatter: None, setLevel=lambda level: None)
|
||||
)
|
||||
flexmock(module).should_receive('Console_color_formatter')
|
||||
flexmock(module.logging).should_receive('basicConfig').with_args(
|
||||
level=logging.INFO, handlers=tuple
|
||||
|
@ -184,6 +207,9 @@ def test_configure_logging_skips_syslog_if_not_found():
|
|||
|
||||
|
||||
def test_configure_logging_skips_syslog_if_interactive_console():
|
||||
flexmock(module).should_receive('Multi_stream_handler').and_return(
|
||||
flexmock(setFormatter=lambda formatter: None, setLevel=lambda level: None)
|
||||
)
|
||||
flexmock(module).should_receive('Console_color_formatter')
|
||||
flexmock(module).should_receive('interactive_console').and_return(True)
|
||||
flexmock(module.logging).should_receive('basicConfig').with_args(
|
||||
|
@ -196,6 +222,10 @@ def test_configure_logging_skips_syslog_if_interactive_console():
|
|||
|
||||
|
||||
def test_configure_logging_to_logfile_instead_of_syslog():
|
||||
flexmock(module).should_receive('Multi_stream_handler').and_return(
|
||||
flexmock(setFormatter=lambda formatter: None, setLevel=lambda level: None)
|
||||
)
|
||||
|
||||
# syslog skipped in non-interactive console if --log-file argument provided
|
||||
flexmock(module).should_receive('interactive_console').and_return(False)
|
||||
flexmock(module.logging).should_receive('basicConfig').with_args(
|
||||
|
@ -214,6 +244,10 @@ def test_configure_logging_to_logfile_instead_of_syslog():
|
|||
|
||||
|
||||
def test_configure_logging_skips_logfile_if_argument_is_none():
|
||||
flexmock(module).should_receive('Multi_stream_handler').and_return(
|
||||
flexmock(setFormatter=lambda formatter: None, setLevel=lambda level: None)
|
||||
)
|
||||
|
||||
# No WatchedFileHandler added if argument --log-file is None
|
||||
flexmock(module).should_receive('interactive_console').and_return(False)
|
||||
flexmock(module.logging).should_receive('basicConfig').with_args(
|
||||
|
|
6
tox.ini
6
tox.ini
|
@ -2,7 +2,7 @@
|
|||
envlist = py35,py36,py37,py38
|
||||
skip_missing_interpreters = True
|
||||
skipsdist = True
|
||||
minversion = 3.14.0
|
||||
minversion = 3.14.1
|
||||
|
||||
[testenv]
|
||||
usedevelop = True
|
||||
|
@ -10,8 +10,7 @@ deps = -rtest_requirements.txt
|
|||
whitelist_externals =
|
||||
find
|
||||
sh
|
||||
commands_pre =
|
||||
find {toxinidir} -type f -not -path '{toxinidir}/.tox/*' -path '*/__pycache__/*' -name '*.py[c|o]' -delete
|
||||
passenv = COVERAGE_FILE
|
||||
commands =
|
||||
pytest {posargs}
|
||||
py36,py37,py38: black --check .
|
||||
|
@ -28,6 +27,7 @@ commands =
|
|||
|
||||
[testenv:end-to-end]
|
||||
deps = -rtest_requirements.txt
|
||||
passenv = COVERAGE_FILE
|
||||
commands =
|
||||
pytest {posargs} --no-cov tests/end-to-end
|
||||
|
||||
|
|
Loading…
Reference in New Issue
Forgive me if this is a duplicate comment.. Is there a reason this is non-blocking? Seems like we'd want it to synchronously lock to be absolutely sure that no other borgmatic is running.
Non blocking here indicates that we don’t want to block the borgmatic and wait for a lock to be freed if the file is currently locked. I think you can do blocking with a timeout but it doesn’t seem that beneficial to me since borgmatic is typically a long running operation so the lock is unlikely to be free in a short amount of time. I think borg uses a timeout so I guess we could replicate that behavior.
Oh, gotcha. I originally misunderstood what
LOCK_NB
did. I see now that you're using it here to error immediately if a lock is in place, which makes sense. Carry on!