Compare commits

...

35 Commits

Author SHA1 Message Date
Dan Helfman 8cec7c74d8 Add "--strip-components all" on the "extract" action to remove leading path components (#647). 2023-03-09 10:09:16 -08:00
Dan Helfman d3086788eb Document how to list database dumps in an archive. 2023-03-08 16:09:41 -08:00
Dan Helfman 8d860ea02c
Enhanced docs with info on fetching mysql database size
Merge pull request #46 from Jelle-SamsonIT/patch-3
2023-03-08 15:52:28 -08:00
Dan Helfman b343363bb8 Change the default action order to: "create", "prune", "compact", "check" (#304). 2023-03-08 14:05:06 -08:00
Dan Helfman 9db31bd1e9 Run any command-line actions in the order specified instead of using a fixed ordering (#304). 2023-03-08 13:19:41 -08:00
Dan Helfman d88bcc8be9 Add Healthchecks "log" state feature to NEWS. 2023-03-07 15:45:23 -08:00
Dan Helfman 332f7c4bb6 Add support for healthchecks "log" feature (#628).
Reviewed-on: borgmatic-collective/borgmatic#645
2023-03-07 22:21:30 +00:00
Dan Helfman 5d19d86e4a Add flake8-quotes to complain about incorrect quoting so I don't have to! 2023-03-07 14:08:35 -08:00
Soumik Dutta 044ae7869a fix tests
Signed-off-by: Soumik Dutta <shalearkane@gmail.com>
2023-03-08 03:30:12 +05:30
Dan Helfman 62ae82f2c0 Mention searching for files in the extract a backup guide. 2023-03-06 22:59:34 -08:00
Dan Helfman 66194b7304 Update dates in documentation examples. 2023-03-06 22:41:43 -08:00
Soumik Dutta 98e429594e added tests to make sure unsupported log states are detected
Signed-off-by: Soumik Dutta <shalearkane@gmail.com>
2023-03-06 20:31:00 +05:30
Soumik Dutta 4fcfddbe08 return early if unsupported state is passed
Signed-off-by: Soumik Dutta <shalearkane@gmail.com>
2023-03-06 19:58:57 +05:30
Soumik Dutta f442aeae9c fix logs_monitor_start_error()
Signed-off-by: Soumik Dutta <shalearkane@gmail.com>
2023-03-06 05:21:56 +05:30
Soumik Dutta e211863cba update test_borgmatic.py
Signed-off-by: Soumik Dutta <shalearkane@gmail.com>
2023-03-06 05:12:24 +05:30
Soumik Dutta 45256ae33f add test for healthchecks
Signed-off-by: Soumik Dutta <shalearkane@gmail.com>
2023-03-06 03:38:08 +05:30
Soumik Dutta 1573d68fe2 update schema.yaml description
also add monitor.State.LOG to cronitor.

Signed-off-by: Soumik Dutta <shalearkane@gmail.com>
2023-03-05 21:57:13 +05:30
Soumik Dutta 69f6695253 Add support for healthchecks "log" feature #628
Signed-off-by: Soumik Dutta <shalearkane@gmail.com>
2023-03-05 19:27:32 +05:30
Dan Helfman a7c055264d
Fix incorrect documentation TOC background by removing extra dark mode styles.
Merge pull request #52 from diivi/fix/remove-special-dark-mode-attributes
2023-03-04 16:18:04 -08:00
Divyansh Singh db18364a73 fix: remove extra dark mode styles 2023-03-05 03:16:46 +05:30
Dan Helfman 22498ebd4c In the documentation, mention what version of borgmatic introduced SQLite support. 2023-03-04 10:50:28 -08:00
Dan Helfman e1f02d9fa5 Add SQLite feature to NEWS and also integrations. 2023-03-04 09:59:16 -08:00
Dan Helfman 9ec220c600
Add SQLite database dump/restore hook (#295).
feat: add dump-restore support for sqlite databases
2023-03-04 09:47:21 -08:00
Divyansh Singh cf0275a3ed remove test path 2023-03-04 23:00:57 +05:30
Divyansh Singh c71eb60cd2 mock os.remove instead of actually removing a file 2023-03-04 13:08:30 +05:30
Divyansh Singh 675e54ba9f use os.remove and improve tests 2023-03-04 12:43:07 +05:30
Divyansh Singh 1793ad74bd add sqlite for e2e tests 2023-03-04 02:41:14 +05:30
Divyansh Singh 767a7d900b e2e tests schema update 2023-03-04 01:29:01 +05:30
Divyansh Singh 903507bd03 code review 2023-03-04 01:27:07 +05:30
Dan Helfman b6cf7d2adc Bump version for release. 2023-03-02 15:34:22 -08:00
Dan Helfman a071e02d20 With the "create" action and the "--list" ("--files") flag, only show excluded files at verbosity 2 (#620). 2023-03-02 15:33:42 -08:00
Divyansh Singh 3aa88085ed formatting fix 2023-03-03 00:01:52 +05:30
Divyansh Singh af1cc27988 feat: add dump-restore support for sqlite databases 2023-03-02 23:55:16 +05:30
Jelle @ Samson-IT 3720f22234
reworded and added 'all' caveat 2022-07-13 22:03:51 +02:00
Jelle @ Samson-IT 1fdec480d6
Added some info about fetching mysql database size 2022-07-13 13:29:45 +02:00
46 changed files with 988 additions and 348 deletions

1
.flake8 Normal file
View File

@ -0,0 +1 @@
select = Q0

15
NEWS
View File

@ -1,4 +1,17 @@
1.7.8.dev0
1.7.9.dev0
* #295: Add a SQLite database dump/restore hook.
* #304: Change the default action order when no actions are specified on the command-line to:
"create", "prune", "compact", "check". If you'd like to retain the old ordering ("prune" and
"compact" first), then specify actions explicitly on the command-line.
* #304: Run any command-line actions in the order specified instead of using a fixed ordering.
* #628: Add a Healthchecks "log" state to send borgmatic logs to Healthchecks without signalling
success or failure.
* #647: Add "--strip-components all" feature on the "extract" action to remove leading path
components of files you extract. Must be used with the "--path" flag.
1.7.8
* #620: With the "create" action and the "--list" ("--files") flag, only show excluded files at
verbosity 2.
* #621: Add optional authentication to the ntfy monitoring hook.
* With the "create" action, only one of "--list" ("--files") and "--progress" flags can be used.
This lines up with the new behavior in Borg 2.0.0b5.

View File

@ -67,6 +67,7 @@ borgmatic is powered by [Borg Backup](https://www.borgbackup.org/).
<a href="https://www.mysql.com/"><img src="docs/static/mysql.png" alt="MySQL" height="60px" style="margin-bottom:20px;"></a>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;
<a href="https://mariadb.com/"><img src="docs/static/mariadb.png" alt="MariaDB" height="60px" style="margin-bottom:20px;"></a>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;
<a href="https://www.mongodb.com/"><img src="docs/static/mongodb.png" alt="MongoDB" height="60px" style="margin-bottom:20px;"></a>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;
<a href="https://sqlite.org/"><img src="docs/static/sqlite.png" alt="SQLite" height="60px" style="margin-bottom:20px;"></a>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;
<a href="https://healthchecks.io/"><img src="docs/static/healthchecks.png" alt="Healthchecks" height="60px" style="margin-bottom:20px;"></a>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;
<a href="https://cronitor.io/"><img src="docs/static/cronitor.png" alt="Cronitor" height="60px" style="margin-bottom:20px;"></a>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;
<a href="https://cronhub.io/"><img src="docs/static/cronhub.png" alt="Cronhub" height="60px" style="margin-bottom:20px;"></a>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;

View File

@ -139,7 +139,7 @@ def filter_checks_on_frequency(
if datetime.datetime.now() < check_time + frequency_delta:
remaining = check_time + frequency_delta - datetime.datetime.now()
logger.info(
f"Skipping {check} check due to configured frequency; {remaining} until next check"
f'Skipping {check} check due to configured frequency; {remaining} until next check'
)
filtered_checks.remove(check)

View File

@ -196,6 +196,27 @@ def make_exclude_flags(location_config, exclude_filename=None):
)
def make_list_filter_flags(local_borg_version, dry_run):
'''
Given the local Borg version and whether this is a dry run, return the corresponding flags for
passing to "--list --filter". The general idea is that excludes are shown for a dry run or when
the verbosity is debug.
'''
base_flags = 'AME'
show_excludes = logger.isEnabledFor(logging.DEBUG)
if feature.available(feature.Feature.EXCLUDED_FILES_MINUS, local_borg_version):
if show_excludes or dry_run:
return f'{base_flags}+-'
else:
return base_flags
if show_excludes:
return f'{base_flags}x-'
else:
return f'{base_flags}-'
DEFAULT_ARCHIVE_NAME_FORMAT = '{hostname}-{now:%Y-%m-%dT%H:%M:%S.%f}'
@ -343,6 +364,7 @@ def create_archive(
upload_rate_limit = storage_config.get('upload_rate_limit', None)
umask = storage_config.get('umask', None)
lock_wait = storage_config.get('lock_wait', None)
list_filter_flags = make_list_filter_flags(local_borg_version, dry_run)
files_cache = location_config.get('files_cache')
archive_name_format = storage_config.get('archive_name_format', DEFAULT_ARCHIVE_NAME_FORMAT)
extra_borg_options = storage_config.get('extra_borg_options', {}).get('create', '')
@ -401,7 +423,11 @@ def create_archive(
+ (('--remote-path', remote_path) if remote_path else ())
+ (('--umask', str(umask)) if umask else ())
+ (('--lock-wait', str(lock_wait)) if lock_wait else ())
+ (('--list', '--filter', 'AMEx+-') if list_files and not json and not progress else ())
+ (
('--list', '--filter', list_filter_flags)
if list_files and not json and not progress
else ()
)
+ (('--dry-run',) if dry_run else ())
+ (tuple(extra_borg_options.split(' ')) if extra_borg_options else ())
+ flags.make_repository_archive_flags(repository, archive_name_format, local_borg_version)

View File

@ -87,6 +87,13 @@ def extract_archive(
else:
numeric_ids_flags = ('--numeric-owner',) if location_config.get('numeric_ids') else ()
if strip_components == 'all':
if not paths:
raise ValueError('The --strip-components flag with "all" requires at least one --path')
# Calculate the maximum number of leading path components of the given paths.
strip_components = max(0, *(len(path.split(os.path.sep)) - 1 for path in paths))
full_command = (
(local_path, 'extract')
+ (('--remote-path', remote_path) if remote_path else ())

View File

@ -14,6 +14,7 @@ class Feature(Enum):
RLIST = 8
RINFO = 9
MATCH_ARCHIVES = 10
EXCLUDED_FILES_MINUS = 11
FEATURE_TO_MINIMUM_BORG_VERSION = {
@ -27,6 +28,7 @@ FEATURE_TO_MINIMUM_BORG_VERSION = {
Feature.RLIST: parse_version('2.0.0a2'), # borg rlist
Feature.RINFO: parse_version('2.0.0a2'), # borg rinfo
Feature.MATCH_ARCHIVES: parse_version('2.0.0b3'), # borg --match-archives
Feature.EXCLUDED_FILES_MINUS: parse_version('2.0.0b5'), # --list --filter uses "-" for excludes
}

View File

@ -17,7 +17,7 @@ def resolve_archive_name(
Raise ValueError if "latest" is given but there are no archives in the repository.
'''
if archive != "latest":
if archive != 'latest':
return archive
lock_wait = storage_config.get('lock_wait', None)

View File

@ -46,11 +46,12 @@ def parse_subparser_arguments(unparsed_arguments, subparsers):
if 'borg' in unparsed_arguments:
subparsers = {'borg': subparsers['borg']}
for subparser_name, subparser in subparsers.items():
if subparser_name not in remaining_arguments:
continue
for argument in remaining_arguments:
canonical_name = alias_to_subparser_name.get(argument, argument)
subparser = subparsers.get(canonical_name)
canonical_name = alias_to_subparser_name.get(subparser_name, subparser_name)
if not subparser:
continue
# If a parsed value happens to be the same as the name of a subparser, remove it from the
# remaining arguments. This prevents, for instance, "check --only extract" from triggering
@ -67,9 +68,9 @@ def parse_subparser_arguments(unparsed_arguments, subparsers):
arguments[canonical_name] = parsed
# If no actions are explicitly requested, assume defaults: prune, compact, create, and check.
# If no actions are explicitly requested, assume defaults.
if not arguments and '--help' not in unparsed_arguments and '-h' not in unparsed_arguments:
for subparser_name in ('prune', 'compact', 'create', 'check'):
for subparser_name in ('create', 'prune', 'compact', 'check'):
subparser = subparsers[subparser_name]
parsed, unused_remaining = subparser.parse_known_args(unparsed_arguments)
arguments[subparser_name] = parsed
@ -215,7 +216,7 @@ def make_parsers():
top_level_parser = ArgumentParser(
description='''
Simple, configuration-driven backup software for servers and workstations. If none of
the action options are given, then borgmatic defaults to: prune, compact, create, and
the action options are given, then borgmatic defaults to: create, prune, compact, and
check.
''',
parents=[global_parser],
@ -224,7 +225,7 @@ def make_parsers():
subparsers = top_level_parser.add_subparsers(
title='actions',
metavar='',
help='Specify zero or more actions. Defaults to prune, compact, create, and check. Use --help with action for details:',
help='Specify zero or more actions. Defaults to creat, prune, compact, and check. Use --help with action for details:',
)
rcreate_parser = subparsers.add_parser(
'rcreate',
@ -475,10 +476,9 @@ def make_parsers():
)
extract_group.add_argument(
'--strip-components',
type=int,
type=lambda number: number if number == 'all' else int(number),
metavar='NUMBER',
dest='strip_components',
help='Number of leading path components to remove from each extracted path. Skip paths with fewer elements',
help='Number of leading path components to remove from each extracted path or "all" to strip all leading path components. Skip paths with fewer elements',
)
extract_group.add_argument(
'--progress',
@ -611,7 +611,7 @@ def make_parsers():
metavar='NAME',
nargs='+',
dest='databases',
help='Names of databases to restore from archive, defaults to all databases. Note that any databases to restore must be defined in borgmatic\'s configuration',
help="Names of databases to restore from archive, defaults to all databases. Note that any databases to restore must be defined in borgmatic's configuration",
)
restore_group.add_argument(
'-h', '--help', action='help', help='Show this help message and exit'
@ -805,7 +805,7 @@ def make_parsers():
'borg',
aliases=SUBPARSER_ALIASES['borg'],
help='Run an arbitrary Borg command',
description='Run an arbitrary Borg command based on borgmatic\'s configuration',
description="Run an arbitrary Borg command based on borgmatic's configuration",
add_help=False,
)
borg_group = borg_parser.add_argument_group('borg arguments')

View File

@ -44,8 +44,8 @@ LEGACY_CONFIG_PATH = '/etc/borgmatic/config'
def run_configuration(config_filename, config, arguments):
'''
Given a config filename, the corresponding parsed config dict, and command-line arguments as a
dict from subparser name to a namespace of parsed arguments, execute the defined prune, compact,
create, check, and/or other actions.
dict from subparser name to a namespace of parsed arguments, execute the defined create, prune,
compact, check, and/or other actions.
Yield a combination of:
@ -64,7 +64,7 @@ def run_configuration(config_filename, config, arguments):
retry_wait = storage.get('retry_wait', 0)
encountered_error = None
error_repository = ''
using_primary_action = {'prune', 'compact', 'create', 'check'}.intersection(arguments)
using_primary_action = {'create', 'prune', 'compact', 'check'}.intersection(arguments)
monitoring_log_level = verbosity_to_log_level(global_arguments.monitoring_verbosity)
try:
@ -152,6 +152,25 @@ def run_configuration(config_filename, config, arguments):
encountered_error = error
error_repository = repository_path
try:
if using_primary_action:
# send logs irrespective of error
dispatch.call_hooks(
'ping_monitor',
hooks,
config_filename,
monitor.MONITOR_HOOK_NAMES,
monitor.State.LOG,
monitoring_log_level,
global_arguments.dry_run,
)
except (OSError, CalledProcessError) as error:
if command.considered_soft_failure(config_filename, error):
return
encountered_error = error
yield from log_error_records('{}: Error pinging monitor'.format(config_filename), error)
if not encountered_error:
try:
if using_primary_action:
@ -262,155 +281,162 @@ def run_actions(
**hook_context,
)
if 'rcreate' in arguments:
borgmatic.actions.rcreate.run_rcreate(
repository,
storage,
local_borg_version,
arguments['rcreate'],
global_arguments,
local_path,
remote_path,
)
if 'transfer' in arguments:
borgmatic.actions.transfer.run_transfer(
repository,
storage,
local_borg_version,
arguments['transfer'],
global_arguments,
local_path,
remote_path,
)
if 'prune' in arguments:
borgmatic.actions.prune.run_prune(
config_filename,
repository,
storage,
retention,
hooks,
hook_context,
local_borg_version,
arguments['prune'],
global_arguments,
dry_run_label,
local_path,
remote_path,
)
if 'compact' in arguments:
borgmatic.actions.compact.run_compact(
config_filename,
repository,
storage,
retention,
hooks,
hook_context,
local_borg_version,
arguments['compact'],
global_arguments,
dry_run_label,
local_path,
remote_path,
)
if 'create' in arguments:
yield from borgmatic.actions.create.run_create(
config_filename,
repository,
location,
storage,
hooks,
hook_context,
local_borg_version,
arguments['create'],
global_arguments,
dry_run_label,
local_path,
remote_path,
)
if 'check' in arguments and checks.repository_enabled_for_checks(repository, consistency):
borgmatic.actions.check.run_check(
config_filename,
repository,
location,
storage,
consistency,
hooks,
hook_context,
local_borg_version,
arguments['check'],
global_arguments,
local_path,
remote_path,
)
if 'extract' in arguments:
borgmatic.actions.extract.run_extract(
config_filename,
repository,
location,
storage,
hooks,
hook_context,
local_borg_version,
arguments['extract'],
global_arguments,
local_path,
remote_path,
)
if 'export-tar' in arguments:
borgmatic.actions.export_tar.run_export_tar(
repository,
storage,
local_borg_version,
arguments['export-tar'],
global_arguments,
local_path,
remote_path,
)
if 'mount' in arguments:
borgmatic.actions.mount.run_mount(
repository, storage, local_borg_version, arguments['mount'], local_path, remote_path,
)
if 'restore' in arguments:
borgmatic.actions.restore.run_restore(
repository,
location,
storage,
hooks,
local_borg_version,
arguments['restore'],
global_arguments,
local_path,
remote_path,
)
if 'rlist' in arguments:
yield from borgmatic.actions.rlist.run_rlist(
repository, storage, local_borg_version, arguments['rlist'], local_path, remote_path,
)
if 'list' in arguments:
yield from borgmatic.actions.list.run_list(
repository, storage, local_borg_version, arguments['list'], local_path, remote_path,
)
if 'rinfo' in arguments:
yield from borgmatic.actions.rinfo.run_rinfo(
repository, storage, local_borg_version, arguments['rinfo'], local_path, remote_path,
)
if 'info' in arguments:
yield from borgmatic.actions.info.run_info(
repository, storage, local_borg_version, arguments['info'], local_path, remote_path,
)
if 'break-lock' in arguments:
borgmatic.actions.break_lock.run_break_lock(
repository,
storage,
local_borg_version,
arguments['break-lock'],
local_path,
remote_path,
)
if 'borg' in arguments:
borgmatic.actions.borg.run_borg(
repository, storage, local_borg_version, arguments['borg'], local_path, remote_path,
)
for (action_name, action_arguments) in arguments.items():
if action_name == 'rcreate':
borgmatic.actions.rcreate.run_rcreate(
repository,
storage,
local_borg_version,
action_arguments,
global_arguments,
local_path,
remote_path,
)
elif action_name == 'transfer':
borgmatic.actions.transfer.run_transfer(
repository,
storage,
local_borg_version,
action_arguments,
global_arguments,
local_path,
remote_path,
)
elif action_name == 'create':
yield from borgmatic.actions.create.run_create(
config_filename,
repository,
location,
storage,
hooks,
hook_context,
local_borg_version,
action_arguments,
global_arguments,
dry_run_label,
local_path,
remote_path,
)
elif action_name == 'prune':
borgmatic.actions.prune.run_prune(
config_filename,
repository,
storage,
retention,
hooks,
hook_context,
local_borg_version,
action_arguments,
global_arguments,
dry_run_label,
local_path,
remote_path,
)
elif action_name == 'compact':
borgmatic.actions.compact.run_compact(
config_filename,
repository,
storage,
retention,
hooks,
hook_context,
local_borg_version,
action_arguments,
global_arguments,
dry_run_label,
local_path,
remote_path,
)
elif action_name == 'check':
if checks.repository_enabled_for_checks(repository, consistency):
borgmatic.actions.check.run_check(
config_filename,
repository,
location,
storage,
consistency,
hooks,
hook_context,
local_borg_version,
action_arguments,
global_arguments,
local_path,
remote_path,
)
elif action_name == 'extract':
borgmatic.actions.extract.run_extract(
config_filename,
repository,
location,
storage,
hooks,
hook_context,
local_borg_version,
action_arguments,
global_arguments,
local_path,
remote_path,
)
elif action_name == 'export-tar':
borgmatic.actions.export_tar.run_export_tar(
repository,
storage,
local_borg_version,
action_arguments,
global_arguments,
local_path,
remote_path,
)
elif action_name == 'mount':
borgmatic.actions.mount.run_mount(
repository,
storage,
local_borg_version,
arguments['mount'],
local_path,
remote_path,
)
elif action_name == 'restore':
borgmatic.actions.restore.run_restore(
repository,
location,
storage,
hooks,
local_borg_version,
action_arguments,
global_arguments,
local_path,
remote_path,
)
elif action_name == 'rlist':
yield from borgmatic.actions.rlist.run_rlist(
repository, storage, local_borg_version, action_arguments, local_path, remote_path,
)
elif action_name == 'list':
yield from borgmatic.actions.list.run_list(
repository, storage, local_borg_version, action_arguments, local_path, remote_path,
)
elif action_name == 'rinfo':
yield from borgmatic.actions.rinfo.run_rinfo(
repository, storage, local_borg_version, action_arguments, local_path, remote_path,
)
elif action_name == 'info':
yield from borgmatic.actions.info.run_info(
repository, storage, local_borg_version, action_arguments, local_path, remote_path,
)
elif action_name == 'break-lock':
borgmatic.actions.break_lock.run_break_lock(
repository,
storage,
local_borg_version,
arguments['break-lock'],
local_path,
remote_path,
)
elif action_name == 'borg':
borgmatic.actions.borg.run_borg(
repository, storage, local_borg_version, action_arguments, local_path, remote_path,
)
command.execute_hook(
hooks.get('after_actions'),

View File

@ -369,6 +369,11 @@ properties:
description: |
Extra command-line options to pass to "borg init".
example: "--extra-option"
create:
type: string
description: |
Extra command-line options to pass to "borg create".
example: "--extra-option"
prune:
type: string
description: |
@ -379,11 +384,6 @@ properties:
description: |
Extra command-line options to pass to "borg compact".
example: "--extra-option"
create:
type: string
description: |
Extra command-line options to pass to "borg create".
example: "--extra-option"
check:
type: string
description: |
@ -663,11 +663,11 @@ properties:
type: string
description: |
List of one or more shell commands or scripts to execute
when an exception occurs during a "prune", "compact",
"create", or "check" action or an associated before/after
when an exception occurs during a "create", "prune",
"compact", or "check" action or an associated before/after
hook.
example:
- echo "Error during prune/compact/create/check."
- echo "Error during create/prune/compact/check."
before_everything:
type: array
items:
@ -941,6 +941,31 @@ properties:
mysqldump/mysql commands (from either MySQL or MariaDB). See
https://dev.mysql.com/doc/refman/8.0/en/mysqldump.html or
https://mariadb.com/kb/en/library/mysqldump/ for details.
sqlite_databases:
type: array
items:
type: object
required: ['path','name']
additionalProperties: false
properties:
name:
type: string
description: |
This is used to tag the database dump file
with a name. It is not the path to the database
file itself. The name "all" has no special
meaning for SQLite databases.
example: users
path:
type: string
description: |
Path to the SQLite database file to dump. If
relative, it is relative to the current working
directory. Note that using this
database hook implicitly enables both
read_special and one_file_system (see above) to
support dump and restore streaming.
example: /var/lib/sqlite/users.db
mongodb_databases:
type: array
items:
@ -1143,7 +1168,7 @@ properties:
type: string
description: |
Healthchecks ping URL or UUID to notify when a
backup begins, ends, or errors.
backup begins, ends, errors or just to send logs.
example: https://hc-ping.com/your-uuid-here
verify_tls:
type: boolean
@ -1155,7 +1180,8 @@ properties:
type: boolean
description: |
Send borgmatic logs to Healthchecks as part the
"finish" state. Defaults to true.
"finish", "fail", and "log" states. Defaults to
true.
example: false
ping_body_limit:
type: integer
@ -1174,10 +1200,11 @@ properties:
- start
- finish
- fail
- log
uniqueItems: true
description: |
List of one or more monitoring states to ping for:
"start", "finish", and/or "fail". Defaults to
"start", "finish", "fail", and/or "log". Defaults to
pinging for all states.
example:
- finish

View File

@ -186,5 +186,5 @@ def guard_single_repository_selected(repository, configurations):
if count != 1:
raise ValueError(
'Can\'t determine which repository to use. Use --repository to disambiguate'
"Can't determine which repository to use. Use --repository to disambiguate"
)

View File

@ -27,6 +27,12 @@ def ping_monitor(hook_config, config_filename, state, monitoring_log_level, dry_
Ping the configured Cronhub URL, modified with the monitor.State. Use the given configuration
filename in any log entries. If this is a dry run, then don't actually ping anything.
'''
if state not in MONITOR_STATE_TO_CRONHUB:
logger.debug(
f'{config_filename}: Ignoring unsupported monitoring {state.name.lower()} in Cronhub hook'
)
return
dry_run_label = ' (dry run; not actually pinging)' if dry_run else ''
formatted_state = '/{}/'.format(MONITOR_STATE_TO_CRONHUB[state])
ping_url = (

View File

@ -27,6 +27,12 @@ def ping_monitor(hook_config, config_filename, state, monitoring_log_level, dry_
Ping the configured Cronitor URL, modified with the monitor.State. Use the given configuration
filename in any log entries. If this is a dry run, then don't actually ping anything.
'''
if state not in MONITOR_STATE_TO_CRONITOR:
logger.debug(
f'{config_filename}: Ignoring unsupported monitoring {state.name.lower()} in Cronitor hook'
)
return
dry_run_label = ' (dry run; not actually pinging)' if dry_run else ''
ping_url = '{}/{}'.format(hook_config['ping_url'], MONITOR_STATE_TO_CRONITOR[state])

View File

@ -9,6 +9,7 @@ from borgmatic.hooks import (
ntfy,
pagerduty,
postgresql,
sqlite,
)
logger = logging.getLogger(__name__)
@ -22,6 +23,7 @@ HOOK_NAME_TO_MODULE = {
'ntfy': ntfy,
'pagerduty': pagerduty,
'postgresql_databases': postgresql,
'sqlite_databases': sqlite,
}

View File

@ -6,7 +6,12 @@ from borgmatic.borg.state import DEFAULT_BORGMATIC_SOURCE_DIRECTORY
logger = logging.getLogger(__name__)
DATABASE_HOOK_NAMES = ('postgresql_databases', 'mysql_databases', 'mongodb_databases')
DATABASE_HOOK_NAMES = (
'postgresql_databases',
'mysql_databases',
'mongodb_databases',
'sqlite_databases',
)
def make_database_dump_path(borgmatic_source_directory, database_hook_name):

View File

@ -10,6 +10,7 @@ MONITOR_STATE_TO_HEALTHCHECKS = {
monitor.State.START: 'start',
monitor.State.FINISH: None, # Healthchecks doesn't append to the URL for the finished state.
monitor.State.FAIL: 'fail',
monitor.State.LOG: 'log',
}
PAYLOAD_TRUNCATION_INDICATOR = '...\n'
@ -117,7 +118,7 @@ def ping_monitor(hook_config, config_filename, state, monitoring_log_level, dry_
)
logger.debug('{}: Using Healthchecks ping URL {}'.format(config_filename, ping_url))
if state in (monitor.State.FINISH, monitor.State.FAIL):
if state in (monitor.State.FINISH, monitor.State.FAIL, monitor.State.LOG):
payload = format_buffered_logs_for_payload()
else:
payload = ''

View File

@ -7,3 +7,4 @@ class State(Enum):
START = 1
FINISH = 2
FAIL = 3
LOG = 4

View File

@ -2,16 +2,8 @@ import logging
import requests
from borgmatic.hooks import monitor
logger = logging.getLogger(__name__)
MONITOR_STATE_TO_NTFY = {
monitor.State.START: None,
monitor.State.FINISH: None,
monitor.State.FAIL: None,
}
def initialize_monitor(
ping_url, config_filename, monitoring_log_level, dry_run

125
borgmatic/hooks/sqlite.py Normal file
View File

@ -0,0 +1,125 @@
import logging
import os
from borgmatic.execute import execute_command, execute_command_with_processes
from borgmatic.hooks import dump
logger = logging.getLogger(__name__)
def make_dump_path(location_config): # pragma: no cover
'''
Make the dump path from the given location configuration and the name of this hook.
'''
return dump.make_database_dump_path(
location_config.get('borgmatic_source_directory'), 'sqlite_databases'
)
def dump_databases(databases, log_prefix, location_config, dry_run):
'''
Dump the given SQLite3 databases to a file. The databases are supplied as a sequence of
configuration dicts, as per the configuration schema. Use the given log prefix in any log
entries. Use the given location configuration dict to construct the destination path. If this
is a dry run, then don't actually dump anything.
'''
dry_run_label = ' (dry run; not actually dumping anything)' if dry_run else ''
processes = []
logger.info('{}: Dumping SQLite databases{}'.format(log_prefix, dry_run_label))
for database in databases:
database_path = database['path']
if database['name'] == 'all':
logger.warning('The "all" database name has no meaning for SQLite3 databases')
if not os.path.exists(database_path):
logger.warning(
f'{log_prefix}: No SQLite database at {database_path}; An empty database will be created and dumped'
)
dump_path = make_dump_path(location_config)
dump_filename = dump.make_database_dump_filename(dump_path, database['name'])
if os.path.exists(dump_filename):
logger.warning(
f'{log_prefix}: Skipping duplicate dump of SQLite database at {database_path} to {dump_filename}'
)
continue
command = (
'sqlite3',
database_path,
'.dump',
'>',
dump_filename,
)
logger.debug(
f'{log_prefix}: Dumping SQLite database at {database_path} to {dump_filename}{dry_run_label}'
)
if dry_run:
continue
dump.create_parent_directory_for_dump(dump_filename)
processes.append(execute_command(command, shell=True, run_to_completion=False))
return processes
def remove_database_dumps(databases, log_prefix, location_config, dry_run): # pragma: no cover
'''
Remove the given SQLite3 database dumps from the filesystem. The databases are supplied as a
sequence of configuration dicts, as per the configuration schema. Use the given log prefix in
any log entries. Use the given location configuration dict to construct the destination path.
If this is a dry run, then don't actually remove anything.
'''
dump.remove_database_dumps(make_dump_path(location_config), 'SQLite', log_prefix, dry_run)
def make_database_dump_pattern(
databases, log_prefix, location_config, name=None
): # pragma: no cover
'''
Make a pattern that matches the given SQLite3 databases. The databases are supplied as a
sequence of configuration dicts, as per the configuration schema.
'''
return dump.make_database_dump_filename(make_dump_path(location_config), name)
def restore_database_dump(database_config, log_prefix, location_config, dry_run, extract_process):
'''
Restore the given SQLite3 database from an extract stream. The database is supplied as a
one-element sequence containing a dict describing the database, as per the configuration schema.
Use the given log prefix in any log entries. If this is a dry run, then don't actually restore
anything. Trigger the given active extract process (an instance of subprocess.Popen) to produce
output to consume.
'''
dry_run_label = ' (dry run; not actually restoring anything)' if dry_run else ''
if len(database_config) != 1:
raise ValueError('The database configuration value is invalid')
database_path = database_config[0]['path']
logger.debug(f'{log_prefix}: Restoring SQLite database at {database_path}{dry_run_label}')
if dry_run:
return
try:
os.remove(database_path)
logger.warning(f'{log_prefix}: Removed existing SQLite database at {database_path}')
except FileNotFoundError: # pragma: no cover
pass
restore_command = (
'sqlite3',
database_path,
)
# Don't give Borg local path so as to error on warnings, as "borg extract" only gives a warning
# if the restore paths don't exist in the archive.
execute_command_with_processes(
restore_command,
[extract_process],
output_log_level=logging.DEBUG,
input_file=extract_process.stdout,
)

View File

@ -4,7 +4,7 @@ COPY . /app
RUN apk add --no-cache py3-pip py3-ruamel.yaml py3-ruamel.yaml.clib
RUN pip install --no-cache /app && generate-borgmatic-config && chmod +r /etc/borgmatic/config.yaml
RUN borgmatic --help > /command-line.txt \
&& for action in rcreate transfer prune compact create check extract export-tar mount umount restore rlist list rinfo info break-lock borg; do \
&& for action in rcreate transfer create prune compact check extract export-tar mount umount restore rlist list rinfo info break-lock borg; do \
echo -e "\n--------------------------------------------------------------------------------\n" >> /command-line.txt \
&& borgmatic "$action" --help >> /command-line.txt; done

View File

@ -63,11 +63,6 @@
top: -2px;
bottom: 2px;
}
@media (prefers-color-scheme: dark) {
.inlinelist .inlinelist-item code:before {
border-left-color: rgba(0,0,0,.8);
}
}
}
a.buzzword {
text-decoration: underline;
@ -91,26 +86,9 @@ a.buzzword {
.buzzword {
background-color: #f7f7f7;
}
@media (prefers-color-scheme: dark) {
.buzzword-list li,
.buzzword {
background-color: #080808;
}
}
.inlinelist .inlinelist-item {
background-color: #e9e9e9;
}
@media (prefers-color-scheme: dark) {
.inlinelist .inlinelist-item {
background-color: #000;
}
.inlinelist .inlinelist-item a {
color: #fff;
}
.inlinelist .inlinelist-item code {
color: inherit;
}
}
.inlinelist .inlinelist-item:hover,
.inlinelist .inlinelist-item:focus,
.buzzword-list li:hover,
@ -217,12 +195,6 @@ main p a.buzzword {
height: 1.75em;
font-weight: 600;
}
@media (prefers-color-scheme: dark) {
.numberflag {
background-color: #00bcd4;
color: #222;
}
}
h1 .numberflag,
h2 .numberflag,
h3 .numberflag,
@ -244,11 +216,6 @@ h2 .numberflag:after {
background-color: #fff;
width: calc(100% + 0.4em); /* 16px /40 */
}
@media (prefers-color-scheme: dark) {
h2 .numberflag:after {
background-color: #222;
}
}
/* Super featured list on home page */
.list-superfeatured .avatar {

View File

@ -12,16 +12,6 @@
line-height: 1.285714285714; /* 18px /14 */
font-family: system-ui, -apple-system, sans-serif;
}
@media (prefers-color-scheme: dark) {
.minilink {
background-color: #222;
/*
!important to override .elv-callout a
see _includes/components/callout.css
*/
color: #fff !important;
}
}
table .minilink {
margin-top: 6px;
}
@ -32,12 +22,6 @@ table .minilink {
.minilink[href]:focus {
background-color: #bbb;
}
@media (prefers-color-scheme: dark) {
.minilink[href]:hover,
.minilink[href]:focus {
background-color: #444;
}
}
pre + .minilink {
color: #fff;
border-radius: 0 0 0.2857142857143em 0.2857142857143em; /* 4px /14 */
@ -74,11 +58,6 @@ h4 .minilink {
text-transform: none;
box-shadow: 0 0 0 1px rgba(0,0,0,0.3);
}
@media (prefers-color-scheme: dark) {
.minilink-addedin {
box-shadow: 0 0 0 1px rgba(255,255,255,0.3);
}
}
.minilink-addedin:not(:first-child) {
margin-left: .5em;
}

View File

@ -79,22 +79,11 @@
border-bottom: 1px solid #ddd;
margin-bottom: 0.25em; /* 4px /16 */
}
@media (prefers-color-scheme: dark) {
.elv-toc-list > li > a {
color: #fff;
border-color: #444;
}
}
/* Active links */
.elv-toc-list li.elv-toc-active > a {
background-color: #dff7ff;
}
@media (prefers-color-scheme: dark) {
.elv-toc-list li.elv-toc-active > a {
background-color: #353535;
}
}
.elv-toc-list ul .elv-toc-active > a:after {
content: "";
}

View File

@ -285,11 +285,6 @@ footer.elv-layout {
.elv-hero {
background-color: #222;
}
@media (prefers-color-scheme: dark) {
.elv-hero {
background-color: #292929;
}
}
.elv-hero img,
.elv-hero svg {
width: 42.95774646vh;

View File

@ -15,8 +15,7 @@ consistent snapshot that is more suited for backups.
Fortunately, borgmatic includes built-in support for creating database dumps
prior to running backups. For example, here is everything you need to dump and
backup a couple of local PostgreSQL databases, a MySQL/MariaDB database, and a
MongoDB database:
backup a couple of local PostgreSQL databases and a MySQL/MariaDB database.
```yaml
hooks:
@ -25,10 +24,27 @@ hooks:
- name: orders
mysql_databases:
- name: posts
```
<span class="minilink minilink-addedin">New in version 1.5.22</span> You can
also dump MongoDB databases. For example:
```yaml
hooks:
mongodb_databases:
- name: messages
```
<span class="minilink minilink-addedin">New in version 1.7.9</span>
Additionally, you can dump SQLite databases. For example:
```yaml
hooks:
sqlite_databases:
- name: mydb
path: /var/lib/sqlite3/mydb.sqlite
```
As part of each backup, borgmatic streams a database dump for each configured
database directly to Borg, so it's included in the backup without consuming
additional disk space. (The exceptions are the PostgreSQL/MongoDB "directory"
@ -74,6 +90,9 @@ hooks:
password: trustsome1
authentication_database: mongousers
options: "--ssl"
sqlite_databases:
- name: mydb
path: /var/lib/sqlite3/mydb.sqlite
```
See your [borgmatic configuration
@ -99,6 +118,8 @@ hooks:
Note that you may need to use a `username` of the `postgres` superuser for
this to work with PostgreSQL.
The SQLite hook in particular does not consider "all" a special database name.
<span class="minilink minilink-addedin">New in version 1.7.6</span> With
PostgreSQL and MySQL, you can optionally dump "all" databases to separate
files instead of one combined dump file, allowing more convenient restores of
@ -154,11 +175,11 @@ bring back any missing configuration files in order to restore a database.
## Supported databases
As of now, borgmatic supports PostgreSQL, MySQL/MariaDB, and MongoDB databases
directly. But see below about general-purpose preparation and cleanup hooks as
a work-around with other database systems. Also, please [file a
ticket](https://torsion.org/borgmatic/#issues) for additional database systems
that you'd like supported.
As of now, borgmatic supports PostgreSQL, MySQL/MariaDB, MongoDB, and SQLite
databases directly. But see below about general-purpose preparation and
cleanup hooks as a work-around with other database systems. Also, please [file
a ticket](https://torsion.org/borgmatic/#issues) for additional database
systems that you'd like supported.
## Database restoration
@ -295,7 +316,10 @@ user and you're extracting to `/tmp`, then the dump will be in
`/tmp/root/.borgmatic`.
After extraction, you can manually restore the dump file using native database
commands like `pg_restore`, `mysql`, `mongorestore` or similar.
commands like `pg_restore`, `mysql`, `mongorestore`, `sqlite`, or similar.
Also see the documentation on [listing database
dumps](https://torsion.org/borgmatic/docs/how-to/inspect-your-backups/#listing-database-dumps).
## Preparation and cleanup hooks

View File

@ -9,37 +9,47 @@ eleventyNavigation:
Borg itself is great for efficiently de-duplicating data across successive
backup archives, even when dealing with very large repositories. But you may
find that while borgmatic's default mode of `prune`, `compact`, `create`, and
`check` works well on small repositories, it's not so great on larger ones.
That's because running the default pruning, compact, and consistency checks
take a long time on large repositories.
find that while borgmatic's default actions of `create`, `prune`, `compact`,
and `check` works well on small repositories, it's not so great on larger
ones. That's because running the default pruning, compact, and consistency
checks take a long time on large repositories.
<span class="minilink minilink-addedin">Prior to version 1.7.9</span> The
default action ordering was `prune`, `compact`, `create`, and `check`.
### A la carte actions
If you find yourself in this situation, you have some options. First, you can
run borgmatic's `prune`, `compact`, `create`, or `check` actions separately.
For instance, the following optional actions are available:
If you find yourself wanting to customize the actions, you have some options.
First, you can run borgmatic's `prune`, `compact`, `create`, or `check`
actions separately. For instance, the following optional actions are
available (among others):
```bash
borgmatic create
borgmatic prune
borgmatic compact
borgmatic create
borgmatic check
```
You can run with only one of these actions provided, or you can mix and match
any number of them in a single borgmatic run. This supports approaches like
skipping certain actions while running others. For instance, this skips
`prune` and `compact` and only runs `create` and `check`:
You can run borgmatic with only one of these actions provided, or you can mix
and match any number of them in a single borgmatic run. This supports
approaches like skipping certain actions while running others. For instance,
this skips `prune` and `compact` and only runs `create` and `check`:
```bash
borgmatic create check
```
Or, you can make backups with `create` on a frequent schedule (e.g. with
`borgmatic create` called from one cron job), while only running expensive
consistency checks with `check` on a much less frequent basis (e.g. with
`borgmatic check` called from a separate cron job).
<span class="minilink minilink-addedin">New in version 1.7.9</span> borgmatic
now respects your specified command-line action order, running actions in the
order you specify. In previous versions, borgmatic ran your specified actions
in a fixed ordering regardless of the order they appeared on the command-line.
But instead of running actions together, another option is to run backups with
`create` on a frequent schedule (e.g. with `borgmatic create` called from one
cron job), while only running expensive consistency checks with `check` on a
much less frequent basis (e.g. with `borgmatic check` called from a separate
cron job).
### Consistency check configuration

View File

@ -20,15 +20,15 @@ borgmatic rlist
That should yield output looking something like:
```text
host-2019-01-01T04:05:06.070809 Tue, 2019-01-01 04:05:06 [...]
host-2019-01-02T04:06:07.080910 Wed, 2019-01-02 04:06:07 [...]
host-2023-01-01T04:05:06.070809 Tue, 2023-01-01 04:05:06 [...]
host-2023-01-02T04:06:07.080910 Wed, 2023-01-02 04:06:07 [...]
```
Assuming that you want to extract the archive with the most up-to-date files
and therefore the latest timestamp, run a command like:
```bash
borgmatic extract --archive host-2019-01-02T04:06:07.080910
borgmatic extract --archive host-2023-01-02T04:06:07.080910
```
(No borgmatic `extract` action? Upgrade borgmatic!)
@ -54,7 +54,7 @@ But if you have multiple repositories configured, then you'll need to specify
the repository path containing the archive to extract. Here's an example:
```bash
borgmatic extract --repository repo.borg --archive host-2019-...
borgmatic extract --repository repo.borg --archive host-2023-...
```
## Extract particular files
@ -74,6 +74,13 @@ run the `extract` command above, borgmatic will extract `/var/path/1` and
`/var/path/2`.
### Searching for files
If you're not sure which archive contains the files you're looking for, you
can [search across
archives](https://torsion.org/borgmatic/docs/how-to/inspect-your-backups/#searching-for-a-file).
## Extract to a particular destination
By default, borgmatic extracts files into the current directory. To instead

View File

@ -91,6 +91,19 @@ example, to search only the last five archives:
borgmatic list --find foo.txt --last 5
```
## Listing database dumps
If you have enabled borgmatic's [database
hooks](https://torsion.org/borgmatic/docs/how-to/backup-your-databases/), you
can list backed up database dumps via borgmatic. For example:
```bash
borgmatic list --archive latest --find .borgmatic/*_databases
```
This gives you a listing of all database dump files contained in the latest
archive, complete with file sizes.
## Logging

View File

@ -83,7 +83,7 @@ tests](https://torsion.org/borgmatic/docs/how-to/extract-a-backup/).
## Error hooks
When an error occurs during a `prune`, `compact`, `create`, or `check` action,
When an error occurs during a `create`, `prune`, `compact`, or `check` action,
borgmatic can run configurable shell commands to fire off custom error
notifications or take other actions, so you can get alerted as soon as
something goes wrong. Here's a not-so-useful example:
@ -116,8 +116,8 @@ the repository. Here's the full set of supported variables you can use here:
* `output`: output of the command that failed (may be blank if an error
occurred without running a command)
Note that borgmatic runs the `on_error` hooks only for `prune`, `compact`,
`create`, or `check` actions or hooks in which an error occurs, and not other
Note that borgmatic runs the `on_error` hooks only for `create`, `prune`,
`compact`, or `check` actions or hooks in which an error occurs, and not other
actions. borgmatic does not run `on_error` hooks if an error occurs within a
`before_everything` or `after_everything` hook. For more about hooks, see the
[borgmatic hooks
@ -144,7 +144,7 @@ With this hook in place, borgmatic pings your Healthchecks project when a
backup begins, ends, or errors. Specifically, after the <a
href="https://torsion.org/borgmatic/docs/how-to/add-preparation-and-cleanup-steps-to-backups/">`before_backup`
hooks</a> run, borgmatic lets Healthchecks know that it has started if any of
the `prune`, `compact`, `create`, or `check` actions are run.
the `create`, `prune`, `compact`, or `check` actions are run.
Then, if the actions complete successfully, borgmatic notifies Healthchecks of
the success after the `after_backup` hooks run, and includes borgmatic logs in
@ -154,8 +154,8 @@ in the Healthchecks UI, although be aware that Healthchecks currently has a
If an error occurs during any action or hook, borgmatic notifies Healthchecks
after the `on_error` hooks run, also tacking on logs including the error
itself. But the logs are only included for errors that occur when a `prune`,
`compact`, `create`, or `check` action is run.
itself. But the logs are only included for errors that occur when a `create`,
`prune`, `compact`, or `check` action is run.
You can customize the verbosity of the logs that are sent to Healthchecks with
borgmatic's `--monitoring-verbosity` flag. The `--list` and `--stats` flags

BIN
docs/static/sqlite.png vendored Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 4.6 KiB

View File

@ -11,7 +11,7 @@
set -e
apk add --no-cache python3 py3-pip borgbackup postgresql-client mariadb-client mongodb-tools \
py3-ruamel.yaml py3-ruamel.yaml.clib bash
py3-ruamel.yaml py3-ruamel.yaml.clib bash sqlite
# If certain dependencies of black are available in this version of Alpine, install them.
apk add --no-cache py3-typed-ast py3-regex || true
python3 -m pip install --no-cache --upgrade pip==22.2.2 setuptools==64.0.1

View File

@ -10,6 +10,8 @@ filterwarnings =
[flake8]
ignore = E501,W503
exclude = *.*/*
multiline-quotes = '''
docstring-quotes = '''
[tool:isort]
force_single_line = False

View File

@ -1,6 +1,6 @@
from setuptools import find_packages, setup
VERSION = '1.7.8.dev0'
VERSION = '1.7.9.dev0'
setup(

View File

@ -5,6 +5,7 @@ click==7.1.2; python_version >= '3.8'
colorama==0.4.4
coverage==5.3
flake8==4.0.1
flake8-quotes==3.3.2
flexmock==0.10.4
isort==5.9.1
mccabe==0.6.1

View File

@ -73,6 +73,9 @@ hooks:
hostname: mongodb
username: root
password: test
sqlite_databases:
- name: sqlite_test
path: /tmp/sqlite_test.db
'''
with open(config_path, 'w') as config_file:

View File

@ -284,6 +284,48 @@ def test_make_exclude_flags_is_empty_when_config_has_no_excludes():
assert exclude_flags == ()
def test_make_list_filter_flags_with_debug_and_feature_available_includes_plus_and_minus():
flexmock(module.logger).should_receive('isEnabledFor').and_return(True)
flexmock(module.feature).should_receive('available').and_return(True)
assert module.make_list_filter_flags(local_borg_version=flexmock(), dry_run=False) == 'AME+-'
def test_make_list_filter_flags_with_info_and_feature_available_omits_plus_and_minus():
flexmock(module.logger).should_receive('isEnabledFor').and_return(False)
flexmock(module.feature).should_receive('available').and_return(True)
assert module.make_list_filter_flags(local_borg_version=flexmock(), dry_run=False) == 'AME'
def test_make_list_filter_flags_with_debug_and_feature_available_and_dry_run_includes_plus_and_minus():
flexmock(module.logger).should_receive('isEnabledFor').and_return(True)
flexmock(module.feature).should_receive('available').and_return(True)
assert module.make_list_filter_flags(local_borg_version=flexmock(), dry_run=True) == 'AME+-'
def test_make_list_filter_flags_with_info_and_feature_available_and_dry_run_includes_plus_and_minus():
flexmock(module.logger).should_receive('isEnabledFor').and_return(False)
flexmock(module.feature).should_receive('available').and_return(True)
assert module.make_list_filter_flags(local_borg_version=flexmock(), dry_run=True) == 'AME+-'
def test_make_list_filter_flags_with_debug_and_feature_not_available_includes_x():
flexmock(module.logger).should_receive('isEnabledFor').and_return(True)
flexmock(module.feature).should_receive('available').and_return(False)
assert module.make_list_filter_flags(local_borg_version=flexmock(), dry_run=False) == 'AMEx-'
def test_make_list_filter_flags_with_info_and_feature_not_available_omits_x():
flexmock(module.logger).should_receive('isEnabledFor').and_return(False)
flexmock(module.feature).should_receive('available').and_return(False)
assert module.make_list_filter_flags(local_borg_version=flexmock(), dry_run=False) == 'AME-'
def test_collect_borgmatic_source_directories_set_when_directory_exists():
flexmock(module.os.path).should_receive('exists').and_return(True)
flexmock(module.os.path).should_receive('expanduser')
@ -423,6 +465,7 @@ def test_create_archive_calls_borg_with_parameters():
flexmock(module.os.path).should_receive('expanduser').and_raise(TypeError)
flexmock(module).should_receive('expand_home_directories').and_return(())
flexmock(module).should_receive('write_pattern_file').and_return(None)
flexmock(module).should_receive('make_list_filter_flags').and_return('FOO')
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module).should_receive('ensure_files_readable')
flexmock(module).should_receive('make_pattern_flags').and_return(())
@ -464,6 +507,7 @@ def test_create_archive_calls_borg_with_environment():
flexmock(module.os.path).should_receive('expanduser').and_raise(TypeError)
flexmock(module).should_receive('expand_home_directories').and_return(())
flexmock(module).should_receive('write_pattern_file').and_return(None)
flexmock(module).should_receive('make_list_filter_flags').and_return('FOO')
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module).should_receive('ensure_files_readable')
flexmock(module).should_receive('make_pattern_flags').and_return(())
@ -509,6 +553,7 @@ def test_create_archive_with_patterns_calls_borg_with_patterns_including_convert
flexmock(module).should_receive('write_pattern_file').and_return(
flexmock(name='/tmp/patterns')
).and_return(None)
flexmock(module).should_receive('make_list_filter_flags').and_return('FOO')
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module).should_receive('ensure_files_readable')
flexmock(module).should_receive('make_pattern_flags').and_return(pattern_flags)
@ -553,6 +598,7 @@ def test_create_archive_with_exclude_patterns_calls_borg_with_excludes():
flexmock(module).should_receive('write_pattern_file').and_return(None).and_return(
flexmock(name='/tmp/excludes')
)
flexmock(module).should_receive('make_list_filter_flags').and_return('FOO')
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module).should_receive('ensure_files_readable')
flexmock(module).should_receive('make_pattern_flags').and_return(())
@ -594,6 +640,7 @@ def test_create_archive_with_log_info_calls_borg_with_info_parameter():
flexmock(module.os.path).should_receive('expanduser').and_raise(TypeError)
flexmock(module).should_receive('expand_home_directories').and_return(())
flexmock(module).should_receive('write_pattern_file').and_return(None)
flexmock(module).should_receive('make_list_filter_flags').and_return('FOO')
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module).should_receive('ensure_files_readable')
flexmock(module).should_receive('make_pattern_flags').and_return(())
@ -636,6 +683,7 @@ def test_create_archive_with_log_info_and_json_suppresses_most_borg_output():
flexmock(module.os.path).should_receive('expanduser').and_raise(TypeError)
flexmock(module).should_receive('expand_home_directories').and_return(())
flexmock(module).should_receive('write_pattern_file').and_return(None)
flexmock(module).should_receive('make_list_filter_flags').and_return('FOO')
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module).should_receive('ensure_files_readable')
flexmock(module).should_receive('make_pattern_flags').and_return(())
@ -676,6 +724,7 @@ def test_create_archive_with_log_debug_calls_borg_with_debug_parameter():
flexmock(module.os.path).should_receive('expanduser').and_raise(TypeError)
flexmock(module).should_receive('expand_home_directories').and_return(())
flexmock(module).should_receive('write_pattern_file').and_return(None)
flexmock(module).should_receive('make_list_filter_flags').and_return('FOO')
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module).should_receive('ensure_files_readable')
flexmock(module).should_receive('make_pattern_flags').and_return(())
@ -718,6 +767,7 @@ def test_create_archive_with_log_debug_and_json_suppresses_most_borg_output():
flexmock(module.os.path).should_receive('expanduser').and_raise(TypeError)
flexmock(module).should_receive('expand_home_directories').and_return(())
flexmock(module).should_receive('write_pattern_file').and_return(None)
flexmock(module).should_receive('make_list_filter_flags').and_return('FOO')
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module).should_receive('ensure_files_readable')
flexmock(module).should_receive('make_pattern_flags').and_return(())
@ -758,6 +808,7 @@ def test_create_archive_with_dry_run_calls_borg_with_dry_run_parameter():
flexmock(module.os.path).should_receive('expanduser').and_raise(TypeError)
flexmock(module).should_receive('expand_home_directories').and_return(())
flexmock(module).should_receive('write_pattern_file').and_return(None)
flexmock(module).should_receive('make_list_filter_flags').and_return('FOO')
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module).should_receive('ensure_files_readable')
flexmock(module).should_receive('make_pattern_flags').and_return(())
@ -801,6 +852,7 @@ def test_create_archive_with_stats_and_dry_run_calls_borg_without_stats_paramete
flexmock(module.os.path).should_receive('expanduser').and_raise(TypeError)
flexmock(module).should_receive('expand_home_directories').and_return(())
flexmock(module).should_receive('write_pattern_file').and_return(None)
flexmock(module).should_receive('make_list_filter_flags').and_return('FOO')
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module).should_receive('ensure_files_readable')
flexmock(module).should_receive('make_pattern_flags').and_return(())
@ -844,6 +896,7 @@ def test_create_archive_with_checkpoint_interval_calls_borg_with_checkpoint_inte
flexmock(module.os.path).should_receive('expanduser').and_raise(TypeError)
flexmock(module).should_receive('expand_home_directories').and_return(())
flexmock(module).should_receive('write_pattern_file').and_return(None)
flexmock(module).should_receive('make_list_filter_flags').and_return('FOO')
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module).should_receive('ensure_files_readable')
flexmock(module).should_receive('make_pattern_flags').and_return(())
@ -885,6 +938,7 @@ def test_create_archive_with_checkpoint_volume_calls_borg_with_checkpoint_volume
flexmock(module.os.path).should_receive('expanduser').and_raise(TypeError)
flexmock(module).should_receive('expand_home_directories').and_return(())
flexmock(module).should_receive('write_pattern_file').and_return(None)
flexmock(module).should_receive('make_list_filter_flags').and_return('FOO')
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module).should_receive('ensure_files_readable')
flexmock(module).should_receive('make_pattern_flags').and_return(())
@ -926,6 +980,7 @@ def test_create_archive_with_chunker_params_calls_borg_with_chunker_params_param
flexmock(module.os.path).should_receive('expanduser').and_raise(TypeError)
flexmock(module).should_receive('expand_home_directories').and_return(())
flexmock(module).should_receive('write_pattern_file').and_return(None)
flexmock(module).should_receive('make_list_filter_flags').and_return('FOO')
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module).should_receive('ensure_files_readable')
flexmock(module).should_receive('make_pattern_flags').and_return(())
@ -967,6 +1022,7 @@ def test_create_archive_with_compression_calls_borg_with_compression_parameters(
flexmock(module.os.path).should_receive('expanduser').and_raise(TypeError)
flexmock(module).should_receive('expand_home_directories').and_return(())
flexmock(module).should_receive('write_pattern_file').and_return(None)
flexmock(module).should_receive('make_list_filter_flags').and_return('FOO')
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module).should_receive('ensure_files_readable')
flexmock(module).should_receive('make_pattern_flags').and_return(())
@ -1013,6 +1069,7 @@ def test_create_archive_with_upload_rate_limit_calls_borg_with_upload_ratelimit_
flexmock(module.os.path).should_receive('expanduser').and_raise(TypeError)
flexmock(module).should_receive('expand_home_directories').and_return(())
flexmock(module).should_receive('write_pattern_file').and_return(None)
flexmock(module).should_receive('make_list_filter_flags').and_return('FOO')
flexmock(module.feature).should_receive('available').and_return(feature_available)
flexmock(module).should_receive('ensure_files_readable')
flexmock(module).should_receive('make_pattern_flags').and_return(())
@ -1056,6 +1113,7 @@ def test_create_archive_with_working_directory_calls_borg_with_working_directory
)
flexmock(module).should_receive('expand_home_directories').and_return(())
flexmock(module).should_receive('write_pattern_file').and_return(None)
flexmock(module).should_receive('make_list_filter_flags').and_return('FOO')
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module).should_receive('ensure_files_readable')
flexmock(module).should_receive('make_pattern_flags').and_return(())
@ -1098,6 +1156,7 @@ def test_create_archive_with_one_file_system_calls_borg_with_one_file_system_par
flexmock(module.os.path).should_receive('expanduser').and_raise(TypeError)
flexmock(module).should_receive('expand_home_directories').and_return(())
flexmock(module).should_receive('write_pattern_file').and_return(None)
flexmock(module).should_receive('make_list_filter_flags').and_return('FOO')
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module).should_receive('ensure_files_readable')
flexmock(module).should_receive('make_pattern_flags').and_return(())
@ -1145,6 +1204,7 @@ def test_create_archive_with_numeric_ids_calls_borg_with_numeric_ids_parameter(
flexmock(module.os.path).should_receive('expanduser').and_raise(TypeError)
flexmock(module).should_receive('expand_home_directories').and_return(())
flexmock(module).should_receive('write_pattern_file').and_return(None)
flexmock(module).should_receive('make_list_filter_flags').and_return('FOO')
flexmock(module.feature).should_receive('available').and_return(feature_available)
flexmock(module).should_receive('ensure_files_readable')
flexmock(module).should_receive('make_pattern_flags').and_return(())
@ -1187,6 +1247,7 @@ def test_create_archive_with_read_special_calls_borg_with_read_special_parameter
flexmock(module.os.path).should_receive('expanduser').and_raise(TypeError)
flexmock(module).should_receive('expand_home_directories').and_return(())
flexmock(module).should_receive('write_pattern_file').and_return(None)
flexmock(module).should_receive('make_list_filter_flags').and_return('FOO')
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module).should_receive('ensure_files_readable')
flexmock(module).should_receive('make_pattern_flags').and_return(())
@ -1246,6 +1307,7 @@ def test_create_archive_with_basic_option_calls_borg_with_corresponding_paramete
flexmock(module.os.path).should_receive('expanduser').and_raise(TypeError)
flexmock(module).should_receive('expand_home_directories').and_return(())
flexmock(module).should_receive('write_pattern_file').and_return(None)
flexmock(module).should_receive('make_list_filter_flags').and_return('FOO')
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module).should_receive('ensure_files_readable')
flexmock(module).should_receive('make_pattern_flags').and_return(())
@ -1299,6 +1361,7 @@ def test_create_archive_with_atime_option_calls_borg_with_corresponding_paramete
flexmock(module.os.path).should_receive('expanduser').and_raise(TypeError)
flexmock(module).should_receive('expand_home_directories').and_return(())
flexmock(module).should_receive('write_pattern_file').and_return(None)
flexmock(module).should_receive('make_list_filter_flags').and_return('FOO')
flexmock(module.feature).should_receive('available').and_return(feature_available)
flexmock(module).should_receive('ensure_files_readable')
flexmock(module).should_receive('make_pattern_flags').and_return(())
@ -1352,6 +1415,7 @@ def test_create_archive_with_flags_option_calls_borg_with_corresponding_paramete
flexmock(module.os.path).should_receive('expanduser').and_raise(TypeError)
flexmock(module).should_receive('expand_home_directories').and_return(())
flexmock(module).should_receive('write_pattern_file').and_return(None)
flexmock(module).should_receive('make_list_filter_flags').and_return('FOO')
flexmock(module.feature).should_receive('available').and_return(feature_available)
flexmock(module).should_receive('ensure_files_readable')
flexmock(module).should_receive('make_pattern_flags').and_return(())
@ -1394,6 +1458,7 @@ def test_create_archive_with_files_cache_calls_borg_with_files_cache_parameters(
flexmock(module.os.path).should_receive('expanduser').and_raise(TypeError)
flexmock(module).should_receive('expand_home_directories').and_return(())
flexmock(module).should_receive('write_pattern_file').and_return(None)
flexmock(module).should_receive('make_list_filter_flags').and_return('FOO')
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module).should_receive('ensure_files_readable')
flexmock(module).should_receive('make_pattern_flags').and_return(())
@ -1436,6 +1501,7 @@ def test_create_archive_with_local_path_calls_borg_via_local_path():
flexmock(module.os.path).should_receive('expanduser').and_raise(TypeError)
flexmock(module).should_receive('expand_home_directories').and_return(())
flexmock(module).should_receive('write_pattern_file').and_return(None)
flexmock(module).should_receive('make_list_filter_flags').and_return('FOO')
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module).should_receive('ensure_files_readable')
flexmock(module).should_receive('make_pattern_flags').and_return(())
@ -1478,6 +1544,7 @@ def test_create_archive_with_remote_path_calls_borg_with_remote_path_parameters(
flexmock(module.os.path).should_receive('expanduser').and_raise(TypeError)
flexmock(module).should_receive('expand_home_directories').and_return(())
flexmock(module).should_receive('write_pattern_file').and_return(None)
flexmock(module).should_receive('make_list_filter_flags').and_return('FOO')
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module).should_receive('ensure_files_readable')
flexmock(module).should_receive('make_pattern_flags').and_return(())
@ -1520,6 +1587,7 @@ def test_create_archive_with_umask_calls_borg_with_umask_parameters():
flexmock(module.os.path).should_receive('expanduser').and_raise(TypeError)
flexmock(module).should_receive('expand_home_directories').and_return(())
flexmock(module).should_receive('write_pattern_file').and_return(None)
flexmock(module).should_receive('make_list_filter_flags').and_return('FOO')
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module).should_receive('ensure_files_readable')
flexmock(module).should_receive('make_pattern_flags').and_return(())
@ -1561,6 +1629,7 @@ def test_create_archive_with_lock_wait_calls_borg_with_lock_wait_parameters():
flexmock(module.os.path).should_receive('expanduser').and_raise(TypeError)
flexmock(module).should_receive('expand_home_directories').and_return(())
flexmock(module).should_receive('write_pattern_file').and_return(None)
flexmock(module).should_receive('make_list_filter_flags').and_return('FOO')
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module).should_receive('ensure_files_readable')
flexmock(module).should_receive('make_pattern_flags').and_return(())
@ -1602,6 +1671,7 @@ def test_create_archive_with_stats_calls_borg_with_stats_parameter_and_answer_ou
flexmock(module.os.path).should_receive('expanduser').and_raise(TypeError)
flexmock(module).should_receive('expand_home_directories').and_return(())
flexmock(module).should_receive('write_pattern_file').and_return(None)
flexmock(module).should_receive('make_list_filter_flags').and_return('FOO')
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module).should_receive('ensure_files_readable')
flexmock(module).should_receive('make_pattern_flags').and_return(())
@ -1644,6 +1714,7 @@ def test_create_archive_with_files_calls_borg_with_list_parameter_and_answer_out
flexmock(module.os.path).should_receive('expanduser').and_raise(TypeError)
flexmock(module).should_receive('expand_home_directories').and_return(())
flexmock(module).should_receive('write_pattern_file').and_return(None)
flexmock(module).should_receive('make_list_filter_flags').and_return('FOO')
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module).should_receive('ensure_files_readable')
flexmock(module).should_receive('make_pattern_flags').and_return(())
@ -1653,7 +1724,7 @@ def test_create_archive_with_files_calls_borg_with_list_parameter_and_answer_out
)
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'create', '--list', '--filter', 'AMEx+-') + REPO_ARCHIVE_WITH_PATHS,
('borg', 'create', '--list', '--filter', 'FOO') + REPO_ARCHIVE_WITH_PATHS,
output_log_level=module.borgmatic.logger.ANSWER,
output_file=None,
borg_local_path='borg',
@ -1686,6 +1757,7 @@ def test_create_archive_with_progress_and_log_info_calls_borg_with_progress_para
flexmock(module.os.path).should_receive('expanduser').and_raise(TypeError)
flexmock(module).should_receive('expand_home_directories').and_return(())
flexmock(module).should_receive('write_pattern_file').and_return(None)
flexmock(module).should_receive('make_list_filter_flags').and_return('FOO')
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module).should_receive('ensure_files_readable')
flexmock(module).should_receive('make_pattern_flags').and_return(())
@ -1729,6 +1801,7 @@ def test_create_archive_with_progress_calls_borg_with_progress_parameter():
flexmock(module.os.path).should_receive('expanduser').and_raise(TypeError)
flexmock(module).should_receive('expand_home_directories').and_return(())
flexmock(module).should_receive('write_pattern_file').and_return(None)
flexmock(module).should_receive('make_list_filter_flags').and_return('FOO')
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module).should_receive('ensure_files_readable')
flexmock(module).should_receive('make_pattern_flags').and_return(())
@ -1772,6 +1845,7 @@ def test_create_archive_with_progress_and_stream_processes_calls_borg_with_progr
flexmock(module.os.path).should_receive('expanduser').and_raise(TypeError)
flexmock(module).should_receive('expand_home_directories').and_return(())
flexmock(module).should_receive('write_pattern_file').and_return(None)
flexmock(module).should_receive('make_list_filter_flags').and_return('FOO')
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module).should_receive('ensure_files_readable')
flexmock(module).should_receive('make_pattern_flags').and_return(())
@ -1832,6 +1906,7 @@ def test_create_archive_with_stream_processes_ignores_read_special_false_and_log
flexmock(module.os.path).should_receive('expanduser').and_raise(TypeError)
flexmock(module).should_receive('expand_home_directories').and_return(())
flexmock(module).should_receive('write_pattern_file').and_return(flexmock(name='/tmp/excludes'))
flexmock(module).should_receive('make_list_filter_flags').and_return('FOO')
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module).should_receive('ensure_files_readable')
flexmock(module.logger).should_receive('warning').twice()
@ -1841,7 +1916,7 @@ def test_create_archive_with_stream_processes_ignores_read_special_false_and_log
(f'repo::{DEFAULT_ARCHIVE_NAME}',)
)
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('collect_special_file_paths').and_return(("/dev/null",))
flexmock(module).should_receive('collect_special_file_paths').and_return(('/dev/null',))
create_command = (
'borg',
'create',
@ -1898,6 +1973,7 @@ def test_create_archive_with_stream_processes_adds_special_files_to_excludes():
flexmock(module).should_receive('write_pattern_file').and_return(None).and_return(
flexmock(name='/excludes')
)
flexmock(module).should_receive('make_list_filter_flags').and_return('FOO')
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module).should_receive('ensure_files_readable')
flexmock(module).should_receive('make_pattern_flags').and_return(())
@ -1962,6 +2038,7 @@ def test_create_archive_with_stream_processes_and_read_special_does_not_add_spec
('special',)
)
flexmock(module).should_receive('write_pattern_file').and_return(None)
flexmock(module).should_receive('make_list_filter_flags').and_return('FOO')
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module).should_receive('ensure_files_readable')
flexmock(module).should_receive('make_pattern_flags').and_return(())
@ -2022,6 +2099,7 @@ def test_create_archive_with_json_calls_borg_with_json_parameter():
flexmock(module.os.path).should_receive('expanduser').and_raise(TypeError)
flexmock(module).should_receive('expand_home_directories').and_return(())
flexmock(module).should_receive('write_pattern_file').and_return(None)
flexmock(module).should_receive('make_list_filter_flags').and_return('FOO')
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module).should_receive('ensure_files_readable')
flexmock(module).should_receive('make_pattern_flags').and_return(())
@ -2063,6 +2141,7 @@ def test_create_archive_with_stats_and_json_calls_borg_without_stats_parameter()
flexmock(module.os.path).should_receive('expanduser').and_raise(TypeError)
flexmock(module).should_receive('expand_home_directories').and_return(())
flexmock(module).should_receive('write_pattern_file').and_return(None)
flexmock(module).should_receive('make_list_filter_flags').and_return('FOO')
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module).should_receive('ensure_files_readable')
flexmock(module).should_receive('make_pattern_flags').and_return(())
@ -2105,6 +2184,7 @@ def test_create_archive_with_source_directories_glob_expands():
flexmock(module.os.path).should_receive('expanduser').and_raise(TypeError)
flexmock(module).should_receive('expand_home_directories').and_return(())
flexmock(module).should_receive('write_pattern_file').and_return(None)
flexmock(module).should_receive('make_list_filter_flags').and_return('FOO')
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module).should_receive('ensure_files_readable')
flexmock(module).should_receive('make_pattern_flags').and_return(())
@ -2147,6 +2227,7 @@ def test_create_archive_with_non_matching_source_directories_glob_passes_through
flexmock(module.os.path).should_receive('expanduser').and_raise(TypeError)
flexmock(module).should_receive('expand_home_directories').and_return(())
flexmock(module).should_receive('write_pattern_file').and_return(None)
flexmock(module).should_receive('make_list_filter_flags').and_return('FOO')
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module).should_receive('ensure_files_readable')
flexmock(module).should_receive('make_pattern_flags').and_return(())
@ -2189,6 +2270,7 @@ def test_create_archive_with_glob_calls_borg_with_expanded_directories():
flexmock(module.os.path).should_receive('expanduser').and_raise(TypeError)
flexmock(module).should_receive('expand_home_directories').and_return(())
flexmock(module).should_receive('write_pattern_file').and_return(None)
flexmock(module).should_receive('make_list_filter_flags').and_return('FOO')
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module).should_receive('ensure_files_readable')
flexmock(module).should_receive('make_pattern_flags').and_return(())
@ -2230,6 +2312,7 @@ def test_create_archive_with_archive_name_format_calls_borg_with_archive_name():
flexmock(module.os.path).should_receive('expanduser').and_raise(TypeError)
flexmock(module).should_receive('expand_home_directories').and_return(())
flexmock(module).should_receive('write_pattern_file').and_return(None)
flexmock(module).should_receive('make_list_filter_flags').and_return('FOO')
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module).should_receive('ensure_files_readable')
flexmock(module).should_receive('make_pattern_flags').and_return(())
@ -2272,6 +2355,7 @@ def test_create_archive_with_archive_name_format_accepts_borg_placeholders():
flexmock(module.os.path).should_receive('expanduser').and_raise(TypeError)
flexmock(module).should_receive('expand_home_directories').and_return(())
flexmock(module).should_receive('write_pattern_file').and_return(None)
flexmock(module).should_receive('make_list_filter_flags').and_return('FOO')
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module).should_receive('ensure_files_readable')
flexmock(module).should_receive('make_pattern_flags').and_return(())
@ -2314,6 +2398,7 @@ def test_create_archive_with_repository_accepts_borg_placeholders():
flexmock(module.os.path).should_receive('expanduser').and_raise(TypeError)
flexmock(module).should_receive('expand_home_directories').and_return(())
flexmock(module).should_receive('write_pattern_file').and_return(None)
flexmock(module).should_receive('make_list_filter_flags').and_return('FOO')
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module).should_receive('ensure_files_readable')
flexmock(module).should_receive('make_pattern_flags').and_return(())
@ -2355,6 +2440,7 @@ def test_create_archive_with_extra_borg_options_calls_borg_with_extra_options():
flexmock(module.os.path).should_receive('expanduser').and_raise(TypeError)
flexmock(module).should_receive('expand_home_directories').and_return(())
flexmock(module).should_receive('write_pattern_file').and_return(None)
flexmock(module).should_receive('make_list_filter_flags').and_return('FOO')
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module).should_receive('ensure_files_readable')
flexmock(module).should_receive('make_pattern_flags').and_return(())
@ -2397,6 +2483,7 @@ def test_create_archive_with_stream_processes_calls_borg_with_processes_and_read
flexmock(module.os.path).should_receive('expanduser').and_raise(TypeError)
flexmock(module).should_receive('expand_home_directories').and_return(())
flexmock(module).should_receive('write_pattern_file').and_return(None)
flexmock(module).should_receive('make_list_filter_flags').and_return('FOO')
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module).should_receive('ensure_files_readable')
flexmock(module).should_receive('make_pattern_flags').and_return(())

View File

@ -312,6 +312,57 @@ def test_extract_archive_calls_borg_with_strip_components():
)
def test_extract_archive_calls_borg_with_strip_components_calculated_from_all():
flexmock(module.os.path).should_receive('abspath').and_return('repo')
insert_execute_command_mock(
(
'borg',
'extract',
'--strip-components',
'2',
'repo::archive',
'foo/bar/baz.txt',
'foo/bar.txt',
)
)
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module.flags).should_receive('make_repository_archive_flags').and_return(
('repo::archive',)
)
module.extract_archive(
dry_run=False,
repository='repo',
archive='archive',
paths=['foo/bar/baz.txt', 'foo/bar.txt'],
location_config={},
storage_config={},
local_borg_version='1.2.3',
strip_components='all',
)
def test_extract_archive_with_strip_components_all_and_no_paths_raises():
flexmock(module.os.path).should_receive('abspath').and_return('repo')
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module.flags).should_receive('make_repository_archive_flags').and_return(
('repo::archive',)
)
flexmock(module).should_receive('execute_command').never()
with pytest.raises(ValueError):
module.extract_archive(
dry_run=False,
repository='repo',
archive='archive',
paths=None,
location_config={},
storage_config={},
local_borg_version='1.2.3',
strip_components='all',
)
def test_extract_archive_calls_borg_with_progress_parameter():
flexmock(module.os.path).should_receive('abspath').and_return('repo')
flexmock(module.environment).should_receive('make_environment')

View File

@ -1,3 +1,5 @@
import collections
from flexmock import flexmock
from borgmatic.commands import arguments as module
@ -70,6 +72,26 @@ def test_parse_subparser_arguments_consumes_multiple_subparser_arguments():
assert remaining_arguments == []
def test_parse_subparser_arguments_respects_command_line_action_ordering():
other_namespace = flexmock()
action_namespace = flexmock(foo=True)
subparsers = {
'action': flexmock(
parse_known_args=lambda arguments: (action_namespace, ['action', '--foo', 'true'])
),
'other': flexmock(parse_known_args=lambda arguments: (other_namespace, ['other'])),
}
arguments, remaining_arguments = module.parse_subparser_arguments(
('other', '--foo', 'true', 'action'), subparsers
)
assert arguments == collections.OrderedDict(
[('other', other_namespace), ('action', action_namespace)]
)
assert remaining_arguments == []
def test_parse_subparser_arguments_applies_default_subparsers():
prune_namespace = flexmock()
compact_namespace = flexmock()

View File

@ -40,7 +40,7 @@ def test_run_configuration_logs_monitor_start_error():
flexmock(module.borg_version).should_receive('local_borg_version').and_return(flexmock())
flexmock(module.dispatch).should_receive('call_hooks').and_raise(OSError).and_return(
None
).and_return(None)
).and_return(None).and_return(None)
expected_results = [flexmock()]
flexmock(module).should_receive('log_error_records').and_return(expected_results)
flexmock(module).should_receive('run_actions').never()
@ -99,7 +99,7 @@ def test_run_configuration_bails_for_actions_soft_failure():
assert results == []
def test_run_configuration_logs_monitor_finish_error():
def test_run_configuration_logs_monitor_log_error():
flexmock(module).should_receive('verbosity_to_log_level').and_return(logging.INFO)
flexmock(module.borg_version).should_receive('local_borg_version').and_return(flexmock())
flexmock(module.dispatch).should_receive('call_hooks').and_return(None).and_return(
@ -116,13 +116,48 @@ def test_run_configuration_logs_monitor_finish_error():
assert results == expected_results
def test_run_configuration_bails_for_monitor_log_soft_failure():
flexmock(module).should_receive('verbosity_to_log_level').and_return(logging.INFO)
flexmock(module.borg_version).should_receive('local_borg_version').and_return(flexmock())
error = subprocess.CalledProcessError(borgmatic.hooks.command.SOFT_FAIL_EXIT_CODE, 'try again')
flexmock(module.dispatch).should_receive('call_hooks').and_return(None).and_return(
None
).and_raise(error)
flexmock(module).should_receive('log_error_records').never()
flexmock(module).should_receive('run_actions').and_return([])
flexmock(module.command).should_receive('considered_soft_failure').and_return(True)
config = {'location': {'repositories': ['foo']}}
arguments = {'global': flexmock(monitoring_verbosity=1, dry_run=False), 'create': flexmock()}
results = list(module.run_configuration('test.yaml', config, arguments))
assert results == []
def test_run_configuration_logs_monitor_finish_error():
flexmock(module).should_receive('verbosity_to_log_level').and_return(logging.INFO)
flexmock(module.borg_version).should_receive('local_borg_version').and_return(flexmock())
flexmock(module.dispatch).should_receive('call_hooks').and_return(None).and_return(
None
).and_return(None).and_raise(OSError)
expected_results = [flexmock()]
flexmock(module).should_receive('log_error_records').and_return(expected_results)
flexmock(module).should_receive('run_actions').and_return([])
config = {'location': {'repositories': ['foo']}}
arguments = {'global': flexmock(monitoring_verbosity=1, dry_run=False), 'create': flexmock()}
results = list(module.run_configuration('test.yaml', config, arguments))
assert results == expected_results
def test_run_configuration_bails_for_monitor_finish_soft_failure():
flexmock(module).should_receive('verbosity_to_log_level').and_return(logging.INFO)
flexmock(module.borg_version).should_receive('local_borg_version').and_return(flexmock())
error = subprocess.CalledProcessError(borgmatic.hooks.command.SOFT_FAIL_EXIT_CODE, 'try again')
flexmock(module.dispatch).should_receive('call_hooks').and_return(None).and_return(
None
).and_raise(error)
).and_raise(None).and_raise(error)
flexmock(module).should_receive('log_error_records').never()
flexmock(module).should_receive('run_actions').and_return([])
flexmock(module.command).should_receive('considered_soft_failure').and_return(True)
@ -401,6 +436,30 @@ def test_run_actions_runs_transfer():
)
def test_run_actions_runs_create():
flexmock(module).should_receive('add_custom_log_levels')
flexmock(module.command).should_receive('execute_hook')
expected = flexmock()
flexmock(borgmatic.actions.create).should_receive('run_create').and_yield(expected).once()
result = tuple(
module.run_actions(
arguments={'global': flexmock(dry_run=False), 'create': flexmock()},
config_filename=flexmock(),
location={'repositories': []},
storage=flexmock(),
retention=flexmock(),
consistency=flexmock(),
hooks={},
local_path=flexmock(),
remote_path=flexmock(),
local_borg_version=flexmock(),
repository_path='repo',
)
)
assert result == (expected,)
def test_run_actions_runs_prune():
flexmock(module).should_receive('add_custom_log_levels')
flexmock(module.command).should_receive('execute_hook')
@ -445,30 +504,6 @@ def test_run_actions_runs_compact():
)
def test_run_actions_runs_create():
flexmock(module).should_receive('add_custom_log_levels')
flexmock(module.command).should_receive('execute_hook')
expected = flexmock()
flexmock(borgmatic.actions.create).should_receive('run_create').and_yield(expected).once()
result = tuple(
module.run_actions(
arguments={'global': flexmock(dry_run=False), 'create': flexmock()},
config_filename=flexmock(),
location={'repositories': []},
storage=flexmock(),
retention=flexmock(),
consistency=flexmock(),
hooks={},
local_path=flexmock(),
remote_path=flexmock(),
local_borg_version=flexmock(),
repository_path='repo',
)
)
assert result == (expected,)
def test_run_actions_runs_check_when_repository_enabled_for_checks():
flexmock(module).should_receive('add_custom_log_levels')
flexmock(module.command).should_receive('execute_hook')
@ -743,6 +778,33 @@ def test_run_actions_runs_borg():
)
def test_run_actions_runs_multiple_actions_in_argument_order():
flexmock(module).should_receive('add_custom_log_levels')
flexmock(module.command).should_receive('execute_hook')
flexmock(borgmatic.actions.borg).should_receive('run_borg').once().ordered()
flexmock(borgmatic.actions.restore).should_receive('run_restore').once().ordered()
tuple(
module.run_actions(
arguments={
'global': flexmock(dry_run=False),
'borg': flexmock(),
'restore': flexmock(),
},
config_filename=flexmock(),
location={'repositories': []},
storage=flexmock(),
retention=flexmock(),
consistency=flexmock(),
hooks={},
local_path=flexmock(),
remote_path=flexmock(),
local_borg_version=flexmock(),
repository_path='repo',
)
)
def test_load_configurations_collects_parsed_configurations_and_logs():
configuration = flexmock()
other_configuration = flexmock()

View File

@ -102,3 +102,11 @@ def test_ping_monitor_with_other_error_logs_warning():
monitoring_log_level=1,
dry_run=False,
)
def test_ping_monitor_with_unsupported_monitoring_state():
hook_config = {'ping_url': 'https://example.com'}
flexmock(module.requests).should_receive('get').never()
module.ping_monitor(
hook_config, 'config.yaml', module.monitor.State.LOG, monitoring_log_level=1, dry_run=False,
)

View File

@ -87,3 +87,11 @@ def test_ping_monitor_with_other_error_logs_warning():
monitoring_log_level=1,
dry_run=False,
)
def test_ping_monitor_with_unsupported_monitoring_state():
hook_config = {'ping_url': 'https://example.com'}
flexmock(module.requests).should_receive('get').never()
module.ping_monitor(
hook_config, 'config.yaml', module.monitor.State.LOG, monitoring_log_level=1, dry_run=False,
)

View File

@ -184,6 +184,23 @@ def test_ping_monitor_hits_ping_url_for_fail_state():
)
def test_ping_monitor_hits_ping_url_for_log_state():
hook_config = {'ping_url': 'https://example.com'}
payload = 'data'
flexmock(module).should_receive('format_buffered_logs_for_payload').and_return(payload)
flexmock(module.requests).should_receive('post').with_args(
'https://example.com/log', data=payload.encode('utf'), verify=True
).and_return(flexmock(ok=True))
module.ping_monitor(
hook_config,
'config.yaml',
state=module.monitor.State.LOG,
monitoring_log_level=1,
dry_run=False,
)
def test_ping_monitor_with_ping_uuid_hits_corresponding_url():
hook_config = {'ping_url': 'abcd-efgh-ijkl-mnop'}
payload = 'data'

View File

@ -72,7 +72,7 @@ def test_dump_databases_runs_mongodump_with_username_and_password():
'name': 'foo',
'username': 'mongo',
'password': 'trustsome1',
'authentication_database': "admin",
'authentication_database': 'admin',
}
]
process = flexmock()

View File

@ -2,6 +2,7 @@ from enum import Enum
from flexmock import flexmock
import borgmatic.hooks.monitor
from borgmatic.hooks import ntfy as module
default_base_url = 'https://ntfy.sh'
@ -37,12 +38,16 @@ def test_ping_monitor_minimal_config_hits_hosted_ntfy_on_fail():
hook_config = {'topic': topic}
flexmock(module.requests).should_receive('post').with_args(
f'{default_base_url}/{topic}',
headers=return_default_message_headers(module.monitor.State.FAIL),
headers=return_default_message_headers(borgmatic.hooks.monitor.State.FAIL),
auth=None,
).and_return(flexmock(ok=True)).once()
module.ping_monitor(
hook_config, 'config.yaml', module.monitor.State.FAIL, monitoring_log_level=1, dry_run=False
hook_config,
'config.yaml',
borgmatic.hooks.monitor.State.FAIL,
monitoring_log_level=1,
dry_run=False,
)
@ -54,12 +59,16 @@ def test_ping_monitor_with_auth_hits_hosted_ntfy_on_fail():
}
flexmock(module.requests).should_receive('post').with_args(
f'{default_base_url}/{topic}',
headers=return_default_message_headers(module.monitor.State.FAIL),
headers=return_default_message_headers(borgmatic.hooks.monitor.State.FAIL),
auth=module.requests.auth.HTTPBasicAuth('testuser', 'fakepassword'),
).and_return(flexmock(ok=True)).once()
module.ping_monitor(
hook_config, 'config.yaml', module.monitor.State.FAIL, monitoring_log_level=1, dry_run=False
hook_config,
'config.yaml',
borgmatic.hooks.monitor.State.FAIL,
monitoring_log_level=1,
dry_run=False,
)
@ -67,13 +76,17 @@ def test_ping_monitor_auth_with_no_username_warning():
hook_config = {'topic': topic, 'password': 'fakepassword'}
flexmock(module.requests).should_receive('post').with_args(
f'{default_base_url}/{topic}',
headers=return_default_message_headers(module.monitor.State.FAIL),
headers=return_default_message_headers(borgmatic.hooks.monitor.State.FAIL),
auth=None,
).and_return(flexmock(ok=True)).once()
flexmock(module.logger).should_receive('warning').once()
module.ping_monitor(
hook_config, 'config.yaml', module.monitor.State.FAIL, monitoring_log_level=1, dry_run=False
hook_config,
'config.yaml',
borgmatic.hooks.monitor.State.FAIL,
monitoring_log_level=1,
dry_run=False,
)
@ -81,13 +94,17 @@ def test_ping_monitor_auth_with_no_password_warning():
hook_config = {'topic': topic, 'username': 'testuser'}
flexmock(module.requests).should_receive('post').with_args(
f'{default_base_url}/{topic}',
headers=return_default_message_headers(module.monitor.State.FAIL),
headers=return_default_message_headers(borgmatic.hooks.monitor.State.FAIL),
auth=None,
).and_return(flexmock(ok=True)).once()
flexmock(module.logger).should_receive('warning').once()
module.ping_monitor(
hook_config, 'config.yaml', module.monitor.State.FAIL, monitoring_log_level=1, dry_run=False
hook_config,
'config.yaml',
borgmatic.hooks.monitor.State.FAIL,
monitoring_log_level=1,
dry_run=False,
)
@ -98,7 +115,7 @@ def test_ping_monitor_minimal_config_does_not_hit_hosted_ntfy_on_start():
module.ping_monitor(
hook_config,
'config.yaml',
module.monitor.State.START,
borgmatic.hooks.monitor.State.START,
monitoring_log_level=1,
dry_run=False,
)
@ -111,7 +128,7 @@ def test_ping_monitor_minimal_config_does_not_hit_hosted_ntfy_on_finish():
module.ping_monitor(
hook_config,
'config.yaml',
module.monitor.State.FINISH,
borgmatic.hooks.monitor.State.FINISH,
monitoring_log_level=1,
dry_run=False,
)
@ -121,12 +138,16 @@ def test_ping_monitor_minimal_config_hits_selfhosted_ntfy_on_fail():
hook_config = {'topic': topic, 'server': custom_base_url}
flexmock(module.requests).should_receive('post').with_args(
f'{custom_base_url}/{topic}',
headers=return_default_message_headers(module.monitor.State.FAIL),
headers=return_default_message_headers(borgmatic.hooks.monitor.State.FAIL),
auth=None,
).and_return(flexmock(ok=True)).once()
module.ping_monitor(
hook_config, 'config.yaml', module.monitor.State.FAIL, monitoring_log_level=1, dry_run=False
hook_config,
'config.yaml',
borgmatic.hooks.monitor.State.FAIL,
monitoring_log_level=1,
dry_run=False,
)
@ -135,7 +156,11 @@ def test_ping_monitor_minimal_config_does_not_hit_hosted_ntfy_on_fail_dry_run():
flexmock(module.requests).should_receive('post').never()
module.ping_monitor(
hook_config, 'config.yaml', module.monitor.State.FAIL, monitoring_log_level=1, dry_run=True
hook_config,
'config.yaml',
borgmatic.hooks.monitor.State.FAIL,
monitoring_log_level=1,
dry_run=True,
)
@ -146,7 +171,11 @@ def test_ping_monitor_custom_message_hits_hosted_ntfy_on_fail():
).and_return(flexmock(ok=True)).once()
module.ping_monitor(
hook_config, 'config.yaml', module.monitor.State.FAIL, monitoring_log_level=1, dry_run=False
hook_config,
'config.yaml',
borgmatic.hooks.monitor.State.FAIL,
monitoring_log_level=1,
dry_run=False,
)
@ -154,14 +183,14 @@ def test_ping_monitor_custom_state_hits_hosted_ntfy_on_start():
hook_config = {'topic': topic, 'states': ['start', 'fail']}
flexmock(module.requests).should_receive('post').with_args(
f'{default_base_url}/{topic}',
headers=return_default_message_headers(module.monitor.State.START),
headers=return_default_message_headers(borgmatic.hooks.monitor.State.START),
auth=None,
).and_return(flexmock(ok=True)).once()
module.ping_monitor(
hook_config,
'config.yaml',
module.monitor.State.START,
borgmatic.hooks.monitor.State.START,
monitoring_log_level=1,
dry_run=False,
)
@ -171,7 +200,7 @@ def test_ping_monitor_with_connection_error_logs_warning():
hook_config = {'topic': topic}
flexmock(module.requests).should_receive('post').with_args(
f'{default_base_url}/{topic}',
headers=return_default_message_headers(module.monitor.State.FAIL),
headers=return_default_message_headers(borgmatic.hooks.monitor.State.FAIL),
auth=None,
).and_raise(module.requests.exceptions.ConnectionError)
flexmock(module.logger).should_receive('warning').once()
@ -179,7 +208,7 @@ def test_ping_monitor_with_connection_error_logs_warning():
module.ping_monitor(
hook_config,
'config.yaml',
module.monitor.State.FAIL,
borgmatic.hooks.monitor.State.FAIL,
monitoring_log_level=1,
dry_run=False,
)
@ -193,7 +222,7 @@ def test_ping_monitor_with_other_error_logs_warning():
)
flexmock(module.requests).should_receive('post').with_args(
f'{default_base_url}/{topic}',
headers=return_default_message_headers(module.monitor.State.FAIL),
headers=return_default_message_headers(borgmatic.hooks.monitor.State.FAIL),
auth=None,
).and_return(response)
flexmock(module.logger).should_receive('warning').once()
@ -201,7 +230,7 @@ def test_ping_monitor_with_other_error_logs_warning():
module.ping_monitor(
hook_config,
'config.yaml',
module.monitor.State.FAIL,
borgmatic.hooks.monitor.State.FAIL,
monitoring_log_level=1,
dry_run=False,
)

View File

@ -0,0 +1,125 @@
import pytest
from flexmock import flexmock
from borgmatic.hooks import sqlite as module
def test_dump_databases_logs_and_skips_if_dump_already_exists():
databases = [{'path': '/path/to/database', 'name': 'database'}]
flexmock(module).should_receive('make_dump_path').and_return('/path/to/dump')
flexmock(module.dump).should_receive('make_database_dump_filename').and_return(
'/path/to/dump/database'
)
flexmock(module.os.path).should_receive('exists').and_return(True)
flexmock(module.dump).should_receive('create_parent_directory_for_dump').never()
flexmock(module).should_receive('execute_command').never()
assert module.dump_databases(databases, 'test.yaml', {}, dry_run=False) == []
def test_dump_databases_dumps_each_database():
databases = [
{'path': '/path/to/database1', 'name': 'database1'},
{'path': '/path/to/database2', 'name': 'database2'},
]
processes = [flexmock(), flexmock()]
flexmock(module).should_receive('make_dump_path').and_return('/path/to/dump')
flexmock(module.dump).should_receive('make_database_dump_filename').and_return(
'/path/to/dump/database'
)
flexmock(module.os.path).should_receive('exists').and_return(False)
flexmock(module.dump).should_receive('create_parent_directory_for_dump')
flexmock(module).should_receive('execute_command').and_return(processes[0]).and_return(
processes[1]
)
assert module.dump_databases(databases, 'test.yaml', {}, dry_run=False) == processes
def test_dumping_database_with_non_existent_path_warns_and_dumps_database():
databases = [
{'path': '/path/to/database1', 'name': 'database1'},
]
processes = [flexmock()]
flexmock(module).should_receive('make_dump_path').and_return('/path/to/dump')
flexmock(module.logger).should_receive('warning').once()
flexmock(module.dump).should_receive('make_database_dump_filename').and_return(
'/path/to/dump/database'
)
flexmock(module.os.path).should_receive('exists').and_return(False)
flexmock(module.dump).should_receive('create_parent_directory_for_dump')
flexmock(module).should_receive('execute_command').and_return(processes[0])
assert module.dump_databases(databases, 'test.yaml', {}, dry_run=False) == processes
def test_dumping_database_with_name_all_warns_and_dumps_all_databases():
databases = [
{'path': '/path/to/database1', 'name': 'all'},
]
processes = [flexmock()]
flexmock(module).should_receive('make_dump_path').and_return('/path/to/dump')
flexmock(module.logger).should_receive(
'warning'
).twice() # once for the name=all, once for the non-existent path
flexmock(module.dump).should_receive('make_database_dump_filename').and_return(
'/path/to/dump/database'
)
flexmock(module.os.path).should_receive('exists').and_return(False)
flexmock(module.dump).should_receive('create_parent_directory_for_dump')
flexmock(module).should_receive('execute_command').and_return(processes[0])
assert module.dump_databases(databases, 'test.yaml', {}, dry_run=False) == processes
def test_dump_databases_does_not_dump_if_dry_run():
databases = [{'path': '/path/to/database', 'name': 'database'}]
flexmock(module).should_receive('make_dump_path').and_return('/path/to/dump')
flexmock(module.dump).should_receive('make_database_dump_filename').and_return(
'/path/to/dump/database'
)
flexmock(module.os.path).should_receive('exists').and_return(False)
flexmock(module.dump).should_receive('create_parent_directory_for_dump').never()
flexmock(module).should_receive('execute_command').never()
assert module.dump_databases(databases, 'test.yaml', {}, dry_run=True) == []
def test_restore_database_dump_restores_database():
database_config = [{'path': '/path/to/database', 'name': 'database'}]
extract_process = flexmock(stdout=flexmock())
flexmock(module).should_receive('execute_command_with_processes').once()
flexmock(module.os).should_receive('remove').once()
module.restore_database_dump(
database_config, 'test.yaml', {}, dry_run=False, extract_process=extract_process
)
def test_restore_database_dump_does_not_restore_database_if_dry_run():
database_config = [{'path': '/path/to/database', 'name': 'database'}]
extract_process = flexmock(stdout=flexmock())
flexmock(module).should_receive('execute_command_with_processes').never()
flexmock(module.os).should_receive('remove').never()
module.restore_database_dump(
database_config, 'test.yaml', {}, dry_run=True, extract_process=extract_process
)
def test_restore_database_dump_raises_error_if_database_config_is_invalid():
database_config = []
extract_process = flexmock(stdout=flexmock())
with pytest.raises(ValueError):
module.restore_database_dump(
database_config, 'test.yaml', {}, dry_run=False, extract_process=extract_process
)