Compare commits

...

39 Commits

Author SHA1 Message Date
Dan Helfman b811f125b2 Clarify "checks" configuration documentation for older versions of borgmatic (#639). 2023-02-12 21:42:43 -08:00
Dan Helfman 061f3e7917 Remove related documentation links. 2023-01-26 16:12:01 -08:00
Dan Helfman 6055918907 Upgrade documentation image dependencies. 2023-01-26 16:11:41 -08:00
Dan Helfman 4a90e090ad Clarify NEWS on database "all" dump feature applying to MySQL as well. 2023-01-26 15:28:17 -08:00
Dan Helfman 301b29ee11 Bump version for release. 2023-01-26 15:17:19 -08:00
Dan Helfman c1eb210253 Fix code style flake issue. 2023-01-26 15:09:35 -08:00
Dan Helfman 30cca62d09 Add configuration options for database command customization (#630). 2023-01-26 14:59:17 -08:00
Dan Helfman 113c0e7616 Update documentation about changes to "all" database restores (#438, #560). 2023-01-26 10:53:58 -08:00
Dan Helfman 0e6b2c6773 Optionally dump "all" PostgreSQL databases to separate files instead of one combined dump file (#438, #560). 2023-01-25 23:31:07 -08:00
Dan Helfman 22c750b949 Mention "before_actions" command hook in soft failure documentation (#631). 2023-01-25 13:01:52 -08:00
Dan Helfman 504cce39a1 Add NEWS entry for #629. 2023-01-14 09:17:27 -08:00
Dan Helfman 6c4abb6803 Merge pull request 'Log warning for excluding special files only if list is not empty' (#629) from palto42/borgmatic:special_files_warn into master
Reviewed-on: borgmatic-collective/borgmatic#629
2023-01-14 17:15:01 +00:00
palto42 fd7ad86daa
conditional warning for excluding special files 2023-01-03 21:53:51 +01:00
Dan Helfman 6f3b23c79d Lowercase borgmatic in documentation. 2022-12-23 14:12:48 -08:00
Dan Helfman 4838f5e810 Add borgmatic minimum version to compact docs (#624).
Reviewed-on: borgmatic-collective/borgmatic#625
2022-12-23 22:11:45 +00:00
Macguire Rintoul 116f1ab989 add borgmatic minimum version to compact docs 2022-12-23 13:32:01 -08:00
Dan Helfman 5e15c9f2bc Fix traceback when include merging on ARM64 (#622). 2022-12-23 10:07:53 -08:00
Dan Helfman 442641f9f6 Update borgmatic social links. 2022-12-16 11:39:05 -08:00
Dan Helfman f67c544be6 Optionally dump "all" PostgreSQL databases to separate files instead of one combined dump file (#438, #560). 2022-12-15 22:59:42 -08:00
Dan Helfman 437fd4dbae Update developer constributing instructions as well. 2022-12-13 23:56:32 -08:00
Dan Helfman 36873252d6 Update developer instructions. 2022-12-13 23:44:27 -08:00
Dan Helfman 1ef82a27fa Clarify data/archives check implicit enabling. 2022-12-12 16:03:05 -08:00
Dan Helfman 6837dcbf42 Clarify documentation about transferring archives between related repositories. 2022-12-10 12:59:44 -08:00
Dan Helfman c657764367 Fix logs that interfere with JSON output by making warnings go to stderr instead of stdout (#602). 2022-12-02 12:12:10 -08:00
Dan Helfman f79286fc91 Bump version for release. 2022-11-27 09:00:40 -08:00
Dan Helfman 694d376d15 Clarify documentation about multiple repositories and separate configuration files (#613). 2022-11-21 13:33:01 -08:00
Dan Helfman ab4c08019c Upgrade pytest test dependency (security). 2022-11-18 11:13:51 -08:00
Dan Helfman fd39f54df7 Code formatting. 2022-11-18 08:35:01 -08:00
Dan Helfman ca7e18bb29
Override PostgreSQL dump/restore commands via configuration options (#311).
Merge pull request #49 from jpaniagualaconich/specify-pg-dump-restore-commands
2022-11-18 08:33:14 -08:00
Dan Helfman 6975a5b155 Fix "data" consistency check to support "check_last" and consistency "prefix" options (#611). 2022-11-17 10:19:48 -08:00
Dan Helfman b627d00595 More consistency checks documentation edits. 2022-11-14 15:13:47 -08:00
Dan Helfman 9bd8f1a6df Clarify consistency check configuration. 2022-11-14 14:58:42 -08:00
Javier Paniagua faf682ca35 specify pg dump/restore commands (#311) 2022-11-06 11:12:53 +01:00
Dan Helfman 6aeb74550d Clarify examples in include merging and deep merging documentation (#607). 2022-10-28 19:33:19 -07:00
Dan Helfman 89500df429 Fix traceback when a configuration section is present but lacking any options (#604). 2022-10-23 13:56:03 -07:00
Dan Helfman 82b072d0b7 Update documentation to mention using blake2 with "transfer" action. 2022-10-17 15:04:30 -07:00
Dan Helfman 018c0296fd Document that special file exclusion also excludes symlinks to special files (#596). 2022-10-15 10:14:46 -07:00
Dan Helfman 9c42e7e817 Fix regression in which "check" action errored on certain systems (#597, #598). 2022-10-14 16:19:26 -07:00
Dan Helfman 953277a066 Fix special file detection when broken symlinks are encountered (#596). 2022-10-14 09:41:08 -07:00
91 changed files with 4220 additions and 1612 deletions

23
NEWS
View File

@ -1,3 +1,26 @@
1.7.6
* #393, #438, #560: Optionally dump "all" PostgreSQL/MySQL databases to separate files instead of
one combined dump file, allowing more convenient restores of individual databases. You can enable
this by specifying the database dump "format" option when the database is named "all".
* #602: Fix logs that interfere with JSON output by making warnings go to stderr instead of stdout.
* #622: Fix traceback when include merging configuration files on ARM64.
* #629: Skip warning about excluded special files when no special files have been excluded.
* #630: Add configuration options for database command customization: "list_options",
"restore_options", and "analyze_options" for PostgreSQL, "restore_options" for MySQL, and
"restore_options" for MongoDB.
1.7.5
* #311: Override PostgreSQL dump/restore commands via configuration options.
* #604: Fix traceback when a configuration section is present but lacking any options.
* #607: Clarify documentation examples for include merging and deep merging.
* #611: Fix "data" consistency check to support "check_last" and consistency "prefix" options.
* #613: Clarify documentation about multiple repositories and separate configuration files.
1.7.4
* #596: Fix special file detection erroring when broken symlinks are encountered.
* #597, #598: Fix regression in which "check" action errored on certain systems ("Cannot determine
Borg repository ID").
1.7.3
* #357: Add "break-lock" action for removing any repository and cache locks leftover from Borg
aborting.

View File

@ -104,23 +104,38 @@ offerings, but do not currently fund borgmatic development or hosting.
### Issues
You've got issues? Or an idea for a feature enhancement? We've got an [issue
tracker](https://projects.torsion.org/borgmatic-collective/borgmatic/issues). In order to
create a new issue or comment on an issue, you'll need to [login
first](https://projects.torsion.org/user/login). Note that you can login with
an existing GitHub account if you prefer.
If you'd like to chat with borgmatic developers or users, head on over to the
`#borgmatic` IRC channel on Libera Chat, either via <a
href="https://web.libera.chat/#borgmatic">web chat</a> or a
native <a href="ircs://irc.libera.chat:6697">IRC client</a>. If you
don't get a response right away, please hang around a while—or file a ticket
instead.
Are you experiencing an issue with borgmatic? Or do you have an idea for a
feature enhancement? Head on over to our [issue
tracker](https://projects.torsion.org/borgmatic-collective/borgmatic/issues).
In order to create a new issue or add a comment, you'll need to
[register](https://projects.torsion.org/user/sign_up?invite_code=borgmatic)
first. If you prefer to use an existing GitHub account, you can skip account
creation and [login directly](https://projects.torsion.org/user/login).
Also see the [security
policy](https://torsion.org/borgmatic/docs/security-policy/) for any security
issues.
### Social
Check out the [Borg subreddit](https://www.reddit.com/r/BorgBackup/) for
general Borg and borgmatic discussion and support.
Also follow [borgmatic on Mastodon](https://fosstodon.org/@borgmatic).
### Chat
To chat with borgmatic developers or users, check out the `#borgmatic`
IRC channel on Libera Chat, either via <a
href="https://web.libera.chat/#borgmatic">web chat</a> or a native <a
href="ircs://irc.libera.chat:6697">IRC client</a>. If you don't get a response
right away, please hang around a while—or file a ticket instead.
### Other
Other questions or comments? Contact
[witten@torsion.org](mailto:witten@torsion.org).
@ -135,10 +150,14 @@ borgmatic is licensed under the GNU General Public License version 3 or any
later version.
If you'd like to contribute to borgmatic development, please feel free to
submit a [Pull Request](https://projects.torsion.org/borgmatic-collective/borgmatic/pulls)
or open an [issue](https://projects.torsion.org/borgmatic-collective/borgmatic/issues) first
to discuss your idea. We also accept Pull Requests on GitHub, if that's more
your thing. In general, contributions are very welcome. We don't bite!
submit a [Pull
Request](https://projects.torsion.org/borgmatic-collective/borgmatic/pulls) or
open an
[issue](https://projects.torsion.org/borgmatic-collective/borgmatic/issues) to
discuss your idea. Note that you'll need to
[register](https://projects.torsion.org/user/sign_up?invite_code=borgmatic)
first. We also accept Pull Requests on GitHub, if that's more your thing. In
general, contributions are very welcome. We don't bite!
Also, please check out the [borgmatic development
how-to](https://torsion.org/borgmatic/docs/how-to/develop-on-borgmatic/) for

View File

36
borgmatic/actions/borg.py Normal file
View File

@ -0,0 +1,36 @@
import logging
import borgmatic.borg.borg
import borgmatic.borg.rlist
import borgmatic.config.validate
logger = logging.getLogger(__name__)
def run_borg(
repository, storage, local_borg_version, borg_arguments, local_path, remote_path,
):
'''
Run the "borg" action for the given repository.
'''
if borg_arguments.repository is None or borgmatic.config.validate.repositories_match(
repository, borg_arguments.repository
):
logger.info('{}: Running arbitrary Borg command'.format(repository))
archive_name = borgmatic.borg.rlist.resolve_archive_name(
repository,
borg_arguments.archive,
storage,
local_borg_version,
local_path,
remote_path,
)
borgmatic.borg.borg.run_arbitrary_borg(
repository,
storage,
local_borg_version,
options=borg_arguments.options,
archive=archive_name,
local_path=local_path,
remote_path=remote_path,
)

View File

@ -0,0 +1,21 @@
import logging
import borgmatic.borg.break_lock
import borgmatic.config.validate
logger = logging.getLogger(__name__)
def run_break_lock(
repository, storage, local_borg_version, break_lock_arguments, local_path, remote_path,
):
'''
Run the "break-lock" action for the given repository.
'''
if break_lock_arguments.repository is None or borgmatic.config.validate.repositories_match(
repository, break_lock_arguments.repository
):
logger.info(f'{repository}: Breaking repository and cache locks')
borgmatic.borg.break_lock.break_lock(
repository, storage, local_borg_version, local_path=local_path, remote_path=remote_path,
)

View File

@ -0,0 +1,55 @@
import logging
import borgmatic.borg.check
import borgmatic.hooks.command
logger = logging.getLogger(__name__)
def run_check(
config_filename,
repository,
location,
storage,
consistency,
hooks,
hook_context,
local_borg_version,
check_arguments,
global_arguments,
local_path,
remote_path,
):
'''
Run the "check" action for the given repository.
'''
borgmatic.hooks.command.execute_hook(
hooks.get('before_check'),
hooks.get('umask'),
config_filename,
'pre-check',
global_arguments.dry_run,
**hook_context,
)
logger.info('{}: Running consistency checks'.format(repository))
borgmatic.borg.check.check_archives(
repository,
location,
storage,
consistency,
local_borg_version,
local_path=local_path,
remote_path=remote_path,
progress=check_arguments.progress,
repair=check_arguments.repair,
only_checks=check_arguments.only,
force=check_arguments.force,
)
borgmatic.hooks.command.execute_hook(
hooks.get('after_check'),
hooks.get('umask'),
config_filename,
'post-check',
global_arguments.dry_run,
**hook_context,
)

View File

@ -0,0 +1,57 @@
import logging
import borgmatic.borg.compact
import borgmatic.borg.feature
import borgmatic.hooks.command
logger = logging.getLogger(__name__)
def run_compact(
config_filename,
repository,
storage,
retention,
hooks,
hook_context,
local_borg_version,
compact_arguments,
global_arguments,
dry_run_label,
local_path,
remote_path,
):
'''
Run the "compact" action for the given repository.
'''
borgmatic.hooks.command.execute_hook(
hooks.get('before_compact'),
hooks.get('umask'),
config_filename,
'pre-compact',
global_arguments.dry_run,
**hook_context,
)
if borgmatic.borg.feature.available(borgmatic.borg.feature.Feature.COMPACT, local_borg_version):
logger.info('{}: Compacting segments{}'.format(repository, dry_run_label))
borgmatic.borg.compact.compact_segments(
global_arguments.dry_run,
repository,
storage,
local_borg_version,
local_path=local_path,
remote_path=remote_path,
progress=compact_arguments.progress,
cleanup_commits=compact_arguments.cleanup_commits,
threshold=compact_arguments.threshold,
)
else: # pragma: nocover
logger.info('{}: Skipping compact (only available/needed in Borg 1.2+)'.format(repository))
borgmatic.hooks.command.execute_hook(
hooks.get('after_compact'),
hooks.get('umask'),
config_filename,
'post-compact',
global_arguments.dry_run,
**hook_context,
)

View File

@ -0,0 +1,90 @@
import json
import logging
import borgmatic.borg.create
import borgmatic.hooks.command
import borgmatic.hooks.dispatch
import borgmatic.hooks.dump
logger = logging.getLogger(__name__)
def run_create(
config_filename,
repository,
location,
storage,
hooks,
hook_context,
local_borg_version,
create_arguments,
global_arguments,
dry_run_label,
local_path,
remote_path,
):
'''
Run the "create" action for the given repository.
If create_arguments.json is True, yield the JSON output from creating the archive.
'''
borgmatic.hooks.command.execute_hook(
hooks.get('before_backup'),
hooks.get('umask'),
config_filename,
'pre-backup',
global_arguments.dry_run,
**hook_context,
)
logger.info('{}: Creating archive{}'.format(repository, dry_run_label))
borgmatic.hooks.dispatch.call_hooks_even_if_unconfigured(
'remove_database_dumps',
hooks,
repository,
borgmatic.hooks.dump.DATABASE_HOOK_NAMES,
location,
global_arguments.dry_run,
)
active_dumps = borgmatic.hooks.dispatch.call_hooks(
'dump_databases',
hooks,
repository,
borgmatic.hooks.dump.DATABASE_HOOK_NAMES,
location,
global_arguments.dry_run,
)
stream_processes = [process for processes in active_dumps.values() for process in processes]
json_output = borgmatic.borg.create.create_archive(
global_arguments.dry_run,
repository,
location,
storage,
local_borg_version,
local_path=local_path,
remote_path=remote_path,
progress=create_arguments.progress,
stats=create_arguments.stats,
json=create_arguments.json,
list_files=create_arguments.list_files,
stream_processes=stream_processes,
)
if json_output: # pragma: nocover
yield json.loads(json_output)
borgmatic.hooks.dispatch.call_hooks_even_if_unconfigured(
'remove_database_dumps',
hooks,
config_filename,
borgmatic.hooks.dump.DATABASE_HOOK_NAMES,
location,
global_arguments.dry_run,
)
borgmatic.hooks.command.execute_hook(
hooks.get('after_backup'),
hooks.get('umask'),
config_filename,
'post-backup',
global_arguments.dry_run,
**hook_context,
)

View File

@ -0,0 +1,48 @@
import logging
import borgmatic.borg.export_tar
import borgmatic.borg.rlist
import borgmatic.config.validate
logger = logging.getLogger(__name__)
def run_export_tar(
repository,
storage,
local_borg_version,
export_tar_arguments,
global_arguments,
local_path,
remote_path,
):
'''
Run the "export-tar" action for the given repository.
'''
if export_tar_arguments.repository is None or borgmatic.config.validate.repositories_match(
repository, export_tar_arguments.repository
):
logger.info(
'{}: Exporting archive {} as tar file'.format(repository, export_tar_arguments.archive)
)
borgmatic.borg.export_tar.export_tar_archive(
global_arguments.dry_run,
repository,
borgmatic.borg.rlist.resolve_archive_name(
repository,
export_tar_arguments.archive,
storage,
local_borg_version,
local_path,
remote_path,
),
export_tar_arguments.paths,
export_tar_arguments.destination,
storage,
local_borg_version,
local_path=local_path,
remote_path=remote_path,
tar_filter=export_tar_arguments.tar_filter,
list_files=export_tar_arguments.list_files,
strip_components=export_tar_arguments.strip_components,
)

View File

@ -0,0 +1,67 @@
import logging
import borgmatic.borg.extract
import borgmatic.borg.rlist
import borgmatic.config.validate
import borgmatic.hooks.command
logger = logging.getLogger(__name__)
def run_extract(
config_filename,
repository,
location,
storage,
hooks,
hook_context,
local_borg_version,
extract_arguments,
global_arguments,
local_path,
remote_path,
):
'''
Run the "extract" action for the given repository.
'''
borgmatic.hooks.command.execute_hook(
hooks.get('before_extract'),
hooks.get('umask'),
config_filename,
'pre-extract',
global_arguments.dry_run,
**hook_context,
)
if extract_arguments.repository is None or borgmatic.config.validate.repositories_match(
repository, extract_arguments.repository
):
logger.info('{}: Extracting archive {}'.format(repository, extract_arguments.archive))
borgmatic.borg.extract.extract_archive(
global_arguments.dry_run,
repository,
borgmatic.borg.rlist.resolve_archive_name(
repository,
extract_arguments.archive,
storage,
local_borg_version,
local_path,
remote_path,
),
extract_arguments.paths,
location,
storage,
local_borg_version,
local_path=local_path,
remote_path=remote_path,
destination_path=extract_arguments.destination,
strip_components=extract_arguments.strip_components,
progress=extract_arguments.progress,
)
borgmatic.hooks.command.execute_hook(
hooks.get('after_extract'),
hooks.get('umask'),
config_filename,
'post-extract',
global_arguments.dry_run,
**hook_context,
)

41
borgmatic/actions/info.py Normal file
View File

@ -0,0 +1,41 @@
import json
import logging
import borgmatic.borg.info
import borgmatic.borg.rlist
import borgmatic.config.validate
logger = logging.getLogger(__name__)
def run_info(
repository, storage, local_borg_version, info_arguments, local_path, remote_path,
):
'''
Run the "info" action for the given repository and archive.
If info_arguments.json is True, yield the JSON output from the info for the archive.
'''
if info_arguments.repository is None or borgmatic.config.validate.repositories_match(
repository, info_arguments.repository
):
if not info_arguments.json: # pragma: nocover
logger.answer(f'{repository}: Displaying archive summary information')
info_arguments.archive = borgmatic.borg.rlist.resolve_archive_name(
repository,
info_arguments.archive,
storage,
local_borg_version,
local_path,
remote_path,
)
json_output = borgmatic.borg.info.display_archives_info(
repository,
storage,
local_borg_version,
info_arguments=info_arguments,
local_path=local_path,
remote_path=remote_path,
)
if json_output: # pragma: nocover
yield json.loads(json_output)

43
borgmatic/actions/list.py Normal file
View File

@ -0,0 +1,43 @@
import json
import logging
import borgmatic.borg.list
import borgmatic.config.validate
logger = logging.getLogger(__name__)
def run_list(
repository, storage, local_borg_version, list_arguments, local_path, remote_path,
):
'''
Run the "list" action for the given repository and archive.
If list_arguments.json is True, yield the JSON output from listing the archive.
'''
if list_arguments.repository is None or borgmatic.config.validate.repositories_match(
repository, list_arguments.repository
):
if not list_arguments.json: # pragma: nocover
if list_arguments.find_paths:
logger.answer(f'{repository}: Searching archives')
elif not list_arguments.archive:
logger.answer(f'{repository}: Listing archives')
list_arguments.archive = borgmatic.borg.rlist.resolve_archive_name(
repository,
list_arguments.archive,
storage,
local_borg_version,
local_path,
remote_path,
)
json_output = borgmatic.borg.list.list_archive(
repository,
storage,
local_borg_version,
list_arguments=list_arguments,
local_path=local_path,
remote_path=remote_path,
)
if json_output: # pragma: nocover
yield json.loads(json_output)

View File

@ -0,0 +1,42 @@
import logging
import borgmatic.borg.mount
import borgmatic.borg.rlist
import borgmatic.config.validate
logger = logging.getLogger(__name__)
def run_mount(
repository, storage, local_borg_version, mount_arguments, local_path, remote_path,
):
'''
Run the "mount" action for the given repository.
'''
if mount_arguments.repository is None or borgmatic.config.validate.repositories_match(
repository, mount_arguments.repository
):
if mount_arguments.archive:
logger.info('{}: Mounting archive {}'.format(repository, mount_arguments.archive))
else: # pragma: nocover
logger.info('{}: Mounting repository'.format(repository))
borgmatic.borg.mount.mount_archive(
repository,
borgmatic.borg.rlist.resolve_archive_name(
repository,
mount_arguments.archive,
storage,
local_borg_version,
local_path,
remote_path,
),
mount_arguments.mount_point,
mount_arguments.paths,
mount_arguments.foreground,
mount_arguments.options,
storage,
local_borg_version,
local_path=local_path,
remote_path=remote_path,
)

View File

@ -0,0 +1,53 @@
import logging
import borgmatic.borg.prune
import borgmatic.hooks.command
logger = logging.getLogger(__name__)
def run_prune(
config_filename,
repository,
storage,
retention,
hooks,
hook_context,
local_borg_version,
prune_arguments,
global_arguments,
dry_run_label,
local_path,
remote_path,
):
'''
Run the "prune" action for the given repository.
'''
borgmatic.hooks.command.execute_hook(
hooks.get('before_prune'),
hooks.get('umask'),
config_filename,
'pre-prune',
global_arguments.dry_run,
**hook_context,
)
logger.info('{}: Pruning archives{}'.format(repository, dry_run_label))
borgmatic.borg.prune.prune_archives(
global_arguments.dry_run,
repository,
storage,
retention,
local_borg_version,
local_path=local_path,
remote_path=remote_path,
stats=prune_arguments.stats,
list_archives=prune_arguments.list_archives,
)
borgmatic.hooks.command.execute_hook(
hooks.get('after_prune'),
hooks.get('umask'),
config_filename,
'post-prune',
global_arguments.dry_run,
**hook_context,
)

View File

@ -0,0 +1,34 @@
import logging
import borgmatic.borg.rcreate
logger = logging.getLogger(__name__)
def run_rcreate(
repository,
storage,
local_borg_version,
rcreate_arguments,
global_arguments,
local_path,
remote_path,
):
'''
Run the "rcreate" action for the given repository.
'''
logger.info('{}: Creating repository'.format(repository))
borgmatic.borg.rcreate.create_repository(
global_arguments.dry_run,
repository,
storage,
local_borg_version,
rcreate_arguments.encryption_mode,
rcreate_arguments.source_repository,
rcreate_arguments.copy_crypt_key,
rcreate_arguments.append_only,
rcreate_arguments.storage_quota,
rcreate_arguments.make_parent_dirs,
local_path=local_path,
remote_path=remote_path,
)

View File

@ -0,0 +1,345 @@
import copy
import logging
import os
import borgmatic.borg.extract
import borgmatic.borg.list
import borgmatic.borg.mount
import borgmatic.borg.rlist
import borgmatic.borg.state
import borgmatic.config.validate
import borgmatic.hooks.dispatch
import borgmatic.hooks.dump
logger = logging.getLogger(__name__)
UNSPECIFIED_HOOK = object()
def get_configured_database(
hooks, archive_database_names, hook_name, database_name, configuration_database_name=None
):
'''
Find the first database with the given hook name and database name in the configured hooks
dict and the given archive database names dict (from hook name to database names contained in
a particular backup archive). If UNSPECIFIED_HOOK is given as the hook name, search all database
hooks for the named database. If a configuration database name is given, use that instead of the
database name to lookup the database in the given hooks configuration.
Return the found database as a tuple of (found hook name, database configuration dict).
'''
if not configuration_database_name:
configuration_database_name = database_name
if hook_name == UNSPECIFIED_HOOK:
hooks_to_search = hooks
else:
hooks_to_search = {hook_name: hooks[hook_name]}
return next(
(
(name, hook_database)
for (name, hook) in hooks_to_search.items()
for hook_database in hook
if hook_database['name'] == configuration_database_name
and database_name in archive_database_names.get(name, [])
),
(None, None),
)
def get_configured_hook_name_and_database(hooks, database_name):
'''
Find the hook name and first database dict with the given database name in the configured hooks
dict. This searches across all database hooks.
'''
def restore_single_database(
repository,
location,
storage,
hooks,
local_borg_version,
global_arguments,
local_path,
remote_path,
archive_name,
hook_name,
database,
): # pragma: no cover
'''
Given (among other things) an archive name, a database hook name, and a configured database
configuration dict, restore that database from the archive.
'''
logger.info(f'{repository}: Restoring database {database["name"]}')
dump_pattern = borgmatic.hooks.dispatch.call_hooks(
'make_database_dump_pattern',
hooks,
repository,
borgmatic.hooks.dump.DATABASE_HOOK_NAMES,
location,
database['name'],
)[hook_name]
# Kick off a single database extract to stdout.
extract_process = borgmatic.borg.extract.extract_archive(
dry_run=global_arguments.dry_run,
repository=repository,
archive=archive_name,
paths=borgmatic.hooks.dump.convert_glob_patterns_to_borg_patterns([dump_pattern]),
location_config=location,
storage_config=storage,
local_borg_version=local_borg_version,
local_path=local_path,
remote_path=remote_path,
destination_path='/',
# A directory format dump isn't a single file, and therefore can't extract
# to stdout. In this case, the extract_process return value is None.
extract_to_stdout=bool(database.get('format') != 'directory'),
)
# Run a single database restore, consuming the extract stdout (if any).
borgmatic.hooks.dispatch.call_hooks(
'restore_database_dump',
{hook_name: [database]},
repository,
borgmatic.hooks.dump.DATABASE_HOOK_NAMES,
location,
global_arguments.dry_run,
extract_process,
)
def collect_archive_database_names(
repository, archive, location, storage, local_borg_version, local_path, remote_path,
):
'''
Given a local or remote repository path, a resolved archive name, a location configuration dict,
a storage configuration dict, the local Borg version, and local and remote Borg paths, query the
archive for the names of databases it contains and return them as a dict from hook name to a
sequence of database names.
'''
borgmatic_source_directory = os.path.expanduser(
location.get(
'borgmatic_source_directory', borgmatic.borg.state.DEFAULT_BORGMATIC_SOURCE_DIRECTORY
)
).lstrip('/')
parent_dump_path = os.path.expanduser(
borgmatic.hooks.dump.make_database_dump_path(borgmatic_source_directory, '*_databases/*/*')
)
dump_paths = borgmatic.borg.list.capture_archive_listing(
repository,
archive,
storage,
local_borg_version,
list_path=parent_dump_path,
local_path=local_path,
remote_path=remote_path,
)
# Determine the database names corresponding to the dumps found in the archive and
# add them to restore_names.
archive_database_names = {}
for dump_path in dump_paths:
try:
(hook_name, _, database_name) = dump_path.split(
borgmatic_source_directory + os.path.sep, 1
)[1].split(os.path.sep)[0:3]
except (ValueError, IndexError):
logger.warning(
f'{repository}: Ignoring invalid database dump path "{dump_path}" in archive {archive}'
)
else:
if database_name not in archive_database_names.get(hook_name, []):
archive_database_names.setdefault(hook_name, []).extend([database_name])
return archive_database_names
def find_databases_to_restore(requested_database_names, archive_database_names):
'''
Given a sequence of requested database names to restore and a dict of hook name to the names of
databases found in an archive, return an expanded sequence of database names to restore,
replacing "all" with actual database names as appropriate.
Raise ValueError if any of the requested database names cannot be found in the archive.
'''
# A map from database hook name to the database names to restore for that hook.
restore_names = (
{UNSPECIFIED_HOOK: requested_database_names}
if requested_database_names
else {UNSPECIFIED_HOOK: ['all']}
)
# If "all" is in restore_names, then replace it with the names of dumps found within the
# archive.
if 'all' in restore_names[UNSPECIFIED_HOOK]:
restore_names[UNSPECIFIED_HOOK].remove('all')
for (hook_name, database_names) in archive_database_names.items():
restore_names.setdefault(hook_name, []).extend(database_names)
# If a database is to be restored as part of "all", then remove it from restore names so
# it doesn't get restored twice.
for database_name in database_names:
if database_name in restore_names[UNSPECIFIED_HOOK]:
restore_names[UNSPECIFIED_HOOK].remove(database_name)
if not restore_names[UNSPECIFIED_HOOK]:
restore_names.pop(UNSPECIFIED_HOOK)
combined_restore_names = set(
name for database_names in restore_names.values() for name in database_names
)
combined_archive_database_names = set(
name for database_names in archive_database_names.values() for name in database_names
)
missing_names = sorted(set(combined_restore_names) - combined_archive_database_names)
if missing_names:
joined_names = ', '.join(f'"{name}"' for name in missing_names)
raise ValueError(
f"Cannot restore database{'s' if len(missing_names) > 1 else ''} {joined_names} missing from archive"
)
return restore_names
def ensure_databases_found(restore_names, remaining_restore_names, found_names):
'''
Given a dict from hook name to database names to restore, a dict from hook name to remaining
database names to restore, and a sequence of found (actually restored) database names, raise
ValueError if requested databases to restore were missing from the archive and/or configuration.
'''
combined_restore_names = set(
name
for database_names in tuple(restore_names.values())
+ tuple(remaining_restore_names.values())
for name in database_names
)
if not combined_restore_names and not found_names:
raise ValueError('No databases were found to restore')
missing_names = sorted(set(combined_restore_names) - set(found_names))
if missing_names:
joined_names = ', '.join(f'"{name}"' for name in missing_names)
raise ValueError(
f"Cannot restore database{'s' if len(missing_names) > 1 else ''} {joined_names} missing from borgmatic's configuration"
)
def run_restore(
repository,
location,
storage,
hooks,
local_borg_version,
restore_arguments,
global_arguments,
local_path,
remote_path,
):
'''
Run the "restore" action for the given repository, but only if the repository matches the
requested repository in restore arguments.
Raise ValueError if a configured database could not be found to restore.
'''
if restore_arguments.repository and not borgmatic.config.validate.repositories_match(
repository, restore_arguments.repository
):
return
logger.info(
'{}: Restoring databases from archive {}'.format(repository, restore_arguments.archive)
)
borgmatic.hooks.dispatch.call_hooks_even_if_unconfigured(
'remove_database_dumps',
hooks,
repository,
borgmatic.hooks.dump.DATABASE_HOOK_NAMES,
location,
global_arguments.dry_run,
)
archive_name = borgmatic.borg.rlist.resolve_archive_name(
repository, restore_arguments.archive, storage, local_borg_version, local_path, remote_path,
)
archive_database_names = collect_archive_database_names(
repository, archive_name, location, storage, local_borg_version, local_path, remote_path,
)
restore_names = find_databases_to_restore(restore_arguments.databases, archive_database_names)
found_names = set()
remaining_restore_names = {}
for hook_name, database_names in restore_names.items():
for database_name in database_names:
found_hook_name, found_database = get_configured_database(
hooks, archive_database_names, hook_name, database_name
)
if not found_database:
remaining_restore_names.setdefault(found_hook_name or hook_name, []).append(
database_name
)
continue
found_names.add(database_name)
restore_single_database(
repository,
location,
storage,
hooks,
local_borg_version,
global_arguments,
local_path,
remote_path,
archive_name,
found_hook_name or hook_name,
found_database,
)
# For any database that weren't found via exact matches in the hooks configuration, try to
# fallback to "all" entries.
for hook_name, database_names in remaining_restore_names.items():
for database_name in database_names:
found_hook_name, found_database = get_configured_database(
hooks, archive_database_names, hook_name, database_name, 'all'
)
if not found_database:
continue
found_names.add(database_name)
database = copy.copy(found_database)
database['name'] = database_name
restore_single_database(
repository,
location,
storage,
hooks,
local_borg_version,
global_arguments,
local_path,
remote_path,
archive_name,
found_hook_name or hook_name,
database,
)
borgmatic.hooks.dispatch.call_hooks_even_if_unconfigured(
'remove_database_dumps',
hooks,
repository,
borgmatic.hooks.dump.DATABASE_HOOK_NAMES,
location,
global_arguments.dry_run,
)
ensure_databases_found(restore_names, remaining_restore_names, found_names)

View File

@ -0,0 +1,32 @@
import json
import logging
import borgmatic.borg.rinfo
import borgmatic.config.validate
logger = logging.getLogger(__name__)
def run_rinfo(
repository, storage, local_borg_version, rinfo_arguments, local_path, remote_path,
):
'''
Run the "rinfo" action for the given repository.
If rinfo_arguments.json is True, yield the JSON output from the info for the repository.
'''
if rinfo_arguments.repository is None or borgmatic.config.validate.repositories_match(
repository, rinfo_arguments.repository
):
if not rinfo_arguments.json: # pragma: nocover
logger.answer('{}: Displaying repository summary information'.format(repository))
json_output = borgmatic.borg.rinfo.display_repository_info(
repository,
storage,
local_borg_version,
rinfo_arguments=rinfo_arguments,
local_path=local_path,
remote_path=remote_path,
)
if json_output: # pragma: nocover
yield json.loads(json_output)

View File

@ -0,0 +1,32 @@
import json
import logging
import borgmatic.borg.rlist
import borgmatic.config.validate
logger = logging.getLogger(__name__)
def run_rlist(
repository, storage, local_borg_version, rlist_arguments, local_path, remote_path,
):
'''
Run the "rlist" action for the given repository.
If rlist_arguments.json is True, yield the JSON output from listing the repository.
'''
if rlist_arguments.repository is None or borgmatic.config.validate.repositories_match(
repository, rlist_arguments.repository
):
if not rlist_arguments.json: # pragma: nocover
logger.answer('{}: Listing repository'.format(repository))
json_output = borgmatic.borg.rlist.list_repository(
repository,
storage,
local_borg_version,
rlist_arguments=rlist_arguments,
local_path=local_path,
remote_path=remote_path,
)
if json_output: # pragma: nocover
yield json.loads(json_output)

View File

@ -0,0 +1,29 @@
import logging
import borgmatic.borg.transfer
logger = logging.getLogger(__name__)
def run_transfer(
repository,
storage,
local_borg_version,
transfer_arguments,
global_arguments,
local_path,
remote_path,
):
'''
Run the "transfer" action for the given repository.
'''
logger.info(f'{repository}: Transferring archives to repository')
borgmatic.borg.transfer.transfer_archives(
global_arguments.dry_run,
repository,
storage,
local_borg_version,
transfer_arguments,
local_path=local_path,
remote_path=remote_path,
)

View File

@ -1,5 +1,6 @@
import logging
import borgmatic.logger
from borgmatic.borg import environment, flags
from borgmatic.execute import execute_command
@ -25,6 +26,7 @@ def run_arbitrary_borg(
sequence of arbitrary command-line Borg options, and an optional archive name, run an arbitrary
Borg command on the given repository/archive.
'''
borgmatic.logger.add_custom_log_levels()
lock_wait = storage_config.get('lock_wait', None)
try:
@ -60,7 +62,7 @@ def run_arbitrary_borg(
return execute_command(
full_command,
output_log_level=logging.WARNING,
output_log_level=logging.ANSWER,
borg_local_path=local_path,
extra_environment=environment.make_environment(storage_config),
)

View File

@ -166,6 +166,12 @@ def make_check_flags(local_borg_version, checks, check_last=None, prefix=None):
"--last" flag. And if a prefix value is given and "archives" is in checks, then include a
"--match-archives" flag.
'''
if 'data' in checks:
data_flags = ('--verify-data',)
checks += ('archives',)
else:
data_flags = ()
if 'archives' in checks:
last_flags = ('--last', str(check_last)) if check_last else ()
if feature.available(feature.Feature.MATCH_ARCHIVES, local_borg_version):
@ -176,17 +182,13 @@ def make_check_flags(local_borg_version, checks, check_last=None, prefix=None):
last_flags = ()
match_archives_flags = ()
if check_last:
logger.info('Ignoring check_last option, as "archives" is not in consistency checks')
if prefix:
logger.info(
'Ignoring consistency prefix option, as "archives" is not in consistency checks'
logger.warning(
'Ignoring check_last option, as "archives" or "data" are not in consistency checks'
)
if prefix:
logger.warning(
'Ignoring consistency prefix option, as "archives" or "data" are not in consistency checks'
)
if 'data' in checks:
data_flags = ('--verify-data',)
checks += ('archives',)
else:
data_flags = ()
common_flags = last_flags + match_archives_flags + data_flags

View File

@ -6,8 +6,14 @@ import pathlib
import stat
import tempfile
import borgmatic.logger
from borgmatic.borg import environment, feature, flags, state
from borgmatic.execute import DO_NOT_CAPTURE, execute_command, execute_command_with_processes
from borgmatic.execute import (
DO_NOT_CAPTURE,
execute_command,
execute_command_and_capture_output,
execute_command_with_processes,
)
logger = logging.getLogger(__name__)
@ -229,7 +235,11 @@ def special_file(path):
Return whether the given path is a special file (character device, block device, or named pipe
/ FIFO).
'''
mode = os.stat(path).st_mode
try:
mode = os.stat(path).st_mode
except (FileNotFoundError, OSError):
return False
return stat.S_ISCHR(mode) or stat.S_ISBLK(mode) or stat.S_ISFIFO(mode)
@ -255,10 +265,9 @@ def collect_special_file_paths(
Borg would encounter during a create. These are all paths that could cause Borg to hang if its
--read-special flag is used.
'''
paths_output = execute_command(
paths_output = execute_command_and_capture_output(
create_command + ('--dry-run', '--list'),
output_log_level=None,
borg_local_path=local_path,
capture_stderr=True,
working_directory=working_directory,
extra_environment=borg_environment,
)
@ -297,6 +306,7 @@ def create_archive(
If a sequence of stream processes is given (instances of subprocess.Popen), then execute the
create command while also triggering the given processes to produce output.
'''
borgmatic.logger.add_custom_log_levels()
borgmatic_source_directories = expand_directories(
collect_borgmatic_source_directories(location_config.get('borgmatic_source_directory'))
)
@ -398,8 +408,8 @@ def create_archive(
if json:
output_log_level = None
elif (stats or list_files) and logger.getEffectiveLevel() == logging.WARNING:
output_log_level = logging.WARNING
elif list_files or (stats and not dry_run):
output_log_level = logging.ANSWER
else:
output_log_level = logging.INFO
@ -420,18 +430,17 @@ def create_archive(
borg_environment,
skip_directories=borgmatic_source_directories,
)
logger.warning(
f'{repository}: Excluding special files to prevent Borg from hanging: {", ".join(special_file_paths)}'
)
exclude_file = write_pattern_file(
expand_home_directories(
tuple(location_config.get('exclude_patterns') or ()) + special_file_paths
),
pattern_file=exclude_file,
)
if exclude_file:
if special_file_paths:
logger.warning(
f'{repository}: Excluding special files to prevent Borg from hanging: {", ".join(special_file_paths)}'
)
exclude_file = write_pattern_file(
expand_home_directories(
tuple(location_config.get('exclude_patterns') or ()) + special_file_paths
),
pattern_file=exclude_file,
)
create_command += make_exclude_flags(location_config, exclude_file.name)
create_command += (
@ -452,12 +461,16 @@ def create_archive(
working_directory=working_directory,
extra_environment=borg_environment,
)
return execute_command(
create_command,
output_log_level,
output_file,
borg_local_path=local_path,
working_directory=working_directory,
extra_environment=borg_environment,
)
elif output_log_level is None:
return execute_command_and_capture_output(
create_command, working_directory=working_directory, extra_environment=borg_environment,
)
else:
execute_command(
create_command,
output_log_level,
output_file,
borg_local_path=local_path,
working_directory=working_directory,
extra_environment=borg_environment,
)

View File

@ -1,6 +1,7 @@
import logging
import os
import borgmatic.logger
from borgmatic.borg import environment, flags
from borgmatic.execute import DO_NOT_CAPTURE, execute_command
@ -30,6 +31,7 @@ def export_tar_archive(
If the destination path is "-", then stream the output to stdout instead of to a file.
'''
borgmatic.logger.add_custom_log_levels()
umask = storage_config.get('umask', None)
lock_wait = storage_config.get('lock_wait', None)
@ -53,8 +55,8 @@ def export_tar_archive(
+ (tuple(paths) if paths else ())
)
if list_files and logger.getEffectiveLevel() == logging.WARNING:
output_log_level = logging.WARNING
if list_files:
output_log_level = logging.ANSWER
else:
output_log_level = logging.INFO

View File

@ -1,7 +1,8 @@
import logging
import borgmatic.logger
from borgmatic.borg import environment, feature, flags
from borgmatic.execute import execute_command
from borgmatic.execute import execute_command, execute_command_and_capture_output
logger = logging.getLogger(__name__)
@ -19,6 +20,7 @@ def display_archives_info(
arguments to the info action, display summary information for Borg archives in the repository or
return JSON summary information.
'''
borgmatic.logger.add_custom_log_levels()
lock_wait = storage_config.get('lock_wait', None)
full_command = (
@ -55,9 +57,14 @@ def display_archives_info(
)
)
return execute_command(
full_command,
output_log_level=None if info_arguments.json else logging.WARNING,
borg_local_path=local_path,
extra_environment=environment.make_environment(storage_config),
)
if info_arguments.json:
return execute_command_and_capture_output(
full_command, extra_environment=environment.make_environment(storage_config),
)
else:
execute_command(
full_command,
output_log_level=logging.ANSWER,
borg_local_path=local_path,
extra_environment=environment.make_environment(storage_config),
)

View File

@ -3,8 +3,9 @@ import copy
import logging
import re
import borgmatic.logger
from borgmatic.borg import environment, feature, flags, rlist
from borgmatic.execute import execute_command
from borgmatic.execute import execute_command, execute_command_and_capture_output
logger = logging.getLogger(__name__)
@ -84,6 +85,46 @@ def make_find_paths(find_paths):
)
def capture_archive_listing(
repository,
archive,
storage_config,
local_borg_version,
list_path=None,
local_path='borg',
remote_path=None,
):
'''
Given a local or remote repository path, an archive name, a storage config dict, the local Borg
version, the archive path in which to list files, and local and remote Borg paths, capture the
output of listing that archive and return it as a list of file paths.
'''
borg_environment = environment.make_environment(storage_config)
return tuple(
execute_command_and_capture_output(
make_list_command(
repository,
storage_config,
local_borg_version,
argparse.Namespace(
repository=repository,
archive=archive,
paths=[f'sh:{list_path}'],
find_paths=None,
json=None,
format='{path}{NL}',
),
local_path,
remote_path,
),
extra_environment=borg_environment,
)
.strip('\n')
.split('\n')
)
def list_archive(
repository,
storage_config,
@ -99,6 +140,8 @@ def list_archive(
list the files by searching across multiple archives. If neither find_paths nor archive name
are given, instead list the archives in the given repository.
'''
borgmatic.logger.add_custom_log_levels()
if not list_arguments.archive and not list_arguments.find_paths:
if feature.available(feature.Feature.RLIST, local_borg_version):
logger.warning(
@ -151,7 +194,7 @@ def list_archive(
# Ask Borg to list archives. Capture its output for use below.
archive_lines = tuple(
execute_command(
execute_command_and_capture_output(
rlist.make_rlist_command(
repository,
storage_config,
@ -160,8 +203,6 @@ def list_archive(
local_path,
remote_path,
),
output_log_level=None,
borg_local_path=local_path,
extra_environment=borg_environment,
)
.strip('\n')
@ -172,7 +213,7 @@ def list_archive(
# For each archive listed by Borg, run list on the contents of that archive.
for archive in archive_lines:
logger.warning(f'{repository}: Listing archive {archive}')
logger.answer(f'{repository}: Listing archive {archive}')
archive_arguments = copy.copy(list_arguments)
archive_arguments.archive = archive
@ -193,7 +234,7 @@ def list_archive(
execute_command(
main_command,
output_log_level=logging.WARNING,
output_log_level=logging.ANSWER,
borg_local_path=local_path,
extra_environment=borg_environment,
)

View File

@ -1,5 +1,6 @@
import logging
import borgmatic.logger
from borgmatic.borg import environment, feature, flags
from borgmatic.execute import execute_command
@ -52,6 +53,7 @@ def prune_archives(
retention config dict, prune Borg archives according to the retention policy specified in that
configuration.
'''
borgmatic.logger.add_custom_log_levels()
umask = storage_config.get('umask', None)
lock_wait = storage_config.get('lock_wait', None)
extra_borg_options = storage_config.get('extra_borg_options', {}).get('prune', '')
@ -75,8 +77,8 @@ def prune_archives(
+ flags.make_repository_flags(repository, local_borg_version)
)
if (stats or list_archives) and logger.getEffectiveLevel() == logging.WARNING:
output_log_level = logging.WARNING
if stats or list_archives:
output_log_level = logging.ANSWER
else:
output_log_level = logging.INFO

View File

@ -1,7 +1,8 @@
import logging
import borgmatic.logger
from borgmatic.borg import environment, feature, flags
from borgmatic.execute import execute_command
from borgmatic.execute import execute_command, execute_command_and_capture_output
logger = logging.getLogger(__name__)
@ -19,6 +20,7 @@ def display_repository_info(
arguments to the rinfo action, display summary information for the Borg repository or return
JSON summary information.
'''
borgmatic.logger.add_custom_log_levels()
lock_wait = storage_config.get('lock_wait', None)
full_command = (
@ -44,9 +46,16 @@ def display_repository_info(
+ flags.make_repository_flags(repository, local_borg_version)
)
return execute_command(
full_command,
output_log_level=None if rinfo_arguments.json else logging.WARNING,
borg_local_path=local_path,
extra_environment=environment.make_environment(storage_config),
)
extra_environment = environment.make_environment(storage_config)
if rinfo_arguments.json:
return execute_command_and_capture_output(
full_command, extra_environment=extra_environment,
)
else:
execute_command(
full_command,
output_log_level=logging.ANSWER,
borg_local_path=local_path,
extra_environment=extra_environment,
)

View File

@ -1,7 +1,8 @@
import logging
import borgmatic.logger
from borgmatic.borg import environment, feature, flags
from borgmatic.execute import execute_command
from borgmatic.execute import execute_command, execute_command_and_capture_output
logger = logging.getLogger(__name__)
@ -33,11 +34,8 @@ def resolve_archive_name(
+ flags.make_repository_flags(repository, local_borg_version)
)
output = execute_command(
full_command,
output_log_level=None,
borg_local_path=local_path,
extra_environment=environment.make_environment(storage_config),
output = execute_command_and_capture_output(
full_command, extra_environment=environment.make_environment(storage_config),
)
try:
latest_archive = output.strip().splitlines()[-1]
@ -111,18 +109,19 @@ def list_repository(
arguments to the list action, and local and remote Borg paths, display the output of listing
Borg archives in the given repository (or return JSON output).
'''
borgmatic.logger.add_custom_log_levels()
borg_environment = environment.make_environment(storage_config)
main_command = make_rlist_command(
repository, storage_config, local_borg_version, rlist_arguments, local_path, remote_path
)
output = execute_command(
main_command,
output_log_level=None if rlist_arguments.json else logging.WARNING,
borg_local_path=local_path,
extra_environment=borg_environment,
)
if rlist_arguments.json:
return output
return execute_command_and_capture_output(main_command, extra_environment=borg_environment,)
else:
execute_command(
main_command,
output_log_level=logging.ANSWER,
borg_local_path=local_path,
extra_environment=borg_environment,
)

View File

@ -1,5 +1,6 @@
import logging
import borgmatic.logger
from borgmatic.borg import environment, flags
from borgmatic.execute import execute_command
@ -19,6 +20,8 @@ def transfer_archives(
Given a dry-run flag, a local or remote repository path, a storage config dict, the local Borg
version, and the arguments to the transfer action, transfer archives to the given repository.
'''
borgmatic.logger.add_custom_log_levels()
full_command = (
(local_path, 'transfer')
+ (('--info',) if logger.getEffectiveLevel() == logging.INFO else ())
@ -41,7 +44,7 @@ def transfer_archives(
return execute_command(
full_command,
output_log_level=logging.WARNING,
output_log_level=logging.ANSWER,
borg_local_path=local_path,
extra_environment=environment.make_environment(storage_config),
)

View File

@ -1,7 +1,7 @@
import logging
from borgmatic.borg import environment
from borgmatic.execute import execute_command
from borgmatic.execute import execute_command_and_capture_output
logger = logging.getLogger(__name__)
@ -18,11 +18,8 @@ def local_borg_version(storage_config, local_path='borg'):
+ (('--info',) if logger.getEffectiveLevel() == logging.INFO else ())
+ (('--debug', '--show-rc') if logger.isEnabledFor(logging.DEBUG) else ())
)
output = execute_command(
full_command,
output_log_level=None,
borg_local_path=local_path,
extra_environment=environment.make_environment(storage_config),
output = execute_command_and_capture_output(
full_command, extra_environment=environment.make_environment(storage_config),
)
try:

View File

@ -337,8 +337,8 @@ def make_parsers():
compact_parser = subparsers.add_parser(
'compact',
aliases=SUBPARSER_ALIASES['compact'],
help='Compact segments to free space (Borg 1.2+ only)',
description='Compact segments to free space (Borg 1.2+ only)',
help='Compact segments to free space (Borg 1.2+, borgmatic 1.5.23+ only)',
description='Compact segments to free space (Borg 1.2+, borgmatic 1.5.23+ only)',
add_help=False,
)
compact_group = compact_parser.add_argument_group('compact arguments')

View File

@ -1,5 +1,4 @@
import collections
import copy
import json
import logging
import os
@ -11,29 +10,29 @@ from subprocess import CalledProcessError
import colorama
import pkg_resources
import borgmatic.actions.borg
import borgmatic.actions.break_lock
import borgmatic.actions.check
import borgmatic.actions.compact
import borgmatic.actions.create
import borgmatic.actions.export_tar
import borgmatic.actions.extract
import borgmatic.actions.info
import borgmatic.actions.list
import borgmatic.actions.mount
import borgmatic.actions.prune
import borgmatic.actions.rcreate
import borgmatic.actions.restore
import borgmatic.actions.rinfo
import borgmatic.actions.rlist
import borgmatic.actions.transfer
import borgmatic.commands.completion
from borgmatic.borg import borg as borg_borg
from borgmatic.borg import break_lock as borg_break_lock
from borgmatic.borg import check as borg_check
from borgmatic.borg import compact as borg_compact
from borgmatic.borg import create as borg_create
from borgmatic.borg import export_tar as borg_export_tar
from borgmatic.borg import extract as borg_extract
from borgmatic.borg import feature as borg_feature
from borgmatic.borg import info as borg_info
from borgmatic.borg import list as borg_list
from borgmatic.borg import mount as borg_mount
from borgmatic.borg import prune as borg_prune
from borgmatic.borg import rcreate as borg_rcreate
from borgmatic.borg import rinfo as borg_rinfo
from borgmatic.borg import rlist as borg_rlist
from borgmatic.borg import transfer as borg_transfer
from borgmatic.borg import umount as borg_umount
from borgmatic.borg import version as borg_version
from borgmatic.commands.arguments import parse_arguments
from borgmatic.config import checks, collect, convert, validate
from borgmatic.hooks import command, dispatch, dump, monitor
from borgmatic.logger import configure_logging, should_do_markup
from borgmatic.hooks import command, dispatch, monitor
from borgmatic.logger import add_custom_log_levels, configure_logging, should_do_markup
from borgmatic.signals import configure_signals
from borgmatic.verbosity import verbosity_to_log_level
@ -244,6 +243,7 @@ def run_actions(
action or a hook. Raise ValueError if the arguments or configuration passed to action are
invalid.
'''
add_custom_log_levels()
repository = os.path.expanduser(repository_path)
global_arguments = arguments['global']
dry_run_label = ' (dry run; not making any changes)' if global_arguments.dry_run else ''
@ -263,509 +263,154 @@ def run_actions(
)
if 'rcreate' in arguments:
logger.info('{}: Creating repository'.format(repository))
borg_rcreate.create_repository(
global_arguments.dry_run,
borgmatic.actions.rcreate.run_rcreate(
repository,
storage,
local_borg_version,
arguments['rcreate'].encryption_mode,
arguments['rcreate'].source_repository,
arguments['rcreate'].copy_crypt_key,
arguments['rcreate'].append_only,
arguments['rcreate'].storage_quota,
arguments['rcreate'].make_parent_dirs,
local_path=local_path,
remote_path=remote_path,
arguments['rcreate'],
global_arguments,
local_path,
remote_path,
)
if 'transfer' in arguments:
logger.info(f'{repository}: Transferring archives to repository')
borg_transfer.transfer_archives(
global_arguments.dry_run,
borgmatic.actions.transfer.run_transfer(
repository,
storage,
local_borg_version,
transfer_arguments=arguments['transfer'],
local_path=local_path,
remote_path=remote_path,
arguments['transfer'],
global_arguments,
local_path,
remote_path,
)
if 'prune' in arguments:
command.execute_hook(
hooks.get('before_prune'),
hooks.get('umask'),
borgmatic.actions.prune.run_prune(
config_filename,
'pre-prune',
global_arguments.dry_run,
**hook_context,
)
logger.info('{}: Pruning archives{}'.format(repository, dry_run_label))
borg_prune.prune_archives(
global_arguments.dry_run,
repository,
storage,
retention,
hooks,
hook_context,
local_borg_version,
local_path=local_path,
remote_path=remote_path,
stats=arguments['prune'].stats,
list_archives=arguments['prune'].list_archives,
)
command.execute_hook(
hooks.get('after_prune'),
hooks.get('umask'),
config_filename,
'post-prune',
global_arguments.dry_run,
**hook_context,
arguments['prune'],
global_arguments,
dry_run_label,
local_path,
remote_path,
)
if 'compact' in arguments:
command.execute_hook(
hooks.get('before_compact'),
hooks.get('umask'),
borgmatic.actions.compact.run_compact(
config_filename,
'pre-compact',
global_arguments.dry_run,
)
if borg_feature.available(borg_feature.Feature.COMPACT, local_borg_version):
logger.info('{}: Compacting segments{}'.format(repository, dry_run_label))
borg_compact.compact_segments(
global_arguments.dry_run,
repository,
storage,
local_borg_version,
local_path=local_path,
remote_path=remote_path,
progress=arguments['compact'].progress,
cleanup_commits=arguments['compact'].cleanup_commits,
threshold=arguments['compact'].threshold,
)
else: # pragma: nocover
logger.info(
'{}: Skipping compact (only available/needed in Borg 1.2+)'.format(repository)
)
command.execute_hook(
hooks.get('after_compact'),
hooks.get('umask'),
config_filename,
'post-compact',
global_arguments.dry_run,
repository,
storage,
retention,
hooks,
hook_context,
local_borg_version,
arguments['compact'],
global_arguments,
dry_run_label,
local_path,
remote_path,
)
if 'create' in arguments:
command.execute_hook(
hooks.get('before_backup'),
hooks.get('umask'),
yield from borgmatic.actions.create.run_create(
config_filename,
'pre-backup',
global_arguments.dry_run,
**hook_context,
)
logger.info('{}: Creating archive{}'.format(repository, dry_run_label))
dispatch.call_hooks_even_if_unconfigured(
'remove_database_dumps',
hooks,
repository,
dump.DATABASE_HOOK_NAMES,
location,
global_arguments.dry_run,
)
active_dumps = dispatch.call_hooks(
'dump_databases',
hooks,
repository,
dump.DATABASE_HOOK_NAMES,
location,
global_arguments.dry_run,
)
stream_processes = [process for processes in active_dumps.values() for process in processes]
json_output = borg_create.create_archive(
global_arguments.dry_run,
repository,
location,
storage,
local_borg_version,
local_path=local_path,
remote_path=remote_path,
progress=arguments['create'].progress,
stats=arguments['create'].stats,
json=arguments['create'].json,
list_files=arguments['create'].list_files,
stream_processes=stream_processes,
)
if json_output: # pragma: nocover
yield json.loads(json_output)
dispatch.call_hooks_even_if_unconfigured(
'remove_database_dumps',
hooks,
config_filename,
dump.DATABASE_HOOK_NAMES,
location,
global_arguments.dry_run,
hook_context,
local_borg_version,
arguments['create'],
global_arguments,
dry_run_label,
local_path,
remote_path,
)
command.execute_hook(
hooks.get('after_backup'),
hooks.get('umask'),
config_filename,
'post-backup',
global_arguments.dry_run,
**hook_context,
)
if 'check' in arguments and checks.repository_enabled_for_checks(repository, consistency):
command.execute_hook(
hooks.get('before_check'),
hooks.get('umask'),
borgmatic.actions.check.run_check(
config_filename,
'pre-check',
global_arguments.dry_run,
**hook_context,
)
logger.info('{}: Running consistency checks'.format(repository))
borg_check.check_archives(
repository,
location,
storage,
consistency,
hooks,
hook_context,
local_borg_version,
local_path=local_path,
remote_path=remote_path,
progress=arguments['check'].progress,
repair=arguments['check'].repair,
only_checks=arguments['check'].only,
force=arguments['check'].force,
)
command.execute_hook(
hooks.get('after_check'),
hooks.get('umask'),
config_filename,
'post-check',
global_arguments.dry_run,
**hook_context,
arguments['check'],
global_arguments,
local_path,
remote_path,
)
if 'extract' in arguments:
command.execute_hook(
hooks.get('before_extract'),
hooks.get('umask'),
borgmatic.actions.extract.run_extract(
config_filename,
'pre-extract',
global_arguments.dry_run,
**hook_context,
)
if arguments['extract'].repository is None or validate.repositories_match(
repository, arguments['extract'].repository
):
logger.info(
'{}: Extracting archive {}'.format(repository, arguments['extract'].archive)
)
borg_extract.extract_archive(
global_arguments.dry_run,
repository,
borg_rlist.resolve_archive_name(
repository,
arguments['extract'].archive,
storage,
local_borg_version,
local_path,
remote_path,
),
arguments['extract'].paths,
location,
storage,
local_borg_version,
local_path=local_path,
remote_path=remote_path,
destination_path=arguments['extract'].destination,
strip_components=arguments['extract'].strip_components,
progress=arguments['extract'].progress,
)
command.execute_hook(
hooks.get('after_extract'),
hooks.get('umask'),
config_filename,
'post-extract',
global_arguments.dry_run,
**hook_context,
repository,
location,
storage,
hooks,
hook_context,
local_borg_version,
arguments['extract'],
global_arguments,
local_path,
remote_path,
)
if 'export-tar' in arguments:
if arguments['export-tar'].repository is None or validate.repositories_match(
repository, arguments['export-tar'].repository
):
logger.info(
'{}: Exporting archive {} as tar file'.format(
repository, arguments['export-tar'].archive
)
)
borg_export_tar.export_tar_archive(
global_arguments.dry_run,
repository,
borg_rlist.resolve_archive_name(
repository,
arguments['export-tar'].archive,
storage,
local_borg_version,
local_path,
remote_path,
),
arguments['export-tar'].paths,
arguments['export-tar'].destination,
storage,
local_borg_version,
local_path=local_path,
remote_path=remote_path,
tar_filter=arguments['export-tar'].tar_filter,
list_files=arguments['export-tar'].list_files,
strip_components=arguments['export-tar'].strip_components,
)
borgmatic.actions.export_tar.run_export_tar(
repository,
storage,
local_borg_version,
arguments['export-tar'],
global_arguments,
local_path,
remote_path,
)
if 'mount' in arguments:
if arguments['mount'].repository is None or validate.repositories_match(
repository, arguments['mount'].repository
):
if arguments['mount'].archive:
logger.info(
'{}: Mounting archive {}'.format(repository, arguments['mount'].archive)
)
else: # pragma: nocover
logger.info('{}: Mounting repository'.format(repository))
borg_mount.mount_archive(
repository,
borg_rlist.resolve_archive_name(
repository,
arguments['mount'].archive,
storage,
local_borg_version,
local_path,
remote_path,
),
arguments['mount'].mount_point,
arguments['mount'].paths,
arguments['mount'].foreground,
arguments['mount'].options,
storage,
local_borg_version,
local_path=local_path,
remote_path=remote_path,
)
if 'restore' in arguments: # pragma: nocover
if arguments['restore'].repository is None or validate.repositories_match(
repository, arguments['restore'].repository
):
logger.info(
'{}: Restoring databases from archive {}'.format(
repository, arguments['restore'].archive
)
)
dispatch.call_hooks_even_if_unconfigured(
'remove_database_dumps',
hooks,
repository,
dump.DATABASE_HOOK_NAMES,
location,
global_arguments.dry_run,
)
restore_names = arguments['restore'].databases or []
if 'all' in restore_names:
restore_names = []
archive_name = borg_rlist.resolve_archive_name(
repository,
arguments['restore'].archive,
storage,
local_borg_version,
local_path,
remote_path,
)
found_names = set()
for hook_name, per_hook_restore_databases in hooks.items():
if hook_name not in dump.DATABASE_HOOK_NAMES:
continue
for restore_database in per_hook_restore_databases:
database_name = restore_database['name']
if restore_names and database_name not in restore_names:
continue
found_names.add(database_name)
dump_pattern = dispatch.call_hooks(
'make_database_dump_pattern',
hooks,
repository,
dump.DATABASE_HOOK_NAMES,
location,
database_name,
)[hook_name]
# Kick off a single database extract to stdout.
extract_process = borg_extract.extract_archive(
dry_run=global_arguments.dry_run,
repository=repository,
archive=archive_name,
paths=dump.convert_glob_patterns_to_borg_patterns([dump_pattern]),
location_config=location,
storage_config=storage,
local_borg_version=local_borg_version,
local_path=local_path,
remote_path=remote_path,
destination_path='/',
# A directory format dump isn't a single file, and therefore can't extract
# to stdout. In this case, the extract_process return value is None.
extract_to_stdout=bool(restore_database.get('format') != 'directory'),
)
# Run a single database restore, consuming the extract stdout (if any).
dispatch.call_hooks(
'restore_database_dump',
{hook_name: [restore_database]},
repository,
dump.DATABASE_HOOK_NAMES,
location,
global_arguments.dry_run,
extract_process,
)
dispatch.call_hooks_even_if_unconfigured(
'remove_database_dumps',
hooks,
repository,
dump.DATABASE_HOOK_NAMES,
location,
global_arguments.dry_run,
)
if not restore_names and not found_names:
raise ValueError('No databases were found to restore')
missing_names = sorted(set(restore_names) - found_names)
if missing_names:
raise ValueError(
'Cannot restore database(s) {} missing from borgmatic\'s configuration'.format(
', '.join(missing_names)
)
)
borgmatic.actions.mount.run_mount(
repository, storage, local_borg_version, arguments['mount'], local_path, remote_path,
)
if 'restore' in arguments:
borgmatic.actions.restore.run_restore(
repository,
location,
storage,
hooks,
local_borg_version,
arguments['restore'],
global_arguments,
local_path,
remote_path,
)
if 'rlist' in arguments:
if arguments['rlist'].repository is None or validate.repositories_match(
repository, arguments['rlist'].repository
):
rlist_arguments = copy.copy(arguments['rlist'])
if not rlist_arguments.json: # pragma: nocover
logger.warning('{}: Listing repository'.format(repository))
json_output = borg_rlist.list_repository(
repository,
storage,
local_borg_version,
rlist_arguments=rlist_arguments,
local_path=local_path,
remote_path=remote_path,
)
if json_output: # pragma: nocover
yield json.loads(json_output)
yield from borgmatic.actions.rlist.run_rlist(
repository, storage, local_borg_version, arguments['rlist'], local_path, remote_path,
)
if 'list' in arguments:
if arguments['list'].repository is None or validate.repositories_match(
repository, arguments['list'].repository
):
list_arguments = copy.copy(arguments['list'])
if not list_arguments.json: # pragma: nocover
if list_arguments.find_paths:
logger.warning('{}: Searching archives'.format(repository))
elif not list_arguments.archive:
logger.warning('{}: Listing archives'.format(repository))
list_arguments.archive = borg_rlist.resolve_archive_name(
repository,
list_arguments.archive,
storage,
local_borg_version,
local_path,
remote_path,
)
json_output = borg_list.list_archive(
repository,
storage,
local_borg_version,
list_arguments=list_arguments,
local_path=local_path,
remote_path=remote_path,
)
if json_output: # pragma: nocover
yield json.loads(json_output)
yield from borgmatic.actions.list.run_list(
repository, storage, local_borg_version, arguments['list'], local_path, remote_path,
)
if 'rinfo' in arguments:
if arguments['rinfo'].repository is None or validate.repositories_match(
repository, arguments['rinfo'].repository
):
rinfo_arguments = copy.copy(arguments['rinfo'])
if not rinfo_arguments.json: # pragma: nocover
logger.warning('{}: Displaying repository summary information'.format(repository))
json_output = borg_rinfo.display_repository_info(
repository,
storage,
local_borg_version,
rinfo_arguments=rinfo_arguments,
local_path=local_path,
remote_path=remote_path,
)
if json_output: # pragma: nocover
yield json.loads(json_output)
yield from borgmatic.actions.rinfo.run_rinfo(
repository, storage, local_borg_version, arguments['rinfo'], local_path, remote_path,
)
if 'info' in arguments:
if arguments['info'].repository is None or validate.repositories_match(
repository, arguments['info'].repository
):
info_arguments = copy.copy(arguments['info'])
if not info_arguments.json: # pragma: nocover
logger.warning('{}: Displaying archive summary information'.format(repository))
info_arguments.archive = borg_rlist.resolve_archive_name(
repository,
info_arguments.archive,
storage,
local_borg_version,
local_path,
remote_path,
)
json_output = borg_info.display_archives_info(
repository,
storage,
local_borg_version,
info_arguments=info_arguments,
local_path=local_path,
remote_path=remote_path,
)
if json_output: # pragma: nocover
yield json.loads(json_output)
yield from borgmatic.actions.info.run_info(
repository, storage, local_borg_version, arguments['info'], local_path, remote_path,
)
if 'break-lock' in arguments:
if arguments['break-lock'].repository is None or validate.repositories_match(
repository, arguments['break-lock'].repository
):
logger.warning(f'{repository}: Breaking repository and cache locks')
borg_break_lock.break_lock(
repository,
storage,
local_borg_version,
local_path=local_path,
remote_path=remote_path,
)
borgmatic.actions.break_lock.run_break_lock(
repository,
storage,
local_borg_version,
arguments['break-lock'],
local_path,
remote_path,
)
if 'borg' in arguments:
if arguments['borg'].repository is None or validate.repositories_match(
repository, arguments['borg'].repository
):
logger.warning('{}: Running arbitrary Borg command'.format(repository))
archive_name = borg_rlist.resolve_archive_name(
repository,
arguments['borg'].archive,
storage,
local_borg_version,
local_path,
remote_path,
)
borg_borg.run_arbitrary_borg(
repository,
storage,
local_borg_version,
options=arguments['borg'].options,
archive=archive_name,
local_path=local_path,
remote_path=remote_path,
)
borgmatic.actions.borg.run_borg(
repository, storage, local_borg_version, arguments['borg'], local_path, remote_path,
)
command.execute_hook(
hooks.get('after_actions'),

View File

@ -1,3 +1,4 @@
import functools
import logging
import os
@ -6,43 +7,17 @@ import ruamel.yaml
logger = logging.getLogger(__name__)
class Yaml_with_loader_stream(ruamel.yaml.YAML):
def include_configuration(loader, filename_node, include_directory):
'''
A derived class of ruamel.yaml.YAML that simply tacks the loaded stream (file object) onto the
loader class so that it's available anywhere that's passed a loader (in this case,
include_configuration() below).
'''
def get_constructor_parser(self, stream):
constructor, parser = super(Yaml_with_loader_stream, self).get_constructor_parser(stream)
constructor.loader.stream = stream
return constructor, parser
def load_configuration(filename):
'''
Load the given configuration file and return its contents as a data structure of nested dicts
and lists.
Raise ruamel.yaml.error.YAMLError if something goes wrong parsing the YAML, or RecursionError
if there are too many recursive includes.
'''
yaml = Yaml_with_loader_stream(typ='safe')
yaml.Constructor = Include_constructor
return yaml.load(open(filename))
def include_configuration(loader, filename_node):
'''
Load the given YAML filename (ignoring the given loader so we can use our own) and return its
contents as a data structure of nested dicts and lists. If the filename is relative, probe for
it within 1. the current working directory and 2. the directory containing the YAML file doing
the including.
Given a ruamel.yaml.loader.Loader, a ruamel.yaml.serializer.ScalarNode containing the included
filename, and an include directory path to search for matching files, load the given YAML
filename (ignoring the given loader so we can use our own) and return its contents as a data
structure of nested dicts and lists. If the filename is relative, probe for it within 1. the
current working directory and 2. the given include directory.
Raise FileNotFoundError if an included file was not found.
'''
include_directories = [os.getcwd(), os.path.abspath(os.path.dirname(loader.stream.name))]
include_directories = [os.getcwd(), os.path.abspath(include_directory)]
include_filename = os.path.expanduser(filename_node.value)
if not os.path.isabs(include_filename):
@ -62,6 +37,70 @@ def include_configuration(loader, filename_node):
return load_configuration(include_filename)
class Include_constructor(ruamel.yaml.SafeConstructor):
'''
A YAML "constructor" (a ruamel.yaml concept) that supports a custom "!include" tag for including
separate YAML configuration files. Example syntax: `retention: !include common.yaml`
'''
def __init__(self, preserve_quotes=None, loader=None, include_directory=None):
super(Include_constructor, self).__init__(preserve_quotes, loader)
self.add_constructor(
'!include',
functools.partial(include_configuration, include_directory=include_directory),
)
def flatten_mapping(self, node):
'''
Support the special case of deep merging included configuration into an existing mapping
using the YAML '<<' merge key. Example syntax:
```
retention:
keep_daily: 1
<<: !include common.yaml
```
These includes are deep merged into the current configuration file. For instance, in this
example, any "retention" options in common.yaml will get merged into the "retention" section
in the example configuration file.
'''
representer = ruamel.yaml.representer.SafeRepresenter()
for index, (key_node, value_node) in enumerate(node.value):
if key_node.tag == u'tag:yaml.org,2002:merge' and value_node.tag == '!include':
included_value = representer.represent_data(self.construct_object(value_node))
node.value[index] = (key_node, included_value)
super(Include_constructor, self).flatten_mapping(node)
node.value = deep_merge_nodes(node.value)
def load_configuration(filename):
'''
Load the given configuration file and return its contents as a data structure of nested dicts
and lists.
Raise ruamel.yaml.error.YAMLError if something goes wrong parsing the YAML, or RecursionError
if there are too many recursive includes.
'''
# Use an embedded derived class for the include constructor so as to capture the filename
# value. (functools.partial doesn't work for this use case because yaml.Constructor has to be
# an actual class.)
class Include_constructor_with_include_directory(Include_constructor):
def __init__(self, preserve_quotes=None, loader=None):
super(Include_constructor_with_include_directory, self).__init__(
preserve_quotes, loader, include_directory=os.path.dirname(filename)
)
yaml = ruamel.yaml.YAML(typ='safe')
yaml.Constructor = Include_constructor_with_include_directory
return yaml.load(open(filename))
DELETED_NODE = object()
@ -175,41 +214,3 @@ def deep_merge_nodes(nodes):
return [
replaced_nodes.get(node, node) for node in nodes if replaced_nodes.get(node) != DELETED_NODE
]
class Include_constructor(ruamel.yaml.SafeConstructor):
'''
A YAML "constructor" (a ruamel.yaml concept) that supports a custom "!include" tag for including
separate YAML configuration files. Example syntax: `retention: !include common.yaml`
'''
def __init__(self, preserve_quotes=None, loader=None):
super(Include_constructor, self).__init__(preserve_quotes, loader)
self.add_constructor('!include', include_configuration)
def flatten_mapping(self, node):
'''
Support the special case of deep merging included configuration into an existing mapping
using the YAML '<<' merge key. Example syntax:
```
retention:
keep_daily: 1
<<: !include common.yaml
```
These includes are deep merged into the current configuration file. For instance, in this
example, any "retention" options in common.yaml will get merged into the "retention" section
in the example configuration file.
'''
representer = ruamel.yaml.representer.SafeRepresenter()
for index, (key_node, value_node) in enumerate(node.value):
if key_node.tag == u'tag:yaml.org,2002:merge' and value_node.tag == '!include':
included_value = representer.represent_data(self.construct_object(value_node))
node.value[index] = (key_node, included_value)
super(Include_constructor, self).flatten_mapping(node)
node.value = deep_merge_nodes(node.value)

View File

@ -8,49 +8,53 @@ def normalize(config_filename, config):
message warnings produced based on the normalization performed.
'''
logs = []
location = config.get('location') or {}
storage = config.get('storage') or {}
consistency = config.get('consistency') or {}
hooks = config.get('hooks') or {}
# Upgrade exclude_if_present from a string to a list.
exclude_if_present = config.get('location', {}).get('exclude_if_present')
exclude_if_present = location.get('exclude_if_present')
if isinstance(exclude_if_present, str):
config['location']['exclude_if_present'] = [exclude_if_present]
# Upgrade various monitoring hooks from a string to a dict.
healthchecks = config.get('hooks', {}).get('healthchecks')
healthchecks = hooks.get('healthchecks')
if isinstance(healthchecks, str):
config['hooks']['healthchecks'] = {'ping_url': healthchecks}
cronitor = config.get('hooks', {}).get('cronitor')
cronitor = hooks.get('cronitor')
if isinstance(cronitor, str):
config['hooks']['cronitor'] = {'ping_url': cronitor}
pagerduty = config.get('hooks', {}).get('pagerduty')
pagerduty = hooks.get('pagerduty')
if isinstance(pagerduty, str):
config['hooks']['pagerduty'] = {'integration_key': pagerduty}
cronhub = config.get('hooks', {}).get('cronhub')
cronhub = hooks.get('cronhub')
if isinstance(cronhub, str):
config['hooks']['cronhub'] = {'ping_url': cronhub}
# Upgrade consistency checks from a list of strings to a list of dicts.
checks = config.get('consistency', {}).get('checks')
checks = consistency.get('checks')
if isinstance(checks, list) and len(checks) and isinstance(checks[0], str):
config['consistency']['checks'] = [{'name': check_type} for check_type in checks]
# Rename various configuration options.
numeric_owner = config.get('location', {}).pop('numeric_owner', None)
numeric_owner = location.pop('numeric_owner', None)
if numeric_owner is not None:
config['location']['numeric_ids'] = numeric_owner
bsd_flags = config.get('location', {}).pop('bsd_flags', None)
bsd_flags = location.pop('bsd_flags', None)
if bsd_flags is not None:
config['location']['flags'] = bsd_flags
remote_rate_limit = config.get('storage', {}).pop('remote_rate_limit', None)
remote_rate_limit = storage.pop('remote_rate_limit', None)
if remote_rate_limit is not None:
config['storage']['upload_rate_limit'] = remote_rate_limit
# Upgrade remote repositories to ssh:// syntax, required in Borg 2.
repositories = config.get('location', {}).get('repositories')
repositories = location.get('repositories')
if repositories:
config['location']['repositories'] = []
for repository in repositories:

View File

@ -691,10 +691,13 @@ properties:
type: string
description: |
Database name (required if using this hook). Or
"all" to dump all databases on the host. Note
that using this database hook implicitly enables
both read_special and one_file_system (see
above) to support dump and restore streaming.
"all" to dump all databases on the host. (Also
set the "format" to dump each database to a
separate file instead of one combined file.)
Note that using this database hook implicitly
enables both read_special and one_file_system
(see above) to support dump and restore
streaming.
example: users
hostname:
type: string
@ -729,9 +732,14 @@ properties:
description: |
Database dump output format. One of "plain",
"custom", "directory", or "tar". Defaults to
"custom" (unlike raw pg_dump). See pg_dump
documentation for details. Note that format is
ignored when the database name is "all".
"custom" (unlike raw pg_dump) for a single
database. Or, when database name is "all" and
format is blank, dumps all databases to a single
file. But if a format is specified with an "all"
database name, dumps each database to a separate
file of that format, allowing more convenient
restores of individual databases. See the
pg_dump documentation for more about formats.
example: directory
ssl_mode:
type: string
@ -764,6 +772,32 @@ properties:
description: |
Path to a certificate revocation list.
example: "/root/.postgresql/root.crl"
pg_dump_command:
type: string
description: |
Command to use instead of "pg_dump" or
"pg_dumpall". This can be used to run a specific
pg_dump version (e.g., one inside a running
docker container). Defaults to "pg_dump" for
single database dump or "pg_dumpall" to dump
all databases.
example: docker exec my_pg_container pg_dump
pg_restore_command:
type: string
description: |
Command to use instead of "pg_restore". This
can be used to run a specific pg_restore
version (e.g., one inside a running docker
container). Defaults to "pg_restore".
example: docker exec my_pg_container pg_restore
psql_command:
type: string
description: |
Command to use instead of "psql". This can be
used to run a specific psql version (e.g.,
one inside a running docker container).
Defaults to "psql".
example: docker exec my_pg_container psql
options:
type: string
description: |
@ -772,6 +806,30 @@ properties:
any validation on them. See pg_dump
documentation for details.
example: --role=someone
list_options:
type: string
description: |
Additional psql options to pass directly to the
psql command that lists available databases,
without performing any validation on them. See
psql documentation for details.
example: --role=someone
restore_options:
type: string
description: |
Additional pg_restore/psql options to pass
directly to the restore command, without
performing any validation on them. See
pg_restore/psql documentation for details.
example: --role=someone
analyze_options:
type: string
description: |
Additional psql options to pass directly to the
analyze command run after a restore, without
performing any validation on them. See psql
documentation for details.
example: --role=someone
description: |
List of one or more PostgreSQL databases to dump before
creating a backup, run once per configuration file. The
@ -821,14 +879,19 @@ properties:
configured to trust the configured username
without a password.
example: trustsome1
list_options:
format:
type: string
enum: ['sql']
description: |
Additional mysql options to pass directly to
the mysql command that lists available
databases, without performing any validation on
them. See mysql documentation for details.
example: --defaults-extra-file=my.cnf
Database dump output format. Currenly only "sql"
is supported. Defaults to "sql" for a single
database. Or, when database name is "all" and
format is blank, dumps all databases to a single
file. But if a format is specified with an "all"
database name, dumps each database to a separate
file of that format, allowing more convenient
restores of individual databases.
example: directory
options:
type: string
description: |
@ -837,6 +900,22 @@ properties:
validation on them. See mysqldump documentation
for details.
example: --skip-comments
list_options:
type: string
description: |
Additional mysql options to pass directly to
the mysql command that lists available
databases, without performing any validation on
them. See mysql documentation for details.
example: --defaults-extra-file=my.cnf
restore_options:
type: string
description: |
Additional mysql options to pass directly to
the mysql command that restores database dumps,
without performing any validation on them. See
mysql documentation for details.
example: --defaults-extra-file=my.cnf
description: |
List of one or more MySQL/MariaDB databases to dump before
creating a backup, run once per configuration file. The
@ -909,7 +988,15 @@ properties:
directly to the dump command, without performing
any validation on them. See mongodump
documentation for details.
example: --role=someone
example: --dumpDbUsersAndRoles
restore_options:
type: string
description: |
Additional mongorestore options to pass
directly to the dump command, without performing
any validation on them. See mongorestore
documentation for details.
example: --restoreDbUsersAndRoles
description: |
List of one or more MongoDB databases to dump before
creating a backup, run once per configuration file. The

View File

@ -49,7 +49,8 @@ def log_outputs(processes, exclude_stdouts, output_log_level, borg_local_path):
'''
Given a sequence of subprocess.Popen() instances for multiple processes, log the output for each
process with the requested log level. Additionally, raise a CalledProcessError if a process
exits with an error (or a warning for exit code 1, if that process matches the Borg local path).
exits with an error (or a warning for exit code 1, if that process does not match the Borg local
path).
If output log level is None, then instead of logging, capture output for each process and return
it as a dict from the process to its output.
@ -147,7 +148,7 @@ def log_outputs(processes, exclude_stdouts, output_log_level, borg_local_path):
}
def log_command(full_command, input_file, output_file):
def log_command(full_command, input_file=None, output_file=None):
'''
Log the given command (a sequence of command/argument strings), along with its input/output file
paths.
@ -178,15 +179,14 @@ def execute_command(
):
'''
Execute the given command (a sequence of command/argument strings) and log its output at the
given log level. If output log level is None, instead capture and return the output. (Implies
run_to_completion.) If an open output file object is given, then write stdout to the file and
only log stderr (but only if an output log level is set). If an open input file object is given,
then read stdin from the file. If shell is True, execute the command within a shell. If an extra
environment dict is given, then use it to augment the current environment, and pass the result
into the command. If a working directory is given, use that as the present working directory
when running the command. If a Borg local path is given, and the command matches it (regardless
of arguments), treat exit code 1 as a warning instead of an error. If run to completion is
False, then return the process for the command without executing it to completion.
given log level. If an open output file object is given, then write stdout to the file and only
log stderr. If an open input file object is given, then read stdin from the file. If shell is
True, execute the command within a shell. If an extra environment dict is given, then use it to
augment the current environment, and pass the result into the command. If a working directory is
given, use that as the present working directory when running the command. If a Borg local path
is given, and the command matches it (regardless of arguments), treat exit code 1 as a warning
instead of an error. If run to completion is False, then return the process for the command
without executing it to completion.
Raise subprocesses.CalledProcessError if an error occurs while running the command.
'''
@ -195,12 +195,6 @@ def execute_command(
do_not_capture = bool(output_file is DO_NOT_CAPTURE)
command = ' '.join(full_command) if shell else full_command
if output_log_level is None:
output = subprocess.check_output(
command, stderr=subprocess.STDOUT, shell=shell, env=environment, cwd=working_directory
)
return output.decode() if output is not None else None
process = subprocess.Popen(
command,
stdin=input_file,
@ -218,6 +212,33 @@ def execute_command(
)
def execute_command_and_capture_output(
full_command, capture_stderr=False, shell=False, extra_environment=None, working_directory=None,
):
'''
Execute the given command (a sequence of command/argument strings), capturing and returning its
output (stdout). If capture stderr is True, then capture and return stderr in addition to
stdout. If shell is True, execute the command within a shell. If an extra environment dict is
given, then use it to augment the current environment, and pass the result into the command. If
a working directory is given, use that as the present working directory when running the command.
Raise subprocesses.CalledProcessError if an error occurs while running the command.
'''
log_command(full_command)
environment = {**os.environ, **extra_environment} if extra_environment else None
command = ' '.join(full_command) if shell else full_command
output = subprocess.check_output(
command,
stderr=subprocess.STDOUT if capture_stderr else None,
shell=shell,
env=environment,
cwd=working_directory,
)
return output.decode() if output is not None else None
def execute_command_with_processes(
full_command,
processes,

View File

@ -160,4 +160,6 @@ def build_restore_command(extract_process, database, dump_filename):
command.extend(('--password', database['password']))
if 'authentication_database' in database:
command.extend(('--authenticationDatabase', database['authentication_database']))
if 'restore_options' in database:
command.extend(database['restore_options'].split(' '))
return command

View File

@ -1,6 +1,12 @@
import copy
import logging
import os
from borgmatic.execute import execute_command, execute_command_with_processes
from borgmatic.execute import (
execute_command,
execute_command_and_capture_output,
execute_command_with_processes,
)
from borgmatic.hooks import dump
logger = logging.getLogger(__name__)
@ -20,14 +26,12 @@ SYSTEM_DATABASE_NAMES = ('information_schema', 'mysql', 'performance_schema', 's
def database_names_to_dump(database, extra_environment, log_prefix, dry_run_label):
'''
Given a requested database name, return the corresponding sequence of database names to dump.
Given a requested database config, return the corresponding sequence of database names to dump.
In the case of "all", query for the names of databases on the configured host and return them,
excluding any system databases that will cause problems during restore.
'''
requested_name = database['name']
if requested_name != 'all':
return (requested_name,)
if database['name'] != 'all':
return (database['name'],)
show_command = (
('mysql',)
@ -42,8 +46,8 @@ def database_names_to_dump(database, extra_environment, log_prefix, dry_run_labe
logger.debug(
'{}: Querying for "all" MySQL databases to dump{}'.format(log_prefix, dry_run_label)
)
show_output = execute_command(
show_command, output_log_level=None, extra_environment=extra_environment
show_output = execute_command_and_capture_output(
show_command, extra_environment=extra_environment
)
return tuple(
@ -53,6 +57,55 @@ def database_names_to_dump(database, extra_environment, log_prefix, dry_run_labe
)
def execute_dump_command(
database, log_prefix, dump_path, database_names, extra_environment, dry_run, dry_run_label
):
'''
Kick off a dump for the given MySQL/MariaDB database (provided as a configuration dict) to a
named pipe constructed from the given dump path and database names. Use the given log prefix in
any log entries.
Return a subprocess.Popen instance for the dump process ready to spew to a named pipe. But if
this is a dry run, then don't actually dump anything and return None.
'''
database_name = database['name']
dump_filename = dump.make_database_dump_filename(
dump_path, database['name'], database.get('hostname')
)
if os.path.exists(dump_filename):
logger.warning(
f'{log_prefix}: Skipping duplicate dump of MySQL database "{database_name}" to {dump_filename}'
)
return None
dump_command = (
('mysqldump',)
+ (tuple(database['options'].split(' ')) if 'options' in database else ())
+ ('--add-drop-database',)
+ (('--host', database['hostname']) if 'hostname' in database else ())
+ (('--port', str(database['port'])) if 'port' in database else ())
+ (('--protocol', 'tcp') if 'hostname' in database or 'port' in database else ())
+ (('--user', database['username']) if 'username' in database else ())
+ ('--databases',)
+ database_names
# Use shell redirection rather than execute_command(output_file=open(...)) to prevent
# the open() call on a named pipe from hanging the main borgmatic process.
+ ('>', dump_filename)
)
logger.debug(
f'{log_prefix}: Dumping MySQL database "{database_name}" to {dump_filename}{dry_run_label}'
)
if dry_run:
return None
dump.create_named_pipe_for_dump(dump_filename)
return execute_command(
dump_command, shell=True, extra_environment=extra_environment, run_to_completion=False,
)
def dump_databases(databases, log_prefix, location_config, dry_run):
'''
Dump the given MySQL/MariaDB databases to a named pipe. The databases are supplied as a sequence
@ -69,10 +122,7 @@ def dump_databases(databases, log_prefix, location_config, dry_run):
logger.info('{}: Dumping MySQL databases{}'.format(log_prefix, dry_run_label))
for database in databases:
requested_name = database['name']
dump_filename = dump.make_database_dump_filename(
make_dump_path(location_config), requested_name, database.get('hostname')
)
dump_path = make_dump_path(location_config)
extra_environment = {'MYSQL_PWD': database['password']} if 'password' in database else None
dump_database_names = database_names_to_dump(
database, extra_environment, log_prefix, dry_run_label
@ -80,41 +130,35 @@ def dump_databases(databases, log_prefix, location_config, dry_run):
if not dump_database_names:
raise ValueError('Cannot find any MySQL databases to dump.')
dump_command = (
('mysqldump',)
+ (tuple(database['options'].split(' ')) if 'options' in database else ())
+ ('--add-drop-database',)
+ (('--host', database['hostname']) if 'hostname' in database else ())
+ (('--port', str(database['port'])) if 'port' in database else ())
+ (('--protocol', 'tcp') if 'hostname' in database or 'port' in database else ())
+ (('--user', database['username']) if 'username' in database else ())
+ ('--databases',)
+ dump_database_names
# Use shell redirection rather than execute_command(output_file=open(...)) to prevent
# the open() call on a named pipe from hanging the main borgmatic process.
+ ('>', dump_filename)
)
logger.debug(
'{}: Dumping MySQL database {} to {}{}'.format(
log_prefix, requested_name, dump_filename, dry_run_label
if database['name'] == 'all' and database.get('format'):
for dump_name in dump_database_names:
renamed_database = copy.copy(database)
renamed_database['name'] = dump_name
processes.append(
execute_dump_command(
renamed_database,
log_prefix,
dump_path,
(dump_name,),
extra_environment,
dry_run,
dry_run_label,
)
)
else:
processes.append(
execute_dump_command(
database,
log_prefix,
dump_path,
dump_database_names,
extra_environment,
dry_run,
dry_run_label,
)
)
)
if dry_run:
continue
dump.create_named_pipe_for_dump(dump_filename)
processes.append(
execute_command(
dump_command,
shell=True,
extra_environment=extra_environment,
run_to_completion=False,
)
)
return processes
return [process for process in processes if process]
def remove_database_dumps(databases, log_prefix, location_config, dry_run): # pragma: no cover
@ -153,6 +197,7 @@ def restore_database_dump(database_config, log_prefix, location_config, dry_run,
database = database_config[0]
restore_command = (
('mysql', '--batch')
+ (tuple(database['restore_options'].split(' ')) if 'restore_options' in database else ())
+ (('--host', database['hostname']) if 'hostname' in database else ())
+ (('--port', str(database['port'])) if 'port' in database else ())
+ (('--protocol', 'tcp') if 'hostname' in database or 'port' in database else ())

View File

@ -1,6 +1,12 @@
import csv
import logging
import os
from borgmatic.execute import execute_command, execute_command_with_processes
from borgmatic.execute import (
execute_command,
execute_command_and_capture_output,
execute_command_with_processes,
)
from borgmatic.hooks import dump
logger = logging.getLogger(__name__)
@ -34,6 +40,44 @@ def make_extra_environment(database):
return extra
EXCLUDED_DATABASE_NAMES = ('template0', 'template1')
def database_names_to_dump(database, extra_environment, log_prefix, dry_run_label):
'''
Given a requested database config, return the corresponding sequence of database names to dump.
In the case of "all" when a database format is given, query for the names of databases on the
configured host and return them. For "all" without a database format, just return a sequence
containing "all".
'''
requested_name = database['name']
if requested_name != 'all':
return (requested_name,)
if not database.get('format'):
return ('all',)
list_command = (
('psql', '--list', '--no-password', '--csv', '--tuples-only')
+ (('--host', database['hostname']) if 'hostname' in database else ())
+ (('--port', str(database['port'])) if 'port' in database else ())
+ (('--username', database['username']) if 'username' in database else ())
+ (tuple(database['list_options'].split(' ')) if 'list_options' in database else ())
)
logger.debug(
'{}: Querying for "all" PostgreSQL databases to dump{}'.format(log_prefix, dry_run_label)
)
list_output = execute_command_and_capture_output(
list_command, extra_environment=extra_environment
)
return tuple(
row[0]
for row in csv.reader(list_output.splitlines(), delimiter=',', quotechar='"')
if row[0] not in EXCLUDED_DATABASE_NAMES
)
def dump_databases(databases, log_prefix, location_config, dry_run):
'''
Dump the given PostgreSQL databases to a named pipe. The databases are supplied as a sequence of
@ -43,6 +87,8 @@ def dump_databases(databases, log_prefix, location_config, dry_run):
Return a sequence of subprocess.Popen instances for the dump processes ready to spew to a named
pipe. But if this is a dry run, then don't actually dump anything and return an empty sequence.
Raise ValueError if the databases to dump cannot be determined.
'''
dry_run_label = ' (dry run; not actually dumping anything)' if dry_run else ''
processes = []
@ -50,51 +96,62 @@ def dump_databases(databases, log_prefix, location_config, dry_run):
logger.info('{}: Dumping PostgreSQL databases{}'.format(log_prefix, dry_run_label))
for database in databases:
name = database['name']
dump_filename = dump.make_database_dump_filename(
make_dump_path(location_config), name, database.get('hostname')
)
all_databases = bool(name == 'all')
dump_format = database.get('format', 'custom')
command = (
(
'pg_dumpall' if all_databases else 'pg_dump',
'--no-password',
'--clean',
'--if-exists',
)
+ (('--host', database['hostname']) if 'hostname' in database else ())
+ (('--port', str(database['port'])) if 'port' in database else ())
+ (('--username', database['username']) if 'username' in database else ())
+ (() if all_databases else ('--format', dump_format))
+ (('--file', dump_filename) if dump_format == 'directory' else ())
+ (tuple(database['options'].split(' ')) if 'options' in database else ())
+ (() if all_databases else (name,))
# Use shell redirection rather than the --file flag to sidestep synchronization issues
# when pg_dump/pg_dumpall tries to write to a named pipe. But for the directory dump
# format in a particular, a named destination is required, and redirection doesn't work.
+ (('>', dump_filename) if dump_format != 'directory' else ())
)
extra_environment = make_extra_environment(database)
logger.debug(
'{}: Dumping PostgreSQL database {} to {}{}'.format(
log_prefix, name, dump_filename, dry_run_label
)
dump_path = make_dump_path(location_config)
dump_database_names = database_names_to_dump(
database, extra_environment, log_prefix, dry_run_label
)
if dry_run:
continue
if dump_format == 'directory':
dump.create_parent_directory_for_dump(dump_filename)
else:
dump.create_named_pipe_for_dump(dump_filename)
if not dump_database_names:
raise ValueError('Cannot find any PostgreSQL databases to dump.')
processes.append(
execute_command(
command, shell=True, extra_environment=extra_environment, run_to_completion=False
for database_name in dump_database_names:
dump_format = database.get('format', None if database_name == 'all' else 'custom')
default_dump_command = 'pg_dumpall' if database_name == 'all' else 'pg_dump'
dump_command = database.get('pg_dump_command') or default_dump_command
dump_filename = dump.make_database_dump_filename(
dump_path, database_name, database.get('hostname')
)
if os.path.exists(dump_filename):
logger.warning(
f'{log_prefix}: Skipping duplicate dump of PostgreSQL database "{database_name}" to {dump_filename}'
)
continue
command = (
(dump_command, '--no-password', '--clean', '--if-exists',)
+ (('--host', database['hostname']) if 'hostname' in database else ())
+ (('--port', str(database['port'])) if 'port' in database else ())
+ (('--username', database['username']) if 'username' in database else ())
+ (('--format', dump_format) if dump_format else ())
+ (('--file', dump_filename) if dump_format == 'directory' else ())
+ (tuple(database['options'].split(' ')) if 'options' in database else ())
+ (() if database_name == 'all' else (database_name,))
# Use shell redirection rather than the --file flag to sidestep synchronization issues
# when pg_dump/pg_dumpall tries to write to a named pipe. But for the directory dump
# format in a particular, a named destination is required, and redirection doesn't work.
+ (('>', dump_filename) if dump_format != 'directory' else ())
)
logger.debug(
f'{log_prefix}: Dumping PostgreSQL database "{database_name}" to {dump_filename}{dry_run_label}'
)
if dry_run:
continue
if dump_format == 'directory':
dump.create_parent_directory_for_dump(dump_filename)
else:
dump.create_named_pipe_for_dump(dump_filename)
processes.append(
execute_command(
command,
shell=True,
extra_environment=extra_environment,
run_to_completion=False,
)
)
)
return processes
@ -140,16 +197,19 @@ def restore_database_dump(database_config, log_prefix, location_config, dry_run,
dump_filename = dump.make_database_dump_filename(
make_dump_path(location_config), database['name'], database.get('hostname')
)
psql_command = database.get('psql_command') or 'psql'
analyze_command = (
('psql', '--no-password', '--quiet')
(psql_command, '--no-password', '--quiet')
+ (('--host', database['hostname']) if 'hostname' in database else ())
+ (('--port', str(database['port'])) if 'port' in database else ())
+ (('--username', database['username']) if 'username' in database else ())
+ (('--dbname', database['name']) if not all_databases else ())
+ (tuple(database['analyze_options'].split(' ')) if 'analyze_options' in database else ())
+ ('--command', 'ANALYZE')
)
pg_restore_command = database.get('pg_restore_command') or 'pg_restore'
restore_command = (
('psql' if all_databases else 'pg_restore', '--no-password')
(psql_command if all_databases else pg_restore_command, '--no-password')
+ (
('--if-exists', '--exit-on-error', '--clean', '--dbname', database['name'])
if not all_databases
@ -158,6 +218,7 @@ def restore_database_dump(database_config, log_prefix, location_config, dry_run,
+ (('--host', database['hostname']) if 'hostname' in database else ())
+ (('--port', str(database['port'])) if 'port' in database else ())
+ (('--username', database['username']) if 'username' in database else ())
+ (tuple(database['restore_options'].split(' ')) if 'restore_options' in database else ())
+ (() if extract_process else (dump_filename,))
)
extra_environment = make_extra_environment(database)

View File

@ -85,18 +85,19 @@ class Multi_stream_handler(logging.Handler):
handler.setLevel(level)
LOG_LEVEL_TO_COLOR = {
logging.CRITICAL: colorama.Fore.RED,
logging.ERROR: colorama.Fore.RED,
logging.WARN: colorama.Fore.YELLOW,
logging.INFO: colorama.Fore.GREEN,
logging.DEBUG: colorama.Fore.CYAN,
}
class Console_color_formatter(logging.Formatter):
def format(self, record):
color = LOG_LEVEL_TO_COLOR.get(record.levelno)
add_custom_log_levels()
color = {
logging.CRITICAL: colorama.Fore.RED,
logging.ERROR: colorama.Fore.RED,
logging.WARN: colorama.Fore.YELLOW,
logging.ANSWER: colorama.Fore.MAGENTA,
logging.INFO: colorama.Fore.GREEN,
logging.DEBUG: colorama.Fore.CYAN,
}.get(record.levelno)
return color_text(color, record.msg)
@ -110,6 +111,45 @@ def color_text(color, message):
return '{}{}{}'.format(color, message, colorama.Style.RESET_ALL)
def add_logging_level(level_name, level_number):
'''
Globally add a custom logging level based on the given (all uppercase) level name and number.
Do this idempotently.
Inspired by https://stackoverflow.com/questions/2183233/how-to-add-a-custom-loglevel-to-pythons-logging-facility/35804945#35804945
'''
method_name = level_name.lower()
if not hasattr(logging, level_name):
logging.addLevelName(level_number, level_name)
setattr(logging, level_name, level_number)
if not hasattr(logging, method_name):
def log_for_level(self, message, *args, **kwargs): # pragma: no cover
if self.isEnabledFor(level_number):
self._log(level_number, message, args, **kwargs)
setattr(logging.getLoggerClass(), method_name, log_for_level)
if not hasattr(logging.getLoggerClass(), method_name):
def log_to_root(message, *args, **kwargs): # pragma: no cover
logging.log(level_number, message, *args, **kwargs)
setattr(logging, method_name, log_to_root)
ANSWER = logging.WARN - 5
def add_custom_log_levels(): # pragma: no cover
'''
Add a custom log level between WARN and INFO for user-requested answers.
'''
add_logging_level('ANSWER', ANSWER)
def configure_logging(
console_log_level,
syslog_log_level=None,
@ -130,6 +170,8 @@ def configure_logging(
if monitoring_log_level is None:
monitoring_log_level = console_log_level
add_custom_log_levels()
# Log certain log levels to console stderr and others to stdout. This supports use cases like
# grepping (non-error) output.
console_error_handler = logging.StreamHandler(sys.stderr)
@ -138,7 +180,8 @@ def configure_logging(
{
logging.CRITICAL: console_error_handler,
logging.ERROR: console_error_handler,
logging.WARN: console_standard_handler,
logging.WARN: console_error_handler,
logging.ANSWER: console_standard_handler,
logging.INFO: console_standard_handler,
logging.DEBUG: console_standard_handler,
}

View File

@ -1,7 +1,9 @@
import logging
import borgmatic.logger
VERBOSITY_ERROR = -1
VERBOSITY_WARNING = 0
VERBOSITY_ANSWER = 0
VERBOSITY_SOME = 1
VERBOSITY_LOTS = 2
@ -10,9 +12,11 @@ def verbosity_to_log_level(verbosity):
'''
Given a borgmatic verbosity value, return the corresponding Python log level.
'''
borgmatic.logger.add_custom_log_levels()
return {
VERBOSITY_ERROR: logging.ERROR,
VERBOSITY_WARNING: logging.WARNING,
VERBOSITY_ANSWER: logging.ANSWER,
VERBOSITY_SOME: logging.INFO,
VERBOSITY_LOTS: logging.DEBUG,
}.get(verbosity, logging.WARNING)

View File

@ -1,4 +1,4 @@
FROM alpine:3.16.0 as borgmatic
FROM alpine:3.17.1 as borgmatic
COPY . /app
RUN apk add --no-cache py3-pip py3-ruamel.yaml py3-ruamel.yaml.clib
@ -8,7 +8,7 @@ RUN borgmatic --help > /command-line.txt \
echo -e "\n--------------------------------------------------------------------------------\n" >> /command-line.txt \
&& borgmatic "$action" --help >> /command-line.txt; done
FROM node:18.4.0-alpine as html
FROM node:19.5.0-alpine as html
ARG ENVIRONMENT=production
@ -27,7 +27,7 @@ COPY . /source
RUN NODE_ENV=${ENVIRONMENT} npx eleventy --input=/source/docs --output=/output/docs \
&& mv /output/docs/index.html /output/index.html
FROM nginx:1.22.0-alpine
FROM nginx:1.22.1-alpine
COPY --from=html /output /usr/share/nginx/html
COPY --from=borgmatic /etc/borgmatic/config.yaml /usr/share/nginx/html/docs/reference/config.yaml

View File

@ -68,6 +68,9 @@ borgmatic. borgmatic logs the soft failure, skips all further actions in that
configurable file, and proceeds onward to any other borgmatic configuration
files you may have.
Note that `before_backup` only runs on the `create` action. See below about
optionally using `before_actions` instead.
You can imagine a similar check for the sometimes-online server case:
```yaml
@ -93,6 +96,12 @@ hooks:
(Writing the battery script is left as an exercise to the reader.)
<span class="minilink minilink-addedin">New in version 1.7.0</span> The
`before_actions` and `after_actions` hooks run before/after all the actions
(like `create`, `prune`, etc.) for each repository. So if you'd like your soft
failure command hook to run regardless of action, consider using
`before_actions` instead of `before_backup`.
## Caveats and details

View File

@ -76,6 +76,14 @@ hooks:
options: "--ssl"
```
See your [borgmatic configuration
file](https://torsion.org/borgmatic/docs/reference/configuration/) for
additional customization of the options passed to database commands (when
listing databases, restoring databases, etc.).
### All databases
If you want to dump all databases on a host, use `all` for the database name:
```yaml
@ -91,9 +99,30 @@ hooks:
Note that you may need to use a `username` of the `postgres` superuser for
this to work with PostgreSQL.
If you would like to backup databases only and not source directories, you can
specify an empty `source_directories` value (as it is a mandatory field prior
to borgmatic 1.7.1):
<span class="minilink minilink-addedin">New in version 1.7.6</span> With
PostgreSQL and MySQL, you can optionally dump "all" databases to separate
files instead of one combined dump file, allowing more convenient restores of
individual databases. Enable this by specifying your desired database dump
`format`:
```yaml
hooks:
postgresql_databases:
- name: all
format: custom
mysql_databases:
- name: all
format: sql
```
### No source directories
<span class="minilink minilink-addedin">New in version 1.7.1</span> If you
would like to backup databases only and not source directories, you can omit
`source_directories` entirely.
In older versions of borgmatic, instead specify an empty `source_directories`
value, as it is a mandatory option prior to version 1.7.1:
```yaml
location:
@ -103,8 +132,7 @@ hooks:
- name: all
```
<span class="minilink minilink-addedin">New in version 1.7.1</span> You can
omit `source_directories` entirely.
### External passwords
@ -148,15 +176,15 @@ borgmatic rlist
That should yield output looking something like:
```text
host-2019-01-01T04:05:06.070809 Tue, 2019-01-01 04:05:06 [...]
host-2019-01-02T04:06:07.080910 Wed, 2019-01-02 04:06:07 [...]
host-2023-01-01T04:05:06.070809 Tue, 2023-01-01 04:05:06 [...]
host-2023-01-02T04:06:07.080910 Wed, 2023-01-02 04:06:07 [...]
```
Assuming that you want to restore all database dumps from the archive with the
most up-to-date files and therefore the latest timestamp, run a command like:
```bash
borgmatic restore --archive host-2019-01-02T04:06:07.080910
borgmatic restore --archive host-2023-01-02T04:06:07.080910
```
(No borgmatic `restore` action? Upgrade borgmatic!)
@ -185,7 +213,7 @@ But if you have multiple repositories configured, then you'll need to specify
the repository path containing the archive to restore. Here's an example:
```bash
borgmatic restore --repository repo.borg --archive host-2019-...
borgmatic restore --repository repo.borg --archive host-2023-...
```
### Restore particular databases
@ -195,9 +223,39 @@ restore one of them, use the `--database` flag to select one or more
databases. For instance:
```bash
borgmatic restore --archive host-2019-... --database users
borgmatic restore --archive host-2023-... --database users
```
<span class="minilink minilink-addedin">New in version 1.7.6</span> You can
also restore individual databases even if you dumped them as "all"—as long as
you dumped them into separate files via use of the "format" option. See above
for more information.
### Restore all databases
To restore all databases:
```bash
borgmatic restore --archive host-2023-... --database all
```
Or omit the `--database` flag entirely:
```bash
borgmatic restore --archive host-2023-...
```
Prior to borgmatic version 1.7.6, this restores a combined "all" database
dump from the archive.
<span class="minilink minilink-addedin">New in version 1.7.6</span> Restoring
"all" databases restores each database found in the selected archive. That
includes any combined dump file named "all" and any other individual database
dumps found in the archive.
### Limitations
There are a few important limitations with borgmatic's current database
@ -220,8 +278,8 @@ such files. Common directories to exclude are `/dev` and `/run`, but that may
not be exhaustive. <span class="minilink minilink-addedin">New in version
1.7.3</span> When database hooks are enabled, borgmatic automatically excludes
special files that may cause Borg to hang, so you no longer need to manually
exclude them. You can override/prevent this behavior by explicitly setting
`read_special` to true.
exclude them. (This includes symlinks with special files as a destination.) You
can override/prevent this behavior by explicitly setting `read_special` to true.
### Manual restoration

View File

@ -44,9 +44,11 @@ consistency checks with `check` on a much less frequent basis (e.g. with
### Consistency check configuration
Another option is to customize your consistency checks. The default
consistency checks run both full-repository checks and per-archive checks
within each repository no more than once a month.
Another option is to customize your consistency checks. By default, if you
omit consistency checks from configuration, borgmatic runs full-repository
checks (`repository`) and per-archive checks (`archives`) within each
repository, no more than once a month. This is equivalent to what `borg check`
does if run without options.
But if you find that archive checks are too slow, for example, you can
configure borgmatic to run repository checks only. Configure this in the
@ -58,14 +60,25 @@ consistency:
- name: repository
```
(Prior to borgmatic 1.6.2, `checks` was a plain list of strings without the `name:` part.)
<span class="minilink minilink-addedin">Prior to version 1.6.2</span> `checks`
was a plain list of strings without the `name:` part. For example:
```yaml
consistency:
checks:
- repository
```
Here are the available checks from fastest to slowest:
* `repository`: Checks the consistency of the repository itself.
* `archives`: Checks all of the archives in the repository.
* `extract`: Performs an extraction dry-run of the most recent archive.
* `data`: Verifies the data integrity of all archives contents, decrypting and decompressing all data (implies `archives` as well).
* `data`: Verifies the data integrity of all archives contents, decrypting and decompressing all data.
Note that the `data` check is a more thorough version of the `archives` check,
so enabling the `data` check implicitly enables the `archives` check as well.
See [Borg's check
documentation](https://borgbackup.readthedocs.io/en/stable/usage/check.html)
@ -120,7 +133,16 @@ consistency:
- name: disabled
```
Or, if you have multiple repositories in your borgmatic configuration file,
<span class="minilink minilink-addedin">Prior to version 1.6.2</span> `checks`
was a plain list of strings without the `name:` part. For instance:
```yaml
consistency:
checks:
- disabled
```
If you have multiple repositories in your borgmatic configuration file,
you can keep running consistency checks, but only against a subset of the
repositories:

View File

@ -42,3 +42,13 @@ potentially across providers.
See [Borg repository URLs
documentation](https://borgbackup.readthedocs.io/en/stable/usage/general.html#repository-urls)
for more information on how to specify local and remote repository paths.
### Different options per repository
What if you want borgmatic to backup to multiple repositories—while also
setting different options for each one? In that case, you'll need to use
[a separate borgmatic configuration file for each
repository](https://torsion.org/borgmatic/docs/how-to/make-per-application-backups/)
instead of the multiple repositories in one configuration file as described
above. That's because all of the repositories in a particular configuration
file get the same options applied.

View File

@ -106,11 +106,60 @@ But if you do want to merge in a YAML key *and* its values, keep reading!
## Include merging
If you need to get even fancier and pull in common configuration options while
potentially overriding individual options, you can perform a YAML merge of
included configuration using the YAML `<<` key. For instance, here's an
example of a main configuration file that pulls in two retention options via
an include and then overrides one of them locally:
If you need to get even fancier and merge in common configuration options, you
can perform a YAML merge of included configuration using the YAML `<<` key.
For instance, here's an example of a main configuration file that pulls in
retention and consistency options via a single include:
```yaml
<<: !include /etc/borgmatic/common.yaml
location:
...
```
This is what `common.yaml` might look like:
```yaml
retention:
keep_hourly: 24
keep_daily: 7
consistency:
checks:
- name: repository
```
Once this include gets merged in, the resulting configuration would have all
of the `location` options from the original configuration file *and* the
`retention` and `consistency` options from the include.
Prior to borgmatic version 1.6.0, when there's a section collision between the
local file and the merged include, the local file's section takes precedence.
So if the `retention` section appears in both the local file and the include
file, the included `retention` is ignored in favor of the local `retention`.
But see below about deep merge in version 1.6.0+.
Note that this `<<` include merging syntax is only for merging in mappings
(configuration options and their values). But if you'd like to include a
single value directly, please see the section above about standard includes.
Additionally, there is a limitation preventing multiple `<<` include merges
per section. So for instance, that means you can do one `<<` merge at the
global level, another `<<` within each configuration section, etc. (This is a
YAML limitation.)
### Deep merge
<span class="minilink minilink-addedin">New in version 1.6.0</span> borgmatic
performs a deep merge of merged include files, meaning that values are merged
at all levels in the two configuration files. This allows you to include
common configuration—up to full borgmatic configuration files—while overriding
only the parts you want to customize.
For instance, here's an example of a main configuration file that pulls in two
retention options via an include and then overrides one of them locally:
```yaml
<<: !include /etc/borgmatic/common.yaml
@ -136,24 +185,8 @@ Once this include gets merged in, the resulting configuration would have a
When there's an option collision between the local file and the merged
include, the local file's option takes precedence.
Note that this `<<` include merging syntax is only for merging in mappings
(configuration options and their values). But if you'd like to include a
single value directly, please see the section above about standard includes.
Additionally, there is a limitation preventing multiple `<<` include merges
per section. So for instance, that means you can do one `<<` merge at the
global level, another `<<` within each configuration section, etc. (This is a
YAML limitation.)
### Deep merge
<span class="minilink minilink-addedin">New in version 1.6.0</span> borgmatic
performs a deep merge of merged include files, meaning that values are merged
at all levels in the two configuration files. Colliding list values are
appended together. This allows you to include common configuration—up to full
borgmatic configuration files—while overriding only the parts you want to
customize.
<span class="minilink minilink-addedin">New in version 1.6.1</span> Colliding
list values are appended together.
## Configuration overrides

View File

@ -257,9 +257,9 @@ See `borgmatic --help` and `borgmatic create --help` for more information.
If you omit `create` and other actions, borgmatic runs through a set of
default actions: `prune` any old backups as per the configured retention
policy, `compact` segments to free up space (with Borg 1.2+), `create` a
backup, *and* `check` backups for consistency problems due to things like file
damage. For instance:
policy, `compact` segments to free up space (with Borg 1.2+, borgmatic
1.5.23+), `create` a backup, *and* `check` backups for consistency problems
due to things like file damage. For instance:
```bash
sudo borgmatic --verbosity 1 --list --stats

View File

@ -160,17 +160,22 @@ Then, run the `rcreate` action (formerly `init`) to create that new Borg 2
repository:
```bash
borgmatic rcreate --verbosity 1 --encryption repokey-aes-ocb \
borgmatic rcreate --verbosity 1 --encryption repokey-blake2-aes-ocb \
--source-repository original.borg --repository upgraded.borg
```
(Note that `repokey-chacha20-poly1305` may be faster than `repokey-aes-ocb` on
certain platforms like ARM64.)
This creates an empty repository and doesn't actually transfer any data yet.
The `--source-repository` flag is necessary to reuse key material from your
Borg 1 repository so that the subsequent data transfer can work.
The `--encryption` value above selects the same chunk ID algorithm (`blake2`)
used in Borg 1, thereby making deduplication work across transferred archives
and new archives. Note that `repokey-blake2-chacha20-poly1305` may be faster
than `repokey-blake2-aes-ocb` on certain platforms like ARM64. Read about
[Borg encryption
modes](https://borgbackup.readthedocs.io/en/2.0.0b4/usage/rcreate.html#encryption-mode-tldr)
for the menu of available encryption modes.
To transfer data from your original Borg 1 repository to your newly created
Borg 2 repository:
@ -189,9 +194,9 @@ might take a while), and the final command with `--dry-run` again provides
confirmation of success—or tells you if something hasn't been transferred yet.
Note that by omitting the `--upgrader` flag, you can also do archive transfers
between Borg 2 repositories without upgrading, even down to individual
between related Borg 2 repositories without upgrading, even down to individual
archives. For more on that functionality, see the [Borg transfer
documentation](https://borgbackup.readthedocs.io/en/2.0.0b1/usage/transfer.html).
documentation](https://borgbackup.readthedocs.io/en/2.0.0b4/usage/transfer.html).
That's it! Now you can use your new Borg 2 repository as normal with
borgmatic. If you've got multiple repositories, repeat the above process for

View File

@ -13,9 +13,3 @@ each action sub-command:
```
{% include borgmatic/command-line.txt %}
```
## Related documentation
* [Set up backups with borgmatic](https://torsion.org/borgmatic/docs/how-to/set-up-backups/)
* [borgmatic configuration reference](https://torsion.org/borgmatic/docs/reference/configuration/)

View File

@ -15,9 +15,3 @@ Here is a full sample borgmatic configuration file including all available optio
Note that you can also [download this configuration
file](https://torsion.org/borgmatic/docs/reference/config.yaml) for use locally.
## Related documentation
* [Set up backups with borgmatic](https://torsion.org/borgmatic/docs/how-to/set-up-backups/)
* [borgmatic command-line reference](https://torsion.org/borgmatic/docs/reference/command-line/)

View File

@ -1,6 +1,6 @@
from setuptools import find_packages, setup
VERSION = '1.7.3'
VERSION = '1.7.6'
setup(

View File

@ -14,8 +14,8 @@ py==1.10.0
pycodestyle==2.8.0
pyflakes==2.4.0
jsonschema==3.2.0
pytest==6.2.5
pytest-cov==3.0.0
pytest==7.2.0
pytest-cov==4.0.0
regex; python_version >= '3.8'
requests==2.25.0
ruamel.yaml>0.15.0,<0.18.0

View File

@ -42,6 +42,11 @@ hooks:
hostname: postgresql
username: postgres
password: test
- name: all
format: custom
hostname: postgresql
username: postgres
password: test
mysql_databases:
- name: test
hostname: mysql
@ -51,6 +56,11 @@ hooks:
hostname: mysql
username: root
password: test
- name: all
format: sql
hostname: mysql
username: root
password: test
mongodb_databases:
- name: test
hostname: mongodb

View File

View File

@ -0,0 +1,22 @@
from flexmock import flexmock
from borgmatic.actions import borg as module
def test_run_borg_does_not_raise():
flexmock(module.logger).answer = lambda message: None
flexmock(module.borgmatic.config.validate).should_receive('repositories_match').and_return(True)
flexmock(module.borgmatic.borg.rlist).should_receive('resolve_archive_name').and_return(
flexmock()
)
flexmock(module.borgmatic.borg.borg).should_receive('run_arbitrary_borg')
borg_arguments = flexmock(repository=flexmock(), archive=flexmock(), options=flexmock())
module.run_borg(
repository='repo',
storage={},
local_borg_version=None,
borg_arguments=borg_arguments,
local_path=None,
remote_path=None,
)

View File

@ -0,0 +1,19 @@
from flexmock import flexmock
from borgmatic.actions import break_lock as module
def test_run_break_lock_does_not_raise():
flexmock(module.logger).answer = lambda message: None
flexmock(module.borgmatic.config.validate).should_receive('repositories_match').and_return(True)
flexmock(module.borgmatic.borg.break_lock).should_receive('break_lock')
break_lock_arguments = flexmock(repository=flexmock())
module.run_break_lock(
repository='repo',
storage={},
local_borg_version=None,
break_lock_arguments=break_lock_arguments,
local_path=None,
remote_path=None,
)

View File

@ -0,0 +1,31 @@
from flexmock import flexmock
from borgmatic.actions import check as module
def test_run_check_calls_hooks():
flexmock(module.logger).answer = lambda message: None
flexmock(module.borgmatic.config.checks).should_receive(
'repository_enabled_for_checks'
).and_return(True)
flexmock(module.borgmatic.borg.check).should_receive('check_archives')
flexmock(module.borgmatic.hooks.command).should_receive('execute_hook').times(2)
check_arguments = flexmock(
progress=flexmock(), repair=flexmock(), only=flexmock(), force=flexmock()
)
global_arguments = flexmock(monitoring_verbosity=1, dry_run=False)
module.run_check(
config_filename='test.yaml',
repository='repo',
location={'repositories': ['repo']},
storage={},
consistency={},
hooks={},
hook_context={},
local_borg_version=None,
check_arguments=check_arguments,
global_arguments=global_arguments,
local_path=None,
remote_path=None,
)

View File

@ -0,0 +1,29 @@
from flexmock import flexmock
from borgmatic.actions import compact as module
def test_compact_actions_calls_hooks():
flexmock(module.logger).answer = lambda message: None
flexmock(module.borgmatic.borg.feature).should_receive('available').and_return(True)
flexmock(module.borgmatic.borg.compact).should_receive('compact_segments')
flexmock(module.borgmatic.hooks.command).should_receive('execute_hook').times(2)
compact_arguments = flexmock(
progress=flexmock(), cleanup_commits=flexmock(), threshold=flexmock()
)
global_arguments = flexmock(monitoring_verbosity=1, dry_run=False)
module.run_compact(
config_filename='test.yaml',
repository='repo',
storage={},
retention={},
hooks={},
hook_context={},
local_borg_version=None,
compact_arguments=compact_arguments,
global_arguments=global_arguments,
dry_run_label='',
local_path=None,
remote_path=None,
)

View File

@ -0,0 +1,34 @@
from flexmock import flexmock
from borgmatic.actions import create as module
def test_run_create_executes_and_calls_hooks():
flexmock(module.logger).answer = lambda message: None
flexmock(module.borgmatic.borg.create).should_receive('create_archive')
flexmock(module.borgmatic.hooks.command).should_receive('execute_hook').times(2)
flexmock(module.borgmatic.hooks.dispatch).should_receive('call_hooks').and_return({})
flexmock(module.borgmatic.hooks.dispatch).should_receive(
'call_hooks_even_if_unconfigured'
).and_return({})
create_arguments = flexmock(
progress=flexmock(), stats=flexmock(), json=flexmock(), list_files=flexmock()
)
global_arguments = flexmock(monitoring_verbosity=1, dry_run=False)
list(
module.run_create(
config_filename='test.yaml',
repository='repo',
location={},
storage={},
hooks={},
hook_context={},
local_borg_version=None,
create_arguments=create_arguments,
global_arguments=global_arguments,
dry_run_label='',
local_path=None,
remote_path=None,
)
)

View File

@ -0,0 +1,29 @@
from flexmock import flexmock
from borgmatic.actions import export_tar as module
def test_run_export_tar_does_not_raise():
flexmock(module.logger).answer = lambda message: None
flexmock(module.borgmatic.config.validate).should_receive('repositories_match').and_return(True)
flexmock(module.borgmatic.borg.export_tar).should_receive('export_tar_archive')
export_tar_arguments = flexmock(
repository=flexmock(),
archive=flexmock(),
paths=flexmock(),
destination=flexmock(),
tar_filter=flexmock(),
list_files=flexmock(),
strip_components=flexmock(),
)
global_arguments = flexmock(monitoring_verbosity=1, dry_run=False)
module.run_export_tar(
repository='repo',
storage={},
local_borg_version=None,
export_tar_arguments=export_tar_arguments,
global_arguments=global_arguments,
local_path=None,
remote_path=None,
)

View File

@ -0,0 +1,33 @@
from flexmock import flexmock
from borgmatic.actions import extract as module
def test_run_extract_calls_hooks():
flexmock(module.logger).answer = lambda message: None
flexmock(module.borgmatic.config.validate).should_receive('repositories_match').and_return(True)
flexmock(module.borgmatic.borg.extract).should_receive('extract_archive')
flexmock(module.borgmatic.hooks.command).should_receive('execute_hook').times(2)
extract_arguments = flexmock(
paths=flexmock(),
progress=flexmock(),
destination=flexmock(),
strip_components=flexmock(),
archive=flexmock(),
repository='repo',
)
global_arguments = flexmock(monitoring_verbosity=1, dry_run=False)
module.run_extract(
config_filename='test.yaml',
repository='repo',
location={'repositories': ['repo']},
storage={},
hooks={},
hook_context={},
local_borg_version=None,
extract_arguments=extract_arguments,
global_arguments=global_arguments,
local_path=None,
remote_path=None,
)

View File

@ -0,0 +1,24 @@
from flexmock import flexmock
from borgmatic.actions import info as module
def test_run_info_does_not_raise():
flexmock(module.logger).answer = lambda message: None
flexmock(module.borgmatic.config.validate).should_receive('repositories_match').and_return(True)
flexmock(module.borgmatic.borg.rlist).should_receive('resolve_archive_name').and_return(
flexmock()
)
flexmock(module.borgmatic.borg.info).should_receive('display_archives_info')
info_arguments = flexmock(repository=flexmock(), archive=flexmock(), json=flexmock())
list(
module.run_info(
repository='repo',
storage={},
local_borg_version=None,
info_arguments=info_arguments,
local_path=None,
remote_path=None,
)
)

View File

@ -0,0 +1,24 @@
from flexmock import flexmock
from borgmatic.actions import list as module
def test_run_list_does_not_raise():
flexmock(module.logger).answer = lambda message: None
flexmock(module.borgmatic.config.validate).should_receive('repositories_match').and_return(True)
flexmock(module.borgmatic.borg.rlist).should_receive('resolve_archive_name').and_return(
flexmock()
)
flexmock(module.borgmatic.borg.list).should_receive('list_archive')
list_arguments = flexmock(repository=flexmock(), archive=flexmock(), json=flexmock())
list(
module.run_list(
repository='repo',
storage={},
local_borg_version=None,
list_arguments=list_arguments,
local_path=None,
remote_path=None,
)
)

View File

@ -0,0 +1,26 @@
from flexmock import flexmock
from borgmatic.actions import mount as module
def test_run_mount_does_not_raise():
flexmock(module.logger).answer = lambda message: None
flexmock(module.borgmatic.config.validate).should_receive('repositories_match').and_return(True)
flexmock(module.borgmatic.borg.mount).should_receive('mount_archive')
mount_arguments = flexmock(
repository=flexmock(),
archive=flexmock(),
mount_point=flexmock(),
paths=flexmock(),
foreground=flexmock(),
options=flexmock(),
)
module.run_mount(
repository='repo',
storage={},
local_borg_version=None,
mount_arguments=mount_arguments,
local_path=None,
remote_path=None,
)

View File

@ -0,0 +1,26 @@
from flexmock import flexmock
from borgmatic.actions import prune as module
def test_run_prune_calls_hooks():
flexmock(module.logger).answer = lambda message: None
flexmock(module.borgmatic.borg.prune).should_receive('prune_archives')
flexmock(module.borgmatic.hooks.command).should_receive('execute_hook').times(2)
prune_arguments = flexmock(stats=flexmock(), list_archives=flexmock())
global_arguments = flexmock(monitoring_verbosity=1, dry_run=False)
module.run_prune(
config_filename='test.yaml',
repository='repo',
storage={},
retention={},
hooks={},
hook_context={},
local_borg_version=None,
prune_arguments=prune_arguments,
global_arguments=global_arguments,
dry_run_label='',
local_path=None,
remote_path=None,
)

View File

@ -0,0 +1,26 @@
from flexmock import flexmock
from borgmatic.actions import rcreate as module
def test_run_rcreate_does_not_raise():
flexmock(module.logger).answer = lambda message: None
flexmock(module.borgmatic.borg.rcreate).should_receive('create_repository')
arguments = flexmock(
encryption_mode=flexmock(),
source_repository=flexmock(),
copy_crypt_key=flexmock(),
append_only=flexmock(),
storage_quota=flexmock(),
make_parent_dirs=flexmock(),
)
module.run_rcreate(
repository='repo',
storage={},
local_borg_version=None,
rcreate_arguments=arguments,
global_arguments=flexmock(dry_run=False),
local_path=None,
remote_path=None,
)

View File

@ -0,0 +1,495 @@
import pytest
from flexmock import flexmock
import borgmatic.actions.restore as module
def test_get_configured_database_matches_database_by_name():
assert module.get_configured_database(
hooks={
'other_databases': [{'name': 'other'}],
'postgresql_databases': [{'name': 'foo'}, {'name': 'bar'}],
},
archive_database_names={'postgresql_databases': ['other', 'foo', 'bar']},
hook_name='postgresql_databases',
database_name='bar',
) == ('postgresql_databases', {'name': 'bar'})
def test_get_configured_database_matches_nothing_when_database_name_not_configured():
assert module.get_configured_database(
hooks={'postgresql_databases': [{'name': 'foo'}, {'name': 'bar'}]},
archive_database_names={'postgresql_databases': ['foo']},
hook_name='postgresql_databases',
database_name='quux',
) == (None, None)
def test_get_configured_database_matches_nothing_when_database_name_not_in_archive():
assert module.get_configured_database(
hooks={'postgresql_databases': [{'name': 'foo'}, {'name': 'bar'}]},
archive_database_names={'postgresql_databases': ['bar']},
hook_name='postgresql_databases',
database_name='foo',
) == (None, None)
def test_get_configured_database_matches_database_by_configuration_database_name():
assert module.get_configured_database(
hooks={'postgresql_databases': [{'name': 'all'}, {'name': 'bar'}]},
archive_database_names={'postgresql_databases': ['foo']},
hook_name='postgresql_databases',
database_name='foo',
configuration_database_name='all',
) == ('postgresql_databases', {'name': 'all'})
def test_get_configured_database_with_unspecified_hook_matches_database_by_name():
assert module.get_configured_database(
hooks={
'other_databases': [{'name': 'other'}],
'postgresql_databases': [{'name': 'foo'}, {'name': 'bar'}],
},
archive_database_names={'postgresql_databases': ['other', 'foo', 'bar']},
hook_name=module.UNSPECIFIED_HOOK,
database_name='bar',
) == ('postgresql_databases', {'name': 'bar'})
def test_collect_archive_database_names_parses_archive_paths():
flexmock(module.borgmatic.hooks.dump).should_receive('make_database_dump_path').and_return('')
flexmock(module.borgmatic.borg.list).should_receive('capture_archive_listing').and_return(
[
'.borgmatic/postgresql_databases/localhost/foo',
'.borgmatic/postgresql_databases/localhost/bar',
'.borgmatic/mysql_databases/localhost/quux',
]
)
archive_database_names = module.collect_archive_database_names(
repository='repo',
archive='archive',
location={'borgmatic_source_directory': '.borgmatic'},
storage=flexmock(),
local_borg_version=flexmock(),
local_path=flexmock(),
remote_path=flexmock(),
)
assert archive_database_names == {
'postgresql_databases': ['foo', 'bar'],
'mysql_databases': ['quux'],
}
def test_collect_archive_database_names_parses_directory_format_archive_paths():
flexmock(module.borgmatic.hooks.dump).should_receive('make_database_dump_path').and_return('')
flexmock(module.borgmatic.borg.list).should_receive('capture_archive_listing').and_return(
[
'.borgmatic/postgresql_databases/localhost/foo/table1',
'.borgmatic/postgresql_databases/localhost/foo/table2',
]
)
archive_database_names = module.collect_archive_database_names(
repository='repo',
archive='archive',
location={'borgmatic_source_directory': '.borgmatic'},
storage=flexmock(),
local_borg_version=flexmock(),
local_path=flexmock(),
remote_path=flexmock(),
)
assert archive_database_names == {
'postgresql_databases': ['foo'],
}
def test_collect_archive_database_names_skips_bad_archive_paths():
flexmock(module.borgmatic.hooks.dump).should_receive('make_database_dump_path').and_return('')
flexmock(module.borgmatic.borg.list).should_receive('capture_archive_listing').and_return(
['.borgmatic/postgresql_databases/localhost/foo', '.borgmatic/invalid', 'invalid/as/well']
)
archive_database_names = module.collect_archive_database_names(
repository='repo',
archive='archive',
location={'borgmatic_source_directory': '.borgmatic'},
storage=flexmock(),
local_borg_version=flexmock(),
local_path=flexmock(),
remote_path=flexmock(),
)
assert archive_database_names == {
'postgresql_databases': ['foo'],
}
def test_find_databases_to_restore_passes_through_requested_names_found_in_archive():
restore_names = module.find_databases_to_restore(
requested_database_names=['foo', 'bar'],
archive_database_names={'postresql_databases': ['foo', 'bar', 'baz']},
)
assert restore_names == {module.UNSPECIFIED_HOOK: ['foo', 'bar']}
def test_find_databases_to_restore_raises_for_requested_names_missing_from_archive():
with pytest.raises(ValueError):
module.find_databases_to_restore(
requested_database_names=['foo', 'bar'],
archive_database_names={'postresql_databases': ['foo']},
)
def test_find_databases_to_restore_without_requested_names_finds_all_archive_databases():
archive_database_names = {'postresql_databases': ['foo', 'bar']}
restore_names = module.find_databases_to_restore(
requested_database_names=[], archive_database_names=archive_database_names,
)
assert restore_names == archive_database_names
def test_find_databases_to_restore_with_all_in_requested_names_finds_all_archive_databases():
archive_database_names = {'postresql_databases': ['foo', 'bar']}
restore_names = module.find_databases_to_restore(
requested_database_names=['all'], archive_database_names=archive_database_names,
)
assert restore_names == archive_database_names
def test_find_databases_to_restore_with_all_in_requested_names_plus_additional_requested_names_omits_duplicates():
archive_database_names = {'postresql_databases': ['foo', 'bar']}
restore_names = module.find_databases_to_restore(
requested_database_names=['all', 'foo', 'bar'],
archive_database_names=archive_database_names,
)
assert restore_names == archive_database_names
def test_find_databases_to_restore_raises_for_all_in_requested_names_and_requested_named_missing_from_archives():
with pytest.raises(ValueError):
module.find_databases_to_restore(
requested_database_names=['all', 'foo', 'bar'],
archive_database_names={'postresql_databases': ['foo']},
)
def test_ensure_databases_found_with_all_databases_found_does_not_raise():
module.ensure_databases_found(
restore_names={'postgresql_databases': ['foo']},
remaining_restore_names={'postgresql_databases': ['bar']},
found_names=['foo', 'bar'],
)
def test_ensure_databases_found_with_no_databases_raises():
with pytest.raises(ValueError):
module.ensure_databases_found(
restore_names={'postgresql_databases': []}, remaining_restore_names={}, found_names=[],
)
def test_ensure_databases_found_with_missing_databases_raises():
with pytest.raises(ValueError):
module.ensure_databases_found(
restore_names={'postgresql_databases': ['foo']},
remaining_restore_names={'postgresql_databases': ['bar']},
found_names=['foo'],
)
def test_run_restore_restores_each_database():
restore_names = {
'postgresql_databases': ['foo', 'bar'],
}
flexmock(module.borgmatic.config.validate).should_receive('repositories_match').and_return(True)
flexmock(module.borgmatic.hooks.dispatch).should_receive('call_hooks_even_if_unconfigured')
flexmock(module.borgmatic.borg.rlist).should_receive('resolve_archive_name').and_return(
flexmock()
)
flexmock(module).should_receive('collect_archive_database_names').and_return(flexmock())
flexmock(module).should_receive('find_databases_to_restore').and_return(restore_names)
flexmock(module).should_receive('get_configured_database').and_return(
('postgresql_databases', {'name': 'foo'})
).and_return(('postgresql_databases', {'name': 'bar'}))
flexmock(module).should_receive('restore_single_database').with_args(
repository=object,
location=object,
storage=object,
hooks=object,
local_borg_version=object,
global_arguments=object,
local_path=object,
remote_path=object,
archive_name=object,
hook_name='postgresql_databases',
database={'name': 'foo'},
).once()
flexmock(module).should_receive('restore_single_database').with_args(
repository=object,
location=object,
storage=object,
hooks=object,
local_borg_version=object,
global_arguments=object,
local_path=object,
remote_path=object,
archive_name=object,
hook_name='postgresql_databases',
database={'name': 'bar'},
).once()
flexmock(module).should_receive('ensure_databases_found')
module.run_restore(
repository='repo',
location=flexmock(),
storage=flexmock(),
hooks=flexmock(),
local_borg_version=flexmock(),
restore_arguments=flexmock(repository='repo', archive='archive', databases=flexmock()),
global_arguments=flexmock(dry_run=False),
local_path=flexmock(),
remote_path=flexmock(),
)
def test_run_restore_bails_for_non_matching_repository():
flexmock(module.borgmatic.config.validate).should_receive('repositories_match').and_return(
False
)
flexmock(module.borgmatic.hooks.dispatch).should_receive(
'call_hooks_even_if_unconfigured'
).never()
flexmock(module).should_receive('restore_single_database').never()
module.run_restore(
repository='repo',
location=flexmock(),
storage=flexmock(),
hooks=flexmock(),
local_borg_version=flexmock(),
restore_arguments=flexmock(repository='repo', archive='archive', databases=flexmock()),
global_arguments=flexmock(dry_run=False),
local_path=flexmock(),
remote_path=flexmock(),
)
def test_run_restore_restores_database_configured_with_all_name():
restore_names = {
'postgresql_databases': ['foo', 'bar'],
}
flexmock(module.borgmatic.config.validate).should_receive('repositories_match').and_return(True)
flexmock(module.borgmatic.hooks.dispatch).should_receive('call_hooks_even_if_unconfigured')
flexmock(module.borgmatic.borg.rlist).should_receive('resolve_archive_name').and_return(
flexmock()
)
flexmock(module).should_receive('collect_archive_database_names').and_return(flexmock())
flexmock(module).should_receive('find_databases_to_restore').and_return(restore_names)
flexmock(module).should_receive('get_configured_database').with_args(
hooks=object,
archive_database_names=object,
hook_name='postgresql_databases',
database_name='foo',
).and_return(('postgresql_databases', {'name': 'foo'}))
flexmock(module).should_receive('get_configured_database').with_args(
hooks=object,
archive_database_names=object,
hook_name='postgresql_databases',
database_name='bar',
).and_return((None, None))
flexmock(module).should_receive('get_configured_database').with_args(
hooks=object,
archive_database_names=object,
hook_name='postgresql_databases',
database_name='bar',
configuration_database_name='all',
).and_return(('postgresql_databases', {'name': 'bar'}))
flexmock(module).should_receive('restore_single_database').with_args(
repository=object,
location=object,
storage=object,
hooks=object,
local_borg_version=object,
global_arguments=object,
local_path=object,
remote_path=object,
archive_name=object,
hook_name='postgresql_databases',
database={'name': 'foo'},
).once()
flexmock(module).should_receive('restore_single_database').with_args(
repository=object,
location=object,
storage=object,
hooks=object,
local_borg_version=object,
global_arguments=object,
local_path=object,
remote_path=object,
archive_name=object,
hook_name='postgresql_databases',
database={'name': 'bar'},
).once()
flexmock(module).should_receive('ensure_databases_found')
module.run_restore(
repository='repo',
location=flexmock(),
storage=flexmock(),
hooks=flexmock(),
local_borg_version=flexmock(),
restore_arguments=flexmock(repository='repo', archive='archive', databases=flexmock()),
global_arguments=flexmock(dry_run=False),
local_path=flexmock(),
remote_path=flexmock(),
)
def test_run_restore_skips_missing_database():
restore_names = {
'postgresql_databases': ['foo', 'bar'],
}
flexmock(module.borgmatic.config.validate).should_receive('repositories_match').and_return(True)
flexmock(module.borgmatic.hooks.dispatch).should_receive('call_hooks_even_if_unconfigured')
flexmock(module.borgmatic.borg.rlist).should_receive('resolve_archive_name').and_return(
flexmock()
)
flexmock(module).should_receive('collect_archive_database_names').and_return(flexmock())
flexmock(module).should_receive('find_databases_to_restore').and_return(restore_names)
flexmock(module).should_receive('get_configured_database').with_args(
hooks=object,
archive_database_names=object,
hook_name='postgresql_databases',
database_name='foo',
).and_return(('postgresql_databases', {'name': 'foo'}))
flexmock(module).should_receive('get_configured_database').with_args(
hooks=object,
archive_database_names=object,
hook_name='postgresql_databases',
database_name='bar',
).and_return((None, None))
flexmock(module).should_receive('get_configured_database').with_args(
hooks=object,
archive_database_names=object,
hook_name='postgresql_databases',
database_name='bar',
configuration_database_name='all',
).and_return((None, None))
flexmock(module).should_receive('restore_single_database').with_args(
repository=object,
location=object,
storage=object,
hooks=object,
local_borg_version=object,
global_arguments=object,
local_path=object,
remote_path=object,
archive_name=object,
hook_name='postgresql_databases',
database={'name': 'foo'},
).once()
flexmock(module).should_receive('restore_single_database').with_args(
repository=object,
location=object,
storage=object,
hooks=object,
local_borg_version=object,
global_arguments=object,
local_path=object,
remote_path=object,
archive_name=object,
hook_name='postgresql_databases',
database={'name': 'bar'},
).never()
flexmock(module).should_receive('ensure_databases_found')
module.run_restore(
repository='repo',
location=flexmock(),
storage=flexmock(),
hooks=flexmock(),
local_borg_version=flexmock(),
restore_arguments=flexmock(repository='repo', archive='archive', databases=flexmock()),
global_arguments=flexmock(dry_run=False),
local_path=flexmock(),
remote_path=flexmock(),
)
def test_run_restore_restores_databases_from_different_hooks():
restore_names = {
'postgresql_databases': ['foo'],
'mysql_databases': ['bar'],
}
flexmock(module.borgmatic.config.validate).should_receive('repositories_match').and_return(True)
flexmock(module.borgmatic.hooks.dispatch).should_receive('call_hooks_even_if_unconfigured')
flexmock(module.borgmatic.borg.rlist).should_receive('resolve_archive_name').and_return(
flexmock()
)
flexmock(module).should_receive('collect_archive_database_names').and_return(flexmock())
flexmock(module).should_receive('find_databases_to_restore').and_return(restore_names)
flexmock(module).should_receive('get_configured_database').with_args(
hooks=object,
archive_database_names=object,
hook_name='postgresql_databases',
database_name='foo',
).and_return(('postgresql_databases', {'name': 'foo'}))
flexmock(module).should_receive('get_configured_database').with_args(
hooks=object,
archive_database_names=object,
hook_name='mysql_databases',
database_name='bar',
).and_return(('mysql_databases', {'name': 'bar'}))
flexmock(module).should_receive('restore_single_database').with_args(
repository=object,
location=object,
storage=object,
hooks=object,
local_borg_version=object,
global_arguments=object,
local_path=object,
remote_path=object,
archive_name=object,
hook_name='postgresql_databases',
database={'name': 'foo'},
).once()
flexmock(module).should_receive('restore_single_database').with_args(
repository=object,
location=object,
storage=object,
hooks=object,
local_borg_version=object,
global_arguments=object,
local_path=object,
remote_path=object,
archive_name=object,
hook_name='mysql_databases',
database={'name': 'bar'},
).once()
flexmock(module).should_receive('ensure_databases_found')
module.run_restore(
repository='repo',
location=flexmock(),
storage=flexmock(),
hooks=flexmock(),
local_borg_version=flexmock(),
restore_arguments=flexmock(repository='repo', archive='archive', databases=flexmock()),
global_arguments=flexmock(dry_run=False),
local_path=flexmock(),
remote_path=flexmock(),
)

View File

@ -0,0 +1,21 @@
from flexmock import flexmock
from borgmatic.actions import rinfo as module
def test_run_rinfo_does_not_raise():
flexmock(module.logger).answer = lambda message: None
flexmock(module.borgmatic.config.validate).should_receive('repositories_match').and_return(True)
flexmock(module.borgmatic.borg.rinfo).should_receive('display_repository_info')
rinfo_arguments = flexmock(repository=flexmock(), json=flexmock())
list(
module.run_rinfo(
repository='repo',
storage={},
local_borg_version=None,
rinfo_arguments=rinfo_arguments,
local_path=None,
remote_path=None,
)
)

View File

@ -0,0 +1,21 @@
from flexmock import flexmock
from borgmatic.actions import rlist as module
def test_run_rlist_does_not_raise():
flexmock(module.logger).answer = lambda message: None
flexmock(module.borgmatic.config.validate).should_receive('repositories_match').and_return(True)
flexmock(module.borgmatic.borg.rlist).should_receive('list_repository')
rlist_arguments = flexmock(repository=flexmock(), json=flexmock())
list(
module.run_rlist(
repository='repo',
storage={},
local_borg_version=None,
rlist_arguments=rlist_arguments,
local_path=None,
remote_path=None,
)
)

View File

@ -0,0 +1,20 @@
from flexmock import flexmock
from borgmatic.actions import transfer as module
def test_run_transfer_does_not_raise():
flexmock(module.logger).answer = lambda message: None
flexmock(module.borgmatic.borg.transfer).should_receive('transfer_archives')
transfer_arguments = flexmock()
global_arguments = flexmock(monitoring_verbosity=1, dry_run=False)
module.run_transfer(
repository='repo',
storage={},
local_borg_version=None,
transfer_arguments=transfer_arguments,
global_arguments=global_arguments,
local_path=None,
remote_path=None,
)

View File

@ -8,12 +8,14 @@ from ..test_verbosity import insert_logging_mock
def test_run_arbitrary_borg_calls_borg_with_parameters():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'break-lock', 'repo'),
output_log_level=logging.WARNING,
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
)
@ -24,12 +26,14 @@ def test_run_arbitrary_borg_calls_borg_with_parameters():
def test_run_arbitrary_borg_with_log_info_calls_borg_with_info_parameter():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'break-lock', 'repo', '--info'),
output_log_level=logging.WARNING,
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
)
@ -41,12 +45,14 @@ def test_run_arbitrary_borg_with_log_info_calls_borg_with_info_parameter():
def test_run_arbitrary_borg_with_log_debug_calls_borg_with_debug_parameter():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'break-lock', 'repo', '--debug', '--show-rc'),
output_log_level=logging.WARNING,
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
)
@ -58,6 +64,8 @@ def test_run_arbitrary_borg_with_log_debug_calls_borg_with_debug_parameter():
def test_run_arbitrary_borg_with_lock_wait_calls_borg_with_lock_wait_parameters():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
storage_config = {'lock_wait': 5}
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
flexmock(module.flags).should_receive('make_flags').and_return(()).and_return(
@ -66,7 +74,7 @@ def test_run_arbitrary_borg_with_lock_wait_calls_borg_with_lock_wait_parameters(
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'break-lock', 'repo', '--lock-wait', '5'),
output_log_level=logging.WARNING,
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
)
@ -80,6 +88,8 @@ def test_run_arbitrary_borg_with_lock_wait_calls_borg_with_lock_wait_parameters(
def test_run_arbitrary_borg_with_archive_calls_borg_with_archive_parameter():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_repository_archive_flags').and_return(
('repo::archive',)
)
@ -87,7 +97,7 @@ def test_run_arbitrary_borg_with_archive_calls_borg_with_archive_parameter():
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'break-lock', 'repo::archive'),
output_log_level=logging.WARNING,
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
)
@ -102,12 +112,14 @@ def test_run_arbitrary_borg_with_archive_calls_borg_with_archive_parameter():
def test_run_arbitrary_borg_with_local_path_calls_borg_via_local_path():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg1', 'break-lock', 'repo'),
output_log_level=logging.WARNING,
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg1',
extra_environment=None,
)
@ -122,6 +134,8 @@ def test_run_arbitrary_borg_with_local_path_calls_borg_via_local_path():
def test_run_arbitrary_borg_with_remote_path_calls_borg_with_remote_path_parameters():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
flexmock(module.flags).should_receive('make_flags').and_return(
('--remote-path', 'borg1')
@ -129,7 +143,7 @@ def test_run_arbitrary_borg_with_remote_path_calls_borg_with_remote_path_paramet
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'break-lock', 'repo', '--remote-path', 'borg1'),
output_log_level=logging.WARNING,
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
)
@ -144,12 +158,14 @@ def test_run_arbitrary_borg_with_remote_path_calls_borg_with_remote_path_paramet
def test_run_arbitrary_borg_passes_borg_specific_parameters_to_borg():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'list', 'repo', '--progress'),
output_log_level=logging.WARNING,
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
)
@ -163,12 +179,14 @@ def test_run_arbitrary_borg_passes_borg_specific_parameters_to_borg():
def test_run_arbitrary_borg_omits_dash_dash_in_parameters_passed_to_borg():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'break-lock', 'repo'),
output_log_level=logging.WARNING,
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
)
@ -182,11 +200,16 @@ def test_run_arbitrary_borg_omits_dash_dash_in_parameters_passed_to_borg():
def test_run_arbitrary_borg_without_borg_specific_parameters_does_not_raise():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_repository_flags').never()
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg',), output_log_level=logging.WARNING, borg_local_path='borg', extra_environment=None,
('borg',),
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
)
module.run_arbitrary_borg(
@ -195,12 +218,14 @@ def test_run_arbitrary_borg_without_borg_specific_parameters_does_not_raise():
def test_run_arbitrary_borg_passes_key_sub_command_to_borg_before_repository():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'key', 'export', 'repo'),
output_log_level=logging.WARNING,
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
)
@ -211,12 +236,14 @@ def test_run_arbitrary_borg_passes_key_sub_command_to_borg_before_repository():
def test_run_arbitrary_borg_passes_debug_sub_command_to_borg_before_repository():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'debug', 'dump-manifest', 'repo', 'path'),
output_log_level=logging.WARNING,
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
)
@ -230,12 +257,14 @@ def test_run_arbitrary_borg_passes_debug_sub_command_to_borg_before_repository()
def test_run_arbitrary_borg_with_debug_info_command_does_not_pass_borg_repository():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_repository_flags').never()
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'debug', 'info'),
output_log_level=logging.WARNING,
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
)
@ -246,12 +275,14 @@ def test_run_arbitrary_borg_with_debug_info_command_does_not_pass_borg_repositor
def test_run_arbitrary_borg_with_debug_convert_profile_command_does_not_pass_borg_repository():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_repository_flags').never()
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'debug', 'convert-profile', 'in', 'out'),
output_log_level=logging.WARNING,
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
)

View File

@ -265,6 +265,14 @@ def test_make_check_flags_with_archives_check_and_last_includes_last_flag():
assert flags == ('--archives-only', '--last', '3')
def test_make_check_flags_with_data_check_and_last_includes_last_flag():
flexmock(module.feature).should_receive('available').and_return(True)
flags = module.make_check_flags('1.2.3', ('data',), check_last=3)
assert flags == ('--archives-only', '--last', '3', '--verify-data')
def test_make_check_flags_with_repository_check_and_last_omits_last_flag():
flexmock(module.feature).should_receive('available').and_return(True)
@ -289,6 +297,14 @@ def test_make_check_flags_with_archives_check_and_prefix_includes_match_archives
assert flags == ('--archives-only', '--match-archives', 'sh:foo-*')
def test_make_check_flags_with_data_check_and_prefix_includes_match_archives_flag():
flexmock(module.feature).should_receive('available').and_return(True)
flags = module.make_check_flags('1.2.3', ('data',), prefix='foo-')
assert flags == ('--archives-only', '--match-archives', 'sh:foo-*', '--verify-data')
def test_make_check_flags_with_archives_check_and_empty_prefix_omits_match_archives_flag():
flexmock(module.feature).should_receive('available').and_return(True)

View File

@ -338,6 +338,12 @@ def test_special_file_looks_at_file_type(character_device, block_device, fifo, e
assert module.special_file('/dev/special') == expected_result
def test_special_file_treats_broken_symlink_as_non_special():
flexmock(module.os).should_receive('stat').and_raise(FileNotFoundError)
assert module.special_file('/broken/symlink') is False
def test_any_parent_directories_treats_parents_as_match():
module.any_parent_directories('/foo/bar.txt', ('/foo', '/etc'))
@ -351,7 +357,7 @@ def test_any_parent_directories_treats_unrelated_paths_as_non_match():
def test_collect_special_file_paths_parses_special_files_from_borg_dry_run_file_list():
flexmock(module).should_receive('execute_command').and_return(
flexmock(module).should_receive('execute_command_and_capture_output').and_return(
'Processing files ...\n- /foo\n- /bar\n- /baz'
)
flexmock(module).should_receive('special_file').and_return(True)
@ -367,7 +373,9 @@ def test_collect_special_file_paths_parses_special_files_from_borg_dry_run_file_
def test_collect_special_file_paths_excludes_requested_directories():
flexmock(module).should_receive('execute_command').and_return('- /foo\n- /bar\n- /baz')
flexmock(module).should_receive('execute_command_and_capture_output').and_return(
'- /foo\n- /bar\n- /baz'
)
flexmock(module).should_receive('special_file').and_return(True)
flexmock(module).should_receive('any_parent_directories').and_return(False).and_return(
True
@ -383,7 +391,9 @@ def test_collect_special_file_paths_excludes_requested_directories():
def test_collect_special_file_paths_excludes_non_special_files():
flexmock(module).should_receive('execute_command').and_return('- /foo\n- /bar\n- /baz')
flexmock(module).should_receive('execute_command_and_capture_output').and_return(
'- /foo\n- /bar\n- /baz'
)
flexmock(module).should_receive('special_file').and_return(True).and_return(False).and_return(
True
)
@ -403,6 +413,8 @@ REPO_ARCHIVE_WITH_PATHS = (f'repo::{DEFAULT_ARCHIVE_NAME}', 'foo', 'bar')
def test_create_archive_calls_borg_with_parameters():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module).should_receive('collect_borgmatic_source_directories').and_return([])
flexmock(module).should_receive('deduplicate_directories').and_return(('foo', 'bar'))
flexmock(module).should_receive('map_directories_to_devices').and_return({})
@ -442,6 +454,8 @@ def test_create_archive_calls_borg_with_parameters():
def test_create_archive_calls_borg_with_environment():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module).should_receive('collect_borgmatic_source_directories').and_return([])
flexmock(module).should_receive('deduplicate_directories').and_return(('foo', 'bar'))
flexmock(module).should_receive('map_directories_to_devices').and_return({})
@ -482,6 +496,8 @@ def test_create_archive_calls_borg_with_environment():
def test_create_archive_with_patterns_calls_borg_with_patterns_including_converted_source_directories():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
pattern_flags = ('--patterns-from', 'patterns')
flexmock(module).should_receive('collect_borgmatic_source_directories').and_return([])
flexmock(module).should_receive('deduplicate_directories').and_return(('foo', 'bar'))
@ -524,6 +540,8 @@ def test_create_archive_with_patterns_calls_borg_with_patterns_including_convert
def test_create_archive_with_exclude_patterns_calls_borg_with_excludes():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
exclude_flags = ('--exclude-from', 'excludes')
flexmock(module).should_receive('collect_borgmatic_source_directories').and_return([])
flexmock(module).should_receive('deduplicate_directories').and_return(('foo', 'bar'))
@ -566,6 +584,8 @@ def test_create_archive_with_exclude_patterns_calls_borg_with_excludes():
def test_create_archive_with_log_info_calls_borg_with_info_parameter():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module).should_receive('collect_borgmatic_source_directories').and_return([])
flexmock(module).should_receive('deduplicate_directories').and_return(('foo', 'bar'))
flexmock(module).should_receive('map_directories_to_devices').and_return({})
@ -606,6 +626,8 @@ def test_create_archive_with_log_info_calls_borg_with_info_parameter():
def test_create_archive_with_log_info_and_json_suppresses_most_borg_output():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module).should_receive('collect_borgmatic_source_directories').and_return([])
flexmock(module).should_receive('deduplicate_directories').and_return(('foo', 'bar'))
flexmock(module).should_receive('map_directories_to_devices').and_return({})
@ -622,11 +644,8 @@ def test_create_archive_with_log_info_and_json_suppresses_most_borg_output():
(f'repo::{DEFAULT_ARCHIVE_NAME}',)
)
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
flexmock(module).should_receive('execute_command_and_capture_output').with_args(
('borg', 'create') + REPO_ARCHIVE_WITH_PATHS + ('--json',),
output_log_level=None,
output_file=None,
borg_local_path='borg',
working_directory=None,
extra_environment=None,
)
@ -647,6 +666,8 @@ def test_create_archive_with_log_info_and_json_suppresses_most_borg_output():
def test_create_archive_with_log_debug_calls_borg_with_debug_parameter():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module).should_receive('collect_borgmatic_source_directories').and_return([])
flexmock(module).should_receive('deduplicate_directories').and_return(('foo', 'bar'))
flexmock(module).should_receive('map_directories_to_devices').and_return({})
@ -687,6 +708,8 @@ def test_create_archive_with_log_debug_calls_borg_with_debug_parameter():
def test_create_archive_with_log_debug_and_json_suppresses_most_borg_output():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module).should_receive('collect_borgmatic_source_directories').and_return([])
flexmock(module).should_receive('deduplicate_directories').and_return(('foo', 'bar'))
flexmock(module).should_receive('map_directories_to_devices').and_return({})
@ -703,11 +726,8 @@ def test_create_archive_with_log_debug_and_json_suppresses_most_borg_output():
(f'repo::{DEFAULT_ARCHIVE_NAME}',)
)
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
flexmock(module).should_receive('execute_command_and_capture_output').with_args(
('borg', 'create') + REPO_ARCHIVE_WITH_PATHS + ('--json',),
output_log_level=None,
output_file=None,
borg_local_path='borg',
working_directory=None,
extra_environment=None,
)
@ -728,6 +748,8 @@ def test_create_archive_with_log_debug_and_json_suppresses_most_borg_output():
def test_create_archive_with_dry_run_calls_borg_with_dry_run_parameter():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module).should_receive('collect_borgmatic_source_directories').and_return([])
flexmock(module).should_receive('deduplicate_directories').and_return(('foo', 'bar'))
flexmock(module).should_receive('map_directories_to_devices').and_return({})
@ -769,6 +791,8 @@ def test_create_archive_with_dry_run_calls_borg_with_dry_run_parameter():
def test_create_archive_with_stats_and_dry_run_calls_borg_without_stats_parameter():
# --dry-run and --stats are mutually exclusive, see:
# https://borgbackup.readthedocs.io/en/stable/usage/create.html#description
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module).should_receive('collect_borgmatic_source_directories').and_return([])
flexmock(module).should_receive('deduplicate_directories').and_return(('foo', 'bar'))
flexmock(module).should_receive('map_directories_to_devices').and_return({})
@ -810,6 +834,8 @@ def test_create_archive_with_stats_and_dry_run_calls_borg_without_stats_paramete
def test_create_archive_with_checkpoint_interval_calls_borg_with_checkpoint_interval_parameters():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module).should_receive('collect_borgmatic_source_directories').and_return([])
flexmock(module).should_receive('deduplicate_directories').and_return(('foo', 'bar'))
flexmock(module).should_receive('map_directories_to_devices').and_return({})
@ -849,6 +875,8 @@ def test_create_archive_with_checkpoint_interval_calls_borg_with_checkpoint_inte
def test_create_archive_with_chunker_params_calls_borg_with_chunker_params_parameters():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module).should_receive('collect_borgmatic_source_directories').and_return([])
flexmock(module).should_receive('deduplicate_directories').and_return(('foo', 'bar'))
flexmock(module).should_receive('map_directories_to_devices').and_return({})
@ -888,6 +916,8 @@ def test_create_archive_with_chunker_params_calls_borg_with_chunker_params_param
def test_create_archive_with_compression_calls_borg_with_compression_parameters():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module).should_receive('collect_borgmatic_source_directories').and_return([])
flexmock(module).should_receive('deduplicate_directories').and_return(('foo', 'bar'))
flexmock(module).should_receive('map_directories_to_devices').and_return({})
@ -932,6 +962,8 @@ def test_create_archive_with_compression_calls_borg_with_compression_parameters(
def test_create_archive_with_upload_rate_limit_calls_borg_with_upload_ratelimit_parameters(
feature_available, option_flag
):
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module).should_receive('collect_borgmatic_source_directories').and_return([])
flexmock(module).should_receive('deduplicate_directories').and_return(('foo', 'bar'))
flexmock(module).should_receive('map_directories_to_devices').and_return({})
@ -971,6 +1003,8 @@ def test_create_archive_with_upload_rate_limit_calls_borg_with_upload_ratelimit_
def test_create_archive_with_working_directory_calls_borg_with_working_directory():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module).should_receive('collect_borgmatic_source_directories').and_return([])
flexmock(module).should_receive('deduplicate_directories').and_return(('foo', 'bar'))
flexmock(module).should_receive('map_directories_to_devices').and_return({})
@ -1013,6 +1047,8 @@ def test_create_archive_with_working_directory_calls_borg_with_working_directory
def test_create_archive_with_one_file_system_calls_borg_with_one_file_system_parameter():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module).should_receive('collect_borgmatic_source_directories').and_return([])
flexmock(module).should_receive('deduplicate_directories').and_return(('foo', 'bar'))
flexmock(module).should_receive('map_directories_to_devices').and_return({})
@ -1058,6 +1094,8 @@ def test_create_archive_with_one_file_system_calls_borg_with_one_file_system_par
def test_create_archive_with_numeric_ids_calls_borg_with_numeric_ids_parameter(
feature_available, option_flag
):
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module).should_receive('collect_borgmatic_source_directories').and_return([])
flexmock(module).should_receive('deduplicate_directories').and_return(('foo', 'bar'))
flexmock(module).should_receive('map_directories_to_devices').and_return({})
@ -1098,6 +1136,8 @@ def test_create_archive_with_numeric_ids_calls_borg_with_numeric_ids_parameter(
def test_create_archive_with_read_special_calls_borg_with_read_special_parameter():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module).should_receive('collect_borgmatic_source_directories').and_return([])
flexmock(module).should_receive('deduplicate_directories').and_return(('foo', 'bar'))
flexmock(module).should_receive('map_directories_to_devices').and_return({})
@ -1154,6 +1194,8 @@ def test_create_archive_with_read_special_calls_borg_with_read_special_parameter
def test_create_archive_with_basic_option_calls_borg_with_corresponding_parameter(
option_name, option_value
):
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
option_flag = '--no' + option_name.replace('', '') if option_value is False else None
flexmock(module).should_receive('collect_borgmatic_source_directories').and_return([])
flexmock(module).should_receive('deduplicate_directories').and_return(('foo', 'bar'))
@ -1206,6 +1248,8 @@ def test_create_archive_with_basic_option_calls_borg_with_corresponding_paramete
def test_create_archive_with_atime_option_calls_borg_with_corresponding_parameter(
option_value, feature_available, option_flag
):
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module).should_receive('collect_borgmatic_source_directories').and_return([])
flexmock(module).should_receive('deduplicate_directories').and_return(('foo', 'bar'))
flexmock(module).should_receive('map_directories_to_devices').and_return({})
@ -1257,6 +1301,8 @@ def test_create_archive_with_atime_option_calls_borg_with_corresponding_paramete
def test_create_archive_with_flags_option_calls_borg_with_corresponding_parameter(
option_value, feature_available, option_flag
):
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module).should_receive('collect_borgmatic_source_directories').and_return([])
flexmock(module).should_receive('deduplicate_directories').and_return(('foo', 'bar'))
flexmock(module).should_receive('map_directories_to_devices').and_return({})
@ -1297,6 +1343,8 @@ def test_create_archive_with_flags_option_calls_borg_with_corresponding_paramete
def test_create_archive_with_files_cache_calls_borg_with_files_cache_parameters():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module).should_receive('collect_borgmatic_source_directories').and_return([])
flexmock(module).should_receive('deduplicate_directories').and_return(('foo', 'bar'))
flexmock(module).should_receive('map_directories_to_devices').and_return({})
@ -1337,6 +1385,8 @@ def test_create_archive_with_files_cache_calls_borg_with_files_cache_parameters(
def test_create_archive_with_local_path_calls_borg_via_local_path():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module).should_receive('collect_borgmatic_source_directories').and_return([])
flexmock(module).should_receive('deduplicate_directories').and_return(('foo', 'bar'))
flexmock(module).should_receive('map_directories_to_devices').and_return({})
@ -1377,6 +1427,8 @@ def test_create_archive_with_local_path_calls_borg_via_local_path():
def test_create_archive_with_remote_path_calls_borg_with_remote_path_parameters():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module).should_receive('collect_borgmatic_source_directories').and_return([])
flexmock(module).should_receive('deduplicate_directories').and_return(('foo', 'bar'))
flexmock(module).should_receive('map_directories_to_devices').and_return({})
@ -1417,6 +1469,8 @@ def test_create_archive_with_remote_path_calls_borg_with_remote_path_parameters(
def test_create_archive_with_umask_calls_borg_with_umask_parameters():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module).should_receive('collect_borgmatic_source_directories').and_return([])
flexmock(module).should_receive('deduplicate_directories').and_return(('foo', 'bar'))
flexmock(module).should_receive('map_directories_to_devices').and_return({})
@ -1456,6 +1510,8 @@ def test_create_archive_with_umask_calls_borg_with_umask_parameters():
def test_create_archive_with_lock_wait_calls_borg_with_lock_wait_parameters():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module).should_receive('collect_borgmatic_source_directories').and_return([])
flexmock(module).should_receive('deduplicate_directories').and_return(('foo', 'bar'))
flexmock(module).should_receive('map_directories_to_devices').and_return({})
@ -1494,7 +1550,9 @@ def test_create_archive_with_lock_wait_calls_borg_with_lock_wait_parameters():
)
def test_create_archive_with_stats_calls_borg_with_stats_parameter_and_warning_output_log_level():
def test_create_archive_with_stats_calls_borg_with_stats_parameter_and_answer_output_log_level():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module).should_receive('collect_borgmatic_source_directories').and_return([])
flexmock(module).should_receive('deduplicate_directories').and_return(('foo', 'bar'))
flexmock(module).should_receive('map_directories_to_devices').and_return({})
@ -1513,7 +1571,7 @@ def test_create_archive_with_stats_calls_borg_with_stats_parameter_and_warning_o
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'create') + REPO_ARCHIVE_WITH_PATHS + ('--stats',),
output_log_level=logging.WARNING,
output_log_level=module.borgmatic.logger.ANSWER,
output_file=None,
borg_local_path='borg',
working_directory=None,
@ -1534,48 +1592,9 @@ def test_create_archive_with_stats_calls_borg_with_stats_parameter_and_warning_o
)
def test_create_archive_with_stats_and_log_info_calls_borg_with_stats_parameter_and_info_output_log_level():
flexmock(module).should_receive('collect_borgmatic_source_directories').and_return([])
flexmock(module).should_receive('deduplicate_directories').and_return(('foo', 'bar'))
flexmock(module).should_receive('map_directories_to_devices').and_return({})
flexmock(module).should_receive('expand_directories').and_return(())
flexmock(module).should_receive('pattern_root_directories').and_return([])
flexmock(module.os.path).should_receive('expanduser').and_raise(TypeError)
flexmock(module).should_receive('expand_home_directories').and_return(())
flexmock(module).should_receive('write_pattern_file').and_return(None)
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module).should_receive('ensure_files_readable')
flexmock(module).should_receive('make_pattern_flags').and_return(())
flexmock(module).should_receive('make_exclude_flags').and_return(())
flexmock(module.flags).should_receive('make_repository_archive_flags').and_return(
(f'repo::{DEFAULT_ARCHIVE_NAME}',)
)
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'create') + REPO_ARCHIVE_WITH_PATHS + ('--info', '--stats'),
output_log_level=logging.INFO,
output_file=None,
borg_local_path='borg',
working_directory=None,
extra_environment=None,
)
insert_logging_mock(logging.INFO)
module.create_archive(
dry_run=False,
repository='repo',
location_config={
'source_directories': ['foo', 'bar'],
'repositories': ['repo'],
'exclude_patterns': None,
},
storage_config={},
local_borg_version='1.2.3',
stats=True,
)
def test_create_archive_with_files_calls_borg_with_list_parameter_and_warning_output_log_level():
def test_create_archive_with_files_calls_borg_with_list_parameter_and_answer_output_log_level():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module).should_receive('collect_borgmatic_source_directories').and_return([])
flexmock(module).should_receive('deduplicate_directories').and_return(('foo', 'bar'))
flexmock(module).should_receive('map_directories_to_devices').and_return({})
@ -1594,7 +1613,7 @@ def test_create_archive_with_files_calls_borg_with_list_parameter_and_warning_ou
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'create', '--list', '--filter', 'AMEx-') + REPO_ARCHIVE_WITH_PATHS,
output_log_level=logging.WARNING,
output_log_level=module.borgmatic.logger.ANSWER,
output_file=None,
borg_local_path='borg',
working_directory=None,
@ -1615,48 +1634,9 @@ def test_create_archive_with_files_calls_borg_with_list_parameter_and_warning_ou
)
def test_create_archive_with_files_and_log_info_calls_borg_with_list_parameter_and_info_output_log_level():
flexmock(module).should_receive('collect_borgmatic_source_directories').and_return([])
flexmock(module).should_receive('deduplicate_directories').and_return(('foo', 'bar'))
flexmock(module).should_receive('map_directories_to_devices').and_return({})
flexmock(module).should_receive('expand_directories').and_return(())
flexmock(module).should_receive('pattern_root_directories').and_return([])
flexmock(module.os.path).should_receive('expanduser').and_raise(TypeError)
flexmock(module).should_receive('expand_home_directories').and_return(())
flexmock(module).should_receive('write_pattern_file').and_return(None)
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module).should_receive('ensure_files_readable')
flexmock(module).should_receive('make_pattern_flags').and_return(())
flexmock(module).should_receive('make_exclude_flags').and_return(())
flexmock(module.flags).should_receive('make_repository_archive_flags').and_return(
(f'repo::{DEFAULT_ARCHIVE_NAME}',)
)
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'create', '--list', '--filter', 'AMEx-') + REPO_ARCHIVE_WITH_PATHS + ('--info',),
output_log_level=logging.INFO,
output_file=None,
borg_local_path='borg',
working_directory=None,
extra_environment=None,
)
insert_logging_mock(logging.INFO)
module.create_archive(
dry_run=False,
repository='repo',
location_config={
'source_directories': ['foo', 'bar'],
'repositories': ['repo'],
'exclude_patterns': None,
},
storage_config={},
local_borg_version='1.2.3',
list_files=True,
)
def test_create_archive_with_progress_and_log_info_calls_borg_with_progress_parameter_and_no_list():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module).should_receive('collect_borgmatic_source_directories').and_return([])
flexmock(module).should_receive('deduplicate_directories').and_return(('foo', 'bar'))
flexmock(module).should_receive('map_directories_to_devices').and_return({})
@ -1698,6 +1678,8 @@ def test_create_archive_with_progress_and_log_info_calls_borg_with_progress_para
def test_create_archive_with_progress_calls_borg_with_progress_parameter():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module).should_receive('collect_borgmatic_source_directories').and_return([])
flexmock(module).should_receive('deduplicate_directories').and_return(('foo', 'bar'))
flexmock(module).should_receive('map_directories_to_devices').and_return({})
@ -1738,6 +1720,8 @@ def test_create_archive_with_progress_calls_borg_with_progress_parameter():
def test_create_archive_with_progress_and_stream_processes_calls_borg_with_progress_parameter():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
processes = flexmock()
flexmock(module).should_receive('collect_borgmatic_source_directories').and_return([])
flexmock(module).should_receive('deduplicate_directories').and_return(('foo', 'bar'))
@ -1796,6 +1780,8 @@ def test_create_archive_with_progress_and_stream_processes_calls_borg_with_progr
def test_create_archive_with_stream_processes_ignores_read_special_false_and_logs_warnings():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
processes = flexmock()
flexmock(module).should_receive('collect_borgmatic_source_directories').and_return([])
flexmock(module).should_receive('deduplicate_directories').and_return(('foo', 'bar'))
@ -1804,7 +1790,7 @@ def test_create_archive_with_stream_processes_ignores_read_special_false_and_log
flexmock(module).should_receive('pattern_root_directories').and_return([])
flexmock(module.os.path).should_receive('expanduser').and_raise(TypeError)
flexmock(module).should_receive('expand_home_directories').and_return(())
flexmock(module).should_receive('write_pattern_file').and_return(None)
flexmock(module).should_receive('write_pattern_file').and_return(flexmock(name='/tmp/excludes'))
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module).should_receive('ensure_files_readable')
flexmock(module.logger).should_receive('warning').twice()
@ -1814,7 +1800,7 @@ def test_create_archive_with_stream_processes_ignores_read_special_false_and_log
(f'repo::{DEFAULT_ARCHIVE_NAME}',)
)
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('collect_special_file_paths').and_return(())
flexmock(module).should_receive('collect_special_file_paths').and_return(("/dev/null",))
create_command = (
'borg',
'create',
@ -1856,6 +1842,8 @@ def test_create_archive_with_stream_processes_ignores_read_special_false_and_log
def test_create_archive_with_stream_processes_adds_special_files_to_excludes():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
processes = flexmock()
flexmock(module).should_receive('collect_borgmatic_source_directories').and_return([])
flexmock(module).should_receive('deduplicate_directories').and_return(('foo', 'bar'))
@ -1920,6 +1908,8 @@ def test_create_archive_with_stream_processes_adds_special_files_to_excludes():
def test_create_archive_with_stream_processes_and_read_special_does_not_add_special_files_to_excludes():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
processes = flexmock()
flexmock(module).should_receive('collect_borgmatic_source_directories').and_return([])
flexmock(module).should_receive('deduplicate_directories').and_return(('foo', 'bar'))
@ -1981,6 +1971,8 @@ def test_create_archive_with_stream_processes_and_read_special_does_not_add_spec
def test_create_archive_with_json_calls_borg_with_json_parameter():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module).should_receive('collect_borgmatic_source_directories').and_return([])
flexmock(module).should_receive('deduplicate_directories').and_return(('foo', 'bar'))
flexmock(module).should_receive('map_directories_to_devices').and_return({})
@ -1997,11 +1989,8 @@ def test_create_archive_with_json_calls_borg_with_json_parameter():
(f'repo::{DEFAULT_ARCHIVE_NAME}',)
)
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
flexmock(module).should_receive('execute_command_and_capture_output').with_args(
('borg', 'create') + REPO_ARCHIVE_WITH_PATHS + ('--json',),
output_log_level=None,
output_file=None,
borg_local_path='borg',
working_directory=None,
extra_environment=None,
).and_return('[]')
@ -2023,6 +2012,8 @@ def test_create_archive_with_json_calls_borg_with_json_parameter():
def test_create_archive_with_stats_and_json_calls_borg_without_stats_parameter():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module).should_receive('collect_borgmatic_source_directories').and_return([])
flexmock(module).should_receive('deduplicate_directories').and_return(('foo', 'bar'))
flexmock(module).should_receive('map_directories_to_devices').and_return({})
@ -2039,11 +2030,8 @@ def test_create_archive_with_stats_and_json_calls_borg_without_stats_parameter()
(f'repo::{DEFAULT_ARCHIVE_NAME}',)
)
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
flexmock(module).should_receive('execute_command_and_capture_output').with_args(
('borg', 'create') + REPO_ARCHIVE_WITH_PATHS + ('--json',),
output_log_level=None,
output_file=None,
borg_local_path='borg',
working_directory=None,
extra_environment=None,
).and_return('[]')
@ -2066,6 +2054,8 @@ def test_create_archive_with_stats_and_json_calls_borg_without_stats_parameter()
def test_create_archive_with_source_directories_glob_expands():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module).should_receive('collect_borgmatic_source_directories').and_return([])
flexmock(module).should_receive('deduplicate_directories').and_return(('foo', 'food'))
flexmock(module).should_receive('map_directories_to_devices').and_return({})
@ -2106,6 +2096,8 @@ def test_create_archive_with_source_directories_glob_expands():
def test_create_archive_with_non_matching_source_directories_glob_passes_through():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module).should_receive('collect_borgmatic_source_directories').and_return([])
flexmock(module).should_receive('deduplicate_directories').and_return(('foo*',))
flexmock(module).should_receive('map_directories_to_devices').and_return({})
@ -2146,6 +2138,8 @@ def test_create_archive_with_non_matching_source_directories_glob_passes_through
def test_create_archive_with_glob_calls_borg_with_expanded_directories():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module).should_receive('collect_borgmatic_source_directories').and_return([])
flexmock(module).should_receive('deduplicate_directories').and_return(('foo', 'food'))
flexmock(module).should_receive('map_directories_to_devices').and_return({})
@ -2185,6 +2179,8 @@ def test_create_archive_with_glob_calls_borg_with_expanded_directories():
def test_create_archive_with_archive_name_format_calls_borg_with_archive_name():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module).should_receive('collect_borgmatic_source_directories').and_return([])
flexmock(module).should_receive('deduplicate_directories').and_return(('foo', 'bar'))
flexmock(module).should_receive('map_directories_to_devices').and_return({})
@ -2224,6 +2220,8 @@ def test_create_archive_with_archive_name_format_calls_borg_with_archive_name():
def test_create_archive_with_archive_name_format_accepts_borg_placeholders():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
repository_archive_pattern = 'repo::Documents_{hostname}-{now}'
flexmock(module).should_receive('collect_borgmatic_source_directories').and_return([])
flexmock(module).should_receive('deduplicate_directories').and_return(('foo', 'bar'))
@ -2264,6 +2262,8 @@ def test_create_archive_with_archive_name_format_accepts_borg_placeholders():
def test_create_archive_with_repository_accepts_borg_placeholders():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
repository_archive_pattern = '{fqdn}::Documents_{hostname}-{now}'
flexmock(module).should_receive('collect_borgmatic_source_directories').and_return([])
flexmock(module).should_receive('deduplicate_directories').and_return(('foo', 'bar'))
@ -2304,6 +2304,8 @@ def test_create_archive_with_repository_accepts_borg_placeholders():
def test_create_archive_with_extra_borg_options_calls_borg_with_extra_options():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module).should_receive('collect_borgmatic_source_directories').and_return([])
flexmock(module).should_receive('deduplicate_directories').and_return(('foo', 'bar'))
flexmock(module).should_receive('map_directories_to_devices').and_return({})
@ -2343,6 +2345,8 @@ def test_create_archive_with_extra_borg_options_calls_borg_with_extra_options():
def test_create_archive_with_stream_processes_calls_borg_with_processes_and_read_special():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
processes = flexmock()
flexmock(module).should_receive('collect_borgmatic_source_directories').and_return([])
flexmock(module).should_receive('deduplicate_directories').and_return(('foo', 'bar'))

View File

@ -21,6 +21,8 @@ def insert_execute_command_mock(
def test_export_tar_archive_calls_borg_with_path_parameters():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_repository_archive_flags').and_return(
('repo::archive',)
)
@ -41,6 +43,8 @@ def test_export_tar_archive_calls_borg_with_path_parameters():
def test_export_tar_archive_calls_borg_with_local_path_parameters():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_repository_archive_flags').and_return(
('repo::archive',)
)
@ -62,6 +66,8 @@ def test_export_tar_archive_calls_borg_with_local_path_parameters():
def test_export_tar_archive_calls_borg_with_remote_path_parameters():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_repository_archive_flags').and_return(
('repo::archive',)
)
@ -83,6 +89,8 @@ def test_export_tar_archive_calls_borg_with_remote_path_parameters():
def test_export_tar_archive_calls_borg_with_umask_parameters():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_repository_archive_flags').and_return(
('repo::archive',)
)
@ -103,6 +111,8 @@ def test_export_tar_archive_calls_borg_with_umask_parameters():
def test_export_tar_archive_calls_borg_with_lock_wait_parameters():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_repository_archive_flags').and_return(
('repo::archive',)
)
@ -123,6 +133,8 @@ def test_export_tar_archive_calls_borg_with_lock_wait_parameters():
def test_export_tar_archive_with_log_info_calls_borg_with_info_parameter():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_repository_archive_flags').and_return(
('repo::archive',)
)
@ -142,6 +154,8 @@ def test_export_tar_archive_with_log_info_calls_borg_with_info_parameter():
def test_export_tar_archive_with_log_debug_calls_borg_with_debug_parameters():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_repository_archive_flags').and_return(
('repo::archive',)
)
@ -163,6 +177,8 @@ def test_export_tar_archive_with_log_debug_calls_borg_with_debug_parameters():
def test_export_tar_archive_calls_borg_with_dry_run_parameter():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_repository_archive_flags').and_return(
('repo::archive',)
)
@ -181,6 +197,8 @@ def test_export_tar_archive_calls_borg_with_dry_run_parameter():
def test_export_tar_archive_calls_borg_with_tar_filter_parameters():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_repository_archive_flags').and_return(
('repo::archive',)
)
@ -202,13 +220,15 @@ def test_export_tar_archive_calls_borg_with_tar_filter_parameters():
def test_export_tar_archive_calls_borg_with_list_parameter():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_repository_archive_flags').and_return(
('repo::archive',)
)
flexmock(module.os.path).should_receive('abspath').and_return('repo')
insert_execute_command_mock(
('borg', 'export-tar', '--list', 'repo::archive', 'test.tar'),
output_log_level=logging.WARNING,
output_log_level=logging.ANSWER,
)
module.export_tar_archive(
@ -224,6 +244,8 @@ def test_export_tar_archive_calls_borg_with_list_parameter():
def test_export_tar_archive_calls_borg_with_strip_components_parameter():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_repository_archive_flags').and_return(
('repo::archive',)
)
@ -245,6 +267,8 @@ def test_export_tar_archive_calls_borg_with_strip_components_parameter():
def test_export_tar_archive_skips_abspath_for_remote_repository_parameter():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_repository_archive_flags').and_return(
('server:repo::archive',)
)
@ -263,6 +287,8 @@ def test_export_tar_archive_skips_abspath_for_remote_repository_parameter():
def test_export_tar_archive_calls_borg_with_stdout_destination_path():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_repository_archive_flags').and_return(
('repo::archive',)
)

View File

@ -15,13 +15,6 @@ def insert_execute_command_mock(command, working_directory=None):
).once()
def insert_execute_command_output_mock(command, result):
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
command, output_log_level=None, borg_local_path=command[0], extra_environment=None,
).and_return(result).once()
def test_extract_last_archive_dry_run_calls_borg_with_last_archive():
flexmock(module.rlist).should_receive('resolve_archive_name').and_return('archive')
insert_execute_command_mock(('borg', 'extract', '--dry-run', 'repo::archive'))

View File

@ -9,13 +9,15 @@ from ..test_verbosity import insert_logging_mock
def test_display_archives_info_calls_borg_with_parameters():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.flags).should_receive('make_flags_from_arguments').and_return(())
flexmock(module.flags).should_receive('make_repository_flags').and_return(('--repo', 'repo'))
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'info', '--repo', 'repo'),
output_log_level=logging.WARNING,
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
)
@ -29,13 +31,15 @@ def test_display_archives_info_calls_borg_with_parameters():
def test_display_archives_info_with_log_info_calls_borg_with_info_parameter():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.flags).should_receive('make_flags_from_arguments').and_return(())
flexmock(module.flags).should_receive('make_repository_flags').and_return(('--repo', 'repo'))
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'info', '--info', '--repo', 'repo'),
output_log_level=logging.WARNING,
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
)
@ -49,15 +53,14 @@ def test_display_archives_info_with_log_info_calls_borg_with_info_parameter():
def test_display_archives_info_with_log_info_and_json_suppresses_most_borg_output():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.flags).should_receive('make_flags_from_arguments').and_return(('--json',))
flexmock(module.flags).should_receive('make_repository_flags').and_return(('--repo', 'repo'))
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'info', '--json', '--repo', 'repo'),
output_log_level=None,
borg_local_path='borg',
extra_environment=None,
flexmock(module).should_receive('execute_command_and_capture_output').with_args(
('borg', 'info', '--json', '--repo', 'repo'), extra_environment=None,
).and_return('[]')
insert_logging_mock(logging.INFO)
@ -72,13 +75,15 @@ def test_display_archives_info_with_log_info_and_json_suppresses_most_borg_outpu
def test_display_archives_info_with_log_debug_calls_borg_with_debug_parameter():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.flags).should_receive('make_flags_from_arguments').and_return(())
flexmock(module.flags).should_receive('make_repository_flags').and_return(('--repo', 'repo'))
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'info', '--debug', '--show-rc', '--repo', 'repo'),
output_log_level=logging.WARNING,
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
)
@ -93,15 +98,14 @@ def test_display_archives_info_with_log_debug_calls_borg_with_debug_parameter():
def test_display_archives_info_with_log_debug_and_json_suppresses_most_borg_output():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.flags).should_receive('make_flags_from_arguments').and_return(('--json',))
flexmock(module.flags).should_receive('make_repository_flags').and_return(('--repo', 'repo'))
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'info', '--json', '--repo', 'repo'),
output_log_level=None,
borg_local_path='borg',
extra_environment=None,
flexmock(module).should_receive('execute_command_and_capture_output').with_args(
('borg', 'info', '--json', '--repo', 'repo'), extra_environment=None,
).and_return('[]')
insert_logging_mock(logging.DEBUG)
@ -116,15 +120,14 @@ def test_display_archives_info_with_log_debug_and_json_suppresses_most_borg_outp
def test_display_archives_info_with_json_calls_borg_with_json_parameter():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.flags).should_receive('make_flags_from_arguments').and_return(('--json',))
flexmock(module.flags).should_receive('make_repository_flags').and_return(('--repo', 'repo'))
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'info', '--json', '--repo', 'repo'),
output_log_level=None,
borg_local_path='borg',
extra_environment=None,
flexmock(module).should_receive('execute_command_and_capture_output').with_args(
('borg', 'info', '--json', '--repo', 'repo'), extra_environment=None,
).and_return('[]')
json_output = module.display_archives_info(
@ -138,6 +141,8 @@ def test_display_archives_info_with_json_calls_borg_with_json_parameter():
def test_display_archives_info_with_archive_calls_borg_with_match_archives_parameter():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.flags).should_receive('make_flags').with_args(
'match-archives', 'archive'
@ -147,7 +152,7 @@ def test_display_archives_info_with_archive_calls_borg_with_match_archives_param
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'info', '--repo', 'repo', '--match-archives', 'archive'),
output_log_level=logging.WARNING,
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
)
@ -161,13 +166,15 @@ def test_display_archives_info_with_archive_calls_borg_with_match_archives_param
def test_display_archives_info_with_local_path_calls_borg_via_local_path():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.flags).should_receive('make_flags_from_arguments').and_return(())
flexmock(module.flags).should_receive('make_repository_flags').and_return(('--repo', 'repo'))
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg1', 'info', '--repo', 'repo'),
output_log_level=logging.WARNING,
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg1',
extra_environment=None,
)
@ -182,6 +189,8 @@ def test_display_archives_info_with_local_path_calls_borg_via_local_path():
def test_display_archives_info_with_remote_path_calls_borg_with_remote_path_parameters():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.flags).should_receive('make_flags').with_args(
'remote-path', 'borg1'
@ -191,7 +200,7 @@ def test_display_archives_info_with_remote_path_calls_borg_with_remote_path_para
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'info', '--remote-path', 'borg1', '--repo', 'repo'),
output_log_level=logging.WARNING,
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
)
@ -206,6 +215,8 @@ def test_display_archives_info_with_remote_path_calls_borg_with_remote_path_para
def test_display_archives_info_with_lock_wait_calls_borg_with_lock_wait_parameters():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.flags).should_receive('make_flags').with_args('lock-wait', 5).and_return(
('--lock-wait', '5')
@ -216,7 +227,7 @@ def test_display_archives_info_with_lock_wait_calls_borg_with_lock_wait_paramete
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'info', '--lock-wait', '5', '--repo', 'repo'),
output_log_level=logging.WARNING,
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
)
@ -230,6 +241,8 @@ def test_display_archives_info_with_lock_wait_calls_borg_with_lock_wait_paramete
def test_display_archives_info_with_prefix_calls_borg_with_match_archives_parameters():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.flags).should_receive('make_flags').with_args(
'match-archives', 'sh:foo*'
@ -239,7 +252,7 @@ def test_display_archives_info_with_prefix_calls_borg_with_match_archives_parame
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'info', '--match-archives', 'sh:foo*', '--repo', 'repo'),
output_log_level=logging.WARNING,
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
)
@ -254,6 +267,8 @@ def test_display_archives_info_with_prefix_calls_borg_with_match_archives_parame
@pytest.mark.parametrize('argument_name', ('match_archives', 'sort_by', 'first', 'last'))
def test_display_archives_info_passes_through_arguments_to_borg(argument_name):
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flag_name = f"--{argument_name.replace('_', ' ')}"
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.flags).should_receive('make_flags_from_arguments').and_return(
@ -263,7 +278,7 @@ def test_display_archives_info_passes_through_arguments_to_borg(argument_name):
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'info', flag_name, 'value', '--repo', 'repo'),
output_log_level=logging.WARNING,
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
)

View File

@ -253,7 +253,23 @@ def test_make_find_paths_adds_globs_to_path_fragments():
assert module.make_find_paths(('foo.txt',)) == ('sh:**/*foo.txt*/**',)
def test_capture_archive_listing_does_not_raise():
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command_and_capture_output').and_return('')
flexmock(module).should_receive('make_list_command')
module.capture_archive_listing(
repository='repo',
archive='archive',
storage_config=flexmock(),
local_borg_version=flexmock(),
)
def test_list_archive_calls_borg_with_parameters():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.logger).answer = lambda message: None
list_arguments = argparse.Namespace(
archive='archive',
paths=None,
@ -279,7 +295,7 @@ def test_list_archive_calls_borg_with_parameters():
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'list', 'repo::archive'),
output_log_level=logging.WARNING,
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
).once()
@ -293,6 +309,9 @@ def test_list_archive_calls_borg_with_parameters():
def test_list_archive_with_archive_and_json_errors():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.logger).answer = lambda message: None
list_arguments = argparse.Namespace(archive='archive', paths=None, json=True, find_paths=None)
flexmock(module.feature).should_receive('available').and_return(False)
@ -307,6 +326,9 @@ def test_list_archive_with_archive_and_json_errors():
def test_list_archive_calls_borg_with_local_path():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.logger).answer = lambda message: None
list_arguments = argparse.Namespace(
archive='archive',
paths=None,
@ -332,7 +354,7 @@ def test_list_archive_calls_borg_with_local_path():
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg2', 'list', 'repo::archive'),
output_log_level=logging.WARNING,
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg2',
extra_environment=None,
).once()
@ -347,6 +369,9 @@ def test_list_archive_calls_borg_with_local_path():
def test_list_archive_calls_borg_multiple_times_with_find_paths():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.logger).answer = lambda message: None
glob_paths = ('**/*foo.txt*/**',)
list_arguments = argparse.Namespace(
archive=None,
@ -361,11 +386,8 @@ def test_list_archive_calls_borg_multiple_times_with_find_paths():
flexmock(module.feature).should_receive('available').and_return(False)
flexmock(module.rlist).should_receive('make_rlist_command').and_return(('borg', 'list', 'repo'))
flexmock(module).should_receive('execute_command').with_args(
('borg', 'list', 'repo'),
output_log_level=None,
borg_local_path='borg',
extra_environment=None,
flexmock(module).should_receive('execute_command_and_capture_output').with_args(
('borg', 'list', 'repo'), extra_environment=None,
).and_return('archive1\narchive2').once()
flexmock(module).should_receive('make_list_command').and_return(
('borg', 'list', 'repo::archive1')
@ -374,13 +396,13 @@ def test_list_archive_calls_borg_multiple_times_with_find_paths():
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'list', 'repo::archive1') + glob_paths,
output_log_level=logging.WARNING,
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
).once()
flexmock(module).should_receive('execute_command').with_args(
('borg', 'list', 'repo::archive2') + glob_paths,
output_log_level=logging.WARNING,
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
).once()
@ -394,6 +416,9 @@ def test_list_archive_calls_borg_multiple_times_with_find_paths():
def test_list_archive_calls_borg_with_archive():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.logger).answer = lambda message: None
list_arguments = argparse.Namespace(
archive='archive',
paths=None,
@ -419,7 +444,7 @@ def test_list_archive_calls_borg_with_archive():
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'list', 'repo::archive'),
output_log_level=logging.WARNING,
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
).once()
@ -433,6 +458,9 @@ def test_list_archive_calls_borg_with_archive():
def test_list_archive_without_archive_delegates_to_list_repository():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.logger).answer = lambda message: None
list_arguments = argparse.Namespace(
archive=None,
short=None,
@ -460,6 +488,9 @@ def test_list_archive_without_archive_delegates_to_list_repository():
def test_list_archive_with_borg_features_without_archive_delegates_to_list_repository():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.logger).answer = lambda message: None
list_arguments = argparse.Namespace(
archive=None,
short=None,
@ -490,6 +521,9 @@ def test_list_archive_with_borg_features_without_archive_delegates_to_list_repos
'archive_filter_flag', ('prefix', 'match_archives', 'sort_by', 'first', 'last',),
)
def test_list_archive_with_archive_ignores_archive_filter_flag(archive_filter_flag,):
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.logger).answer = lambda message: None
default_filter_flags = {
'prefix': None,
'match_archives': None,
@ -516,7 +550,7 @@ def test_list_archive_with_archive_ignores_archive_filter_flag(archive_filter_fl
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'list', 'repo::archive'),
output_log_level=logging.WARNING,
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
).once()
@ -537,6 +571,9 @@ def test_list_archive_with_archive_ignores_archive_filter_flag(archive_filter_fl
def test_list_archive_with_find_paths_allows_archive_filter_flag_but_only_passes_it_to_rlist(
archive_filter_flag,
):
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.logger).answer = lambda message: None
default_filter_flags = {
'prefix': None,
'match_archives': None,
@ -559,11 +596,8 @@ def test_list_archive_with_find_paths_allows_archive_filter_flag_but_only_passes
remote_path=None,
).and_return(('borg', 'rlist', '--repo', 'repo'))
flexmock(module).should_receive('execute_command').with_args(
('borg', 'rlist', '--repo', 'repo'),
output_log_level=None,
borg_local_path='borg',
extra_environment=None,
flexmock(module).should_receive('execute_command_and_capture_output').with_args(
('borg', 'rlist', '--repo', 'repo'), extra_environment=None,
).and_return('archive1\narchive2').once()
flexmock(module).should_receive('make_list_command').with_args(
@ -606,13 +640,13 @@ def test_list_archive_with_find_paths_allows_archive_filter_flag_but_only_passes
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'list', '--repo', 'repo', 'archive1') + glob_paths,
output_log_level=logging.WARNING,
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
).once()
flexmock(module).should_receive('execute_command').with_args(
('borg', 'list', '--repo', 'repo', 'archive2') + glob_paths,
output_log_level=logging.WARNING,
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
).once()

View File

@ -78,6 +78,8 @@ PRUNE_COMMAND = ('borg', 'prune', '--keep-daily', '1', '--keep-weekly', '2', '--
def test_prune_archives_calls_borg_with_parameters():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module).should_receive('make_prune_flags').and_return(BASE_PRUNE_FLAGS)
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
insert_execute_command_mock(PRUNE_COMMAND + ('repo',), logging.INFO)
@ -92,6 +94,8 @@ def test_prune_archives_calls_borg_with_parameters():
def test_prune_archives_with_log_info_calls_borg_with_info_parameter():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module).should_receive('make_prune_flags').and_return(BASE_PRUNE_FLAGS)
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
insert_execute_command_mock(PRUNE_COMMAND + ('--info', 'repo'), logging.INFO)
@ -107,6 +111,8 @@ def test_prune_archives_with_log_info_calls_borg_with_info_parameter():
def test_prune_archives_with_log_debug_calls_borg_with_debug_parameter():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module).should_receive('make_prune_flags').and_return(BASE_PRUNE_FLAGS)
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
insert_execute_command_mock(PRUNE_COMMAND + ('--debug', '--show-rc', 'repo'), logging.INFO)
@ -122,6 +128,8 @@ def test_prune_archives_with_log_debug_calls_borg_with_debug_parameter():
def test_prune_archives_with_dry_run_calls_borg_with_dry_run_parameter():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module).should_receive('make_prune_flags').and_return(BASE_PRUNE_FLAGS)
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
insert_execute_command_mock(PRUNE_COMMAND + ('--dry-run', 'repo'), logging.INFO)
@ -136,6 +144,8 @@ def test_prune_archives_with_dry_run_calls_borg_with_dry_run_parameter():
def test_prune_archives_with_local_path_calls_borg_via_local_path():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module).should_receive('make_prune_flags').and_return(BASE_PRUNE_FLAGS)
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
insert_execute_command_mock(('borg1',) + PRUNE_COMMAND[1:] + ('repo',), logging.INFO)
@ -151,6 +161,8 @@ def test_prune_archives_with_local_path_calls_borg_via_local_path():
def test_prune_archives_with_remote_path_calls_borg_with_remote_path_parameters():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module).should_receive('make_prune_flags').and_return(BASE_PRUNE_FLAGS)
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
insert_execute_command_mock(PRUNE_COMMAND + ('--remote-path', 'borg1', 'repo'), logging.INFO)
@ -165,10 +177,12 @@ def test_prune_archives_with_remote_path_calls_borg_with_remote_path_parameters(
)
def test_prune_archives_with_stats_calls_borg_with_stats_parameter_and_warning_output_log_level():
def test_prune_archives_with_stats_calls_borg_with_stats_parameter_and_answer_output_log_level():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module).should_receive('make_prune_flags').and_return(BASE_PRUNE_FLAGS)
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
insert_execute_command_mock(PRUNE_COMMAND + ('--stats', 'repo'), logging.WARNING)
insert_execute_command_mock(PRUNE_COMMAND + ('--stats', 'repo'), module.borgmatic.logger.ANSWER)
module.prune_archives(
dry_run=False,
@ -180,42 +194,12 @@ def test_prune_archives_with_stats_calls_borg_with_stats_parameter_and_warning_o
)
def test_prune_archives_with_stats_and_log_info_calls_borg_with_stats_parameter_and_info_output_log_level():
def test_prune_archives_with_files_calls_borg_with_list_parameter_and_answer_output_log_level():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module).should_receive('make_prune_flags').and_return(BASE_PRUNE_FLAGS)
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
insert_logging_mock(logging.INFO)
insert_execute_command_mock(PRUNE_COMMAND + ('--stats', '--info', 'repo'), logging.INFO)
module.prune_archives(
dry_run=False,
repository='repo',
storage_config={},
retention_config=flexmock(),
local_borg_version='1.2.3',
stats=True,
)
def test_prune_archives_with_files_calls_borg_with_list_parameter_and_warning_output_log_level():
flexmock(module).should_receive('make_prune_flags').and_return(BASE_PRUNE_FLAGS)
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
insert_execute_command_mock(PRUNE_COMMAND + ('--list', 'repo'), logging.WARNING)
module.prune_archives(
dry_run=False,
repository='repo',
storage_config={},
retention_config=flexmock(),
local_borg_version='1.2.3',
list_archives=True,
)
def test_prune_archives_with_files_and_log_info_calls_borg_with_list_parameter_and_info_output_log_level():
flexmock(module).should_receive('make_prune_flags').and_return(BASE_PRUNE_FLAGS)
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
insert_logging_mock(logging.INFO)
insert_execute_command_mock(PRUNE_COMMAND + ('--info', '--list', 'repo'), logging.INFO)
insert_execute_command_mock(PRUNE_COMMAND + ('--list', 'repo'), module.borgmatic.logger.ANSWER)
module.prune_archives(
dry_run=False,
@ -228,6 +212,8 @@ def test_prune_archives_with_files_and_log_info_calls_borg_with_list_parameter_a
def test_prune_archives_with_umask_calls_borg_with_umask_parameters():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
storage_config = {'umask': '077'}
flexmock(module).should_receive('make_prune_flags').and_return(BASE_PRUNE_FLAGS)
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
@ -243,6 +229,8 @@ def test_prune_archives_with_umask_calls_borg_with_umask_parameters():
def test_prune_archives_with_lock_wait_calls_borg_with_lock_wait_parameters():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
storage_config = {'lock_wait': 5}
flexmock(module).should_receive('make_prune_flags').and_return(BASE_PRUNE_FLAGS)
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
@ -258,6 +246,8 @@ def test_prune_archives_with_lock_wait_calls_borg_with_lock_wait_parameters():
def test_prune_archives_with_extra_borg_options_calls_borg_with_extra_options():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module).should_receive('make_prune_flags').and_return(BASE_PRUNE_FLAGS)
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
insert_execute_command_mock(PRUNE_COMMAND + ('--extra', '--options', 'repo'), logging.INFO)

View File

@ -8,12 +8,14 @@ from ..test_verbosity import insert_logging_mock
def test_display_repository_info_calls_borg_with_parameters():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module.flags).should_receive('make_repository_flags').and_return(('--repo', 'repo',))
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'rinfo', '--repo', 'repo'),
output_log_level=logging.WARNING,
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
)
@ -27,12 +29,14 @@ def test_display_repository_info_calls_borg_with_parameters():
def test_display_repository_info_without_borg_features_calls_borg_with_info_sub_command():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.feature).should_receive('available').and_return(False)
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'info', 'repo'),
output_log_level=logging.WARNING,
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
)
@ -46,12 +50,14 @@ def test_display_repository_info_without_borg_features_calls_borg_with_info_sub_
def test_display_repository_info_with_log_info_calls_borg_with_info_parameter():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module.flags).should_receive('make_repository_flags').and_return(('--repo', 'repo',))
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'rinfo', '--info', '--repo', 'repo'),
output_log_level=logging.WARNING,
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
)
@ -65,14 +71,13 @@ def test_display_repository_info_with_log_info_calls_borg_with_info_parameter():
def test_display_repository_info_with_log_info_and_json_suppresses_most_borg_output():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module.flags).should_receive('make_repository_flags').and_return(('--repo', 'repo',))
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'rinfo', '--json', '--repo', 'repo'),
output_log_level=None,
borg_local_path='borg',
extra_environment=None,
flexmock(module).should_receive('execute_command_and_capture_output').with_args(
('borg', 'rinfo', '--json', '--repo', 'repo'), extra_environment=None,
).and_return('[]')
insert_logging_mock(logging.INFO)
@ -87,12 +92,14 @@ def test_display_repository_info_with_log_info_and_json_suppresses_most_borg_out
def test_display_repository_info_with_log_debug_calls_borg_with_debug_parameter():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module.flags).should_receive('make_repository_flags').and_return(('--repo', 'repo',))
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'rinfo', '--debug', '--show-rc', '--repo', 'repo'),
output_log_level=logging.WARNING,
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
)
@ -107,14 +114,13 @@ def test_display_repository_info_with_log_debug_calls_borg_with_debug_parameter(
def test_display_repository_info_with_log_debug_and_json_suppresses_most_borg_output():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module.flags).should_receive('make_repository_flags').and_return(('--repo', 'repo',))
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'rinfo', '--json', '--repo', 'repo'),
output_log_level=None,
borg_local_path='borg',
extra_environment=None,
flexmock(module).should_receive('execute_command_and_capture_output').with_args(
('borg', 'rinfo', '--json', '--repo', 'repo'), extra_environment=None,
).and_return('[]')
insert_logging_mock(logging.DEBUG)
@ -129,14 +135,13 @@ def test_display_repository_info_with_log_debug_and_json_suppresses_most_borg_ou
def test_display_repository_info_with_json_calls_borg_with_json_parameter():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module.flags).should_receive('make_repository_flags').and_return(('--repo', 'repo',))
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'rinfo', '--json', '--repo', 'repo'),
output_log_level=None,
borg_local_path='borg',
extra_environment=None,
flexmock(module).should_receive('execute_command_and_capture_output').with_args(
('borg', 'rinfo', '--json', '--repo', 'repo'), extra_environment=None,
).and_return('[]')
json_output = module.display_repository_info(
@ -150,12 +155,14 @@ def test_display_repository_info_with_json_calls_borg_with_json_parameter():
def test_display_repository_info_with_local_path_calls_borg_via_local_path():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module.flags).should_receive('make_repository_flags').and_return(('--repo', 'repo',))
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg1', 'rinfo', '--repo', 'repo'),
output_log_level=logging.WARNING,
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg1',
extra_environment=None,
)
@ -170,12 +177,14 @@ def test_display_repository_info_with_local_path_calls_borg_via_local_path():
def test_display_repository_info_with_remote_path_calls_borg_with_remote_path_parameters():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module.flags).should_receive('make_repository_flags').and_return(('--repo', 'repo',))
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'rinfo', '--remote-path', 'borg1', '--repo', 'repo'),
output_log_level=logging.WARNING,
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
)
@ -190,13 +199,15 @@ def test_display_repository_info_with_remote_path_calls_borg_with_remote_path_pa
def test_display_repository_info_with_lock_wait_calls_borg_with_lock_wait_parameters():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
storage_config = {'lock_wait': 5}
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module.flags).should_receive('make_repository_flags').and_return(('--repo', 'repo',))
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'rinfo', '--lock-wait', '5', '--repo', 'repo'),
output_log_level=logging.WARNING,
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
)

View File

@ -28,11 +28,8 @@ def test_resolve_archive_name_passes_through_non_latest_archive_name():
def test_resolve_archive_name_calls_borg_with_parameters():
expected_archive = 'archive-name'
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'list') + BORG_LIST_LATEST_ARGUMENTS,
output_log_level=None,
borg_local_path='borg',
extra_environment=None,
flexmock(module).should_receive('execute_command_and_capture_output').with_args(
('borg', 'list') + BORG_LIST_LATEST_ARGUMENTS, extra_environment=None,
).and_return(expected_archive + '\n')
assert (
@ -44,11 +41,8 @@ def test_resolve_archive_name_calls_borg_with_parameters():
def test_resolve_archive_name_with_log_info_calls_borg_without_info_parameter():
expected_archive = 'archive-name'
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'list') + BORG_LIST_LATEST_ARGUMENTS,
output_log_level=None,
borg_local_path='borg',
extra_environment=None,
flexmock(module).should_receive('execute_command_and_capture_output').with_args(
('borg', 'list') + BORG_LIST_LATEST_ARGUMENTS, extra_environment=None,
).and_return(expected_archive + '\n')
insert_logging_mock(logging.INFO)
@ -61,11 +55,8 @@ def test_resolve_archive_name_with_log_info_calls_borg_without_info_parameter():
def test_resolve_archive_name_with_log_debug_calls_borg_without_debug_parameter():
expected_archive = 'archive-name'
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'list') + BORG_LIST_LATEST_ARGUMENTS,
output_log_level=None,
borg_local_path='borg',
extra_environment=None,
flexmock(module).should_receive('execute_command_and_capture_output').with_args(
('borg', 'list') + BORG_LIST_LATEST_ARGUMENTS, extra_environment=None,
).and_return(expected_archive + '\n')
insert_logging_mock(logging.DEBUG)
@ -78,11 +69,8 @@ def test_resolve_archive_name_with_log_debug_calls_borg_without_debug_parameter(
def test_resolve_archive_name_with_local_path_calls_borg_via_local_path():
expected_archive = 'archive-name'
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg1', 'list') + BORG_LIST_LATEST_ARGUMENTS,
output_log_level=None,
borg_local_path='borg1',
extra_environment=None,
flexmock(module).should_receive('execute_command_and_capture_output').with_args(
('borg1', 'list') + BORG_LIST_LATEST_ARGUMENTS, extra_environment=None,
).and_return(expected_archive + '\n')
assert (
@ -96,10 +84,8 @@ def test_resolve_archive_name_with_local_path_calls_borg_via_local_path():
def test_resolve_archive_name_with_remote_path_calls_borg_with_remote_path_parameters():
expected_archive = 'archive-name'
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
flexmock(module).should_receive('execute_command_and_capture_output').with_args(
('borg', 'list', '--remote-path', 'borg1') + BORG_LIST_LATEST_ARGUMENTS,
output_log_level=None,
borg_local_path='borg',
extra_environment=None,
).and_return(expected_archive + '\n')
@ -113,11 +99,8 @@ def test_resolve_archive_name_with_remote_path_calls_borg_with_remote_path_param
def test_resolve_archive_name_without_archives_raises():
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'list') + BORG_LIST_LATEST_ARGUMENTS,
output_log_level=None,
borg_local_path='borg',
extra_environment=None,
flexmock(module).should_receive('execute_command_and_capture_output').with_args(
('borg', 'list') + BORG_LIST_LATEST_ARGUMENTS, extra_environment=None,
).and_return('')
with pytest.raises(ValueError):
@ -128,10 +111,8 @@ def test_resolve_archive_name_with_lock_wait_calls_borg_with_lock_wait_parameter
expected_archive = 'archive-name'
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
flexmock(module).should_receive('execute_command_and_capture_output').with_args(
('borg', 'list', '--lock-wait', 'okay') + BORG_LIST_LATEST_ARGUMENTS,
output_log_level=None,
borg_local_path='borg',
extra_environment=None,
).and_return(expected_archive + '\n')
@ -344,6 +325,8 @@ def test_make_rlist_command_includes_additional_flags(argument_name):
def test_list_repository_calls_borg_with_parameters():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
rlist_arguments = argparse.Namespace(json=False)
flexmock(module.feature).should_receive('available').and_return(False)
@ -358,7 +341,7 @@ def test_list_repository_calls_borg_with_parameters():
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'rlist', 'repo'),
output_log_level=logging.WARNING,
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
).once()
@ -372,6 +355,8 @@ def test_list_repository_calls_borg_with_parameters():
def test_list_repository_with_json_returns_borg_output():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
rlist_arguments = argparse.Namespace(json=True)
json_output = flexmock()
@ -385,7 +370,7 @@ def test_list_repository_with_json_returns_borg_output():
remote_path=None,
).and_return(('borg', 'rlist', 'repo'))
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').and_return(json_output)
flexmock(module).should_receive('execute_command_and_capture_output').and_return(json_output)
assert (
module.list_repository(

View File

@ -9,13 +9,15 @@ from ..test_verbosity import insert_logging_mock
def test_transfer_archives_calls_borg_with_flags():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.flags).should_receive('make_flags_from_arguments').and_return(())
flexmock(module.flags).should_receive('make_repository_flags').and_return(('--repo', 'repo'))
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'transfer', '--repo', 'repo'),
output_log_level=logging.WARNING,
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
)
@ -30,6 +32,8 @@ def test_transfer_archives_calls_borg_with_flags():
def test_transfer_archives_with_dry_run_calls_borg_with_dry_run_flag():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.flags).should_receive('make_flags').with_args('dry-run', True).and_return(
('--dry-run',)
@ -39,7 +43,7 @@ def test_transfer_archives_with_dry_run_calls_borg_with_dry_run_flag():
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'transfer', '--repo', 'repo', '--dry-run'),
output_log_level=logging.WARNING,
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
)
@ -54,13 +58,15 @@ def test_transfer_archives_with_dry_run_calls_borg_with_dry_run_flag():
def test_transfer_archives_with_log_info_calls_borg_with_info_flag():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.flags).should_receive('make_flags_from_arguments').and_return(())
flexmock(module.flags).should_receive('make_repository_flags').and_return(('--repo', 'repo'))
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'transfer', '--info', '--repo', 'repo'),
output_log_level=logging.WARNING,
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
)
@ -75,13 +81,15 @@ def test_transfer_archives_with_log_info_calls_borg_with_info_flag():
def test_transfer_archives_with_log_debug_calls_borg_with_debug_flag():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.flags).should_receive('make_flags_from_arguments').and_return(())
flexmock(module.flags).should_receive('make_repository_flags').and_return(('--repo', 'repo'))
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'transfer', '--debug', '--show-rc', '--repo', 'repo'),
output_log_level=logging.WARNING,
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
)
@ -97,6 +105,8 @@ def test_transfer_archives_with_log_debug_calls_borg_with_debug_flag():
def test_transfer_archives_with_archive_calls_borg_with_match_archives_flag():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.flags).should_receive('make_flags').with_args(
'match-archives', 'archive'
@ -106,7 +116,7 @@ def test_transfer_archives_with_archive_calls_borg_with_match_archives_flag():
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'transfer', '--match-archives', 'archive', '--repo', 'repo'),
output_log_level=logging.WARNING,
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
)
@ -121,6 +131,8 @@ def test_transfer_archives_with_archive_calls_borg_with_match_archives_flag():
def test_transfer_archives_with_match_archives_calls_borg_with_match_archives_flag():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.flags).should_receive('make_flags').with_args(
'match-archives', 'sh:foo*'
@ -130,7 +142,7 @@ def test_transfer_archives_with_match_archives_calls_borg_with_match_archives_fl
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'transfer', '--match-archives', 'sh:foo*', '--repo', 'repo'),
output_log_level=logging.WARNING,
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
)
@ -145,13 +157,15 @@ def test_transfer_archives_with_match_archives_calls_borg_with_match_archives_fl
def test_transfer_archives_with_local_path_calls_borg_via_local_path():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.flags).should_receive('make_flags_from_arguments').and_return(())
flexmock(module.flags).should_receive('make_repository_flags').and_return(('--repo', 'repo'))
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg2', 'transfer', '--repo', 'repo'),
output_log_level=logging.WARNING,
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg2',
extra_environment=None,
)
@ -167,6 +181,8 @@ def test_transfer_archives_with_local_path_calls_borg_via_local_path():
def test_transfer_archives_with_remote_path_calls_borg_with_remote_path_flags():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.flags).should_receive('make_flags').with_args(
'remote-path', 'borg2'
@ -176,7 +192,7 @@ def test_transfer_archives_with_remote_path_calls_borg_with_remote_path_flags():
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'transfer', '--remote-path', 'borg2', '--repo', 'repo'),
output_log_level=logging.WARNING,
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
)
@ -192,6 +208,8 @@ def test_transfer_archives_with_remote_path_calls_borg_with_remote_path_flags():
def test_transfer_archives_with_lock_wait_calls_borg_with_lock_wait_flags():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.flags).should_receive('make_flags').with_args('lock-wait', 5).and_return(
('--lock-wait', '5')
@ -202,7 +220,7 @@ def test_transfer_archives_with_lock_wait_calls_borg_with_lock_wait_flags():
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'transfer', '--lock-wait', '5', '--repo', 'repo'),
output_log_level=logging.WARNING,
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
)
@ -218,6 +236,8 @@ def test_transfer_archives_with_lock_wait_calls_borg_with_lock_wait_flags():
@pytest.mark.parametrize('argument_name', ('upgrader', 'sort_by', 'first', 'last'))
def test_transfer_archives_passes_through_arguments_to_borg(argument_name):
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flag_name = f"--{argument_name.replace('_', ' ')}"
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.flags).should_receive('make_flags_from_arguments').and_return(
@ -227,7 +247,7 @@ def test_transfer_archives_passes_through_arguments_to_borg(argument_name):
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'transfer', flag_name, 'value', '--repo', 'repo'),
output_log_level=logging.WARNING,
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
)
@ -244,6 +264,7 @@ def test_transfer_archives_passes_through_arguments_to_borg(argument_name):
def test_transfer_archives_with_source_repository_calls_borg_with_other_repo_flags():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.flags).should_receive('make_flags').with_args('other-repo', 'other').and_return(
('--other-repo', 'other')
@ -253,7 +274,7 @@ def test_transfer_archives_with_source_repository_calls_borg_with_other_repo_fla
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'transfer', '--repo', 'repo', '--other-repo', 'other'),
output_log_level=logging.WARNING,
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
)

View File

@ -10,22 +10,24 @@ from ..test_verbosity import insert_logging_mock
VERSION = '1.2.3'
def insert_execute_command_mock(command, borg_local_path='borg', version_output=f'borg {VERSION}'):
def insert_execute_command_and_capture_output_mock(
command, borg_local_path='borg', version_output=f'borg {VERSION}'
):
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
command, output_log_level=None, borg_local_path=borg_local_path, extra_environment=None,
flexmock(module).should_receive('execute_command_and_capture_output').with_args(
command, extra_environment=None,
).once().and_return(version_output)
def test_local_borg_version_calls_borg_with_required_parameters():
insert_execute_command_mock(('borg', '--version'))
insert_execute_command_and_capture_output_mock(('borg', '--version'))
flexmock(module.environment).should_receive('make_environment')
assert module.local_borg_version({}) == VERSION
def test_local_borg_version_with_log_info_calls_borg_with_info_parameter():
insert_execute_command_mock(('borg', '--version', '--info'))
insert_execute_command_and_capture_output_mock(('borg', '--version', '--info'))
insert_logging_mock(logging.INFO)
flexmock(module.environment).should_receive('make_environment')
@ -33,7 +35,7 @@ def test_local_borg_version_with_log_info_calls_borg_with_info_parameter():
def test_local_borg_version_with_log_debug_calls_borg_with_debug_parameters():
insert_execute_command_mock(('borg', '--version', '--debug', '--show-rc'))
insert_execute_command_and_capture_output_mock(('borg', '--version', '--debug', '--show-rc'))
insert_logging_mock(logging.DEBUG)
flexmock(module.environment).should_receive('make_environment')
@ -41,14 +43,14 @@ def test_local_borg_version_with_log_debug_calls_borg_with_debug_parameters():
def test_local_borg_version_with_local_borg_path_calls_borg_with_it():
insert_execute_command_mock(('borg1', '--version'), borg_local_path='borg1')
insert_execute_command_and_capture_output_mock(('borg1', '--version'), borg_local_path='borg1')
flexmock(module.environment).should_receive('make_environment')
assert module.local_borg_version({}, 'borg1') == VERSION
def test_local_borg_version_with_invalid_version_raises():
insert_execute_command_mock(('borg', '--version'), version_output='wtf')
insert_execute_command_and_capture_output_mock(('borg', '--version'), version_output='wtf')
flexmock(module.environment).should_receive('make_environment')
with pytest.raises(ValueError):

View File

@ -9,6 +9,7 @@ from borgmatic.commands import borgmatic as module
def test_run_configuration_runs_actions_for_each_repository():
flexmock(module).should_receive('verbosity_to_log_level').and_return(logging.INFO)
flexmock(module.borg_version).should_receive('local_borg_version').and_return(flexmock())
expected_results = [flexmock(), flexmock()]
flexmock(module).should_receive('run_actions').and_return(expected_results[:1]).and_return(
@ -23,6 +24,7 @@ def test_run_configuration_runs_actions_for_each_repository():
def test_run_configuration_with_invalid_borg_version_errors():
flexmock(module).should_receive('verbosity_to_log_level').and_return(logging.INFO)
flexmock(module.borg_version).should_receive('local_borg_version').and_raise(ValueError)
flexmock(module.command).should_receive('execute_hook').never()
flexmock(module.dispatch).should_receive('call_hooks').never()
@ -34,6 +36,7 @@ def test_run_configuration_with_invalid_borg_version_errors():
def test_run_configuration_logs_monitor_start_error():
flexmock(module).should_receive('verbosity_to_log_level').and_return(logging.INFO)
flexmock(module.borg_version).should_receive('local_borg_version').and_return(flexmock())
flexmock(module.dispatch).should_receive('call_hooks').and_raise(OSError).and_return(
None
@ -50,6 +53,7 @@ def test_run_configuration_logs_monitor_start_error():
def test_run_configuration_bails_for_monitor_start_soft_failure():
flexmock(module).should_receive('verbosity_to_log_level').and_return(logging.INFO)
flexmock(module.borg_version).should_receive('local_borg_version').and_return(flexmock())
error = subprocess.CalledProcessError(borgmatic.hooks.command.SOFT_FAIL_EXIT_CODE, 'try again')
flexmock(module.dispatch).should_receive('call_hooks').and_raise(error)
@ -64,6 +68,7 @@ def test_run_configuration_bails_for_monitor_start_soft_failure():
def test_run_configuration_logs_actions_error():
flexmock(module).should_receive('verbosity_to_log_level').and_return(logging.INFO)
flexmock(module.borg_version).should_receive('local_borg_version').and_return(flexmock())
flexmock(module.command).should_receive('execute_hook')
flexmock(module.dispatch).should_receive('call_hooks')
@ -79,6 +84,7 @@ def test_run_configuration_logs_actions_error():
def test_run_configuration_bails_for_actions_soft_failure():
flexmock(module).should_receive('verbosity_to_log_level').and_return(logging.INFO)
flexmock(module.borg_version).should_receive('local_borg_version').and_return(flexmock())
flexmock(module.dispatch).should_receive('call_hooks')
error = subprocess.CalledProcessError(borgmatic.hooks.command.SOFT_FAIL_EXIT_CODE, 'try again')
@ -94,6 +100,7 @@ def test_run_configuration_bails_for_actions_soft_failure():
def test_run_configuration_logs_monitor_finish_error():
flexmock(module).should_receive('verbosity_to_log_level').and_return(logging.INFO)
flexmock(module.borg_version).should_receive('local_borg_version').and_return(flexmock())
flexmock(module.dispatch).should_receive('call_hooks').and_return(None).and_return(
None
@ -110,6 +117,7 @@ def test_run_configuration_logs_monitor_finish_error():
def test_run_configuration_bails_for_monitor_finish_soft_failure():
flexmock(module).should_receive('verbosity_to_log_level').and_return(logging.INFO)
flexmock(module.borg_version).should_receive('local_borg_version').and_return(flexmock())
error = subprocess.CalledProcessError(borgmatic.hooks.command.SOFT_FAIL_EXIT_CODE, 'try again')
flexmock(module.dispatch).should_receive('call_hooks').and_return(None).and_return(
@ -127,6 +135,7 @@ def test_run_configuration_bails_for_monitor_finish_soft_failure():
def test_run_configuration_logs_on_error_hook_error():
flexmock(module).should_receive('verbosity_to_log_level').and_return(logging.INFO)
flexmock(module.borg_version).should_receive('local_borg_version').and_return(flexmock())
flexmock(module.command).should_receive('execute_hook').and_raise(OSError)
expected_results = [flexmock(), flexmock()]
@ -143,6 +152,7 @@ def test_run_configuration_logs_on_error_hook_error():
def test_run_configuration_bails_for_on_error_hook_soft_failure():
flexmock(module).should_receive('verbosity_to_log_level').and_return(logging.INFO)
flexmock(module.borg_version).should_receive('local_borg_version').and_return(flexmock())
error = subprocess.CalledProcessError(borgmatic.hooks.command.SOFT_FAIL_EXIT_CODE, 'try again')
flexmock(module.command).should_receive('execute_hook').and_raise(error)
@ -159,6 +169,7 @@ def test_run_configuration_bails_for_on_error_hook_soft_failure():
def test_run_configuration_retries_soft_error():
# Run action first fails, second passes
flexmock(module).should_receive('verbosity_to_log_level').and_return(logging.INFO)
flexmock(module.borg_version).should_receive('local_borg_version').and_return(flexmock())
flexmock(module.command).should_receive('execute_hook')
flexmock(module).should_receive('run_actions').and_raise(OSError).and_return([])
@ -171,6 +182,7 @@ def test_run_configuration_retries_soft_error():
def test_run_configuration_retries_hard_error():
# Run action fails twice
flexmock(module).should_receive('verbosity_to_log_level').and_return(logging.INFO)
flexmock(module.borg_version).should_receive('local_borg_version').and_return(flexmock())
flexmock(module.command).should_receive('execute_hook')
flexmock(module).should_receive('run_actions').and_raise(OSError).times(2)
@ -190,7 +202,8 @@ def test_run_configuration_retries_hard_error():
assert results == error_logs
def test_run_repos_ordered():
def test_run_configuration_repos_ordered():
flexmock(module).should_receive('verbosity_to_log_level').and_return(logging.INFO)
flexmock(module.borg_version).should_receive('local_borg_version').and_return(flexmock())
flexmock(module.command).should_receive('execute_hook')
flexmock(module).should_receive('run_actions').and_raise(OSError).times(2)
@ -208,6 +221,7 @@ def test_run_repos_ordered():
def test_run_configuration_retries_round_robbin():
flexmock(module).should_receive('verbosity_to_log_level').and_return(logging.INFO)
flexmock(module.borg_version).should_receive('local_borg_version').and_return(flexmock())
flexmock(module.command).should_receive('execute_hook')
flexmock(module).should_receive('run_actions').and_raise(OSError).times(4)
@ -238,6 +252,7 @@ def test_run_configuration_retries_round_robbin():
def test_run_configuration_retries_one_passes():
flexmock(module).should_receive('verbosity_to_log_level').and_return(logging.INFO)
flexmock(module.borg_version).should_receive('local_borg_version').and_return(flexmock())
flexmock(module.command).should_receive('execute_hook')
flexmock(module).should_receive('run_actions').and_raise(OSError).and_raise(OSError).and_return(
@ -266,6 +281,7 @@ def test_run_configuration_retries_one_passes():
def test_run_configuration_retry_wait():
flexmock(module).should_receive('verbosity_to_log_level').and_return(logging.INFO)
flexmock(module.borg_version).should_receive('local_borg_version').and_return(flexmock())
flexmock(module.command).should_receive('execute_hook')
flexmock(module).should_receive('run_actions').and_raise(OSError).times(4)
@ -304,6 +320,7 @@ def test_run_configuration_retry_wait():
def test_run_configuration_retries_timeout_multiple_repos():
flexmock(module).should_receive('verbosity_to_log_level').and_return(logging.INFO)
flexmock(module.borg_version).should_receive('local_borg_version').and_return(flexmock())
flexmock(module.command).should_receive('execute_hook')
flexmock(module).should_receive('run_actions').and_raise(OSError).and_raise(OSError).and_return(
@ -340,425 +357,387 @@ def test_run_configuration_retries_timeout_multiple_repos():
assert results == error_logs
def test_run_actions_does_not_raise_for_rcreate_action():
flexmock(module.borg_rcreate).should_receive('create_repository')
arguments = {
'global': flexmock(monitoring_verbosity=1, dry_run=False),
'rcreate': flexmock(
encryption_mode=flexmock(),
source_repository=flexmock(),
copy_crypt_key=flexmock(),
append_only=flexmock(),
storage_quota=flexmock(),
make_parent_dirs=flexmock(),
),
}
def test_run_actions_runs_rcreate():
flexmock(module).should_receive('add_custom_log_levels')
flexmock(module.command).should_receive('execute_hook')
flexmock(borgmatic.actions.rcreate).should_receive('run_rcreate').once()
list(
tuple(
module.run_actions(
arguments=arguments,
config_filename='test.yaml',
location={'repositories': ['repo']},
storage={},
retention={},
consistency={},
arguments={'global': flexmock(dry_run=False), 'rcreate': flexmock()},
config_filename=flexmock(),
location={'repositories': []},
storage=flexmock(),
retention=flexmock(),
consistency=flexmock(),
hooks={},
local_path=None,
remote_path=None,
local_borg_version=None,
local_path=flexmock(),
remote_path=flexmock(),
local_borg_version=flexmock(),
repository_path='repo',
)
)
def test_run_actions_does_not_raise_for_transfer_action():
flexmock(module.borg_transfer).should_receive('transfer_archives')
arguments = {
'global': flexmock(monitoring_verbosity=1, dry_run=False),
'transfer': flexmock(),
}
def test_run_actions_runs_transfer():
flexmock(module).should_receive('add_custom_log_levels')
flexmock(module.command).should_receive('execute_hook')
flexmock(borgmatic.actions.transfer).should_receive('run_transfer').once()
list(
tuple(
module.run_actions(
arguments=arguments,
config_filename='test.yaml',
location={'repositories': ['repo']},
storage={},
retention={},
consistency={},
arguments={'global': flexmock(dry_run=False), 'transfer': flexmock()},
config_filename=flexmock(),
location={'repositories': []},
storage=flexmock(),
retention=flexmock(),
consistency=flexmock(),
hooks={},
local_path=None,
remote_path=None,
local_borg_version=None,
local_path=flexmock(),
remote_path=flexmock(),
local_borg_version=flexmock(),
repository_path='repo',
)
)
def test_run_actions_calls_hooks_for_prune_action():
flexmock(module.borg_prune).should_receive('prune_archives')
flexmock(module.command).should_receive('execute_hook').times(
4
) # Before/after extract and before/after actions.
arguments = {
'global': flexmock(monitoring_verbosity=1, dry_run=False),
'prune': flexmock(stats=flexmock(), list_archives=flexmock()),
}
def test_run_actions_runs_prune():
flexmock(module).should_receive('add_custom_log_levels')
flexmock(module.command).should_receive('execute_hook')
flexmock(borgmatic.actions.prune).should_receive('run_prune').once()
list(
tuple(
module.run_actions(
arguments=arguments,
config_filename='test.yaml',
location={'repositories': ['repo']},
storage={},
retention={},
consistency={},
arguments={'global': flexmock(dry_run=False), 'prune': flexmock()},
config_filename=flexmock(),
location={'repositories': []},
storage=flexmock(),
retention=flexmock(),
consistency=flexmock(),
hooks={},
local_path=None,
remote_path=None,
local_borg_version=None,
local_path=flexmock(),
remote_path=flexmock(),
local_borg_version=flexmock(),
repository_path='repo',
)
)
def test_run_actions_calls_hooks_for_compact_action():
flexmock(module.borg_feature).should_receive('available').and_return(True)
flexmock(module.borg_compact).should_receive('compact_segments')
flexmock(module.command).should_receive('execute_hook').times(
4
) # Before/after extract and before/after actions.
arguments = {
'global': flexmock(monitoring_verbosity=1, dry_run=False),
'compact': flexmock(progress=flexmock(), cleanup_commits=flexmock(), threshold=flexmock()),
}
def test_run_actions_runs_compact():
flexmock(module).should_receive('add_custom_log_levels')
flexmock(module.command).should_receive('execute_hook')
flexmock(borgmatic.actions.compact).should_receive('run_compact').once()
list(
tuple(
module.run_actions(
arguments=arguments,
config_filename='test.yaml',
location={'repositories': ['repo']},
storage={},
retention={},
consistency={},
arguments={'global': flexmock(dry_run=False), 'compact': flexmock()},
config_filename=flexmock(),
location={'repositories': []},
storage=flexmock(),
retention=flexmock(),
consistency=flexmock(),
hooks={},
local_path=None,
remote_path=None,
local_borg_version=None,
local_path=flexmock(),
remote_path=flexmock(),
local_borg_version=flexmock(),
repository_path='repo',
)
)
def test_run_actions_executes_and_calls_hooks_for_create_action():
flexmock(module.borg_create).should_receive('create_archive')
flexmock(module.command).should_receive('execute_hook').times(
4
) # Before/after extract and before/after actions.
flexmock(module.dispatch).should_receive('call_hooks').and_return({})
flexmock(module.dispatch).should_receive('call_hooks_even_if_unconfigured').and_return({})
arguments = {
'global': flexmock(monitoring_verbosity=1, dry_run=False),
'create': flexmock(
progress=flexmock(), stats=flexmock(), json=flexmock(), list_files=flexmock()
),
}
def test_run_actions_runs_create():
flexmock(module).should_receive('add_custom_log_levels')
flexmock(module.command).should_receive('execute_hook')
expected = flexmock()
flexmock(borgmatic.actions.create).should_receive('run_create').and_yield(expected).once()
list(
result = tuple(
module.run_actions(
arguments=arguments,
config_filename='test.yaml',
location={'repositories': ['repo']},
storage={},
retention={},
consistency={},
arguments={'global': flexmock(dry_run=False), 'create': flexmock()},
config_filename=flexmock(),
location={'repositories': []},
storage=flexmock(),
retention=flexmock(),
consistency=flexmock(),
hooks={},
local_path=None,
remote_path=None,
local_borg_version=None,
local_path=flexmock(),
remote_path=flexmock(),
local_borg_version=flexmock(),
repository_path='repo',
)
)
assert result == (expected,)
def test_run_actions_calls_hooks_for_check_action():
def test_run_actions_runs_check_when_repository_enabled_for_checks():
flexmock(module).should_receive('add_custom_log_levels')
flexmock(module.command).should_receive('execute_hook')
flexmock(module.checks).should_receive('repository_enabled_for_checks').and_return(True)
flexmock(module.borg_check).should_receive('check_archives')
flexmock(module.command).should_receive('execute_hook').times(
4
) # Before/after extract and before/after actions.
arguments = {
'global': flexmock(monitoring_verbosity=1, dry_run=False),
'check': flexmock(
progress=flexmock(), repair=flexmock(), only=flexmock(), force=flexmock()
),
}
flexmock(borgmatic.actions.check).should_receive('run_check').once()
list(
tuple(
module.run_actions(
arguments=arguments,
config_filename='test.yaml',
location={'repositories': ['repo']},
storage={},
retention={},
consistency={},
arguments={'global': flexmock(dry_run=False), 'check': flexmock()},
config_filename=flexmock(),
location={'repositories': []},
storage=flexmock(),
retention=flexmock(),
consistency=flexmock(),
hooks={},
local_path=None,
remote_path=None,
local_borg_version=None,
local_path=flexmock(),
remote_path=flexmock(),
local_borg_version=flexmock(),
repository_path='repo',
)
)
def test_run_actions_calls_hooks_for_extract_action():
flexmock(module.validate).should_receive('repositories_match').and_return(True)
flexmock(module.borg_extract).should_receive('extract_archive')
flexmock(module.command).should_receive('execute_hook').times(
4
) # Before/after extract and before/after actions.
arguments = {
'global': flexmock(monitoring_verbosity=1, dry_run=False),
'extract': flexmock(
paths=flexmock(),
progress=flexmock(),
destination=flexmock(),
strip_components=flexmock(),
archive=flexmock(),
repository='repo',
),
}
def test_run_actions_skips_check_when_repository_not_enabled_for_checks():
flexmock(module).should_receive('add_custom_log_levels')
flexmock(module.command).should_receive('execute_hook')
flexmock(module.checks).should_receive('repository_enabled_for_checks').and_return(False)
flexmock(borgmatic.actions.check).should_receive('run_check').never()
list(
tuple(
module.run_actions(
arguments=arguments,
config_filename='test.yaml',
location={'repositories': ['repo']},
storage={},
retention={},
consistency={},
arguments={'global': flexmock(dry_run=False), 'check': flexmock()},
config_filename=flexmock(),
location={'repositories': []},
storage=flexmock(),
retention=flexmock(),
consistency=flexmock(),
hooks={},
local_path=None,
remote_path=None,
local_borg_version=None,
local_path=flexmock(),
remote_path=flexmock(),
local_borg_version=flexmock(),
repository_path='repo',
)
)
def test_run_actions_does_not_raise_for_export_tar_action():
flexmock(module.validate).should_receive('repositories_match').and_return(True)
flexmock(module.borg_export_tar).should_receive('export_tar_archive')
arguments = {
'global': flexmock(monitoring_verbosity=1, dry_run=False),
'export-tar': flexmock(
repository=flexmock(),
archive=flexmock(),
paths=flexmock(),
destination=flexmock(),
tar_filter=flexmock(),
list_files=flexmock(),
strip_components=flexmock(),
),
}
def test_run_actions_runs_extract():
flexmock(module).should_receive('add_custom_log_levels')
flexmock(module.command).should_receive('execute_hook')
flexmock(borgmatic.actions.extract).should_receive('run_extract').once()
list(
tuple(
module.run_actions(
arguments=arguments,
config_filename='test.yaml',
location={'repositories': ['repo']},
storage={},
retention={},
consistency={},
arguments={'global': flexmock(dry_run=False), 'extract': flexmock()},
config_filename=flexmock(),
location={'repositories': []},
storage=flexmock(),
retention=flexmock(),
consistency=flexmock(),
hooks={},
local_path=None,
remote_path=None,
local_borg_version=None,
local_path=flexmock(),
remote_path=flexmock(),
local_borg_version=flexmock(),
repository_path='repo',
)
)
def test_run_actions_does_not_raise_for_mount_action():
flexmock(module.validate).should_receive('repositories_match').and_return(True)
flexmock(module.borg_mount).should_receive('mount_archive')
arguments = {
'global': flexmock(monitoring_verbosity=1, dry_run=False),
'mount': flexmock(
repository=flexmock(),
archive=flexmock(),
mount_point=flexmock(),
paths=flexmock(),
foreground=flexmock(),
options=flexmock(),
),
}
def test_run_actions_runs_export_tar():
flexmock(module).should_receive('add_custom_log_levels')
flexmock(module.command).should_receive('execute_hook')
flexmock(borgmatic.actions.export_tar).should_receive('run_export_tar').once()
list(
tuple(
module.run_actions(
arguments=arguments,
config_filename='test.yaml',
location={'repositories': ['repo']},
storage={},
retention={},
consistency={},
arguments={'global': flexmock(dry_run=False), 'export-tar': flexmock()},
config_filename=flexmock(),
location={'repositories': []},
storage=flexmock(),
retention=flexmock(),
consistency=flexmock(),
hooks={},
local_path=None,
remote_path=None,
local_borg_version=None,
local_path=flexmock(),
remote_path=flexmock(),
local_borg_version=flexmock(),
repository_path='repo',
)
)
def test_run_actions_does_not_raise_for_rlist_action():
flexmock(module.validate).should_receive('repositories_match').and_return(True)
flexmock(module.borg_rlist).should_receive('list_repository')
arguments = {
'global': flexmock(monitoring_verbosity=1, dry_run=False),
'rlist': flexmock(repository=flexmock(), json=flexmock()),
}
def test_run_actions_runs_mount():
flexmock(module).should_receive('add_custom_log_levels')
flexmock(module.command).should_receive('execute_hook')
flexmock(borgmatic.actions.mount).should_receive('run_mount').once()
list(
tuple(
module.run_actions(
arguments=arguments,
config_filename='test.yaml',
location={'repositories': ['repo']},
storage={},
retention={},
consistency={},
arguments={'global': flexmock(dry_run=False), 'mount': flexmock()},
config_filename=flexmock(),
location={'repositories': []},
storage=flexmock(),
retention=flexmock(),
consistency=flexmock(),
hooks={},
local_path=None,
remote_path=None,
local_borg_version=None,
local_path=flexmock(),
remote_path=flexmock(),
local_borg_version=flexmock(),
repository_path='repo',
)
)
def test_run_actions_does_not_raise_for_list_action():
flexmock(module.validate).should_receive('repositories_match').and_return(True)
flexmock(module.borg_rlist).should_receive('resolve_archive_name').and_return(flexmock())
flexmock(module.borg_list).should_receive('list_archive')
arguments = {
'global': flexmock(monitoring_verbosity=1, dry_run=False),
'list': flexmock(repository=flexmock(), archive=flexmock(), json=flexmock()),
}
def test_run_actions_runs_restore():
flexmock(module).should_receive('add_custom_log_levels')
flexmock(module.command).should_receive('execute_hook')
flexmock(borgmatic.actions.restore).should_receive('run_restore').once()
list(
tuple(
module.run_actions(
arguments=arguments,
config_filename='test.yaml',
location={'repositories': ['repo']},
storage={},
retention={},
consistency={},
arguments={'global': flexmock(dry_run=False), 'restore': flexmock()},
config_filename=flexmock(),
location={'repositories': []},
storage=flexmock(),
retention=flexmock(),
consistency=flexmock(),
hooks={},
local_path=None,
remote_path=None,
local_borg_version=None,
local_path=flexmock(),
remote_path=flexmock(),
local_borg_version=flexmock(),
repository_path='repo',
)
)
def test_run_actions_does_not_raise_for_rinfo_action():
flexmock(module.validate).should_receive('repositories_match').and_return(True)
flexmock(module.borg_rinfo).should_receive('display_repository_info')
arguments = {
'global': flexmock(monitoring_verbosity=1, dry_run=False),
'rinfo': flexmock(repository=flexmock(), json=flexmock()),
}
def test_run_actions_runs_rlist():
flexmock(module).should_receive('add_custom_log_levels')
flexmock(module.command).should_receive('execute_hook')
expected = flexmock()
flexmock(borgmatic.actions.rlist).should_receive('run_rlist').and_yield(expected).once()
list(
result = tuple(
module.run_actions(
arguments=arguments,
config_filename='test.yaml',
location={'repositories': ['repo']},
storage={},
retention={},
consistency={},
arguments={'global': flexmock(dry_run=False), 'rlist': flexmock()},
config_filename=flexmock(),
location={'repositories': []},
storage=flexmock(),
retention=flexmock(),
consistency=flexmock(),
hooks={},
local_path=None,
remote_path=None,
local_borg_version=None,
local_path=flexmock(),
remote_path=flexmock(),
local_borg_version=flexmock(),
repository_path='repo',
)
)
assert result == (expected,)
def test_run_actions_runs_list():
flexmock(module).should_receive('add_custom_log_levels')
flexmock(module.command).should_receive('execute_hook')
expected = flexmock()
flexmock(borgmatic.actions.list).should_receive('run_list').and_yield(expected).once()
result = tuple(
module.run_actions(
arguments={'global': flexmock(dry_run=False), 'list': flexmock()},
config_filename=flexmock(),
location={'repositories': []},
storage=flexmock(),
retention=flexmock(),
consistency=flexmock(),
hooks={},
local_path=flexmock(),
remote_path=flexmock(),
local_borg_version=flexmock(),
repository_path='repo',
)
)
assert result == (expected,)
def test_run_actions_runs_rinfo():
flexmock(module).should_receive('add_custom_log_levels')
flexmock(module.command).should_receive('execute_hook')
expected = flexmock()
flexmock(borgmatic.actions.rinfo).should_receive('run_rinfo').and_yield(expected).once()
result = tuple(
module.run_actions(
arguments={'global': flexmock(dry_run=False), 'rinfo': flexmock()},
config_filename=flexmock(),
location={'repositories': []},
storage=flexmock(),
retention=flexmock(),
consistency=flexmock(),
hooks={},
local_path=flexmock(),
remote_path=flexmock(),
local_borg_version=flexmock(),
repository_path='repo',
)
)
assert result == (expected,)
def test_run_actions_runs_info():
flexmock(module).should_receive('add_custom_log_levels')
flexmock(module.command).should_receive('execute_hook')
expected = flexmock()
flexmock(borgmatic.actions.info).should_receive('run_info').and_yield(expected).once()
result = tuple(
module.run_actions(
arguments={'global': flexmock(dry_run=False), 'info': flexmock()},
config_filename=flexmock(),
location={'repositories': []},
storage=flexmock(),
retention=flexmock(),
consistency=flexmock(),
hooks={},
local_path=flexmock(),
remote_path=flexmock(),
local_borg_version=flexmock(),
repository_path='repo',
)
)
assert result == (expected,)
def test_run_actions_runs_break_lock():
flexmock(module).should_receive('add_custom_log_levels')
flexmock(module.command).should_receive('execute_hook')
flexmock(borgmatic.actions.break_lock).should_receive('run_break_lock').once()
tuple(
module.run_actions(
arguments={'global': flexmock(dry_run=False), 'break-lock': flexmock()},
config_filename=flexmock(),
location={'repositories': []},
storage=flexmock(),
retention=flexmock(),
consistency=flexmock(),
hooks={},
local_path=flexmock(),
remote_path=flexmock(),
local_borg_version=flexmock(),
repository_path='repo',
)
)
def test_run_actions_does_not_raise_for_info_action():
flexmock(module.validate).should_receive('repositories_match').and_return(True)
flexmock(module.borg_rlist).should_receive('resolve_archive_name').and_return(flexmock())
flexmock(module.borg_info).should_receive('display_archives_info')
arguments = {
'global': flexmock(monitoring_verbosity=1, dry_run=False),
'info': flexmock(repository=flexmock(), archive=flexmock(), json=flexmock()),
}
def test_run_actions_runs_borg():
flexmock(module).should_receive('add_custom_log_levels')
flexmock(module.command).should_receive('execute_hook')
flexmock(borgmatic.actions.borg).should_receive('run_borg').once()
list(
tuple(
module.run_actions(
arguments=arguments,
config_filename='test.yaml',
location={'repositories': ['repo']},
storage={},
retention={},
consistency={},
arguments={'global': flexmock(dry_run=False), 'borg': flexmock()},
config_filename=flexmock(),
location={'repositories': []},
storage=flexmock(),
retention=flexmock(),
consistency=flexmock(),
hooks={},
local_path=None,
remote_path=None,
local_borg_version=None,
repository_path='repo',
)
)
def test_run_actions_does_not_raise_for_break_lock_action():
flexmock(module.validate).should_receive('repositories_match').and_return(True)
flexmock(module.borg_break_lock).should_receive('break_lock')
arguments = {
'global': flexmock(monitoring_verbosity=1, dry_run=False),
'break-lock': flexmock(repository=flexmock()),
}
list(
module.run_actions(
arguments=arguments,
config_filename='test.yaml',
location={'repositories': ['repo']},
storage={},
retention={},
consistency={},
hooks={},
local_path=None,
remote_path=None,
local_borg_version=None,
repository_path='repo',
)
)
def test_run_actions_does_not_raise_for_borg_action():
flexmock(module.validate).should_receive('repositories_match').and_return(True)
flexmock(module.borg_rlist).should_receive('resolve_archive_name').and_return(flexmock())
flexmock(module.borg_borg).should_receive('run_arbitrary_borg')
arguments = {
'global': flexmock(monitoring_verbosity=1, dry_run=False),
'borg': flexmock(repository=flexmock(), archive=flexmock(), options=flexmock()),
}
list(
module.run_actions(
arguments=arguments,
config_filename='test.yaml',
location={'repositories': ['repo']},
storage={},
retention={},
consistency={},
hooks={},
local_path=None,
remote_path=None,
local_borg_version=None,
local_path=flexmock(),
remote_path=flexmock(),
local_borg_version=flexmock(),
repository_path='repo',
)
)

View File

@ -21,11 +21,13 @@ from borgmatic.config import normalize as module
{'location': {'source_directories': ['foo', 'bar']}},
False,
),
({'location': None}, {'location': None}, False,),
(
{'storage': {'compression': 'yes_please'}},
{'storage': {'compression': 'yes_please'}},
False,
),
({'storage': None}, {'storage': None}, False,),
(
{'hooks': {'healthchecks': 'https://example.com'}},
{'hooks': {'healthchecks': {'ping_url': 'https://example.com'}}},
@ -46,11 +48,18 @@ from borgmatic.config import normalize as module
{'hooks': {'cronhub': {'ping_url': 'https://example.com'}}},
False,
),
({'hooks': None}, {'hooks': None}, False,),
(
{'consistency': {'checks': ['archives']}},
{'consistency': {'checks': [{'name': 'archives'}]}},
False,
),
(
{'consistency': {'checks': ['archives']}},
{'consistency': {'checks': [{'name': 'archives'}]}},
False,
),
({'consistency': None}, {'consistency': None}, False,),
({'location': {'numeric_owner': False}}, {'location': {'numeric_ids': False}}, False,),
({'location': {'bsd_flags': False}}, {'location': {'flags': False}}, False,),
(

View File

@ -256,6 +256,24 @@ def test_restore_database_dump_runs_mongorestore_with_username_and_password():
)
def test_restore_database_dump_runs_mongorestore_with_options():
database_config = [{'name': 'foo', 'restore_options': '--harder'}]
extract_process = flexmock(stdout=flexmock())
flexmock(module).should_receive('make_dump_path')
flexmock(module.dump).should_receive('make_database_dump_filename')
flexmock(module).should_receive('execute_command_with_processes').with_args(
['mongorestore', '--archive', '--drop', '--db', 'foo', '--harder'],
processes=[extract_process],
output_log_level=logging.DEBUG,
input_file=extract_process.stdout,
).once()
module.restore_database_dump(
database_config, 'test.yaml', {}, dry_run=False, extract_process=extract_process
)
def test_restore_database_dump_runs_psql_for_all_database_dump():
database_config = [{'name': 'all'}]
extract_process = flexmock(stdout=flexmock())

View File

@ -22,9 +22,8 @@ def test_database_names_to_dump_queries_mysql_for_database_names():
extra_environment = flexmock()
log_prefix = ''
dry_run_label = ''
flexmock(module).should_receive('execute_command').with_args(
flexmock(module).should_receive('execute_command_and_capture_output').with_args(
('mysql', '--skip-column-names', '--batch', '--execute', 'show schemas'),
output_log_level=None,
extra_environment=extra_environment,
).and_return('foo\nbar\nmysql\n').once()
@ -35,59 +34,135 @@ def test_database_names_to_dump_queries_mysql_for_database_names():
assert names == ('foo', 'bar')
def test_dump_databases_runs_mysqldump_for_each_database():
def test_dump_databases_dumps_each_database():
databases = [{'name': 'foo'}, {'name': 'bar'}]
processes = [flexmock(), flexmock()]
flexmock(module).should_receive('make_dump_path').and_return('')
flexmock(module.dump).should_receive('make_database_dump_filename').and_return(
'databases/localhost/foo'
).and_return('databases/localhost/bar')
flexmock(module).should_receive('database_names_to_dump').and_return(('foo',)).and_return(
('bar',)
)
flexmock(module.dump).should_receive('create_named_pipe_for_dump')
for name, process in zip(('foo', 'bar'), processes):
flexmock(module).should_receive('execute_command').with_args(
(
'mysqldump',
'--add-drop-database',
'--databases',
name,
'>',
'databases/localhost/{}'.format(name),
),
shell=True,
extra_environment=None,
run_to_completion=False,
flexmock(module).should_receive('execute_dump_command').with_args(
database={'name': name},
log_prefix=object,
dump_path=object,
database_names=(name,),
extra_environment=object,
dry_run=object,
dry_run_label=object,
).and_return(process).once()
assert module.dump_databases(databases, 'test.yaml', {}, dry_run=False) == processes
def test_dump_databases_with_dry_run_skips_mysqldump():
databases = [{'name': 'foo'}, {'name': 'bar'}]
def test_dump_databases_dumps_with_password():
database = {'name': 'foo', 'username': 'root', 'password': 'trustsome1'}
process = flexmock()
flexmock(module).should_receive('make_dump_path').and_return('')
flexmock(module.dump).should_receive('make_database_dump_filename').and_return(
'databases/localhost/foo'
).and_return('databases/localhost/bar')
flexmock(module).should_receive('database_names_to_dump').and_return(('foo',)).and_return(
('bar',)
)
flexmock(module.dump).should_receive('create_named_pipe_for_dump').never()
flexmock(module).should_receive('execute_command').never()
module.dump_databases(databases, 'test.yaml', {}, dry_run=True)
flexmock(module).should_receive('execute_dump_command').with_args(
database=database,
log_prefix=object,
dump_path=object,
database_names=('foo',),
extra_environment={'MYSQL_PWD': 'trustsome1'},
dry_run=object,
dry_run_label=object,
).and_return(process).once()
assert module.dump_databases([database], 'test.yaml', {}, dry_run=False) == [process]
def test_dump_databases_runs_mysqldump_with_hostname_and_port():
databases = [{'name': 'foo', 'hostname': 'database.example.org', 'port': 5433}]
def test_dump_databases_dumps_all_databases_at_once():
databases = [{'name': 'all'}]
process = flexmock()
flexmock(module).should_receive('make_dump_path').and_return('')
flexmock(module.dump).should_receive('make_database_dump_filename').and_return(
'databases/database.example.org/foo'
flexmock(module).should_receive('database_names_to_dump').and_return(('foo', 'bar'))
flexmock(module).should_receive('execute_dump_command').with_args(
database={'name': 'all'},
log_prefix=object,
dump_path=object,
database_names=('foo', 'bar'),
extra_environment=object,
dry_run=object,
dry_run_label=object,
).and_return(process).once()
assert module.dump_databases(databases, 'test.yaml', {}, dry_run=False) == [process]
def test_dump_databases_dumps_all_databases_separately_when_format_configured():
databases = [{'name': 'all', 'format': 'sql'}]
processes = [flexmock(), flexmock()]
flexmock(module).should_receive('make_dump_path').and_return('')
flexmock(module).should_receive('database_names_to_dump').and_return(('foo', 'bar'))
for name, process in zip(('foo', 'bar'), processes):
flexmock(module).should_receive('execute_dump_command').with_args(
database={'name': name, 'format': 'sql'},
log_prefix=object,
dump_path=object,
database_names=(name,),
extra_environment=object,
dry_run=object,
dry_run_label=object,
).and_return(process).once()
assert module.dump_databases(databases, 'test.yaml', {}, dry_run=False) == processes
def test_database_names_to_dump_runs_mysql_with_list_options():
database = {'name': 'all', 'list_options': '--defaults-extra-file=my.cnf'}
flexmock(module).should_receive('execute_command_and_capture_output').with_args(
(
'mysql',
'--defaults-extra-file=my.cnf',
'--skip-column-names',
'--batch',
'--execute',
'show schemas',
),
extra_environment=None,
).and_return(('foo\nbar')).once()
assert module.database_names_to_dump(database, None, 'test.yaml', '') == ('foo', 'bar')
def test_execute_dump_command_runs_mysqldump():
process = flexmock()
flexmock(module.dump).should_receive('make_database_dump_filename').and_return('dump')
flexmock(module.os.path).should_receive('exists').and_return(False)
flexmock(module.dump).should_receive('create_named_pipe_for_dump')
flexmock(module).should_receive('execute_command').with_args(
('mysqldump', '--add-drop-database', '--databases', 'foo', '>', 'dump',),
shell=True,
extra_environment=None,
run_to_completion=False,
).and_return(process).once()
assert (
module.execute_dump_command(
database={'name': 'foo'},
log_prefix='log',
dump_path=flexmock(),
database_names=('foo',),
extra_environment=None,
dry_run=False,
dry_run_label='',
)
== process
)
flexmock(module).should_receive('database_names_to_dump').and_return(('foo',))
def test_execute_dump_command_runs_mysqldump_with_hostname_and_port():
process = flexmock()
flexmock(module.dump).should_receive('make_database_dump_filename').and_return('dump')
flexmock(module.os.path).should_receive('exists').and_return(False)
flexmock(module.dump).should_receive('create_named_pipe_for_dump')
flexmock(module).should_receive('execute_command').with_args(
@ -103,117 +178,120 @@ def test_dump_databases_runs_mysqldump_with_hostname_and_port():
'--databases',
'foo',
'>',
'databases/database.example.org/foo',
'dump',
),
shell=True,
extra_environment=None,
run_to_completion=False,
).and_return(process).once()
assert module.dump_databases(databases, 'test.yaml', {}, dry_run=False) == [process]
def test_dump_databases_runs_mysqldump_with_username_and_password():
databases = [{'name': 'foo', 'username': 'root', 'password': 'trustsome1'}]
process = flexmock()
flexmock(module).should_receive('make_dump_path').and_return('')
flexmock(module.dump).should_receive('make_database_dump_filename').and_return(
'databases/localhost/foo'
assert (
module.execute_dump_command(
database={'name': 'foo', 'hostname': 'database.example.org', 'port': 5433},
log_prefix='log',
dump_path=flexmock(),
database_names=('foo',),
extra_environment=None,
dry_run=False,
dry_run_label='',
)
== process
)
flexmock(module).should_receive('database_names_to_dump').and_return(('foo',))
def test_execute_dump_command_runs_mysqldump_with_username_and_password():
process = flexmock()
flexmock(module.dump).should_receive('make_database_dump_filename').and_return('dump')
flexmock(module.os.path).should_receive('exists').and_return(False)
flexmock(module.dump).should_receive('create_named_pipe_for_dump')
flexmock(module).should_receive('execute_command').with_args(
(
'mysqldump',
'--add-drop-database',
'--user',
'root',
'--databases',
'foo',
'>',
'databases/localhost/foo',
),
('mysqldump', '--add-drop-database', '--user', 'root', '--databases', 'foo', '>', 'dump',),
shell=True,
extra_environment={'MYSQL_PWD': 'trustsome1'},
run_to_completion=False,
).and_return(process).once()
assert module.dump_databases(databases, 'test.yaml', {}, dry_run=False) == [process]
def test_dump_databases_runs_mysqldump_with_options():
databases = [{'name': 'foo', 'options': '--stuff=such'}]
process = flexmock()
flexmock(module).should_receive('make_dump_path').and_return('')
flexmock(module.dump).should_receive('make_database_dump_filename').and_return(
'databases/localhost/foo'
assert (
module.execute_dump_command(
database={'name': 'foo', 'username': 'root', 'password': 'trustsome1'},
log_prefix='log',
dump_path=flexmock(),
database_names=('foo',),
extra_environment={'MYSQL_PWD': 'trustsome1'},
dry_run=False,
dry_run_label='',
)
== process
)
flexmock(module).should_receive('database_names_to_dump').and_return(('foo',))
def test_execute_dump_command_runs_mysqldump_with_options():
process = flexmock()
flexmock(module.dump).should_receive('make_database_dump_filename').and_return('dump')
flexmock(module.os.path).should_receive('exists').and_return(False)
flexmock(module.dump).should_receive('create_named_pipe_for_dump')
flexmock(module).should_receive('execute_command').with_args(
(
'mysqldump',
'--stuff=such',
'--add-drop-database',
'--databases',
'foo',
'>',
'databases/localhost/foo',
),
('mysqldump', '--stuff=such', '--add-drop-database', '--databases', 'foo', '>', 'dump',),
shell=True,
extra_environment=None,
run_to_completion=False,
).and_return(process).once()
assert module.dump_databases(databases, 'test.yaml', {}, dry_run=False) == [process]
def test_dump_databases_runs_mysqldump_for_all_databases():
databases = [{'name': 'all'}]
process = flexmock()
flexmock(module).should_receive('make_dump_path').and_return('')
flexmock(module.dump).should_receive('make_database_dump_filename').and_return(
'databases/localhost/all'
assert (
module.execute_dump_command(
database={'name': 'foo', 'options': '--stuff=such'},
log_prefix='log',
dump_path=flexmock(),
database_names=('foo',),
extra_environment=None,
dry_run=False,
dry_run_label='',
)
== process
)
flexmock(module).should_receive('database_names_to_dump').and_return(('foo', 'bar'))
def test_execute_dump_command_with_duplicate_dump_skips_mysqldump():
flexmock(module.dump).should_receive('make_database_dump_filename').and_return('dump')
flexmock(module.os.path).should_receive('exists').and_return(True)
flexmock(module.dump).should_receive('create_named_pipe_for_dump').never()
flexmock(module).should_receive('execute_command').never()
assert (
module.execute_dump_command(
database={'name': 'foo'},
log_prefix='log',
dump_path=flexmock(),
database_names=('foo',),
extra_environment=None,
dry_run=True,
dry_run_label='SO DRY',
)
is None
)
def test_execute_dump_command_with_dry_run_skips_mysqldump():
flexmock(module.dump).should_receive('make_database_dump_filename').and_return('dump')
flexmock(module.os.path).should_receive('exists').and_return(False)
flexmock(module.dump).should_receive('create_named_pipe_for_dump')
flexmock(module).should_receive('execute_command').with_args(
(
'mysqldump',
'--add-drop-database',
'--databases',
'foo',
'bar',
'>',
'databases/localhost/all',
),
shell=True,
extra_environment=None,
run_to_completion=False,
).and_return(process).once()
flexmock(module).should_receive('execute_command').never()
assert module.dump_databases(databases, 'test.yaml', {}, dry_run=False) == [process]
def test_database_names_to_dump_runs_mysql_with_list_options():
database = {'name': 'all', 'list_options': '--defaults-extra-file=my.cnf'}
flexmock(module).should_receive('execute_command').with_args(
(
'mysql',
'--defaults-extra-file=my.cnf',
'--skip-column-names',
'--batch',
'--execute',
'show schemas',
),
output_log_level=None,
extra_environment=None,
).and_return(('foo\nbar')).once()
assert module.database_names_to_dump(database, None, 'test.yaml', '') == ('foo', 'bar')
assert (
module.execute_dump_command(
database={'name': 'foo'},
log_prefix='log',
dump_path=flexmock(),
database_names=('foo',),
extra_environment=None,
dry_run=True,
dry_run_label='SO DRY',
)
is None
)
def test_dump_databases_errors_for_missing_all_databases():
@ -258,6 +336,23 @@ def test_restore_database_dump_errors_on_multiple_database_config():
)
def test_restore_database_dump_runs_mysql_with_options():
database_config = [{'name': 'foo', 'restore_options': '--harder'}]
extract_process = flexmock(stdout=flexmock())
flexmock(module).should_receive('execute_command_with_processes').with_args(
('mysql', '--batch', '--harder'),
processes=[extract_process],
output_log_level=logging.DEBUG,
input_file=extract_process.stdout,
extra_environment=None,
).once()
module.restore_database_dump(
database_config, 'test.yaml', {}, dry_run=False, extract_process=extract_process
)
def test_restore_database_dump_runs_mysql_with_hostname_and_port():
database_config = [{'name': 'foo', 'hostname': 'database.example.org', 'port': 5433}]
extract_process = flexmock(stdout=flexmock())

View File

@ -6,15 +6,107 @@ from flexmock import flexmock
from borgmatic.hooks import postgresql as module
def test_database_names_to_dump_passes_through_individual_database_name():
database = {'name': 'foo'}
assert module.database_names_to_dump(database, flexmock(), flexmock(), flexmock()) == ('foo',)
def test_database_names_to_dump_passes_through_individual_database_name_with_format():
database = {'name': 'foo', 'format': 'custom'}
assert module.database_names_to_dump(database, flexmock(), flexmock(), flexmock()) == ('foo',)
def test_database_names_to_dump_passes_through_all_without_format():
database = {'name': 'all'}
assert module.database_names_to_dump(database, flexmock(), flexmock(), flexmock()) == ('all',)
def test_database_names_to_dump_with_all_and_format_lists_databases():
database = {'name': 'all', 'format': 'custom'}
flexmock(module).should_receive('execute_command_and_capture_output').and_return(
'foo,test,\nbar,test,"stuff and such"'
)
assert module.database_names_to_dump(database, flexmock(), flexmock(), flexmock()) == (
'foo',
'bar',
)
def test_database_names_to_dump_with_all_and_format_lists_databases_with_hostname_and_port():
database = {'name': 'all', 'format': 'custom', 'hostname': 'localhost', 'port': 1234}
flexmock(module).should_receive('execute_command_and_capture_output').with_args(
(
'psql',
'--list',
'--no-password',
'--csv',
'--tuples-only',
'--host',
'localhost',
'--port',
'1234',
),
extra_environment=object,
).and_return('foo,test,\nbar,test,"stuff and such"')
assert module.database_names_to_dump(database, flexmock(), flexmock(), flexmock()) == (
'foo',
'bar',
)
def test_database_names_to_dump_with_all_and_format_lists_databases_with_username():
database = {'name': 'all', 'format': 'custom', 'username': 'postgres'}
flexmock(module).should_receive('execute_command_and_capture_output').with_args(
('psql', '--list', '--no-password', '--csv', '--tuples-only', '--username', 'postgres'),
extra_environment=object,
).and_return('foo,test,\nbar,test,"stuff and such"')
assert module.database_names_to_dump(database, flexmock(), flexmock(), flexmock()) == (
'foo',
'bar',
)
def test_database_names_to_dump_with_all_and_format_lists_databases_with_options():
database = {'name': 'all', 'format': 'custom', 'list_options': '--harder'}
flexmock(module).should_receive('execute_command_and_capture_output').with_args(
('psql', '--list', '--no-password', '--csv', '--tuples-only', '--harder'),
extra_environment=object,
).and_return('foo,test,\nbar,test,"stuff and such"')
assert module.database_names_to_dump(database, flexmock(), flexmock(), flexmock()) == (
'foo',
'bar',
)
def test_database_names_to_dump_with_all_and_format_excludes_particular_databases():
database = {'name': 'all', 'format': 'custom'}
flexmock(module).should_receive('execute_command_and_capture_output').and_return(
'foo,test,\ntemplate0,test,blah'
)
assert module.database_names_to_dump(database, flexmock(), flexmock(), flexmock()) == ('foo',)
def test_dump_databases_runs_pg_dump_for_each_database():
databases = [{'name': 'foo'}, {'name': 'bar'}]
processes = [flexmock(), flexmock()]
flexmock(module).should_receive('make_extra_environment').and_return({'PGSSLMODE': 'disable'})
flexmock(module).should_receive('make_dump_path').and_return('')
flexmock(module).should_receive('database_names_to_dump').and_return(('foo',)).and_return(
('bar',)
)
flexmock(module.dump).should_receive('make_database_dump_filename').and_return(
'databases/localhost/foo'
).and_return('databases/localhost/bar')
flexmock(module.os.path).should_receive('exists').and_return(False)
flexmock(module.dump).should_receive('create_named_pipe_for_dump')
flexmock(module).should_receive('make_extra_environment').and_return({'PGSSLMODE': 'disable'})
for name, process in zip(('foo', 'bar'), processes):
flexmock(module).should_receive('execute_command').with_args(
@ -37,14 +129,45 @@ def test_dump_databases_runs_pg_dump_for_each_database():
assert module.dump_databases(databases, 'test.yaml', {}, dry_run=False) == processes
def test_dump_databases_with_dry_run_skips_pg_dump():
def test_dump_databases_raises_when_no_database_names_to_dump():
databases = [{'name': 'foo'}, {'name': 'bar'}]
flexmock(module).should_receive('make_extra_environment').and_return({'PGSSLMODE': 'disable'})
flexmock(module).should_receive('make_dump_path').and_return('')
flexmock(module).should_receive('database_names_to_dump').and_return(())
with pytest.raises(ValueError):
module.dump_databases(databases, 'test.yaml', {}, dry_run=False)
def test_dump_databases_with_duplicate_dump_skips_pg_dump():
databases = [{'name': 'foo'}, {'name': 'bar'}]
flexmock(module).should_receive('make_extra_environment').and_return({'PGSSLMODE': 'disable'})
flexmock(module).should_receive('make_dump_path').and_return('')
flexmock(module).should_receive('database_names_to_dump').and_return(('foo',)).and_return(
('bar',)
)
flexmock(module.dump).should_receive('make_database_dump_filename').and_return(
'databases/localhost/foo'
).and_return('databases/localhost/bar')
flexmock(module.os.path).should_receive('exists').and_return(True)
flexmock(module.dump).should_receive('create_named_pipe_for_dump').never()
flexmock(module).should_receive('execute_command').never()
assert module.dump_databases(databases, 'test.yaml', {}, dry_run=False) == []
def test_dump_databases_with_dry_run_skips_pg_dump():
databases = [{'name': 'foo'}, {'name': 'bar'}]
flexmock(module).should_receive('make_extra_environment').and_return({'PGSSLMODE': 'disable'})
flexmock(module).should_receive('make_dump_path').and_return('')
flexmock(module).should_receive('database_names_to_dump').and_return(('foo',)).and_return(
('bar',)
)
flexmock(module.dump).should_receive('make_database_dump_filename').and_return(
'databases/localhost/foo'
).and_return('databases/localhost/bar')
flexmock(module.os.path).should_receive('exists').and_return(False)
flexmock(module.dump).should_receive('create_named_pipe_for_dump').never()
flexmock(module).should_receive('execute_command').never()
assert module.dump_databases(databases, 'test.yaml', {}, dry_run=True) == []
@ -53,12 +176,14 @@ def test_dump_databases_with_dry_run_skips_pg_dump():
def test_dump_databases_runs_pg_dump_with_hostname_and_port():
databases = [{'name': 'foo', 'hostname': 'database.example.org', 'port': 5433}]
process = flexmock()
flexmock(module).should_receive('make_extra_environment').and_return({'PGSSLMODE': 'disable'})
flexmock(module).should_receive('make_dump_path').and_return('')
flexmock(module).should_receive('database_names_to_dump').and_return(('foo',))
flexmock(module.dump).should_receive('make_database_dump_filename').and_return(
'databases/database.example.org/foo'
)
flexmock(module.os.path).should_receive('exists').and_return(False)
flexmock(module.dump).should_receive('create_named_pipe_for_dump')
flexmock(module).should_receive('make_extra_environment').and_return({'PGSSLMODE': 'disable'})
flexmock(module).should_receive('execute_command').with_args(
(
@ -87,14 +212,16 @@ def test_dump_databases_runs_pg_dump_with_hostname_and_port():
def test_dump_databases_runs_pg_dump_with_username_and_password():
databases = [{'name': 'foo', 'username': 'postgres', 'password': 'trustsome1'}]
process = flexmock()
flexmock(module).should_receive('make_dump_path').and_return('')
flexmock(module.dump).should_receive('make_database_dump_filename').and_return(
'databases/localhost/foo'
)
flexmock(module.dump).should_receive('create_named_pipe_for_dump')
flexmock(module).should_receive('make_extra_environment').and_return(
{'PGPASSWORD': 'trustsome1', 'PGSSLMODE': 'disable'}
)
flexmock(module).should_receive('make_dump_path').and_return('')
flexmock(module).should_receive('database_names_to_dump').and_return(('foo',))
flexmock(module.dump).should_receive('make_database_dump_filename').and_return(
'databases/localhost/foo'
)
flexmock(module.os.path).should_receive('exists').and_return(False)
flexmock(module.dump).should_receive('create_named_pipe_for_dump')
flexmock(module).should_receive('execute_command').with_args(
(
@ -144,13 +271,15 @@ def test_make_extra_environment_maps_options_to_environment():
def test_dump_databases_runs_pg_dump_with_directory_format():
databases = [{'name': 'foo', 'format': 'directory'}]
process = flexmock()
flexmock(module).should_receive('make_extra_environment').and_return({'PGSSLMODE': 'disable'})
flexmock(module).should_receive('make_dump_path').and_return('')
flexmock(module).should_receive('database_names_to_dump').and_return(('foo',))
flexmock(module.dump).should_receive('make_database_dump_filename').and_return(
'databases/localhost/foo'
)
flexmock(module.os.path).should_receive('exists').and_return(False)
flexmock(module.dump).should_receive('create_parent_directory_for_dump')
flexmock(module.dump).should_receive('create_named_pipe_for_dump').never()
flexmock(module).should_receive('make_extra_environment').and_return({'PGSSLMODE': 'disable'})
flexmock(module).should_receive('execute_command').with_args(
(
@ -175,12 +304,14 @@ def test_dump_databases_runs_pg_dump_with_directory_format():
def test_dump_databases_runs_pg_dump_with_options():
databases = [{'name': 'foo', 'options': '--stuff=such'}]
process = flexmock()
flexmock(module).should_receive('make_extra_environment').and_return({'PGSSLMODE': 'disable'})
flexmock(module).should_receive('make_dump_path').and_return('')
flexmock(module).should_receive('database_names_to_dump').and_return(('foo',))
flexmock(module.dump).should_receive('make_database_dump_filename').and_return(
'databases/localhost/foo'
)
flexmock(module.os.path).should_receive('exists').and_return(False)
flexmock(module.dump).should_receive('create_named_pipe_for_dump')
flexmock(module).should_receive('make_extra_environment').and_return({'PGSSLMODE': 'disable'})
flexmock(module).should_receive('execute_command').with_args(
(
@ -206,12 +337,14 @@ def test_dump_databases_runs_pg_dump_with_options():
def test_dump_databases_runs_pg_dumpall_for_all_databases():
databases = [{'name': 'all'}]
process = flexmock()
flexmock(module).should_receive('make_extra_environment').and_return({'PGSSLMODE': 'disable'})
flexmock(module).should_receive('make_dump_path').and_return('')
flexmock(module).should_receive('database_names_to_dump').and_return(('all',))
flexmock(module.dump).should_receive('make_database_dump_filename').and_return(
'databases/localhost/all'
)
flexmock(module.os.path).should_receive('exists').and_return(False)
flexmock(module.dump).should_receive('create_named_pipe_for_dump')
flexmock(module).should_receive('make_extra_environment').and_return({'PGSSLMODE': 'disable'})
flexmock(module).should_receive('execute_command').with_args(
('pg_dumpall', '--no-password', '--clean', '--if-exists', '>', 'databases/localhost/all'),
@ -223,13 +356,45 @@ def test_dump_databases_runs_pg_dumpall_for_all_databases():
assert module.dump_databases(databases, 'test.yaml', {}, dry_run=False) == [process]
def test_dump_databases_runs_non_default_pg_dump():
databases = [{'name': 'foo', 'pg_dump_command': 'special_pg_dump'}]
process = flexmock()
flexmock(module).should_receive('make_extra_environment').and_return({'PGSSLMODE': 'disable'})
flexmock(module).should_receive('make_dump_path').and_return('')
flexmock(module).should_receive('database_names_to_dump').and_return(('foo',))
flexmock(module.dump).should_receive('make_database_dump_filename').and_return(
'databases/localhost/foo'
)
flexmock(module.os.path).should_receive('exists').and_return(False)
flexmock(module.dump).should_receive('create_named_pipe_for_dump')
flexmock(module).should_receive('execute_command').with_args(
(
'special_pg_dump',
'--no-password',
'--clean',
'--if-exists',
'--format',
'custom',
'foo',
'>',
'databases/localhost/foo',
),
shell=True,
extra_environment={'PGSSLMODE': 'disable'},
run_to_completion=False,
).and_return(process).once()
assert module.dump_databases(databases, 'test.yaml', {}, dry_run=False) == [process]
def test_restore_database_dump_runs_pg_restore():
database_config = [{'name': 'foo'}]
extract_process = flexmock(stdout=flexmock())
flexmock(module).should_receive('make_extra_environment').and_return({'PGSSLMODE': 'disable'})
flexmock(module).should_receive('make_dump_path')
flexmock(module.dump).should_receive('make_database_dump_filename')
flexmock(module).should_receive('make_extra_environment').and_return({'PGSSLMODE': 'disable'})
flexmock(module).should_receive('execute_command_with_processes').with_args(
(
'pg_restore',
@ -258,9 +423,9 @@ def test_restore_database_dump_runs_pg_restore():
def test_restore_database_dump_errors_on_multiple_database_config():
database_config = [{'name': 'foo'}, {'name': 'bar'}]
flexmock(module).should_receive('make_extra_environment').and_return({'PGSSLMODE': 'disable'})
flexmock(module).should_receive('make_dump_path')
flexmock(module.dump).should_receive('make_database_dump_filename')
flexmock(module).should_receive('make_extra_environment').and_return({'PGSSLMODE': 'disable'})
flexmock(module).should_receive('execute_command_with_processes').never()
flexmock(module).should_receive('execute_command').never()
@ -274,9 +439,9 @@ def test_restore_database_dump_runs_pg_restore_with_hostname_and_port():
database_config = [{'name': 'foo', 'hostname': 'database.example.org', 'port': 5433}]
extract_process = flexmock(stdout=flexmock())
flexmock(module).should_receive('make_extra_environment').and_return({'PGSSLMODE': 'disable'})
flexmock(module).should_receive('make_dump_path')
flexmock(module.dump).should_receive('make_database_dump_filename')
flexmock(module).should_receive('make_extra_environment').and_return({'PGSSLMODE': 'disable'})
flexmock(module).should_receive('execute_command_with_processes').with_args(
(
'pg_restore',
@ -322,11 +487,11 @@ def test_restore_database_dump_runs_pg_restore_with_username_and_password():
database_config = [{'name': 'foo', 'username': 'postgres', 'password': 'trustsome1'}]
extract_process = flexmock(stdout=flexmock())
flexmock(module).should_receive('make_dump_path')
flexmock(module.dump).should_receive('make_database_dump_filename')
flexmock(module).should_receive('make_extra_environment').and_return(
{'PGPASSWORD': 'trustsome1', 'PGSSLMODE': 'disable'}
)
flexmock(module).should_receive('make_dump_path')
flexmock(module.dump).should_receive('make_database_dump_filename')
flexmock(module).should_receive('execute_command_with_processes').with_args(
(
'pg_restore',
@ -364,13 +529,57 @@ def test_restore_database_dump_runs_pg_restore_with_username_and_password():
)
def test_restore_database_dump_runs_pg_restore_with_options():
database_config = [
{'name': 'foo', 'restore_options': '--harder', 'analyze_options': '--smarter'}
]
extract_process = flexmock(stdout=flexmock())
flexmock(module).should_receive('make_extra_environment').and_return({'PGSSLMODE': 'disable'})
flexmock(module).should_receive('make_dump_path')
flexmock(module.dump).should_receive('make_database_dump_filename')
flexmock(module).should_receive('execute_command_with_processes').with_args(
(
'pg_restore',
'--no-password',
'--if-exists',
'--exit-on-error',
'--clean',
'--dbname',
'foo',
'--harder',
),
processes=[extract_process],
output_log_level=logging.DEBUG,
input_file=extract_process.stdout,
extra_environment={'PGSSLMODE': 'disable'},
).once()
flexmock(module).should_receive('execute_command').with_args(
(
'psql',
'--no-password',
'--quiet',
'--dbname',
'foo',
'--smarter',
'--command',
'ANALYZE',
),
extra_environment={'PGSSLMODE': 'disable'},
).once()
module.restore_database_dump(
database_config, 'test.yaml', {}, dry_run=False, extract_process=extract_process
)
def test_restore_database_dump_runs_psql_for_all_database_dump():
database_config = [{'name': 'all'}]
extract_process = flexmock(stdout=flexmock())
flexmock(module).should_receive('make_extra_environment').and_return({'PGSSLMODE': 'disable'})
flexmock(module).should_receive('make_dump_path')
flexmock(module.dump).should_receive('make_database_dump_filename')
flexmock(module).should_receive('make_extra_environment').and_return({'PGSSLMODE': 'disable'})
flexmock(module).should_receive('execute_command_with_processes').with_args(
('psql', '--no-password'),
processes=[extract_process],
@ -388,12 +597,46 @@ def test_restore_database_dump_runs_psql_for_all_database_dump():
)
def test_restore_database_dump_runs_non_default_pg_restore_and_psql():
database_config = [
{'name': 'foo', 'pg_restore_command': 'special_pg_restore', 'psql_command': 'special_psql'}
]
extract_process = flexmock(stdout=flexmock())
flexmock(module).should_receive('make_extra_environment').and_return({'PGSSLMODE': 'disable'})
flexmock(module).should_receive('make_dump_path')
flexmock(module.dump).should_receive('make_database_dump_filename')
flexmock(module).should_receive('execute_command_with_processes').with_args(
(
'special_pg_restore',
'--no-password',
'--if-exists',
'--exit-on-error',
'--clean',
'--dbname',
'foo',
),
processes=[extract_process],
output_log_level=logging.DEBUG,
input_file=extract_process.stdout,
extra_environment={'PGSSLMODE': 'disable'},
).once()
flexmock(module).should_receive('execute_command').with_args(
('special_psql', '--no-password', '--quiet', '--dbname', 'foo', '--command', 'ANALYZE'),
extra_environment={'PGSSLMODE': 'disable'},
).once()
module.restore_database_dump(
database_config, 'test.yaml', {}, dry_run=False, extract_process=extract_process
)
def test_restore_database_dump_with_dry_run_skips_restore():
database_config = [{'name': 'foo'}]
flexmock(module).should_receive('make_extra_environment').and_return({'PGSSLMODE': 'disable'})
flexmock(module).should_receive('make_dump_path')
flexmock(module.dump).should_receive('make_database_dump_filename')
flexmock(module).should_receive('make_extra_environment').and_return({'PGSSLMODE': 'disable'})
flexmock(module).should_receive('execute_command_with_processes').never()
module.restore_database_dump(
@ -404,9 +647,9 @@ def test_restore_database_dump_with_dry_run_skips_restore():
def test_restore_database_dump_without_extract_process_restores_from_disk():
database_config = [{'name': 'foo'}]
flexmock(module).should_receive('make_extra_environment').and_return({'PGSSLMODE': 'disable'})
flexmock(module).should_receive('make_dump_path')
flexmock(module.dump).should_receive('make_database_dump_filename').and_return('/dump/path')
flexmock(module).should_receive('make_extra_environment').and_return({'PGSSLMODE': 'disable'})
flexmock(module).should_receive('execute_command_with_processes').with_args(
(
'pg_restore',

View File

@ -213,7 +213,20 @@ def test_execute_command_without_run_to_completion_returns_process():
assert module.execute_command(full_command, run_to_completion=False) == process
def test_execute_command_captures_output():
def test_execute_command_and_capture_output_returns_stdout():
full_command = ['foo', 'bar']
expected_output = '[]'
flexmock(module.os, environ={'a': 'b'})
flexmock(module.subprocess).should_receive('check_output').with_args(
full_command, stderr=None, shell=False, env=None, cwd=None
).and_return(flexmock(decode=lambda: expected_output)).once()
output = module.execute_command_and_capture_output(full_command)
assert output == expected_output
def test_execute_command_and_capture_output_with_capture_stderr_returns_stderr():
full_command = ['foo', 'bar']
expected_output = '[]'
flexmock(module.os, environ={'a': 'b'})
@ -221,53 +234,49 @@ def test_execute_command_captures_output():
full_command, stderr=module.subprocess.STDOUT, shell=False, env=None, cwd=None
).and_return(flexmock(decode=lambda: expected_output)).once()
output = module.execute_command(full_command, output_log_level=None)
output = module.execute_command_and_capture_output(full_command, capture_stderr=True)
assert output == expected_output
def test_execute_command_captures_output_with_shell():
def test_execute_command_and_capture_output_returns_output_with_shell():
full_command = ['foo', 'bar']
expected_output = '[]'
flexmock(module.os, environ={'a': 'b'})
flexmock(module.subprocess).should_receive('check_output').with_args(
'foo bar', stderr=module.subprocess.STDOUT, shell=True, env=None, cwd=None
'foo bar', stderr=None, shell=True, env=None, cwd=None
).and_return(flexmock(decode=lambda: expected_output)).once()
output = module.execute_command(full_command, output_log_level=None, shell=True)
output = module.execute_command_and_capture_output(full_command, shell=True)
assert output == expected_output
def test_execute_command_captures_output_with_extra_environment():
def test_execute_command_and_capture_output_returns_output_with_extra_environment():
full_command = ['foo', 'bar']
expected_output = '[]'
flexmock(module.os, environ={'a': 'b'})
flexmock(module.subprocess).should_receive('check_output').with_args(
full_command,
stderr=module.subprocess.STDOUT,
shell=False,
env={'a': 'b', 'c': 'd'},
cwd=None,
full_command, stderr=None, shell=False, env={'a': 'b', 'c': 'd'}, cwd=None,
).and_return(flexmock(decode=lambda: expected_output)).once()
output = module.execute_command(
full_command, output_log_level=None, shell=False, extra_environment={'c': 'd'}
output = module.execute_command_and_capture_output(
full_command, shell=False, extra_environment={'c': 'd'}
)
assert output == expected_output
def test_execute_command_captures_output_with_working_directory():
def test_execute_command_and_capture_output_returns_output_with_working_directory():
full_command = ['foo', 'bar']
expected_output = '[]'
flexmock(module.os, environ={'a': 'b'})
flexmock(module.subprocess).should_receive('check_output').with_args(
full_command, stderr=module.subprocess.STDOUT, shell=False, env=None, cwd='/working'
full_command, stderr=None, shell=False, env=None, cwd='/working'
).and_return(flexmock(decode=lambda: expected_output)).once()
output = module.execute_command(
full_command, output_log_level=None, shell=False, working_directory='/working'
output = module.execute_command_and_capture_output(
full_command, shell=False, working_directory='/working'
)
assert output == expected_output

View File

@ -1,4 +1,5 @@
import logging
import sys
import pytest
from flexmock import flexmock
@ -125,6 +126,8 @@ def test_multi_stream_handler_logs_to_handler_for_log_level():
def test_console_color_formatter_format_includes_log_message():
flexmock(module).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.ANSWER
plain_message = 'uh oh'
record = flexmock(levelno=logging.CRITICAL, msg=plain_message)
@ -142,7 +145,38 @@ def test_color_text_without_color_does_not_raise():
module.color_text(None, 'hi')
def test_add_logging_level_adds_level_name_and_sets_global_attributes_and_methods():
logger = flexmock()
flexmock(module.logging).should_receive('getLoggerClass').and_return(logger)
flexmock(module.logging).should_receive('addLevelName').with_args(99, 'PLAID')
builtins = flexmock(sys.modules['builtins'])
builtins.should_call('setattr')
builtins.should_receive('setattr').with_args(module.logging, 'PLAID', 99).once()
builtins.should_receive('setattr').with_args(logger, 'plaid', object).once()
builtins.should_receive('setattr').with_args(logging, 'plaid', object).once()
module.add_logging_level('PLAID', 99)
def test_add_logging_level_skips_global_setting_if_already_set():
logger = flexmock()
flexmock(module.logging).should_receive('getLoggerClass').and_return(logger)
flexmock(module.logging).PLAID = 99
flexmock(logger).plaid = flexmock()
flexmock(logging).plaid = flexmock()
flexmock(module.logging).should_receive('addLevelName').never()
builtins = flexmock(sys.modules['builtins'])
builtins.should_call('setattr')
builtins.should_receive('setattr').with_args(module.logging, 'PLAID', 99).never()
builtins.should_receive('setattr').with_args(logger, 'plaid', object).never()
builtins.should_receive('setattr').with_args(logging, 'plaid', object).never()
module.add_logging_level('PLAID', 99)
def test_configure_logging_probes_for_log_socket_on_linux():
flexmock(module).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.ANSWER
flexmock(module).should_receive('Multi_stream_handler').and_return(
flexmock(setFormatter=lambda formatter: None, setLevel=lambda level: None)
)
@ -161,6 +195,8 @@ def test_configure_logging_probes_for_log_socket_on_linux():
def test_configure_logging_probes_for_log_socket_on_macos():
flexmock(module).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.ANSWER
flexmock(module).should_receive('Multi_stream_handler').and_return(
flexmock(setFormatter=lambda formatter: None, setLevel=lambda level: None)
)
@ -180,6 +216,8 @@ def test_configure_logging_probes_for_log_socket_on_macos():
def test_configure_logging_probes_for_log_socket_on_freebsd():
flexmock(module).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.ANSWER
flexmock(module).should_receive('Multi_stream_handler').and_return(
flexmock(setFormatter=lambda formatter: None, setLevel=lambda level: None)
)
@ -200,6 +238,8 @@ def test_configure_logging_probes_for_log_socket_on_freebsd():
def test_configure_logging_sets_global_logger_to_most_verbose_log_level():
flexmock(module).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.ANSWER
flexmock(module).should_receive('Multi_stream_handler').and_return(
flexmock(setFormatter=lambda formatter: None, setLevel=lambda level: None)
)
@ -213,6 +253,8 @@ def test_configure_logging_sets_global_logger_to_most_verbose_log_level():
def test_configure_logging_skips_syslog_if_not_found():
flexmock(module).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.ANSWER
flexmock(module).should_receive('Multi_stream_handler').and_return(
flexmock(setFormatter=lambda formatter: None, setLevel=lambda level: None)
)
@ -227,6 +269,8 @@ def test_configure_logging_skips_syslog_if_not_found():
def test_configure_logging_skips_syslog_if_interactive_console():
flexmock(module).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.ANSWER
flexmock(module).should_receive('Multi_stream_handler').and_return(
flexmock(setFormatter=lambda formatter: None, setLevel=lambda level: None)
)
@ -242,6 +286,8 @@ def test_configure_logging_skips_syslog_if_interactive_console():
def test_configure_logging_to_logfile_instead_of_syslog():
flexmock(module).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.ANSWER
flexmock(module).should_receive('Multi_stream_handler').and_return(
flexmock(setFormatter=lambda formatter: None, setLevel=lambda level: None)
)
@ -264,6 +310,8 @@ def test_configure_logging_to_logfile_instead_of_syslog():
def test_configure_logging_skips_logfile_if_argument_is_none():
flexmock(module).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.ANSWER
flexmock(module).should_receive('Multi_stream_handler').and_return(
flexmock(setFormatter=lambda formatter: None, setLevel=lambda level: None)
)

View File

@ -15,10 +15,17 @@ def insert_logging_mock(log_level):
def test_verbosity_to_log_level_maps_known_verbosity_to_log_level():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
assert module.verbosity_to_log_level(module.VERBOSITY_ERROR) == logging.ERROR
assert module.verbosity_to_log_level(module.VERBOSITY_ANSWER) == module.borgmatic.logger.ANSWER
assert module.verbosity_to_log_level(module.VERBOSITY_SOME) == logging.INFO
assert module.verbosity_to_log_level(module.VERBOSITY_LOTS) == logging.DEBUG
assert module.verbosity_to_log_level(module.VERBOSITY_ERROR) == logging.ERROR
def test_verbosity_to_log_level_maps_unknown_verbosity_to_warning_level():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
assert module.verbosity_to_log_level('my pants') == logging.WARNING