Compare commits

...

99 Commits

Author SHA1 Message Date
Dan Helfman 5e15c9f2bc Fix traceback when include merging on ARM64 (#622). 2022-12-23 10:07:53 -08:00
Dan Helfman 442641f9f6 Update borgmatic social links. 2022-12-16 11:39:05 -08:00
Dan Helfman f67c544be6 Optionally dump "all" PostgreSQL databases to separate files instead of one combined dump file (#438, #560). 2022-12-15 22:59:42 -08:00
Dan Helfman 437fd4dbae Update developer constributing instructions as well. 2022-12-13 23:56:32 -08:00
Dan Helfman 36873252d6 Update developer instructions. 2022-12-13 23:44:27 -08:00
Dan Helfman 1ef82a27fa Clarify data/archives check implicit enabling. 2022-12-12 16:03:05 -08:00
Dan Helfman 6837dcbf42 Clarify documentation about transferring archives between related repositories. 2022-12-10 12:59:44 -08:00
Dan Helfman c657764367 Fix logs that interfere with JSON output by making warnings go to stderr instead of stdout (#602). 2022-12-02 12:12:10 -08:00
Dan Helfman f79286fc91 Bump version for release. 2022-11-27 09:00:40 -08:00
Dan Helfman 694d376d15 Clarify documentation about multiple repositories and separate configuration files (#613). 2022-11-21 13:33:01 -08:00
Dan Helfman ab4c08019c Upgrade pytest test dependency (security). 2022-11-18 11:13:51 -08:00
Dan Helfman fd39f54df7 Code formatting. 2022-11-18 08:35:01 -08:00
Dan Helfman ca7e18bb29
Override PostgreSQL dump/restore commands via configuration options (#311).
Merge pull request #49 from jpaniagualaconich/specify-pg-dump-restore-commands
2022-11-18 08:33:14 -08:00
Dan Helfman 6975a5b155 Fix "data" consistency check to support "check_last" and consistency "prefix" options (#611). 2022-11-17 10:19:48 -08:00
Dan Helfman b627d00595 More consistency checks documentation edits. 2022-11-14 15:13:47 -08:00
Dan Helfman 9bd8f1a6df Clarify consistency check configuration. 2022-11-14 14:58:42 -08:00
Javier Paniagua faf682ca35 specify pg dump/restore commands (#311) 2022-11-06 11:12:53 +01:00
Dan Helfman 6aeb74550d Clarify examples in include merging and deep merging documentation (#607). 2022-10-28 19:33:19 -07:00
Dan Helfman 89500df429 Fix traceback when a configuration section is present but lacking any options (#604). 2022-10-23 13:56:03 -07:00
Dan Helfman 82b072d0b7 Update documentation to mention using blake2 with "transfer" action. 2022-10-17 15:04:30 -07:00
Dan Helfman 018c0296fd Document that special file exclusion also excludes symlinks to special files (#596). 2022-10-15 10:14:46 -07:00
Dan Helfman 9c42e7e817 Fix regression in which "check" action errored on certain systems (#597, #598). 2022-10-14 16:19:26 -07:00
Dan Helfman 953277a066 Fix special file detection when broken symlinks are encountered (#596). 2022-10-14 09:41:08 -07:00
Dan Helfman e2002b5488 Bump version for release. 2022-10-12 10:59:54 -07:00
Dan Helfman c9742e1d04 Code formatting. 2022-10-12 10:52:32 -07:00
Dan Helfman 906da838ef Add missing break-lock action command-line help (#357). 2022-10-12 10:48:10 -07:00
Dan Helfman d7f1c10c8c To prevent Borg hangs, unconditionally delete stale named pipes before dumping databases (#360). 2022-10-12 10:26:09 -07:00
Dan Helfman e8e4d17168 Clean up changelog for the current dev release. 2022-10-06 22:06:03 -07:00
Dan Helfman a31ce337e9 Skip auto-exclusion of special files when user explicitly sets read_special to true (#587). 2022-10-06 11:07:43 -07:00
Dan Helfman 902730df46 Update sample systemd file to allow system idle (#589). 2022-10-05 10:20:25 -07:00
Dan Helfman c969c822ee Do not inhibit idle in borgmatic.service (#589).
Reviewed-on: borgmatic-collective/borgmatic#589
2022-10-05 17:14:19 +00:00
Dan Helfman c31702d092 Fix for potential data loss with "patterns_from". Also, display excluded files (#590). 2022-10-04 22:57:18 -07:00
Dan Helfman ba8fbe7a44 Add "break-lock" action for removing any repository and cache locks leftover from Borg aborting (#357). 2022-10-04 13:42:18 -07:00
Dan Helfman 2774c2e4c0 Add support for Borg 2's "--match-archives" flag (replaces "--glob-archives") (#591). 2022-10-03 22:50:37 -07:00
Dan Helfman ae036aebd7 When the "read_special" option is true or database hooks are enabled, auto-exclude special files for a "create" action to prevent Borg from hanging (#587). 2022-10-03 12:58:13 -07:00
LaserEyess 2e9f70d496 Do not inhibit idle in borgmatic.service
When backing up a machine with a monitor using logind to control idle
timeout and things like DPMS, borgmatic can block the screen from
turning on/off with systemd-inhibit. This is because by default
systemd-inhibit will block "idle:sleep:shutdown". Borgmatic does not
need to care about idle, only about suspend and shutdown. So, add an
explicit `--what` flag for what borgmatic should inhibit.

For more information see systemd-inhibit(1).
2022-10-01 09:33:38 -04:00
Dan Helfman 90be5b84b1 Fix changelog development version. 2022-09-20 14:02:48 -07:00
Dan Helfman 80e95f20a3 Add "borgmatic borg" documentation note about interactive prompts. 2022-09-20 14:01:47 -07:00
Dan Helfman ac7c7d4036 Warn when ignoring a configured "read_special" value of false, as true is needed when database hooks are enabled (#587). 2022-09-20 13:52:13 -07:00
Dan Helfman 858b0b9fbe Note version of borgmatic needed for "borgmatic borg" action (#586). 2022-09-13 09:05:18 -07:00
Dan Helfman 9cc043f60e Update "find" command in documentation to work on BSDs and not just Linux (#583). 2022-09-11 20:02:30 -07:00
Dan Helfman 276a27d485 Bump version for release. 2022-09-08 10:29:44 -07:00
Dan Helfman 679bb839d7 Fix hang when database hooks are enabled and "patterns" contains a parent directory of "~/.borgmatic" (#582). 2022-09-08 10:16:42 -07:00
Dan Helfman 9e64d847ef Fix regression in which "borgmatic info --archive ..." showed repository info instead of archive info with Borg 1 (#577). 2022-08-30 20:42:42 -07:00
Dan Helfman 61fb275896 Fix duplicate-appearing log entries for "list" action. 2022-08-30 20:29:26 -07:00
Dan Helfman ca0c79c93c Fix duplicate bind path in sample systemd service. 2022-08-28 14:49:23 -07:00
Dan Helfman 87c97b7568 Fixed spurious, intermittent test failures related to command execution and logging. 2022-08-28 09:06:06 -07:00
Dan Helfman 80b8c25bba Update docs about "source_directories" being optional. 2022-08-25 13:24:26 -07:00
Dan Helfman d1837cd1d3 Bump version for release. 2022-08-25 11:58:06 -07:00
Dan Helfman c46f2b8508 Fix conflict between "patterns" and "source_directories" (#574), make "source_directories" optional (#542). 2022-08-25 11:55:34 -07:00
Dan Helfman a274c0dbf7 In generate-borgmatic-config, indicate that the example options are exhaustive. 2022-08-24 09:53:54 -07:00
Dan Helfman ef7e95e22a Fix end-to-end tests. 2022-08-21 23:29:13 -07:00
Dan Helfman 3be99de5b1 Update "repositories" examples in configuration to use ssh:// style syntax. 2022-08-21 22:40:31 -07:00
Dan Helfman e7b7560477 Bump version for release. 2022-08-21 21:54:13 -07:00
Dan Helfman 317dc7fbce Add "before_actions" and "after_actions" command hooks that run before/after all the actions for each repository, update docs to cover per-repository configurations (#463). 2022-08-21 21:48:37 -07:00
Dan Helfman 97fad15009 Switch to more accessible header permalink anchors in documentation. 2022-08-21 21:48:07 -07:00
Dan Helfman 462326406e Drop only-style actions like "--create", rename "prune --files" to "prune --list", and add "--list" alias to "create" and "export-tar" (#571). 2022-08-21 14:25:16 -07:00
Dan Helfman bbdf4893d1 Clarify --format flag in documentation. 2022-08-19 15:27:03 -07:00
Dan Helfman ef6617cfe6
Add link to Borg list --format documentation. 2022-08-19 15:16:56 -07:00
Dan Helfman dbef0a440f
Merge branch 'master' into patch-2 2022-08-19 15:16:17 -07:00
Dan Helfman 22628ba5d4 Update ssh:// examples in documentation to use relative paths on the remote machine (#557). 2022-08-19 12:00:40 -07:00
Dan Helfman 8576ac86b9 Fix incorrect version in documentation (#557). 2022-08-19 09:44:31 -07:00
Dan Helfman 540f9f6b72 Add missing test for "transfer" action (#557). 2022-08-19 09:40:29 -07:00
Dan Helfman f9d7faf884 Fix mount action to work without archive again (#557). 2022-08-18 23:33:05 -07:00
Dan Helfman 7dee6194a2 Add new "transfer" action for Borg 2 (#557). 2022-08-18 23:06:51 -07:00
Dan Helfman 68f9c1b950 Add generate-borgmatic-config end-to-end test. 2022-08-18 14:28:46 -07:00
Dan Helfman 43d711463c Add additional command-line flags to rcreate action (#557). 2022-08-18 14:28:12 -07:00
Dan Helfman 00255a2437 Various documentation edits for Borg 2 (#557). 2022-08-18 10:19:11 -07:00
Dan Helfman b40e9b7da2 Ignore archive filter parameters passed to list action when --archive is given (#557). 2022-08-18 09:59:48 -07:00
Dan Helfman 89d201c8ff Fleshing out NEWS for the Borg 2 changes. 2022-08-17 21:54:00 -07:00
Dan Helfman f47c98c4a5 Rename several configuration options to match Borg 2 (#557). 2022-08-17 21:14:58 -07:00
Dan Helfman 3b6ed06686 Add --other-repo flag to rcreate action (#557). 2022-08-17 17:33:09 -07:00
Dan Helfman 57009e22b5 Use flag-related utility functions in info action (#557). 2022-08-17 17:11:02 -07:00
Dan Helfman 3ab7a3b64a Replace use of --prefix with --glob-archives in info action (#557). 2022-08-17 15:36:19 -07:00
Dan Helfman 596dd49cf5 Use --glob-archives instead of --prefix on rlist command (#557). 2022-08-17 14:26:35 -07:00
Dan Helfman 28d847b8b1 Warn and tranform on non-ssh://-style repositories (#557). 2022-08-17 10:13:11 -07:00
Dan Helfman 2a1c6b1477 Update documentation with newly required ssh:// repository syntax for Borg 2 (#557). 2022-08-16 11:41:35 -07:00
Dan Helfman 30abd0e3de Update borg action for Borg 2 support (#557). 2022-08-16 09:30:00 -07:00
Dan Helfman f36e38ec20 Update mount action for Borg 2 support (#557). 2022-08-15 19:32:37 -07:00
Dan Helfman d807ce095e Update export-tar action for Borg 2 support (#557). 2022-08-15 17:34:12 -07:00
Dan Helfman 7626fe1189 Disallow borg list --json with --archive or --find (#557). 2022-08-15 15:40:28 -07:00
Dan Helfman cc04bf57df Update list action for Borg 2 support, add rinfo action, and update extract consistency check for Borg 2. 2022-08-15 15:04:40 -07:00
Dan Helfman cce6d56661 Update extract action for Borg 2 support (#557). 2022-08-13 23:07:29 -07:00
Dan Helfman a05d0f378e Factor out repository/archive flags formatting code from create action (#557). 2022-08-13 22:50:14 -07:00
Dan Helfman 94321aec7a Update compact action for Borg 2 support (#557). 2022-08-13 22:07:15 -07:00
Dan Helfman 4a55749bd2 Update prune action for Borg 2 support (#557). 2022-08-13 17:26:51 -07:00
Dan Helfman 2898e63166 Update create action for Borg 2 support (#557). 2022-08-12 23:54:13 -07:00
Dan Helfman c7176bd00a Add rinfo action for Borg 2 support (#557). 2022-08-12 23:06:56 -07:00
Dan Helfman 647ecdac29 Borg 2 support in borgmatic check action (#557). 2022-08-12 15:46:33 -07:00
Dan Helfman e7a8acfb96 Add missing rinfo action source files (#557). 2022-08-12 14:59:03 -07:00
Dan Helfman 622caa0c21 Support for Borg 2's rcreate and rinfo sub-commands (#557). 2022-08-12 14:53:20 -07:00
Dan Helfman 22149c6401 Switch to self-hosted container registry for borgmatic documentation image. 2022-08-01 21:17:59 -07:00
Dan Helfman 9aece3936a Modify "mount" and "extract" actions to require the "--repository" flag when multiple repositories are configured (#566). 2022-07-25 11:30:02 -07:00
Dan Helfman c7e4e6f6c9 Add Healthchecks "verify_tls" option to NEWS. 2022-07-23 23:16:06 -07:00
Dan Helfman bcad0de1a4
Add verify_tls option for Healthchecks to optionally disable TLS verification. 2022-07-23 23:11:41 -07:00
Uli 5c6407047f feat: add verify_tls flag for Healthchecks 2022-07-24 07:37:00 +02:00
Dan Helfman 6ddae20fa1 Fix handling of "repository" and "data" consistency checks to prevent invalid Borg flags (#565). 2022-07-23 21:02:21 -07:00
Jelle @ Samson-IT f7c8e89a9f
update format specifier syntax link to use anchor 2022-07-13 21:52:21 +02:00
Jelle @ Samson-IT ba377952fd
Added link to borgbackup list --format docs
I kept searching for this link, so it's time to add it to official docs.
2022-07-13 13:52:48 +02:00
95 changed files with 6240 additions and 1869 deletions

View File

@ -42,7 +42,9 @@ steps:
from_secret: docker_username
password:
from_secret: docker_password
repo: witten/borgmatic-docs
registry: projects.torsion.org
repo: projects.torsion.org/borgmatic-collective/borgmatic
tags: docs
dockerfile: docs/Dockerfile
trigger:

View File

@ -23,8 +23,7 @@ module.exports = function(eleventyConfig) {
}
};
let markdownItAnchorOptions = {
permalink: true,
permalinkClass: "direct-link"
permalink: markdownItAnchor.permalink.headerLink()
};
eleventyConfig.setLibrary(

94
NEWS
View File

@ -1,3 +1,97 @@
1.7.6.dev0
* #438, #560: Optionally dump "all" PostgreSQL databases to separate files instead of one combined
dump file, allowing more convenient restores of individual databases. You can enable this by
specifying the database dump "format" option when the database is named "all".
* #602: Fix logs that interfere with JSON output by making warnings go to stderr instead of stdout.
* #622: Fix traceback when include merging on ARM64.
1.7.5
* #311: Override PostgreSQL dump/restore commands via configuration options.
* #604: Fix traceback when a configuration section is present but lacking any options.
* #607: Clarify documentation examples for include merging and deep merging.
* #611: Fix "data" consistency check to support "check_last" and consistency "prefix" options.
* #613: Clarify documentation about multiple repositories and separate configuration files.
1.7.4
* #596: Fix special file detection erroring when broken symlinks are encountered.
* #597, #598: Fix regression in which "check" action errored on certain systems ("Cannot determine
Borg repository ID").
1.7.3
* #357: Add "break-lock" action for removing any repository and cache locks leftover from Borg
aborting.
* #360: To prevent Borg hangs, unconditionally delete stale named pipes before dumping databases.
* #587: When database hooks are enabled, auto-exclude special files from a "create" action to
prevent Borg from hanging. You can override/prevent this behavior by explicitly setting the
"read_special" option to true.
* #587: Warn when ignoring a configured "read_special" value of false, as true is needed when
database hooks are enabled.
* #589: Update sample systemd service file to allow system "idle" (e.g. a video monitor turning
off) while borgmatic is running.
* #590: Fix for potential data loss (data not getting backed up) when the "patterns_from" option
was used with "source_directories" (or the "~/.borgmatic" path existed, which got injected into
"source_directories" implicitly). The fix is for borgmatic to convert "source_directories" into
patterns whenever "patterns_from" is used, working around a Borg bug:
https://github.com/borgbackup/borg/issues/6994
* #590: In "borgmatic create --list" output, display which files get excluded from the backup due
to patterns or excludes.
* #591: Add support for Borg 2's "--match-archives" flag. This replaces "--glob-archives", which
borgmatic now treats as an alias for "--match-archives". But note that the two flags have
slightly different syntax. See the Borg 2 changelog for more information:
https://borgbackup.readthedocs.io/en/2.0.0b3/changes.html#version-2-0-0b3-2022-10-02
* Fix for "borgmatic --archive latest" not finding the latest archive when a verbosity is set.
1.7.2
* #577: Fix regression in which "borgmatic info --archive ..." showed repository info instead of
archive info with Borg 1.
* #582: Fix hang when database hooks are enabled and "patterns" contains a parent directory of
"~/.borgmatic".
1.7.1
* #542: Make the "source_directories" option optional. This is useful for "check"-only setups or
using "patterns" exclusively.
* #574: Fix for potential data loss (data not getting backed up) when the "patterns" option was
used with "source_directories" (or the "~/.borgmatic" path existed, which got injected into
"source_directories" implicitly). The fix is for borgmatic to convert "source_directories" into
patterns whenever "patterns" is used, working around a Borg bug:
https://github.com/borgbackup/borg/issues/6994
1.7.0
* #463: Add "before_actions" and "after_actions" command hooks that run before/after all the
actions for each repository. These new hooks are a good place to run per-repository steps like
mounting/unmounting a remote filesystem.
* #463: Update documentation to cover per-repository configurations:
https://torsion.org/borgmatic/docs/how-to/make-per-application-backups/
* #557: Support for Borg 2 while still working with Borg 1. This includes new borgmatic actions
like "rcreate" (replaces "init"), "rlist" (list archives in repository), "rinfo" (show repository
info), and "transfer" (for upgrading Borg repositories). For the most part, borgmatic tries to
smooth over differences between Borg 1 and 2 to make your upgrade process easier. However, there
are still a few cases where Borg made breaking changes. See the Borg 2.0 changelog for more
information: https://www.borgbackup.org/releases/borg-2.0.html
* #557: If you install Borg 2, you'll need to manually upgrade your existing Borg 1 repositories
before use. Note that Borg 2 stable is not yet released as of this borgmatic release, so don't
use Borg 2 for production until it is! See the documentation for more information:
https://torsion.org/borgmatic/docs/how-to/upgrade/#upgrading-borg
* #557: Rename several configuration options to match Borg 2: "remote_rate_limit" is now
"upload_rate_limit", "numeric_owner" is "numeric_ids", and "bsd_flags" is "flags". borgmatic
still works with the old options.
* #557: Remote repository paths without the "ssh://" syntax are deprecated but still supported for
now. Remote repository paths containing "~" are deprecated in borgmatic and no longer work in
Borg 2.
* #557: Omitting the "--archive" flag on the "list" action is deprecated when using Borg 2. Use
the new "rlist" action instead.
* #557: The "--dry-run" flag can now be used with the "rcreate"/"init" action.
* #565: Fix handling of "repository" and "data" consistency checks to prevent invalid Borg flags.
* #566: Modify "mount" and "extract" actions to require the "--repository" flag when multiple
repositories are configured.
* #571: BREAKING: Remove old-style command-line action flags like "--create, "--list", etc. If
you're already using actions like "create" and "list" instead, this change should not affect you.
* #571: BREAKING: Rename "--files" flag on "prune" action to "--list", as it lists archives, not
files.
* #571: Add "--list" as alias for "--files" flag on "create" and "export-tar" actions.
* Add support for disabling TLS verification in Healthchecks monitoring hook with "verify_tls"
option.
1.6.6
* #559: Update documentation about configuring multiple consistency checks or multiple databases.
* #560: Fix all database hooks to error when the requested database to restore isn't present in the

View File

@ -24,8 +24,8 @@ location:
# Paths of local or remote repositories to backup to.
repositories:
- 1234@usw-s001.rsync.net:backups.borg
- k8pDxu32@k8pDxu32.repo.borgbase.com:repo
- ssh://1234@usw-s001.rsync.net/./backups.borg
- ssh://k8pDxu32@k8pDxu32.repo.borgbase.com/./repo
- /var/lib/backups/local.borg
retention:
@ -104,23 +104,38 @@ offerings, but do not currently fund borgmatic development or hosting.
### Issues
You've got issues? Or an idea for a feature enhancement? We've got an [issue
tracker](https://projects.torsion.org/borgmatic-collective/borgmatic/issues). In order to
create a new issue or comment on an issue, you'll need to [login
first](https://projects.torsion.org/user/login). Note that you can login with
an existing GitHub account if you prefer.
If you'd like to chat with borgmatic developers or users, head on over to the
`#borgmatic` IRC channel on Libera Chat, either via <a
href="https://web.libera.chat/#borgmatic">web chat</a> or a
native <a href="ircs://irc.libera.chat:6697">IRC client</a>. If you
don't get a response right away, please hang around a while—or file a ticket
instead.
Are you experiencing an issue with borgmatic? Or do you have an idea for a
feature enhancement? Head on over to our [issue
tracker](https://projects.torsion.org/borgmatic-collective/borgmatic/issues).
In order to create a new issue or add a comment, you'll need to
[register](https://projects.torsion.org/user/sign_up?invite_code=borgmatic)
first. If you prefer to use an existing GitHub account, you can skip account
creation and [login directly](https://projects.torsion.org/user/login).
Also see the [security
policy](https://torsion.org/borgmatic/docs/security-policy/) for any security
issues.
### Social
Check out the [Borg subreddit](https://www.reddit.com/r/BorgBackup/) for
general Borg and borgmatic discussion and support.
Also follow [borgmatic on Mastodon](https://fosstodon.org/@borgmatic).
### Chat
To chat with borgmatic developers or users, check out the `#borgmatic`
IRC channel on Libera Chat, either via <a
href="https://web.libera.chat/#borgmatic">web chat</a> or a native <a
href="ircs://irc.libera.chat:6697">IRC client</a>. If you don't get a response
right away, please hang around a while—or file a ticket instead.
### Other
Other questions or comments? Contact
[witten@torsion.org](mailto:witten@torsion.org).
@ -135,10 +150,14 @@ borgmatic is licensed under the GNU General Public License version 3 or any
later version.
If you'd like to contribute to borgmatic development, please feel free to
submit a [Pull Request](https://projects.torsion.org/borgmatic-collective/borgmatic/pulls)
or open an [issue](https://projects.torsion.org/borgmatic-collective/borgmatic/issues) first
to discuss your idea. We also accept Pull Requests on GitHub, if that's more
your thing. In general, contributions are very welcome. We don't bite!
submit a [Pull
Request](https://projects.torsion.org/borgmatic-collective/borgmatic/pulls) or
open an
[issue](https://projects.torsion.org/borgmatic-collective/borgmatic/issues) to
discuss your idea. Note that you'll need to
[register](https://projects.torsion.org/user/sign_up?invite_code=borgmatic)
first. We also accept Pull Requests on GitHub, if that's more your thing. In
general, contributions are very welcome. We don't bite!
Also, please check out the [borgmatic development
how-to](https://torsion.org/borgmatic/docs/how-to/develop-on-borgmatic/) for

View File

@ -1,32 +1,39 @@
import logging
from borgmatic.borg import environment
from borgmatic.borg.flags import make_flags
import borgmatic.logger
from borgmatic.borg import environment, flags
from borgmatic.execute import execute_command
logger = logging.getLogger(__name__)
REPOSITORYLESS_BORG_COMMANDS = {'serve', None}
BORG_COMMANDS_WITH_SUBCOMMANDS = {'key', 'debug'}
BORG_SUBCOMMANDS_WITHOUT_REPOSITORY = (('debug', 'info'), ('debug', 'convert-profile'))
BORG_SUBCOMMANDS_WITH_SUBCOMMANDS = {'key', 'debug'}
BORG_SUBCOMMANDS_WITHOUT_REPOSITORY = (('debug', 'info'), ('debug', 'convert-profile'), ())
def run_arbitrary_borg(
repository, storage_config, options, archive=None, local_path='borg', remote_path=None
repository,
storage_config,
local_borg_version,
options,
archive=None,
local_path='borg',
remote_path=None,
):
'''
Given a local or remote repository path, a storage config dict, a sequence of arbitrary
command-line Borg options, and an optional archive name, run an arbitrary Borg command on the
given repository/archive.
Given a local or remote repository path, a storage config dict, the local Borg version, a
sequence of arbitrary command-line Borg options, and an optional archive name, run an arbitrary
Borg command on the given repository/archive.
'''
borgmatic.logger.add_custom_log_levels()
lock_wait = storage_config.get('lock_wait', None)
try:
options = options[1:] if options[0] == '--' else options
# Borg commands like "key" have a sub-command ("export", etc.) that must follow it.
command_options_start_index = 2 if options[0] in BORG_COMMANDS_WITH_SUBCOMMANDS else 1
command_options_start_index = 2 if options[0] in BORG_SUBCOMMANDS_WITH_SUBCOMMANDS else 1
borg_command = tuple(options[:command_options_start_index])
command_options = tuple(options[command_options_start_index:])
except IndexError:
@ -34,26 +41,28 @@ def run_arbitrary_borg(
command_options = ()
if borg_command in BORG_SUBCOMMANDS_WITHOUT_REPOSITORY:
repository_archive = None
else:
repository_archive = (
'::'.join((repository, archive)) if repository and archive else repository
repository_archive_flags = ()
elif archive:
repository_archive_flags = flags.make_repository_archive_flags(
repository, archive, local_borg_version
)
else:
repository_archive_flags = flags.make_repository_flags(repository, local_borg_version)
full_command = (
(local_path,)
+ borg_command
+ ((repository_archive,) if borg_command and repository_archive else ())
+ repository_archive_flags
+ command_options
+ (('--info',) if logger.getEffectiveLevel() == logging.INFO else ())
+ (('--debug', '--show-rc') if logger.isEnabledFor(logging.DEBUG) else ())
+ make_flags('remote-path', remote_path)
+ make_flags('lock-wait', lock_wait)
+ flags.make_flags('remote-path', remote_path)
+ flags.make_flags('lock-wait', lock_wait)
)
return execute_command(
full_command,
output_log_level=logging.WARNING,
output_log_level=logging.ANSWER,
borg_local_path=local_path,
extra_environment=environment.make_environment(storage_config),
)

View File

@ -0,0 +1,31 @@
import logging
from borgmatic.borg import environment, flags
from borgmatic.execute import execute_command
logger = logging.getLogger(__name__)
def break_lock(
repository, storage_config, local_borg_version, local_path='borg', remote_path=None,
):
'''
Given a local or remote repository path, a storage configuration dict, the local Borg version,
and optional local and remote Borg paths, break any repository and cache locks leftover from Borg
aborting.
'''
umask = storage_config.get('umask', None)
lock_wait = storage_config.get('lock_wait', None)
full_command = (
(local_path, 'break-lock')
+ (('--remote-path', remote_path) if remote_path else ())
+ (('--umask', str(umask)) if umask else ())
+ (('--lock-wait', str(lock_wait)) if lock_wait else ())
+ (('--info',) if logger.getEffectiveLevel() == logging.INFO else ())
+ (('--debug', '--show-rc') if logger.isEnabledFor(logging.DEBUG) else ())
+ flags.make_repository_flags(repository, local_borg_version)
)
borg_environment = environment.make_environment(storage_config)
execute_command(full_command, borg_local_path=local_path, extra_environment=borg_environment)

View File

@ -5,7 +5,7 @@ import logging
import os
import pathlib
from borgmatic.borg import environment, extract, info, state
from borgmatic.borg import environment, extract, feature, flags, rinfo, state
from borgmatic.execute import DO_NOT_CAPTURE, execute_command
DEFAULT_CHECKS = (
@ -33,8 +33,6 @@ def parse_checks(consistency_config, only_checks=None):
If no "checks" option is present in the config, return the DEFAULT_CHECKS. If a checks value
has a name of "disabled", return an empty tuple, meaning that no checks should be run.
If the "data" check is present, then make sure the "archives" check is included as well.
'''
checks = only_checks or tuple(
check_config['name']
@ -48,9 +46,6 @@ def parse_checks(consistency_config, only_checks=None):
)
return ()
if 'data' in checks and 'archives' not in checks:
return checks + ('archives',)
return checks
@ -151,9 +146,10 @@ def filter_checks_on_frequency(
return tuple(filtered_checks)
def make_check_flags(checks, check_last=None, prefix=None):
def make_check_flags(local_borg_version, checks, check_last=None, prefix=None):
'''
Given a parsed sequence of checks, transform it into tuple of command-line flags.
Given the local Borg version and a parsed sequence of checks, transform the checks into tuple of
command-line flags.
For example, given parsed checks of:
@ -164,26 +160,37 @@ def make_check_flags(checks, check_last=None, prefix=None):
('--repository-only',)
However, if both "repository" and "archives" are in checks, then omit them from the returned
flags because Borg does both checks by default.
flags because Borg does both checks by default. If "data" is in checks, that implies "archives".
Additionally, if a check_last value is given and "archives" is in checks, then include a
"--last" flag. And if a prefix value is given and "archives" is in checks, then include a
"--prefix" flag.
"--match-archives" flag.
'''
if 'data' in checks:
data_flags = ('--verify-data',)
checks += ('archives',)
else:
data_flags = ()
if 'archives' in checks:
last_flags = ('--last', str(check_last)) if check_last else ()
prefix_flags = ('--prefix', prefix) if prefix else ()
if feature.available(feature.Feature.MATCH_ARCHIVES, local_borg_version):
match_archives_flags = ('--match-archives', f'sh:{prefix}*') if prefix else ()
else:
match_archives_flags = ('--glob-archives', f'{prefix}*') if prefix else ()
else:
last_flags = ()
prefix_flags = ()
match_archives_flags = ()
if check_last:
logger.info('Ignoring check_last option, as "archives" is not in consistency checks')
logger.warning(
'Ignoring check_last option, as "archives" or "data" are not in consistency checks'
)
if prefix:
logger.info(
'Ignoring consistency prefix option, as "archives" is not in consistency checks'
logger.warning(
'Ignoring consistency prefix option, as "archives" or "data" are not in consistency checks'
)
common_flags = last_flags + prefix_flags + (('--verify-data',) if 'data' in checks else ())
common_flags = last_flags + match_archives_flags + data_flags
if {'repository', 'archives'}.issubset(set(checks)):
return common_flags
@ -240,6 +247,7 @@ def check_archives(
location_config,
storage_config,
consistency_config,
local_borg_version,
local_path='borg',
remote_path=None,
progress=None,
@ -259,10 +267,11 @@ def check_archives(
'''
try:
borg_repository_id = json.loads(
info.display_archives_info(
rinfo.display_repository_info(
repository,
storage_config,
argparse.Namespace(json=True, archive=None),
local_borg_version,
argparse.Namespace(json=True),
local_path,
remote_path,
)
@ -295,13 +304,13 @@ def check_archives(
full_command = (
(local_path, 'check')
+ (('--repair',) if repair else ())
+ make_check_flags(checks, check_last, prefix)
+ make_check_flags(local_borg_version, checks, check_last, prefix)
+ (('--remote-path', remote_path) if remote_path else ())
+ (('--lock-wait', str(lock_wait)) if lock_wait else ())
+ verbosity_flags
+ (('--progress',) if progress else ())
+ (tuple(extra_borg_options.split(' ')) if extra_borg_options else ())
+ (repository,)
+ flags.make_repository_flags(repository, local_borg_version)
)
borg_environment = environment.make_environment(storage_config)
@ -320,6 +329,6 @@ def check_archives(
if 'extract' in checks:
extract.extract_last_archive_dry_run(
storage_config, repository, lock_wait, local_path, remote_path
storage_config, local_borg_version, repository, lock_wait, local_path, remote_path
)
write_check_time(make_check_time_path(location_config, borg_repository_id, 'extract'))

View File

@ -1,6 +1,6 @@
import logging
from borgmatic.borg import environment
from borgmatic.borg import environment, flags
from borgmatic.execute import execute_command
logger = logging.getLogger(__name__)
@ -10,6 +10,7 @@ def compact_segments(
dry_run,
repository,
storage_config,
local_borg_version,
local_path='borg',
remote_path=None,
progress=False,
@ -17,8 +18,8 @@ def compact_segments(
threshold=None,
):
'''
Given dry-run flag, a local or remote repository path, and a storage config dict, compact Borg
segments in a repository.
Given dry-run flag, a local or remote repository path, a storage config dict, and the local
Borg version, compact the segments in a repository.
'''
umask = storage_config.get('umask', None)
lock_wait = storage_config.get('lock_wait', None)
@ -35,13 +36,16 @@ def compact_segments(
+ (('--info',) if logger.getEffectiveLevel() == logging.INFO else ())
+ (('--debug', '--show-rc') if logger.isEnabledFor(logging.DEBUG) else ())
+ (tuple(extra_borg_options.split(' ')) if extra_borg_options else ())
+ (repository,)
+ flags.make_repository_flags(repository, local_borg_version)
)
if not dry_run:
execute_command(
full_command,
output_log_level=logging.INFO,
borg_local_path=local_path,
extra_environment=environment.make_environment(storage_config),
)
if dry_run:
logging.info(f'{repository}: Skipping compact (dry run)')
return
execute_command(
full_command,
output_log_level=logging.INFO,
borg_local_path=local_path,
extra_environment=environment.make_environment(storage_config),
)

View File

@ -3,10 +3,17 @@ import itertools
import logging
import os
import pathlib
import stat
import tempfile
from borgmatic.borg import environment, feature, state
from borgmatic.execute import DO_NOT_CAPTURE, execute_command, execute_command_with_processes
import borgmatic.logger
from borgmatic.borg import environment, feature, flags, state
from borgmatic.execute import (
DO_NOT_CAPTURE,
execute_command,
execute_command_and_capture_output,
execute_command_with_processes,
)
logger = logging.getLogger(__name__)
@ -59,7 +66,7 @@ def map_directories_to_devices(directories):
}
def deduplicate_directories(directory_devices):
def deduplicate_directories(directory_devices, additional_directory_devices):
'''
Given a map from directory to the identifier for the device on which that directory resides,
return the directories as a sorted tuple with all duplicate child directories removed. For
@ -74,22 +81,28 @@ def deduplicate_directories(directory_devices):
there are cases where Borg coming across the same file twice will result in duplicate reads and
even hangs, e.g. when a database hook is using a named pipe for streaming database dumps to
Borg.
If any additional directory devices are given, also deduplicate against them, but don't include
them in the returned directories.
'''
deduplicated = set()
directories = sorted(directory_devices.keys())
additional_directories = sorted(additional_directory_devices.keys())
all_devices = {**directory_devices, **additional_directory_devices}
for directory in directories:
deduplicated.add(directory)
parents = pathlib.PurePath(directory).parents
# If another directory in the given list is a parent of current directory (even n levels
# up) and both are on the same filesystem, then the current directory is a duplicate.
for other_directory in directories:
# If another directory in the given list (or the additional list) is a parent of current
# directory (even n levels up) and both are on the same filesystem, then the current
# directory is a duplicate.
for other_directory in directories + additional_directories:
for parent in parents:
if (
pathlib.PurePath(other_directory) == parent
and directory_devices[directory] is not None
and directory_devices[other_directory] == directory_devices[directory]
and all_devices[directory] is not None
and all_devices[other_directory] == all_devices[directory]
):
if directory in deduplicated:
deduplicated.remove(directory)
@ -98,16 +111,24 @@ def deduplicate_directories(directory_devices):
return tuple(sorted(deduplicated))
def write_pattern_file(patterns=None):
def write_pattern_file(patterns=None, sources=None, pattern_file=None):
'''
Given a sequence of patterns, write them to a named temporary file and return it. Return None
if no patterns are provided.
Given a sequence of patterns and an optional sequence of source directories, write them to a
named temporary file (with the source directories as additional roots) and return the file.
If an optional open pattern file is given, overwrite it instead of making a new temporary file.
Return None if no patterns are provided.
'''
if not patterns:
if not patterns and not sources:
return None
pattern_file = tempfile.NamedTemporaryFile('w')
pattern_file.write('\n'.join(patterns))
if pattern_file is None:
pattern_file = tempfile.NamedTemporaryFile('w')
else:
pattern_file.seek(0)
pattern_file.write(
'\n'.join(tuple(patterns or ()) + tuple(f'R {source}' for source in (sources or [])))
)
pattern_file.flush()
return pattern_file
@ -178,7 +199,7 @@ def make_exclude_flags(location_config, exclude_filename=None):
DEFAULT_ARCHIVE_NAME_FORMAT = '{hostname}-{now:%Y-%m-%dT%H:%M:%S.%f}'
def borgmatic_source_directories(borgmatic_source_directory):
def collect_borgmatic_source_directories(borgmatic_source_directory):
'''
Return a list of borgmatic-specific source directories used for state like database backups.
'''
@ -192,6 +213,78 @@ def borgmatic_source_directories(borgmatic_source_directory):
)
ROOT_PATTERN_PREFIX = 'R '
def pattern_root_directories(patterns=None):
'''
Given a sequence of patterns, parse out and return just the root directories.
'''
if not patterns:
return []
return [
pattern.split(ROOT_PATTERN_PREFIX, maxsplit=1)[1]
for pattern in patterns
if pattern.startswith(ROOT_PATTERN_PREFIX)
]
def special_file(path):
'''
Return whether the given path is a special file (character device, block device, or named pipe
/ FIFO).
'''
try:
mode = os.stat(path).st_mode
except (FileNotFoundError, OSError):
return False
return stat.S_ISCHR(mode) or stat.S_ISBLK(mode) or stat.S_ISFIFO(mode)
def any_parent_directories(path, candidate_parents):
'''
Return whether any of the given candidate parent directories are an actual parent of the given
path. This includes grandparents, etc.
'''
for parent in candidate_parents:
if pathlib.PurePosixPath(parent) in pathlib.PurePath(path).parents:
return True
return False
def collect_special_file_paths(
create_command, local_path, working_directory, borg_environment, skip_directories
):
'''
Given a Borg create command as a tuple, a local Borg path, a working directory, and a dict of
environment variables to pass to Borg, and a sequence of parent directories to skip, collect the
paths for any special files (character devices, block devices, and named pipes / FIFOs) that
Borg would encounter during a create. These are all paths that could cause Borg to hang if its
--read-special flag is used.
'''
paths_output = execute_command_and_capture_output(
create_command + ('--dry-run', '--list'),
capture_stderr=True,
working_directory=working_directory,
extra_environment=borg_environment,
)
paths = tuple(
path_line.split(' ', 1)[1]
for path_line in paths_output.split('\n')
if path_line and path_line.startswith('- ')
)
return tuple(
path
for path in paths
if special_file(path) and not any_parent_directories(path, skip_directories)
)
def create_archive(
dry_run,
repository,
@ -203,7 +296,7 @@ def create_archive(
progress=False,
stats=False,
json=False,
files=False,
list_files=False,
stream_processes=None,
):
'''
@ -213,27 +306,40 @@ def create_archive(
If a sequence of stream processes is given (instances of subprocess.Popen), then execute the
create command while also triggering the given processes to produce output.
'''
borgmatic.logger.add_custom_log_levels()
borgmatic_source_directories = expand_directories(
collect_borgmatic_source_directories(location_config.get('borgmatic_source_directory'))
)
sources = deduplicate_directories(
map_directories_to_devices(
expand_directories(
location_config['source_directories']
+ borgmatic_source_directories(location_config.get('borgmatic_source_directory'))
tuple(location_config.get('source_directories', ())) + borgmatic_source_directories
)
)
),
additional_directory_devices=map_directories_to_devices(
expand_directories(pattern_root_directories(location_config.get('patterns')))
),
)
ensure_files_readable(location_config.get('patterns_from'), location_config.get('exclude_from'))
try:
working_directory = os.path.expanduser(location_config.get('working_directory'))
except TypeError:
working_directory = None
pattern_file = write_pattern_file(location_config.get('patterns'))
pattern_file = (
write_pattern_file(location_config.get('patterns'), sources)
if location_config.get('patterns') or location_config.get('patterns_from')
else None
)
exclude_file = write_pattern_file(
expand_home_directories(location_config.get('exclude_patterns'))
)
checkpoint_interval = storage_config.get('checkpoint_interval', None)
chunker_params = storage_config.get('chunker_params', None)
compression = storage_config.get('compression', None)
remote_rate_limit = storage_config.get('remote_rate_limit', None)
upload_rate_limit = storage_config.get('upload_rate_limit', None)
umask = storage_config.get('umask', None)
lock_wait = storage_config.get('lock_wait', None)
files_cache = location_config.get('files_cache')
@ -246,27 +352,30 @@ def create_archive(
atime_flags = ('--noatime',) if location_config.get('atime') is False else ()
if feature.available(feature.Feature.NOFLAGS, local_borg_version):
noflags_flags = ('--noflags',) if location_config.get('bsd_flags') is False else ()
noflags_flags = ('--noflags',) if location_config.get('flags') is False else ()
else:
noflags_flags = ('--nobsdflags',) if location_config.get('bsd_flags') is False else ()
noflags_flags = ('--nobsdflags',) if location_config.get('flags') is False else ()
if feature.available(feature.Feature.NUMERIC_IDS, local_borg_version):
numeric_ids_flags = ('--numeric-ids',) if location_config.get('numeric_owner') else ()
numeric_ids_flags = ('--numeric-ids',) if location_config.get('numeric_ids') else ()
else:
numeric_ids_flags = ('--numeric-owner',) if location_config.get('numeric_owner') else ()
numeric_ids_flags = ('--numeric-owner',) if location_config.get('numeric_ids') else ()
if feature.available(feature.Feature.UPLOAD_RATELIMIT, local_borg_version):
upload_ratelimit_flags = (
('--upload-ratelimit', str(remote_rate_limit)) if remote_rate_limit else ()
('--upload-ratelimit', str(upload_rate_limit)) if upload_rate_limit else ()
)
else:
upload_ratelimit_flags = (
('--remote-ratelimit', str(remote_rate_limit)) if remote_rate_limit else ()
('--remote-ratelimit', str(upload_rate_limit)) if upload_rate_limit else ()
)
ensure_files_readable(location_config.get('patterns_from'), location_config.get('exclude_from'))
if stream_processes and location_config.get('read_special') is False:
logger.warning(
f'{repository}: Ignoring configured "read_special" value of false, as true is needed for database hooks.'
)
full_command = (
create_command = (
tuple(local_path.split(' '))
+ ('create',)
+ make_pattern_flags(location_config, pattern_file.name if pattern_file else None)
@ -284,32 +393,23 @@ def create_archive(
+ atime_flags
+ (('--noctime',) if location_config.get('ctime') is False else ())
+ (('--nobirthtime',) if location_config.get('birthtime') is False else ())
+ (('--read-special',) if (location_config.get('read_special') or stream_processes) else ())
+ (('--read-special',) if location_config.get('read_special') or stream_processes else ())
+ noflags_flags
+ (('--files-cache', files_cache) if files_cache else ())
+ (('--remote-path', remote_path) if remote_path else ())
+ (('--umask', str(umask)) if umask else ())
+ (('--lock-wait', str(lock_wait)) if lock_wait else ())
+ (('--list', '--filter', 'AME-') if files and not json and not progress else ())
+ (('--info',) if logger.getEffectiveLevel() == logging.INFO and not json else ())
+ (('--stats',) if stats and not json and not dry_run else ())
+ (('--debug', '--show-rc') if logger.isEnabledFor(logging.DEBUG) and not json else ())
+ (('--list', '--filter', 'AMEx-') if list_files and not json and not progress else ())
+ (('--dry-run',) if dry_run else ())
+ (('--progress',) if progress else ())
+ (('--json',) if json else ())
+ (tuple(extra_borg_options.split(' ')) if extra_borg_options else ())
+ (
'{repository}::{archive_name_format}'.format(
repository=repository, archive_name_format=archive_name_format
),
)
+ sources
+ flags.make_repository_archive_flags(repository, archive_name_format, local_borg_version)
+ (sources if not pattern_file else ())
)
if json:
output_log_level = None
elif (stats or files) and logger.getEffectiveLevel() == logging.WARNING:
output_log_level = logging.WARNING
elif list_files or (stats and not dry_run):
output_log_level = logging.ANSWER
else:
output_log_level = logging.INFO
@ -319,9 +419,42 @@ def create_archive(
borg_environment = environment.make_environment(storage_config)
# If database hooks are enabled (as indicated by streaming processes), exclude files that might
# cause Borg to hang. But skip this if the user has explicitly set the "read_special" to True.
if stream_processes and not location_config.get('read_special'):
logger.debug(f'{repository}: Collecting special file paths')
special_file_paths = collect_special_file_paths(
create_command,
local_path,
working_directory,
borg_environment,
skip_directories=borgmatic_source_directories,
)
logger.warning(
f'{repository}: Excluding special files to prevent Borg from hanging: {", ".join(special_file_paths)}'
)
exclude_file = write_pattern_file(
expand_home_directories(
tuple(location_config.get('exclude_patterns') or ()) + special_file_paths
),
pattern_file=exclude_file,
)
if exclude_file:
create_command += make_exclude_flags(location_config, exclude_file.name)
create_command += (
(('--info',) if logger.getEffectiveLevel() == logging.INFO and not json else ())
+ (('--stats',) if stats and not json and not dry_run else ())
+ (('--debug', '--show-rc') if logger.isEnabledFor(logging.DEBUG) and not json else ())
+ (('--progress',) if progress else ())
+ (('--json',) if json else ())
)
if stream_processes:
return execute_command_with_processes(
full_command,
create_command,
stream_processes,
output_log_level,
output_file,
@ -329,12 +462,16 @@ def create_archive(
working_directory=working_directory,
extra_environment=borg_environment,
)
return execute_command(
full_command,
output_log_level,
output_file,
borg_local_path=local_path,
working_directory=working_directory,
extra_environment=borg_environment,
)
elif output_log_level is None:
return execute_command_and_capture_output(
create_command, working_directory=working_directory, extra_environment=borg_environment,
)
else:
execute_command(
create_command,
output_log_level,
output_file,
borg_local_path=local_path,
working_directory=working_directory,
extra_environment=borg_environment,
)

View File

@ -1,7 +1,8 @@
import logging
import os
from borgmatic.borg import environment
import borgmatic.logger
from borgmatic.borg import environment, flags
from borgmatic.execute import DO_NOT_CAPTURE, execute_command
logger = logging.getLogger(__name__)
@ -14,21 +15,23 @@ def export_tar_archive(
paths,
destination_path,
storage_config,
local_borg_version,
local_path='borg',
remote_path=None,
tar_filter=None,
files=False,
list_files=False,
strip_components=None,
):
'''
Given a dry-run flag, a local or remote repository path, an archive name, zero or more paths to
export from the archive, a destination path to export to, a storage configuration dict, optional
local and remote Borg paths, an optional filter program, whether to include per-file details,
and an optional number of path components to strip, export the archive into the given
destination path as a tar-formatted file.
export from the archive, a destination path to export to, a storage configuration dict, the
local Borg version, optional local and remote Borg paths, an optional filter program, whether to
include per-file details, and an optional number of path components to strip, export the archive
into the given destination path as a tar-formatted file.
If the destination path is "-", then stream the output to stdout instead of to a file.
'''
borgmatic.logger.add_custom_log_levels()
umask = storage_config.get('umask', None)
lock_wait = storage_config.get('lock_wait', None)
@ -38,18 +41,22 @@ def export_tar_archive(
+ (('--umask', str(umask)) if umask else ())
+ (('--lock-wait', str(lock_wait)) if lock_wait else ())
+ (('--info',) if logger.getEffectiveLevel() == logging.INFO else ())
+ (('--list',) if files else ())
+ (('--list',) if list_files else ())
+ (('--debug', '--show-rc') if logger.isEnabledFor(logging.DEBUG) else ())
+ (('--dry-run',) if dry_run else ())
+ (('--tar-filter', tar_filter) if tar_filter else ())
+ (('--strip-components', str(strip_components)) if strip_components else ())
+ ('::'.join((repository if ':' in repository else os.path.abspath(repository), archive)),)
+ flags.make_repository_archive_flags(
repository if ':' in repository else os.path.abspath(repository),
archive,
local_borg_version,
)
+ (destination_path,)
+ (tuple(paths) if paths else ())
)
if files and logger.getEffectiveLevel() == logging.WARNING:
output_log_level = logging.WARNING
if list_files:
output_log_level = logging.ANSWER
else:
output_log_level = logging.INFO

View File

@ -2,14 +2,19 @@ import logging
import os
import subprocess
from borgmatic.borg import environment, feature
from borgmatic.borg import environment, feature, flags, rlist
from borgmatic.execute import DO_NOT_CAPTURE, execute_command
logger = logging.getLogger(__name__)
def extract_last_archive_dry_run(
storage_config, repository, lock_wait=None, local_path='borg', remote_path=None
storage_config,
local_borg_version,
repository,
lock_wait=None,
local_path='borg',
remote_path=None,
):
'''
Perform an extraction dry-run of the most recent archive. If there are no archives, skip the
@ -23,40 +28,23 @@ def extract_last_archive_dry_run(
elif logger.isEnabledFor(logging.INFO):
verbosity_flags = ('--info',)
full_list_command = (
(local_path, 'list', '--short')
+ remote_path_flags
+ lock_wait_flags
+ verbosity_flags
+ (repository,)
)
borg_environment = environment.make_environment(storage_config)
list_output = execute_command(
full_list_command,
output_log_level=None,
borg_local_path=local_path,
extra_environment=borg_environment,
)
try:
last_archive_name = list_output.strip().splitlines()[-1]
except IndexError:
last_archive_name = rlist.resolve_archive_name(
repository, 'latest', storage_config, local_borg_version, local_path, remote_path
)
except ValueError:
logger.warning('No archives found. Skipping extract consistency check.')
return
list_flag = ('--list',) if logger.isEnabledFor(logging.DEBUG) else ()
borg_environment = environment.make_environment(storage_config)
full_extract_command = (
(local_path, 'extract', '--dry-run')
+ remote_path_flags
+ lock_wait_flags
+ verbosity_flags
+ list_flag
+ (
'{repository}::{last_archive_name}'.format(
repository=repository, last_archive_name=last_archive_name
),
)
+ flags.make_repository_archive_flags(repository, last_archive_name, local_borg_version)
)
execute_command(
@ -95,9 +83,9 @@ def extract_archive(
raise ValueError('progress and extract_to_stdout cannot both be set')
if feature.available(feature.Feature.NUMERIC_IDS, local_borg_version):
numeric_ids_flags = ('--numeric-ids',) if location_config.get('numeric_owner') else ()
numeric_ids_flags = ('--numeric-ids',) if location_config.get('numeric_ids') else ()
else:
numeric_ids_flags = ('--numeric-owner',) if location_config.get('numeric_owner') else ()
numeric_ids_flags = ('--numeric-owner',) if location_config.get('numeric_ids') else ()
full_command = (
(local_path, 'extract')
@ -111,7 +99,11 @@ def extract_archive(
+ (('--strip-components', str(strip_components)) if strip_components else ())
+ (('--progress',) if progress else ())
+ (('--stdout',) if extract_to_stdout else ())
+ ('::'.join((repository if ':' in repository else os.path.abspath(repository), archive)),)
+ flags.make_repository_archive_flags(
repository if ':' in repository else os.path.abspath(repository),
archive,
local_borg_version,
)
+ (tuple(paths) if paths else ())
)

View File

@ -9,6 +9,11 @@ class Feature(Enum):
NOFLAGS = 3
NUMERIC_IDS = 4
UPLOAD_RATELIMIT = 5
SEPARATE_REPOSITORY_ARCHIVE = 6
RCREATE = 7
RLIST = 8
RINFO = 9
MATCH_ARCHIVES = 10
FEATURE_TO_MINIMUM_BORG_VERSION = {
@ -17,6 +22,11 @@ FEATURE_TO_MINIMUM_BORG_VERSION = {
Feature.NOFLAGS: parse_version('1.2.0a8'), # borg create --noflags
Feature.NUMERIC_IDS: parse_version('1.2.0b3'), # borg create/extract/mount --numeric-ids
Feature.UPLOAD_RATELIMIT: parse_version('1.2.0b3'), # borg create --upload-ratelimit
Feature.SEPARATE_REPOSITORY_ARCHIVE: parse_version('2.0.0a2'), # --repo with separate archive
Feature.RCREATE: parse_version('2.0.0a2'), # borg rcreate
Feature.RLIST: parse_version('2.0.0a2'), # borg rlist
Feature.RINFO: parse_version('2.0.0a2'), # borg rinfo
Feature.MATCH_ARCHIVES: parse_version('2.0.0b3'), # borg --match-archives
}

View File

@ -1,5 +1,7 @@
import itertools
from borgmatic.borg import feature
def make_flags(name, value):
'''
@ -29,3 +31,28 @@ def make_flags_from_arguments(arguments, excludes=()):
if name not in excludes and not name.startswith('_')
)
)
def make_repository_flags(repository, local_borg_version):
'''
Given the path of a Borg repository and the local Borg version, return Borg-version-appropriate
command-line flags (as a tuple) for selecting that repository.
'''
return (
('--repo',)
if feature.available(feature.Feature.SEPARATE_REPOSITORY_ARCHIVE, local_borg_version)
else ()
) + (repository,)
def make_repository_archive_flags(repository, archive, local_borg_version):
'''
Given the path of a Borg repository, an archive name or pattern, and the local Borg version,
return Borg-version-appropriate command-line flags (as a tuple) for selecting that repository
and archive.
'''
return (
('--repo', repository, archive)
if feature.available(feature.Feature.SEPARATE_REPOSITORY_ARCHIVE, local_borg_version)
else (f'{repository}::{archive}',)
)

View File

@ -1,20 +1,26 @@
import logging
from borgmatic.borg import environment
from borgmatic.borg.flags import make_flags, make_flags_from_arguments
from borgmatic.execute import execute_command
import borgmatic.logger
from borgmatic.borg import environment, feature, flags
from borgmatic.execute import execute_command, execute_command_and_capture_output
logger = logging.getLogger(__name__)
def display_archives_info(
repository, storage_config, info_arguments, local_path='borg', remote_path=None
repository,
storage_config,
local_borg_version,
info_arguments,
local_path='borg',
remote_path=None,
):
'''
Given a local or remote repository path, a storage config dict, and the arguments to the info
action, display summary information for Borg archives in the repository or return JSON summary
information.
Given a local or remote repository path, a storage config dict, the local Borg version, and the
arguments to the info action, display summary information for Borg archives in the repository or
return JSON summary information.
'''
borgmatic.logger.add_custom_log_levels()
lock_wait = storage_config.get('lock_wait', None)
full_command = (
@ -29,19 +35,36 @@ def display_archives_info(
if logger.isEnabledFor(logging.DEBUG) and not info_arguments.json
else ()
)
+ make_flags('remote-path', remote_path)
+ make_flags('lock-wait', lock_wait)
+ make_flags_from_arguments(info_arguments, excludes=('repository', 'archive'))
+ flags.make_flags('remote-path', remote_path)
+ flags.make_flags('lock-wait', lock_wait)
+ (
'::'.join((repository, info_arguments.archive))
if info_arguments.archive
else repository,
(
flags.make_flags('match-archives', f'sh:{info_arguments.prefix}*')
if feature.available(feature.Feature.MATCH_ARCHIVES, local_borg_version)
else flags.make_flags('glob-archives', f'{info_arguments.prefix}*')
)
if info_arguments.prefix
else ()
)
+ flags.make_flags_from_arguments(
info_arguments, excludes=('repository', 'archive', 'prefix')
)
+ flags.make_repository_flags(repository, local_borg_version)
+ (
flags.make_flags('match-archives', info_arguments.archive)
if feature.available(feature.Feature.MATCH_ARCHIVES, local_borg_version)
else flags.make_flags('glob-archives', info_arguments.archive)
)
)
return execute_command(
full_command,
output_log_level=None if info_arguments.json else logging.WARNING,
borg_local_path=local_path,
extra_environment=environment.make_environment(storage_config),
)
if info_arguments.json:
return execute_command_and_capture_output(
full_command, extra_environment=environment.make_environment(storage_config),
)
else:
execute_command(
full_command,
output_log_level=logging.ANSWER,
borg_local_path=local_path,
extra_environment=environment.make_environment(storage_config),
)

View File

@ -1,62 +0,0 @@
import argparse
import logging
import subprocess
from borgmatic.borg import environment, info
from borgmatic.execute import DO_NOT_CAPTURE, execute_command
logger = logging.getLogger(__name__)
INFO_REPOSITORY_NOT_FOUND_EXIT_CODE = 2
def initialize_repository(
repository,
storage_config,
encryption_mode,
append_only=None,
storage_quota=None,
local_path='borg',
remote_path=None,
):
'''
Given a local or remote repository path, a storage configuration dict, a Borg encryption mode,
whether the repository should be append-only, and the storage quota to use, initialize the
repository. If the repository already exists, then log and skip initialization.
'''
try:
info.display_archives_info(
repository,
storage_config,
argparse.Namespace(json=True, archive=None),
local_path,
remote_path,
)
logger.info('Repository already exists. Skipping initialization.')
return
except subprocess.CalledProcessError as error:
if error.returncode != INFO_REPOSITORY_NOT_FOUND_EXIT_CODE:
raise
extra_borg_options = storage_config.get('extra_borg_options', {}).get('init', '')
init_command = (
(local_path, 'init')
+ (('--encryption', encryption_mode) if encryption_mode else ())
+ (('--append-only',) if append_only else ())
+ (('--storage-quota', storage_quota) if storage_quota else ())
+ (('--info',) if logger.getEffectiveLevel() == logging.INFO else ())
+ (('--debug',) if logger.isEnabledFor(logging.DEBUG) else ())
+ (('--remote-path', remote_path) if remote_path else ())
+ (tuple(extra_borg_options.split(' ')) if extra_borg_options else ())
+ (repository,)
)
# Do not capture output here, so as to support interactive prompts.
execute_command(
init_command,
output_file=DO_NOT_CAPTURE,
borg_local_path=local_path,
extra_environment=environment.make_environment(storage_config),
)

View File

@ -1,58 +1,32 @@
import argparse
import copy
import logging
import re
from borgmatic.borg import environment
from borgmatic.borg.flags import make_flags, make_flags_from_arguments
from borgmatic.execute import execute_command
import borgmatic.logger
from borgmatic.borg import environment, feature, flags, rlist
from borgmatic.execute import execute_command, execute_command_and_capture_output
logger = logging.getLogger(__name__)
def resolve_archive_name(repository, archive, storage_config, local_path='borg', remote_path=None):
'''
Given a local or remote repository path, an archive name, a storage config dict, a local Borg
path, and a remote Borg path, simply return the archive name. But if the archive name is
"latest", then instead introspect the repository for the latest archive and return its name.
Raise ValueError if "latest" is given but there are no archives in the repository.
'''
if archive != "latest":
return archive
lock_wait = storage_config.get('lock_wait', None)
full_command = (
(local_path, 'list')
+ (('--info',) if logger.getEffectiveLevel() == logging.INFO else ())
+ (('--debug', '--show-rc') if logger.isEnabledFor(logging.DEBUG) else ())
+ make_flags('remote-path', remote_path)
+ make_flags('lock-wait', lock_wait)
+ make_flags('last', 1)
+ ('--short', repository)
)
output = execute_command(
full_command,
output_log_level=None,
borg_local_path=local_path,
extra_environment=environment.make_environment(storage_config),
)
try:
latest_archive = output.strip().splitlines()[-1]
except IndexError:
raise ValueError('No archives found in the repository')
logger.debug('{}: Latest archive is {}'.format(repository, latest_archive))
return latest_archive
MAKE_FLAGS_EXCLUDES = ('repository', 'archive', 'successful', 'paths', 'find_paths')
ARCHIVE_FILTER_FLAGS_MOVED_TO_RLIST = ('prefix', 'match_archives', 'sort_by', 'first', 'last')
MAKE_FLAGS_EXCLUDES = (
'repository',
'archive',
'successful',
'paths',
'find_paths',
) + ARCHIVE_FILTER_FLAGS_MOVED_TO_RLIST
def make_list_command(
repository, storage_config, list_arguments, local_path='borg', remote_path=None
repository,
storage_config,
local_borg_version,
list_arguments,
local_path='borg',
remote_path=None,
):
'''
Given a local or remote repository path, a storage config dict, the arguments to the list
@ -73,13 +47,15 @@ def make_list_command(
if logger.isEnabledFor(logging.DEBUG) and not list_arguments.json
else ()
)
+ make_flags('remote-path', remote_path)
+ make_flags('lock-wait', lock_wait)
+ make_flags_from_arguments(list_arguments, excludes=MAKE_FLAGS_EXCLUDES,)
+ flags.make_flags('remote-path', remote_path)
+ flags.make_flags('lock-wait', lock_wait)
+ flags.make_flags_from_arguments(list_arguments, excludes=MAKE_FLAGS_EXCLUDES)
+ (
('::'.join((repository, list_arguments.archive)),)
flags.make_repository_archive_flags(
repository, list_arguments.archive, local_borg_version
)
if list_arguments.archive
else (repository,)
else flags.make_repository_flags(repository, local_borg_version)
)
+ (tuple(list_arguments.paths) if list_arguments.paths else ())
)
@ -109,32 +85,84 @@ def make_find_paths(find_paths):
)
def list_archives(repository, storage_config, list_arguments, local_path='borg', remote_path=None):
def list_archive(
repository,
storage_config,
local_borg_version,
list_arguments,
local_path='borg',
remote_path=None,
):
'''
Given a local or remote repository path, a storage config dict, the arguments to the list
action, and local and remote Borg paths, display the output of listing Borg archives in the
repository or return JSON output. Or, if an archive name is given, list the files in that
archive. Or, if list_arguments.find_paths are given, list the files by searching across multiple
archives.
Given a local or remote repository path, a storage config dict, the local Borg version, the
arguments to the list action, and local and remote Borg paths, display the output of listing
the files of a Borg archive (or return JSON output). If list_arguments.find_paths are given,
list the files by searching across multiple archives. If neither find_paths nor archive name
are given, instead list the archives in the given repository.
'''
borgmatic.logger.add_custom_log_levels()
if not list_arguments.archive and not list_arguments.find_paths:
if feature.available(feature.Feature.RLIST, local_borg_version):
logger.warning(
'Omitting the --archive flag on the list action is deprecated when using Borg 2.x+. Use the rlist action instead.'
)
rlist_arguments = argparse.Namespace(
repository=repository,
short=list_arguments.short,
format=list_arguments.format,
json=list_arguments.json,
prefix=list_arguments.prefix,
match_archives=list_arguments.match_archives,
sort_by=list_arguments.sort_by,
first=list_arguments.first,
last=list_arguments.last,
)
return rlist.list_repository(
repository, storage_config, local_borg_version, rlist_arguments, local_path, remote_path
)
if list_arguments.archive:
for name in ARCHIVE_FILTER_FLAGS_MOVED_TO_RLIST:
if getattr(list_arguments, name, None):
logger.warning(
f"The --{name.replace('_', '-')} flag on the list action is ignored when using the --archive flag."
)
if list_arguments.json:
raise ValueError(
'The --json flag on the list action is not supported when using the --archive/--find flags.'
)
borg_environment = environment.make_environment(storage_config)
# If there are any paths to find (and there's not a single archive already selected), start by
# getting a list of archives to search.
if list_arguments.find_paths and not list_arguments.archive:
repository_arguments = copy.copy(list_arguments)
repository_arguments.archive = None
repository_arguments.json = False
repository_arguments.format = None
rlist_arguments = argparse.Namespace(
repository=repository,
short=True,
format=None,
json=None,
prefix=list_arguments.prefix,
match_archives=list_arguments.match_archives,
sort_by=list_arguments.sort_by,
first=list_arguments.first,
last=list_arguments.last,
)
# Ask Borg to list archives. Capture its output for use below.
archive_lines = tuple(
execute_command(
make_list_command(
repository, storage_config, repository_arguments, local_path, remote_path
execute_command_and_capture_output(
rlist.make_rlist_command(
repository,
storage_config,
local_borg_version,
rlist_arguments,
local_path,
remote_path,
),
output_log_level=None,
borg_local_path=local_path,
extra_environment=borg_environment,
)
.strip('\n')
@ -144,27 +172,29 @@ def list_archives(repository, storage_config, list_arguments, local_path='borg',
archive_lines = (list_arguments.archive,)
# For each archive listed by Borg, run list on the contents of that archive.
for archive_line in archive_lines:
try:
archive = archive_line.split()[0]
except (AttributeError, IndexError):
archive = None
if archive:
logger.warning(archive_line)
for archive in archive_lines:
logger.answer(f'{repository}: Listing archive {archive}')
archive_arguments = copy.copy(list_arguments)
archive_arguments.archive = archive
# This list call is to show the files in a single archive, not list multiple archives. So
# blank out any archive filtering flags. They'll break anyway in Borg 2.
for name in ARCHIVE_FILTER_FLAGS_MOVED_TO_RLIST:
setattr(archive_arguments, name, None)
main_command = make_list_command(
repository, storage_config, archive_arguments, local_path, remote_path
repository,
storage_config,
local_borg_version,
archive_arguments,
local_path,
remote_path,
) + make_find_paths(list_arguments.find_paths)
output = execute_command(
execute_command(
main_command,
output_log_level=None if list_arguments.json else logging.WARNING,
output_log_level=logging.ANSWER,
borg_local_path=local_path,
extra_environment=borg_environment,
)
if list_arguments.json:
return output

View File

@ -1,6 +1,6 @@
import logging
from borgmatic.borg import environment
from borgmatic.borg import environment, feature, flags
from borgmatic.execute import DO_NOT_CAPTURE, execute_command
logger = logging.getLogger(__name__)
@ -14,13 +14,15 @@ def mount_archive(
foreground,
options,
storage_config,
local_borg_version,
local_path='borg',
remote_path=None,
):
'''
Given a local or remote repository path, an optional archive name, a filesystem mount point,
zero or more paths to mount from the archive, extra Borg mount options, a storage configuration
dict, and optional local and remote Borg paths, mount the archive onto the mount point.
dict, the local Borg version, and optional local and remote Borg paths, mount the archive onto
the mount point.
'''
umask = storage_config.get('umask', None)
lock_wait = storage_config.get('lock_wait', None)
@ -34,7 +36,22 @@ def mount_archive(
+ (('--debug', '--show-rc') if logger.isEnabledFor(logging.DEBUG) else ())
+ (('--foreground',) if foreground else ())
+ (('-o', options) if options else ())
+ (('::'.join((repository, archive)),) if archive else (repository,))
+ (
(
flags.make_repository_flags(repository, local_borg_version)
+ (
('--match-archives', archive)
if feature.available(feature.Feature.MATCH_ARCHIVES, local_borg_version)
else ('--glob-archives', archive)
)
)
if feature.available(feature.Feature.SEPARATE_REPOSITORY_ARCHIVE, local_borg_version)
else (
flags.make_repository_archive_flags(repository, archive, local_borg_version)
if archive
else flags.make_repository_flags(repository, local_borg_version)
)
)
+ (mount_point,)
+ (tuple(paths) if paths else ())
)

View File

@ -1,12 +1,13 @@
import logging
from borgmatic.borg import environment
import borgmatic.logger
from borgmatic.borg import environment, feature, flags
from borgmatic.execute import execute_command
logger = logging.getLogger(__name__)
def _make_prune_flags(retention_config):
def make_prune_flags(retention_config, local_borg_version):
'''
Given a retention config dict mapping from option name to value, tranform it into an iterable of
command-line name-value flag pairs.
@ -23,11 +24,13 @@ def _make_prune_flags(retention_config):
)
'''
config = retention_config.copy()
prefix = config.pop('prefix', '{hostname}-')
if 'prefix' not in config:
config['prefix'] = '{hostname}-'
elif not config['prefix']:
config.pop('prefix')
if prefix:
if feature.available(feature.Feature.MATCH_ARCHIVES, local_borg_version):
config['match_archives'] = f'sh:{prefix}*'
else:
config['glob_archives'] = f'{prefix}*'
return (
('--' + option_name.replace('_', '-'), str(value)) for option_name, value in config.items()
@ -39,37 +42,43 @@ def prune_archives(
repository,
storage_config,
retention_config,
local_borg_version,
local_path='borg',
remote_path=None,
stats=False,
files=False,
list_archives=False,
):
'''
Given dry-run flag, a local or remote repository path, a storage config dict, and a
retention config dict, prune Borg archives according to the retention policy specified in that
configuration.
'''
borgmatic.logger.add_custom_log_levels()
umask = storage_config.get('umask', None)
lock_wait = storage_config.get('lock_wait', None)
extra_borg_options = storage_config.get('extra_borg_options', {}).get('prune', '')
full_command = (
(local_path, 'prune')
+ tuple(element for pair in _make_prune_flags(retention_config) for element in pair)
+ tuple(
element
for pair in make_prune_flags(retention_config, local_borg_version)
for element in pair
)
+ (('--remote-path', remote_path) if remote_path else ())
+ (('--umask', str(umask)) if umask else ())
+ (('--lock-wait', str(lock_wait)) if lock_wait else ())
+ (('--stats',) if stats and not dry_run else ())
+ (('--info',) if logger.getEffectiveLevel() == logging.INFO else ())
+ (('--list',) if files else ())
+ (('--list',) if list_archives else ())
+ (('--debug', '--show-rc') if logger.isEnabledFor(logging.DEBUG) else ())
+ (('--dry-run',) if dry_run else ())
+ (tuple(extra_borg_options.split(' ')) if extra_borg_options else ())
+ (repository,)
+ flags.make_repository_flags(repository, local_borg_version)
)
if (stats or files) and logger.getEffectiveLevel() == logging.WARNING:
output_log_level = logging.WARNING
if stats or list_archives:
output_log_level = logging.ANSWER
else:
output_log_level = logging.INFO

81
borgmatic/borg/rcreate.py Normal file
View File

@ -0,0 +1,81 @@
import argparse
import logging
import subprocess
from borgmatic.borg import environment, feature, flags, rinfo
from borgmatic.execute import DO_NOT_CAPTURE, execute_command
logger = logging.getLogger(__name__)
RINFO_REPOSITORY_NOT_FOUND_EXIT_CODE = 2
def create_repository(
dry_run,
repository,
storage_config,
local_borg_version,
encryption_mode,
source_repository=None,
copy_crypt_key=False,
append_only=None,
storage_quota=None,
make_parent_dirs=False,
local_path='borg',
remote_path=None,
):
'''
Given a dry-run flag, a local or remote repository path, a storage configuration dict, the local
Borg version, a Borg encryption mode, the path to another repo whose key material should be
reused, whether the repository should be append-only, and the storage quota to use, create the
repository. If the repository already exists, then log and skip creation.
'''
try:
rinfo.display_repository_info(
repository,
storage_config,
local_borg_version,
argparse.Namespace(json=True),
local_path,
remote_path,
)
logger.info(f'{repository}: Repository already exists. Skipping creation.')
return
except subprocess.CalledProcessError as error:
if error.returncode != RINFO_REPOSITORY_NOT_FOUND_EXIT_CODE:
raise
extra_borg_options = storage_config.get('extra_borg_options', {}).get('rcreate', '')
rcreate_command = (
(local_path,)
+ (
('rcreate',)
if feature.available(feature.Feature.RCREATE, local_borg_version)
else ('init',)
)
+ (('--encryption', encryption_mode) if encryption_mode else ())
+ (('--other-repo', source_repository) if source_repository else ())
+ (('--copy-crypt-key',) if copy_crypt_key else ())
+ (('--append-only',) if append_only else ())
+ (('--storage-quota', storage_quota) if storage_quota else ())
+ (('--make-parent-dirs',) if make_parent_dirs else ())
+ (('--info',) if logger.getEffectiveLevel() == logging.INFO else ())
+ (('--debug',) if logger.isEnabledFor(logging.DEBUG) else ())
+ (('--remote-path', remote_path) if remote_path else ())
+ (tuple(extra_borg_options.split(' ')) if extra_borg_options else ())
+ flags.make_repository_flags(repository, local_borg_version)
)
if dry_run:
logging.info(f'{repository}: Skipping repository creation (dry run)')
return
# Do not capture output here, so as to support interactive prompts.
execute_command(
rcreate_command,
output_file=DO_NOT_CAPTURE,
borg_local_path=local_path,
extra_environment=environment.make_environment(storage_config),
)

61
borgmatic/borg/rinfo.py Normal file
View File

@ -0,0 +1,61 @@
import logging
import borgmatic.logger
from borgmatic.borg import environment, feature, flags
from borgmatic.execute import execute_command, execute_command_and_capture_output
logger = logging.getLogger(__name__)
def display_repository_info(
repository,
storage_config,
local_borg_version,
rinfo_arguments,
local_path='borg',
remote_path=None,
):
'''
Given a local or remote repository path, a storage config dict, the local Borg version, and the
arguments to the rinfo action, display summary information for the Borg repository or return
JSON summary information.
'''
borgmatic.logger.add_custom_log_levels()
lock_wait = storage_config.get('lock_wait', None)
full_command = (
(local_path,)
+ (
('rinfo',)
if feature.available(feature.Feature.RINFO, local_borg_version)
else ('info',)
)
+ (
('--info',)
if logger.getEffectiveLevel() == logging.INFO and not rinfo_arguments.json
else ()
)
+ (
('--debug', '--show-rc')
if logger.isEnabledFor(logging.DEBUG) and not rinfo_arguments.json
else ()
)
+ flags.make_flags('remote-path', remote_path)
+ flags.make_flags('lock-wait', lock_wait)
+ (('--json',) if rinfo_arguments.json else ())
+ flags.make_repository_flags(repository, local_borg_version)
)
extra_environment = environment.make_environment(storage_config)
if rinfo_arguments.json:
return execute_command_and_capture_output(
full_command, extra_environment=extra_environment,
)
else:
execute_command(
full_command,
output_log_level=logging.ANSWER,
borg_local_path=local_path,
extra_environment=extra_environment,
)

127
borgmatic/borg/rlist.py Normal file
View File

@ -0,0 +1,127 @@
import logging
import borgmatic.logger
from borgmatic.borg import environment, feature, flags
from borgmatic.execute import execute_command, execute_command_and_capture_output
logger = logging.getLogger(__name__)
def resolve_archive_name(
repository, archive, storage_config, local_borg_version, local_path='borg', remote_path=None
):
'''
Given a local or remote repository path, an archive name, a storage config dict, a local Borg
path, and a remote Borg path, simply return the archive name. But if the archive name is
"latest", then instead introspect the repository for the latest archive and return its name.
Raise ValueError if "latest" is given but there are no archives in the repository.
'''
if archive != "latest":
return archive
lock_wait = storage_config.get('lock_wait', None)
full_command = (
(
local_path,
'rlist' if feature.available(feature.Feature.RLIST, local_borg_version) else 'list',
)
+ flags.make_flags('remote-path', remote_path)
+ flags.make_flags('lock-wait', lock_wait)
+ flags.make_flags('last', 1)
+ ('--short',)
+ flags.make_repository_flags(repository, local_borg_version)
)
output = execute_command_and_capture_output(
full_command, extra_environment=environment.make_environment(storage_config),
)
try:
latest_archive = output.strip().splitlines()[-1]
except IndexError:
raise ValueError('No archives found in the repository')
logger.debug('{}: Latest archive is {}'.format(repository, latest_archive))
return latest_archive
MAKE_FLAGS_EXCLUDES = ('repository', 'prefix')
def make_rlist_command(
repository,
storage_config,
local_borg_version,
rlist_arguments,
local_path='borg',
remote_path=None,
):
'''
Given a local or remote repository path, a storage config dict, the local Borg version, the
arguments to the rlist action, and local and remote Borg paths, return a command as a tuple to
list archives with a repository.
'''
lock_wait = storage_config.get('lock_wait', None)
return (
(
local_path,
'rlist' if feature.available(feature.Feature.RLIST, local_borg_version) else 'list',
)
+ (
('--info',)
if logger.getEffectiveLevel() == logging.INFO and not rlist_arguments.json
else ()
)
+ (
('--debug', '--show-rc')
if logger.isEnabledFor(logging.DEBUG) and not rlist_arguments.json
else ()
)
+ flags.make_flags('remote-path', remote_path)
+ flags.make_flags('lock-wait', lock_wait)
+ (
(
flags.make_flags('match-archives', f'sh:{rlist_arguments.prefix}*')
if feature.available(feature.Feature.MATCH_ARCHIVES, local_borg_version)
else flags.make_flags('glob-archives', f'{rlist_arguments.prefix}*')
)
if rlist_arguments.prefix
else ()
)
+ flags.make_flags_from_arguments(rlist_arguments, excludes=MAKE_FLAGS_EXCLUDES)
+ flags.make_repository_flags(repository, local_borg_version)
)
def list_repository(
repository,
storage_config,
local_borg_version,
rlist_arguments,
local_path='borg',
remote_path=None,
):
'''
Given a local or remote repository path, a storage config dict, the local Borg version, the
arguments to the list action, and local and remote Borg paths, display the output of listing
Borg archives in the given repository (or return JSON output).
'''
borgmatic.logger.add_custom_log_levels()
borg_environment = environment.make_environment(storage_config)
main_command = make_rlist_command(
repository, storage_config, local_borg_version, rlist_arguments, local_path, remote_path
)
if rlist_arguments.json:
return execute_command_and_capture_output(main_command, extra_environment=borg_environment,)
else:
execute_command(
main_command,
output_log_level=logging.ANSWER,
borg_local_path=local_path,
extra_environment=borg_environment,
)

View File

@ -0,0 +1,50 @@
import logging
import borgmatic.logger
from borgmatic.borg import environment, flags
from borgmatic.execute import execute_command
logger = logging.getLogger(__name__)
def transfer_archives(
dry_run,
repository,
storage_config,
local_borg_version,
transfer_arguments,
local_path='borg',
remote_path=None,
):
'''
Given a dry-run flag, a local or remote repository path, a storage config dict, the local Borg
version, and the arguments to the transfer action, transfer archives to the given repository.
'''
borgmatic.logger.add_custom_log_levels()
full_command = (
(local_path, 'transfer')
+ (('--info',) if logger.getEffectiveLevel() == logging.INFO else ())
+ (('--debug', '--show-rc') if logger.isEnabledFor(logging.DEBUG) else ())
+ flags.make_flags('remote-path', remote_path)
+ flags.make_flags('lock-wait', storage_config.get('lock_wait', None))
+ (
flags.make_flags(
'match-archives', transfer_arguments.match_archives or transfer_arguments.archive
)
)
+ flags.make_flags_from_arguments(
transfer_arguments,
excludes=('repository', 'source_repository', 'archive', 'match_archives'),
)
+ flags.make_repository_flags(repository, local_borg_version)
+ flags.make_flags('other-repo', transfer_arguments.source_repository)
+ flags.make_flags('dry-run', dry_run)
)
return execute_command(
full_command,
output_log_level=logging.ANSWER,
borg_local_path=local_path,
extra_environment=environment.make_environment(storage_config),
)

View File

@ -1,7 +1,7 @@
import logging
from borgmatic.borg import environment
from borgmatic.execute import execute_command
from borgmatic.execute import execute_command_and_capture_output
logger = logging.getLogger(__name__)
@ -18,11 +18,8 @@ def local_borg_version(storage_config, local_path='borg'):
+ (('--info',) if logger.getEffectiveLevel() == logging.INFO else ())
+ (('--debug', '--show-rc') if logger.isEnabledFor(logging.DEBUG) else ())
)
output = execute_command(
full_command,
output_log_level=None,
borg_local_path=local_path,
extra_environment=environment.make_environment(storage_config),
output = execute_command_and_capture_output(
full_command, extra_environment=environment.make_environment(storage_config),
)
try:

View File

@ -4,18 +4,22 @@ from argparse import Action, ArgumentParser
from borgmatic.config import collect
SUBPARSER_ALIASES = {
'init': ['--init', '-I'],
'prune': ['--prune', '-p'],
'rcreate': ['init', '-I'],
'prune': ['-p'],
'compact': [],
'create': ['--create', '-C'],
'check': ['--check', '-k'],
'extract': ['--extract', '-x'],
'export-tar': ['--export-tar'],
'mount': ['--mount', '-m'],
'umount': ['--umount', '-u'],
'restore': ['--restore', '-r'],
'list': ['--list', '-l'],
'info': ['--info', '-i'],
'create': ['-C'],
'check': ['-k'],
'extract': ['-x'],
'export-tar': [],
'mount': ['-m'],
'umount': ['-u'],
'restore': ['-r'],
'rlist': [],
'list': ['-l'],
'rinfo': [],
'info': ['-i'],
'transfer': [],
'break-lock': [],
'borg': [],
}
@ -222,33 +226,93 @@ def make_parsers():
metavar='',
help='Specify zero or more actions. Defaults to prune, compact, create, and check. Use --help with action for details:',
)
init_parser = subparsers.add_parser(
'init',
aliases=SUBPARSER_ALIASES['init'],
help='Initialize an empty Borg repository',
description='Initialize an empty Borg repository',
rcreate_parser = subparsers.add_parser(
'rcreate',
aliases=SUBPARSER_ALIASES['rcreate'],
help='Create a new, empty Borg repository',
description='Create a new, empty Borg repository',
add_help=False,
)
init_group = init_parser.add_argument_group('init arguments')
init_group.add_argument(
rcreate_group = rcreate_parser.add_argument_group('rcreate arguments')
rcreate_group.add_argument(
'-e',
'--encryption',
dest='encryption_mode',
help='Borg repository encryption mode',
required=True,
)
init_group.add_argument(
'--append-only',
dest='append_only',
rcreate_group.add_argument(
'--source-repository',
'--other-repo',
metavar='KEY_REPOSITORY',
help='Path to an existing Borg repository whose key material should be reused (Borg 2.x+ only)',
)
rcreate_group.add_argument(
'--copy-crypt-key',
action='store_true',
help='Create an append-only repository',
help='Copy the crypt key used for authenticated encryption from the source repository, defaults to a new random key (Borg 2.x+ only)',
)
init_group.add_argument(
'--storage-quota',
dest='storage_quota',
help='Create a repository with a fixed storage quota',
rcreate_group.add_argument(
'--append-only', action='store_true', help='Create an append-only repository',
)
rcreate_group.add_argument(
'--storage-quota', help='Create a repository with a fixed storage quota',
)
rcreate_group.add_argument(
'--make-parent-dirs',
action='store_true',
help='Create any missing parent directories of the repository directory',
)
rcreate_group.add_argument(
'-h', '--help', action='help', help='Show this help message and exit'
)
transfer_parser = subparsers.add_parser(
'transfer',
aliases=SUBPARSER_ALIASES['transfer'],
help='Transfer archives from one repository to another, optionally upgrading the transferred data (Borg 2.0+ only)',
description='Transfer archives from one repository to another, optionally upgrading the transferred data (Borg 2.0+ only)',
add_help=False,
)
transfer_group = transfer_parser.add_argument_group('transfer arguments')
transfer_group.add_argument(
'--repository',
help='Path of existing destination repository to transfer archives to, defaults to the configured repository if there is only one',
)
transfer_group.add_argument(
'--source-repository',
help='Path of existing source repository to transfer archives from',
required=True,
)
transfer_group.add_argument(
'--archive',
help='Name of single archive to transfer (or "latest"), defaults to transferring all archives',
)
transfer_group.add_argument(
'--upgrader',
help='Upgrader type used to convert the transfered data, e.g. "From12To20" to upgrade data from Borg 1.2 to 2.0 format, defaults to no conversion',
)
transfer_group.add_argument(
'-a',
'--match-archives',
'--glob-archives',
metavar='PATTERN',
help='Only transfer archives with names matching this pattern',
)
transfer_group.add_argument(
'--sort-by', metavar='KEYS', help='Comma-separated list of sorting keys'
)
transfer_group.add_argument(
'--first',
metavar='N',
help='Only transfer first N archives after other filters are applied',
)
transfer_group.add_argument(
'--last', metavar='N', help='Only transfer last N archives after other filters are applied'
)
transfer_group.add_argument(
'-h', '--help', action='help', help='Show this help message and exit'
)
init_group.add_argument('-h', '--help', action='help', help='Show this help message and exit')
prune_parser = subparsers.add_parser(
'prune',
@ -266,7 +330,7 @@ def make_parsers():
help='Display statistics of archive',
)
prune_group.add_argument(
'--files', dest='files', default=False, action='store_true', help='Show per-file details'
'--list', dest='list_archives', action='store_true', help='List archives kept/pruned'
)
prune_group.add_argument('-h', '--help', action='help', help='Show this help message and exit')
@ -290,7 +354,7 @@ def make_parsers():
dest='cleanup_commits',
default=False,
action='store_true',
help='Cleanup commit-only 17-byte segment files left behind by Borg 1.1',
help='Cleanup commit-only 17-byte segment files left behind by Borg 1.1 (flag in Borg 1.2 only)',
)
compact_group.add_argument(
'--threshold',
@ -305,8 +369,8 @@ def make_parsers():
create_parser = subparsers.add_parser(
'create',
aliases=SUBPARSER_ALIASES['create'],
help='Create archives (actually perform backups)',
description='Create archives (actually perform backups)',
help='Create an archive (actually perform a backup)',
description='Create an archive (actually perform a backup)',
add_help=False,
)
create_group = create_parser.add_argument_group('create arguments')
@ -325,7 +389,7 @@ def make_parsers():
help='Display statistics of archive',
)
create_group.add_argument(
'--files', dest='files', default=False, action='store_true', help='Show per-file details'
'--list', '--files', dest='list_files', action='store_true', help='Show per-file details'
)
create_group.add_argument(
'--json', dest='json', default=False, action='store_true', help='Output results as JSON'
@ -443,14 +507,14 @@ def make_parsers():
'--destination',
metavar='PATH',
dest='destination',
help='Path to destination export tar file, or "-" for stdout (but be careful about dirtying output with --verbosity or --files)',
help='Path to destination export tar file, or "-" for stdout (but be careful about dirtying output with --verbosity or --list)',
required=True,
)
export_tar_group.add_argument(
'--tar-filter', help='Name of filter program to pipe data through'
)
export_tar_group.add_argument(
'--files', default=False, action='store_true', help='Show per-file details'
'--list', '--files', dest='list_files', action='store_true', help='Show per-file details'
)
export_tar_group.add_argument(
'--strip-components',
@ -543,18 +607,58 @@ def make_parsers():
'-h', '--help', action='help', help='Show this help message and exit'
)
rlist_parser = subparsers.add_parser(
'rlist',
aliases=SUBPARSER_ALIASES['rlist'],
help='List repository',
description='List the archives in a repository',
add_help=False,
)
rlist_group = rlist_parser.add_argument_group('rlist arguments')
rlist_group.add_argument(
'--repository', help='Path of repository to list, defaults to the configured repositories',
)
rlist_group.add_argument(
'--short', default=False, action='store_true', help='Output only archive names'
)
rlist_group.add_argument('--format', help='Format for archive listing')
rlist_group.add_argument(
'--json', default=False, action='store_true', help='Output results as JSON'
)
rlist_group.add_argument(
'-P', '--prefix', help='Only list archive names starting with this prefix'
)
rlist_group.add_argument(
'-a',
'--match-archives',
'--glob-archives',
metavar='PATTERN',
help='Only list archive names matching this pattern',
)
rlist_group.add_argument(
'--sort-by', metavar='KEYS', help='Comma-separated list of sorting keys'
)
rlist_group.add_argument(
'--first', metavar='N', help='List first N archives after other filters are applied'
)
rlist_group.add_argument(
'--last', metavar='N', help='List last N archives after other filters are applied'
)
rlist_group.add_argument('-h', '--help', action='help', help='Show this help message and exit')
list_parser = subparsers.add_parser(
'list',
aliases=SUBPARSER_ALIASES['list'],
help='List archives',
description='List archives or the contents of an archive',
help='List archive',
description='List the files in an archive or search for a file across archives',
add_help=False,
)
list_group = list_parser.add_argument_group('list arguments')
list_group.add_argument(
'--repository', help='Path of repository to list, defaults to the configured repositories',
'--repository',
help='Path of repository containing archive to list, defaults to the configured repositories',
)
list_group.add_argument('--archive', help='Name of archive to list (or "latest")')
list_group.add_argument('--archive', help='Name of the archive to list (or "latest")')
list_group.add_argument(
'--path',
metavar='PATH',
@ -570,7 +674,7 @@ def make_parsers():
help='Partial paths or patterns to search for and list across multiple archives',
)
list_group.add_argument(
'--short', default=False, action='store_true', help='Output only archive or path names'
'--short', default=False, action='store_true', help='Output only path names'
)
list_group.add_argument('--format', help='Format for file listing')
list_group.add_argument(
@ -580,13 +684,17 @@ def make_parsers():
'-P', '--prefix', help='Only list archive names starting with this prefix'
)
list_group.add_argument(
'-a', '--glob-archives', metavar='GLOB', help='Only list archive names matching this glob'
'-a',
'--match-archives',
'--glob-archives',
metavar='PATTERN',
help='Only list archive names matching this pattern',
)
list_group.add_argument(
'--successful',
default=True,
action='store_true',
help='Deprecated in favor of listing successful (non-checkpoint) backups by default in newer versions of Borg',
help='Deprecated; no effect. Newer versions of Borg shows successful (non-checkpoint) archives by default.',
)
list_group.add_argument(
'--sort-by', metavar='KEYS', help='Comma-separated list of sorting keys'
@ -611,17 +719,34 @@ def make_parsers():
)
list_group.add_argument('-h', '--help', action='help', help='Show this help message and exit')
rinfo_parser = subparsers.add_parser(
'rinfo',
aliases=SUBPARSER_ALIASES['rinfo'],
help='Show repository summary information such as disk space used',
description='Show repository summary information such as disk space used',
add_help=False,
)
rinfo_group = rinfo_parser.add_argument_group('rinfo arguments')
rinfo_group.add_argument(
'--repository',
help='Path of repository to show info for, defaults to the configured repository if there is only one',
)
rinfo_group.add_argument(
'--json', dest='json', default=False, action='store_true', help='Output results as JSON'
)
rinfo_group.add_argument('-h', '--help', action='help', help='Show this help message and exit')
info_parser = subparsers.add_parser(
'info',
aliases=SUBPARSER_ALIASES['info'],
help='Display summary information on archives',
description='Display summary information on archives',
help='Show archive summary information such as disk space used',
description='Show archive summary information such as disk space used',
add_help=False,
)
info_group = info_parser.add_argument_group('info arguments')
info_group.add_argument(
'--repository',
help='Path of repository to show info for, defaults to the configured repository if there is only one',
help='Path of repository containing archive to show info for, defaults to the configured repository if there is only one',
)
info_group.add_argument('--archive', help='Name of archive to show info for (or "latest")')
info_group.add_argument(
@ -632,9 +757,10 @@ def make_parsers():
)
info_group.add_argument(
'-a',
'--match-archives',
'--glob-archives',
metavar='GLOB',
help='Only show info for archive names matching this glob',
metavar='PATTERN',
help='Only show info for archive names matching this pattern',
)
info_group.add_argument(
'--sort-by', metavar='KEYS', help='Comma-separated list of sorting keys'
@ -649,6 +775,22 @@ def make_parsers():
)
info_group.add_argument('-h', '--help', action='help', help='Show this help message and exit')
break_lock_parser = subparsers.add_parser(
'break-lock',
aliases=SUBPARSER_ALIASES['break-lock'],
help='Break the repository and cache locks left behind by Borg aborting',
description='Break Borg repository and cache locks left behind by Borg aborting',
add_help=False,
)
break_lock_group = break_lock_parser.add_argument_group('break-lock arguments')
break_lock_group.add_argument(
'--repository',
help='Path of repository to break the lock for, defaults to the configured repository if there is only one',
)
break_lock_group.add_argument(
'-h', '--help', action='help', help='Show this help message and exit'
)
borg_parser = subparsers.add_parser(
'borg',
aliases=SUBPARSER_ALIASES['borg'],
@ -688,18 +830,32 @@ def parse_arguments(*unparsed_arguments):
if arguments['global'].excludes_filename:
raise ValueError(
'The --excludes option has been replaced with exclude_patterns in configuration'
'The --excludes flag has been replaced with exclude_patterns in configuration.'
)
if 'init' in arguments and arguments['global'].dry_run:
raise ValueError('The init action cannot be used with the --dry-run option')
if (
('list' in arguments and 'rinfo' in arguments and arguments['list'].json)
or ('list' in arguments and 'info' in arguments and arguments['list'].json)
or ('rinfo' in arguments and 'info' in arguments and arguments['rinfo'].json)
):
raise ValueError('With the --json flag, multiple actions cannot be used together.')
if (
'list' in arguments
and 'info' in arguments
and arguments['list'].json
and arguments['info'].json
'transfer' in arguments
and arguments['transfer'].archive
and arguments['transfer'].match_archives
):
raise ValueError('With the --json option, list and info actions cannot be used together')
raise ValueError(
'With the transfer action, only one of --archive and --glob-archives flags can be used.'
)
if 'info' in arguments and (
(arguments['info'].archive and arguments['info'].prefix)
or (arguments['info'].archive and arguments['info'].match_archives)
or (arguments['info'].prefix and arguments['info'].match_archives)
):
raise ValueError(
'With the info action, only one of --archive, --prefix, or --match-archives flags can be used.'
)
return arguments

View File

@ -13,6 +13,7 @@ import pkg_resources
import borgmatic.commands.completion
from borgmatic.borg import borg as borg_borg
from borgmatic.borg import break_lock as borg_break_lock
from borgmatic.borg import check as borg_check
from borgmatic.borg import compact as borg_compact
from borgmatic.borg import create as borg_create
@ -20,16 +21,19 @@ from borgmatic.borg import export_tar as borg_export_tar
from borgmatic.borg import extract as borg_extract
from borgmatic.borg import feature as borg_feature
from borgmatic.borg import info as borg_info
from borgmatic.borg import init as borg_init
from borgmatic.borg import list as borg_list
from borgmatic.borg import mount as borg_mount
from borgmatic.borg import prune as borg_prune
from borgmatic.borg import rcreate as borg_rcreate
from borgmatic.borg import rinfo as borg_rinfo
from borgmatic.borg import rlist as borg_rlist
from borgmatic.borg import transfer as borg_transfer
from borgmatic.borg import umount as borg_umount
from borgmatic.borg import version as borg_version
from borgmatic.commands.arguments import parse_arguments
from borgmatic.config import checks, collect, convert, validate
from borgmatic.hooks import command, dispatch, dump, monitor
from borgmatic.logger import configure_logging, should_do_markup
from borgmatic.logger import add_custom_log_levels, configure_logging, should_do_markup
from borgmatic.signals import configure_signals
from borgmatic.verbosity import verbosity_to_log_level
@ -240,6 +244,7 @@ def run_actions(
action or a hook. Raise ValueError if the arguments or configuration passed to action are
invalid.
'''
add_custom_log_levels()
repository = os.path.expanduser(repository_path)
global_arguments = arguments['global']
dry_run_label = ' (dry run; not making any changes)' if global_arguments.dry_run else ''
@ -249,14 +254,39 @@ def run_actions(
'repositories': ','.join(location['repositories']),
}
if 'init' in arguments:
logger.info('{}: Initializing repository'.format(repository))
borg_init.initialize_repository(
command.execute_hook(
hooks.get('before_actions'),
hooks.get('umask'),
config_filename,
'pre-actions',
global_arguments.dry_run,
**hook_context,
)
if 'rcreate' in arguments:
logger.info('{}: Creating repository'.format(repository))
borg_rcreate.create_repository(
global_arguments.dry_run,
repository,
storage,
arguments['init'].encryption_mode,
arguments['init'].append_only,
arguments['init'].storage_quota,
local_borg_version,
arguments['rcreate'].encryption_mode,
arguments['rcreate'].source_repository,
arguments['rcreate'].copy_crypt_key,
arguments['rcreate'].append_only,
arguments['rcreate'].storage_quota,
arguments['rcreate'].make_parent_dirs,
local_path=local_path,
remote_path=remote_path,
)
if 'transfer' in arguments:
logger.info(f'{repository}: Transferring archives to repository')
borg_transfer.transfer_archives(
global_arguments.dry_run,
repository,
storage,
local_borg_version,
transfer_arguments=arguments['transfer'],
local_path=local_path,
remote_path=remote_path,
)
@ -275,10 +305,11 @@ def run_actions(
repository,
storage,
retention,
local_borg_version,
local_path=local_path,
remote_path=remote_path,
stats=arguments['prune'].stats,
files=arguments['prune'].files,
list_archives=arguments['prune'].list_archives,
)
command.execute_hook(
hooks.get('after_prune'),
@ -302,6 +333,7 @@ def run_actions(
global_arguments.dry_run,
repository,
storage,
local_borg_version,
local_path=local_path,
remote_path=remote_path,
progress=arguments['compact'].progress,
@ -329,7 +361,7 @@ def run_actions(
**hook_context,
)
logger.info('{}: Creating archive{}'.format(repository, dry_run_label))
dispatch.call_hooks(
dispatch.call_hooks_even_if_unconfigured(
'remove_database_dumps',
hooks,
repository,
@ -358,13 +390,13 @@ def run_actions(
progress=arguments['create'].progress,
stats=arguments['create'].stats,
json=arguments['create'].json,
files=arguments['create'].files,
list_files=arguments['create'].list_files,
stream_processes=stream_processes,
)
if json_output: # pragma: nocover
yield json.loads(json_output)
dispatch.call_hooks(
dispatch.call_hooks_even_if_unconfigured(
'remove_database_dumps',
hooks,
config_filename,
@ -396,6 +428,7 @@ def run_actions(
location,
storage,
consistency,
local_borg_version,
local_path=local_path,
remote_path=remote_path,
progress=arguments['check'].progress,
@ -429,8 +462,13 @@ def run_actions(
borg_extract.extract_archive(
global_arguments.dry_run,
repository,
borg_list.resolve_archive_name(
repository, arguments['extract'].archive, storage, local_path, remote_path
borg_rlist.resolve_archive_name(
repository,
arguments['extract'].archive,
storage,
local_borg_version,
local_path,
remote_path,
),
arguments['extract'].paths,
location,
@ -462,16 +500,22 @@ def run_actions(
borg_export_tar.export_tar_archive(
global_arguments.dry_run,
repository,
borg_list.resolve_archive_name(
repository, arguments['export-tar'].archive, storage, local_path, remote_path
borg_rlist.resolve_archive_name(
repository,
arguments['export-tar'].archive,
storage,
local_borg_version,
local_path,
remote_path,
),
arguments['export-tar'].paths,
arguments['export-tar'].destination,
storage,
local_borg_version,
local_path=local_path,
remote_path=remote_path,
tar_filter=arguments['export-tar'].tar_filter,
files=arguments['export-tar'].files,
list_files=arguments['export-tar'].list_files,
strip_components=arguments['export-tar'].strip_components,
)
if 'mount' in arguments:
@ -487,14 +531,20 @@ def run_actions(
borg_mount.mount_archive(
repository,
borg_list.resolve_archive_name(
repository, arguments['mount'].archive, storage, local_path, remote_path
borg_rlist.resolve_archive_name(
repository,
arguments['mount'].archive,
storage,
local_borg_version,
local_path,
remote_path,
),
arguments['mount'].mount_point,
arguments['mount'].paths,
arguments['mount'].foreground,
arguments['mount'].options,
storage,
local_borg_version,
local_path=local_path,
remote_path=remote_path,
)
@ -507,7 +557,7 @@ def run_actions(
repository, arguments['restore'].archive
)
)
dispatch.call_hooks(
dispatch.call_hooks_even_if_unconfigured(
'remove_database_dumps',
hooks,
repository,
@ -520,8 +570,13 @@ def run_actions(
if 'all' in restore_names:
restore_names = []
archive_name = borg_list.resolve_archive_name(
repository, arguments['restore'].archive, storage, local_path, remote_path
archive_name = borg_rlist.resolve_archive_name(
repository,
arguments['restore'].archive,
storage,
local_borg_version,
local_path,
remote_path,
)
found_names = set()
@ -572,7 +627,7 @@ def run_actions(
extract_process,
)
dispatch.call_hooks(
dispatch.call_hooks_even_if_unconfigured(
'remove_database_dumps',
hooks,
repository,
@ -591,62 +646,137 @@ def run_actions(
', '.join(missing_names)
)
)
if 'rlist' in arguments:
if arguments['rlist'].repository is None or validate.repositories_match(
repository, arguments['rlist'].repository
):
rlist_arguments = copy.copy(arguments['rlist'])
if not rlist_arguments.json: # pragma: nocover
logger.answer('{}: Listing repository'.format(repository))
json_output = borg_rlist.list_repository(
repository,
storage,
local_borg_version,
rlist_arguments=rlist_arguments,
local_path=local_path,
remote_path=remote_path,
)
if json_output: # pragma: nocover
yield json.loads(json_output)
if 'list' in arguments:
if arguments['list'].repository is None or validate.repositories_match(
repository, arguments['list'].repository
):
list_arguments = copy.copy(arguments['list'])
if not list_arguments.json: # pragma: nocover
logger.warning('{}: Listing archives'.format(repository))
list_arguments.archive = borg_list.resolve_archive_name(
repository, list_arguments.archive, storage, local_path, remote_path
if list_arguments.find_paths:
logger.answer('{}: Searching archives'.format(repository))
elif not list_arguments.archive:
logger.answer('{}: Listing archives'.format(repository))
list_arguments.archive = borg_rlist.resolve_archive_name(
repository,
list_arguments.archive,
storage,
local_borg_version,
local_path,
remote_path,
)
json_output = borg_list.list_archives(
json_output = borg_list.list_archive(
repository,
storage,
local_borg_version,
list_arguments=list_arguments,
local_path=local_path,
remote_path=remote_path,
)
if json_output: # pragma: nocover
yield json.loads(json_output)
if 'rinfo' in arguments:
if arguments['rinfo'].repository is None or validate.repositories_match(
repository, arguments['rinfo'].repository
):
rinfo_arguments = copy.copy(arguments['rinfo'])
if not rinfo_arguments.json: # pragma: nocover
logger.answer('{}: Displaying repository summary information'.format(repository))
json_output = borg_rinfo.display_repository_info(
repository,
storage,
local_borg_version,
rinfo_arguments=rinfo_arguments,
local_path=local_path,
remote_path=remote_path,
)
if json_output: # pragma: nocover
yield json.loads(json_output)
if 'info' in arguments:
if arguments['info'].repository is None or validate.repositories_match(
repository, arguments['info'].repository
):
info_arguments = copy.copy(arguments['info'])
if not info_arguments.json: # pragma: nocover
logger.warning('{}: Displaying summary info for archives'.format(repository))
info_arguments.archive = borg_list.resolve_archive_name(
repository, info_arguments.archive, storage, local_path, remote_path
logger.answer('{}: Displaying archive summary information'.format(repository))
info_arguments.archive = borg_rlist.resolve_archive_name(
repository,
info_arguments.archive,
storage,
local_borg_version,
local_path,
remote_path,
)
json_output = borg_info.display_archives_info(
repository,
storage,
local_borg_version,
info_arguments=info_arguments,
local_path=local_path,
remote_path=remote_path,
)
if json_output: # pragma: nocover
yield json.loads(json_output)
if 'break-lock' in arguments:
if arguments['break-lock'].repository is None or validate.repositories_match(
repository, arguments['break-lock'].repository
):
logger.info(f'{repository}: Breaking repository and cache locks')
borg_break_lock.break_lock(
repository,
storage,
local_borg_version,
local_path=local_path,
remote_path=remote_path,
)
if 'borg' in arguments:
if arguments['borg'].repository is None or validate.repositories_match(
repository, arguments['borg'].repository
):
logger.warning('{}: Running arbitrary Borg command'.format(repository))
archive_name = borg_list.resolve_archive_name(
repository, arguments['borg'].archive, storage, local_path, remote_path
logger.info('{}: Running arbitrary Borg command'.format(repository))
archive_name = borg_rlist.resolve_archive_name(
repository,
arguments['borg'].archive,
storage,
local_borg_version,
local_path,
remote_path,
)
borg_borg.run_arbitrary_borg(
repository,
storage,
local_borg_version,
options=arguments['borg'].options,
archive=archive_name,
local_path=local_path,
remote_path=remote_path,
)
command.execute_hook(
hooks.get('after_actions'),
hooks.get('umask'),
config_filename,
'post-actions',
global_arguments.dry_run,
**hook_context,
)
def load_configurations(config_filenames, overrides=None, resolve_env=True):
'''
@ -661,9 +791,10 @@ def load_configurations(config_filenames, overrides=None, resolve_env=True):
# Parse and load each configuration file.
for config_filename in config_filenames:
try:
configs[config_filename] = validate.parse_configuration(
configs[config_filename], parse_logs = validate.parse_configuration(
config_filename, validate.schema_filename(), overrides, resolve_env
)
logs.extend(parse_logs)
except PermissionError:
logs.extend(
[
@ -768,21 +899,21 @@ def collect_configuration_run_summary_logs(configs, arguments):
any, to stdout.
'''
# Run cross-file validation checks.
if 'extract' in arguments:
repository = arguments['extract'].repository
elif 'list' in arguments and arguments['list'].archive:
repository = arguments['list'].repository
elif 'mount' in arguments:
repository = arguments['mount'].repository
else:
repository = None
repository = None
if repository:
try:
validate.guard_configuration_contains_repository(repository, configs)
except ValueError as error:
yield from log_error_records(str(error))
return
for action_name, action_arguments in arguments.items():
if hasattr(action_arguments, 'repository'):
repository = getattr(action_arguments, 'repository')
break
try:
if 'extract' in arguments or 'mount' in arguments:
validate.guard_single_repository_selected(repository, configs)
validate.guard_configuration_contains_repository(repository, configs)
except ValueError as error:
yield from log_error_records(str(error))
return
if not configs:
yield from log_error_records(

View File

@ -60,8 +60,8 @@ def main(): # pragma: no cover
' diff --unified {} {}'.format(args.source_filename, args.destination_filename)
)
print()
print('Please edit the file to suit your needs. The values are representative.')
print('All fields are optional except where indicated.')
print('This includes all available configuration options with example values. The few')
print('required options are indicated. Please edit the file to suit your needs.')
print()
print('If you ever need help: https://torsion.org/borgmatic/#issues')
except (ValueError, OSError) as error:

View File

@ -283,7 +283,7 @@ def generate_sample_configuration(
if source_filename:
source_config = load.load_configuration(source_filename)
normalize.normalize(source_config)
normalize.normalize(source_filename, source_config)
destination_config = merge_source_configuration_into_destination(
_schema_to_sample_configuration(schema), source_config

View File

@ -1,3 +1,4 @@
import functools
import logging
import os
@ -6,43 +7,17 @@ import ruamel.yaml
logger = logging.getLogger(__name__)
class Yaml_with_loader_stream(ruamel.yaml.YAML):
def include_configuration(loader, filename_node, include_directory):
'''
A derived class of ruamel.yaml.YAML that simply tacks the loaded stream (file object) onto the
loader class so that it's available anywhere that's passed a loader (in this case,
include_configuration() below).
'''
def get_constructor_parser(self, stream):
constructor, parser = super(Yaml_with_loader_stream, self).get_constructor_parser(stream)
constructor.loader.stream = stream
return constructor, parser
def load_configuration(filename):
'''
Load the given configuration file and return its contents as a data structure of nested dicts
and lists.
Raise ruamel.yaml.error.YAMLError if something goes wrong parsing the YAML, or RecursionError
if there are too many recursive includes.
'''
yaml = Yaml_with_loader_stream(typ='safe')
yaml.Constructor = Include_constructor
return yaml.load(open(filename))
def include_configuration(loader, filename_node):
'''
Load the given YAML filename (ignoring the given loader so we can use our own) and return its
contents as a data structure of nested dicts and lists. If the filename is relative, probe for
it within 1. the current working directory and 2. the directory containing the YAML file doing
the including.
Given a ruamel.yaml.loader.Loader, a ruamel.yaml.serializer.ScalarNode containing the included
filename, and an include directory path to search for matching files, load the given YAML
filename (ignoring the given loader so we can use our own) and return its contents as a data
structure of nested dicts and lists. If the filename is relative, probe for it within 1. the
current working directory and 2. the given include directory.
Raise FileNotFoundError if an included file was not found.
'''
include_directories = [os.getcwd(), os.path.abspath(os.path.dirname(loader.stream.name))]
include_directories = [os.getcwd(), os.path.abspath(include_directory)]
include_filename = os.path.expanduser(filename_node.value)
if not os.path.isabs(include_filename):
@ -62,6 +37,70 @@ def include_configuration(loader, filename_node):
return load_configuration(include_filename)
class Include_constructor(ruamel.yaml.SafeConstructor):
'''
A YAML "constructor" (a ruamel.yaml concept) that supports a custom "!include" tag for including
separate YAML configuration files. Example syntax: `retention: !include common.yaml`
'''
def __init__(self, preserve_quotes=None, loader=None, include_directory=None):
super(Include_constructor, self).__init__(preserve_quotes, loader)
self.add_constructor(
'!include',
functools.partial(include_configuration, include_directory=include_directory),
)
def flatten_mapping(self, node):
'''
Support the special case of deep merging included configuration into an existing mapping
using the YAML '<<' merge key. Example syntax:
```
retention:
keep_daily: 1
<<: !include common.yaml
```
These includes are deep merged into the current configuration file. For instance, in this
example, any "retention" options in common.yaml will get merged into the "retention" section
in the example configuration file.
'''
representer = ruamel.yaml.representer.SafeRepresenter()
for index, (key_node, value_node) in enumerate(node.value):
if key_node.tag == u'tag:yaml.org,2002:merge' and value_node.tag == '!include':
included_value = representer.represent_data(self.construct_object(value_node))
node.value[index] = (key_node, included_value)
super(Include_constructor, self).flatten_mapping(node)
node.value = deep_merge_nodes(node.value)
def load_configuration(filename):
'''
Load the given configuration file and return its contents as a data structure of nested dicts
and lists.
Raise ruamel.yaml.error.YAMLError if something goes wrong parsing the YAML, or RecursionError
if there are too many recursive includes.
'''
# Use an embedded derived class for the include constructor so as to capture the filename
# value. (functools.partial doesn't work for this use case because yaml.Constructor has to be
# an actual class.)
class Include_constructor_with_include_directory(Include_constructor):
def __init__(self, preserve_quotes=None, loader=None):
super(Include_constructor_with_include_directory, self).__init__(
preserve_quotes, loader, include_directory=os.path.dirname(filename)
)
yaml = ruamel.yaml.YAML(typ='safe')
yaml.Constructor = Include_constructor_with_include_directory
return yaml.load(open(filename))
DELETED_NODE = object()
@ -175,41 +214,3 @@ def deep_merge_nodes(nodes):
return [
replaced_nodes.get(node, node) for node in nodes if replaced_nodes.get(node) != DELETED_NODE
]
class Include_constructor(ruamel.yaml.SafeConstructor):
'''
A YAML "constructor" (a ruamel.yaml concept) that supports a custom "!include" tag for including
separate YAML configuration files. Example syntax: `retention: !include common.yaml`
'''
def __init__(self, preserve_quotes=None, loader=None):
super(Include_constructor, self).__init__(preserve_quotes, loader)
self.add_constructor('!include', include_configuration)
def flatten_mapping(self, node):
'''
Support the special case of deep merging included configuration into an existing mapping
using the YAML '<<' merge key. Example syntax:
```
retention:
keep_daily: 1
<<: !include common.yaml
```
These includes are deep merged into the current configuration file. For instance, in this
example, any "retention" options in common.yaml will get merged into the "retention" section
in the example configuration file.
'''
representer = ruamel.yaml.representer.SafeRepresenter()
for index, (key_node, value_node) in enumerate(node.value):
if key_node.tag == u'tag:yaml.org,2002:merge' and value_node.tag == '!include':
included_value = representer.represent_data(self.construct_object(value_node))
node.value[index] = (key_node, included_value)
super(Include_constructor, self).flatten_mapping(node)
node.value = deep_merge_nodes(node.value)

View File

@ -1,31 +1,88 @@
def normalize(config):
import logging
def normalize(config_filename, config):
'''
Given a configuration dict, apply particular hard-coded rules to normalize its contents to
adhere to the configuration schema.
Given a configuration filename and a configuration dict of its loaded contents, apply particular
hard-coded rules to normalize the configuration to adhere to the current schema. Return any log
message warnings produced based on the normalization performed.
'''
logs = []
location = config.get('location') or {}
storage = config.get('storage') or {}
consistency = config.get('consistency') or {}
hooks = config.get('hooks') or {}
# Upgrade exclude_if_present from a string to a list.
exclude_if_present = config.get('location', {}).get('exclude_if_present')
exclude_if_present = location.get('exclude_if_present')
if isinstance(exclude_if_present, str):
config['location']['exclude_if_present'] = [exclude_if_present]
# Upgrade various monitoring hooks from a string to a dict.
healthchecks = config.get('hooks', {}).get('healthchecks')
healthchecks = hooks.get('healthchecks')
if isinstance(healthchecks, str):
config['hooks']['healthchecks'] = {'ping_url': healthchecks}
cronitor = config.get('hooks', {}).get('cronitor')
cronitor = hooks.get('cronitor')
if isinstance(cronitor, str):
config['hooks']['cronitor'] = {'ping_url': cronitor}
pagerduty = config.get('hooks', {}).get('pagerduty')
pagerduty = hooks.get('pagerduty')
if isinstance(pagerduty, str):
config['hooks']['pagerduty'] = {'integration_key': pagerduty}
cronhub = config.get('hooks', {}).get('cronhub')
cronhub = hooks.get('cronhub')
if isinstance(cronhub, str):
config['hooks']['cronhub'] = {'ping_url': cronhub}
# Upgrade consistency checks from a list of strings to a list of dicts.
checks = config.get('consistency', {}).get('checks')
checks = consistency.get('checks')
if isinstance(checks, list) and len(checks) and isinstance(checks[0], str):
config['consistency']['checks'] = [{'name': check_type} for check_type in checks]
# Rename various configuration options.
numeric_owner = location.pop('numeric_owner', None)
if numeric_owner is not None:
config['location']['numeric_ids'] = numeric_owner
bsd_flags = location.pop('bsd_flags', None)
if bsd_flags is not None:
config['location']['flags'] = bsd_flags
remote_rate_limit = storage.pop('remote_rate_limit', None)
if remote_rate_limit is not None:
config['storage']['upload_rate_limit'] = remote_rate_limit
# Upgrade remote repositories to ssh:// syntax, required in Borg 2.
repositories = location.get('repositories')
if repositories:
config['location']['repositories'] = []
for repository in repositories:
if '~' in repository:
logs.append(
logging.makeLogRecord(
dict(
levelno=logging.WARNING,
levelname='WARNING',
msg=f'{config_filename}: Repository paths containing "~" are deprecated in borgmatic and no longer work in Borg 2.x+.',
)
)
)
if ':' in repository and not repository.startswith('ssh://'):
rewritten_repository = (
f"ssh://{repository.replace(':~', '/~').replace(':/', '/').replace(':', '/./')}"
)
logs.append(
logging.makeLogRecord(
dict(
levelno=logging.WARNING,
levelname='WARNING',
msg=f'{config_filename}: Remote repository paths without ssh:// syntax are deprecated. Interpreting "{repository}" as "{rewritten_repository}"',
)
)
)
config['location']['repositories'].append(rewritten_repository)
else:
config['location']['repositories'].append(repository)
return logs

View File

@ -70,8 +70,8 @@ def parse_overrides(raw_overrides):
def apply_overrides(config, raw_overrides):
'''
Given a sequence of configuration file override strings in the form of "section.option=value"
and a configuration dict, parse each override and set it the configuration dict.
Given a configuration dict and a sequence of configuration file override strings in the form of
"section.option=value", parse each override and set it the configuration dict.
'''
overrides = parse_overrides(raw_overrides)

View File

@ -11,7 +11,6 @@ properties:
https://borgbackup.readthedocs.io/en/stable/usage/create.html
for details.
required:
- source_directories
- repositories
additionalProperties: false
properties:
@ -20,8 +19,8 @@ properties:
items:
type: string
description: |
List of source directories to backup (required). Globs and
tildes are expanded. Do not backslash spaces in path names.
List of source directories to backup. Globs and tildes are
expanded. Do not backslash spaces in path names.
example:
- /home
- /etc
@ -40,8 +39,9 @@ properties:
is used, then add local repository paths in the systemd
service file to the ReadWritePaths list.
example:
- user@backupserver:sourcehostname.borg
- "user@backupserver:{fqdn}"
- ssh://user@backupserver/./sourcehostname.borg
- ssh://user@backupserver/./{fqdn}
- /var/local/backups/local.borg
working_directory:
type: string
description: |
@ -58,7 +58,7 @@ properties:
database hook is used, the setting here is ignored and
one_file_system is considered true.
example: true
numeric_owner:
numeric_ids:
type: boolean
description: |
Only store/extract numeric user and group identifiers.
@ -90,10 +90,10 @@ properties:
used, the setting here is ignored and read_special is
considered true.
example: false
bsd_flags:
flags:
type: boolean
description: |
Record bsdflags (e.g. NODUMP, IMMUTABLE) in archive.
Record filesystem flags (e.g. NODUMP, IMMUTABLE) in archive.
Defaults to true.
example: true
files_cache:
@ -122,7 +122,8 @@ properties:
backups. Globs are expanded. (Tildes are not.) See the
output of "borg help patterns" for more details. Quote any
value if it contains leading punctuation, so it parses
correctly.
correctly. Note that only one of "patterns" and
"source_directories" may be used.
example:
- 'R /'
- '- /home/*/.cache'
@ -145,10 +146,10 @@ properties:
type: string
description: |
Any paths matching these patterns are excluded from backups.
Globs and tildes are expanded. (Note however that a glob
pattern must either start with a glob or be an absolute
path.) Do not backslash spaces in path names. See the output
of "borg help patterns" for more details.
Globs and tildes are expanded. Note that a glob pattern must
either start with a glob or be an absolute path. Do not
backslash spaces in path names. See the output of "borg help
patterns" for more details.
example:
- '*.pyc'
- /home/*/.cache
@ -255,7 +256,7 @@ properties:
http://borgbackup.readthedocs.io/en/stable/usage/create.html
for details. Defaults to "lz4".
example: lz4
remote_rate_limit:
upload_rate_limit:
type: integer
description: |
Remote network upload rate limit in kiBytes/second. Defaults
@ -538,13 +539,22 @@ properties:
prevent potential shell injection or privilege escalation.
additionalProperties: false
properties:
before_actions:
type: array
items:
type: string
description: |
List of one or more shell commands or scripts to execute
before all the actions for each repository.
example:
- echo "Starting actions."
before_backup:
type: array
items:
type: string
description: |
List of one or more shell commands or scripts to execute
before creating a backup, run once per configuration file.
before creating a backup, run once per repository.
example:
- echo "Starting a backup."
before_prune:
@ -553,7 +563,7 @@ properties:
type: string
description: |
List of one or more shell commands or scripts to execute
before pruning, run once per configuration file.
before pruning, run once per repository.
example:
- echo "Starting pruning."
before_compact:
@ -562,7 +572,7 @@ properties:
type: string
description: |
List of one or more shell commands or scripts to execute
before compaction, run once per configuration file.
before compaction, run once per repository.
example:
- echo "Starting compaction."
before_check:
@ -571,7 +581,7 @@ properties:
type: string
description: |
List of one or more shell commands or scripts to execute
before consistency checks, run once per configuration file.
before consistency checks, run once per repository.
example:
- echo "Starting checks."
before_extract:
@ -580,7 +590,7 @@ properties:
type: string
description: |
List of one or more shell commands or scripts to execute
before extracting a backup, run once per configuration file.
before extracting a backup, run once per repository.
example:
- echo "Starting extracting."
after_backup:
@ -589,7 +599,7 @@ properties:
type: string
description: |
List of one or more shell commands or scripts to execute
after creating a backup, run once per configuration file.
after creating a backup, run once per repository.
example:
- echo "Finished a backup."
after_compact:
@ -598,7 +608,7 @@ properties:
type: string
description: |
List of one or more shell commands or scripts to execute
after compaction, run once per configuration file.
after compaction, run once per repository.
example:
- echo "Finished compaction."
after_prune:
@ -607,7 +617,7 @@ properties:
type: string
description: |
List of one or more shell commands or scripts to execute
after pruning, run once per configuration file.
after pruning, run once per repository.
example:
- echo "Finished pruning."
after_check:
@ -616,7 +626,7 @@ properties:
type: string
description: |
List of one or more shell commands or scripts to execute
after consistency checks, run once per configuration file.
after consistency checks, run once per repository.
example:
- echo "Finished checks."
after_extract:
@ -625,9 +635,18 @@ properties:
type: string
description: |
List of one or more shell commands or scripts to execute
after extracting a backup, run once per configuration file.
after extracting a backup, run once per repository.
example:
- echo "Finished extracting."
after_actions:
type: array
items:
type: string
description: |
List of one or more shell commands or scripts to execute
after all actions for each repository.
example:
- echo "Finished actions."
on_error:
type: array
items:
@ -672,10 +691,13 @@ properties:
type: string
description: |
Database name (required if using this hook). Or
"all" to dump all databases on the host. Note
that using this database hook implicitly enables
both read_special and one_file_system (see
above) to support dump and restore streaming.
"all" to dump all databases on the host. (Also
set the "format" to dump each database to a
separate file instead of one combined file.)
Note that using this database hook implicitly
enables both read_special and one_file_system
(see above) to support dump and restore
streaming.
example: users
hostname:
type: string
@ -710,9 +732,14 @@ properties:
description: |
Database dump output format. One of "plain",
"custom", "directory", or "tar". Defaults to
"custom" (unlike raw pg_dump). See pg_dump
documentation for details. Note that format is
ignored when the database name is "all".
"custom" (unlike raw pg_dump) for a single
database. Or, when database name is "all" and
format is blank, dumps all databases to a single
file. But if a format is specified with an "all"
database name, dumps each database to a separate
file of that format, allowing more convenient
restores of individual databases. See the
pg_dump documentation for more about formats.
example: directory
ssl_mode:
type: string
@ -745,6 +772,32 @@ properties:
description: |
Path to a certificate revocation list.
example: "/root/.postgresql/root.crl"
pg_dump_command:
type: string
description: |
Command to use instead of "pg_dump" or
"pg_dumpall". This can be used to run a specific
pg_dump version (e.g., one inside a running
docker container). Defaults to "pg_dump" for
single database dump or "pg_dumpall" to dump
all databases.
example: docker exec my_pg_container pg_dump
pg_restore_command:
type: string
description: |
Command to use instead of "pg_restore". This
can be used to run a specific pg_restore
version (e.g., one inside a running docker
container). Defaults to "pg_restore".
example: docker exec my_pg_container pg_restore
psql_command:
type: string
description: |
Command to use instead of "psql". This can be
used to run a specific psql version (e.g.,
one inside a running docker container).
Defaults to "psql".
example: docker exec my_pg_container psql
options:
type: string
description: |
@ -1012,6 +1065,12 @@ properties:
Healthchecks ping URL or UUID to notify when a
backup begins, ends, or errors.
example: https://hc-ping.com/your-uuid-here
verify_tls:
type: boolean
description: |
Verify the TLS certificate of the ping URL host.
Defaults to true.
example: false
send_logs:
type: boolean
description: |

View File

@ -72,7 +72,7 @@ def apply_logical_validation(config_filename, parsed_configuration):
raise Validation_error(
config_filename,
(
'Unknown repository in the consistency section\'s check_repositories: {}'.format(
'Unknown repository in the "consistency" section\'s "check_repositories": {}'.format(
repository
),
),
@ -89,6 +89,9 @@ def parse_configuration(config_filename, schema_filename, overrides=None, resolv
{'location': {'source_directories': ['/home', '/etc'], 'repository': 'hostname.borg'},
'retention': {'keep_daily': 7}, 'consistency': {'checks': ['repository', 'archives']}}
Also return a sequence of logging.LogRecord instances containing any warnings about the
configuration.
Raise FileNotFoundError if the file does not exist, PermissionError if the user does not
have permissions to read the file, or Validation_error if the config does not match the schema.
'''
@ -99,7 +102,7 @@ def parse_configuration(config_filename, schema_filename, overrides=None, resolv
raise Validation_error(config_filename, (str(error),))
override.apply_overrides(config, overrides)
normalize.normalize(config)
logs = normalize.normalize(config_filename, config)
if resolve_env:
environment.resolve_env_variables(config)
@ -116,7 +119,7 @@ def parse_configuration(config_filename, schema_filename, overrides=None, resolv
apply_logical_validation(config_filename, config)
return config
return config, logs
def normalize_repository_path(repository):
@ -140,27 +143,13 @@ def repositories_match(first, second):
def guard_configuration_contains_repository(repository, configurations):
'''
Given a repository path and a dict mapping from config filename to corresponding parsed config
dict, ensure that the repository is declared exactly once in all of the configurations.
If no repository is given, then error if there are multiple configured repositories.
dict, ensure that the repository is declared exactly once in all of the configurations. If no
repository is given, skip this check.
Raise ValueError if the repository is not found in a configuration, or is declared multiple
times.
'''
if not repository:
count = len(
tuple(
config_repository
for config in configurations.values()
for config_repository in config['location']['repositories']
)
)
if count > 1:
raise ValueError(
'Can\'t determine which repository to use. Use --repository option to disambiguate'
)
return
count = len(
@ -176,3 +165,26 @@ def guard_configuration_contains_repository(repository, configurations):
raise ValueError('Repository {} not found in configuration files'.format(repository))
if count > 1:
raise ValueError('Repository {} found in multiple configuration files'.format(repository))
def guard_single_repository_selected(repository, configurations):
'''
Given a repository path and a dict mapping from config filename to corresponding parsed config
dict, ensure either a single repository exists across all configuration files or a repository
path was given.
'''
if repository:
return
count = len(
tuple(
config_repository
for config in configurations.values()
for config_repository in config['location']['repositories']
)
)
if count != 1:
raise ValueError(
'Can\'t determine which repository to use. Use --repository to disambiguate'
)

View File

@ -69,6 +69,7 @@ def log_outputs(processes, exclude_stdouts, output_log_level, borg_local_path):
}
output_buffers = list(process_for_output_buffer.keys())
captured_outputs = collections.defaultdict(list)
still_running = True
# Log output for each process until they all exit.
while True:
@ -108,6 +109,9 @@ def log_outputs(processes, exclude_stdouts, output_log_level, borg_local_path):
else:
logger.log(output_log_level, line)
if not still_running:
break
still_running = False
for process in processes:
@ -137,16 +141,13 @@ def log_outputs(processes, exclude_stdouts, output_log_level, borg_local_path):
exit_code, command_for_process(process), '\n'.join(last_lines)
)
if not still_running:
break
if captured_outputs:
return {
process: '\n'.join(output_lines) for process, output_lines in captured_outputs.items()
}
def log_command(full_command, input_file, output_file):
def log_command(full_command, input_file=None, output_file=None):
'''
Log the given command (a sequence of command/argument strings), along with its input/output file
paths.
@ -177,15 +178,14 @@ def execute_command(
):
'''
Execute the given command (a sequence of command/argument strings) and log its output at the
given log level. If output log level is None, instead capture and return the output. (Implies
run_to_completion.) If an open output file object is given, then write stdout to the file and
only log stderr (but only if an output log level is set). If an open input file object is given,
then read stdin from the file. If shell is True, execute the command within a shell. If an extra
environment dict is given, then use it to augment the current environment, and pass the result
into the command. If a working directory is given, use that as the present working directory
when running the command. If a Borg local path is given, and the command matches it (regardless
of arguments), treat exit code 1 as a warning instead of an error. If run to completion is
False, then return the process for the command without executing it to completion.
given log level. If an open output file object is given, then write stdout to the file and only
log stderr. If an open input file object is given, then read stdin from the file. If shell is
True, execute the command within a shell. If an extra environment dict is given, then use it to
augment the current environment, and pass the result into the command. If a working directory is
given, use that as the present working directory when running the command. If a Borg local path
is given, and the command matches it (regardless of arguments), treat exit code 1 as a warning
instead of an error. If run to completion is False, then return the process for the command
without executing it to completion.
Raise subprocesses.CalledProcessError if an error occurs while running the command.
'''
@ -194,12 +194,6 @@ def execute_command(
do_not_capture = bool(output_file is DO_NOT_CAPTURE)
command = ' '.join(full_command) if shell else full_command
if output_log_level is None:
output = subprocess.check_output(
command, shell=shell, env=environment, cwd=working_directory
)
return output.decode() if output is not None else None
process = subprocess.Popen(
command,
stdin=input_file,
@ -217,6 +211,33 @@ def execute_command(
)
def execute_command_and_capture_output(
full_command, capture_stderr=False, shell=False, extra_environment=None, working_directory=None,
):
'''
Execute the given command (a sequence of command/argument strings), capturing and returning its
output (stdout). If capture stderr is True, then capture and return stderr in addition to
stdout. If shell is True, execute the command within a shell. If an extra environment dict is
given, then use it to augment the current environment, and pass the result into the command. If
a working directory is given, use that as the present working directory when running the command.
Raise subprocesses.CalledProcessError if an error occurs while running the command.
'''
log_command(full_command)
environment = {**os.environ, **extra_environment} if extra_environment else None
command = ' '.join(full_command) if shell else full_command
output = subprocess.check_output(
command,
stderr=subprocess.STDOUT if capture_stderr else None,
shell=shell,
env=environment,
cwd=working_directory,
)
return output.decode() if output is not None else None
def execute_command_with_processes(
full_command,
processes,

View File

@ -29,19 +29,14 @@ def call_hook(function_name, hooks, log_prefix, hook_name, *args, **kwargs):
'''
Given the hooks configuration dict and a prefix to use in log entries, call the requested
function of the Python module corresponding to the given hook name. Supply that call with the
configuration for this hook, the log prefix, and any given args and kwargs. Return any return
value.
If the hook name is not present in the hooks configuration, then bail without calling anything.
configuration for this hook (if any), the log prefix, and any given args and kwargs. Return any
return value.
Raise ValueError if the hook name is unknown.
Raise AttributeError if the function name is not found in the module.
Raise anything else that the called function raises.
'''
config = hooks.get(hook_name)
if not config:
logger.debug('{}: No {} hook configured.'.format(log_prefix, hook_name))
return
config = hooks.get(hook_name, {})
try:
module = HOOK_NAME_TO_MODULE[hook_name]
@ -59,7 +54,7 @@ def call_hooks(function_name, hooks, log_prefix, hook_names, *args, **kwargs):
configuration for that hook, the log prefix, and any given args and kwargs. Collect any return
values into a dict from hook name to return value.
If the hook name is not present in the hooks configuration, then don't call the function for it,
If the hook name is not present in the hooks configuration, then don't call the function for it
and omit it from the return values.
Raise ValueError if the hook name is unknown.
@ -71,3 +66,19 @@ def call_hooks(function_name, hooks, log_prefix, hook_names, *args, **kwargs):
for hook_name in hook_names
if hooks.get(hook_name)
}
def call_hooks_even_if_unconfigured(function_name, hooks, log_prefix, hook_names, *args, **kwargs):
'''
Given the hooks configuration dict and a prefix to use in log entries, call the requested
function of the Python module corresponding to each given hook name. Supply each call with the
configuration for that hook, the log prefix, and any given args and kwargs. Collect any return
values into a dict from hook name to return value.
Raise AttributeError if the function name is not found in the module.
Raise anything else that a called function raises. An error stops calls to subsequent functions.
'''
return {
hook_name: call_hook(function_name, hooks, log_prefix, hook_name, *args, **kwargs)
for hook_name in hook_names
}

View File

@ -55,7 +55,7 @@ def remove_database_dumps(dump_path, database_type_name, log_prefix, dry_run):
'''
dry_run_label = ' (dry run; not actually removing anything)' if dry_run else ''
logger.info(
logger.debug(
'{}: Removing {} database dumps{}'.format(log_prefix, database_type_name, dry_run_label)
)

View File

@ -125,7 +125,9 @@ def ping_monitor(hook_config, config_filename, state, monitoring_log_level, dry_
if not dry_run:
logging.getLogger('urllib3').setLevel(logging.ERROR)
try:
response = requests.post(ping_url, data=payload.encode('utf-8'))
response = requests.post(
ping_url, data=payload.encode('utf-8'), verify=hook_config.get('verify_tls', True)
)
if not response.ok:
response.raise_for_status()
except requests.exceptions.RequestException as error:

View File

@ -1,6 +1,10 @@
import logging
from borgmatic.execute import execute_command, execute_command_with_processes
from borgmatic.execute import (
execute_command,
execute_command_and_capture_output,
execute_command_with_processes,
)
from borgmatic.hooks import dump
logger = logging.getLogger(__name__)
@ -20,7 +24,7 @@ SYSTEM_DATABASE_NAMES = ('information_schema', 'mysql', 'performance_schema', 's
def database_names_to_dump(database, extra_environment, log_prefix, dry_run_label):
'''
Given a requested database name, return the corresponding sequence of database names to dump.
Given a requested database config, return the corresponding sequence of database names to dump.
In the case of "all", query for the names of databases on the configured host and return them,
excluding any system databases that will cause problems during restore.
'''
@ -42,8 +46,8 @@ def database_names_to_dump(database, extra_environment, log_prefix, dry_run_labe
logger.debug(
'{}: Querying for "all" MySQL databases to dump{}'.format(log_prefix, dry_run_label)
)
show_output = execute_command(
show_command, output_log_level=None, extra_environment=extra_environment
show_output = execute_command_and_capture_output(
show_command, extra_environment=extra_environment
)
return tuple(

View File

@ -1,6 +1,11 @@
import csv
import logging
from borgmatic.execute import execute_command, execute_command_with_processes
from borgmatic.execute import (
execute_command,
execute_command_and_capture_output,
execute_command_with_processes,
)
from borgmatic.hooks import dump
logger = logging.getLogger(__name__)
@ -34,6 +39,44 @@ def make_extra_environment(database):
return extra
EXCLUDED_DATABASE_NAMES = ('template0', 'template1')
def database_names_to_dump(database, extra_environment, log_prefix, dry_run_label):
'''
Given a requested database config, return the corresponding sequence of database names to dump.
In the case of "all" when a database format is given, query for the names of databases on the
configured host and return them. For "all" without a database format, just return a sequence
containing "all".
'''
requested_name = database['name']
if requested_name != 'all':
return (requested_name,)
if not database.get('format'):
return ('all',)
list_command = (
('psql', '--list', '--no-password', '--csv', '--tuples-only')
+ (('--host', database['hostname']) if 'hostname' in database else ())
+ (('--port', str(database['port'])) if 'port' in database else ())
+ (('--username', database['username']) if 'username' in database else ())
+ (tuple(database['options'].split(' ')) if 'options' in database else ())
)
logger.debug(
'{}: Querying for "all" PostgreSQL databases to dump{}'.format(log_prefix, dry_run_label)
)
list_output = execute_command_and_capture_output(
list_command, extra_environment=extra_environment
)
return tuple(
row[0]
for row in csv.reader(list_output.splitlines(), delimiter=',', quotechar='"')
if row[0] not in EXCLUDED_DATABASE_NAMES
)
def dump_databases(databases, log_prefix, location_config, dry_run):
'''
Dump the given PostgreSQL databases to a named pipe. The databases are supplied as a sequence of
@ -43,6 +86,8 @@ def dump_databases(databases, log_prefix, location_config, dry_run):
Return a sequence of subprocess.Popen instances for the dump processes ready to spew to a named
pipe. But if this is a dry run, then don't actually dump anything and return an empty sequence.
Raise ValueError if the databases to dump cannot be determined.
'''
dry_run_label = ' (dry run; not actually dumping anything)' if dry_run else ''
processes = []
@ -50,51 +95,59 @@ def dump_databases(databases, log_prefix, location_config, dry_run):
logger.info('{}: Dumping PostgreSQL databases{}'.format(log_prefix, dry_run_label))
for database in databases:
name = database['name']
dump_filename = dump.make_database_dump_filename(
make_dump_path(location_config), name, database.get('hostname')
)
all_databases = bool(name == 'all')
dump_format = database.get('format', 'custom')
command = (
(
'pg_dumpall' if all_databases else 'pg_dump',
'--no-password',
'--clean',
'--if-exists',
)
+ (('--host', database['hostname']) if 'hostname' in database else ())
+ (('--port', str(database['port'])) if 'port' in database else ())
+ (('--username', database['username']) if 'username' in database else ())
+ (() if all_databases else ('--format', dump_format))
+ (('--file', dump_filename) if dump_format == 'directory' else ())
+ (tuple(database['options'].split(' ')) if 'options' in database else ())
+ (() if all_databases else (name,))
# Use shell redirection rather than the --file flag to sidestep synchronization issues
# when pg_dump/pg_dumpall tries to write to a named pipe. But for the directory dump
# format in a particular, a named destination is required, and redirection doesn't work.
+ (('>', dump_filename) if dump_format != 'directory' else ())
)
extra_environment = make_extra_environment(database)
logger.debug(
'{}: Dumping PostgreSQL database {} to {}{}'.format(
log_prefix, name, dump_filename, dry_run_label
)
dump_path = make_dump_path(location_config)
dump_database_names = database_names_to_dump(
database, extra_environment, log_prefix, dry_run_label
)
if dry_run:
continue
if dump_format == 'directory':
dump.create_parent_directory_for_dump(dump_filename)
else:
dump.create_named_pipe_for_dump(dump_filename)
if not dump_database_names:
raise ValueError('Cannot find any PostgreSQL databases to dump.')
processes.append(
execute_command(
command, shell=True, extra_environment=extra_environment, run_to_completion=False
for database_name in dump_database_names:
dump_format = database.get('format', None if database_name == 'all' else 'custom')
default_dump_command = 'pg_dumpall' if database_name == 'all' else 'pg_dump'
dump_command = database.get('pg_dump_command') or default_dump_command
dump_filename = dump.make_database_dump_filename(
dump_path, database_name, database.get('hostname')
)
command = (
(dump_command, '--no-password', '--clean', '--if-exists',)
+ (('--host', database['hostname']) if 'hostname' in database else ())
+ (('--port', str(database['port'])) if 'port' in database else ())
+ (('--username', database['username']) if 'username' in database else ())
+ (('--format', dump_format) if dump_format else ())
+ (('--file', dump_filename) if dump_format == 'directory' else ())
+ (tuple(database['options'].split(' ')) if 'options' in database else ())
+ (() if database_name == 'all' else (database_name,))
# Use shell redirection rather than the --file flag to sidestep synchronization issues
# when pg_dump/pg_dumpall tries to write to a named pipe. But for the directory dump
# format in a particular, a named destination is required, and redirection doesn't work.
+ (('>', dump_filename) if dump_format != 'directory' else ())
)
logger.debug(
'{}: Dumping PostgreSQL database "{}" to {}{}'.format(
log_prefix, database_name, dump_filename, dry_run_label
)
)
if dry_run:
continue
if dump_format == 'directory':
dump.create_parent_directory_for_dump(dump_filename)
else:
dump.create_named_pipe_for_dump(dump_filename)
processes.append(
execute_command(
command,
shell=True,
extra_environment=extra_environment,
run_to_completion=False,
)
)
)
return processes
@ -140,16 +193,18 @@ def restore_database_dump(database_config, log_prefix, location_config, dry_run,
dump_filename = dump.make_database_dump_filename(
make_dump_path(location_config), database['name'], database.get('hostname')
)
psql_command = database.get('psql_command') or 'psql'
analyze_command = (
('psql', '--no-password', '--quiet')
(psql_command, '--no-password', '--quiet')
+ (('--host', database['hostname']) if 'hostname' in database else ())
+ (('--port', str(database['port'])) if 'port' in database else ())
+ (('--username', database['username']) if 'username' in database else ())
+ (('--dbname', database['name']) if not all_databases else ())
+ ('--command', 'ANALYZE')
)
pg_restore_command = database.get('pg_restore_command') or 'pg_restore'
restore_command = (
('psql' if all_databases else 'pg_restore', '--no-password')
(psql_command if all_databases else pg_restore_command, '--no-password')
+ (
('--if-exists', '--exit-on-error', '--clean', '--dbname', database['name'])
if not all_databases

View File

@ -85,18 +85,19 @@ class Multi_stream_handler(logging.Handler):
handler.setLevel(level)
LOG_LEVEL_TO_COLOR = {
logging.CRITICAL: colorama.Fore.RED,
logging.ERROR: colorama.Fore.RED,
logging.WARN: colorama.Fore.YELLOW,
logging.INFO: colorama.Fore.GREEN,
logging.DEBUG: colorama.Fore.CYAN,
}
class Console_color_formatter(logging.Formatter):
def format(self, record):
color = LOG_LEVEL_TO_COLOR.get(record.levelno)
add_custom_log_levels()
color = {
logging.CRITICAL: colorama.Fore.RED,
logging.ERROR: colorama.Fore.RED,
logging.WARN: colorama.Fore.YELLOW,
logging.ANSWER: colorama.Fore.MAGENTA,
logging.INFO: colorama.Fore.GREEN,
logging.DEBUG: colorama.Fore.CYAN,
}.get(record.levelno)
return color_text(color, record.msg)
@ -110,6 +111,45 @@ def color_text(color, message):
return '{}{}{}'.format(color, message, colorama.Style.RESET_ALL)
def add_logging_level(level_name, level_number):
'''
Globally add a custom logging level based on the given (all uppercase) level name and number.
Do this idempotently.
Inspired by https://stackoverflow.com/questions/2183233/how-to-add-a-custom-loglevel-to-pythons-logging-facility/35804945#35804945
'''
method_name = level_name.lower()
if not hasattr(logging, level_name):
logging.addLevelName(level_number, level_name)
setattr(logging, level_name, level_number)
if not hasattr(logging, method_name):
def log_for_level(self, message, *args, **kwargs): # pragma: no cover
if self.isEnabledFor(level_number):
self._log(level_number, message, args, **kwargs)
setattr(logging.getLoggerClass(), method_name, log_for_level)
if not hasattr(logging.getLoggerClass(), method_name):
def log_to_root(message, *args, **kwargs): # pragma: no cover
logging.log(level_number, message, *args, **kwargs)
setattr(logging, method_name, log_to_root)
ANSWER = logging.WARN - 5
def add_custom_log_levels(): # pragma: no cover
'''
Add a custom log level between WARN and INFO for user-requested answers.
'''
add_logging_level('ANSWER', ANSWER)
def configure_logging(
console_log_level,
syslog_log_level=None,
@ -130,6 +170,8 @@ def configure_logging(
if monitoring_log_level is None:
monitoring_log_level = console_log_level
add_custom_log_levels()
# Log certain log levels to console stderr and others to stdout. This supports use cases like
# grepping (non-error) output.
console_error_handler = logging.StreamHandler(sys.stderr)
@ -138,7 +180,8 @@ def configure_logging(
{
logging.CRITICAL: console_error_handler,
logging.ERROR: console_error_handler,
logging.WARN: console_standard_handler,
logging.WARN: console_error_handler,
logging.ANSWER: console_standard_handler,
logging.INFO: console_standard_handler,
logging.DEBUG: console_standard_handler,
}

View File

@ -1,7 +1,9 @@
import logging
import borgmatic.logger
VERBOSITY_ERROR = -1
VERBOSITY_WARNING = 0
VERBOSITY_ANSWER = 0
VERBOSITY_SOME = 1
VERBOSITY_LOTS = 2
@ -10,9 +12,11 @@ def verbosity_to_log_level(verbosity):
'''
Given a borgmatic verbosity value, return the corresponding Python log level.
'''
borgmatic.logger.add_custom_log_levels()
return {
VERBOSITY_ERROR: logging.ERROR,
VERBOSITY_WARNING: logging.WARNING,
VERBOSITY_ANSWER: logging.ANSWER,
VERBOSITY_SOME: logging.INFO,
VERBOSITY_LOTS: logging.DEBUG,
}.get(verbosity, logging.WARNING)

View File

@ -4,7 +4,7 @@ COPY . /app
RUN apk add --no-cache py3-pip py3-ruamel.yaml py3-ruamel.yaml.clib
RUN pip install --no-cache /app && generate-borgmatic-config && chmod +r /etc/borgmatic/config.yaml
RUN borgmatic --help > /command-line.txt \
&& for action in init prune compact create check extract export-tar mount umount restore list info borg; do \
&& for action in rcreate transfer prune compact create check extract export-tar mount umount restore rlist list rinfo info break-lock borg; do \
echo -e "\n--------------------------------------------------------------------------------\n" >> /command-line.txt \
&& borgmatic "$action" --help >> /command-line.txt; done

View File

@ -530,3 +530,11 @@ main .elv-toc + h1 .direct-link {
display: none ;
}
}
.header-anchor {
text-decoration: none;
}
.header-anchor:hover::after {
content: " đź”—";
}

View File

@ -40,6 +40,13 @@ There are additional hooks that run before/after other actions as well. For
instance, `before_prune` runs before a `prune` action for a repository, while
`after_prune` runs after it.
<span class="minilink minilink-addedin">New in version 1.7.0</span> The
`before_actions` and `after_actions` hooks run before/after all the actions
(like `create`, `prune`, etc.) for each repository. These hooks are a good
place to run per-repository steps like mounting/unmounting a remote
filesystem.
## Variable interpolation
The before and after action hooks support interpolating particular runtime

View File

@ -76,7 +76,7 @@ location:
- /home
repositories:
- me@buddys-server.org:backup.borg
- ssh://me@buddys-server.org/./backup.borg
hooks:
before_backup:

View File

@ -76,6 +76,9 @@ hooks:
options: "--ssl"
```
### All databases
If you want to dump all databases on a host, use `all` for the database name:
```yaml
@ -91,8 +94,30 @@ hooks:
Note that you may need to use a `username` of the `postgres` superuser for
this to work with PostgreSQL.
If you would like to backup databases only and not source directories, you can
specify an empty `source_directories` value (as it is a mandatory field):
<span class="minilink minilink-addedin">New in version 1.7.6</span> With
PostgreSQL and MySQL, you can optionally dump "all" databases to separate
files instead of one combined dump file, allowing more convenient restores of
individual databases. Enable this by specifying your desired database dump
`format`:
```yaml
hooks:
postgresql_databases:
- name: all
format: custom
mysql_databases:
- name: all
format: sql
```
### No source directories
<span class="minilink minilink-addedin">New in version 1.7.1</span> If you
would like to backup databases only and not source directories, you can omit
`source_directories` entirely.
In older versions of borgmatic, instead specify an empty `source_directories`
value, as it is a mandatory option prior to version 1.7.1:
```yaml
location:
@ -102,6 +127,8 @@ hooks:
- name: all
```
### External passwords
If you don't want to keep your database passwords in your borgmatic
@ -133,14 +160,13 @@ that you'd like supported.
To restore a database dump from an archive, use the `borgmatic restore`
action. But the first step is to figure out which archive to restore from. A
good way to do that is to use the `list` action:
good way to do that is to use the `rlist` action:
```bash
borgmatic list
borgmatic rlist
```
(No borgmatic `list` action? Try the old-style `--list`, or upgrade
borgmatic!)
(No borgmatic `rlist` action? Try `list` instead or upgrade borgmatic!)
That should yield output looking something like:
@ -212,8 +238,13 @@ databases that share the exact same name on different hosts.
setting to support dump and restore streaming, you'll need to ensure that any
special files are excluded from backups (named pipes, block devices,
character devices, and sockets) to prevent hanging. Try a command like
`find /your/source/path -type c,b,p,s` to find such files. Common directories
to exclude are `/dev` and `/run`, but that may not be exhaustive.
`find /your/source/path -type b -or -type c -or -type p -or -type s` to find
such files. Common directories to exclude are `/dev` and `/run`, but that may
not be exhaustive. <span class="minilink minilink-addedin">New in version
1.7.3</span> When database hooks are enabled, borgmatic automatically excludes
special files that may cause Borg to hang, so you no longer need to manually
exclude them. (This includes symlinks with special files as a destination.) You
can override/prevent this behavior by explicitly setting `read_special` to true.
### Manual restoration
@ -269,3 +300,7 @@ Alternatively, if excluding special files is too onerous, you can create two
separate borgmatic configuration files—one for your source files and a
separate one for backing up databases. That way, the database `read_special`
option will not be active when backing up special files.
<span class="minilink minilink-addedin">New in version 1.7.3</span> See
Limitations above about borgmatic's automatic exclusion of special files to
prevent Borg hangs.

View File

@ -27,9 +27,6 @@ borgmatic create
borgmatic check
```
(No borgmatic `prune`, `create`, or `check` actions? Try the old-style
`--prune`, `--create`, or `--check`. Or upgrade borgmatic!)
You can run with only one of these actions provided, or you can mix and match
any number of them in a single borgmatic run. This supports approaches like
skipping certain actions while running others. For instance, this skips
@ -47,9 +44,11 @@ consistency checks with `check` on a much less frequent basis (e.g. with
### Consistency check configuration
Another option is to customize your consistency checks. The default
consistency checks run both full-repository checks and per-archive checks
within each repository no more than once a month.
Another option is to customize your consistency checks. By default, if you
omit consistency checks from configuration, borgmatic runs full-repository
checks (`repository`) and per-archive checks (`archives`) within each
repository, no more than once a month. This is equivalent to what `borg check`
does if run without options.
But if you find that archive checks are too slow, for example, you can
configure borgmatic to run repository checks only. Configure this in the
@ -68,9 +67,14 @@ Here are the available checks from fastest to slowest:
* `repository`: Checks the consistency of the repository itself.
* `archives`: Checks all of the archives in the repository.
* `extract`: Performs an extraction dry-run of the most recent archive.
* `data`: Verifies the data integrity of all archives contents, decrypting and decompressing all data (implies `archives` as well).
* `data`: Verifies the data integrity of all archives contents, decrypting and decompressing all data.
See [Borg's check documentation](https://borgbackup.readthedocs.io/en/stable/usage/check.html) for more information.
Note that the `data` check is a more thorough version of the `archives` check,
so enabling the `data` check implicitly enables the `archives` check as well.
See [Borg's check
documentation](https://borgbackup.readthedocs.io/en/stable/usage/check.html)
for more information.
### Check frequency
@ -97,12 +101,12 @@ Unlike a real scheduler like cron, borgmatic only makes a best effort to run
checks on the configured frequency. It compares that frequency with how long
it's been since the last check for a given repository (as recorded in a file
within `~/.borgmatic/checks`). If it hasn't been long enough, the check is
skipped. And you still have to run `borgmatic check` (or just `borgmatic`) in
order for checks to run, even when a `frequency` is configured!
skipped. And you still have to run `borgmatic check` (or `borgmatic` without
actions) in order for checks to run, even when a `frequency` is configured!
This also applies *across* configuration files that have the same repository
configured. Just make sure you have the same check frequency configured in
each—or the most frequently configured check will apply.
configured. Make sure you have the same check frequency configured in each
though—or the most frequently configured check will apply.
If you want to temporarily ignore your configured frequencies, you can invoke
`borgmatic check --force` to run checks unconditionally.

View File

@ -9,14 +9,13 @@ eleventyNavigation:
When the worst happens—or you want to test your backups—the first step is
to figure out which archive to extract. A good way to do that is to use the
`list` action:
`rlist` action:
```bash
borgmatic list
borgmatic rlist
```
(No borgmatic `list` action? Try the old-style `--list`, or upgrade
borgmatic!)
(No borgmatic `rlist` action? Try `list` instead or upgrade borgmatic!)
That should yield output looking something like:
@ -32,10 +31,9 @@ and therefore the latest timestamp, run a command like:
borgmatic extract --archive host-2019-01-02T04:06:07.080910
```
(No borgmatic `extract` action? Try the old-style `--extract`, or upgrade
borgmatic!)
(No borgmatic `extract` action? Upgrade borgmatic!)
With newer versions of borgmatic, you can simplify this to:
Or simplify this to:
```bash
borgmatic extract --archive latest
@ -43,7 +41,8 @@ borgmatic extract --archive latest
The `--archive` value is the name of the archive to extract. This extracts the
entire contents of the archive to the current directory, so make sure you're
in the right place before running the command.
in the right place before running the command—or see below about the
`--destination` flag.
## Repository selection
@ -65,13 +64,15 @@ everything from an archive. To do that, tack on one or more `--path` values.
For instance:
```bash
borgmatic extract --archive host-2019-... --path path/1 path/2
borgmatic extract --archive latest --path path/1 path/2
```
Note that the specified restore paths should not have a leading slash. Like a
whole-archive extract, this also extracts into the current directory. So for
example, if you happen to be in the directory `/var` and you run the `extract`
command above, borgmatic will extract `/var/path/1` and `/var/path/2`.
whole-archive extract, this also extracts into the current directory by
default. So for example, if you happen to be in the directory `/var` and you
run the `extract` command above, borgmatic will extract `/var/path/1` and
`/var/path/2`.
## Extract to a particular destination
@ -80,7 +81,7 @@ extract files to a particular destination directory, use the `--destination`
flag:
```bash
borgmatic extract --archive host-2019-... --destination /tmp
borgmatic extract --archive latest --destination /tmp
```
When using the `--destination` flag, be careful not to overwrite your system's
@ -104,7 +105,7 @@ archive as a [FUSE](https://en.wikipedia.org/wiki/Filesystem_in_Userspace)
filesystem, you can use the `borgmatic mount` action. Here's an example:
```bash
borgmatic mount --archive host-2019-... --mount-point /mnt
borgmatic mount --archive latest --mount-point /mnt
```
This mounts the entire archive on the given mount point `/mnt`, so that you
@ -127,7 +128,7 @@ your archive, use the `--path` flag, similar to the `extract` action above.
For instance:
```bash
borgmatic mount --archive host-2019-... --mount-point /mnt --path var/lib
borgmatic mount --archive latest --mount-point /mnt --path var/lib
```
When you're all done exploring your files, unmount your mount point. No

View File

@ -37,18 +37,34 @@ borgmatic --stats
## Existing backups
borgmatic provides convenient actions for Borg's
[list](https://borgbackup.readthedocs.io/en/stable/usage/list.html) and
[info](https://borgbackup.readthedocs.io/en/stable/usage/info.html)
[`list`](https://borgbackup.readthedocs.io/en/stable/usage/list.html) and
[`info`](https://borgbackup.readthedocs.io/en/stable/usage/info.html)
functionality:
```bash
borgmatic list
borgmatic info
```
(No borgmatic `list` or `info` actions? Try the old-style `--list` or
`--info`. Or upgrade borgmatic!)
You can change the output format of `borgmatic list` by specifying your own
with `--format`. Refer to the [borg list --format
documentation](https://borgbackup.readthedocs.io/en/stable/usage/list.html#the-format-specifier-syntax)
for available values.
*(No borgmatic `list` or `info` actions? Upgrade borgmatic!)*
<span class="minilink minilink-addedin">New in borgmatic version 1.7.0</span>
There are also `rlist` and `rinfo` actions for displaying repository
information with Borg 2.x:
```bash
borgmatic rlist
borgmatic rinfo
```
See the [borgmatic command-line
reference](https://torsion.org/borgmatic/docs/reference/command-line/) for
more information.
### Searching for a file
@ -68,7 +84,7 @@ be a [Borg
pattern](https://borgbackup.readthedocs.io/en/stable/usage/help.html#borg-patterns).
To limit the archives searched, use the standard `list` parameters for
filtering archives such as `--last`, `--archive`, `--glob-archives`, etc. For
filtering archives such as `--last`, `--archive`, `--match-archives`, etc. For
example, to search only the last five archives:
```bash

View File

@ -20,8 +20,8 @@ location:
# Paths of local or remote repositories to backup to.
repositories:
- 1234@usw-s001.rsync.net:backups.borg
- k8pDxu32@k8pDxu32.repo.borgbase.com:repo
- ssh://1234@usw-s001.rsync.net/./backups.borg
- ssh://k8pDxu32@k8pDxu32.repo.borgbase.com/./repo
- /var/lib/backups/local.borg
```
@ -42,3 +42,13 @@ potentially across providers.
See [Borg repository URLs
documentation](https://borgbackup.readthedocs.io/en/stable/usage/general.html#repository-urls)
for more information on how to specify local and remote repository paths.
### Different options per repository
What if you want borgmatic to backup to multiple repositories—while also
setting different options for each one? In that case, you'll need to use
[a separate borgmatic configuration file for each
repository](https://torsion.org/borgmatic/docs/how-to/make-per-application-backups/)
instead of the multiple repositories in one configuration file as described
above. That's because all of the repositories in a particular configuration
file get the same options applied.

View File

@ -8,13 +8,15 @@ eleventyNavigation:
## Multiple backup configurations
You may find yourself wanting to create different backup policies for
different applications on your system. For instance, you may want one backup
configuration for your database data directory, and a different configuration
for your user home directories.
different applications on your system or even for different backup
repositories. For instance, you might want one backup configuration for your
database data directory and a different configuration for your user home
directories. Or one backup configuration for your local backups with a
different configuration for your remote repository.
The way to accomplish that is pretty simple: Create multiple separate
configuration files and place each one in a `/etc/borgmatic.d/` directory. For
instance:
instance, for applications:
```bash
sudo mkdir /etc/borgmatic.d
@ -22,6 +24,14 @@ sudo generate-borgmatic-config --destination /etc/borgmatic.d/app1.yaml
sudo generate-borgmatic-config --destination /etc/borgmatic.d/app2.yaml
```
Or, for repositories:
```bash
sudo mkdir /etc/borgmatic.d
sudo generate-borgmatic-config --destination /etc/borgmatic.d/repo1.yaml
sudo generate-borgmatic-config --destination /etc/borgmatic.d/repo2.yaml
```
When you set up multiple configuration files like this, borgmatic will run
each one in turn from a single borgmatic invocation. This includes, by
default, the traditional `/etc/borgmatic/config.yaml` as well.
@ -29,7 +39,8 @@ default, the traditional `/etc/borgmatic/config.yaml` as well.
Each configuration file is interpreted independently, as if you ran borgmatic
for each configuration file one at a time. In other words, borgmatic does not
perform any merging of configuration files by default. If you'd like borgmatic
to merge your configuration files, see below about configuration includes.
to merge your configuration files, for instance to avoid duplication of
settings, see below about configuration includes.
Additionally, the `~/.config/borgmatic.d/` directory works the same way as
`/etc/borgmatic.d`.
@ -95,11 +106,60 @@ But if you do want to merge in a YAML key *and* its values, keep reading!
## Include merging
If you need to get even fancier and pull in common configuration options while
potentially overriding individual options, you can perform a YAML merge of
included configuration using the YAML `<<` key. For instance, here's an
example of a main configuration file that pulls in two retention options via
an include and then overrides one of them locally:
If you need to get even fancier and merge in common configuration options, you
can perform a YAML merge of included configuration using the YAML `<<` key.
For instance, here's an example of a main configuration file that pulls in
retention and consistency options via a single include:
```yaml
<<: !include /etc/borgmatic/common.yaml
location:
...
```
This is what `common.yaml` might look like:
```yaml
retention:
keep_hourly: 24
keep_daily: 7
consistency:
checks:
- name: repository
```
Once this include gets merged in, the resulting configuration would have all
of the `location` options from the original configuration file *and* the
`retention` and `consistency` options from the include.
Prior to borgmatic version 1.6.0, when there's a section collision between the
local file and the merged include, the local file's section takes precedence.
So if the `retention` section appears in both the local file and the include
file, the included `retention` is ignored in favor of the local `retention`.
But see below about deep merge in version 1.6.0+.
Note that this `<<` include merging syntax is only for merging in mappings
(configuration options and their values). But if you'd like to include a
single value directly, please see the section above about standard includes.
Additionally, there is a limitation preventing multiple `<<` include merges
per section. So for instance, that means you can do one `<<` merge at the
global level, another `<<` within each configuration section, etc. (This is a
YAML limitation.)
### Deep merge
<span class="minilink minilink-addedin">New in version 1.6.0</span> borgmatic
performs a deep merge of merged include files, meaning that values are merged
at all levels in the two configuration files. This allows you to include
common configuration—up to full borgmatic configuration files—while overriding
only the parts you want to customize.
For instance, here's an example of a main configuration file that pulls in two
retention options via an include and then overrides one of them locally:
```yaml
<<: !include /etc/borgmatic/common.yaml
@ -125,24 +185,8 @@ Once this include gets merged in, the resulting configuration would have a
When there's an option collision between the local file and the merged
include, the local file's option takes precedence.
Note that this `<<` include merging syntax is only for merging in mappings
(configuration options and their values). But if you'd like to include a
single value directly, please see the section above about standard includes.
Additionally, there is a limitation preventing multiple `<<` include merges
per section. So for instance, that means you can do one `<<` merge at the
global level, another `<<` within each configuration section, etc. (This is a
YAML limitation.)
### Deep merge
<span class="minilink minilink-addedin">New in version 1.6.0</span> borgmatic
performs a deep merge of merged include files, meaning that values are merged
at all levels in the two configuration files. Colliding list values are
appended together. This allows you to include common configuration—up to full
borgmatic configuration files—while overriding only the parts you want to
customize.
<span class="minilink minilink-addedin">New in version 1.6.1</span> Colliding
list values are appended together.
## Configuration overrides

View File

@ -158,9 +158,9 @@ itself. But the logs are only included for errors that occur when a `prune`,
`compact`, `create`, or `check` action is run.
You can customize the verbosity of the logs that are sent to Healthchecks with
borgmatic's `--monitoring-verbosity` flag. The `--files` and `--stats` flags
may also be of use. See `borgmatic --help` for more information. Additionally,
see the [borgmatic configuration
borgmatic's `--monitoring-verbosity` flag. The `--list` and `--stats` flags
may also be of use. See `borgmatic create --help` for more information.
Additionally, see the [borgmatic configuration
file](https://torsion.org/borgmatic/docs/reference/configuration/) for
additional Healthchecks options.
@ -319,8 +319,8 @@ hooks:
## Scripting borgmatic
To consume the output of borgmatic in other software, you can include an
optional `--json` flag with `create`, `list`, or `info` to get the output
formatted as JSON.
optional `--json` flag with `create`, `rlist`, `rinfo`, or `info` to get the
output formatted as JSON.
Note that when you specify the `--json` flag, Borg's other non-JSON output is
suppressed so as not to interfere with the captured JSON. Also note that JSON
@ -329,9 +329,9 @@ output only shows up at the console, and not in syslog.
### Latest backups
All borgmatic actions that accept an "--archive" flag allow you to specify an
archive name of "latest". This lets you get the latest archive without having
to first run "borgmatic list" manually, which can be handy in automated
All borgmatic actions that accept an `--archive` flag allow you to specify an
archive name of `latest`. This lets you get the latest archive without having
to first run `borgmatic rlist` manually, which can be handy in automated
scripts. Here's an example:
```bash

View File

@ -25,8 +25,8 @@ storage:
```
This uses the `MY_PASSPHRASE` environment variable as your encryption
passphrase. Note that the `{` `}` brackets are required. Just `$MY_PASSPHRASE`
will not work.
passphrase. Note that the `{` `}` brackets are required. `$MY_PASSPHRASE` by
itself will not work.
In the case of `encryption_passphrase` in particular, an alternate approach
is to use Borg's `BORG_PASSPHRASE` environment variable, which doesn't even

View File

@ -30,8 +30,8 @@ based on your borgmatic configuration files or command-line arguments:
### borg action
The way you run Borg with borgmatic is via the `borg` action. Here's a simple
example:
<span class="minilink minilink-addedin">New in version 1.5.15</span> The way
you run Borg with borgmatic is via the `borg` action. Here's a simple example:
```bash
borgmatic borg break-lock
@ -46,12 +46,11 @@ options, as that part is provided by borgmatic.
You can also specify Borg options for relevant commands:
```bash
borgmatic borg list --progress
borgmatic borg rlist --short
```
This runs Borg's `list` command once on each configured borgmatic
repository. However, the native `borgmatic list` action should be preferred
for most use.
This runs Borg's `rlist` command once on each configured borgmatic repository.
(The native `borgmatic rlist` action should be preferred for most use.)
What if you only want to run Borg on a single configured borgmatic repository
when you've got several configured? Not a problem.
@ -63,7 +62,7 @@ borgmatic borg --repository repo.borg break-lock
And what about a single archive?
```bash
borgmatic borg --archive your-archive-name list
borgmatic borg --archive your-archive-name rlist
```
### Limitations
@ -88,6 +87,9 @@ borgmatic's `borg` action is not without limitations:
borgmatic action. In this case, only the Borg command is run.
* Unlike normal borgmatic actions that support JSON, the `borg` action will
not disable certain borgmatic logs to avoid interfering with JSON output.
* Unlike other borgmatic actions, the `borg` action captures (and logs) all
output, so interactive prompts or flags like `--progress` will not work as
expected.
In general, this `borgmatic borg` feature should be considered an escape
valve—a feature of second resort. In the long run, it's preferable to wrap

View File

@ -186,32 +186,39 @@ files via configuration management, or you want to double check that your hand
edits are valid.
## Initialization
## Repository creation
Before you can create backups with borgmatic, you first need to initialize a
Borg repository so you have a destination for your backup archives. (But skip
this step if you already have a Borg repository.) To create a repository, run
a command like the following:
Before you can create backups with borgmatic, you first need to create a Borg
repository so you have a destination for your backup archives. (But skip this
step if you already have a Borg repository.) To create a repository, run a
command like the following with Borg 1.x:
```bash
sudo borgmatic init --encryption repokey
```
(No borgmatic `init` action? Try the old-style `--init` flag, or upgrade
borgmatic!)
<span class="minilink minilink-addedin">New in borgmatic version 1.7.0</span>
Or, with Borg 2.x:
```bash
sudo borgmatic rcreate --encryption repokey-aes-ocb
```
(Note that `repokey-chacha20-poly1305` may be faster than `repokey-aes-ocb` on
certain platforms like ARM64.)
This uses the borgmatic configuration file you created above to determine
which local or remote repository to create, and encrypts it with the
encryption passphrase specified there if one is provided. Read about [Borg
encryption
modes](https://borgbackup.readthedocs.io/en/stable/usage/init.html#encryption-modes)
modes](https://borgbackup.readthedocs.io/en/stable/usage/init.html#encryption-mode-tldr)
for the menu of available encryption modes.
Also, optionally check out the [Borg Quick
Start](https://borgbackup.readthedocs.org/en/stable/quickstart.html) for more
background about repository initialization.
background about repository creation.
Note that borgmatic skips repository initialization if the repository already
Note that borgmatic skips repository creation if the repository already
exists. This supports use cases like ensuring a repository exists prior to
performing a backup.
@ -221,21 +228,21 @@ key-based SSH access to the desired user account on the remote host.
## Backups
Now that you've configured borgmatic and initialized a repository, it's a
good idea to test that borgmatic is working. So to run borgmatic and start a
Now that you've configured borgmatic and created a repository, it's a good
idea to test that borgmatic is working. So to run borgmatic and start a
backup, you can invoke it like this:
```bash
sudo borgmatic create --verbosity 1 --files --stats
sudo borgmatic create --verbosity 1 --list --stats
```
(No borgmatic `--files` flag? It's only present in newer versions of
borgmatic. So try leaving it out, or upgrade borgmatic!)
(No borgmatic `--list` flag? Try `--files` instead, leave it out, or upgrade
borgmatic!)
The `--verbosity` flag makes borgmatic show the steps it's performing. The
`--files` flag lists each file that's new or changed since the last backup.
And `--stats` shows summary information about the created archive. All of
these flags are optional.
`--list` flag lists each file that's new or changed since the last backup. And
`--stats` shows summary information about the created archive. All of these
flags are optional.
As the command runs, you should eyeball the output to see if it matches your
expectations based on your configuration.
@ -255,7 +262,7 @@ backup, *and* `check` backups for consistency problems due to things like file
damage. For instance:
```bash
sudo borgmatic --verbosity 1 --files --stats
sudo borgmatic --verbosity 1 --list --stats
```
## Autopilot
@ -340,7 +347,7 @@ instead:
sudo su -c "borgmatic --bash-completion > /usr/share/bash-completion/completions/borgmatic"
```
Or, if you'd like to install the script for just the current user:
Or, if you'd like to install the script for only the current user:
```bash
mkdir --parents ~/.local/share/bash-completion/completions

View File

@ -1,11 +1,11 @@
---
title: How to upgrade borgmatic
title: How to upgrade borgmatic and Borg
eleventyNavigation:
key: 📦 Upgrade borgmatic
key: 📦 Upgrade borgmatic/Borg
parent: How-to guides
order: 12
---
## Upgrading
## Upgrading borgmatic
In general, all you should need to do to upgrade borgmatic is run the
following:
@ -115,3 +115,89 @@ sudo pip3 install --user borgmatic
That's it! borgmatic will continue using your /etc/borgmatic configuration
files.
## Upgrading Borg
To upgrade to a new version of Borg, you can generally install a new version
the same way you installed the previous version, paying attention to any
instructions included with each Borg release changelog linked from the
[releases page](https://github.com/borgbackup/borg/releases). Some more major
Borg releases require additional steps that borgmatic can help with.
### Borg 1.2 to 2.0
<span class="minilink minilink-addedin">New in borgmatic version 1.7.0</span>
Upgrading Borg from 1.2 to 2.0 requires manually upgrading your existing Borg
1 repositories before use with Borg or borgmatic. Here's how you can
accomplish that.
Start by upgrading borgmatic as described above to at least version 1.7.0 and
Borg to 2.0. Then, rename your repository in borgmatic's configuration file to
a new repository path. The repository upgrade process does not occur
in-place; you'll create a new repository with a copy of your old repository's
data.
Let's say your original borgmatic repository configuration file looks something
like this:
```yaml
location:
repositories:
- original.borg
```
Change it to a new (not yet created) repository path:
```yaml
location:
repositories:
- upgraded.borg
```
Then, run the `rcreate` action (formerly `init`) to create that new Borg 2
repository:
```bash
borgmatic rcreate --verbosity 1 --encryption repokey-blake2-aes-ocb \
--source-repository original.borg --repository upgraded.borg
```
This creates an empty repository and doesn't actually transfer any data yet.
The `--source-repository` flag is necessary to reuse key material from your
Borg 1 repository so that the subsequent data transfer can work.
The `--encryption` value above selects the same chunk ID algorithm (`blake2`)
used in Borg 1, thereby making deduplication work across transferred archives
and new archives. Note that `repokey-blake2-chacha20-poly1305` may be faster
than `repokey-blake2-aes-ocb` on certain platforms like ARM64. Read about
[Borg encryption
modes](https://borgbackup.readthedocs.io/en/2.0.0b4/usage/rcreate.html#encryption-mode-tldr)
for the menu of available encryption modes.
To transfer data from your original Borg 1 repository to your newly created
Borg 2 repository:
```bash
borgmatic transfer --verbosity 1 --upgrader From12To20 --source-repository \
original.borg --repository upgraded.borg --dry-run
borgmatic transfer --verbosity 1 --upgrader From12To20 --source-repository \
original.borg --repository upgraded.borg
borgmatic transfer --verbosity 1 --upgrader From12To20 --source-repository \
original.borg --repository upgraded.borg --dry-run
```
The first command with `--dry-run` tells you what Borg is going to do during
the transfer, the second command actually performs the transfer/upgrade (this
might take a while), and the final command with `--dry-run` again provides
confirmation of success—or tells you if something hasn't been transferred yet.
Note that by omitting the `--upgrader` flag, you can also do archive transfers
between related Borg 2 repositories without upgrading, even down to individual
archives. For more on that functionality, see the [Borg transfer
documentation](https://borgbackup.readthedocs.io/en/2.0.0b4/usage/transfer.html).
That's it! Now you can use your new Borg 2 repository as normal with
borgmatic. If you've got multiple repositories, repeat the above process for
each.

View File

@ -41,7 +41,7 @@ ProtectSystem=full
# ReadOnlyPaths=-/var/lib/my_backup_source
# This will mount a tmpfs on top of /root and pass through needed paths
# ProtectHome=tmpfs
# BindPaths=-/root/.cache/borg -/root/.cache/borg -/root/.borgmatic
# BindPaths=-/root/.cache/borg -/root/.config/borg -/root/.borgmatic
# May interfere with running external programs within borgmatic hooks.
CapabilityBoundingSet=CAP_DAC_READ_SEARCH CAP_NET_RAW
@ -61,4 +61,4 @@ LogRateLimitIntervalSec=0
# Delay start to prevent backups running during boot. Note that systemd-inhibit requires dbus and
# dbus-user-session to be installed.
ExecStartPre=sleep 1m
ExecStart=systemd-inhibit --who="borgmatic" --why="Prevent interrupting scheduled backup" /root/.local/bin/borgmatic --verbosity -1 --syslog-verbosity 1
ExecStart=systemd-inhibit --who="borgmatic" --what="sleep:shutdown" --why="Prevent interrupting scheduled backup" /root/.local/bin/borgmatic --verbosity -1 --syslog-verbosity 1

View File

@ -53,6 +53,7 @@ for sub_command in prune create check list info; do
| grep -v '^--first' \
| grep -v '^--format' \
| grep -v '^--glob-archives' \
| grep -v '^--match-archives' \
| grep -v '^--last' \
| grep -v '^--format' \
| grep -v '^--patterns-from' \

View File

@ -14,8 +14,8 @@ apk add --no-cache python3 py3-pip borgbackup postgresql-client mariadb-client m
py3-ruamel.yaml py3-ruamel.yaml.clib bash
# If certain dependencies of black are available in this version of Alpine, install them.
apk add --no-cache py3-typed-ast py3-regex || true
python3 -m pip install --no-cache --upgrade pip==22.0.3 setuptools==60.8.1
pip3 install tox==3.24.5
python3 -m pip install --no-cache --upgrade pip==22.2.2 setuptools==64.0.1
pip3 install --ignore-installed tox==3.25.1
export COVERAGE_FILE=/tmp/.coverage
tox --workdir /tmp/.tox --sitepackages
tox --workdir /tmp/.tox --sitepackages -e end-to-end

View File

@ -1,6 +1,6 @@
from setuptools import find_packages, setup
VERSION = '1.6.6'
VERSION = '1.7.6.dev0'
setup(

View File

@ -14,8 +14,8 @@ py==1.10.0
pycodestyle==2.8.0
pyflakes==2.4.0
jsonschema==3.2.0
pytest==6.2.5
pytest-cov==3.0.0
pytest==7.2.0
pytest-cov==4.0.0
regex; python_version >= '3.8'
requests==2.25.0
ruamel.yaml>0.15.0,<0.18.0

View File

@ -18,8 +18,9 @@ def generate_configuration(config_path, repository_path):
config = (
open(config_path)
.read()
.replace('user@backupserver:sourcehostname.borg', repository_path)
.replace('- user@backupserver:{fqdn}', '')
.replace('ssh://user@backupserver/./sourcehostname.borg', repository_path)
.replace('- ssh://user@backupserver/./{fqdn}', '')
.replace('- /var/local/backups/local.borg', '')
.replace('- /home/user/path with spaces', '')
.replace('- /home', '- {}'.format(config_path))
.replace('- /etc', '')

View File

@ -9,20 +9,24 @@ import pytest
def write_configuration(
config_path, repository_path, borgmatic_source_directory, postgresql_dump_format='custom'
source_directory,
config_path,
repository_path,
borgmatic_source_directory,
postgresql_dump_format='custom',
):
'''
Write out borgmatic configuration into a file at the config path. Set the options so as to work
for testing. This includes injecting the given repository path, borgmatic source directory for
storing database dumps, dump format (for PostgreSQL), and encryption passphrase.
'''
config = '''
config = f'''
location:
source_directories:
- {}
- {source_directory}
repositories:
- {}
borgmatic_source_directory: {}
- {repository_path}
borgmatic_source_directory: {borgmatic_source_directory}
storage:
encryption_passphrase: "test"
@ -33,7 +37,7 @@ hooks:
hostname: postgresql
username: postgres
password: test
format: {}
format: {postgresql_dump_format}
- name: all
hostname: postgresql
username: postgres
@ -57,9 +61,7 @@ hooks:
hostname: mongodb
username: root
password: test
'''.format(
config_path, repository_path, borgmatic_source_directory, postgresql_dump_format
)
'''
with open(config_path, 'w') as config_file:
config_file.write(config)
@ -71,11 +73,16 @@ def test_database_dump_and_restore():
repository_path = os.path.join(temporary_directory, 'test.borg')
borgmatic_source_directory = os.path.join(temporary_directory, '.borgmatic')
# Write out a special file to ensure that it gets properly excluded and Borg doesn't hang on it.
os.mkfifo(os.path.join(temporary_directory, 'special_file'))
original_working_directory = os.getcwd()
try:
config_path = os.path.join(temporary_directory, 'test.yaml')
write_configuration(config_path, repository_path, borgmatic_source_directory)
write_configuration(
temporary_directory, config_path, repository_path, borgmatic_source_directory
)
subprocess.check_call(
['borgmatic', '-v', '2', '--config', config_path, 'init', '--encryption', 'repokey']
@ -114,6 +121,7 @@ def test_database_dump_and_restore_with_directory_format():
try:
config_path = os.path.join(temporary_directory, 'test.yaml')
write_configuration(
temporary_directory,
config_path,
repository_path,
borgmatic_source_directory,
@ -146,7 +154,9 @@ def test_database_dump_with_error_causes_borgmatic_to_exit():
try:
config_path = os.path.join(temporary_directory, 'test.yaml')
write_configuration(config_path, repository_path, borgmatic_source_directory)
write_configuration(
temporary_directory, config_path, repository_path, borgmatic_source_directory
)
subprocess.check_call(
['borgmatic', '-v', '2', '--config', config_path, 'init', '--encryption', 'repokey']

View File

@ -0,0 +1,16 @@
import os
import subprocess
import tempfile
def test_generate_borgmatic_config_with_merging_succeeds():
with tempfile.TemporaryDirectory() as temporary_directory:
config_path = os.path.join(temporary_directory, 'test.yaml')
new_config_path = os.path.join(temporary_directory, 'new.yaml')
subprocess.check_call(f'generate-borgmatic-config --destination {config_path}'.split(' '))
subprocess.check_call(
f'generate-borgmatic-config --source {config_path} --destination {new_config_path}'.split(
' '
)
)

View File

@ -16,8 +16,9 @@ def generate_configuration(config_path, repository_path):
config = (
open(config_path)
.read()
.replace('user@backupserver:sourcehostname.borg', repository_path)
.replace('- user@backupserver:{fqdn}', '')
.replace('ssh://user@backupserver/./sourcehostname.borg', repository_path)
.replace('- ssh://user@backupserver/./{fqdn}', '')
.replace('- /var/local/backups/local.borg', '')
.replace('- /home/user/path with spaces', '')
.replace('- /home', '- {}'.format(config_path))
.replace('- /etc', '')
@ -32,11 +33,8 @@ def generate_configuration(config_path, repository_path):
def test_override_get_normalized():
temporary_directory = tempfile.mkdtemp()
repository_path = os.path.join(temporary_directory, 'test.borg')
extract_path = os.path.join(temporary_directory, 'extract')
original_working_directory = os.getcwd()
os.mkdir(extract_path)
os.chdir(extract_path)
try:
config_path = os.path.join(temporary_directory, 'test.yaml')

View File

@ -107,13 +107,6 @@ def test_parse_arguments_with_list_json_overrides_default():
assert arguments['list'].json is True
def test_parse_arguments_with_dashed_list_json_overrides_default():
arguments = module.parse_arguments('--list', '--json')
assert 'list' in arguments
assert arguments['list'].json is True
def test_parse_arguments_with_no_actions_defaults_to_all_actions_enabled():
flexmock(module.collect).should_receive('get_default_config_paths').and_return(['default'])
@ -127,14 +120,14 @@ def test_parse_arguments_with_no_actions_defaults_to_all_actions_enabled():
def test_parse_arguments_with_no_actions_passes_argument_to_relevant_actions():
flexmock(module.collect).should_receive('get_default_config_paths').and_return(['default'])
arguments = module.parse_arguments('--stats', '--files')
arguments = module.parse_arguments('--stats', '--list')
assert 'prune' in arguments
assert arguments['prune'].stats
assert arguments['prune'].files
assert arguments['prune'].list_archives
assert 'create' in arguments
assert arguments['create'].stats
assert arguments['create'].files
assert arguments['create'].list_files
assert 'check' in arguments
@ -191,16 +184,6 @@ def test_parse_arguments_with_prune_action_leaves_other_actions_disabled():
assert 'check' not in arguments
def test_parse_arguments_with_dashed_prune_action_leaves_other_actions_disabled():
flexmock(module.collect).should_receive('get_default_config_paths').and_return(['default'])
arguments = module.parse_arguments('--prune')
assert 'prune' in arguments
assert 'create' not in arguments
assert 'check' not in arguments
def test_parse_arguments_with_multiple_actions_leaves_other_action_disabled():
flexmock(module.collect).should_receive('get_default_config_paths').and_return(['default'])
@ -211,16 +194,6 @@ def test_parse_arguments_with_multiple_actions_leaves_other_action_disabled():
assert 'check' in arguments
def test_parse_arguments_with_multiple_dashed_actions_leaves_other_action_disabled():
flexmock(module.collect).should_receive('get_default_config_paths').and_return(['default'])
arguments = module.parse_arguments('--create', '--check')
assert 'prune' not in arguments
assert 'create' in arguments
assert 'check' in arguments
def test_parse_arguments_with_invalid_arguments_exits():
flexmock(module.collect).should_receive('get_default_config_paths').and_return(['default'])
@ -248,12 +221,6 @@ def test_parse_arguments_allows_encryption_mode_with_init():
module.parse_arguments('--config', 'myconfig', 'init', '--encryption', 'repokey')
def test_parse_arguments_allows_encryption_mode_with_dashed_init():
flexmock(module.collect).should_receive('get_default_config_paths').and_return(['default'])
module.parse_arguments('--config', 'myconfig', '--init', '--encryption', 'repokey')
def test_parse_arguments_requires_encryption_mode_with_init():
flexmock(module.collect).should_receive('get_default_config_paths').and_return(['default'])
@ -287,15 +254,6 @@ def test_parse_arguments_allows_init_and_create():
module.parse_arguments('--config', 'myconfig', 'init', '--encryption', 'repokey', 'create')
def test_parse_arguments_disallows_init_and_dry_run():
flexmock(module.collect).should_receive('get_default_config_paths').and_return(['default'])
with pytest.raises(ValueError):
module.parse_arguments(
'--config', 'myconfig', 'init', '--encryption', 'repokey', '--dry-run'
)
def test_parse_arguments_disallows_repository_unless_action_consumes_it():
flexmock(module.collect).should_receive('get_default_config_paths').and_return(['default'])
@ -361,24 +319,12 @@ def test_parse_arguments_allows_archive_with_mount():
)
def test_parse_arguments_allows_archive_with_dashed_extract():
flexmock(module.collect).should_receive('get_default_config_paths').and_return(['default'])
module.parse_arguments('--config', 'myconfig', '--extract', '--archive', 'test')
def test_parse_arguments_allows_archive_with_restore():
flexmock(module.collect).should_receive('get_default_config_paths').and_return(['default'])
module.parse_arguments('--config', 'myconfig', 'restore', '--archive', 'test')
def test_parse_arguments_allows_archive_with_dashed_restore():
flexmock(module.collect).should_receive('get_default_config_paths').and_return(['default'])
module.parse_arguments('--config', 'myconfig', '--restore', '--archive', 'test')
def test_parse_arguments_allows_archive_with_list():
flexmock(module.collect).should_receive('get_default_config_paths').and_return(['default'])
@ -457,23 +403,23 @@ def test_parse_arguments_with_stats_flag_but_no_create_or_prune_flag_raises_valu
module.parse_arguments('--stats', 'list')
def test_parse_arguments_with_files_and_create_flags_does_not_raise():
def test_parse_arguments_with_list_and_create_flags_does_not_raise():
flexmock(module.collect).should_receive('get_default_config_paths').and_return(['default'])
module.parse_arguments('--files', 'create', 'list')
module.parse_arguments('--list', 'create')
def test_parse_arguments_with_files_and_prune_flags_does_not_raise():
def test_parse_arguments_with_list_and_prune_flags_does_not_raise():
flexmock(module.collect).should_receive('get_default_config_paths').and_return(['default'])
module.parse_arguments('--files', 'prune', 'list')
module.parse_arguments('--list', 'prune')
def test_parse_arguments_with_files_flag_but_no_create_or_prune_or_restore_flag_raises_value_error():
def test_parse_arguments_with_list_flag_but_no_relevant_action_raises_value_error():
flexmock(module.collect).should_receive('get_default_config_paths').and_return(['default'])
with pytest.raises(SystemExit):
module.parse_arguments('--files', 'list')
module.parse_arguments('--list', 'rcreate')
def test_parse_arguments_allows_json_with_list_or_info():
@ -483,12 +429,6 @@ def test_parse_arguments_allows_json_with_list_or_info():
module.parse_arguments('info', '--json')
def test_parse_arguments_allows_json_with_dashed_info():
flexmock(module.collect).should_receive('get_default_config_paths').and_return(['default'])
module.parse_arguments('--info', '--json')
def test_parse_arguments_disallows_json_with_both_list_and_info():
flexmock(module.collect).should_receive('get_default_config_paths').and_return(['default'])
@ -496,6 +436,56 @@ def test_parse_arguments_disallows_json_with_both_list_and_info():
module.parse_arguments('list', 'info', '--json')
def test_parse_arguments_disallows_json_with_both_list_and_rinfo():
flexmock(module.collect).should_receive('get_default_config_paths').and_return(['default'])
with pytest.raises(ValueError):
module.parse_arguments('list', 'rinfo', '--json')
def test_parse_arguments_disallows_json_with_both_rinfo_and_info():
flexmock(module.collect).should_receive('get_default_config_paths').and_return(['default'])
with pytest.raises(ValueError):
module.parse_arguments('rinfo', 'info', '--json')
def test_parse_arguments_disallows_transfer_with_both_archive_and_match_archives():
flexmock(module.collect).should_receive('get_default_config_paths').and_return(['default'])
with pytest.raises(ValueError):
module.parse_arguments(
'transfer',
'--source-repository',
'source.borg',
'--archive',
'foo',
'--match-archives',
'sh:*bar',
)
def test_parse_arguments_disallows_info_with_both_archive_and_match_archives():
flexmock(module.collect).should_receive('get_default_config_paths').and_return(['default'])
with pytest.raises(ValueError):
module.parse_arguments('info', '--archive', 'foo', '--match-archives', 'sh:*bar')
def test_parse_arguments_disallows_info_with_both_archive_and_prefix():
flexmock(module.collect).should_receive('get_default_config_paths').and_return(['default'])
with pytest.raises(ValueError):
module.parse_arguments('info', '--archive', 'foo', '--prefix', 'bar')
def test_parse_arguments_disallows_info_with_both_prefix_and_match_archives():
flexmock(module.collect).should_receive('get_default_config_paths').and_return(['default'])
with pytest.raises(ValueError):
module.parse_arguments('info', '--prefix', 'foo', '--match-archives', 'sh:*bar')
def test_parse_arguments_check_only_extract_does_not_raise_extract_subparser_error():
flexmock(module.collect).should_receive('get_default_config_paths').and_return(['default'])

View File

@ -60,39 +60,39 @@ def test_parse_configuration_transforms_file_into_mapping():
'''
)
result = module.parse_configuration('/tmp/config.yaml', '/tmp/schema.yaml')
config, logs = module.parse_configuration('/tmp/config.yaml', '/tmp/schema.yaml')
assert result == {
assert config == {
'location': {'source_directories': ['/home', '/etc'], 'repositories': ['hostname.borg']},
'retention': {'keep_daily': 7, 'keep_hourly': 24, 'keep_minutely': 60},
'consistency': {'checks': [{'name': 'repository'}, {'name': 'archives'}]},
}
assert logs == []
def test_parse_configuration_passes_through_quoted_punctuation():
escaped_punctuation = string.punctuation.replace('\\', r'\\').replace('"', r'\"')
mock_config_and_schema(
'''
f'''
location:
source_directories:
- /home
- "/home/{escaped_punctuation}"
repositories:
- "{}.borg"
'''.format(
escaped_punctuation
)
- test.borg
'''
)
result = module.parse_configuration('/tmp/config.yaml', '/tmp/schema.yaml')
config, logs = module.parse_configuration('/tmp/config.yaml', '/tmp/schema.yaml')
assert result == {
assert config == {
'location': {
'source_directories': ['/home'],
'repositories': ['{}.borg'.format(string.punctuation)],
'source_directories': [f'/home/{string.punctuation}'],
'repositories': ['test.borg'],
}
}
assert logs == []
def test_parse_configuration_with_schema_lacking_examples_does_not_raise():
@ -148,12 +148,13 @@ def test_parse_configuration_inlines_include():
include_file.name = 'include.yaml'
builtins.should_receive('open').with_args('/tmp/include.yaml').and_return(include_file)
result = module.parse_configuration('/tmp/config.yaml', '/tmp/schema.yaml')
config, logs = module.parse_configuration('/tmp/config.yaml', '/tmp/schema.yaml')
assert result == {
assert config == {
'location': {'source_directories': ['/home'], 'repositories': ['hostname.borg']},
'retention': {'keep_daily': 7, 'keep_hourly': 24},
}
assert logs == []
def test_parse_configuration_merges_include():
@ -181,12 +182,13 @@ def test_parse_configuration_merges_include():
include_file.name = 'include.yaml'
builtins.should_receive('open').with_args('/tmp/include.yaml').and_return(include_file)
result = module.parse_configuration('/tmp/config.yaml', '/tmp/schema.yaml')
config, logs = module.parse_configuration('/tmp/config.yaml', '/tmp/schema.yaml')
assert result == {
assert config == {
'location': {'source_directories': ['/home'], 'repositories': ['hostname.borg']},
'retention': {'keep_daily': 1, 'keep_hourly': 24},
}
assert logs == []
def test_parse_configuration_raises_for_missing_config_file():
@ -238,17 +240,18 @@ def test_parse_configuration_applies_overrides():
'''
)
result = module.parse_configuration(
config, logs = module.parse_configuration(
'/tmp/config.yaml', '/tmp/schema.yaml', overrides=['location.local_path=borg2']
)
assert result == {
assert config == {
'location': {
'source_directories': ['/home'],
'repositories': ['hostname.borg'],
'local_path': 'borg2',
}
}
assert logs == []
def test_parse_configuration_applies_normalization():
@ -265,12 +268,13 @@ def test_parse_configuration_applies_normalization():
'''
)
result = module.parse_configuration('/tmp/config.yaml', '/tmp/schema.yaml')
config, logs = module.parse_configuration('/tmp/config.yaml', '/tmp/schema.yaml')
assert result == {
assert config == {
'location': {
'source_directories': ['/home'],
'repositories': ['hostname.borg'],
'exclude_if_present': ['.nobackup'],
}
}
assert logs == []

View File

@ -278,7 +278,7 @@ def test_log_outputs_with_unfinished_process_re_polls():
flexmock(module).should_receive('exit_code_indicates_error').and_return(False)
process = subprocess.Popen(['true'], stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
flexmock(process).should_receive('poll').and_return(None).and_return(0).twice()
flexmock(process).should_receive('poll').and_return(None).and_return(0).times(3)
flexmock(module).should_receive('output_buffer_for_process').and_return(process.stdout)
module.log_outputs(

View File

@ -8,197 +8,288 @@ from ..test_verbosity import insert_logging_mock
def test_run_arbitrary_borg_calls_borg_with_parameters():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'break-lock', 'repo'),
output_log_level=logging.WARNING,
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
)
module.run_arbitrary_borg(
repository='repo', storage_config={}, options=['break-lock'],
repository='repo', storage_config={}, local_borg_version='1.2.3', options=['break-lock'],
)
def test_run_arbitrary_borg_with_log_info_calls_borg_with_info_parameter():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'break-lock', 'repo', '--info'),
output_log_level=logging.WARNING,
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
)
insert_logging_mock(logging.INFO)
module.run_arbitrary_borg(
repository='repo', storage_config={}, options=['break-lock'],
repository='repo', storage_config={}, local_borg_version='1.2.3', options=['break-lock'],
)
def test_run_arbitrary_borg_with_log_debug_calls_borg_with_debug_parameter():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'break-lock', 'repo', '--debug', '--show-rc'),
output_log_level=logging.WARNING,
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
)
insert_logging_mock(logging.DEBUG)
module.run_arbitrary_borg(
repository='repo', storage_config={}, options=['break-lock'],
repository='repo', storage_config={}, local_borg_version='1.2.3', options=['break-lock'],
)
def test_run_arbitrary_borg_with_lock_wait_calls_borg_with_lock_wait_parameters():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
storage_config = {'lock_wait': 5}
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
flexmock(module.flags).should_receive('make_flags').and_return(()).and_return(
('--lock-wait', '5')
)
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'break-lock', 'repo', '--lock-wait', '5'),
output_log_level=logging.WARNING,
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
)
module.run_arbitrary_borg(
repository='repo', storage_config=storage_config, options=['break-lock'],
repository='repo',
storage_config=storage_config,
local_borg_version='1.2.3',
options=['break-lock'],
)
def test_run_arbitrary_borg_with_archive_calls_borg_with_archive_parameter():
storage_config = {}
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_repository_archive_flags').and_return(
('repo::archive',)
)
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'break-lock', 'repo::archive'),
output_log_level=logging.WARNING,
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
)
module.run_arbitrary_borg(
repository='repo', storage_config=storage_config, options=['break-lock'], archive='archive',
repository='repo',
storage_config={},
local_borg_version='1.2.3',
options=['break-lock'],
archive='archive',
)
def test_run_arbitrary_borg_with_local_path_calls_borg_via_local_path():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg1', 'break-lock', 'repo'),
output_log_level=logging.WARNING,
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg1',
extra_environment=None,
)
module.run_arbitrary_borg(
repository='repo', storage_config={}, options=['break-lock'], local_path='borg1',
repository='repo',
storage_config={},
local_borg_version='1.2.3',
options=['break-lock'],
local_path='borg1',
)
def test_run_arbitrary_borg_with_remote_path_calls_borg_with_remote_path_parameters():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
flexmock(module.flags).should_receive('make_flags').and_return(
('--remote-path', 'borg1')
).and_return(())
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'break-lock', 'repo', '--remote-path', 'borg1'),
output_log_level=logging.WARNING,
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
)
module.run_arbitrary_borg(
repository='repo', storage_config={}, options=['break-lock'], remote_path='borg1',
repository='repo',
storage_config={},
local_borg_version='1.2.3',
options=['break-lock'],
remote_path='borg1',
)
def test_run_arbitrary_borg_passes_borg_specific_parameters_to_borg():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'list', 'repo', '--progress'),
output_log_level=logging.WARNING,
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
)
module.run_arbitrary_borg(
repository='repo', storage_config={}, options=['list', '--progress'],
repository='repo',
storage_config={},
local_borg_version='1.2.3',
options=['list', '--progress'],
)
def test_run_arbitrary_borg_omits_dash_dash_in_parameters_passed_to_borg():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'break-lock', 'repo'),
output_log_level=logging.WARNING,
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
)
module.run_arbitrary_borg(
repository='repo', storage_config={}, options=['--', 'break-lock'],
repository='repo',
storage_config={},
local_borg_version='1.2.3',
options=['--', 'break-lock'],
)
def test_run_arbitrary_borg_without_borg_specific_parameters_does_not_raise():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_repository_flags').never()
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg',), output_log_level=logging.WARNING, borg_local_path='borg', extra_environment=None,
('borg',),
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
)
module.run_arbitrary_borg(
repository='repo', storage_config={}, options=[],
repository='repo', storage_config={}, local_borg_version='1.2.3', options=[],
)
def test_run_arbitrary_borg_passes_key_sub_command_to_borg_before_repository():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'key', 'export', 'repo'),
output_log_level=logging.WARNING,
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
)
module.run_arbitrary_borg(
repository='repo', storage_config={}, options=['key', 'export'],
repository='repo', storage_config={}, local_borg_version='1.2.3', options=['key', 'export'],
)
def test_run_arbitrary_borg_passes_debug_sub_command_to_borg_before_repository():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'debug', 'dump-manifest', 'repo', 'path'),
output_log_level=logging.WARNING,
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
)
module.run_arbitrary_borg(
repository='repo', storage_config={}, options=['debug', 'dump-manifest', 'path'],
repository='repo',
storage_config={},
local_borg_version='1.2.3',
options=['debug', 'dump-manifest', 'path'],
)
def test_run_arbitrary_borg_with_debug_info_command_does_not_pass_borg_repository():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_repository_flags').never()
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'debug', 'info'),
output_log_level=logging.WARNING,
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
)
module.run_arbitrary_borg(
repository='repo', storage_config={}, options=['debug', 'info'],
repository='repo', storage_config={}, local_borg_version='1.2.3', options=['debug', 'info'],
)
def test_run_arbitrary_borg_with_debug_convert_profile_command_does_not_pass_borg_repository():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_repository_flags').never()
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'debug', 'convert-profile', 'in', 'out'),
output_log_level=logging.WARNING,
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
)
module.run_arbitrary_borg(
repository='repo', storage_config={}, options=['debug', 'convert-profile', 'in', 'out'],
repository='repo',
storage_config={},
local_borg_version='1.2.3',
options=['debug', 'convert-profile', 'in', 'out'],
)

View File

@ -0,0 +1,70 @@
import logging
from flexmock import flexmock
from borgmatic.borg import break_lock as module
from ..test_verbosity import insert_logging_mock
def insert_execute_command_mock(command):
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
command, borg_local_path='borg', extra_environment=None,
).once()
def test_break_lock_calls_borg_with_required_flags():
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
insert_execute_command_mock(('borg', 'break-lock', 'repo'))
module.break_lock(
repository='repo', storage_config={}, local_borg_version='1.2.3',
)
def test_break_lock_calls_borg_with_remote_path_flags():
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
insert_execute_command_mock(('borg', 'break-lock', '--remote-path', 'borg1', 'repo'))
module.break_lock(
repository='repo', storage_config={}, local_borg_version='1.2.3', remote_path='borg1',
)
def test_break_lock_calls_borg_with_umask_flags():
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
insert_execute_command_mock(('borg', 'break-lock', '--umask', '0770', 'repo'))
module.break_lock(
repository='repo', storage_config={'umask': '0770'}, local_borg_version='1.2.3',
)
def test_break_lock_calls_borg_with_lock_wait_flags():
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
insert_execute_command_mock(('borg', 'break-lock', '--lock-wait', '5', 'repo'))
module.break_lock(
repository='repo', storage_config={'lock_wait': '5'}, local_borg_version='1.2.3',
)
def test_break_lock_with_log_info_calls_borg_with_info_parameter():
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
insert_execute_command_mock(('borg', 'break-lock', '--info', 'repo'))
insert_logging_mock(logging.INFO)
module.break_lock(
repository='repo', storage_config={}, local_borg_version='1.2.3',
)
def test_break_lock_with_log_debug_calls_borg_with_debug_flags():
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
insert_execute_command_mock(('borg', 'break-lock', '--debug', '--show-rc', 'repo'))
insert_logging_mock(logging.DEBUG)
module.break_lock(
repository='repo', storage_config={}, local_borg_version='1.2.3',
)

View File

@ -49,18 +49,6 @@ def test_parse_checks_with_disabled_returns_no_checks():
assert checks == ()
def test_parse_checks_with_data_check_also_injects_archives():
checks = module.parse_checks({'checks': [{'name': 'data'}]})
assert checks == ('data', 'archives')
def test_parse_checks_with_data_check_passes_through_archives():
checks = module.parse_checks({'checks': [{'name': 'data'}, {'name': 'archives'}]})
assert checks == ('data', 'archives')
def test_parse_checks_prefers_override_checks_to_configured_checks():
checks = module.parse_checks(
{'checks': [{'name': 'archives'}]}, only_checks=['repository', 'extract']
@ -69,12 +57,6 @@ def test_parse_checks_prefers_override_checks_to_configured_checks():
assert checks == ('repository', 'extract')
def test_parse_checks_with_override_data_check_also_injects_archives():
checks = module.parse_checks({'checks': [{'name': 'extract'}]}, only_checks=['data'])
assert checks == ('data', 'archives')
@pytest.mark.parametrize(
'frequency,expected_result',
(
@ -206,89 +188,153 @@ def test_filter_checks_on_frequency_restains_check_with_unelapsed_frequency_and_
def test_make_check_flags_with_repository_check_returns_flag():
flags = module.make_check_flags(('repository',))
flexmock(module.feature).should_receive('available').and_return(True)
flags = module.make_check_flags('1.2.3', ('repository',))
assert flags == ('--repository-only',)
def test_make_check_flags_with_archives_check_returns_flag():
flags = module.make_check_flags(('archives',))
flexmock(module.feature).should_receive('available').and_return(True)
flags = module.make_check_flags('1.2.3', ('archives',))
assert flags == ('--archives-only',)
def test_make_check_flags_with_data_check_returns_flag():
flags = module.make_check_flags(('data',))
def test_make_check_flags_with_data_check_returns_flag_and_implies_archives():
flexmock(module.feature).should_receive('available').and_return(True)
assert flags == ('--verify-data',)
flags = module.make_check_flags('1.2.3', ('data',))
assert flags == ('--archives-only', '--verify-data',)
def test_make_check_flags_with_extract_omits_extract_flag():
flags = module.make_check_flags(('extract',))
flexmock(module.feature).should_receive('available').and_return(True)
flags = module.make_check_flags('1.2.3', ('extract',))
assert flags == ()
def test_make_check_flags_with_default_checks_and_default_prefix_returns_default_flags():
flags = module.make_check_flags(('repository', 'archives'), prefix=module.DEFAULT_PREFIX)
def test_make_check_flags_with_repository_and_data_checks_does_not_return_repository_only():
flexmock(module.feature).should_receive('available').and_return(True)
assert flags == ('--prefix', module.DEFAULT_PREFIX)
flags = module.make_check_flags('1.2.3', ('repository', 'data',))
assert flags == ('--verify-data',)
def test_make_check_flags_with_default_checks_and_default_prefix_returns_default_flags():
flexmock(module.feature).should_receive('available').and_return(True)
flags = module.make_check_flags(
'1.2.3', ('repository', 'archives'), prefix=module.DEFAULT_PREFIX
)
assert flags == ('--match-archives', f'sh:{module.DEFAULT_PREFIX}*')
def test_make_check_flags_with_all_checks_and_default_prefix_returns_default_flags():
flexmock(module.feature).should_receive('available').and_return(True)
flags = module.make_check_flags(
('repository', 'archives', 'extract'), prefix=module.DEFAULT_PREFIX
'1.2.3', ('repository', 'archives', 'extract'), prefix=module.DEFAULT_PREFIX
)
assert flags == ('--prefix', module.DEFAULT_PREFIX)
assert flags == ('--match-archives', f'sh:{module.DEFAULT_PREFIX}*')
def test_make_check_flags_with_all_checks_and_default_prefix_without_borg_features_returns_glob_archives_flags():
flexmock(module.feature).should_receive('available').and_return(False)
flags = module.make_check_flags(
'1.2.3', ('repository', 'archives', 'extract'), prefix=module.DEFAULT_PREFIX
)
assert flags == ('--glob-archives', f'{module.DEFAULT_PREFIX}*')
def test_make_check_flags_with_archives_check_and_last_includes_last_flag():
flags = module.make_check_flags(('archives',), check_last=3)
flexmock(module.feature).should_receive('available').and_return(True)
flags = module.make_check_flags('1.2.3', ('archives',), check_last=3)
assert flags == ('--archives-only', '--last', '3')
def test_make_check_flags_with_data_check_and_last_includes_last_flag():
flexmock(module.feature).should_receive('available').and_return(True)
flags = module.make_check_flags('1.2.3', ('data',), check_last=3)
assert flags == ('--archives-only', '--last', '3', '--verify-data')
def test_make_check_flags_with_repository_check_and_last_omits_last_flag():
flags = module.make_check_flags(('repository',), check_last=3)
flexmock(module.feature).should_receive('available').and_return(True)
flags = module.make_check_flags('1.2.3', ('repository',), check_last=3)
assert flags == ('--repository-only',)
def test_make_check_flags_with_default_checks_and_last_includes_last_flag():
flags = module.make_check_flags(('repository', 'archives'), check_last=3)
flexmock(module.feature).should_receive('available').and_return(True)
flags = module.make_check_flags('1.2.3', ('repository', 'archives'), check_last=3)
assert flags == ('--last', '3')
def test_make_check_flags_with_archives_check_and_prefix_includes_prefix_flag():
flags = module.make_check_flags(('archives',), prefix='foo-')
def test_make_check_flags_with_archives_check_and_prefix_includes_match_archives_flag():
flexmock(module.feature).should_receive('available').and_return(True)
assert flags == ('--archives-only', '--prefix', 'foo-')
flags = module.make_check_flags('1.2.3', ('archives',), prefix='foo-')
assert flags == ('--archives-only', '--match-archives', 'sh:foo-*')
def test_make_check_flags_with_archives_check_and_empty_prefix_omits_prefix_flag():
flags = module.make_check_flags(('archives',), prefix='')
def test_make_check_flags_with_data_check_and_prefix_includes_match_archives_flag():
flexmock(module.feature).should_receive('available').and_return(True)
flags = module.make_check_flags('1.2.3', ('data',), prefix='foo-')
assert flags == ('--archives-only', '--match-archives', 'sh:foo-*', '--verify-data')
def test_make_check_flags_with_archives_check_and_empty_prefix_omits_match_archives_flag():
flexmock(module.feature).should_receive('available').and_return(True)
flags = module.make_check_flags('1.2.3', ('archives',), prefix='')
assert flags == ('--archives-only',)
def test_make_check_flags_with_archives_check_and_none_prefix_omits_prefix_flag():
flags = module.make_check_flags(('archives',), prefix=None)
def test_make_check_flags_with_archives_check_and_none_prefix_omits_match_archives_flag():
flexmock(module.feature).should_receive('available').and_return(True)
flags = module.make_check_flags('1.2.3', ('archives',), prefix=None)
assert flags == ('--archives-only',)
def test_make_check_flags_with_repository_check_and_prefix_omits_prefix_flag():
flags = module.make_check_flags(('repository',), prefix='foo-')
def test_make_check_flags_with_repository_check_and_prefix_omits_match_archives_flag():
flexmock(module.feature).should_receive('available').and_return(True)
flags = module.make_check_flags('1.2.3', ('repository',), prefix='foo-')
assert flags == ('--repository-only',)
def test_make_check_flags_with_default_checks_and_prefix_includes_prefix_flag():
flags = module.make_check_flags(('repository', 'archives'), prefix='foo-')
def test_make_check_flags_with_default_checks_and_prefix_includes_match_archives_flag():
flexmock(module.feature).should_receive('available').and_return(True)
assert flags == ('--prefix', 'foo-')
flags = module.make_check_flags('1.2.3', ('repository', 'archives'), prefix='foo-')
assert flags == ('--match-archives', 'sh:foo-*')
def test_read_check_time_does_not_raise():
@ -308,11 +354,12 @@ def test_check_archives_with_progress_calls_borg_with_progress_parameter():
consistency_config = {'check_last': None}
flexmock(module).should_receive('parse_checks')
flexmock(module).should_receive('filter_checks_on_frequency').and_return(checks)
flexmock(module.info).should_receive('display_archives_info').and_return(
flexmock(module.rinfo).should_receive('display_repository_info').and_return(
'{"repository": {"id": "repo"}}'
)
flexmock(module).should_receive('make_check_flags').and_return(())
flexmock(module).should_receive('execute_command').never()
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'check', '--progress', 'repo'),
@ -327,6 +374,7 @@ def test_check_archives_with_progress_calls_borg_with_progress_parameter():
location_config={},
storage_config={},
consistency_config=consistency_config,
local_borg_version='1.2.3',
progress=True,
)
@ -336,11 +384,12 @@ def test_check_archives_with_repair_calls_borg_with_repair_parameter():
consistency_config = {'check_last': None}
flexmock(module).should_receive('parse_checks')
flexmock(module).should_receive('filter_checks_on_frequency').and_return(checks)
flexmock(module.info).should_receive('display_archives_info').and_return(
flexmock(module.rinfo).should_receive('display_repository_info').and_return(
'{"repository": {"id": "repo"}}'
)
flexmock(module).should_receive('make_check_flags').and_return(())
flexmock(module).should_receive('execute_command').never()
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'check', '--repair', 'repo'),
@ -355,6 +404,7 @@ def test_check_archives_with_repair_calls_borg_with_repair_parameter():
location_config={},
storage_config={},
consistency_config=consistency_config,
local_borg_version='1.2.3',
repair=True,
)
@ -373,12 +423,13 @@ def test_check_archives_calls_borg_with_parameters(checks):
consistency_config = {'check_last': check_last}
flexmock(module).should_receive('parse_checks')
flexmock(module).should_receive('filter_checks_on_frequency').and_return(checks)
flexmock(module.info).should_receive('display_archives_info').and_return(
flexmock(module.rinfo).should_receive('display_repository_info').and_return(
'{"repository": {"id": "repo"}}'
)
flexmock(module).should_receive('make_check_flags').with_args(
checks, check_last, module.DEFAULT_PREFIX
'1.2.3', checks, check_last, module.DEFAULT_PREFIX
).and_return(())
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
insert_execute_command_mock(('borg', 'check', 'repo'))
flexmock(module).should_receive('make_check_time_path')
flexmock(module).should_receive('write_check_time')
@ -388,6 +439,7 @@ def test_check_archives_calls_borg_with_parameters(checks):
location_config={},
storage_config={},
consistency_config=consistency_config,
local_borg_version='1.2.3',
)
@ -397,7 +449,7 @@ def test_check_archives_with_json_error_raises():
consistency_config = {'check_last': check_last}
flexmock(module).should_receive('parse_checks')
flexmock(module).should_receive('filter_checks_on_frequency').and_return(checks)
flexmock(module.info).should_receive('display_archives_info').and_return(
flexmock(module.rinfo).should_receive('display_repository_info').and_return(
'{"unexpected": {"id": "repo"}}'
)
@ -407,6 +459,7 @@ def test_check_archives_with_json_error_raises():
location_config={},
storage_config={},
consistency_config=consistency_config,
local_borg_version='1.2.3',
)
@ -416,7 +469,7 @@ def test_check_archives_with_missing_json_keys_raises():
consistency_config = {'check_last': check_last}
flexmock(module).should_receive('parse_checks')
flexmock(module).should_receive('filter_checks_on_frequency').and_return(checks)
flexmock(module.info).should_receive('display_archives_info').and_return('{invalid JSON')
flexmock(module.rinfo).should_receive('display_repository_info').and_return('{invalid JSON')
with pytest.raises(ValueError):
module.check_archives(
@ -424,6 +477,7 @@ def test_check_archives_with_missing_json_keys_raises():
location_config={},
storage_config={},
consistency_config=consistency_config,
local_borg_version='1.2.3',
)
@ -433,10 +487,11 @@ def test_check_archives_with_extract_check_calls_extract_only():
consistency_config = {'check_last': check_last}
flexmock(module).should_receive('parse_checks')
flexmock(module).should_receive('filter_checks_on_frequency').and_return(checks)
flexmock(module.info).should_receive('display_archives_info').and_return(
flexmock(module.rinfo).should_receive('display_repository_info').and_return(
'{"repository": {"id": "repo"}}'
)
flexmock(module).should_receive('make_check_flags').never()
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
flexmock(module.extract).should_receive('extract_last_archive_dry_run').once()
flexmock(module).should_receive('write_check_time')
insert_execute_command_never()
@ -446,6 +501,7 @@ def test_check_archives_with_extract_check_calls_extract_only():
location_config={},
storage_config={},
consistency_config=consistency_config,
local_borg_version='1.2.3',
)
@ -454,10 +510,11 @@ def test_check_archives_with_log_info_calls_borg_with_info_parameter():
consistency_config = {'check_last': None}
flexmock(module).should_receive('parse_checks')
flexmock(module).should_receive('filter_checks_on_frequency').and_return(checks)
flexmock(module.info).should_receive('display_archives_info').and_return(
flexmock(module.rinfo).should_receive('display_repository_info').and_return(
'{"repository": {"id": "repo"}}'
)
flexmock(module).should_receive('make_check_flags').and_return(())
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
insert_logging_mock(logging.INFO)
insert_execute_command_mock(('borg', 'check', '--info', 'repo'))
flexmock(module).should_receive('make_check_time_path')
@ -468,6 +525,7 @@ def test_check_archives_with_log_info_calls_borg_with_info_parameter():
location_config={},
storage_config={},
consistency_config=consistency_config,
local_borg_version='1.2.3',
)
@ -476,10 +534,11 @@ def test_check_archives_with_log_debug_calls_borg_with_debug_parameter():
consistency_config = {'check_last': None}
flexmock(module).should_receive('parse_checks')
flexmock(module).should_receive('filter_checks_on_frequency').and_return(checks)
flexmock(module.info).should_receive('display_archives_info').and_return(
flexmock(module.rinfo).should_receive('display_repository_info').and_return(
'{"repository": {"id": "repo"}}'
)
flexmock(module).should_receive('make_check_flags').and_return(())
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
insert_logging_mock(logging.DEBUG)
insert_execute_command_mock(('borg', 'check', '--debug', '--show-rc', 'repo'))
flexmock(module).should_receive('make_check_time_path')
@ -490,6 +549,7 @@ def test_check_archives_with_log_debug_calls_borg_with_debug_parameter():
location_config={},
storage_config={},
consistency_config=consistency_config,
local_borg_version='1.2.3',
)
@ -497,7 +557,7 @@ def test_check_archives_without_any_checks_bails():
consistency_config = {'check_last': None}
flexmock(module).should_receive('parse_checks')
flexmock(module).should_receive('filter_checks_on_frequency').and_return(())
flexmock(module.info).should_receive('display_archives_info').and_return(
flexmock(module.rinfo).should_receive('display_repository_info').and_return(
'{"repository": {"id": "repo"}}'
)
insert_execute_command_never()
@ -507,6 +567,7 @@ def test_check_archives_without_any_checks_bails():
location_config={},
storage_config={},
consistency_config=consistency_config,
local_borg_version='1.2.3',
)
@ -516,12 +577,13 @@ def test_check_archives_with_local_path_calls_borg_via_local_path():
consistency_config = {'check_last': check_last}
flexmock(module).should_receive('parse_checks')
flexmock(module).should_receive('filter_checks_on_frequency').and_return(checks)
flexmock(module.info).should_receive('display_archives_info').and_return(
flexmock(module.rinfo).should_receive('display_repository_info').and_return(
'{"repository": {"id": "repo"}}'
)
flexmock(module).should_receive('make_check_flags').with_args(
checks, check_last, module.DEFAULT_PREFIX
'1.2.3', checks, check_last, module.DEFAULT_PREFIX
).and_return(())
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
insert_execute_command_mock(('borg1', 'check', 'repo'))
flexmock(module).should_receive('make_check_time_path')
flexmock(module).should_receive('write_check_time')
@ -531,6 +593,7 @@ def test_check_archives_with_local_path_calls_borg_via_local_path():
location_config={},
storage_config={},
consistency_config=consistency_config,
local_borg_version='1.2.3',
local_path='borg1',
)
@ -541,12 +604,13 @@ def test_check_archives_with_remote_path_calls_borg_with_remote_path_parameters(
consistency_config = {'check_last': check_last}
flexmock(module).should_receive('parse_checks')
flexmock(module).should_receive('filter_checks_on_frequency').and_return(checks)
flexmock(module.info).should_receive('display_archives_info').and_return(
flexmock(module.rinfo).should_receive('display_repository_info').and_return(
'{"repository": {"id": "repo"}}'
)
flexmock(module).should_receive('make_check_flags').with_args(
checks, check_last, module.DEFAULT_PREFIX
'1.2.3', checks, check_last, module.DEFAULT_PREFIX
).and_return(())
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
insert_execute_command_mock(('borg', 'check', '--remote-path', 'borg1', 'repo'))
flexmock(module).should_receive('make_check_time_path')
flexmock(module).should_receive('write_check_time')
@ -556,6 +620,7 @@ def test_check_archives_with_remote_path_calls_borg_with_remote_path_parameters(
location_config={},
storage_config={},
consistency_config=consistency_config,
local_borg_version='1.2.3',
remote_path='borg1',
)
@ -566,12 +631,13 @@ def test_check_archives_with_lock_wait_calls_borg_with_lock_wait_parameters():
consistency_config = {'check_last': check_last}
flexmock(module).should_receive('parse_checks')
flexmock(module).should_receive('filter_checks_on_frequency').and_return(checks)
flexmock(module.info).should_receive('display_archives_info').and_return(
flexmock(module.rinfo).should_receive('display_repository_info').and_return(
'{"repository": {"id": "repo"}}'
)
flexmock(module).should_receive('make_check_flags').with_args(
checks, check_last, module.DEFAULT_PREFIX
'1.2.3', checks, check_last, module.DEFAULT_PREFIX
).and_return(())
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
insert_execute_command_mock(('borg', 'check', '--lock-wait', '5', 'repo'))
flexmock(module).should_receive('make_check_time_path')
flexmock(module).should_receive('write_check_time')
@ -581,6 +647,7 @@ def test_check_archives_with_lock_wait_calls_borg_with_lock_wait_parameters():
location_config={},
storage_config={'lock_wait': 5},
consistency_config=consistency_config,
local_borg_version='1.2.3',
)
@ -591,12 +658,13 @@ def test_check_archives_with_retention_prefix():
consistency_config = {'check_last': check_last, 'prefix': prefix}
flexmock(module).should_receive('parse_checks')
flexmock(module).should_receive('filter_checks_on_frequency').and_return(checks)
flexmock(module.info).should_receive('display_archives_info').and_return(
flexmock(module.rinfo).should_receive('display_repository_info').and_return(
'{"repository": {"id": "repo"}}'
)
flexmock(module).should_receive('make_check_flags').with_args(
checks, check_last, prefix
'1.2.3', checks, check_last, prefix
).and_return(())
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
insert_execute_command_mock(('borg', 'check', 'repo'))
flexmock(module).should_receive('make_check_time_path')
flexmock(module).should_receive('write_check_time')
@ -606,6 +674,7 @@ def test_check_archives_with_retention_prefix():
location_config={},
storage_config={},
consistency_config=consistency_config,
local_borg_version='1.2.3',
)
@ -614,10 +683,11 @@ def test_check_archives_with_extra_borg_options_calls_borg_with_extra_options():
consistency_config = {'check_last': None}
flexmock(module).should_receive('parse_checks')
flexmock(module).should_receive('filter_checks_on_frequency').and_return(checks)
flexmock(module.info).should_receive('display_archives_info').and_return(
flexmock(module.rinfo).should_receive('display_repository_info').and_return(
'{"repository": {"id": "repo"}}'
)
flexmock(module).should_receive('make_check_flags').and_return(())
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
insert_execute_command_mock(('borg', 'check', '--extra', '--options', 'repo'))
flexmock(module).should_receive('make_check_time_path')
flexmock(module).should_receive('write_check_time')
@ -627,4 +697,5 @@ def test_check_archives_with_extra_borg_options_calls_borg_with_extra_options():
location_config={},
storage_config={'extra_borg_options': {'check': '--extra --options'}},
consistency_config=consistency_config,
local_borg_version='1.2.3',
)

View File

@ -21,94 +21,134 @@ COMPACT_COMMAND = ('borg', 'compact')
def test_compact_segments_calls_borg_with_parameters():
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
insert_execute_command_mock(COMPACT_COMMAND + ('repo',), logging.INFO)
module.compact_segments(dry_run=False, repository='repo', storage_config={})
module.compact_segments(
dry_run=False, repository='repo', storage_config={}, local_borg_version='1.2.3'
)
def test_compact_segments_with_log_info_calls_borg_with_info_parameter():
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
insert_execute_command_mock(COMPACT_COMMAND + ('--info', 'repo'), logging.INFO)
insert_logging_mock(logging.INFO)
module.compact_segments(repository='repo', storage_config={}, dry_run=False)
module.compact_segments(
repository='repo', storage_config={}, local_borg_version='1.2.3', dry_run=False
)
def test_compact_segments_with_log_debug_calls_borg_with_debug_parameter():
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
insert_execute_command_mock(COMPACT_COMMAND + ('--debug', '--show-rc', 'repo'), logging.INFO)
insert_logging_mock(logging.DEBUG)
module.compact_segments(repository='repo', storage_config={}, dry_run=False)
module.compact_segments(
repository='repo', storage_config={}, local_borg_version='1.2.3', dry_run=False
)
def test_compact_segments_with_dry_run_skips_borg_call():
flexmock(module).should_receive('execute_command').never()
module.compact_segments(repository='repo', storage_config={}, dry_run=True)
module.compact_segments(
repository='repo', storage_config={}, local_borg_version='1.2.3', dry_run=True
)
def test_compact_segments_with_local_path_calls_borg_via_local_path():
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
insert_execute_command_mock(('borg1',) + COMPACT_COMMAND[1:] + ('repo',), logging.INFO)
module.compact_segments(
dry_run=False, repository='repo', storage_config={}, local_path='borg1',
dry_run=False,
repository='repo',
storage_config={},
local_borg_version='1.2.3',
local_path='borg1',
)
def test_compact_segments_with_remote_path_calls_borg_with_remote_path_parameters():
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
insert_execute_command_mock(COMPACT_COMMAND + ('--remote-path', 'borg1', 'repo'), logging.INFO)
module.compact_segments(
dry_run=False, repository='repo', storage_config={}, remote_path='borg1',
dry_run=False,
repository='repo',
storage_config={},
local_borg_version='1.2.3',
remote_path='borg1',
)
def test_compact_segments_with_progress_calls_borg_with_progress_parameter():
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
insert_execute_command_mock(COMPACT_COMMAND + ('--progress', 'repo'), logging.INFO)
module.compact_segments(
dry_run=False, repository='repo', storage_config={}, progress=True,
dry_run=False,
repository='repo',
storage_config={},
local_borg_version='1.2.3',
progress=True,
)
def test_compact_segments_with_cleanup_commits_calls_borg_with_cleanup_commits_parameter():
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
insert_execute_command_mock(COMPACT_COMMAND + ('--cleanup-commits', 'repo'), logging.INFO)
module.compact_segments(
dry_run=False, repository='repo', storage_config={}, cleanup_commits=True,
dry_run=False,
repository='repo',
storage_config={},
local_borg_version='1.2.3',
cleanup_commits=True,
)
def test_compact_segments_with_threshold_calls_borg_with_threshold_parameter():
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
insert_execute_command_mock(COMPACT_COMMAND + ('--threshold', '20', 'repo'), logging.INFO)
module.compact_segments(
dry_run=False, repository='repo', storage_config={}, threshold=20,
dry_run=False,
repository='repo',
storage_config={},
local_borg_version='1.2.3',
threshold=20,
)
def test_compact_segments_with_umask_calls_borg_with_umask_parameters():
storage_config = {'umask': '077'}
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
insert_execute_command_mock(COMPACT_COMMAND + ('--umask', '077', 'repo'), logging.INFO)
module.compact_segments(
dry_run=False, repository='repo', storage_config=storage_config,
dry_run=False, repository='repo', storage_config=storage_config, local_borg_version='1.2.3'
)
def test_compact_segments_with_lock_wait_calls_borg_with_lock_wait_parameters():
storage_config = {'lock_wait': 5}
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
insert_execute_command_mock(COMPACT_COMMAND + ('--lock-wait', '5', 'repo'), logging.INFO)
module.compact_segments(
dry_run=False, repository='repo', storage_config=storage_config,
dry_run=False, repository='repo', storage_config=storage_config, local_borg_version='1.2.3'
)
def test_compact_segments_with_extra_borg_options_calls_borg_with_extra_options():
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
insert_execute_command_mock(COMPACT_COMMAND + ('--extra', '--options', 'repo'), logging.INFO)
module.compact_segments(
dry_run=False,
repository='repo',
storage_config={'extra_borg_options': {'compact': '--extra --options'}},
local_borg_version='1.2.3',
)

File diff suppressed because it is too large Load Diff

View File

@ -21,6 +21,11 @@ def insert_execute_command_mock(
def test_export_tar_archive_calls_borg_with_path_parameters():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_repository_archive_flags').and_return(
('repo::archive',)
)
flexmock(module.os.path).should_receive('abspath').and_return('repo')
insert_execute_command_mock(
('borg', 'export-tar', 'repo::archive', 'test.tar', 'path1', 'path2')
@ -33,10 +38,16 @@ def test_export_tar_archive_calls_borg_with_path_parameters():
paths=['path1', 'path2'],
destination_path='test.tar',
storage_config={},
local_borg_version='1.2.3',
)
def test_export_tar_archive_calls_borg_with_local_path_parameters():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_repository_archive_flags').and_return(
('repo::archive',)
)
flexmock(module.os.path).should_receive('abspath').and_return('repo')
insert_execute_command_mock(
('borg1', 'export-tar', 'repo::archive', 'test.tar'), borg_local_path='borg1'
@ -49,11 +60,17 @@ def test_export_tar_archive_calls_borg_with_local_path_parameters():
paths=None,
destination_path='test.tar',
storage_config={},
local_borg_version='1.2.3',
local_path='borg1',
)
def test_export_tar_archive_calls_borg_with_remote_path_parameters():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_repository_archive_flags').and_return(
('repo::archive',)
)
flexmock(module.os.path).should_receive('abspath').and_return('repo')
insert_execute_command_mock(
('borg', 'export-tar', '--remote-path', 'borg1', 'repo::archive', 'test.tar')
@ -66,11 +83,17 @@ def test_export_tar_archive_calls_borg_with_remote_path_parameters():
paths=None,
destination_path='test.tar',
storage_config={},
local_borg_version='1.2.3',
remote_path='borg1',
)
def test_export_tar_archive_calls_borg_with_umask_parameters():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_repository_archive_flags').and_return(
('repo::archive',)
)
flexmock(module.os.path).should_receive('abspath').and_return('repo')
insert_execute_command_mock(
('borg', 'export-tar', '--umask', '0770', 'repo::archive', 'test.tar')
@ -83,10 +106,16 @@ def test_export_tar_archive_calls_borg_with_umask_parameters():
paths=None,
destination_path='test.tar',
storage_config={'umask': '0770'},
local_borg_version='1.2.3',
)
def test_export_tar_archive_calls_borg_with_lock_wait_parameters():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_repository_archive_flags').and_return(
('repo::archive',)
)
flexmock(module.os.path).should_receive('abspath').and_return('repo')
insert_execute_command_mock(
('borg', 'export-tar', '--lock-wait', '5', 'repo::archive', 'test.tar')
@ -99,10 +128,16 @@ def test_export_tar_archive_calls_borg_with_lock_wait_parameters():
paths=None,
destination_path='test.tar',
storage_config={'lock_wait': '5'},
local_borg_version='1.2.3',
)
def test_export_tar_archive_with_log_info_calls_borg_with_info_parameter():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_repository_archive_flags').and_return(
('repo::archive',)
)
flexmock(module.os.path).should_receive('abspath').and_return('repo')
insert_execute_command_mock(('borg', 'export-tar', '--info', 'repo::archive', 'test.tar'))
insert_logging_mock(logging.INFO)
@ -114,10 +149,16 @@ def test_export_tar_archive_with_log_info_calls_borg_with_info_parameter():
paths=None,
destination_path='test.tar',
storage_config={},
local_borg_version='1.2.3',
)
def test_export_tar_archive_with_log_debug_calls_borg_with_debug_parameters():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_repository_archive_flags').and_return(
('repo::archive',)
)
flexmock(module.os.path).should_receive('abspath').and_return('repo')
insert_execute_command_mock(
('borg', 'export-tar', '--debug', '--show-rc', 'repo::archive', 'test.tar')
@ -131,10 +172,16 @@ def test_export_tar_archive_with_log_debug_calls_borg_with_debug_parameters():
paths=None,
destination_path='test.tar',
storage_config={},
local_borg_version='1.2.3',
)
def test_export_tar_archive_calls_borg_with_dry_run_parameter():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_repository_archive_flags').and_return(
('repo::archive',)
)
flexmock(module.os.path).should_receive('abspath').and_return('repo')
flexmock(module).should_receive('execute_command').never()
@ -145,10 +192,16 @@ def test_export_tar_archive_calls_borg_with_dry_run_parameter():
paths=None,
destination_path='test.tar',
storage_config={},
local_borg_version='1.2.3',
)
def test_export_tar_archive_calls_borg_with_tar_filter_parameters():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_repository_archive_flags').and_return(
('repo::archive',)
)
flexmock(module.os.path).should_receive('abspath').and_return('repo')
insert_execute_command_mock(
('borg', 'export-tar', '--tar-filter', 'bzip2', 'repo::archive', 'test.tar')
@ -161,15 +214,21 @@ def test_export_tar_archive_calls_borg_with_tar_filter_parameters():
paths=None,
destination_path='test.tar',
storage_config={},
local_borg_version='1.2.3',
tar_filter='bzip2',
)
def test_export_tar_archive_calls_borg_with_list_parameter():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_repository_archive_flags').and_return(
('repo::archive',)
)
flexmock(module.os.path).should_receive('abspath').and_return('repo')
insert_execute_command_mock(
('borg', 'export-tar', '--list', 'repo::archive', 'test.tar'),
output_log_level=logging.WARNING,
output_log_level=logging.ANSWER,
)
module.export_tar_archive(
@ -179,11 +238,17 @@ def test_export_tar_archive_calls_borg_with_list_parameter():
paths=None,
destination_path='test.tar',
storage_config={},
files=True,
local_borg_version='1.2.3',
list_files=True,
)
def test_export_tar_archive_calls_borg_with_strip_components_parameter():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_repository_archive_flags').and_return(
('repo::archive',)
)
flexmock(module.os.path).should_receive('abspath').and_return('repo')
insert_execute_command_mock(
('borg', 'export-tar', '--strip-components', '5', 'repo::archive', 'test.tar')
@ -196,11 +261,17 @@ def test_export_tar_archive_calls_borg_with_strip_components_parameter():
paths=None,
destination_path='test.tar',
storage_config={},
local_borg_version='1.2.3',
strip_components=5,
)
def test_export_tar_archive_skips_abspath_for_remote_repository_parameter():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_repository_archive_flags').and_return(
('server:repo::archive',)
)
flexmock(module.os.path).should_receive('abspath').never()
insert_execute_command_mock(('borg', 'export-tar', 'server:repo::archive', 'test.tar'))
@ -211,10 +282,16 @@ def test_export_tar_archive_skips_abspath_for_remote_repository_parameter():
paths=None,
destination_path='test.tar',
storage_config={},
local_borg_version='1.2.3',
)
def test_export_tar_archive_calls_borg_with_stdout_destination_path():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_repository_archive_flags').and_return(
('repo::archive',)
)
flexmock(module.os.path).should_receive('abspath').and_return('repo')
insert_execute_command_mock(('borg', 'export-tar', 'repo::archive', '-'), capture=False)
@ -225,4 +302,5 @@ def test_export_tar_archive_calls_borg_with_stdout_destination_path():
paths=None,
destination_path='-',
storage_config={},
local_borg_version='1.2.3',
)

View File

@ -15,96 +15,110 @@ def insert_execute_command_mock(command, working_directory=None):
).once()
def insert_execute_command_output_mock(command, result):
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
command, output_log_level=None, borg_local_path=command[0], extra_environment=None,
).and_return(result).once()
def test_extract_last_archive_dry_run_calls_borg_with_last_archive():
insert_execute_command_output_mock(
('borg', 'list', '--short', 'repo'), result='archive1\narchive2\n'
flexmock(module.rlist).should_receive('resolve_archive_name').and_return('archive')
insert_execute_command_mock(('borg', 'extract', '--dry-run', 'repo::archive'))
flexmock(module.flags).should_receive('make_repository_archive_flags').and_return(
('repo::archive',)
)
insert_execute_command_mock(('borg', 'extract', '--dry-run', 'repo::archive2'))
flexmock(module.feature).should_receive('available').and_return(True)
module.extract_last_archive_dry_run(storage_config={}, repository='repo', lock_wait=None)
module.extract_last_archive_dry_run(
storage_config={}, local_borg_version='1.2.3', repository='repo', lock_wait=None
)
def test_extract_last_archive_dry_run_without_any_archives_should_not_raise():
insert_execute_command_output_mock(('borg', 'list', '--short', 'repo'), result='\n')
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module.rlist).should_receive('resolve_archive_name').and_raise(ValueError)
flexmock(module.flags).should_receive('make_repository_archive_flags').and_return(('repo',))
module.extract_last_archive_dry_run(storage_config={}, repository='repo', lock_wait=None)
module.extract_last_archive_dry_run(
storage_config={}, local_borg_version='1.2.3', repository='repo', lock_wait=None
)
def test_extract_last_archive_dry_run_with_log_info_calls_borg_with_info_parameter():
insert_execute_command_output_mock(
('borg', 'list', '--short', '--info', 'repo'), result='archive1\narchive2\n'
)
insert_execute_command_mock(('borg', 'extract', '--dry-run', '--info', 'repo::archive2'))
flexmock(module.rlist).should_receive('resolve_archive_name').and_return('archive')
insert_execute_command_mock(('borg', 'extract', '--dry-run', '--info', 'repo::archive'))
insert_logging_mock(logging.INFO)
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module.flags).should_receive('make_repository_archive_flags').and_return(
('repo::archive',)
)
module.extract_last_archive_dry_run(storage_config={}, repository='repo', lock_wait=None)
module.extract_last_archive_dry_run(
storage_config={}, local_borg_version='1.2.3', repository='repo', lock_wait=None
)
def test_extract_last_archive_dry_run_with_log_debug_calls_borg_with_debug_parameter():
insert_execute_command_output_mock(
('borg', 'list', '--short', '--debug', '--show-rc', 'repo'), result='archive1\narchive2\n'
)
flexmock(module.rlist).should_receive('resolve_archive_name').and_return('archive')
insert_execute_command_mock(
('borg', 'extract', '--dry-run', '--debug', '--show-rc', '--list', 'repo::archive2')
('borg', 'extract', '--dry-run', '--debug', '--show-rc', '--list', 'repo::archive')
)
insert_logging_mock(logging.DEBUG)
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module.flags).should_receive('make_repository_archive_flags').and_return(
('repo::archive',)
)
module.extract_last_archive_dry_run(storage_config={}, repository='repo', lock_wait=None)
module.extract_last_archive_dry_run(
storage_config={}, local_borg_version='1.2.3', repository='repo', lock_wait=None
)
def test_extract_last_archive_dry_run_calls_borg_via_local_path():
insert_execute_command_output_mock(
('borg1', 'list', '--short', 'repo'), result='archive1\narchive2\n'
flexmock(module.rlist).should_receive('resolve_archive_name').and_return('archive')
insert_execute_command_mock(('borg1', 'extract', '--dry-run', 'repo::archive'))
flexmock(module.flags).should_receive('make_repository_archive_flags').and_return(
('repo::archive',)
)
insert_execute_command_mock(('borg1', 'extract', '--dry-run', 'repo::archive2'))
flexmock(module.feature).should_receive('available').and_return(True)
module.extract_last_archive_dry_run(
storage_config={}, repository='repo', lock_wait=None, local_path='borg1'
storage_config={},
local_borg_version='1.2.3',
repository='repo',
lock_wait=None,
local_path='borg1',
)
def test_extract_last_archive_dry_run_calls_borg_with_remote_path_parameters():
insert_execute_command_output_mock(
('borg', 'list', '--short', '--remote-path', 'borg1', 'repo'), result='archive1\narchive2\n'
)
flexmock(module.rlist).should_receive('resolve_archive_name').and_return('archive')
insert_execute_command_mock(
('borg', 'extract', '--dry-run', '--remote-path', 'borg1', 'repo::archive2')
('borg', 'extract', '--dry-run', '--remote-path', 'borg1', 'repo::archive')
)
flexmock(module.flags).should_receive('make_repository_archive_flags').and_return(
('repo::archive',)
)
flexmock(module.feature).should_receive('available').and_return(True)
module.extract_last_archive_dry_run(
storage_config={}, repository='repo', lock_wait=None, remote_path='borg1'
storage_config={},
local_borg_version='1.2.3',
repository='repo',
lock_wait=None,
remote_path='borg1',
)
def test_extract_last_archive_dry_run_calls_borg_with_lock_wait_parameters():
insert_execute_command_output_mock(
('borg', 'list', '--short', '--lock-wait', '5', 'repo'), result='archive1\narchive2\n'
)
flexmock(module.rlist).should_receive('resolve_archive_name').and_return('archive')
insert_execute_command_mock(
('borg', 'extract', '--dry-run', '--lock-wait', '5', 'repo::archive2')
('borg', 'extract', '--dry-run', '--lock-wait', '5', 'repo::archive')
)
flexmock(module.flags).should_receive('make_repository_archive_flags').and_return(
('repo::archive',)
)
flexmock(module.feature).should_receive('available').and_return(True)
module.extract_last_archive_dry_run(storage_config={}, repository='repo', lock_wait=5)
module.extract_last_archive_dry_run(
storage_config={}, local_borg_version='1.2.3', repository='repo', lock_wait=5
)
def test_extract_archive_calls_borg_with_path_parameters():
flexmock(module.os.path).should_receive('abspath').and_return('repo')
insert_execute_command_mock(('borg', 'extract', 'repo::archive', 'path1', 'path2'))
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module.flags).should_receive('make_repository_archive_flags').and_return(
('repo::archive',)
)
module.extract_archive(
dry_run=False,
@ -121,6 +135,9 @@ def test_extract_archive_calls_borg_with_remote_path_parameters():
flexmock(module.os.path).should_receive('abspath').and_return('repo')
insert_execute_command_mock(('borg', 'extract', '--remote-path', 'borg1', 'repo::archive'))
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module.flags).should_receive('make_repository_archive_flags').and_return(
('repo::archive',)
)
module.extract_archive(
dry_run=False,
@ -141,13 +158,16 @@ def test_extract_archive_calls_borg_with_numeric_ids_parameter(feature_available
flexmock(module.os.path).should_receive('abspath').and_return('repo')
insert_execute_command_mock(('borg', 'extract', option_flag, 'repo::archive'))
flexmock(module.feature).should_receive('available').and_return(feature_available)
flexmock(module.flags).should_receive('make_repository_archive_flags').and_return(
('repo::archive',)
)
module.extract_archive(
dry_run=False,
repository='repo',
archive='archive',
paths=None,
location_config={'numeric_owner': True},
location_config={'numeric_ids': True},
storage_config={},
local_borg_version='1.2.3',
)
@ -157,6 +177,9 @@ def test_extract_archive_calls_borg_with_umask_parameters():
flexmock(module.os.path).should_receive('abspath').and_return('repo')
insert_execute_command_mock(('borg', 'extract', '--umask', '0770', 'repo::archive'))
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module.flags).should_receive('make_repository_archive_flags').and_return(
('repo::archive',)
)
module.extract_archive(
dry_run=False,
@ -173,6 +196,9 @@ def test_extract_archive_calls_borg_with_lock_wait_parameters():
flexmock(module.os.path).should_receive('abspath').and_return('repo')
insert_execute_command_mock(('borg', 'extract', '--lock-wait', '5', 'repo::archive'))
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module.flags).should_receive('make_repository_archive_flags').and_return(
('repo::archive',)
)
module.extract_archive(
dry_run=False,
@ -190,6 +216,9 @@ def test_extract_archive_with_log_info_calls_borg_with_info_parameter():
insert_execute_command_mock(('borg', 'extract', '--info', 'repo::archive'))
insert_logging_mock(logging.INFO)
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module.flags).should_receive('make_repository_archive_flags').and_return(
('repo::archive',)
)
module.extract_archive(
dry_run=False,
@ -209,6 +238,9 @@ def test_extract_archive_with_log_debug_calls_borg_with_debug_parameters():
)
insert_logging_mock(logging.DEBUG)
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module.flags).should_receive('make_repository_archive_flags').and_return(
('repo::archive',)
)
module.extract_archive(
dry_run=False,
@ -225,6 +257,9 @@ def test_extract_archive_calls_borg_with_dry_run_parameter():
flexmock(module.os.path).should_receive('abspath').and_return('repo')
insert_execute_command_mock(('borg', 'extract', '--dry-run', 'repo::archive'))
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module.flags).should_receive('make_repository_archive_flags').and_return(
('repo::archive',)
)
module.extract_archive(
dry_run=True,
@ -241,6 +276,9 @@ def test_extract_archive_calls_borg_with_destination_path():
flexmock(module.os.path).should_receive('abspath').and_return('repo')
insert_execute_command_mock(('borg', 'extract', 'repo::archive'), working_directory='/dest')
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module.flags).should_receive('make_repository_archive_flags').and_return(
('repo::archive',)
)
module.extract_archive(
dry_run=False,
@ -258,6 +296,9 @@ def test_extract_archive_calls_borg_with_strip_components():
flexmock(module.os.path).should_receive('abspath').and_return('repo')
insert_execute_command_mock(('borg', 'extract', '--strip-components', '5', 'repo::archive'))
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module.flags).should_receive('make_repository_archive_flags').and_return(
('repo::archive',)
)
module.extract_archive(
dry_run=False,
@ -281,6 +322,9 @@ def test_extract_archive_calls_borg_with_progress_parameter():
extra_environment=None,
).once()
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module.flags).should_receive('make_repository_archive_flags').and_return(
('repo::archive',)
)
module.extract_archive(
dry_run=False,
@ -323,6 +367,9 @@ def test_extract_archive_calls_borg_with_stdout_parameter_and_returns_process():
extra_environment=None,
).and_return(process).once()
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module.flags).should_receive('make_repository_archive_flags').and_return(
('repo::archive',)
)
assert (
module.extract_archive(
@ -346,6 +393,9 @@ def test_extract_archive_skips_abspath_for_remote_repository():
('borg', 'extract', 'server:repo::archive'), working_directory=None, extra_environment=None,
).once()
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module.flags).should_receive('make_repository_archive_flags').and_return(
('server:repo::archive',)
)
module.extract_archive(
dry_run=False,

View File

@ -45,3 +45,34 @@ def test_make_flags_from_arguments_omits_excludes():
arguments = flexmock(foo='bar', baz='quux')
assert module.make_flags_from_arguments(arguments, excludes=('baz', 'other')) == ('foo', 'bar')
def test_make_repository_flags_with_borg_features_includes_repo_flag():
flexmock(module.feature).should_receive('available').and_return(True)
assert module.make_repository_flags(repository='repo', local_borg_version='1.2.3') == (
'--repo',
'repo',
)
def test_make_repository_flags_without_borg_features_includes_omits_flag():
flexmock(module.feature).should_receive('available').and_return(False)
assert module.make_repository_flags(repository='repo', local_borg_version='1.2.3') == ('repo',)
def test_make_repository_archive_flags_with_borg_features_separates_repository_and_archive():
flexmock(module.feature).should_receive('available').and_return(True)
assert module.make_repository_archive_flags(
repository='repo', archive='archive', local_borg_version='1.2.3'
) == ('--repo', 'repo', 'archive',)
def test_make_repository_archive_flags_with_borg_features_joins_repository_and_archive():
flexmock(module.feature).should_receive('available').and_return(False)
assert module.make_repository_archive_flags(
repository='repo', archive='archive', local_borg_version='1.2.3'
) == ('repo::archive',)

View File

@ -9,117 +9,172 @@ from ..test_verbosity import insert_logging_mock
def test_display_archives_info_calls_borg_with_parameters():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.flags).should_receive('make_flags_from_arguments').and_return(())
flexmock(module.flags).should_receive('make_repository_flags').and_return(('--repo', 'repo'))
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'info', 'repo'),
output_log_level=logging.WARNING,
('borg', 'info', '--repo', 'repo'),
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
)
module.display_archives_info(
repository='repo', storage_config={}, info_arguments=flexmock(archive=None, json=False)
repository='repo',
storage_config={},
local_borg_version='2.3.4',
info_arguments=flexmock(archive=None, json=False, prefix=None),
)
def test_display_archives_info_with_log_info_calls_borg_with_info_parameter():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.flags).should_receive('make_flags_from_arguments').and_return(())
flexmock(module.flags).should_receive('make_repository_flags').and_return(('--repo', 'repo'))
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'info', '--info', 'repo'),
output_log_level=logging.WARNING,
('borg', 'info', '--info', '--repo', 'repo'),
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
)
insert_logging_mock(logging.INFO)
module.display_archives_info(
repository='repo', storage_config={}, info_arguments=flexmock(archive=None, json=False)
repository='repo',
storage_config={},
local_borg_version='2.3.4',
info_arguments=flexmock(archive=None, json=False, prefix=None),
)
def test_display_archives_info_with_log_info_and_json_suppresses_most_borg_output():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.flags).should_receive('make_flags_from_arguments').and_return(('--json',))
flexmock(module.flags).should_receive('make_repository_flags').and_return(('--repo', 'repo'))
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'info', '--json', 'repo'),
output_log_level=None,
borg_local_path='borg',
extra_environment=None,
flexmock(module).should_receive('execute_command_and_capture_output').with_args(
('borg', 'info', '--json', '--repo', 'repo'), extra_environment=None,
).and_return('[]')
insert_logging_mock(logging.INFO)
json_output = module.display_archives_info(
repository='repo', storage_config={}, info_arguments=flexmock(archive=None, json=True)
repository='repo',
storage_config={},
local_borg_version='2.3.4',
info_arguments=flexmock(archive=None, json=True, prefix=None),
)
assert json_output == '[]'
def test_display_archives_info_with_log_debug_calls_borg_with_debug_parameter():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.flags).should_receive('make_flags_from_arguments').and_return(())
flexmock(module.flags).should_receive('make_repository_flags').and_return(('--repo', 'repo'))
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'info', '--debug', '--show-rc', 'repo'),
output_log_level=logging.WARNING,
('borg', 'info', '--debug', '--show-rc', '--repo', 'repo'),
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
)
insert_logging_mock(logging.DEBUG)
module.display_archives_info(
repository='repo', storage_config={}, info_arguments=flexmock(archive=None, json=False)
repository='repo',
storage_config={},
local_borg_version='2.3.4',
info_arguments=flexmock(archive=None, json=False, prefix=None),
)
def test_display_archives_info_with_log_debug_and_json_suppresses_most_borg_output():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.flags).should_receive('make_flags_from_arguments').and_return(('--json',))
flexmock(module.flags).should_receive('make_repository_flags').and_return(('--repo', 'repo'))
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'info', '--json', 'repo'),
output_log_level=None,
borg_local_path='borg',
extra_environment=None,
flexmock(module).should_receive('execute_command_and_capture_output').with_args(
('borg', 'info', '--json', '--repo', 'repo'), extra_environment=None,
).and_return('[]')
insert_logging_mock(logging.DEBUG)
json_output = module.display_archives_info(
repository='repo', storage_config={}, info_arguments=flexmock(archive=None, json=True)
repository='repo',
storage_config={},
local_borg_version='2.3.4',
info_arguments=flexmock(archive=None, json=True, prefix=None),
)
assert json_output == '[]'
def test_display_archives_info_with_json_calls_borg_with_json_parameter():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.flags).should_receive('make_flags_from_arguments').and_return(('--json',))
flexmock(module.flags).should_receive('make_repository_flags').and_return(('--repo', 'repo'))
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'info', '--json', 'repo'),
output_log_level=None,
borg_local_path='borg',
extra_environment=None,
flexmock(module).should_receive('execute_command_and_capture_output').with_args(
('borg', 'info', '--json', '--repo', 'repo'), extra_environment=None,
).and_return('[]')
json_output = module.display_archives_info(
repository='repo', storage_config={}, info_arguments=flexmock(archive=None, json=True)
repository='repo',
storage_config={},
local_borg_version='2.3.4',
info_arguments=flexmock(archive=None, json=True, prefix=None),
)
assert json_output == '[]'
def test_display_archives_info_with_archive_calls_borg_with_archive_parameter():
def test_display_archives_info_with_archive_calls_borg_with_match_archives_parameter():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.flags).should_receive('make_flags').with_args(
'match-archives', 'archive'
).and_return(('--match-archives', 'archive'))
flexmock(module.flags).should_receive('make_flags_from_arguments').and_return(())
flexmock(module.flags).should_receive('make_repository_flags').and_return(('--repo', 'repo'))
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'info', 'repo::archive'),
output_log_level=logging.WARNING,
('borg', 'info', '--repo', 'repo', '--match-archives', 'archive'),
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
)
module.display_archives_info(
repository='repo', storage_config={}, info_arguments=flexmock(archive='archive', json=False)
repository='repo',
storage_config={},
local_borg_version='2.3.4',
info_arguments=flexmock(archive='archive', json=False, prefix=None),
)
def test_display_archives_info_with_local_path_calls_borg_via_local_path():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.flags).should_receive('make_flags_from_arguments').and_return(())
flexmock(module.flags).should_receive('make_repository_flags').and_return(('--repo', 'repo'))
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg1', 'info', 'repo'),
output_log_level=logging.WARNING,
('borg1', 'info', '--repo', 'repo'),
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg1',
extra_environment=None,
)
@ -127,16 +182,25 @@ def test_display_archives_info_with_local_path_calls_borg_via_local_path():
module.display_archives_info(
repository='repo',
storage_config={},
info_arguments=flexmock(archive=None, json=False),
local_borg_version='2.3.4',
info_arguments=flexmock(archive=None, json=False, prefix=None),
local_path='borg1',
)
def test_display_archives_info_with_remote_path_calls_borg_with_remote_path_parameters():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.flags).should_receive('make_flags').with_args(
'remote-path', 'borg1'
).and_return(('--remote-path', 'borg1'))
flexmock(module.flags).should_receive('make_flags_from_arguments').and_return(())
flexmock(module.flags).should_receive('make_repository_flags').and_return(('--repo', 'repo'))
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'info', '--remote-path', 'borg1', 'repo'),
output_log_level=logging.WARNING,
('borg', 'info', '--remote-path', 'borg1', '--repo', 'repo'),
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
)
@ -144,17 +208,26 @@ def test_display_archives_info_with_remote_path_calls_borg_with_remote_path_para
module.display_archives_info(
repository='repo',
storage_config={},
info_arguments=flexmock(archive=None, json=False),
local_borg_version='2.3.4',
info_arguments=flexmock(archive=None, json=False, prefix=None),
remote_path='borg1',
)
def test_display_archives_info_with_lock_wait_calls_borg_with_lock_wait_parameters():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.flags).should_receive('make_flags').with_args('lock-wait', 5).and_return(
('--lock-wait', '5')
)
flexmock(module.flags).should_receive('make_flags_from_arguments').and_return(())
flexmock(module.flags).should_receive('make_repository_flags').and_return(('--repo', 'repo'))
storage_config = {'lock_wait': 5}
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'info', '--lock-wait', '5', 'repo'),
output_log_level=logging.WARNING,
('borg', 'info', '--lock-wait', '5', '--repo', 'repo'),
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
)
@ -162,16 +235,24 @@ def test_display_archives_info_with_lock_wait_calls_borg_with_lock_wait_paramete
module.display_archives_info(
repository='repo',
storage_config=storage_config,
info_arguments=flexmock(archive=None, json=False),
local_borg_version='2.3.4',
info_arguments=flexmock(archive=None, json=False, prefix=None),
)
@pytest.mark.parametrize('argument_name', ('prefix', 'glob_archives', 'sort_by', 'first', 'last'))
def test_display_archives_info_passes_through_arguments_to_borg(argument_name):
def test_display_archives_info_with_prefix_calls_borg_with_match_archives_parameters():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.flags).should_receive('make_flags').with_args(
'match-archives', 'sh:foo*'
).and_return(('--match-archives', 'sh:foo*'))
flexmock(module.flags).should_receive('make_flags_from_arguments').and_return(())
flexmock(module.flags).should_receive('make_repository_flags').and_return(('--repo', 'repo'))
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'info', '--' + argument_name.replace('_', '-'), 'value', 'repo'),
output_log_level=logging.WARNING,
('borg', 'info', '--match-archives', 'sh:foo*', '--repo', 'repo'),
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
)
@ -179,5 +260,32 @@ def test_display_archives_info_passes_through_arguments_to_borg(argument_name):
module.display_archives_info(
repository='repo',
storage_config={},
info_arguments=flexmock(archive=None, json=False, **{argument_name: 'value'}),
local_borg_version='2.3.4',
info_arguments=flexmock(archive=None, json=False, prefix='foo'),
)
@pytest.mark.parametrize('argument_name', ('match_archives', 'sort_by', 'first', 'last'))
def test_display_archives_info_passes_through_arguments_to_borg(argument_name):
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flag_name = f"--{argument_name.replace('_', ' ')}"
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.flags).should_receive('make_flags_from_arguments').and_return(
(flag_name, 'value')
)
flexmock(module.flags).should_receive('make_repository_flags').and_return(('--repo', 'repo'))
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'info', flag_name, 'value', '--repo', 'repo'),
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
)
module.display_archives_info(
repository='repo',
storage_config={},
local_borg_version='2.3.4',
info_arguments=flexmock(archive=None, json=False, prefix=None, **{argument_name: 'value'}),
)

View File

@ -1,132 +0,0 @@
import logging
import subprocess
import pytest
from flexmock import flexmock
from borgmatic.borg import init as module
from ..test_verbosity import insert_logging_mock
INFO_SOME_UNKNOWN_EXIT_CODE = -999
INIT_COMMAND = ('borg', 'init', '--encryption', 'repokey')
def insert_info_command_found_mock():
flexmock(module.info).should_receive('display_archives_info')
def insert_info_command_not_found_mock():
flexmock(module.info).should_receive('display_archives_info').and_raise(
subprocess.CalledProcessError(module.INFO_REPOSITORY_NOT_FOUND_EXIT_CODE, [])
)
def insert_init_command_mock(init_command, **kwargs):
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
init_command,
output_file=module.DO_NOT_CAPTURE,
borg_local_path=init_command[0],
extra_environment=None,
).once()
def test_initialize_repository_calls_borg_with_parameters():
insert_info_command_not_found_mock()
insert_init_command_mock(INIT_COMMAND + ('repo',))
module.initialize_repository(repository='repo', storage_config={}, encryption_mode='repokey')
def test_initialize_repository_raises_for_borg_init_error():
insert_info_command_not_found_mock()
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').and_raise(
module.subprocess.CalledProcessError(2, 'borg init')
)
with pytest.raises(subprocess.CalledProcessError):
module.initialize_repository(
repository='repo', storage_config={}, encryption_mode='repokey'
)
def test_initialize_repository_skips_initialization_when_repository_already_exists():
insert_info_command_found_mock()
module.initialize_repository(repository='repo', storage_config={}, encryption_mode='repokey')
def test_initialize_repository_raises_for_unknown_info_command_error():
flexmock(module.info).should_receive('display_archives_info').and_raise(
subprocess.CalledProcessError(INFO_SOME_UNKNOWN_EXIT_CODE, [])
)
with pytest.raises(subprocess.CalledProcessError):
module.initialize_repository(
repository='repo', storage_config={}, encryption_mode='repokey'
)
def test_initialize_repository_with_append_only_calls_borg_with_append_only_parameter():
insert_info_command_not_found_mock()
insert_init_command_mock(INIT_COMMAND + ('--append-only', 'repo'))
module.initialize_repository(
repository='repo', storage_config={}, encryption_mode='repokey', append_only=True
)
def test_initialize_repository_with_storage_quota_calls_borg_with_storage_quota_parameter():
insert_info_command_not_found_mock()
insert_init_command_mock(INIT_COMMAND + ('--storage-quota', '5G', 'repo'))
module.initialize_repository(
repository='repo', storage_config={}, encryption_mode='repokey', storage_quota='5G'
)
def test_initialize_repository_with_log_info_calls_borg_with_info_parameter():
insert_info_command_not_found_mock()
insert_init_command_mock(INIT_COMMAND + ('--info', 'repo'))
insert_logging_mock(logging.INFO)
module.initialize_repository(repository='repo', storage_config={}, encryption_mode='repokey')
def test_initialize_repository_with_log_debug_calls_borg_with_debug_parameter():
insert_info_command_not_found_mock()
insert_init_command_mock(INIT_COMMAND + ('--debug', 'repo'))
insert_logging_mock(logging.DEBUG)
module.initialize_repository(repository='repo', storage_config={}, encryption_mode='repokey')
def test_initialize_repository_with_local_path_calls_borg_via_local_path():
insert_info_command_not_found_mock()
insert_init_command_mock(('borg1',) + INIT_COMMAND[1:] + ('repo',))
module.initialize_repository(
repository='repo', storage_config={}, encryption_mode='repokey', local_path='borg1'
)
def test_initialize_repository_with_remote_path_calls_borg_with_remote_path_parameter():
insert_info_command_not_found_mock()
insert_init_command_mock(INIT_COMMAND + ('--remote-path', 'borg1', 'repo'))
module.initialize_repository(
repository='repo', storage_config={}, encryption_mode='repokey', remote_path='borg1'
)
def test_initialize_repository_with_extra_borg_options_calls_borg_with_extra_options():
insert_info_command_not_found_mock()
insert_init_command_mock(INIT_COMMAND + ('--extra', '--options', 'repo'))
module.initialize_repository(
repository='repo',
storage_config={'extra_borg_options': {'init': '--extra --options'}},
encryption_mode='repokey',
)

View File

@ -8,129 +8,17 @@ from borgmatic.borg import list as module
from ..test_verbosity import insert_logging_mock
BORG_LIST_LATEST_ARGUMENTS = (
'--last',
'1',
'--short',
'repo',
)
def test_resolve_archive_name_passes_through_non_latest_archive_name():
archive = 'myhost-2030-01-01T14:41:17.647620'
assert module.resolve_archive_name('repo', archive, storage_config={}) == archive
def test_resolve_archive_name_calls_borg_with_parameters():
expected_archive = 'archive-name'
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'list') + BORG_LIST_LATEST_ARGUMENTS,
output_log_level=None,
borg_local_path='borg',
extra_environment=None,
).and_return(expected_archive + '\n')
assert module.resolve_archive_name('repo', 'latest', storage_config={}) == expected_archive
def test_resolve_archive_name_with_log_info_calls_borg_with_info_parameter():
expected_archive = 'archive-name'
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'list', '--info') + BORG_LIST_LATEST_ARGUMENTS,
output_log_level=None,
borg_local_path='borg',
extra_environment=None,
).and_return(expected_archive + '\n')
insert_logging_mock(logging.INFO)
assert module.resolve_archive_name('repo', 'latest', storage_config={}) == expected_archive
def test_resolve_archive_name_with_log_debug_calls_borg_with_debug_parameter():
expected_archive = 'archive-name'
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'list', '--debug', '--show-rc') + BORG_LIST_LATEST_ARGUMENTS,
output_log_level=None,
borg_local_path='borg',
extra_environment=None,
).and_return(expected_archive + '\n')
insert_logging_mock(logging.DEBUG)
assert module.resolve_archive_name('repo', 'latest', storage_config={}) == expected_archive
def test_resolve_archive_name_with_local_path_calls_borg_via_local_path():
expected_archive = 'archive-name'
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg1', 'list') + BORG_LIST_LATEST_ARGUMENTS,
output_log_level=None,
borg_local_path='borg1',
extra_environment=None,
).and_return(expected_archive + '\n')
assert (
module.resolve_archive_name('repo', 'latest', storage_config={}, local_path='borg1')
== expected_archive
)
def test_resolve_archive_name_with_remote_path_calls_borg_with_remote_path_parameters():
expected_archive = 'archive-name'
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'list', '--remote-path', 'borg1') + BORG_LIST_LATEST_ARGUMENTS,
output_log_level=None,
borg_local_path='borg',
extra_environment=None,
).and_return(expected_archive + '\n')
assert (
module.resolve_archive_name('repo', 'latest', storage_config={}, remote_path='borg1')
== expected_archive
)
def test_resolve_archive_name_without_archives_raises():
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'list') + BORG_LIST_LATEST_ARGUMENTS,
output_log_level=None,
borg_local_path='borg',
extra_environment=None,
).and_return('')
with pytest.raises(ValueError):
module.resolve_archive_name('repo', 'latest', storage_config={})
def test_resolve_archive_name_with_lock_wait_calls_borg_with_lock_wait_parameters():
expected_archive = 'archive-name'
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'list', '--lock-wait', 'okay') + BORG_LIST_LATEST_ARGUMENTS,
output_log_level=None,
borg_local_path='borg',
extra_environment=None,
).and_return(expected_archive + '\n')
assert (
module.resolve_archive_name('repo', 'latest', storage_config={'lock_wait': 'okay'})
== expected_archive
)
def test_make_list_command_includes_log_info():
insert_logging_mock(logging.INFO)
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.flags).should_receive('make_flags_from_arguments').and_return(())
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
command = module.make_list_command(
repository='repo',
storage_config={},
local_borg_version='1.2.3',
list_arguments=flexmock(archive=None, paths=None, json=False),
)
@ -139,10 +27,14 @@ def test_make_list_command_includes_log_info():
def test_make_list_command_includes_json_but_not_info():
insert_logging_mock(logging.INFO)
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.flags).should_receive('make_flags_from_arguments').and_return(('--json',))
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
command = module.make_list_command(
repository='repo',
storage_config={},
local_borg_version='1.2.3',
list_arguments=flexmock(archive=None, paths=None, json=True),
)
@ -151,10 +43,14 @@ def test_make_list_command_includes_json_but_not_info():
def test_make_list_command_includes_log_debug():
insert_logging_mock(logging.DEBUG)
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.flags).should_receive('make_flags_from_arguments').and_return(())
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
command = module.make_list_command(
repository='repo',
storage_config={},
local_borg_version='1.2.3',
list_arguments=flexmock(archive=None, paths=None, json=False),
)
@ -163,10 +59,14 @@ def test_make_list_command_includes_log_debug():
def test_make_list_command_includes_json_but_not_debug():
insert_logging_mock(logging.DEBUG)
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.flags).should_receive('make_flags_from_arguments').and_return(('--json',))
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
command = module.make_list_command(
repository='repo',
storage_config={},
local_borg_version='1.2.3',
list_arguments=flexmock(archive=None, paths=None, json=True),
)
@ -174,9 +74,14 @@ def test_make_list_command_includes_json_but_not_debug():
def test_make_list_command_includes_json():
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.flags).should_receive('make_flags_from_arguments').and_return(('--json',))
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
command = module.make_list_command(
repository='repo',
storage_config={},
local_borg_version='1.2.3',
list_arguments=flexmock(archive=None, paths=None, json=True),
)
@ -184,9 +89,16 @@ def test_make_list_command_includes_json():
def test_make_list_command_includes_lock_wait():
flexmock(module.flags).should_receive('make_flags').and_return(()).and_return(
('--lock-wait', '5')
)
flexmock(module.flags).should_receive('make_flags_from_arguments').and_return(())
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
command = module.make_list_command(
repository='repo',
storage_config={'lock_wait': 5},
local_borg_version='1.2.3',
list_arguments=flexmock(archive=None, paths=None, json=False),
)
@ -194,9 +106,16 @@ def test_make_list_command_includes_lock_wait():
def test_make_list_command_includes_archive():
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.flags).should_receive('make_flags_from_arguments').and_return(())
flexmock(module.flags).should_receive('make_repository_archive_flags').and_return(
('repo::archive',)
)
command = module.make_list_command(
repository='repo',
storage_config={},
local_borg_version='1.2.3',
list_arguments=flexmock(archive='archive', paths=None, json=False),
)
@ -204,9 +123,16 @@ def test_make_list_command_includes_archive():
def test_make_list_command_includes_archive_and_path():
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.flags).should_receive('make_flags_from_arguments').and_return(())
flexmock(module.flags).should_receive('make_repository_archive_flags').and_return(
('repo::archive',)
)
command = module.make_list_command(
repository='repo',
storage_config={},
local_borg_version='1.2.3',
list_arguments=flexmock(archive='archive', paths=['var/lib'], json=False),
)
@ -214,9 +140,14 @@ def test_make_list_command_includes_archive_and_path():
def test_make_list_command_includes_local_path():
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.flags).should_receive('make_flags_from_arguments').and_return(())
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
command = module.make_list_command(
repository='repo',
storage_config={},
local_borg_version='1.2.3',
list_arguments=flexmock(archive=None, paths=None, json=False),
local_path='borg2',
)
@ -225,9 +156,16 @@ def test_make_list_command_includes_local_path():
def test_make_list_command_includes_remote_path():
flexmock(module.flags).should_receive('make_flags').and_return(
('--remote-path', 'borg2')
).and_return(())
flexmock(module.flags).should_receive('make_flags_from_arguments').and_return(())
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
command = module.make_list_command(
repository='repo',
storage_config={},
local_borg_version='1.2.3',
list_arguments=flexmock(archive=None, paths=None, json=False),
remote_path='borg2',
)
@ -236,9 +174,14 @@ def test_make_list_command_includes_remote_path():
def test_make_list_command_includes_short():
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.flags).should_receive('make_flags_from_arguments').and_return(('--short',))
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
command = module.make_list_command(
repository='repo',
storage_config={},
local_borg_version='1.2.3',
list_arguments=flexmock(archive=None, paths=None, json=False, short=True),
)
@ -249,7 +192,7 @@ def test_make_list_command_includes_short():
'argument_name',
(
'prefix',
'glob_archives',
'match_archives',
'sort_by',
'first',
'last',
@ -260,16 +203,23 @@ def test_make_list_command_includes_short():
),
)
def test_make_list_command_includes_additional_flags(argument_name):
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.flags).should_receive('make_flags_from_arguments').and_return(
(f"--{argument_name.replace('_', '-')}", 'value')
)
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
command = module.make_list_command(
repository='repo',
storage_config={},
local_borg_version='1.2.3',
list_arguments=flexmock(
archive=None,
paths=None,
json=False,
find_paths=None,
format=None,
**{argument_name: 'value'}
**{argument_name: 'value'},
),
)
@ -303,122 +253,27 @@ def test_make_find_paths_adds_globs_to_path_fragments():
assert module.make_find_paths(('foo.txt',)) == ('sh:**/*foo.txt*/**',)
def test_list_archives_calls_borg_with_parameters():
list_arguments = argparse.Namespace(archive=None, paths=None, json=False, find_paths=None)
flexmock(module).should_receive('make_list_command').with_args(
repository='repo',
storage_config={},
list_arguments=list_arguments,
local_path='borg',
remote_path=None,
).and_return(('borg', 'list', 'repo'))
flexmock(module).should_receive('make_find_paths').and_return(())
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'list', 'repo'),
output_log_level=logging.WARNING,
borg_local_path='borg',
extra_environment=None,
).once()
module.list_archives(
repository='repo', storage_config={}, list_arguments=list_arguments,
)
def test_list_archives_with_json_suppresses_most_borg_output():
list_arguments = argparse.Namespace(archive=None, paths=None, json=True, find_paths=None)
flexmock(module).should_receive('make_list_command').with_args(
repository='repo',
storage_config={},
list_arguments=list_arguments,
local_path='borg',
remote_path=None,
).and_return(('borg', 'list', 'repo'))
flexmock(module).should_receive('make_find_paths').and_return(())
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'list', 'repo'),
output_log_level=None,
borg_local_path='borg',
extra_environment=None,
).once()
module.list_archives(
repository='repo', storage_config={}, list_arguments=list_arguments,
)
def test_list_archives_calls_borg_with_local_path():
list_arguments = argparse.Namespace(archive=None, paths=None, json=False, find_paths=None)
flexmock(module).should_receive('make_list_command').with_args(
repository='repo',
storage_config={},
list_arguments=list_arguments,
local_path='borg2',
remote_path=None,
).and_return(('borg2', 'list', 'repo'))
flexmock(module).should_receive('make_find_paths').and_return(())
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg2', 'list', 'repo'),
output_log_level=logging.WARNING,
borg_local_path='borg2',
extra_environment=None,
).once()
module.list_archives(
repository='repo', storage_config={}, list_arguments=list_arguments, local_path='borg2',
)
def test_list_archives_calls_borg_multiple_times_with_find_paths():
glob_paths = ('**/*foo.txt*/**',)
def test_list_archive_calls_borg_with_parameters():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.logger).answer = lambda message: None
list_arguments = argparse.Namespace(
archive=None, paths=None, json=False, find_paths=['foo.txt'], format=None
archive='archive',
paths=None,
json=False,
find_paths=None,
prefix=None,
match_archives=None,
sort_by=None,
first=None,
last=None,
)
flexmock(module).should_receive('make_list_command').and_return(
('borg', 'list', 'repo')
).and_return(('borg', 'list', 'repo::archive1')).and_return(('borg', 'list', 'repo::archive2'))
flexmock(module).should_receive('make_find_paths').and_return(glob_paths)
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'list', 'repo'),
output_log_level=None,
borg_local_path='borg',
extra_environment=None,
).and_return(
'archive1 Sun, 2022-05-29 15:27:04 [abc]\narchive2 Mon, 2022-05-30 19:47:15 [xyz]'
).once()
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'list', 'repo::archive1') + glob_paths,
output_log_level=logging.WARNING,
borg_local_path='borg',
extra_environment=None,
).once()
flexmock(module).should_receive('execute_command').with_args(
('borg', 'list', 'repo::archive2') + glob_paths,
output_log_level=logging.WARNING,
borg_local_path='borg',
extra_environment=None,
).once()
module.list_archives(
repository='repo', storage_config={}, list_arguments=list_arguments,
)
def test_list_archives_calls_borg_with_archive():
list_arguments = argparse.Namespace(archive='archive', paths=None, json=False, find_paths=None)
flexmock(module.feature).should_receive('available').and_return(False)
flexmock(module).should_receive('make_list_command').with_args(
repository='repo',
storage_config={},
local_borg_version='1.2.3',
list_arguments=list_arguments,
local_path='borg',
remote_path=None,
@ -427,11 +282,374 @@ def test_list_archives_calls_borg_with_archive():
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'list', 'repo::archive'),
output_log_level=logging.WARNING,
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
).once()
module.list_archives(
repository='repo', storage_config={}, list_arguments=list_arguments,
module.list_archive(
repository='repo',
storage_config={},
local_borg_version='1.2.3',
list_arguments=list_arguments,
)
def test_list_archive_with_archive_and_json_errors():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.logger).answer = lambda message: None
list_arguments = argparse.Namespace(archive='archive', paths=None, json=True, find_paths=None)
flexmock(module.feature).should_receive('available').and_return(False)
with pytest.raises(ValueError):
module.list_archive(
repository='repo',
storage_config={},
local_borg_version='1.2.3',
list_arguments=list_arguments,
)
def test_list_archive_calls_borg_with_local_path():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.logger).answer = lambda message: None
list_arguments = argparse.Namespace(
archive='archive',
paths=None,
json=False,
find_paths=None,
prefix=None,
match_archives=None,
sort_by=None,
first=None,
last=None,
)
flexmock(module.feature).should_receive('available').and_return(False)
flexmock(module).should_receive('make_list_command').with_args(
repository='repo',
storage_config={},
local_borg_version='1.2.3',
list_arguments=list_arguments,
local_path='borg2',
remote_path=None,
).and_return(('borg2', 'list', 'repo::archive'))
flexmock(module).should_receive('make_find_paths').and_return(())
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg2', 'list', 'repo::archive'),
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg2',
extra_environment=None,
).once()
module.list_archive(
repository='repo',
storage_config={},
local_borg_version='1.2.3',
list_arguments=list_arguments,
local_path='borg2',
)
def test_list_archive_calls_borg_multiple_times_with_find_paths():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.logger).answer = lambda message: None
glob_paths = ('**/*foo.txt*/**',)
list_arguments = argparse.Namespace(
archive=None,
json=False,
find_paths=['foo.txt'],
prefix=None,
match_archives=None,
sort_by=None,
first=None,
last=None,
)
flexmock(module.feature).should_receive('available').and_return(False)
flexmock(module.rlist).should_receive('make_rlist_command').and_return(('borg', 'list', 'repo'))
flexmock(module).should_receive('execute_command_and_capture_output').with_args(
('borg', 'list', 'repo'), extra_environment=None,
).and_return('archive1\narchive2').once()
flexmock(module).should_receive('make_list_command').and_return(
('borg', 'list', 'repo::archive1')
).and_return(('borg', 'list', 'repo::archive2'))
flexmock(module).should_receive('make_find_paths').and_return(glob_paths)
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'list', 'repo::archive1') + glob_paths,
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
).once()
flexmock(module).should_receive('execute_command').with_args(
('borg', 'list', 'repo::archive2') + glob_paths,
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
).once()
module.list_archive(
repository='repo',
storage_config={},
local_borg_version='1.2.3',
list_arguments=list_arguments,
)
def test_list_archive_calls_borg_with_archive():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.logger).answer = lambda message: None
list_arguments = argparse.Namespace(
archive='archive',
paths=None,
json=False,
find_paths=None,
prefix=None,
match_archives=None,
sort_by=None,
first=None,
last=None,
)
flexmock(module.feature).should_receive('available').and_return(False)
flexmock(module).should_receive('make_list_command').with_args(
repository='repo',
storage_config={},
local_borg_version='1.2.3',
list_arguments=list_arguments,
local_path='borg',
remote_path=None,
).and_return(('borg', 'list', 'repo::archive'))
flexmock(module).should_receive('make_find_paths').and_return(())
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'list', 'repo::archive'),
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
).once()
module.list_archive(
repository='repo',
storage_config={},
local_borg_version='1.2.3',
list_arguments=list_arguments,
)
def test_list_archive_without_archive_delegates_to_list_repository():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.logger).answer = lambda message: None
list_arguments = argparse.Namespace(
archive=None,
short=None,
format=None,
json=None,
prefix=None,
match_archives=None,
sort_by=None,
first=None,
last=None,
find_paths=None,
)
flexmock(module.feature).should_receive('available').and_return(False)
flexmock(module.rlist).should_receive('list_repository')
flexmock(module.environment).should_receive('make_environment').never()
flexmock(module).should_receive('execute_command').never()
module.list_archive(
repository='repo',
storage_config={},
local_borg_version='1.2.3',
list_arguments=list_arguments,
)
def test_list_archive_with_borg_features_without_archive_delegates_to_list_repository():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.logger).answer = lambda message: None
list_arguments = argparse.Namespace(
archive=None,
short=None,
format=None,
json=None,
prefix=None,
match_archives=None,
sort_by=None,
first=None,
last=None,
find_paths=None,
)
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module.rlist).should_receive('list_repository')
flexmock(module.environment).should_receive('make_environment').never()
flexmock(module).should_receive('execute_command').never()
module.list_archive(
repository='repo',
storage_config={},
local_borg_version='1.2.3',
list_arguments=list_arguments,
)
@pytest.mark.parametrize(
'archive_filter_flag', ('prefix', 'match_archives', 'sort_by', 'first', 'last',),
)
def test_list_archive_with_archive_ignores_archive_filter_flag(archive_filter_flag,):
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.logger).answer = lambda message: None
default_filter_flags = {
'prefix': None,
'match_archives': None,
'sort_by': None,
'first': None,
'last': None,
}
altered_filter_flags = {**default_filter_flags, **{archive_filter_flag: 'foo'}}
flexmock(module.feature).should_receive('available').with_args(
module.feature.Feature.RLIST, '1.2.3'
).and_return(False)
flexmock(module).should_receive('make_list_command').with_args(
repository='repo',
storage_config={},
local_borg_version='1.2.3',
list_arguments=argparse.Namespace(
archive='archive', paths=None, json=False, find_paths=None, **default_filter_flags
),
local_path='borg',
remote_path=None,
).and_return(('borg', 'list', 'repo::archive'))
flexmock(module).should_receive('make_find_paths').and_return(())
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'list', 'repo::archive'),
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
).once()
module.list_archive(
repository='repo',
storage_config={},
local_borg_version='1.2.3',
list_arguments=argparse.Namespace(
archive='archive', paths=None, json=False, find_paths=None, **altered_filter_flags
),
)
@pytest.mark.parametrize(
'archive_filter_flag', ('prefix', 'match_archives', 'sort_by', 'first', 'last',),
)
def test_list_archive_with_find_paths_allows_archive_filter_flag_but_only_passes_it_to_rlist(
archive_filter_flag,
):
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.logger).answer = lambda message: None
default_filter_flags = {
'prefix': None,
'match_archives': None,
'sort_by': None,
'first': None,
'last': None,
}
altered_filter_flags = {**default_filter_flags, **{archive_filter_flag: 'foo'}}
glob_paths = ('**/*foo.txt*/**',)
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module.rlist).should_receive('make_rlist_command').with_args(
repository='repo',
storage_config={},
local_borg_version='1.2.3',
rlist_arguments=argparse.Namespace(
repository='repo', short=True, format=None, json=None, **altered_filter_flags
),
local_path='borg',
remote_path=None,
).and_return(('borg', 'rlist', '--repo', 'repo'))
flexmock(module).should_receive('execute_command_and_capture_output').with_args(
('borg', 'rlist', '--repo', 'repo'), extra_environment=None,
).and_return('archive1\narchive2').once()
flexmock(module).should_receive('make_list_command').with_args(
repository='repo',
storage_config={},
local_borg_version='1.2.3',
list_arguments=argparse.Namespace(
repository='repo',
archive='archive1',
paths=None,
short=True,
format=None,
json=None,
find_paths=['foo.txt'],
**default_filter_flags,
),
local_path='borg',
remote_path=None,
).and_return(('borg', 'list', '--repo', 'repo', 'archive1'))
flexmock(module).should_receive('make_list_command').with_args(
repository='repo',
storage_config={},
local_borg_version='1.2.3',
list_arguments=argparse.Namespace(
repository='repo',
archive='archive2',
paths=None,
short=True,
format=None,
json=None,
find_paths=['foo.txt'],
**default_filter_flags,
),
local_path='borg',
remote_path=None,
).and_return(('borg', 'list', '--repo', 'repo', 'archive2'))
flexmock(module).should_receive('make_find_paths').and_return(glob_paths)
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'list', '--repo', 'repo', 'archive1') + glob_paths,
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
).once()
flexmock(module).should_receive('execute_command').with_args(
('borg', 'list', '--repo', 'repo', 'archive2') + glob_paths,
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
).once()
module.list_archive(
repository='repo',
storage_config={},
local_borg_version='1.2.3',
list_arguments=argparse.Namespace(
repository='repo',
archive=None,
paths=None,
short=True,
format=None,
json=None,
find_paths=['foo.txt'],
**altered_filter_flags,
),
)

View File

@ -14,7 +14,47 @@ def insert_execute_command_mock(command):
).once()
def test_mount_archive_calls_borg_with_required_parameters():
def test_mount_archive_calls_borg_with_required_flags():
flexmock(module.feature).should_receive('available').and_return(False)
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
insert_execute_command_mock(('borg', 'mount', 'repo', '/mnt'))
module.mount_archive(
repository='repo',
archive=None,
mount_point='/mnt',
paths=None,
foreground=False,
options=None,
storage_config={},
local_borg_version='1.2.3',
)
def test_mount_archive_with_borg_features_calls_borg_with_repository_and_match_archives_flags():
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module.flags).should_receive('make_repository_flags').and_return(('--repo', 'repo',))
insert_execute_command_mock(
('borg', 'mount', '--repo', 'repo', '--match-archives', 'archive', '/mnt')
)
module.mount_archive(
repository='repo',
archive='archive',
mount_point='/mnt',
paths=None,
foreground=False,
options=None,
storage_config={},
local_borg_version='1.2.3',
)
def test_mount_archive_without_archive_calls_borg_with_repository_flags_only():
flexmock(module.feature).should_receive('available').and_return(False)
flexmock(module.flags).should_receive('make_repository_archive_flags').and_return(
('repo::archive',)
)
insert_execute_command_mock(('borg', 'mount', 'repo::archive', '/mnt'))
module.mount_archive(
@ -25,10 +65,15 @@ def test_mount_archive_calls_borg_with_required_parameters():
foreground=False,
options=None,
storage_config={},
local_borg_version='1.2.3',
)
def test_mount_archive_calls_borg_with_path_parameters():
def test_mount_archive_calls_borg_with_path_flags():
flexmock(module.feature).should_receive('available').and_return(False)
flexmock(module.flags).should_receive('make_repository_archive_flags').and_return(
('repo::archive',)
)
insert_execute_command_mock(('borg', 'mount', 'repo::archive', '/mnt', 'path1', 'path2'))
module.mount_archive(
@ -39,10 +84,15 @@ def test_mount_archive_calls_borg_with_path_parameters():
foreground=False,
options=None,
storage_config={},
local_borg_version='1.2.3',
)
def test_mount_archive_calls_borg_with_remote_path_parameters():
def test_mount_archive_calls_borg_with_remote_path_flags():
flexmock(module.feature).should_receive('available').and_return(False)
flexmock(module.flags).should_receive('make_repository_archive_flags').and_return(
('repo::archive',)
)
insert_execute_command_mock(
('borg', 'mount', '--remote-path', 'borg1', 'repo::archive', '/mnt')
)
@ -55,11 +105,16 @@ def test_mount_archive_calls_borg_with_remote_path_parameters():
foreground=False,
options=None,
storage_config={},
local_borg_version='1.2.3',
remote_path='borg1',
)
def test_mount_archive_calls_borg_with_umask_parameters():
def test_mount_archive_calls_borg_with_umask_flags():
flexmock(module.feature).should_receive('available').and_return(False)
flexmock(module.flags).should_receive('make_repository_archive_flags').and_return(
('repo::archive',)
)
insert_execute_command_mock(('borg', 'mount', '--umask', '0770', 'repo::archive', '/mnt'))
module.mount_archive(
@ -70,10 +125,15 @@ def test_mount_archive_calls_borg_with_umask_parameters():
foreground=False,
options=None,
storage_config={'umask': '0770'},
local_borg_version='1.2.3',
)
def test_mount_archive_calls_borg_with_lock_wait_parameters():
def test_mount_archive_calls_borg_with_lock_wait_flags():
flexmock(module.feature).should_receive('available').and_return(False)
flexmock(module.flags).should_receive('make_repository_archive_flags').and_return(
('repo::archive',)
)
insert_execute_command_mock(('borg', 'mount', '--lock-wait', '5', 'repo::archive', '/mnt'))
module.mount_archive(
@ -84,10 +144,15 @@ def test_mount_archive_calls_borg_with_lock_wait_parameters():
foreground=False,
options=None,
storage_config={'lock_wait': '5'},
local_borg_version='1.2.3',
)
def test_mount_archive_with_log_info_calls_borg_with_info_parameter():
flexmock(module.feature).should_receive('available').and_return(False)
flexmock(module.flags).should_receive('make_repository_archive_flags').and_return(
('repo::archive',)
)
insert_execute_command_mock(('borg', 'mount', '--info', 'repo::archive', '/mnt'))
insert_logging_mock(logging.INFO)
@ -99,10 +164,15 @@ def test_mount_archive_with_log_info_calls_borg_with_info_parameter():
foreground=False,
options=None,
storage_config={},
local_borg_version='1.2.3',
)
def test_mount_archive_with_log_debug_calls_borg_with_debug_parameters():
def test_mount_archive_with_log_debug_calls_borg_with_debug_flags():
flexmock(module.feature).should_receive('available').and_return(False)
flexmock(module.flags).should_receive('make_repository_archive_flags').and_return(
('repo::archive',)
)
insert_execute_command_mock(('borg', 'mount', '--debug', '--show-rc', 'repo::archive', '/mnt'))
insert_logging_mock(logging.DEBUG)
@ -114,10 +184,15 @@ def test_mount_archive_with_log_debug_calls_borg_with_debug_parameters():
foreground=False,
options=None,
storage_config={},
local_borg_version='1.2.3',
)
def test_mount_archive_calls_borg_with_foreground_parameter():
flexmock(module.feature).should_receive('available').and_return(False)
flexmock(module.flags).should_receive('make_repository_archive_flags').and_return(
('repo::archive',)
)
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'mount', '--foreground', 'repo::archive', '/mnt'),
@ -134,10 +209,15 @@ def test_mount_archive_calls_borg_with_foreground_parameter():
foreground=True,
options=None,
storage_config={},
local_borg_version='1.2.3',
)
def test_mount_archive_calls_borg_with_options_parameters():
def test_mount_archive_calls_borg_with_options_flags():
flexmock(module.feature).should_receive('available').and_return(False)
flexmock(module.flags).should_receive('make_repository_archive_flags').and_return(
('repo::archive',)
)
insert_execute_command_mock(('borg', 'mount', '-o', 'super_mount', 'repo::archive', '/mnt'))
module.mount_archive(
@ -148,4 +228,5 @@ def test_mount_archive_calls_borg_with_options_parameters():
foreground=False,
options='super_mount',
storage_config={},
local_borg_version='1.2.3',
)

View File

@ -21,28 +21,42 @@ def insert_execute_command_mock(prune_command, output_log_level):
BASE_PRUNE_FLAGS = (('--keep-daily', '1'), ('--keep-weekly', '2'), ('--keep-monthly', '3'))
def test_make_prune_flags_returns_flags_from_config_plus_default_prefix():
def test_make_prune_flags_returns_flags_from_config_plus_default_prefix_glob():
retention_config = OrderedDict((('keep_daily', 1), ('keep_weekly', 2), ('keep_monthly', 3)))
flexmock(module.feature).should_receive('available').and_return(True)
result = module._make_prune_flags(retention_config)
result = module.make_prune_flags(retention_config, local_borg_version='1.2.3')
assert tuple(result) == BASE_PRUNE_FLAGS + (('--prefix', '{hostname}-'),)
assert tuple(result) == BASE_PRUNE_FLAGS + (('--match-archives', 'sh:{hostname}-*'),)
def test_make_prune_flags_accepts_prefix_with_placeholders():
retention_config = OrderedDict((('keep_daily', 1), ('prefix', 'Documents_{hostname}-{now}')))
flexmock(module.feature).should_receive('available').and_return(True)
result = module._make_prune_flags(retention_config)
result = module.make_prune_flags(retention_config, local_borg_version='1.2.3')
expected = (('--keep-daily', '1'), ('--prefix', 'Documents_{hostname}-{now}'))
expected = (('--keep-daily', '1'), ('--match-archives', 'sh:Documents_{hostname}-{now}*'))
assert tuple(result) == expected
def test_make_prune_flags_with_prefix_without_borg_features_uses_glob_archives():
retention_config = OrderedDict((('keep_daily', 1), ('prefix', 'Documents_{hostname}-{now}')))
flexmock(module.feature).should_receive('available').and_return(False)
result = module.make_prune_flags(retention_config, local_borg_version='1.2.3')
expected = (('--keep-daily', '1'), ('--glob-archives', 'Documents_{hostname}-{now}*'))
assert tuple(result) == expected
def test_make_prune_flags_treats_empty_prefix_as_no_prefix():
retention_config = OrderedDict((('keep_daily', 1), ('prefix', '')))
flexmock(module.feature).should_receive('available').and_return(True)
result = module._make_prune_flags(retention_config)
result = module.make_prune_flags(retention_config, local_borg_version='1.2.3')
expected = (('--keep-daily', '1'),)
@ -51,8 +65,9 @@ def test_make_prune_flags_treats_empty_prefix_as_no_prefix():
def test_make_prune_flags_treats_none_prefix_as_no_prefix():
retention_config = OrderedDict((('keep_daily', 1), ('prefix', None)))
flexmock(module.feature).should_receive('available').and_return(True)
result = module._make_prune_flags(retention_config)
result = module.make_prune_flags(retention_config, local_borg_version='1.2.3')
expected = (('--keep-daily', '1'),)
@ -63,195 +78,184 @@ PRUNE_COMMAND = ('borg', 'prune', '--keep-daily', '1', '--keep-weekly', '2', '--
def test_prune_archives_calls_borg_with_parameters():
retention_config = flexmock()
flexmock(module).should_receive('_make_prune_flags').with_args(retention_config).and_return(
BASE_PRUNE_FLAGS
)
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module).should_receive('make_prune_flags').and_return(BASE_PRUNE_FLAGS)
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
insert_execute_command_mock(PRUNE_COMMAND + ('repo',), logging.INFO)
module.prune_archives(
dry_run=False, repository='repo', storage_config={}, retention_config=retention_config
dry_run=False,
repository='repo',
storage_config={},
retention_config=flexmock(),
local_borg_version='1.2.3',
)
def test_prune_archives_with_log_info_calls_borg_with_info_parameter():
retention_config = flexmock()
flexmock(module).should_receive('_make_prune_flags').with_args(retention_config).and_return(
BASE_PRUNE_FLAGS
)
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module).should_receive('make_prune_flags').and_return(BASE_PRUNE_FLAGS)
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
insert_execute_command_mock(PRUNE_COMMAND + ('--info', 'repo'), logging.INFO)
insert_logging_mock(logging.INFO)
module.prune_archives(
repository='repo', storage_config={}, dry_run=False, retention_config=retention_config
repository='repo',
storage_config={},
dry_run=False,
retention_config=flexmock(),
local_borg_version='1.2.3',
)
def test_prune_archives_with_log_debug_calls_borg_with_debug_parameter():
retention_config = flexmock()
flexmock(module).should_receive('_make_prune_flags').with_args(retention_config).and_return(
BASE_PRUNE_FLAGS
)
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module).should_receive('make_prune_flags').and_return(BASE_PRUNE_FLAGS)
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
insert_execute_command_mock(PRUNE_COMMAND + ('--debug', '--show-rc', 'repo'), logging.INFO)
insert_logging_mock(logging.DEBUG)
module.prune_archives(
repository='repo', storage_config={}, dry_run=False, retention_config=retention_config
repository='repo',
storage_config={},
dry_run=False,
retention_config=flexmock(),
local_borg_version='1.2.3',
)
def test_prune_archives_with_dry_run_calls_borg_with_dry_run_parameter():
retention_config = flexmock()
flexmock(module).should_receive('_make_prune_flags').with_args(retention_config).and_return(
BASE_PRUNE_FLAGS
)
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module).should_receive('make_prune_flags').and_return(BASE_PRUNE_FLAGS)
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
insert_execute_command_mock(PRUNE_COMMAND + ('--dry-run', 'repo'), logging.INFO)
module.prune_archives(
repository='repo', storage_config={}, dry_run=True, retention_config=retention_config
repository='repo',
storage_config={},
dry_run=True,
retention_config=flexmock(),
local_borg_version='1.2.3',
)
def test_prune_archives_with_local_path_calls_borg_via_local_path():
retention_config = flexmock()
flexmock(module).should_receive('_make_prune_flags').with_args(retention_config).and_return(
BASE_PRUNE_FLAGS
)
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module).should_receive('make_prune_flags').and_return(BASE_PRUNE_FLAGS)
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
insert_execute_command_mock(('borg1',) + PRUNE_COMMAND[1:] + ('repo',), logging.INFO)
module.prune_archives(
dry_run=False,
repository='repo',
storage_config={},
retention_config=retention_config,
retention_config=flexmock(),
local_borg_version='1.2.3',
local_path='borg1',
)
def test_prune_archives_with_remote_path_calls_borg_with_remote_path_parameters():
retention_config = flexmock()
flexmock(module).should_receive('_make_prune_flags').with_args(retention_config).and_return(
BASE_PRUNE_FLAGS
)
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module).should_receive('make_prune_flags').and_return(BASE_PRUNE_FLAGS)
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
insert_execute_command_mock(PRUNE_COMMAND + ('--remote-path', 'borg1', 'repo'), logging.INFO)
module.prune_archives(
dry_run=False,
repository='repo',
storage_config={},
retention_config=retention_config,
retention_config=flexmock(),
local_borg_version='1.2.3',
remote_path='borg1',
)
def test_prune_archives_with_stats_calls_borg_with_stats_parameter_and_warning_output_log_level():
retention_config = flexmock()
flexmock(module).should_receive('_make_prune_flags').with_args(retention_config).and_return(
BASE_PRUNE_FLAGS
)
insert_execute_command_mock(PRUNE_COMMAND + ('--stats', 'repo'), logging.WARNING)
def test_prune_archives_with_stats_calls_borg_with_stats_parameter_and_answer_output_log_level():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module).should_receive('make_prune_flags').and_return(BASE_PRUNE_FLAGS)
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
insert_execute_command_mock(PRUNE_COMMAND + ('--stats', 'repo'), module.borgmatic.logger.ANSWER)
module.prune_archives(
dry_run=False,
repository='repo',
storage_config={},
retention_config=retention_config,
retention_config=flexmock(),
local_borg_version='1.2.3',
stats=True,
)
def test_prune_archives_with_stats_and_log_info_calls_borg_with_stats_parameter_and_info_output_log_level():
retention_config = flexmock()
flexmock(module).should_receive('_make_prune_flags').with_args(retention_config).and_return(
BASE_PRUNE_FLAGS
)
insert_logging_mock(logging.INFO)
insert_execute_command_mock(PRUNE_COMMAND + ('--stats', '--info', 'repo'), logging.INFO)
def test_prune_archives_with_files_calls_borg_with_list_parameter_and_answer_output_log_level():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module).should_receive('make_prune_flags').and_return(BASE_PRUNE_FLAGS)
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
insert_execute_command_mock(PRUNE_COMMAND + ('--list', 'repo'), module.borgmatic.logger.ANSWER)
module.prune_archives(
dry_run=False,
repository='repo',
storage_config={},
retention_config=retention_config,
stats=True,
)
def test_prune_archives_with_files_calls_borg_with_list_parameter_and_warning_output_log_level():
retention_config = flexmock()
flexmock(module).should_receive('_make_prune_flags').with_args(retention_config).and_return(
BASE_PRUNE_FLAGS
)
insert_execute_command_mock(PRUNE_COMMAND + ('--list', 'repo'), logging.WARNING)
module.prune_archives(
dry_run=False,
repository='repo',
storage_config={},
retention_config=retention_config,
files=True,
)
def test_prune_archives_with_files_and_log_info_calls_borg_with_list_parameter_and_info_output_log_level():
retention_config = flexmock()
flexmock(module).should_receive('_make_prune_flags').with_args(retention_config).and_return(
BASE_PRUNE_FLAGS
)
insert_logging_mock(logging.INFO)
insert_execute_command_mock(PRUNE_COMMAND + ('--info', '--list', 'repo'), logging.INFO)
module.prune_archives(
dry_run=False,
repository='repo',
storage_config={},
retention_config=retention_config,
files=True,
retention_config=flexmock(),
local_borg_version='1.2.3',
list_archives=True,
)
def test_prune_archives_with_umask_calls_borg_with_umask_parameters():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
storage_config = {'umask': '077'}
retention_config = flexmock()
flexmock(module).should_receive('_make_prune_flags').with_args(retention_config).and_return(
BASE_PRUNE_FLAGS
)
flexmock(module).should_receive('make_prune_flags').and_return(BASE_PRUNE_FLAGS)
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
insert_execute_command_mock(PRUNE_COMMAND + ('--umask', '077', 'repo'), logging.INFO)
module.prune_archives(
dry_run=False,
repository='repo',
storage_config=storage_config,
retention_config=retention_config,
retention_config=flexmock(),
local_borg_version='1.2.3',
)
def test_prune_archives_with_lock_wait_calls_borg_with_lock_wait_parameters():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
storage_config = {'lock_wait': 5}
retention_config = flexmock()
flexmock(module).should_receive('_make_prune_flags').with_args(retention_config).and_return(
BASE_PRUNE_FLAGS
)
flexmock(module).should_receive('make_prune_flags').and_return(BASE_PRUNE_FLAGS)
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
insert_execute_command_mock(PRUNE_COMMAND + ('--lock-wait', '5', 'repo'), logging.INFO)
module.prune_archives(
dry_run=False,
repository='repo',
storage_config=storage_config,
retention_config=retention_config,
retention_config=flexmock(),
local_borg_version='1.2.3',
)
def test_prune_archives_with_extra_borg_options_calls_borg_with_extra_options():
retention_config = flexmock()
flexmock(module).should_receive('_make_prune_flags').with_args(retention_config).and_return(
BASE_PRUNE_FLAGS
)
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module).should_receive('make_prune_flags').and_return(BASE_PRUNE_FLAGS)
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
insert_execute_command_mock(PRUNE_COMMAND + ('--extra', '--options', 'repo'), logging.INFO)
module.prune_archives(
dry_run=False,
repository='repo',
storage_config={'extra_borg_options': {'prune': '--extra --options'}},
retention_config=retention_config,
retention_config=flexmock(),
local_borg_version='1.2.3',
)

View File

@ -0,0 +1,269 @@
import logging
import subprocess
import pytest
from flexmock import flexmock
from borgmatic.borg import rcreate as module
from ..test_verbosity import insert_logging_mock
RINFO_SOME_UNKNOWN_EXIT_CODE = -999
RCREATE_COMMAND = ('borg', 'rcreate', '--encryption', 'repokey')
def insert_rinfo_command_found_mock():
flexmock(module.rinfo).should_receive('display_repository_info')
def insert_rinfo_command_not_found_mock():
flexmock(module.rinfo).should_receive('display_repository_info').and_raise(
subprocess.CalledProcessError(module.RINFO_REPOSITORY_NOT_FOUND_EXIT_CODE, [])
)
def insert_rcreate_command_mock(rcreate_command, **kwargs):
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
rcreate_command,
output_file=module.DO_NOT_CAPTURE,
borg_local_path=rcreate_command[0],
extra_environment=None,
).once()
def test_create_repository_calls_borg_with_flags():
insert_rinfo_command_not_found_mock()
insert_rcreate_command_mock(RCREATE_COMMAND + ('--repo', 'repo'))
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module.flags).should_receive('make_repository_flags').and_return(('--repo', 'repo',))
module.create_repository(
dry_run=False,
repository='repo',
storage_config={},
local_borg_version='2.3.4',
encryption_mode='repokey',
)
def test_create_repository_with_dry_run_skips_borg_call():
insert_rinfo_command_not_found_mock()
flexmock(module).should_receive('execute_command').never()
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module.flags).should_receive('make_repository_flags').and_return(('--repo', 'repo',))
module.create_repository(
dry_run=True,
repository='repo',
storage_config={},
local_borg_version='2.3.4',
encryption_mode='repokey',
)
def test_create_repository_raises_for_borg_rcreate_error():
insert_rinfo_command_not_found_mock()
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module.flags).should_receive('make_repository_flags').and_return(('--repo', 'repo',))
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').and_raise(
module.subprocess.CalledProcessError(2, 'borg rcreate')
)
with pytest.raises(subprocess.CalledProcessError):
module.create_repository(
dry_run=False,
repository='repo',
storage_config={},
local_borg_version='2.3.4',
encryption_mode='repokey',
)
def test_create_repository_skips_creation_when_repository_already_exists():
insert_rinfo_command_found_mock()
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module.flags).should_receive('make_repository_flags').and_return(('--repo', 'repo',))
module.create_repository(
dry_run=False,
repository='repo',
storage_config={},
local_borg_version='2.3.4',
encryption_mode='repokey',
)
def test_create_repository_raises_for_unknown_rinfo_command_error():
flexmock(module.rinfo).should_receive('display_repository_info').and_raise(
subprocess.CalledProcessError(RINFO_SOME_UNKNOWN_EXIT_CODE, [])
)
with pytest.raises(subprocess.CalledProcessError):
module.create_repository(
dry_run=False,
repository='repo',
storage_config={},
local_borg_version='2.3.4',
encryption_mode='repokey',
)
def test_create_repository_with_source_repository_calls_borg_with_other_repo_flag():
insert_rinfo_command_not_found_mock()
insert_rcreate_command_mock(RCREATE_COMMAND + ('--other-repo', 'other.borg', '--repo', 'repo'))
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module.flags).should_receive('make_repository_flags').and_return(('--repo', 'repo',))
module.create_repository(
dry_run=False,
repository='repo',
storage_config={},
local_borg_version='2.3.4',
encryption_mode='repokey',
source_repository='other.borg',
)
def test_create_repository_with_copy_crypt_key_calls_borg_with_copy_crypt_key_flag():
insert_rinfo_command_not_found_mock()
insert_rcreate_command_mock(RCREATE_COMMAND + ('--copy-crypt-key', '--repo', 'repo'))
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module.flags).should_receive('make_repository_flags').and_return(('--repo', 'repo',))
module.create_repository(
dry_run=False,
repository='repo',
storage_config={},
local_borg_version='2.3.4',
encryption_mode='repokey',
copy_crypt_key=True,
)
def test_create_repository_with_append_only_calls_borg_with_append_only_flag():
insert_rinfo_command_not_found_mock()
insert_rcreate_command_mock(RCREATE_COMMAND + ('--append-only', '--repo', 'repo'))
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module.flags).should_receive('make_repository_flags').and_return(('--repo', 'repo',))
module.create_repository(
dry_run=False,
repository='repo',
storage_config={},
local_borg_version='2.3.4',
encryption_mode='repokey',
append_only=True,
)
def test_create_repository_with_storage_quota_calls_borg_with_storage_quota_flag():
insert_rinfo_command_not_found_mock()
insert_rcreate_command_mock(RCREATE_COMMAND + ('--storage-quota', '5G', '--repo', 'repo'))
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module.flags).should_receive('make_repository_flags').and_return(('--repo', 'repo',))
module.create_repository(
dry_run=False,
repository='repo',
storage_config={},
local_borg_version='2.3.4',
encryption_mode='repokey',
storage_quota='5G',
)
def test_create_repository_with_make_parent_dirs_calls_borg_with_make_parent_dirs_flag():
insert_rinfo_command_not_found_mock()
insert_rcreate_command_mock(RCREATE_COMMAND + ('--make-parent-dirs', '--repo', 'repo'))
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module.flags).should_receive('make_repository_flags').and_return(('--repo', 'repo',))
module.create_repository(
dry_run=False,
repository='repo',
storage_config={},
local_borg_version='2.3.4',
encryption_mode='repokey',
make_parent_dirs=True,
)
def test_create_repository_with_log_info_calls_borg_with_info_flag():
insert_rinfo_command_not_found_mock()
insert_rcreate_command_mock(RCREATE_COMMAND + ('--info', '--repo', 'repo'))
insert_logging_mock(logging.INFO)
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module.flags).should_receive('make_repository_flags').and_return(('--repo', 'repo',))
module.create_repository(
dry_run=False,
repository='repo',
storage_config={},
local_borg_version='2.3.4',
encryption_mode='repokey',
)
def test_create_repository_with_log_debug_calls_borg_with_debug_flag():
insert_rinfo_command_not_found_mock()
insert_rcreate_command_mock(RCREATE_COMMAND + ('--debug', '--repo', 'repo'))
insert_logging_mock(logging.DEBUG)
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module.flags).should_receive('make_repository_flags').and_return(('--repo', 'repo',))
module.create_repository(
dry_run=False,
repository='repo',
storage_config={},
local_borg_version='2.3.4',
encryption_mode='repokey',
)
def test_create_repository_with_local_path_calls_borg_via_local_path():
insert_rinfo_command_not_found_mock()
insert_rcreate_command_mock(('borg1',) + RCREATE_COMMAND[1:] + ('--repo', 'repo'))
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module.flags).should_receive('make_repository_flags').and_return(('--repo', 'repo',))
module.create_repository(
dry_run=False,
repository='repo',
storage_config={},
local_borg_version='2.3.4',
encryption_mode='repokey',
local_path='borg1',
)
def test_create_repository_with_remote_path_calls_borg_with_remote_path_flag():
insert_rinfo_command_not_found_mock()
insert_rcreate_command_mock(RCREATE_COMMAND + ('--remote-path', 'borg1', '--repo', 'repo'))
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module.flags).should_receive('make_repository_flags').and_return(('--repo', 'repo',))
module.create_repository(
dry_run=False,
repository='repo',
storage_config={},
local_borg_version='2.3.4',
encryption_mode='repokey',
remote_path='borg1',
)
def test_create_repository_with_extra_borg_options_calls_borg_with_extra_options():
insert_rinfo_command_not_found_mock()
insert_rcreate_command_mock(RCREATE_COMMAND + ('--extra', '--options', '--repo', 'repo'))
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module.flags).should_receive('make_repository_flags').and_return(('--repo', 'repo',))
module.create_repository(
dry_run=False,
repository='repo',
storage_config={'extra_borg_options': {'rcreate': '--extra --options'}},
local_borg_version='2.3.4',
encryption_mode='repokey',
)

View File

@ -0,0 +1,220 @@
import logging
from flexmock import flexmock
from borgmatic.borg import rinfo as module
from ..test_verbosity import insert_logging_mock
def test_display_repository_info_calls_borg_with_parameters():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module.flags).should_receive('make_repository_flags').and_return(('--repo', 'repo',))
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'rinfo', '--repo', 'repo'),
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
)
module.display_repository_info(
repository='repo',
storage_config={},
local_borg_version='2.3.4',
rinfo_arguments=flexmock(json=False),
)
def test_display_repository_info_without_borg_features_calls_borg_with_info_sub_command():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.feature).should_receive('available').and_return(False)
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'info', 'repo'),
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
)
module.display_repository_info(
repository='repo',
storage_config={},
local_borg_version='2.3.4',
rinfo_arguments=flexmock(json=False),
)
def test_display_repository_info_with_log_info_calls_borg_with_info_parameter():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module.flags).should_receive('make_repository_flags').and_return(('--repo', 'repo',))
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'rinfo', '--info', '--repo', 'repo'),
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
)
insert_logging_mock(logging.INFO)
module.display_repository_info(
repository='repo',
storage_config={},
local_borg_version='2.3.4',
rinfo_arguments=flexmock(json=False),
)
def test_display_repository_info_with_log_info_and_json_suppresses_most_borg_output():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module.flags).should_receive('make_repository_flags').and_return(('--repo', 'repo',))
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command_and_capture_output').with_args(
('borg', 'rinfo', '--json', '--repo', 'repo'), extra_environment=None,
).and_return('[]')
insert_logging_mock(logging.INFO)
json_output = module.display_repository_info(
repository='repo',
storage_config={},
local_borg_version='2.3.4',
rinfo_arguments=flexmock(json=True),
)
assert json_output == '[]'
def test_display_repository_info_with_log_debug_calls_borg_with_debug_parameter():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module.flags).should_receive('make_repository_flags').and_return(('--repo', 'repo',))
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'rinfo', '--debug', '--show-rc', '--repo', 'repo'),
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
)
insert_logging_mock(logging.DEBUG)
module.display_repository_info(
repository='repo',
storage_config={},
local_borg_version='2.3.4',
rinfo_arguments=flexmock(json=False),
)
def test_display_repository_info_with_log_debug_and_json_suppresses_most_borg_output():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module.flags).should_receive('make_repository_flags').and_return(('--repo', 'repo',))
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command_and_capture_output').with_args(
('borg', 'rinfo', '--json', '--repo', 'repo'), extra_environment=None,
).and_return('[]')
insert_logging_mock(logging.DEBUG)
json_output = module.display_repository_info(
repository='repo',
storage_config={},
local_borg_version='2.3.4',
rinfo_arguments=flexmock(json=True),
)
assert json_output == '[]'
def test_display_repository_info_with_json_calls_borg_with_json_parameter():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module.flags).should_receive('make_repository_flags').and_return(('--repo', 'repo',))
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command_and_capture_output').with_args(
('borg', 'rinfo', '--json', '--repo', 'repo'), extra_environment=None,
).and_return('[]')
json_output = module.display_repository_info(
repository='repo',
storage_config={},
local_borg_version='2.3.4',
rinfo_arguments=flexmock(json=True),
)
assert json_output == '[]'
def test_display_repository_info_with_local_path_calls_borg_via_local_path():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module.flags).should_receive('make_repository_flags').and_return(('--repo', 'repo',))
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg1', 'rinfo', '--repo', 'repo'),
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg1',
extra_environment=None,
)
module.display_repository_info(
repository='repo',
storage_config={},
local_borg_version='2.3.4',
rinfo_arguments=flexmock(json=False),
local_path='borg1',
)
def test_display_repository_info_with_remote_path_calls_borg_with_remote_path_parameters():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module.flags).should_receive('make_repository_flags').and_return(('--repo', 'repo',))
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'rinfo', '--remote-path', 'borg1', '--repo', 'repo'),
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
)
module.display_repository_info(
repository='repo',
storage_config={},
local_borg_version='2.3.4',
rinfo_arguments=flexmock(json=False),
remote_path='borg1',
)
def test_display_repository_info_with_lock_wait_calls_borg_with_lock_wait_parameters():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
storage_config = {'lock_wait': 5}
flexmock(module.feature).should_receive('available').and_return(True)
flexmock(module.flags).should_receive('make_repository_flags').and_return(('--repo', 'repo',))
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'rinfo', '--lock-wait', '5', '--repo', 'repo'),
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
)
module.display_repository_info(
repository='repo',
storage_config=storage_config,
local_borg_version='2.3.4',
rinfo_arguments=flexmock(json=False),
)

View File

@ -0,0 +1,383 @@
import argparse
import logging
import pytest
from flexmock import flexmock
from borgmatic.borg import rlist as module
from ..test_verbosity import insert_logging_mock
BORG_LIST_LATEST_ARGUMENTS = (
'--last',
'1',
'--short',
'repo',
)
def test_resolve_archive_name_passes_through_non_latest_archive_name():
archive = 'myhost-2030-01-01T14:41:17.647620'
assert (
module.resolve_archive_name('repo', archive, storage_config={}, local_borg_version='1.2.3')
== archive
)
def test_resolve_archive_name_calls_borg_with_parameters():
expected_archive = 'archive-name'
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command_and_capture_output').with_args(
('borg', 'list') + BORG_LIST_LATEST_ARGUMENTS, extra_environment=None,
).and_return(expected_archive + '\n')
assert (
module.resolve_archive_name('repo', 'latest', storage_config={}, local_borg_version='1.2.3')
== expected_archive
)
def test_resolve_archive_name_with_log_info_calls_borg_without_info_parameter():
expected_archive = 'archive-name'
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command_and_capture_output').with_args(
('borg', 'list') + BORG_LIST_LATEST_ARGUMENTS, extra_environment=None,
).and_return(expected_archive + '\n')
insert_logging_mock(logging.INFO)
assert (
module.resolve_archive_name('repo', 'latest', storage_config={}, local_borg_version='1.2.3')
== expected_archive
)
def test_resolve_archive_name_with_log_debug_calls_borg_without_debug_parameter():
expected_archive = 'archive-name'
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command_and_capture_output').with_args(
('borg', 'list') + BORG_LIST_LATEST_ARGUMENTS, extra_environment=None,
).and_return(expected_archive + '\n')
insert_logging_mock(logging.DEBUG)
assert (
module.resolve_archive_name('repo', 'latest', storage_config={}, local_borg_version='1.2.3')
== expected_archive
)
def test_resolve_archive_name_with_local_path_calls_borg_via_local_path():
expected_archive = 'archive-name'
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command_and_capture_output').with_args(
('borg1', 'list') + BORG_LIST_LATEST_ARGUMENTS, extra_environment=None,
).and_return(expected_archive + '\n')
assert (
module.resolve_archive_name(
'repo', 'latest', storage_config={}, local_borg_version='1.2.3', local_path='borg1'
)
== expected_archive
)
def test_resolve_archive_name_with_remote_path_calls_borg_with_remote_path_parameters():
expected_archive = 'archive-name'
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command_and_capture_output').with_args(
('borg', 'list', '--remote-path', 'borg1') + BORG_LIST_LATEST_ARGUMENTS,
extra_environment=None,
).and_return(expected_archive + '\n')
assert (
module.resolve_archive_name(
'repo', 'latest', storage_config={}, local_borg_version='1.2.3', remote_path='borg1'
)
== expected_archive
)
def test_resolve_archive_name_without_archives_raises():
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command_and_capture_output').with_args(
('borg', 'list') + BORG_LIST_LATEST_ARGUMENTS, extra_environment=None,
).and_return('')
with pytest.raises(ValueError):
module.resolve_archive_name('repo', 'latest', storage_config={}, local_borg_version='1.2.3')
def test_resolve_archive_name_with_lock_wait_calls_borg_with_lock_wait_parameters():
expected_archive = 'archive-name'
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command_and_capture_output').with_args(
('borg', 'list', '--lock-wait', 'okay') + BORG_LIST_LATEST_ARGUMENTS,
extra_environment=None,
).and_return(expected_archive + '\n')
assert (
module.resolve_archive_name(
'repo', 'latest', storage_config={'lock_wait': 'okay'}, local_borg_version='1.2.3'
)
== expected_archive
)
def test_make_rlist_command_includes_log_info():
insert_logging_mock(logging.INFO)
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.flags).should_receive('make_flags_from_arguments').and_return(())
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
command = module.make_rlist_command(
repository='repo',
storage_config={},
local_borg_version='1.2.3',
rlist_arguments=flexmock(archive=None, paths=None, json=False, prefix=None),
)
assert command == ('borg', 'list', '--info', 'repo')
def test_make_rlist_command_includes_json_but_not_info():
insert_logging_mock(logging.INFO)
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.flags).should_receive('make_flags_from_arguments').and_return(('--json',))
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
command = module.make_rlist_command(
repository='repo',
storage_config={},
local_borg_version='1.2.3',
rlist_arguments=flexmock(archive=None, paths=None, json=True, prefix=None),
)
assert command == ('borg', 'list', '--json', 'repo')
def test_make_rlist_command_includes_log_debug():
insert_logging_mock(logging.DEBUG)
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.flags).should_receive('make_flags_from_arguments').and_return(())
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
command = module.make_rlist_command(
repository='repo',
storage_config={},
local_borg_version='1.2.3',
rlist_arguments=flexmock(archive=None, paths=None, json=False, prefix=None),
)
assert command == ('borg', 'list', '--debug', '--show-rc', 'repo')
def test_make_rlist_command_includes_json_but_not_debug():
insert_logging_mock(logging.DEBUG)
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.flags).should_receive('make_flags_from_arguments').and_return(('--json',))
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
command = module.make_rlist_command(
repository='repo',
storage_config={},
local_borg_version='1.2.3',
rlist_arguments=flexmock(archive=None, paths=None, json=True, prefix=None),
)
assert command == ('borg', 'list', '--json', 'repo')
def test_make_rlist_command_includes_json():
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.flags).should_receive('make_flags_from_arguments').and_return(('--json',))
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
command = module.make_rlist_command(
repository='repo',
storage_config={},
local_borg_version='1.2.3',
rlist_arguments=flexmock(archive=None, paths=None, json=True, prefix=None),
)
assert command == ('borg', 'list', '--json', 'repo')
def test_make_rlist_command_includes_lock_wait():
flexmock(module.flags).should_receive('make_flags').and_return(()).and_return(
('--lock-wait', '5')
).and_return(())
flexmock(module.flags).should_receive('make_flags_from_arguments').and_return(())
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
command = module.make_rlist_command(
repository='repo',
storage_config={'lock_wait': 5},
local_borg_version='1.2.3',
rlist_arguments=flexmock(archive=None, paths=None, json=False, prefix=None),
)
assert command == ('borg', 'list', '--lock-wait', '5', 'repo')
def test_make_rlist_command_includes_local_path():
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.flags).should_receive('make_flags_from_arguments').and_return(())
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
command = module.make_rlist_command(
repository='repo',
storage_config={},
local_borg_version='1.2.3',
rlist_arguments=flexmock(archive=None, paths=None, json=False, prefix=None),
local_path='borg2',
)
assert command == ('borg2', 'list', 'repo')
def test_make_rlist_command_includes_remote_path():
flexmock(module.flags).should_receive('make_flags').and_return(
('--remote-path', 'borg2')
).and_return(()).and_return(())
flexmock(module.flags).should_receive('make_flags_from_arguments').and_return(())
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
command = module.make_rlist_command(
repository='repo',
storage_config={},
local_borg_version='1.2.3',
rlist_arguments=flexmock(archive=None, paths=None, json=False, prefix=None),
remote_path='borg2',
)
assert command == ('borg', 'list', '--remote-path', 'borg2', 'repo')
def test_make_rlist_command_transforms_prefix_into_match_archives():
flexmock(module.flags).should_receive('make_flags').and_return(()).and_return(()).and_return(
('--match-archives', 'sh:foo*')
)
flexmock(module.flags).should_receive('make_flags_from_arguments').and_return(())
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
command = module.make_rlist_command(
repository='repo',
storage_config={},
local_borg_version='1.2.3',
rlist_arguments=flexmock(archive=None, paths=None, json=False, prefix='foo'),
)
assert command == ('borg', 'list', '--match-archives', 'sh:foo*', 'repo')
def test_make_rlist_command_includes_short():
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.flags).should_receive('make_flags_from_arguments').and_return(('--short',))
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
command = module.make_rlist_command(
repository='repo',
storage_config={},
local_borg_version='1.2.3',
rlist_arguments=flexmock(archive=None, paths=None, json=False, prefix=None, short=True),
)
assert command == ('borg', 'list', '--short', 'repo')
@pytest.mark.parametrize(
'argument_name',
(
'match_archives',
'sort_by',
'first',
'last',
'exclude',
'exclude_from',
'pattern',
'patterns_from',
),
)
def test_make_rlist_command_includes_additional_flags(argument_name):
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.flags).should_receive('make_flags_from_arguments').and_return(
(f"--{argument_name.replace('_', '-')}", 'value')
)
flexmock(module.flags).should_receive('make_repository_flags').and_return(('repo',))
command = module.make_rlist_command(
repository='repo',
storage_config={},
local_borg_version='1.2.3',
rlist_arguments=flexmock(
archive=None,
paths=None,
json=False,
prefix=None,
find_paths=None,
format=None,
**{argument_name: 'value'},
),
)
assert command == ('borg', 'list', '--' + argument_name.replace('_', '-'), 'value', 'repo')
def test_list_repository_calls_borg_with_parameters():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
rlist_arguments = argparse.Namespace(json=False)
flexmock(module.feature).should_receive('available').and_return(False)
flexmock(module).should_receive('make_rlist_command').with_args(
repository='repo',
storage_config={},
local_borg_version='1.2.3',
rlist_arguments=rlist_arguments,
local_path='borg',
remote_path=None,
).and_return(('borg', 'rlist', 'repo'))
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'rlist', 'repo'),
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
).once()
module.list_repository(
repository='repo',
storage_config={},
local_borg_version='1.2.3',
rlist_arguments=rlist_arguments,
)
def test_list_repository_with_json_returns_borg_output():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
rlist_arguments = argparse.Namespace(json=True)
json_output = flexmock()
flexmock(module.feature).should_receive('available').and_return(False)
flexmock(module).should_receive('make_rlist_command').with_args(
repository='repo',
storage_config={},
local_borg_version='1.2.3',
rlist_arguments=rlist_arguments,
local_path='borg',
remote_path=None,
).and_return(('borg', 'rlist', 'repo'))
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command_and_capture_output').and_return(json_output)
assert (
module.list_repository(
repository='repo',
storage_config={},
local_borg_version='1.2.3',
rlist_arguments=rlist_arguments,
)
== json_output
)

View File

@ -0,0 +1,288 @@
import logging
import pytest
from flexmock import flexmock
from borgmatic.borg import transfer as module
from ..test_verbosity import insert_logging_mock
def test_transfer_archives_calls_borg_with_flags():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.flags).should_receive('make_flags_from_arguments').and_return(())
flexmock(module.flags).should_receive('make_repository_flags').and_return(('--repo', 'repo'))
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'transfer', '--repo', 'repo'),
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
)
module.transfer_archives(
dry_run=False,
repository='repo',
storage_config={},
local_borg_version='2.3.4',
transfer_arguments=flexmock(archive=None, match_archives=None, source_repository=None),
)
def test_transfer_archives_with_dry_run_calls_borg_with_dry_run_flag():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.flags).should_receive('make_flags').with_args('dry-run', True).and_return(
('--dry-run',)
)
flexmock(module.flags).should_receive('make_flags_from_arguments').and_return(())
flexmock(module.flags).should_receive('make_repository_flags').and_return(('--repo', 'repo'))
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'transfer', '--repo', 'repo', '--dry-run'),
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
)
module.transfer_archives(
dry_run=True,
repository='repo',
storage_config={},
local_borg_version='2.3.4',
transfer_arguments=flexmock(archive=None, match_archives=None, source_repository=None),
)
def test_transfer_archives_with_log_info_calls_borg_with_info_flag():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.flags).should_receive('make_flags_from_arguments').and_return(())
flexmock(module.flags).should_receive('make_repository_flags').and_return(('--repo', 'repo'))
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'transfer', '--info', '--repo', 'repo'),
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
)
insert_logging_mock(logging.INFO)
module.transfer_archives(
dry_run=False,
repository='repo',
storage_config={},
local_borg_version='2.3.4',
transfer_arguments=flexmock(archive=None, match_archives=None, source_repository=None),
)
def test_transfer_archives_with_log_debug_calls_borg_with_debug_flag():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.flags).should_receive('make_flags_from_arguments').and_return(())
flexmock(module.flags).should_receive('make_repository_flags').and_return(('--repo', 'repo'))
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'transfer', '--debug', '--show-rc', '--repo', 'repo'),
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
)
insert_logging_mock(logging.DEBUG)
module.transfer_archives(
dry_run=False,
repository='repo',
storage_config={},
local_borg_version='2.3.4',
transfer_arguments=flexmock(archive=None, match_archives=None, source_repository=None),
)
def test_transfer_archives_with_archive_calls_borg_with_match_archives_flag():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.flags).should_receive('make_flags').with_args(
'match-archives', 'archive'
).and_return(('--match-archives', 'archive'))
flexmock(module.flags).should_receive('make_flags_from_arguments').and_return(())
flexmock(module.flags).should_receive('make_repository_flags').and_return(('--repo', 'repo'))
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'transfer', '--match-archives', 'archive', '--repo', 'repo'),
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
)
module.transfer_archives(
dry_run=False,
repository='repo',
storage_config={},
local_borg_version='2.3.4',
transfer_arguments=flexmock(archive='archive', match_archives=None, source_repository=None),
)
def test_transfer_archives_with_match_archives_calls_borg_with_match_archives_flag():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.flags).should_receive('make_flags').with_args(
'match-archives', 'sh:foo*'
).and_return(('--match-archives', 'sh:foo*'))
flexmock(module.flags).should_receive('make_flags_from_arguments').and_return(())
flexmock(module.flags).should_receive('make_repository_flags').and_return(('--repo', 'repo'))
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'transfer', '--match-archives', 'sh:foo*', '--repo', 'repo'),
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
)
module.transfer_archives(
dry_run=False,
repository='repo',
storage_config={},
local_borg_version='2.3.4',
transfer_arguments=flexmock(archive=None, match_archives='sh:foo*', source_repository=None),
)
def test_transfer_archives_with_local_path_calls_borg_via_local_path():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.flags).should_receive('make_flags_from_arguments').and_return(())
flexmock(module.flags).should_receive('make_repository_flags').and_return(('--repo', 'repo'))
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg2', 'transfer', '--repo', 'repo'),
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg2',
extra_environment=None,
)
module.transfer_archives(
dry_run=False,
repository='repo',
storage_config={},
local_borg_version='2.3.4',
transfer_arguments=flexmock(archive=None, match_archives=None, source_repository=None),
local_path='borg2',
)
def test_transfer_archives_with_remote_path_calls_borg_with_remote_path_flags():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.flags).should_receive('make_flags').with_args(
'remote-path', 'borg2'
).and_return(('--remote-path', 'borg2'))
flexmock(module.flags).should_receive('make_flags_from_arguments').and_return(())
flexmock(module.flags).should_receive('make_repository_flags').and_return(('--repo', 'repo'))
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'transfer', '--remote-path', 'borg2', '--repo', 'repo'),
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
)
module.transfer_archives(
dry_run=False,
repository='repo',
storage_config={},
local_borg_version='2.3.4',
transfer_arguments=flexmock(archive=None, match_archives=None, source_repository=None),
remote_path='borg2',
)
def test_transfer_archives_with_lock_wait_calls_borg_with_lock_wait_flags():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.flags).should_receive('make_flags').with_args('lock-wait', 5).and_return(
('--lock-wait', '5')
)
flexmock(module.flags).should_receive('make_flags_from_arguments').and_return(())
flexmock(module.flags).should_receive('make_repository_flags').and_return(('--repo', 'repo'))
storage_config = {'lock_wait': 5}
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'transfer', '--lock-wait', '5', '--repo', 'repo'),
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
)
module.transfer_archives(
dry_run=False,
repository='repo',
storage_config=storage_config,
local_borg_version='2.3.4',
transfer_arguments=flexmock(archive=None, match_archives=None, source_repository=None),
)
@pytest.mark.parametrize('argument_name', ('upgrader', 'sort_by', 'first', 'last'))
def test_transfer_archives_passes_through_arguments_to_borg(argument_name):
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
flag_name = f"--{argument_name.replace('_', ' ')}"
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.flags).should_receive('make_flags_from_arguments').and_return(
(flag_name, 'value')
)
flexmock(module.flags).should_receive('make_repository_flags').and_return(('--repo', 'repo'))
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'transfer', flag_name, 'value', '--repo', 'repo'),
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
)
module.transfer_archives(
dry_run=False,
repository='repo',
storage_config={},
local_borg_version='2.3.4',
transfer_arguments=flexmock(
archive=None, match_archives=None, source_repository=None, **{argument_name: 'value'}
),
)
def test_transfer_archives_with_source_repository_calls_borg_with_other_repo_flags():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.flags).should_receive('make_flags').and_return(())
flexmock(module.flags).should_receive('make_flags').with_args('other-repo', 'other').and_return(
('--other-repo', 'other')
)
flexmock(module.flags).should_receive('make_flags_from_arguments').and_return(())
flexmock(module.flags).should_receive('make_repository_flags').and_return(('--repo', 'repo'))
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
('borg', 'transfer', '--repo', 'repo', '--other-repo', 'other'),
output_log_level=module.borgmatic.logger.ANSWER,
borg_local_path='borg',
extra_environment=None,
)
module.transfer_archives(
dry_run=False,
repository='repo',
storage_config={},
local_borg_version='2.3.4',
transfer_arguments=flexmock(archive=None, match_archives=None, source_repository='other'),
)

View File

@ -10,22 +10,24 @@ from ..test_verbosity import insert_logging_mock
VERSION = '1.2.3'
def insert_execute_command_mock(command, borg_local_path='borg', version_output=f'borg {VERSION}'):
def insert_execute_command_and_capture_output_mock(
command, borg_local_path='borg', version_output=f'borg {VERSION}'
):
flexmock(module.environment).should_receive('make_environment')
flexmock(module).should_receive('execute_command').with_args(
command, output_log_level=None, borg_local_path=borg_local_path, extra_environment=None,
flexmock(module).should_receive('execute_command_and_capture_output').with_args(
command, extra_environment=None,
).once().and_return(version_output)
def test_local_borg_version_calls_borg_with_required_parameters():
insert_execute_command_mock(('borg', '--version'))
insert_execute_command_and_capture_output_mock(('borg', '--version'))
flexmock(module.environment).should_receive('make_environment')
assert module.local_borg_version({}) == VERSION
def test_local_borg_version_with_log_info_calls_borg_with_info_parameter():
insert_execute_command_mock(('borg', '--version', '--info'))
insert_execute_command_and_capture_output_mock(('borg', '--version', '--info'))
insert_logging_mock(logging.INFO)
flexmock(module.environment).should_receive('make_environment')
@ -33,7 +35,7 @@ def test_local_borg_version_with_log_info_calls_borg_with_info_parameter():
def test_local_borg_version_with_log_debug_calls_borg_with_debug_parameters():
insert_execute_command_mock(('borg', '--version', '--debug', '--show-rc'))
insert_execute_command_and_capture_output_mock(('borg', '--version', '--debug', '--show-rc'))
insert_logging_mock(logging.DEBUG)
flexmock(module.environment).should_receive('make_environment')
@ -41,14 +43,14 @@ def test_local_borg_version_with_log_debug_calls_borg_with_debug_parameters():
def test_local_borg_version_with_local_borg_path_calls_borg_with_it():
insert_execute_command_mock(('borg1', '--version'), borg_local_path='borg1')
insert_execute_command_and_capture_output_mock(('borg1', '--version'), borg_local_path='borg1')
flexmock(module.environment).should_receive('make_environment')
assert module.local_borg_version({}, 'borg1') == VERSION
def test_local_borg_version_with_invalid_version_raises():
insert_execute_command_mock(('borg', '--version'), version_output='wtf')
insert_execute_command_and_capture_output_mock(('borg', '--version'), version_output='wtf')
flexmock(module.environment).should_receive('make_environment')
with pytest.raises(ValueError):

View File

@ -9,6 +9,7 @@ from borgmatic.commands import borgmatic as module
def test_run_configuration_runs_actions_for_each_repository():
flexmock(module).should_receive('verbosity_to_log_level').and_return(logging.INFO)
flexmock(module.borg_version).should_receive('local_borg_version').and_return(flexmock())
expected_results = [flexmock(), flexmock()]
flexmock(module).should_receive('run_actions').and_return(expected_results[:1]).and_return(
@ -23,6 +24,7 @@ def test_run_configuration_runs_actions_for_each_repository():
def test_run_configuration_with_invalid_borg_version_errors():
flexmock(module).should_receive('verbosity_to_log_level').and_return(logging.INFO)
flexmock(module.borg_version).should_receive('local_borg_version').and_raise(ValueError)
flexmock(module.command).should_receive('execute_hook').never()
flexmock(module.dispatch).should_receive('call_hooks').never()
@ -34,6 +36,7 @@ def test_run_configuration_with_invalid_borg_version_errors():
def test_run_configuration_logs_monitor_start_error():
flexmock(module).should_receive('verbosity_to_log_level').and_return(logging.INFO)
flexmock(module.borg_version).should_receive('local_borg_version').and_return(flexmock())
flexmock(module.dispatch).should_receive('call_hooks').and_raise(OSError).and_return(
None
@ -50,6 +53,7 @@ def test_run_configuration_logs_monitor_start_error():
def test_run_configuration_bails_for_monitor_start_soft_failure():
flexmock(module).should_receive('verbosity_to_log_level').and_return(logging.INFO)
flexmock(module.borg_version).should_receive('local_borg_version').and_return(flexmock())
error = subprocess.CalledProcessError(borgmatic.hooks.command.SOFT_FAIL_EXIT_CODE, 'try again')
flexmock(module.dispatch).should_receive('call_hooks').and_raise(error)
@ -64,6 +68,7 @@ def test_run_configuration_bails_for_monitor_start_soft_failure():
def test_run_configuration_logs_actions_error():
flexmock(module).should_receive('verbosity_to_log_level').and_return(logging.INFO)
flexmock(module.borg_version).should_receive('local_borg_version').and_return(flexmock())
flexmock(module.command).should_receive('execute_hook')
flexmock(module.dispatch).should_receive('call_hooks')
@ -79,6 +84,7 @@ def test_run_configuration_logs_actions_error():
def test_run_configuration_bails_for_actions_soft_failure():
flexmock(module).should_receive('verbosity_to_log_level').and_return(logging.INFO)
flexmock(module.borg_version).should_receive('local_borg_version').and_return(flexmock())
flexmock(module.dispatch).should_receive('call_hooks')
error = subprocess.CalledProcessError(borgmatic.hooks.command.SOFT_FAIL_EXIT_CODE, 'try again')
@ -94,6 +100,7 @@ def test_run_configuration_bails_for_actions_soft_failure():
def test_run_configuration_logs_monitor_finish_error():
flexmock(module).should_receive('verbosity_to_log_level').and_return(logging.INFO)
flexmock(module.borg_version).should_receive('local_borg_version').and_return(flexmock())
flexmock(module.dispatch).should_receive('call_hooks').and_return(None).and_return(
None
@ -110,6 +117,7 @@ def test_run_configuration_logs_monitor_finish_error():
def test_run_configuration_bails_for_monitor_finish_soft_failure():
flexmock(module).should_receive('verbosity_to_log_level').and_return(logging.INFO)
flexmock(module.borg_version).should_receive('local_borg_version').and_return(flexmock())
error = subprocess.CalledProcessError(borgmatic.hooks.command.SOFT_FAIL_EXIT_CODE, 'try again')
flexmock(module.dispatch).should_receive('call_hooks').and_return(None).and_return(
@ -127,6 +135,7 @@ def test_run_configuration_bails_for_monitor_finish_soft_failure():
def test_run_configuration_logs_on_error_hook_error():
flexmock(module).should_receive('verbosity_to_log_level').and_return(logging.INFO)
flexmock(module.borg_version).should_receive('local_borg_version').and_return(flexmock())
flexmock(module.command).should_receive('execute_hook').and_raise(OSError)
expected_results = [flexmock(), flexmock()]
@ -143,6 +152,7 @@ def test_run_configuration_logs_on_error_hook_error():
def test_run_configuration_bails_for_on_error_hook_soft_failure():
flexmock(module).should_receive('verbosity_to_log_level').and_return(logging.INFO)
flexmock(module.borg_version).should_receive('local_borg_version').and_return(flexmock())
error = subprocess.CalledProcessError(borgmatic.hooks.command.SOFT_FAIL_EXIT_CODE, 'try again')
flexmock(module.command).should_receive('execute_hook').and_raise(error)
@ -159,6 +169,7 @@ def test_run_configuration_bails_for_on_error_hook_soft_failure():
def test_run_configuration_retries_soft_error():
# Run action first fails, second passes
flexmock(module).should_receive('verbosity_to_log_level').and_return(logging.INFO)
flexmock(module.borg_version).should_receive('local_borg_version').and_return(flexmock())
flexmock(module.command).should_receive('execute_hook')
flexmock(module).should_receive('run_actions').and_raise(OSError).and_return([])
@ -171,6 +182,7 @@ def test_run_configuration_retries_soft_error():
def test_run_configuration_retries_hard_error():
# Run action fails twice
flexmock(module).should_receive('verbosity_to_log_level').and_return(logging.INFO)
flexmock(module.borg_version).should_receive('local_borg_version').and_return(flexmock())
flexmock(module.command).should_receive('execute_hook')
flexmock(module).should_receive('run_actions').and_raise(OSError).times(2)
@ -190,7 +202,8 @@ def test_run_configuration_retries_hard_error():
assert results == error_logs
def test_run_repos_ordered():
def test_run_configuration_repos_ordered():
flexmock(module).should_receive('verbosity_to_log_level').and_return(logging.INFO)
flexmock(module.borg_version).should_receive('local_borg_version').and_return(flexmock())
flexmock(module.command).should_receive('execute_hook')
flexmock(module).should_receive('run_actions').and_raise(OSError).times(2)
@ -208,6 +221,7 @@ def test_run_repos_ordered():
def test_run_configuration_retries_round_robbin():
flexmock(module).should_receive('verbosity_to_log_level').and_return(logging.INFO)
flexmock(module.borg_version).should_receive('local_borg_version').and_return(flexmock())
flexmock(module.command).should_receive('execute_hook')
flexmock(module).should_receive('run_actions').and_raise(OSError).times(4)
@ -238,6 +252,7 @@ def test_run_configuration_retries_round_robbin():
def test_run_configuration_retries_one_passes():
flexmock(module).should_receive('verbosity_to_log_level').and_return(logging.INFO)
flexmock(module.borg_version).should_receive('local_borg_version').and_return(flexmock())
flexmock(module.command).should_receive('execute_hook')
flexmock(module).should_receive('run_actions').and_raise(OSError).and_raise(OSError).and_return(
@ -266,6 +281,7 @@ def test_run_configuration_retries_one_passes():
def test_run_configuration_retry_wait():
flexmock(module).should_receive('verbosity_to_log_level').and_return(logging.INFO)
flexmock(module.borg_version).should_receive('local_borg_version').and_return(flexmock())
flexmock(module.command).should_receive('execute_hook')
flexmock(module).should_receive('run_actions').and_raise(OSError).times(4)
@ -304,6 +320,7 @@ def test_run_configuration_retry_wait():
def test_run_configuration_retries_timeout_multiple_repos():
flexmock(module).should_receive('verbosity_to_log_level').and_return(logging.INFO)
flexmock(module.borg_version).should_receive('local_borg_version').and_return(flexmock())
flexmock(module.command).should_receive('execute_hook')
flexmock(module).should_receive('run_actions').and_raise(OSError).and_raise(OSError).and_return(
@ -340,12 +357,19 @@ def test_run_configuration_retries_timeout_multiple_repos():
assert results == error_logs
def test_run_actions_does_not_raise_for_init_action():
flexmock(module.borg_init).should_receive('initialize_repository')
def test_run_actions_does_not_raise_for_rcreate_action():
flexmock(module).should_receive('add_custom_log_levels')
flexmock(module.logger).answer = lambda message: None
flexmock(module.borg_rcreate).should_receive('create_repository')
arguments = {
'global': flexmock(monitoring_verbosity=1, dry_run=False),
'init': flexmock(
encryption_mode=flexmock(), append_only=flexmock(), storage_quota=flexmock()
'rcreate': flexmock(
encryption_mode=flexmock(),
source_repository=flexmock(),
copy_crypt_key=flexmock(),
append_only=flexmock(),
storage_quota=flexmock(),
make_parent_dirs=flexmock(),
),
}
@ -366,12 +390,42 @@ def test_run_actions_does_not_raise_for_init_action():
)
def test_run_actions_calls_hooks_for_prune_action():
flexmock(module.borg_prune).should_receive('prune_archives')
flexmock(module.command).should_receive('execute_hook').twice()
def test_run_actions_does_not_raise_for_transfer_action():
flexmock(module).should_receive('add_custom_log_levels')
flexmock(module.logger).answer = lambda message: None
flexmock(module.borg_transfer).should_receive('transfer_archives')
arguments = {
'global': flexmock(monitoring_verbosity=1, dry_run=False),
'prune': flexmock(stats=flexmock(), files=flexmock()),
'transfer': flexmock(),
}
list(
module.run_actions(
arguments=arguments,
config_filename='test.yaml',
location={'repositories': ['repo']},
storage={},
retention={},
consistency={},
hooks={},
local_path=None,
remote_path=None,
local_borg_version=None,
repository_path='repo',
)
)
def test_run_actions_calls_hooks_for_prune_action():
flexmock(module).should_receive('add_custom_log_levels')
flexmock(module.logger).answer = lambda message: None
flexmock(module.borg_prune).should_receive('prune_archives')
flexmock(module.command).should_receive('execute_hook').times(
4
) # Before/after extract and before/after actions.
arguments = {
'global': flexmock(monitoring_verbosity=1, dry_run=False),
'prune': flexmock(stats=flexmock(), list_archives=flexmock()),
}
list(
@ -392,9 +446,13 @@ def test_run_actions_calls_hooks_for_prune_action():
def test_run_actions_calls_hooks_for_compact_action():
flexmock(module).should_receive('add_custom_log_levels')
flexmock(module.logger).answer = lambda message: None
flexmock(module.borg_feature).should_receive('available').and_return(True)
flexmock(module.borg_compact).should_receive('compact_segments')
flexmock(module.command).should_receive('execute_hook').twice()
flexmock(module.command).should_receive('execute_hook').times(
4
) # Before/after extract and before/after actions.
arguments = {
'global': flexmock(monitoring_verbosity=1, dry_run=False),
'compact': flexmock(progress=flexmock(), cleanup_commits=flexmock(), threshold=flexmock()),
@ -418,13 +476,18 @@ def test_run_actions_calls_hooks_for_compact_action():
def test_run_actions_executes_and_calls_hooks_for_create_action():
flexmock(module).should_receive('add_custom_log_levels')
flexmock(module.logger).answer = lambda message: None
flexmock(module.borg_create).should_receive('create_archive')
flexmock(module.command).should_receive('execute_hook').twice()
flexmock(module.dispatch).should_receive('call_hooks').and_return({}).times(3)
flexmock(module.command).should_receive('execute_hook').times(
4
) # Before/after extract and before/after actions.
flexmock(module.dispatch).should_receive('call_hooks').and_return({})
flexmock(module.dispatch).should_receive('call_hooks_even_if_unconfigured').and_return({})
arguments = {
'global': flexmock(monitoring_verbosity=1, dry_run=False),
'create': flexmock(
progress=flexmock(), stats=flexmock(), json=flexmock(), files=flexmock()
progress=flexmock(), stats=flexmock(), json=flexmock(), list_files=flexmock()
),
}
@ -446,9 +509,13 @@ def test_run_actions_executes_and_calls_hooks_for_create_action():
def test_run_actions_calls_hooks_for_check_action():
flexmock(module).should_receive('add_custom_log_levels')
flexmock(module.logger).answer = lambda message: None
flexmock(module.checks).should_receive('repository_enabled_for_checks').and_return(True)
flexmock(module.borg_check).should_receive('check_archives')
flexmock(module.command).should_receive('execute_hook').twice()
flexmock(module.command).should_receive('execute_hook').times(
4
) # Before/after extract and before/after actions.
arguments = {
'global': flexmock(monitoring_verbosity=1, dry_run=False),
'check': flexmock(
@ -474,9 +541,13 @@ def test_run_actions_calls_hooks_for_check_action():
def test_run_actions_calls_hooks_for_extract_action():
flexmock(module).should_receive('add_custom_log_levels')
flexmock(module.logger).answer = lambda message: None
flexmock(module.validate).should_receive('repositories_match').and_return(True)
flexmock(module.borg_extract).should_receive('extract_archive')
flexmock(module.command).should_receive('execute_hook').twice()
flexmock(module.command).should_receive('execute_hook').times(
4
) # Before/after extract and before/after actions.
arguments = {
'global': flexmock(monitoring_verbosity=1, dry_run=False),
'extract': flexmock(
@ -507,6 +578,8 @@ def test_run_actions_calls_hooks_for_extract_action():
def test_run_actions_does_not_raise_for_export_tar_action():
flexmock(module).should_receive('add_custom_log_levels')
flexmock(module.logger).answer = lambda message: None
flexmock(module.validate).should_receive('repositories_match').and_return(True)
flexmock(module.borg_export_tar).should_receive('export_tar_archive')
arguments = {
@ -517,7 +590,7 @@ def test_run_actions_does_not_raise_for_export_tar_action():
paths=flexmock(),
destination=flexmock(),
tar_filter=flexmock(),
files=flexmock(),
list_files=flexmock(),
strip_components=flexmock(),
),
}
@ -540,6 +613,8 @@ def test_run_actions_does_not_raise_for_export_tar_action():
def test_run_actions_does_not_raise_for_mount_action():
flexmock(module).should_receive('add_custom_log_levels')
flexmock(module.logger).answer = lambda message: None
flexmock(module.validate).should_receive('repositories_match').and_return(True)
flexmock(module.borg_mount).should_receive('mount_archive')
arguments = {
@ -571,10 +646,39 @@ def test_run_actions_does_not_raise_for_mount_action():
)
def test_run_actions_does_not_raise_for_list_action():
def test_run_actions_does_not_raise_for_rlist_action():
flexmock(module).should_receive('add_custom_log_levels')
flexmock(module.logger).answer = lambda message: None
flexmock(module.validate).should_receive('repositories_match').and_return(True)
flexmock(module.borg_list).should_receive('resolve_archive_name').and_return(flexmock())
flexmock(module.borg_list).should_receive('list_archives')
flexmock(module.borg_rlist).should_receive('list_repository')
arguments = {
'global': flexmock(monitoring_verbosity=1, dry_run=False),
'rlist': flexmock(repository=flexmock(), json=flexmock()),
}
list(
module.run_actions(
arguments=arguments,
config_filename='test.yaml',
location={'repositories': ['repo']},
storage={},
retention={},
consistency={},
hooks={},
local_path=None,
remote_path=None,
local_borg_version=None,
repository_path='repo',
)
)
def test_run_actions_does_not_raise_for_list_action():
flexmock(module).should_receive('add_custom_log_levels')
flexmock(module.logger).answer = lambda message: None
flexmock(module.validate).should_receive('repositories_match').and_return(True)
flexmock(module.borg_rlist).should_receive('resolve_archive_name').and_return(flexmock())
flexmock(module.borg_list).should_receive('list_archive')
arguments = {
'global': flexmock(monitoring_verbosity=1, dry_run=False),
'list': flexmock(repository=flexmock(), archive=flexmock(), json=flexmock()),
@ -597,9 +701,38 @@ def test_run_actions_does_not_raise_for_list_action():
)
def test_run_actions_does_not_raise_for_info_action():
def test_run_actions_does_not_raise_for_rinfo_action():
flexmock(module).should_receive('add_custom_log_levels')
flexmock(module.logger).answer = lambda message: None
flexmock(module.validate).should_receive('repositories_match').and_return(True)
flexmock(module.borg_list).should_receive('resolve_archive_name').and_return(flexmock())
flexmock(module.borg_rinfo).should_receive('display_repository_info')
arguments = {
'global': flexmock(monitoring_verbosity=1, dry_run=False),
'rinfo': flexmock(repository=flexmock(), json=flexmock()),
}
list(
module.run_actions(
arguments=arguments,
config_filename='test.yaml',
location={'repositories': ['repo']},
storage={},
retention={},
consistency={},
hooks={},
local_path=None,
remote_path=None,
local_borg_version=None,
repository_path='repo',
)
)
def test_run_actions_does_not_raise_for_info_action():
flexmock(module).should_receive('add_custom_log_levels')
flexmock(module.logger).answer = lambda message: None
flexmock(module.validate).should_receive('repositories_match').and_return(True)
flexmock(module.borg_rlist).should_receive('resolve_archive_name').and_return(flexmock())
flexmock(module.borg_info).should_receive('display_archives_info')
arguments = {
'global': flexmock(monitoring_verbosity=1, dry_run=False),
@ -623,9 +756,38 @@ def test_run_actions_does_not_raise_for_info_action():
)
def test_run_actions_does_not_raise_for_borg_action():
def test_run_actions_does_not_raise_for_break_lock_action():
flexmock(module).should_receive('add_custom_log_levels')
flexmock(module.logger).answer = lambda message: None
flexmock(module.validate).should_receive('repositories_match').and_return(True)
flexmock(module.borg_list).should_receive('resolve_archive_name').and_return(flexmock())
flexmock(module.borg_break_lock).should_receive('break_lock')
arguments = {
'global': flexmock(monitoring_verbosity=1, dry_run=False),
'break-lock': flexmock(repository=flexmock()),
}
list(
module.run_actions(
arguments=arguments,
config_filename='test.yaml',
location={'repositories': ['repo']},
storage={},
retention={},
consistency={},
hooks={},
local_path=None,
remote_path=None,
local_borg_version=None,
repository_path='repo',
)
)
def test_run_actions_does_not_raise_for_borg_action():
flexmock(module).should_receive('add_custom_log_levels')
flexmock(module.logger).answer = lambda message: None
flexmock(module.validate).should_receive('repositories_match').and_return(True)
flexmock(module.borg_rlist).should_receive('resolve_archive_name').and_return(flexmock())
flexmock(module.borg_borg).should_receive('run_arbitrary_borg')
arguments = {
'global': flexmock(monitoring_verbosity=1, dry_run=False),
@ -649,17 +811,19 @@ def test_run_actions_does_not_raise_for_borg_action():
)
def test_load_configurations_collects_parsed_configurations():
def test_load_configurations_collects_parsed_configurations_and_logs():
configuration = flexmock()
other_configuration = flexmock()
test_expected_logs = [flexmock(), flexmock()]
other_expected_logs = [flexmock(), flexmock()]
flexmock(module.validate).should_receive('parse_configuration').and_return(
configuration
).and_return(other_configuration)
configuration, test_expected_logs
).and_return(other_configuration, other_expected_logs)
configs, logs = tuple(module.load_configurations(('test.yaml', 'other.yaml')))
assert configs == {'test.yaml': configuration, 'other.yaml': other_configuration}
assert logs == []
assert logs == test_expected_logs + other_expected_logs
def test_load_configurations_logs_warning_for_permission_error():
@ -746,6 +910,7 @@ def test_get_local_path_without_local_path_defaults_to_borg():
def test_collect_configuration_run_summary_logs_info_for_success():
flexmock(module.command).should_receive('execute_hook').never()
flexmock(module.validate).should_receive('guard_configuration_contains_repository')
flexmock(module).should_receive('run_configuration').and_return([])
arguments = {}
@ -757,6 +922,7 @@ def test_collect_configuration_run_summary_logs_info_for_success():
def test_collect_configuration_run_summary_executes_hooks_for_create():
flexmock(module.validate).should_receive('guard_configuration_contains_repository')
flexmock(module).should_receive('run_configuration').and_return([])
arguments = {'create': flexmock(), 'global': flexmock(monitoring_verbosity=1, dry_run=False)}
@ -768,6 +934,7 @@ def test_collect_configuration_run_summary_executes_hooks_for_create():
def test_collect_configuration_run_summary_logs_info_for_success_with_extract():
flexmock(module.validate).should_receive('guard_single_repository_selected')
flexmock(module.validate).should_receive('guard_configuration_contains_repository')
flexmock(module).should_receive('run_configuration').and_return([])
arguments = {'extract': flexmock(repository='repo')}
@ -795,6 +962,7 @@ def test_collect_configuration_run_summary_logs_extract_with_repository_error():
def test_collect_configuration_run_summary_logs_info_for_success_with_mount():
flexmock(module.validate).should_receive('guard_single_repository_selected')
flexmock(module.validate).should_receive('guard_configuration_contains_repository')
flexmock(module).should_receive('run_configuration').and_return([])
arguments = {'mount': flexmock(repository='repo')}
@ -846,6 +1014,7 @@ def test_collect_configuration_run_summary_logs_pre_hook_error():
def test_collect_configuration_run_summary_logs_post_hook_error():
flexmock(module.command).should_receive('execute_hook').and_return(None).and_raise(ValueError)
flexmock(module.validate).should_receive('guard_configuration_contains_repository')
flexmock(module).should_receive('run_configuration').and_return([])
expected_logs = (flexmock(),)
flexmock(module).should_receive('log_error_records').and_return(expected_logs)
@ -874,6 +1043,7 @@ def test_collect_configuration_run_summary_logs_for_list_with_archive_and_reposi
def test_collect_configuration_run_summary_logs_info_for_success_with_list():
flexmock(module.validate).should_receive('guard_configuration_contains_repository')
flexmock(module).should_receive('run_configuration').and_return([])
arguments = {'list': flexmock(repository='repo', archive=None)}
@ -916,6 +1086,7 @@ def test_collect_configuration_run_summary_logs_run_umount_error():
def test_collect_configuration_run_summary_logs_outputs_merged_json_results():
flexmock(module.validate).should_receive('guard_configuration_contains_repository')
flexmock(module).should_receive('run_configuration').and_return(['foo', 'bar']).and_return(
['baz']
)

View File

@ -4,44 +4,99 @@ from borgmatic.config import normalize as module
@pytest.mark.parametrize(
'config,expected_config',
'config,expected_config,produces_logs',
(
(
{'location': {'exclude_if_present': '.nobackup'}},
{'location': {'exclude_if_present': ['.nobackup']}},
False,
),
(
{'location': {'exclude_if_present': ['.nobackup']}},
{'location': {'exclude_if_present': ['.nobackup']}},
False,
),
(
{'location': {'source_directories': ['foo', 'bar']}},
{'location': {'source_directories': ['foo', 'bar']}},
False,
),
({'storage': {'compression': 'yes_please'}}, {'storage': {'compression': 'yes_please'}}),
({'location': None}, {'location': None}, False,),
(
{'storage': {'compression': 'yes_please'}},
{'storage': {'compression': 'yes_please'}},
False,
),
({'storage': None}, {'storage': None}, False,),
(
{'hooks': {'healthchecks': 'https://example.com'}},
{'hooks': {'healthchecks': {'ping_url': 'https://example.com'}}},
False,
),
(
{'hooks': {'cronitor': 'https://example.com'}},
{'hooks': {'cronitor': {'ping_url': 'https://example.com'}}},
False,
),
(
{'hooks': {'pagerduty': 'https://example.com'}},
{'hooks': {'pagerduty': {'integration_key': 'https://example.com'}}},
False,
),
(
{'hooks': {'cronhub': 'https://example.com'}},
{'hooks': {'cronhub': {'ping_url': 'https://example.com'}}},
False,
),
({'hooks': None}, {'hooks': None}, False,),
(
{'consistency': {'checks': ['archives']}},
{'consistency': {'checks': [{'name': 'archives'}]}},
False,
),
(
{'consistency': {'checks': ['archives']}},
{'consistency': {'checks': [{'name': 'archives'}]}},
False,
),
({'consistency': None}, {'consistency': None}, False,),
({'location': {'numeric_owner': False}}, {'location': {'numeric_ids': False}}, False,),
({'location': {'bsd_flags': False}}, {'location': {'flags': False}}, False,),
(
{'storage': {'remote_rate_limit': False}},
{'storage': {'upload_rate_limit': False}},
False,
),
(
{'location': {'repositories': ['foo@bar:/repo']}},
{'location': {'repositories': ['ssh://foo@bar/repo']}},
True,
),
(
{'location': {'repositories': ['foo@bar:repo']}},
{'location': {'repositories': ['ssh://foo@bar/./repo']}},
True,
),
(
{'location': {'repositories': ['foo@bar:~/repo']}},
{'location': {'repositories': ['ssh://foo@bar/~/repo']}},
True,
),
(
{'location': {'repositories': ['ssh://foo@bar:1234/repo']}},
{'location': {'repositories': ['ssh://foo@bar:1234/repo']}},
False,
),
),
)
def test_normalize_applies_hard_coded_normalization_to_config(config, expected_config):
module.normalize(config)
def test_normalize_applies_hard_coded_normalization_to_config(
config, expected_config, produces_logs
):
logs = module.normalize('test.yaml', config)
assert config == expected_config
if produces_logs:
assert logs
else:
assert logs == []

View File

@ -120,14 +120,6 @@ def test_guard_configuration_contains_repository_does_not_raise_when_repository_
)
def test_guard_configuration_contains_repository_errors_when_repository_assumed_to_match_config_twice():
with pytest.raises(ValueError):
module.guard_configuration_contains_repository(
repository=None,
configurations={'config.yaml': {'location': {'repositories': ['repo', 'repo2']}}},
)
def test_guard_configuration_contains_repository_errors_when_repository_missing_from_config():
flexmock(module).should_receive('repositories_match').replace_with(
lambda first, second: first == second
@ -153,3 +145,30 @@ def test_guard_configuration_contains_repository_errors_when_repository_matches_
'other.yaml': {'location': {'repositories': ['repo']}},
},
)
def test_guard_single_repository_selected_raises_when_multiple_repositories_configured_and_none_selected():
with pytest.raises(ValueError):
module.guard_single_repository_selected(
repository=None,
configurations={'config.yaml': {'location': {'repositories': ['repo', 'repo2']}}},
)
def test_guard_single_repository_selected_does_not_raise_when_single_repository_configured_and_none_selected():
module.guard_single_repository_selected(
repository=None, configurations={'config.yaml': {'location': {'repositories': ['repo']}}},
)
def test_guard_single_repository_selected_does_not_raise_when_no_repositories_configured_and_one_selected():
module.guard_single_repository_selected(
repository='repo', configurations={'config.yaml': {'location': {'repositories': []}}},
)
def test_guard_single_repository_selected_does_not_raise_when_repositories_configured_and_one_selected():
module.guard_single_repository_selected(
repository='repo',
configurations={'config.yaml': {'location': {'repositories': ['repo', 'repo2']}}},
)

View File

@ -27,13 +27,18 @@ def test_call_hook_invokes_module_function_with_arguments_and_returns_value():
assert return_value == expected_return_value
def test_call_hook_without_hook_config_skips_call():
def test_call_hook_without_hook_config_invokes_module_function_with_arguments_and_returns_value():
hooks = {'other_hook': flexmock()}
expected_return_value = flexmock()
test_module = sys.modules[__name__]
flexmock(module).HOOK_NAME_TO_MODULE = {'super_hook': test_module}
flexmock(test_module).should_receive('hook_function').never()
flexmock(test_module).should_receive('hook_function').with_args(
{}, 'prefix', 55, value=66
).and_return(expected_return_value).once()
module.call_hook('hook_function', hooks, 'prefix', 'super_hook', 55, value=66)
return_value = module.call_hook('hook_function', hooks, 'prefix', 'super_hook', 55, value=66)
assert return_value == expected_return_value
def test_call_hook_without_corresponding_module_raises():
@ -76,3 +81,31 @@ def test_call_hooks_calls_skips_return_values_for_null_hooks():
return_values = module.call_hooks('do_stuff', hooks, 'prefix', ('super_hook', 'other_hook'), 55)
assert return_values == expected_return_values
def test_call_hooks_even_if_unconfigured_calls_each_hook_and_collects_return_values():
hooks = {'super_hook': flexmock(), 'other_hook': flexmock()}
expected_return_values = {'super_hook': flexmock(), 'other_hook': flexmock()}
flexmock(module).should_receive('call_hook').and_return(
expected_return_values['super_hook']
).and_return(expected_return_values['other_hook'])
return_values = module.call_hooks_even_if_unconfigured(
'do_stuff', hooks, 'prefix', ('super_hook', 'other_hook'), 55
)
assert return_values == expected_return_values
def test_call_hooks_even_if_unconfigured_calls_each_hook_configured_or_not_and_collects_return_values():
hooks = {'other_hook': flexmock()}
expected_return_values = {'super_hook': flexmock(), 'other_hook': flexmock()}
flexmock(module).should_receive('call_hook').and_return(
expected_return_values['super_hook']
).and_return(expected_return_values['other_hook'])
return_values = module.call_hooks_even_if_unconfigured(
'do_stuff', hooks, 'prefix', ('super_hook', 'other_hook'), 55
)
assert return_values == expected_return_values

View File

@ -138,7 +138,7 @@ def test_ping_monitor_hits_ping_url_for_start_state():
flexmock(module).should_receive('Forgetful_buffering_handler')
hook_config = {'ping_url': 'https://example.com'}
flexmock(module.requests).should_receive('post').with_args(
'https://example.com/start', data=''.encode('utf-8')
'https://example.com/start', data=''.encode('utf-8'), verify=True
).and_return(flexmock(ok=True))
module.ping_monitor(
@ -155,7 +155,7 @@ def test_ping_monitor_hits_ping_url_for_finish_state():
payload = 'data'
flexmock(module).should_receive('format_buffered_logs_for_payload').and_return(payload)
flexmock(module.requests).should_receive('post').with_args(
'https://example.com', data=payload.encode('utf-8')
'https://example.com', data=payload.encode('utf-8'), verify=True
).and_return(flexmock(ok=True))
module.ping_monitor(
@ -172,7 +172,7 @@ def test_ping_monitor_hits_ping_url_for_fail_state():
payload = 'data'
flexmock(module).should_receive('format_buffered_logs_for_payload').and_return(payload)
flexmock(module.requests).should_receive('post').with_args(
'https://example.com/fail', data=payload.encode('utf')
'https://example.com/fail', data=payload.encode('utf'), verify=True
).and_return(flexmock(ok=True))
module.ping_monitor(
@ -189,7 +189,43 @@ def test_ping_monitor_with_ping_uuid_hits_corresponding_url():
payload = 'data'
flexmock(module).should_receive('format_buffered_logs_for_payload').and_return(payload)
flexmock(module.requests).should_receive('post').with_args(
'https://hc-ping.com/{}'.format(hook_config['ping_url']), data=payload.encode('utf-8')
'https://hc-ping.com/{}'.format(hook_config['ping_url']),
data=payload.encode('utf-8'),
verify=True,
).and_return(flexmock(ok=True))
module.ping_monitor(
hook_config,
'config.yaml',
state=module.monitor.State.FINISH,
monitoring_log_level=1,
dry_run=False,
)
def test_ping_monitor_skips_ssl_verification_when_verify_tls_false():
hook_config = {'ping_url': 'https://example.com', 'verify_tls': False}
payload = 'data'
flexmock(module).should_receive('format_buffered_logs_for_payload').and_return(payload)
flexmock(module.requests).should_receive('post').with_args(
'https://example.com', data=payload.encode('utf-8'), verify=False
).and_return(flexmock(ok=True))
module.ping_monitor(
hook_config,
'config.yaml',
state=module.monitor.State.FINISH,
monitoring_log_level=1,
dry_run=False,
)
def test_ping_monitor_executes_ssl_verification_when_verify_tls_true():
hook_config = {'ping_url': 'https://example.com', 'verify_tls': True}
payload = 'data'
flexmock(module).should_receive('format_buffered_logs_for_payload').and_return(payload)
flexmock(module.requests).should_receive('post').with_args(
'https://example.com', data=payload.encode('utf-8'), verify=True
).and_return(flexmock(ok=True))
module.ping_monitor(
@ -233,7 +269,7 @@ def test_ping_monitor_hits_ping_url_when_states_matching():
flexmock(module).should_receive('Forgetful_buffering_handler')
hook_config = {'ping_url': 'https://example.com', 'states': ['start', 'finish']}
flexmock(module.requests).should_receive('post').with_args(
'https://example.com/start', data=''.encode('utf-8')
'https://example.com/start', data=''.encode('utf-8'), verify=True
).and_return(flexmock(ok=True))
module.ping_monitor(
@ -249,7 +285,7 @@ def test_ping_monitor_with_connection_error_logs_warning():
flexmock(module).should_receive('Forgetful_buffering_handler')
hook_config = {'ping_url': 'https://example.com'}
flexmock(module.requests).should_receive('post').with_args(
'https://example.com/start', data=''.encode('utf-8')
'https://example.com/start', data=''.encode('utf-8'), verify=True
).and_raise(module.requests.exceptions.ConnectionError)
flexmock(module.logger).should_receive('warning').once()
@ -270,7 +306,7 @@ def test_ping_monitor_with_other_error_logs_warning():
module.requests.exceptions.RequestException
)
flexmock(module.requests).should_receive('post').with_args(
'https://example.com/start', data=''.encode('utf-8')
'https://example.com/start', data=''.encode('utf-8'), verify=True
).and_return(response)
flexmock(module.logger).should_receive('warning').once()

View File

@ -22,9 +22,8 @@ def test_database_names_to_dump_queries_mysql_for_database_names():
extra_environment = flexmock()
log_prefix = ''
dry_run_label = ''
flexmock(module).should_receive('execute_command').with_args(
flexmock(module).should_receive('execute_command_and_capture_output').with_args(
('mysql', '--skip-column-names', '--batch', '--execute', 'show schemas'),
output_log_level=None,
extra_environment=extra_environment,
).and_return('foo\nbar\nmysql\n').once()
@ -200,7 +199,7 @@ def test_dump_databases_runs_mysqldump_for_all_databases():
def test_database_names_to_dump_runs_mysql_with_list_options():
database = {'name': 'all', 'list_options': '--defaults-extra-file=my.cnf'}
flexmock(module).should_receive('execute_command').with_args(
flexmock(module).should_receive('execute_command_and_capture_output').with_args(
(
'mysql',
'--defaults-extra-file=my.cnf',
@ -209,7 +208,6 @@ def test_database_names_to_dump_runs_mysql_with_list_options():
'--execute',
'show schemas',
),
output_log_level=None,
extra_environment=None,
).and_return(('foo\nbar')).once()

View File

@ -6,15 +6,57 @@ from flexmock import flexmock
from borgmatic.hooks import postgresql as module
def test_database_names_to_dump_passes_through_individual_database_name():
database = {'name': 'foo'}
assert module.database_names_to_dump(database, flexmock(), flexmock(), flexmock()) == ('foo',)
def test_database_names_to_dump_passes_through_individual_database_name_with_format():
database = {'name': 'foo', 'format': 'custom'}
assert module.database_names_to_dump(database, flexmock(), flexmock(), flexmock()) == ('foo',)
def test_database_names_to_dump_passes_through_all_without_format():
database = {'name': 'all'}
assert module.database_names_to_dump(database, flexmock(), flexmock(), flexmock()) == ('all',)
def test_database_names_to_dump_with_all_and_format_lists_databases():
database = {'name': 'all', 'format': 'custom'}
flexmock(module).should_receive('execute_command_and_capture_output').and_return(
'foo,test,\nbar,test,"stuff and such"'
)
assert module.database_names_to_dump(database, flexmock(), flexmock(), flexmock()) == (
'foo',
'bar',
)
def test_database_names_to_dump_with_all_and_format_excludes_particular_databases():
database = {'name': 'all', 'format': 'custom'}
flexmock(module).should_receive('execute_command_and_capture_output').and_return(
'foo,test,\ntemplate0,test,blah'
)
assert module.database_names_to_dump(database, flexmock(), flexmock(), flexmock()) == ('foo',)
def test_dump_databases_runs_pg_dump_for_each_database():
databases = [{'name': 'foo'}, {'name': 'bar'}]
processes = [flexmock(), flexmock()]
flexmock(module).should_receive('make_extra_environment').and_return({'PGSSLMODE': 'disable'})
flexmock(module).should_receive('make_dump_path').and_return('')
flexmock(module).should_receive('database_names_to_dump').and_return(('foo',)).and_return(
('bar',)
)
flexmock(module.dump).should_receive('make_database_dump_filename').and_return(
'databases/localhost/foo'
).and_return('databases/localhost/bar')
flexmock(module.dump).should_receive('create_named_pipe_for_dump')
flexmock(module).should_receive('make_extra_environment').and_return({'PGSSLMODE': 'disable'})
for name, process in zip(('foo', 'bar'), processes):
flexmock(module).should_receive('execute_command').with_args(
@ -37,14 +79,27 @@ def test_dump_databases_runs_pg_dump_for_each_database():
assert module.dump_databases(databases, 'test.yaml', {}, dry_run=False) == processes
def test_dump_databases_runs_raises_when_no_database_names_to_dump():
databases = [{'name': 'foo'}, {'name': 'bar'}]
flexmock(module).should_receive('make_extra_environment').and_return({'PGSSLMODE': 'disable'})
flexmock(module).should_receive('make_dump_path').and_return('')
flexmock(module).should_receive('database_names_to_dump').and_return(())
with pytest.raises(ValueError):
module.dump_databases(databases, 'test.yaml', {}, dry_run=False)
def test_dump_databases_with_dry_run_skips_pg_dump():
databases = [{'name': 'foo'}, {'name': 'bar'}]
flexmock(module).should_receive('make_extra_environment').and_return({'PGSSLMODE': 'disable'})
flexmock(module).should_receive('make_dump_path').and_return('')
flexmock(module).should_receive('database_names_to_dump').and_return(('foo',)).and_return(
('bar',)
)
flexmock(module.dump).should_receive('make_database_dump_filename').and_return(
'databases/localhost/foo'
).and_return('databases/localhost/bar')
flexmock(module.dump).should_receive('create_named_pipe_for_dump').never()
flexmock(module).should_receive('make_extra_environment').and_return({'PGSSLMODE': 'disable'})
flexmock(module).should_receive('execute_command').never()
assert module.dump_databases(databases, 'test.yaml', {}, dry_run=True) == []
@ -53,12 +108,13 @@ def test_dump_databases_with_dry_run_skips_pg_dump():
def test_dump_databases_runs_pg_dump_with_hostname_and_port():
databases = [{'name': 'foo', 'hostname': 'database.example.org', 'port': 5433}]
process = flexmock()
flexmock(module).should_receive('make_extra_environment').and_return({'PGSSLMODE': 'disable'})
flexmock(module).should_receive('make_dump_path').and_return('')
flexmock(module).should_receive('database_names_to_dump').and_return(('foo',))
flexmock(module.dump).should_receive('make_database_dump_filename').and_return(
'databases/database.example.org/foo'
)
flexmock(module.dump).should_receive('create_named_pipe_for_dump')
flexmock(module).should_receive('make_extra_environment').and_return({'PGSSLMODE': 'disable'})
flexmock(module).should_receive('execute_command').with_args(
(
@ -87,14 +143,15 @@ def test_dump_databases_runs_pg_dump_with_hostname_and_port():
def test_dump_databases_runs_pg_dump_with_username_and_password():
databases = [{'name': 'foo', 'username': 'postgres', 'password': 'trustsome1'}]
process = flexmock()
flexmock(module).should_receive('make_extra_environment').and_return(
{'PGPASSWORD': 'trustsome1', 'PGSSLMODE': 'disable'}
)
flexmock(module).should_receive('make_dump_path').and_return('')
flexmock(module).should_receive('database_names_to_dump').and_return(('foo',))
flexmock(module.dump).should_receive('make_database_dump_filename').and_return(
'databases/localhost/foo'
)
flexmock(module.dump).should_receive('create_named_pipe_for_dump')
flexmock(module).should_receive('make_extra_environment').and_return(
{'PGPASSWORD': 'trustsome1', 'PGSSLMODE': 'disable'}
)
flexmock(module).should_receive('execute_command').with_args(
(
@ -144,13 +201,14 @@ def test_make_extra_environment_maps_options_to_environment():
def test_dump_databases_runs_pg_dump_with_directory_format():
databases = [{'name': 'foo', 'format': 'directory'}]
process = flexmock()
flexmock(module).should_receive('make_extra_environment').and_return({'PGSSLMODE': 'disable'})
flexmock(module).should_receive('make_dump_path').and_return('')
flexmock(module).should_receive('database_names_to_dump').and_return(('foo',))
flexmock(module.dump).should_receive('make_database_dump_filename').and_return(
'databases/localhost/foo'
)
flexmock(module.dump).should_receive('create_parent_directory_for_dump')
flexmock(module.dump).should_receive('create_named_pipe_for_dump').never()
flexmock(module).should_receive('make_extra_environment').and_return({'PGSSLMODE': 'disable'})
flexmock(module).should_receive('execute_command').with_args(
(
@ -175,12 +233,13 @@ def test_dump_databases_runs_pg_dump_with_directory_format():
def test_dump_databases_runs_pg_dump_with_options():
databases = [{'name': 'foo', 'options': '--stuff=such'}]
process = flexmock()
flexmock(module).should_receive('make_extra_environment').and_return({'PGSSLMODE': 'disable'})
flexmock(module).should_receive('make_dump_path').and_return('')
flexmock(module).should_receive('database_names_to_dump').and_return(('foo',))
flexmock(module.dump).should_receive('make_database_dump_filename').and_return(
'databases/localhost/foo'
)
flexmock(module.dump).should_receive('create_named_pipe_for_dump')
flexmock(module).should_receive('make_extra_environment').and_return({'PGSSLMODE': 'disable'})
flexmock(module).should_receive('execute_command').with_args(
(
@ -206,12 +265,13 @@ def test_dump_databases_runs_pg_dump_with_options():
def test_dump_databases_runs_pg_dumpall_for_all_databases():
databases = [{'name': 'all'}]
process = flexmock()
flexmock(module).should_receive('make_extra_environment').and_return({'PGSSLMODE': 'disable'})
flexmock(module).should_receive('make_dump_path').and_return('')
flexmock(module).should_receive('database_names_to_dump').and_return(('all',))
flexmock(module.dump).should_receive('make_database_dump_filename').and_return(
'databases/localhost/all'
)
flexmock(module.dump).should_receive('create_named_pipe_for_dump')
flexmock(module).should_receive('make_extra_environment').and_return({'PGSSLMODE': 'disable'})
flexmock(module).should_receive('execute_command').with_args(
('pg_dumpall', '--no-password', '--clean', '--if-exists', '>', 'databases/localhost/all'),
@ -223,13 +283,44 @@ def test_dump_databases_runs_pg_dumpall_for_all_databases():
assert module.dump_databases(databases, 'test.yaml', {}, dry_run=False) == [process]
def test_dump_databases_runs_non_default_pg_dump():
databases = [{'name': 'foo', 'pg_dump_command': 'special_pg_dump'}]
process = flexmock()
flexmock(module).should_receive('make_extra_environment').and_return({'PGSSLMODE': 'disable'})
flexmock(module).should_receive('make_dump_path').and_return('')
flexmock(module).should_receive('database_names_to_dump').and_return(('foo',))
flexmock(module.dump).should_receive('make_database_dump_filename').and_return(
'databases/localhost/foo'
)
flexmock(module.dump).should_receive('create_named_pipe_for_dump')
flexmock(module).should_receive('execute_command').with_args(
(
'special_pg_dump',
'--no-password',
'--clean',
'--if-exists',
'--format',
'custom',
'foo',
'>',
'databases/localhost/foo',
),
shell=True,
extra_environment={'PGSSLMODE': 'disable'},
run_to_completion=False,
).and_return(process).once()
assert module.dump_databases(databases, 'test.yaml', {}, dry_run=False) == [process]
def test_restore_database_dump_runs_pg_restore():
database_config = [{'name': 'foo'}]
extract_process = flexmock(stdout=flexmock())
flexmock(module).should_receive('make_extra_environment').and_return({'PGSSLMODE': 'disable'})
flexmock(module).should_receive('make_dump_path')
flexmock(module.dump).should_receive('make_database_dump_filename')
flexmock(module).should_receive('make_extra_environment').and_return({'PGSSLMODE': 'disable'})
flexmock(module).should_receive('execute_command_with_processes').with_args(
(
'pg_restore',
@ -258,9 +349,9 @@ def test_restore_database_dump_runs_pg_restore():
def test_restore_database_dump_errors_on_multiple_database_config():
database_config = [{'name': 'foo'}, {'name': 'bar'}]
flexmock(module).should_receive('make_extra_environment').and_return({'PGSSLMODE': 'disable'})
flexmock(module).should_receive('make_dump_path')
flexmock(module.dump).should_receive('make_database_dump_filename')
flexmock(module).should_receive('make_extra_environment').and_return({'PGSSLMODE': 'disable'})
flexmock(module).should_receive('execute_command_with_processes').never()
flexmock(module).should_receive('execute_command').never()
@ -274,9 +365,9 @@ def test_restore_database_dump_runs_pg_restore_with_hostname_and_port():
database_config = [{'name': 'foo', 'hostname': 'database.example.org', 'port': 5433}]
extract_process = flexmock(stdout=flexmock())
flexmock(module).should_receive('make_extra_environment').and_return({'PGSSLMODE': 'disable'})
flexmock(module).should_receive('make_dump_path')
flexmock(module.dump).should_receive('make_database_dump_filename')
flexmock(module).should_receive('make_extra_environment').and_return({'PGSSLMODE': 'disable'})
flexmock(module).should_receive('execute_command_with_processes').with_args(
(
'pg_restore',
@ -322,11 +413,11 @@ def test_restore_database_dump_runs_pg_restore_with_username_and_password():
database_config = [{'name': 'foo', 'username': 'postgres', 'password': 'trustsome1'}]
extract_process = flexmock(stdout=flexmock())
flexmock(module).should_receive('make_dump_path')
flexmock(module.dump).should_receive('make_database_dump_filename')
flexmock(module).should_receive('make_extra_environment').and_return(
{'PGPASSWORD': 'trustsome1', 'PGSSLMODE': 'disable'}
)
flexmock(module).should_receive('make_dump_path')
flexmock(module.dump).should_receive('make_database_dump_filename')
flexmock(module).should_receive('execute_command_with_processes').with_args(
(
'pg_restore',
@ -368,9 +459,9 @@ def test_restore_database_dump_runs_psql_for_all_database_dump():
database_config = [{'name': 'all'}]
extract_process = flexmock(stdout=flexmock())
flexmock(module).should_receive('make_extra_environment').and_return({'PGSSLMODE': 'disable'})
flexmock(module).should_receive('make_dump_path')
flexmock(module.dump).should_receive('make_database_dump_filename')
flexmock(module).should_receive('make_extra_environment').and_return({'PGSSLMODE': 'disable'})
flexmock(module).should_receive('execute_command_with_processes').with_args(
('psql', '--no-password'),
processes=[extract_process],
@ -388,12 +479,46 @@ def test_restore_database_dump_runs_psql_for_all_database_dump():
)
def test_restore_database_dump_runs_non_default_pg_restore_and_psql():
database_config = [
{'name': 'foo', 'pg_restore_command': 'special_pg_restore', 'psql_command': 'special_psql'}
]
extract_process = flexmock(stdout=flexmock())
flexmock(module).should_receive('make_extra_environment').and_return({'PGSSLMODE': 'disable'})
flexmock(module).should_receive('make_dump_path')
flexmock(module.dump).should_receive('make_database_dump_filename')
flexmock(module).should_receive('execute_command_with_processes').with_args(
(
'special_pg_restore',
'--no-password',
'--if-exists',
'--exit-on-error',
'--clean',
'--dbname',
'foo',
),
processes=[extract_process],
output_log_level=logging.DEBUG,
input_file=extract_process.stdout,
extra_environment={'PGSSLMODE': 'disable'},
).once()
flexmock(module).should_receive('execute_command').with_args(
('special_psql', '--no-password', '--quiet', '--dbname', 'foo', '--command', 'ANALYZE'),
extra_environment={'PGSSLMODE': 'disable'},
).once()
module.restore_database_dump(
database_config, 'test.yaml', {}, dry_run=False, extract_process=extract_process
)
def test_restore_database_dump_with_dry_run_skips_restore():
database_config = [{'name': 'foo'}]
flexmock(module).should_receive('make_extra_environment').and_return({'PGSSLMODE': 'disable'})
flexmock(module).should_receive('make_dump_path')
flexmock(module.dump).should_receive('make_database_dump_filename')
flexmock(module).should_receive('make_extra_environment').and_return({'PGSSLMODE': 'disable'})
flexmock(module).should_receive('execute_command_with_processes').never()
module.restore_database_dump(
@ -404,9 +529,9 @@ def test_restore_database_dump_with_dry_run_skips_restore():
def test_restore_database_dump_without_extract_process_restores_from_disk():
database_config = [{'name': 'foo'}]
flexmock(module).should_receive('make_extra_environment').and_return({'PGSSLMODE': 'disable'})
flexmock(module).should_receive('make_dump_path')
flexmock(module.dump).should_receive('make_database_dump_filename').and_return('/dump/path')
flexmock(module).should_receive('make_extra_environment').and_return({'PGSSLMODE': 'disable'})
flexmock(module).should_receive('execute_command_with_processes').with_args(
(
'pg_restore',

View File

@ -213,57 +213,70 @@ def test_execute_command_without_run_to_completion_returns_process():
assert module.execute_command(full_command, run_to_completion=False) == process
def test_execute_command_captures_output():
def test_execute_command_and_capture_output_returns_stdout():
full_command = ['foo', 'bar']
expected_output = '[]'
flexmock(module.os, environ={'a': 'b'})
flexmock(module.subprocess).should_receive('check_output').with_args(
full_command, shell=False, env=None, cwd=None
full_command, stderr=None, shell=False, env=None, cwd=None
).and_return(flexmock(decode=lambda: expected_output)).once()
output = module.execute_command(full_command, output_log_level=None)
output = module.execute_command_and_capture_output(full_command)
assert output == expected_output
def test_execute_command_captures_output_with_shell():
def test_execute_command_and_capture_output_with_capture_stderr_returns_stderr():
full_command = ['foo', 'bar']
expected_output = '[]'
flexmock(module.os, environ={'a': 'b'})
flexmock(module.subprocess).should_receive('check_output').with_args(
'foo bar', shell=True, env=None, cwd=None
full_command, stderr=module.subprocess.STDOUT, shell=False, env=None, cwd=None
).and_return(flexmock(decode=lambda: expected_output)).once()
output = module.execute_command(full_command, output_log_level=None, shell=True)
output = module.execute_command_and_capture_output(full_command, capture_stderr=True)
assert output == expected_output
def test_execute_command_captures_output_with_extra_environment():
def test_execute_command_and_capture_output_returns_output_with_shell():
full_command = ['foo', 'bar']
expected_output = '[]'
flexmock(module.os, environ={'a': 'b'})
flexmock(module.subprocess).should_receive('check_output').with_args(
full_command, shell=False, env={'a': 'b', 'c': 'd'}, cwd=None
'foo bar', stderr=None, shell=True, env=None, cwd=None
).and_return(flexmock(decode=lambda: expected_output)).once()
output = module.execute_command(
full_command, output_log_level=None, shell=False, extra_environment={'c': 'd'}
output = module.execute_command_and_capture_output(full_command, shell=True)
assert output == expected_output
def test_execute_command_and_capture_output_returns_output_with_extra_environment():
full_command = ['foo', 'bar']
expected_output = '[]'
flexmock(module.os, environ={'a': 'b'})
flexmock(module.subprocess).should_receive('check_output').with_args(
full_command, stderr=None, shell=False, env={'a': 'b', 'c': 'd'}, cwd=None,
).and_return(flexmock(decode=lambda: expected_output)).once()
output = module.execute_command_and_capture_output(
full_command, shell=False, extra_environment={'c': 'd'}
)
assert output == expected_output
def test_execute_command_captures_output_with_working_directory():
def test_execute_command_and_capture_output_returns_output_with_working_directory():
full_command = ['foo', 'bar']
expected_output = '[]'
flexmock(module.os, environ={'a': 'b'})
flexmock(module.subprocess).should_receive('check_output').with_args(
full_command, shell=False, env=None, cwd='/working'
full_command, stderr=None, shell=False, env=None, cwd='/working'
).and_return(flexmock(decode=lambda: expected_output)).once()
output = module.execute_command(
full_command, output_log_level=None, shell=False, working_directory='/working'
output = module.execute_command_and_capture_output(
full_command, shell=False, working_directory='/working'
)
assert output == expected_output

View File

@ -1,4 +1,5 @@
import logging
import sys
import pytest
from flexmock import flexmock
@ -125,6 +126,8 @@ def test_multi_stream_handler_logs_to_handler_for_log_level():
def test_console_color_formatter_format_includes_log_message():
flexmock(module).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.ANSWER
plain_message = 'uh oh'
record = flexmock(levelno=logging.CRITICAL, msg=plain_message)
@ -142,7 +145,38 @@ def test_color_text_without_color_does_not_raise():
module.color_text(None, 'hi')
def test_add_logging_level_adds_level_name_and_sets_global_attributes_and_methods():
logger = flexmock()
flexmock(module.logging).should_receive('getLoggerClass').and_return(logger)
flexmock(module.logging).should_receive('addLevelName').with_args(99, 'PLAID')
builtins = flexmock(sys.modules['builtins'])
builtins.should_call('setattr')
builtins.should_receive('setattr').with_args(module.logging, 'PLAID', 99).once()
builtins.should_receive('setattr').with_args(logger, 'plaid', object).once()
builtins.should_receive('setattr').with_args(logging, 'plaid', object).once()
module.add_logging_level('PLAID', 99)
def test_add_logging_level_skips_global_setting_if_already_set():
logger = flexmock()
flexmock(module.logging).should_receive('getLoggerClass').and_return(logger)
flexmock(module.logging).PLAID = 99
flexmock(logger).plaid = flexmock()
flexmock(logging).plaid = flexmock()
flexmock(module.logging).should_receive('addLevelName').never()
builtins = flexmock(sys.modules['builtins'])
builtins.should_call('setattr')
builtins.should_receive('setattr').with_args(module.logging, 'PLAID', 99).never()
builtins.should_receive('setattr').with_args(logger, 'plaid', object).never()
builtins.should_receive('setattr').with_args(logging, 'plaid', object).never()
module.add_logging_level('PLAID', 99)
def test_configure_logging_probes_for_log_socket_on_linux():
flexmock(module).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.ANSWER
flexmock(module).should_receive('Multi_stream_handler').and_return(
flexmock(setFormatter=lambda formatter: None, setLevel=lambda level: None)
)
@ -161,6 +195,8 @@ def test_configure_logging_probes_for_log_socket_on_linux():
def test_configure_logging_probes_for_log_socket_on_macos():
flexmock(module).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.ANSWER
flexmock(module).should_receive('Multi_stream_handler').and_return(
flexmock(setFormatter=lambda formatter: None, setLevel=lambda level: None)
)
@ -180,6 +216,8 @@ def test_configure_logging_probes_for_log_socket_on_macos():
def test_configure_logging_probes_for_log_socket_on_freebsd():
flexmock(module).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.ANSWER
flexmock(module).should_receive('Multi_stream_handler').and_return(
flexmock(setFormatter=lambda formatter: None, setLevel=lambda level: None)
)
@ -200,6 +238,8 @@ def test_configure_logging_probes_for_log_socket_on_freebsd():
def test_configure_logging_sets_global_logger_to_most_verbose_log_level():
flexmock(module).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.ANSWER
flexmock(module).should_receive('Multi_stream_handler').and_return(
flexmock(setFormatter=lambda formatter: None, setLevel=lambda level: None)
)
@ -213,6 +253,8 @@ def test_configure_logging_sets_global_logger_to_most_verbose_log_level():
def test_configure_logging_skips_syslog_if_not_found():
flexmock(module).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.ANSWER
flexmock(module).should_receive('Multi_stream_handler').and_return(
flexmock(setFormatter=lambda formatter: None, setLevel=lambda level: None)
)
@ -227,6 +269,8 @@ def test_configure_logging_skips_syslog_if_not_found():
def test_configure_logging_skips_syslog_if_interactive_console():
flexmock(module).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.ANSWER
flexmock(module).should_receive('Multi_stream_handler').and_return(
flexmock(setFormatter=lambda formatter: None, setLevel=lambda level: None)
)
@ -242,6 +286,8 @@ def test_configure_logging_skips_syslog_if_interactive_console():
def test_configure_logging_to_logfile_instead_of_syslog():
flexmock(module).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.ANSWER
flexmock(module).should_receive('Multi_stream_handler').and_return(
flexmock(setFormatter=lambda formatter: None, setLevel=lambda level: None)
)
@ -264,6 +310,8 @@ def test_configure_logging_to_logfile_instead_of_syslog():
def test_configure_logging_skips_logfile_if_argument_is_none():
flexmock(module).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.ANSWER
flexmock(module).should_receive('Multi_stream_handler').and_return(
flexmock(setFormatter=lambda formatter: None, setLevel=lambda level: None)
)

View File

@ -15,10 +15,17 @@ def insert_logging_mock(log_level):
def test_verbosity_to_log_level_maps_known_verbosity_to_log_level():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
assert module.verbosity_to_log_level(module.VERBOSITY_ERROR) == logging.ERROR
assert module.verbosity_to_log_level(module.VERBOSITY_ANSWER) == module.borgmatic.logger.ANSWER
assert module.verbosity_to_log_level(module.VERBOSITY_SOME) == logging.INFO
assert module.verbosity_to_log_level(module.VERBOSITY_LOTS) == logging.DEBUG
assert module.verbosity_to_log_level(module.VERBOSITY_ERROR) == logging.ERROR
def test_verbosity_to_log_level_maps_unknown_verbosity_to_warning_level():
flexmock(module.borgmatic.logger).should_receive('add_custom_log_levels')
flexmock(module.logging).ANSWER = module.borgmatic.logger.ANSWER
assert module.verbosity_to_log_level('my pants') == logging.WARNING