Compare commits

...

166 Commits

Author SHA1 Message Date
Dan Helfman 08843d51d9 Replace "sequence" with "list" in docs for consistency. 2023-04-12 10:30:23 -07:00
Dan Helfman ea9213cb03 Spelling. 2023-04-11 22:12:57 -07:00
Dan Helfman 1ea4433aa9 Selectively shallow merge certain mappings or sequences when including configuration files (#672). 2023-04-11 21:49:10 -07:00
Dan Helfman 4c0e2cab78 View the results of configuration file merging via "validate-borgmatic-config --show" flag (#673). 2023-04-11 10:49:09 -07:00
Dan Helfman 31a2ac914a Add optional support for running end-to-end tests and building documentation with rootless Podman instead of Docker. 2023-04-10 14:26:54 -07:00
Dan Helfman d6ef0df50d Mention #670 being fixed in NEWS. 2023-04-09 10:01:08 -07:00
Dan Helfman cc60a71210 Clarify "log_file" NEWS (#413). 2023-04-06 14:12:12 -07:00
Dan Helfman 4cd7556a34 Add "log_file" command hook context to NEWS and docs (#413). 2023-04-06 13:58:37 -07:00
Dan Helfman b4b1fa939d
feat: add logfile name to hook context for interpolation
Merge pull request #68 from diivi/feat/add-log-filename-to-hook-context
2023-04-06 13:46:45 -07:00
Divyansh Singh 16d7131fb7 refactor tests 2023-04-07 01:00:38 +05:30
Divyansh Singh 091d60c226 refactor and improve tests 2023-04-06 12:36:10 +05:30
Divyansh Singh 0fbdf8d860 feat: add logfile name to hook context for interpolation 2023-04-06 09:31:24 +05:30
Dan Helfman 192bfe46a9 Fix error when running the "prune" action with both "archive_name_format" and "prefix" options set (#668). 2023-04-05 14:58:05 -07:00
Dan Helfman 080c3afa0d Fix documentation referring to "archive_name_format" in wrong configuration section. 2023-04-05 14:00:21 -07:00
Dan Helfman a9a65ebe54 Fix integration tests to actually assert (#666). 2023-04-04 22:11:36 -07:00
Dan Helfman 616eb6b6da Fix error with "info --match-archives" and fix "--match-archives" overriding logic (#666). 2023-04-04 21:25:10 -07:00
Dan Helfman 00d1dea94e Bump version for release. 2023-04-03 16:11:25 -07:00
Dan Helfman 127ad1dd1f
Add favicon to documentation.
Merge pull request #66 from diivi/add-favicon
2023-04-03 10:22:12 -07:00
Divyansh Singh fc58ba5763 add favicon to documentation 2023-04-03 17:36:24 +05:30
Dan Helfman 7e6bee84b0 Add "--log-file-format" flag for customizing the log message format (#658). 2023-04-02 23:06:36 -07:00
Dan Helfman 01811e03ba Tagged the auto-matching archive behavior as breaking in NEWS. 2023-04-02 14:38:35 -07:00
Dan Helfman 9712d00680 Add "match_archives" option (#588). 2023-04-01 23:57:55 -07:00
Dan Helfman 275e99d0b9 Add codespell link to documentation. 2023-04-01 14:38:52 -07:00
Dan Helfman b9328e6d42 Add spellchecking of source code to NEWS. 2023-04-01 14:09:48 -07:00
Dan Helfman 2934d0902c Code spell checking on every test run! 2023-04-01 11:03:59 -07:00
Dan Helfman 1ad43ad4b5
Fix: run typos to fix various typos in source code.
Merge pull request #65 from diivi/fix/run-typos
2023-04-01 10:44:11 -07:00
Divyansh Singh 32ab17fa46 merge 2023-04-01 22:12:41 +05:30
Divyansh Singh 6054ced931 fix: run typos 2023-04-01 22:10:32 +05:30
Dan Helfman 1412038ed3
Fix randomly failing test: test_log_outputs_kills_other_processes_when_one_errors (#635).
Merge pull request #64 from kxxt/master
2023-03-31 23:19:57 -07:00
kxxt fa8bc285c8 Fix randomly failing test. 2023-04-01 14:02:30 +08:00
Dan Helfman f256908b27 Document wording tweaks (#479). 2023-03-31 15:36:59 -07:00
Dan Helfman 3f78ac4085 Automatically use the "archive_name_format" option to filter which archives get used for borgmatic actions that operate on multiple archives (#479). 2023-03-31 15:21:08 -07:00
Dan Helfman 5f595f7ac3 Fix regression in which the "transfer" action produced a traceback (#663). 2023-03-30 23:21:20 -07:00
Dan Helfman b27e625a77 Update schema comment for check_repositories to mention labels (#635). 2023-03-28 15:44:38 -07:00
Dan Helfman fc2c181b74 Add missing Docker Compose depends. 2023-03-28 15:31:37 -07:00
Dan Helfman 010b82d6d8 Remove unnecessary cd in dev documentation. 2023-03-28 12:45:39 -07:00
Dan Helfman aaf3462d17 Fix Drone intentation. 2023-03-28 12:03:12 -07:00
Dan Helfman f709125110 Error out if run-full-tests is run not inside a test container. 2023-03-28 12:02:07 -07:00
Dan Helfman 3512191f3e Add check_repositories regression fix to NEWS (#662). 2023-03-28 11:45:55 -07:00
Dan Helfman 06b5d81baa Merge branch 'master' of github.com:borgmatic-collective/borgmatic 2023-03-28 11:15:31 -07:00
Dan Helfman 9d71bf916e
fix: make check repositories work with dict and str repositories (#662).
Merge pull request #63 from diivi/fix/check-repositories-by-label
2023-03-28 11:15:01 -07:00
Dan Helfman 59fe01b56d Update script comment. 2023-03-28 11:09:25 -07:00
Divyansh Singh 08e358e27f add and update tests 2023-03-28 22:51:35 +05:30
Divyansh Singh ce22d2d302 reformat 2023-03-28 22:29:21 +05:30
Divyansh Singh 2d08a63e60 fix: make check repositories work with dict and str repositories 2023-03-28 22:14:50 +05:30
Dan Helfman d96f2239c1 Update OpenBSD borgmatic link. 2023-03-27 23:43:39 -07:00
Dan Helfman 67a349ae44 I had one job... (#461). 2023-03-27 23:28:36 -07:00
Dan Helfman dcefded0fa Document that most command-line flags are not config-file-able (#461). 2023-03-27 23:21:14 -07:00
Dan Helfman 1bcdebd1cc Fix multiple repositories example. 2023-03-27 23:16:44 -07:00
Dan Helfman 7a8e0e89dd Mention prior versions of borgmatic in repositories schema. 2023-03-27 21:54:01 -07:00
Dan Helfman 489ae080e5 Update docs with a few more "path:" repositories references (#635). 2023-03-27 21:49:31 -07:00
Dan Helfman 0e3da7be63 Fix repository schema description. 2023-03-27 16:15:24 -07:00
Dan Helfman c5ffb76dfa Bump version for release. 2023-03-27 15:56:49 -07:00
Dan Helfman 61c7b8f13c Add optional repository labels so you can select a repository via "--repository yourlabel" at the command-line (#635). 2023-03-27 15:54:55 -07:00
Dan Helfman 3e8e38011b
Labels for repositories (#635).
Merge pull request #57 from diivi/feat/tag-repos
2023-03-27 15:46:22 -07:00
Dan Helfman d0d3a39833 When a database command errors, display and log the error message instead of swallowing it (#396). 2023-03-27 10:36:39 -07:00
Divyansh Singh 8bef1c698b add feature to docs 2023-03-27 22:16:39 +05:30
Dan Helfman acbbd6670a Removing debugging command output. 2023-03-26 21:26:35 -07:00
Divyansh Singh b336b9bedf add tests for repo labels 2023-03-27 00:19:23 +05:30
Divyansh Singh ec9def4e71 rename repository arg to repository_path in all borg actions 2023-03-26 23:52:25 +05:30
Divyansh Singh a136fda92d check all tests 2023-03-26 23:35:47 +05:30
Divyansh Singh b511e679ae remove optional label for repos from tests 2023-03-26 16:59:29 +05:30
Dan Helfman f56fdab7a9 Add troubleshooting documentation on PostgreSQL/MySQL authentication errors. 2023-03-25 17:08:17 -07:00
Dan Helfman 8c0eea7229 Add additional documentation link to environment variable feature. Rename constants section. 2023-03-25 08:56:25 -07:00
Dan Helfman 19e95628c3 Add documentation and NEWS for custom constants feature (#612). 2023-03-24 23:47:05 -07:00
Dan Helfman 4d01e53414
Fix: replace primitive values in config without quotes (#612).
Merge pull request #62 from diivi/fix/config-json-replacement
2023-03-24 23:45:36 -07:00
Divyansh Singh a082cb87cb fix: replace primitive values in config without quotes 2023-03-25 12:12:56 +05:30
Dan Helfman 1c51a8e229
Allow defining custom variables in config file (#612).
Merge pull request #60 from diivi/feat/constants-support
2023-03-24 22:50:57 -07:00
Dan Helfman d14a8df71a Hide obnoxious ruamel.yaml warnings during test runs. 2023-03-24 22:43:10 -07:00
Dan Helfman 739a58fe47 Rename scripts/run-full-dev-tests to scripts/run-end-to-end-dev-tests and make it run end-to-end tests only. 2023-03-24 16:24:00 -07:00
Dan Helfman af3431d6ae
fix: docs cli reference create spelling
Merge pull request #61 from diivi/docs/cli-reference
2023-03-24 16:09:50 -07:00
Dan Helfman 9851abc2e1 Add documentation on backing up a database running in a container (#649). 2023-03-24 15:18:49 -07:00
Divyansh Singh 61ce6f0473 fix: docs cli reference create spelling 2023-03-25 02:44:56 +05:30
Divyansh Singh 78e8bb6c8c reformat 2023-03-25 02:08:52 +05:30
Divyansh Singh af95134cd2 add test for complex constant 2023-03-25 02:03:36 +05:30
Divyansh Singh d6dfb8753a reformat 2023-03-25 01:50:47 +05:30
Divyansh Singh 1bc003560a Merge branch 'master' of https://github.com/diivi/borgmatic into feat/tag-repos 2023-03-25 01:39:26 +05:30
Divyansh Singh aeaf69f49e pass all tests 2023-03-25 01:34:03 +05:30
Divyansh Singh e83ad9e1e4 use repository["path"] instead of repository 2023-03-25 01:04:57 +05:30
Dan Helfman f42890430c Add code style plugins to enforce use of Python f-strings and prevent single-letter variables. 2023-03-23 23:11:14 -07:00
Divyansh Singh 6f300b0079 feat: constants support 2023-03-24 02:39:37 +05:30
Dan Helfman 9bec029b4f
Fix: remove extra links from docs css.
Merge pull request #59 from diivi/fix/remove-extra-links-from-css
2023-03-23 12:57:55 -07:00
Divyansh Singh 08afad5d81 end with newline 2023-03-24 01:25:15 +05:30
Divyansh Singh a01dc62468 fix: remove extra links from docs css 2023-03-24 01:23:40 +05:30
Dan Helfman 8b61225b13
Copy to clipboard support in documentation.
Merge pull request #58 from diivi/docs/copy-to-clipboard-support
2023-03-23 12:39:41 -07:00
Divyansh Singh 66d2f49f18 docs: copy to clipboard support 2023-03-23 14:45:23 +05:30
Dan Helfman 0a72c67c6c Add missing source directory error fix to NEWS (#655). 2023-03-22 13:02:22 -07:00
Dan Helfman ab64b7ef67
Fix error when a source directory doesn't exist and databases are configured (#655).
Merge pull request #56 from diivi/fix/no-error-on-database-backup-without-source-dirs
2023-03-22 12:59:01 -07:00
Divyansh Singh 1e3a3bf1e7 review 2023-03-23 01:18:06 +05:30
Divyansh Singh 7a2f287918 reformat base 2023-03-23 01:08:30 +05:30
Divyansh Singh 8a63c49498 feat: tag repos 2023-03-23 01:01:26 +05:30
Divyansh Singh 3b5ede8044 remove extra parameter from function call 2023-03-22 23:11:44 +05:30
Divyansh Singh bd235f0426 use exit_code_indicates_error and modify it to accept a command 2023-03-22 16:23:53 +05:30
Divyansh Singh 09183464cd fix: no error on database backups without source dirs 2023-03-22 09:41:39 +05:30
Dan Helfman ca6fd6b061 Add confusing error message fix to NEWS (#623). 2023-03-21 14:25:20 -07:00
Dan Helfman dd9a64f4b6
Fix confusing message when an error occurs running actions for a configuration file (#623).
Merge pull request #55 from diivi/fix/rephrase-error-message
2023-03-21 14:23:09 -07:00
Divyansh Singh 23e7f27ee4 fix: rephrase error when running from config
to avoid confusion, as the user might think the problem is with their config file
2023-03-22 02:22:43 +05:30
Dan Helfman f9ef52f9a5 Remove unused module and outdated test expectations (#576). 2023-03-21 10:29:17 -07:00
Dan Helfman 3f17c355ca Add "file://" paths to NEWS (#576). 2023-03-21 10:24:51 -07:00
Dan Helfman c83fae5e5b
Support file:// paths for repositories (#576).
Merge pull request #54 from diivi/feat/file-urls-support
2023-03-21 10:22:39 -07:00
Divyansh Singh 39ad8f64c4 add tests and remove magic number 2023-03-21 17:06:03 +05:30
Divyansh Singh e86d223bbf Merge branch 'master' of https://github.com/diivi/borgmatic into feat/file-urls-support 2023-03-21 16:55:05 +05:30
Divyansh Singh 86587ab2dc send repo directly to extract and export_tar 2023-03-20 21:51:45 +05:30
Divyansh Singh 58c95d8015 feat: file:// URLs support 2023-03-20 02:43:23 +05:30
Dan Helfman 6351747da5 Add NixOS package link to installation docs. 2023-03-19 09:02:47 -07:00
Dan Helfman 55c153409e Add "source_directories_must_exist" option to NEWS (#501). 2023-03-18 14:07:38 -07:00
Dan Helfman b115fb2fbe Merge branch 'master' of github.com:borgmatic-collective/borgmatic 2023-03-18 14:01:52 -07:00
Dan Helfman 31d04d9ee3
Optionally error if a source directory does not exist.
feat: add optional check for existence of source directories
2023-03-18 13:59:20 -07:00
Divyansh Singh f803836416 reformat 2023-03-18 17:27:33 +05:30
Divyansh Singh 997f60b3e6 add tests 2023-03-18 17:24:21 +05:30
Dan Helfman c84b26499b Add "borg_files_cache_ttl" option to NEWS. 2023-03-17 19:29:10 -07:00
Dan Helfman 214ae81cbb Add option to set borg_files_cache_ttl in config (#618).
Reviewed-on: borgmatic-collective/borgmatic#654
2023-03-18 02:24:41 +00:00
Divyansh Singh d17b2c74db feat: add optional check for existence of source directories 2023-03-18 04:35:55 +05:30
Soumik Dutta fb9677230b add test to ensure integers are converted to string
before setting them up to be environment variable values

Signed-off-by: Soumik Dutta <shalearkane@gmail.com>
2023-03-18 02:57:56 +05:30
Soumik Dutta 0db137efdf add option to set borg_files_cache_ttl in config
Signed-off-by: Soumik Dutta <shalearkane@gmail.com>
2023-03-18 01:48:24 +05:30
Dan Helfman e6605c868d Clarify check frequency default behavior (#653). 2023-03-17 10:09:36 -07:00
Dan Helfman bdfe4b61eb Bump version for release. 2023-03-16 13:42:15 -07:00
Dan Helfman ca4461820d Add support for Python 3.11. 2023-03-16 13:29:37 -07:00
Dan Helfman 7605838bfe Add "--repository" flag to all actions where it makes sense (#564). 2023-03-16 13:27:08 -07:00
Dan Helfman 7a784b8eba Add "--repository" flag to common actions (where it makes sense) (#652).
Reviewed-on: borgmatic-collective/borgmatic#652
2023-03-16 20:21:40 +00:00
Nain 3e22414613 Update tests
Make them more explicit. Also formatting.
2023-03-16 14:01:29 -04:00
Nain 5f87ea3ec5 Add "--repository" flag to the "create" action 2023-03-16 13:15:49 -04:00
Nain a8aeace5b5 Add "--repository" flag to the "compact" action 2023-03-16 11:13:45 -04:00
Nain 480addd7ce Add "--repository" flag to the "check" action 2023-03-16 10:41:13 -04:00
Nain ce0ce4cd1c Merge mostly repetetive tests 2023-03-16 08:23:21 -04:00
Nain 7de9260b0d Remove test now that --repository isn't expected to error
As discussed #652#issuecomment-5579
2023-03-15 14:59:12 -04:00
Nain cdbe6cdf3a Add "--repository" flag to the "prune" action
part of ticket #564
2023-03-15 14:43:17 -04:00
Dan Helfman 95dcc20d5f Better indicate position of additional docs on page (#651).
Reviewed-on: borgmatic-collective/borgmatic#651
2023-03-15 18:13:27 +00:00
Dan Helfman 49e0494924 Fix --editable (mode) option given --user as arg (#648).
Reviewed-on: borgmatic-collective/borgmatic#650
2023-03-15 18:06:46 +00:00
Nain 5fad2bd408 Better indicate position of additional docs on page
On wide screens, the position of the documentation (how-to and reference guide)
is at same level as #it's-your-data.-keep-it-that-way.

So the jump due to anchor link makes it seem like we're taken to top aka
main content. Indicate that links are to the left so reader doesn't recurse.
2023-03-15 07:54:49 -04:00
Nain c6829782a3 Fix --editable (mode) option given --user as arg
--user option should be before, or after `--editable .` not in between.
Before seems better.
2023-03-15 06:50:47 -04:00
Dan Helfman 8cec7c74d8 Add "--strip-components all" on the "extract" action to remove leading path components (#647). 2023-03-09 10:09:16 -08:00
Dan Helfman d3086788eb Document how to list database dumps in an archive. 2023-03-08 16:09:41 -08:00
Dan Helfman 8d860ea02c
Enhanced docs with info on fetching mysql database size
Merge pull request #46 from Jelle-SamsonIT/patch-3
2023-03-08 15:52:28 -08:00
Dan Helfman b343363bb8 Change the default action order to: "create", "prune", "compact", "check" (#304). 2023-03-08 14:05:06 -08:00
Dan Helfman 9db31bd1e9 Run any command-line actions in the order specified instead of using a fixed ordering (#304). 2023-03-08 13:19:41 -08:00
Dan Helfman d88bcc8be9 Add Healthchecks "log" state feature to NEWS. 2023-03-07 15:45:23 -08:00
Dan Helfman 332f7c4bb6 Add support for healthchecks "log" feature (#628).
Reviewed-on: borgmatic-collective/borgmatic#645
2023-03-07 22:21:30 +00:00
Dan Helfman 5d19d86e4a Add flake8-quotes to complain about incorrect quoting so I don't have to! 2023-03-07 14:08:35 -08:00
Soumik Dutta 044ae7869a fix tests
Signed-off-by: Soumik Dutta <shalearkane@gmail.com>
2023-03-08 03:30:12 +05:30
Dan Helfman 62ae82f2c0 Mention searching for files in the extract a backup guide. 2023-03-06 22:59:34 -08:00
Dan Helfman 66194b7304 Update dates in documentation examples. 2023-03-06 22:41:43 -08:00
Soumik Dutta 98e429594e added tests to make sure unsupported log states are detected
Signed-off-by: Soumik Dutta <shalearkane@gmail.com>
2023-03-06 20:31:00 +05:30
Soumik Dutta 4fcfddbe08 return early if unsupported state is passed
Signed-off-by: Soumik Dutta <shalearkane@gmail.com>
2023-03-06 19:58:57 +05:30
Soumik Dutta f442aeae9c fix logs_monitor_start_error()
Signed-off-by: Soumik Dutta <shalearkane@gmail.com>
2023-03-06 05:21:56 +05:30
Soumik Dutta e211863cba update test_borgmatic.py
Signed-off-by: Soumik Dutta <shalearkane@gmail.com>
2023-03-06 05:12:24 +05:30
Soumik Dutta 45256ae33f add test for healthchecks
Signed-off-by: Soumik Dutta <shalearkane@gmail.com>
2023-03-06 03:38:08 +05:30
Soumik Dutta 1573d68fe2 update schema.yaml description
also add monitor.State.LOG to cronitor.

Signed-off-by: Soumik Dutta <shalearkane@gmail.com>
2023-03-05 21:57:13 +05:30
Soumik Dutta 69f6695253 Add support for healthchecks "log" feature #628
Signed-off-by: Soumik Dutta <shalearkane@gmail.com>
2023-03-05 19:27:32 +05:30
Dan Helfman a7c055264d
Fix incorrect documentation TOC background by removing extra dark mode styles.
Merge pull request #52 from diivi/fix/remove-special-dark-mode-attributes
2023-03-04 16:18:04 -08:00
Divyansh Singh db18364a73 fix: remove extra dark mode styles 2023-03-05 03:16:46 +05:30
Dan Helfman 22498ebd4c In the documentation, mention what version of borgmatic introduced SQLite support. 2023-03-04 10:50:28 -08:00
Dan Helfman e1f02d9fa5 Add SQLite feature to NEWS and also integrations. 2023-03-04 09:59:16 -08:00
Dan Helfman 9ec220c600
Add SQLite database dump/restore hook (#295).
feat: add dump-restore support for sqlite databases
2023-03-04 09:47:21 -08:00
Divyansh Singh cf0275a3ed remove test path 2023-03-04 23:00:57 +05:30
Divyansh Singh c71eb60cd2 mock os.remove instead of actually removing a file 2023-03-04 13:08:30 +05:30
Divyansh Singh 675e54ba9f use os.remove and improve tests 2023-03-04 12:43:07 +05:30
Divyansh Singh 1793ad74bd add sqlite for e2e tests 2023-03-04 02:41:14 +05:30
Divyansh Singh 767a7d900b e2e tests schema update 2023-03-04 01:29:01 +05:30
Divyansh Singh 903507bd03 code review 2023-03-04 01:27:07 +05:30
Dan Helfman b6cf7d2adc Bump version for release. 2023-03-02 15:34:22 -08:00
Dan Helfman a071e02d20 With the "create" action and the "--list" ("--files") flag, only show excluded files at verbosity 2 (#620). 2023-03-02 15:33:42 -08:00
Divyansh Singh 3aa88085ed formatting fix 2023-03-03 00:01:52 +05:30
Divyansh Singh af1cc27988 feat: add dump-restore support for sqlite databases 2023-03-02 23:55:16 +05:30
Jelle @ Samson-IT 3720f22234
reworded and added 'all' caveat 2022-07-13 22:03:51 +02:00
Jelle @ Samson-IT 1fdec480d6
Added some info about fetching mysql database size 2022-07-13 13:29:45 +02:00
158 changed files with 4162 additions and 1512 deletions

View File

@ -24,6 +24,8 @@ clone:
steps: steps:
- name: build - name: build
image: alpine:3.13 image: alpine:3.13
environment:
TEST_CONTAINER: true
pull: always pull: always
commands: commands:
- scripts/run-full-tests - scripts/run-full-tests

View File

@ -1,4 +1,5 @@
const pluginSyntaxHighlight = require("@11ty/eleventy-plugin-syntaxhighlight"); const pluginSyntaxHighlight = require("@11ty/eleventy-plugin-syntaxhighlight");
const codeClipboard = require("eleventy-plugin-code-clipboard");
const inclusiveLangPlugin = require("@11ty/eleventy-plugin-inclusive-language"); const inclusiveLangPlugin = require("@11ty/eleventy-plugin-inclusive-language");
const navigationPlugin = require("@11ty/eleventy-navigation"); const navigationPlugin = require("@11ty/eleventy-navigation");
@ -6,6 +7,7 @@ module.exports = function(eleventyConfig) {
eleventyConfig.addPlugin(pluginSyntaxHighlight); eleventyConfig.addPlugin(pluginSyntaxHighlight);
eleventyConfig.addPlugin(inclusiveLangPlugin); eleventyConfig.addPlugin(inclusiveLangPlugin);
eleventyConfig.addPlugin(navigationPlugin); eleventyConfig.addPlugin(navigationPlugin);
eleventyConfig.addPlugin(codeClipboard);
let markdownIt = require("markdown-it"); let markdownIt = require("markdown-it");
let markdownItAnchor = require("markdown-it-anchor"); let markdownItAnchor = require("markdown-it-anchor");
@ -31,6 +33,7 @@ module.exports = function(eleventyConfig) {
markdownIt(markdownItOptions) markdownIt(markdownItOptions)
.use(markdownItAnchor, markdownItAnchorOptions) .use(markdownItAnchor, markdownItAnchorOptions)
.use(markdownItReplaceLink) .use(markdownItReplaceLink)
.use(codeClipboard.markdownItCopyButton)
); );
eleventyConfig.addPassthroughCopy({"docs/static": "static"}); eleventyConfig.addPassthroughCopy({"docs/static": "static"});

1
.flake8 Normal file
View File

@ -0,0 +1 @@
select = Q0

80
NEWS
View File

@ -1,4 +1,76 @@
1.7.8.dev0 1.7.12.dev0
* #413: Add "log_file" context to command hooks so your scripts can consume the borgmatic log file.
See the documentation for more information:
https://torsion.org/borgmatic/docs/how-to/add-preparation-and-cleanup-steps-to-backups/
* #666, #670: Fix error when running the "info" action with the "--match-archives" or "--archive"
flags. Also fix the "--match-archives"/"--archive" flags to correctly override the
"match_archives" configuration option for the "transfer", "list", "rlist", and "info" actions.
* #668: Fix error when running the "prune" action with both "archive_name_format" and "prefix"
options set.
* #672: Selectively shallow merge certain mappings or sequences when including configuration files.
See the documentation for more information:
https://torsion.org/borgmatic/docs/how-to/make-per-application-backups/#shallow-merge
* #673: View the results of configuration file merging via "validate-borgmatic-config --show" flag.
See the documentation for more information:
https://torsion.org/borgmatic/docs/how-to/make-per-application-backups/#debugging-includes
* Add optional support for running end-to-end tests and building documentation with rootless Podman
instead of Docker.
1.7.11
* #479, #588: BREAKING: Automatically use the "archive_name_format" option to filter which archives
get used for borgmatic actions that operate on multiple archives. Override this behavior with the
new "match_archives" option in the storage section. This change is "breaking" in that it silently
changes which archives get considered for "rlist", "prune", "check", etc. See the documentation
for more information:
https://torsion.org/borgmatic/docs/how-to/make-per-application-backups/#archive-naming
* #479, #588: The "prefix" options have been deprecated in favor of the new "archive_name_format"
auto-matching behavior and the "match_archives" option.
* #658: Add "--log-file-format" flag for customizing the log message format. See the documentation
for more information:
https://torsion.org/borgmatic/docs/how-to/inspect-your-backups/#logging-to-file
* #662: Fix regression in which the "check_repositories" option failed to match repositories.
* #663: Fix regression in which the "transfer" action produced a traceback.
* Add spellchecking of source code during test runs.
1.7.10
* #396: When a database command errors, display and log the error message instead of swallowing it.
* #501: Optionally error if a source directory does not exist via "source_directories_must_exist"
option in borgmatic's location configuration.
* #576: Add support for "file://" paths within "repositories" option.
* #612: Define and use custom constants in borgmatic configuration files. See the documentation for
more information:
https://torsion.org/borgmatic/docs/how-to/make-per-application-backups/#constant-interpolation
* #618: Add support for BORG_FILES_CACHE_TTL environment variable via "borg_files_cache_ttl" option
in borgmatic's storage configuration.
* #623: Fix confusing message when an error occurs running actions for a configuration file.
* #635: Add optional repository labels so you can select a repository via "--repository yourlabel"
at the command-line. See the configuration reference for more information:
https://torsion.org/borgmatic/docs/reference/configuration/
* #649: Add documentation on backing up a database running in a container:
https://torsion.org/borgmatic/docs/how-to/backup-your-databases/#containers
* #655: Fix error when databases are configured and a source directory doesn't exist.
* Add code style plugins to enforce use of Python f-strings and prevent single-letter variables.
To join in the pedantry, refresh your test environment with "tox --recreate".
* Rename scripts/run-full-dev-tests to scripts/run-end-to-end-dev-tests and make it run end-to-end
tests only. Continue using tox to run unit and integration tests.
1.7.9
* #295: Add a SQLite database dump/restore hook.
* #304: Change the default action order when no actions are specified on the command-line to:
"create", "prune", "compact", "check". If you'd like to retain the old ordering ("prune" and
"compact" first), then specify actions explicitly on the command-line.
* #304: Run any command-line actions in the order specified instead of using a fixed ordering.
* #564: Add "--repository" flag to all actions where it makes sense, so you can run borgmatic on
a single configured repository instead of all of them.
* #628: Add a Healthchecks "log" state to send borgmatic logs to Healthchecks without signalling
success or failure.
* #647: Add "--strip-components all" feature on the "extract" action to remove leading path
components of files you extract. Must be used with the "--path" flag.
* Add support for Python 3.11.
1.7.8
* #620: With the "create" action and the "--list" ("--files") flag, only show excluded files at
verbosity 2.
* #621: Add optional authentication to the ntfy monitoring hook. * #621: Add optional authentication to the ntfy monitoring hook.
* With the "create" action, only one of "--list" ("--files") and "--progress" flags can be used. * With the "create" action, only one of "--list" ("--files") and "--progress" flags can be used.
This lines up with the new behavior in Borg 2.0.0b5. This lines up with the new behavior in Borg 2.0.0b5.
@ -350,7 +422,7 @@
configuration schema descriptions. configuration schema descriptions.
1.5.6 1.5.6
* #292: Allow before_backup and similiar hooks to exit with a soft failure without altering the * #292: Allow before_backup and similar hooks to exit with a soft failure without altering the
monitoring status on Healthchecks or other providers. Support this by waiting to ping monitoring monitoring status on Healthchecks or other providers. Support this by waiting to ping monitoring
services with a "start" status until after before_* hooks finish. Failures in before_* hooks services with a "start" status until after before_* hooks finish. Failures in before_* hooks
still trigger a monitoring "fail" status. still trigger a monitoring "fail" status.
@ -419,7 +491,7 @@
* For "list" and "info" actions, show repository names even at verbosity 0. * For "list" and "info" actions, show repository names even at verbosity 0.
1.4.22 1.4.22
* #276, #285: Disable colored output when "--json" flag is used, so as to produce valid JSON ouput. * #276, #285: Disable colored output when "--json" flag is used, so as to produce valid JSON output.
* After a backup of a database dump in directory format, properly remove the dump directory. * After a backup of a database dump in directory format, properly remove the dump directory.
* In "borgmatic --help", don't expand $HOME in listing of default "--config" paths. * In "borgmatic --help", don't expand $HOME in listing of default "--config" paths.
@ -791,7 +863,7 @@
* #77: Skip non-"*.yaml" config filenames in /etc/borgmatic.d/ so as not to parse backup files, * #77: Skip non-"*.yaml" config filenames in /etc/borgmatic.d/ so as not to parse backup files,
editor swap files, etc. editor swap files, etc.
* #81: Document user-defined hooks run before/after backup, or on error. * #81: Document user-defined hooks run before/after backup, or on error.
* Add code style guidelines to the documention. * Add code style guidelines to the documentation.
1.2.0 1.2.0
* #61: Support for Borg --list option via borgmatic command-line to list all archives. * #61: Support for Borg --list option via borgmatic command-line to list all archives.

View File

@ -24,9 +24,10 @@ location:
# Paths of local or remote repositories to backup to. # Paths of local or remote repositories to backup to.
repositories: repositories:
- ssh://1234@usw-s001.rsync.net/./backups.borg - path: ssh://k8pDxu32@k8pDxu32.repo.borgbase.com/./repo
- ssh://k8pDxu32@k8pDxu32.repo.borgbase.com/./repo label: borgbase
- /var/lib/backups/local.borg - path: /var/lib/backups/local.borg
label: local
retention: retention:
# Retention policy for how many backups to keep. # Retention policy for how many backups to keep.
@ -67,6 +68,7 @@ borgmatic is powered by [Borg Backup](https://www.borgbackup.org/).
<a href="https://www.mysql.com/"><img src="docs/static/mysql.png" alt="MySQL" height="60px" style="margin-bottom:20px;"></a>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; <a href="https://www.mysql.com/"><img src="docs/static/mysql.png" alt="MySQL" height="60px" style="margin-bottom:20px;"></a>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;
<a href="https://mariadb.com/"><img src="docs/static/mariadb.png" alt="MariaDB" height="60px" style="margin-bottom:20px;"></a>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; <a href="https://mariadb.com/"><img src="docs/static/mariadb.png" alt="MariaDB" height="60px" style="margin-bottom:20px;"></a>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;
<a href="https://www.mongodb.com/"><img src="docs/static/mongodb.png" alt="MongoDB" height="60px" style="margin-bottom:20px;"></a>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; <a href="https://www.mongodb.com/"><img src="docs/static/mongodb.png" alt="MongoDB" height="60px" style="margin-bottom:20px;"></a>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;
<a href="https://sqlite.org/"><img src="docs/static/sqlite.png" alt="SQLite" height="60px" style="margin-bottom:20px;"></a>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;
<a href="https://healthchecks.io/"><img src="docs/static/healthchecks.png" alt="Healthchecks" height="60px" style="margin-bottom:20px;"></a>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; <a href="https://healthchecks.io/"><img src="docs/static/healthchecks.png" alt="Healthchecks" height="60px" style="margin-bottom:20px;"></a>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;
<a href="https://cronitor.io/"><img src="docs/static/cronitor.png" alt="Cronitor" height="60px" style="margin-bottom:20px;"></a>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; <a href="https://cronitor.io/"><img src="docs/static/cronitor.png" alt="Cronitor" height="60px" style="margin-bottom:20px;"></a>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;
<a href="https://cronhub.io/"><img src="docs/static/cronhub.png" alt="Cronhub" height="60px" style="margin-bottom:20px;"></a>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; <a href="https://cronhub.io/"><img src="docs/static/cronhub.png" alt="Cronhub" height="60px" style="margin-bottom:20px;"></a>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;
@ -80,8 +82,8 @@ borgmatic is powered by [Borg Backup](https://www.borgbackup.org/).
Your first step is to [install and configure Your first step is to [install and configure
borgmatic](https://torsion.org/borgmatic/docs/how-to/set-up-backups/). borgmatic](https://torsion.org/borgmatic/docs/how-to/set-up-backups/).
For additional documentation, check out the links above for <a For additional documentation, check out the links above (left panel on wide screens)
href="https://torsion.org/borgmatic/#documentation">borgmatic how-to and for <a href="https://torsion.org/borgmatic/#documentation">borgmatic how-to and
reference guides</a>. reference guides</a>.

View File

@ -16,9 +16,9 @@ def run_borg(
if borg_arguments.repository is None or borgmatic.config.validate.repositories_match( if borg_arguments.repository is None or borgmatic.config.validate.repositories_match(
repository, borg_arguments.repository repository, borg_arguments.repository
): ):
logger.info('{}: Running arbitrary Borg command'.format(repository)) logger.info(f'{repository["path"]}: Running arbitrary Borg command')
archive_name = borgmatic.borg.rlist.resolve_archive_name( archive_name = borgmatic.borg.rlist.resolve_archive_name(
repository, repository['path'],
borg_arguments.archive, borg_arguments.archive,
storage, storage,
local_borg_version, local_borg_version,
@ -26,7 +26,7 @@ def run_borg(
remote_path, remote_path,
) )
borgmatic.borg.borg.run_arbitrary_borg( borgmatic.borg.borg.run_arbitrary_borg(
repository, repository['path'],
storage, storage,
local_borg_version, local_borg_version,
options=borg_arguments.options, options=borg_arguments.options,

View File

@ -15,7 +15,11 @@ def run_break_lock(
if break_lock_arguments.repository is None or borgmatic.config.validate.repositories_match( if break_lock_arguments.repository is None or borgmatic.config.validate.repositories_match(
repository, break_lock_arguments.repository repository, break_lock_arguments.repository
): ):
logger.info(f'{repository}: Breaking repository and cache locks') logger.info(f'{repository["path"]}: Breaking repository and cache locks')
borgmatic.borg.break_lock.break_lock( borgmatic.borg.break_lock.break_lock(
repository, storage, local_borg_version, local_path=local_path, remote_path=remote_path, repository['path'],
storage,
local_borg_version,
local_path=local_path,
remote_path=remote_path,
) )

View File

@ -1,6 +1,7 @@
import logging import logging
import borgmatic.borg.check import borgmatic.borg.check
import borgmatic.config.validate
import borgmatic.hooks.command import borgmatic.hooks.command
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
@ -23,6 +24,11 @@ def run_check(
''' '''
Run the "check" action for the given repository. Run the "check" action for the given repository.
''' '''
if check_arguments.repository and not borgmatic.config.validate.repositories_match(
repository, check_arguments.repository
):
return
borgmatic.hooks.command.execute_hook( borgmatic.hooks.command.execute_hook(
hooks.get('before_check'), hooks.get('before_check'),
hooks.get('umask'), hooks.get('umask'),
@ -31,9 +37,9 @@ def run_check(
global_arguments.dry_run, global_arguments.dry_run,
**hook_context, **hook_context,
) )
logger.info('{}: Running consistency checks'.format(repository)) logger.info(f'{repository["path"]}: Running consistency checks')
borgmatic.borg.check.check_archives( borgmatic.borg.check.check_archives(
repository, repository['path'],
location, location,
storage, storage,
consistency, consistency,

View File

@ -2,6 +2,7 @@ import logging
import borgmatic.borg.compact import borgmatic.borg.compact
import borgmatic.borg.feature import borgmatic.borg.feature
import borgmatic.config.validate
import borgmatic.hooks.command import borgmatic.hooks.command
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
@ -24,6 +25,11 @@ def run_compact(
''' '''
Run the "compact" action for the given repository. Run the "compact" action for the given repository.
''' '''
if compact_arguments.repository and not borgmatic.config.validate.repositories_match(
repository, compact_arguments.repository
):
return
borgmatic.hooks.command.execute_hook( borgmatic.hooks.command.execute_hook(
hooks.get('before_compact'), hooks.get('before_compact'),
hooks.get('umask'), hooks.get('umask'),
@ -33,10 +39,10 @@ def run_compact(
**hook_context, **hook_context,
) )
if borgmatic.borg.feature.available(borgmatic.borg.feature.Feature.COMPACT, local_borg_version): if borgmatic.borg.feature.available(borgmatic.borg.feature.Feature.COMPACT, local_borg_version):
logger.info('{}: Compacting segments{}'.format(repository, dry_run_label)) logger.info(f'{repository["path"]}: Compacting segments{dry_run_label}')
borgmatic.borg.compact.compact_segments( borgmatic.borg.compact.compact_segments(
global_arguments.dry_run, global_arguments.dry_run,
repository, repository['path'],
storage, storage,
local_borg_version, local_borg_version,
local_path=local_path, local_path=local_path,
@ -46,7 +52,7 @@ def run_compact(
threshold=compact_arguments.threshold, threshold=compact_arguments.threshold,
) )
else: # pragma: nocover else: # pragma: nocover
logger.info('{}: Skipping compact (only available/needed in Borg 1.2+)'.format(repository)) logger.info(f'{repository["path"]}: Skipping compact (only available/needed in Borg 1.2+)')
borgmatic.hooks.command.execute_hook( borgmatic.hooks.command.execute_hook(
hooks.get('after_compact'), hooks.get('after_compact'),
hooks.get('umask'), hooks.get('umask'),

View File

@ -2,6 +2,7 @@ import json
import logging import logging
import borgmatic.borg.create import borgmatic.borg.create
import borgmatic.config.validate
import borgmatic.hooks.command import borgmatic.hooks.command
import borgmatic.hooks.dispatch import borgmatic.hooks.dispatch
import borgmatic.hooks.dump import borgmatic.hooks.dump
@ -28,6 +29,11 @@ def run_create(
If create_arguments.json is True, yield the JSON output from creating the archive. If create_arguments.json is True, yield the JSON output from creating the archive.
''' '''
if create_arguments.repository and not borgmatic.config.validate.repositories_match(
repository, create_arguments.repository
):
return
borgmatic.hooks.command.execute_hook( borgmatic.hooks.command.execute_hook(
hooks.get('before_backup'), hooks.get('before_backup'),
hooks.get('umask'), hooks.get('umask'),
@ -36,11 +42,11 @@ def run_create(
global_arguments.dry_run, global_arguments.dry_run,
**hook_context, **hook_context,
) )
logger.info('{}: Creating archive{}'.format(repository, dry_run_label)) logger.info(f'{repository["path"]}: Creating archive{dry_run_label}')
borgmatic.hooks.dispatch.call_hooks_even_if_unconfigured( borgmatic.hooks.dispatch.call_hooks_even_if_unconfigured(
'remove_database_dumps', 'remove_database_dumps',
hooks, hooks,
repository, repository['path'],
borgmatic.hooks.dump.DATABASE_HOOK_NAMES, borgmatic.hooks.dump.DATABASE_HOOK_NAMES,
location, location,
global_arguments.dry_run, global_arguments.dry_run,
@ -48,7 +54,7 @@ def run_create(
active_dumps = borgmatic.hooks.dispatch.call_hooks( active_dumps = borgmatic.hooks.dispatch.call_hooks(
'dump_databases', 'dump_databases',
hooks, hooks,
repository, repository['path'],
borgmatic.hooks.dump.DATABASE_HOOK_NAMES, borgmatic.hooks.dump.DATABASE_HOOK_NAMES,
location, location,
global_arguments.dry_run, global_arguments.dry_run,
@ -57,7 +63,7 @@ def run_create(
json_output = borgmatic.borg.create.create_archive( json_output = borgmatic.borg.create.create_archive(
global_arguments.dry_run, global_arguments.dry_run,
repository, repository['path'],
location, location,
storage, storage,
local_borg_version, local_borg_version,

View File

@ -23,13 +23,13 @@ def run_export_tar(
repository, export_tar_arguments.repository repository, export_tar_arguments.repository
): ):
logger.info( logger.info(
'{}: Exporting archive {} as tar file'.format(repository, export_tar_arguments.archive) f'{repository["path"]}: Exporting archive {export_tar_arguments.archive} as tar file'
) )
borgmatic.borg.export_tar.export_tar_archive( borgmatic.borg.export_tar.export_tar_archive(
global_arguments.dry_run, global_arguments.dry_run,
repository, repository['path'],
borgmatic.borg.rlist.resolve_archive_name( borgmatic.borg.rlist.resolve_archive_name(
repository, repository['path'],
export_tar_arguments.archive, export_tar_arguments.archive,
storage, storage,
local_borg_version, local_borg_version,

View File

@ -35,12 +35,12 @@ def run_extract(
if extract_arguments.repository is None or borgmatic.config.validate.repositories_match( if extract_arguments.repository is None or borgmatic.config.validate.repositories_match(
repository, extract_arguments.repository repository, extract_arguments.repository
): ):
logger.info('{}: Extracting archive {}'.format(repository, extract_arguments.archive)) logger.info(f'{repository["path"]}: Extracting archive {extract_arguments.archive}')
borgmatic.borg.extract.extract_archive( borgmatic.borg.extract.extract_archive(
global_arguments.dry_run, global_arguments.dry_run,
repository, repository['path'],
borgmatic.borg.rlist.resolve_archive_name( borgmatic.borg.rlist.resolve_archive_name(
repository, repository['path'],
extract_arguments.archive, extract_arguments.archive,
storage, storage,
local_borg_version, local_borg_version,

View File

@ -20,9 +20,9 @@ def run_info(
repository, info_arguments.repository repository, info_arguments.repository
): ):
if not info_arguments.json: # pragma: nocover if not info_arguments.json: # pragma: nocover
logger.answer(f'{repository}: Displaying archive summary information') logger.answer(f'{repository["path"]}: Displaying archive summary information')
info_arguments.archive = borgmatic.borg.rlist.resolve_archive_name( info_arguments.archive = borgmatic.borg.rlist.resolve_archive_name(
repository, repository['path'],
info_arguments.archive, info_arguments.archive,
storage, storage,
local_borg_version, local_borg_version,
@ -30,7 +30,7 @@ def run_info(
remote_path, remote_path,
) )
json_output = borgmatic.borg.info.display_archives_info( json_output = borgmatic.borg.info.display_archives_info(
repository, repository['path'],
storage, storage,
local_borg_version, local_borg_version,
info_arguments=info_arguments, info_arguments=info_arguments,

View File

@ -20,11 +20,11 @@ def run_list(
): ):
if not list_arguments.json: # pragma: nocover if not list_arguments.json: # pragma: nocover
if list_arguments.find_paths: if list_arguments.find_paths:
logger.answer(f'{repository}: Searching archives') logger.answer(f'{repository["path"]}: Searching archives')
elif not list_arguments.archive: elif not list_arguments.archive:
logger.answer(f'{repository}: Listing archives') logger.answer(f'{repository["path"]}: Listing archives')
list_arguments.archive = borgmatic.borg.rlist.resolve_archive_name( list_arguments.archive = borgmatic.borg.rlist.resolve_archive_name(
repository, repository['path'],
list_arguments.archive, list_arguments.archive,
storage, storage,
local_borg_version, local_borg_version,
@ -32,7 +32,7 @@ def run_list(
remote_path, remote_path,
) )
json_output = borgmatic.borg.list.list_archive( json_output = borgmatic.borg.list.list_archive(
repository, repository['path'],
storage, storage,
local_borg_version, local_borg_version,
list_arguments=list_arguments, list_arguments=list_arguments,

View File

@ -17,14 +17,14 @@ def run_mount(
repository, mount_arguments.repository repository, mount_arguments.repository
): ):
if mount_arguments.archive: if mount_arguments.archive:
logger.info('{}: Mounting archive {}'.format(repository, mount_arguments.archive)) logger.info(f'{repository["path"]}: Mounting archive {mount_arguments.archive}')
else: # pragma: nocover else: # pragma: nocover
logger.info('{}: Mounting repository'.format(repository)) logger.info(f'{repository["path"]}: Mounting repository')
borgmatic.borg.mount.mount_archive( borgmatic.borg.mount.mount_archive(
repository, repository['path'],
borgmatic.borg.rlist.resolve_archive_name( borgmatic.borg.rlist.resolve_archive_name(
repository, repository['path'],
mount_arguments.archive, mount_arguments.archive,
storage, storage,
local_borg_version, local_borg_version,

View File

@ -1,6 +1,7 @@
import logging import logging
import borgmatic.borg.prune import borgmatic.borg.prune
import borgmatic.config.validate
import borgmatic.hooks.command import borgmatic.hooks.command
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
@ -23,6 +24,11 @@ def run_prune(
''' '''
Run the "prune" action for the given repository. Run the "prune" action for the given repository.
''' '''
if prune_arguments.repository and not borgmatic.config.validate.repositories_match(
repository, prune_arguments.repository
):
return
borgmatic.hooks.command.execute_hook( borgmatic.hooks.command.execute_hook(
hooks.get('before_prune'), hooks.get('before_prune'),
hooks.get('umask'), hooks.get('umask'),
@ -31,10 +37,10 @@ def run_prune(
global_arguments.dry_run, global_arguments.dry_run,
**hook_context, **hook_context,
) )
logger.info('{}: Pruning archives{}'.format(repository, dry_run_label)) logger.info(f'{repository["path"]}: Pruning archives{dry_run_label}')
borgmatic.borg.prune.prune_archives( borgmatic.borg.prune.prune_archives(
global_arguments.dry_run, global_arguments.dry_run,
repository, repository['path'],
storage, storage,
retention, retention,
local_borg_version, local_borg_version,

View File

@ -23,10 +23,10 @@ def run_rcreate(
): ):
return return
logger.info('{}: Creating repository'.format(repository)) logger.info(f'{repository["path"]}: Creating repository')
borgmatic.borg.rcreate.create_repository( borgmatic.borg.rcreate.create_repository(
global_arguments.dry_run, global_arguments.dry_run,
repository, repository['path'],
storage, storage,
local_borg_version, local_borg_version,
rcreate_arguments.encryption_mode, rcreate_arguments.encryption_mode,

View File

@ -256,22 +256,34 @@ def run_restore(
return return
logger.info( logger.info(
'{}: Restoring databases from archive {}'.format(repository, restore_arguments.archive) f'{repository["path"]}: Restoring databases from archive {restore_arguments.archive}'
) )
borgmatic.hooks.dispatch.call_hooks_even_if_unconfigured( borgmatic.hooks.dispatch.call_hooks_even_if_unconfigured(
'remove_database_dumps', 'remove_database_dumps',
hooks, hooks,
repository, repository['path'],
borgmatic.hooks.dump.DATABASE_HOOK_NAMES, borgmatic.hooks.dump.DATABASE_HOOK_NAMES,
location, location,
global_arguments.dry_run, global_arguments.dry_run,
) )
archive_name = borgmatic.borg.rlist.resolve_archive_name( archive_name = borgmatic.borg.rlist.resolve_archive_name(
repository, restore_arguments.archive, storage, local_borg_version, local_path, remote_path, repository['path'],
restore_arguments.archive,
storage,
local_borg_version,
local_path,
remote_path,
) )
archive_database_names = collect_archive_database_names( archive_database_names = collect_archive_database_names(
repository, archive_name, location, storage, local_borg_version, local_path, remote_path, repository['path'],
archive_name,
location,
storage,
local_borg_version,
local_path,
remote_path,
) )
restore_names = find_databases_to_restore(restore_arguments.databases, archive_database_names) restore_names = find_databases_to_restore(restore_arguments.databases, archive_database_names)
found_names = set() found_names = set()
@ -291,7 +303,7 @@ def run_restore(
found_names.add(database_name) found_names.add(database_name)
restore_single_database( restore_single_database(
repository, repository['path'],
location, location,
storage, storage,
hooks, hooks,
@ -320,7 +332,7 @@ def run_restore(
database['name'] = database_name database['name'] = database_name
restore_single_database( restore_single_database(
repository, repository['path'],
location, location,
storage, storage,
hooks, hooks,
@ -336,7 +348,7 @@ def run_restore(
borgmatic.hooks.dispatch.call_hooks_even_if_unconfigured( borgmatic.hooks.dispatch.call_hooks_even_if_unconfigured(
'remove_database_dumps', 'remove_database_dumps',
hooks, hooks,
repository, repository['path'],
borgmatic.hooks.dump.DATABASE_HOOK_NAMES, borgmatic.hooks.dump.DATABASE_HOOK_NAMES,
location, location,
global_arguments.dry_run, global_arguments.dry_run,

View File

@ -19,9 +19,10 @@ def run_rinfo(
repository, rinfo_arguments.repository repository, rinfo_arguments.repository
): ):
if not rinfo_arguments.json: # pragma: nocover if not rinfo_arguments.json: # pragma: nocover
logger.answer('{}: Displaying repository summary information'.format(repository)) logger.answer(f'{repository["path"]}: Displaying repository summary information')
json_output = borgmatic.borg.rinfo.display_repository_info( json_output = borgmatic.borg.rinfo.display_repository_info(
repository, repository['path'],
storage, storage,
local_borg_version, local_borg_version,
rinfo_arguments=rinfo_arguments, rinfo_arguments=rinfo_arguments,

View File

@ -19,9 +19,10 @@ def run_rlist(
repository, rlist_arguments.repository repository, rlist_arguments.repository
): ):
if not rlist_arguments.json: # pragma: nocover if not rlist_arguments.json: # pragma: nocover
logger.answer('{}: Listing repository'.format(repository)) logger.answer(f'{repository["path"]}: Listing repository')
json_output = borgmatic.borg.rlist.list_repository( json_output = borgmatic.borg.rlist.list_repository(
repository, repository['path'],
storage, storage,
local_borg_version, local_borg_version,
rlist_arguments=rlist_arguments, rlist_arguments=rlist_arguments,

View File

@ -17,10 +17,10 @@ def run_transfer(
''' '''
Run the "transfer" action for the given repository. Run the "transfer" action for the given repository.
''' '''
logger.info(f'{repository}: Transferring archives to repository') logger.info(f'{repository["path"]}: Transferring archives to repository')
borgmatic.borg.transfer.transfer_archives( borgmatic.borg.transfer.transfer_archives(
global_arguments.dry_run, global_arguments.dry_run,
repository, repository['path'],
storage, storage,
local_borg_version, local_borg_version,
transfer_arguments, transfer_arguments,

View File

@ -13,7 +13,7 @@ BORG_SUBCOMMANDS_WITHOUT_REPOSITORY = (('debug', 'info'), ('debug', 'convert-pro
def run_arbitrary_borg( def run_arbitrary_borg(
repository, repository_path,
storage_config, storage_config,
local_borg_version, local_borg_version,
options, options,
@ -44,10 +44,10 @@ def run_arbitrary_borg(
repository_archive_flags = () repository_archive_flags = ()
elif archive: elif archive:
repository_archive_flags = flags.make_repository_archive_flags( repository_archive_flags = flags.make_repository_archive_flags(
repository, archive, local_borg_version repository_path, archive, local_borg_version
) )
else: else:
repository_archive_flags = flags.make_repository_flags(repository, local_borg_version) repository_archive_flags = flags.make_repository_flags(repository_path, local_borg_version)
full_command = ( full_command = (
(local_path,) (local_path,)

View File

@ -7,7 +7,7 @@ logger = logging.getLogger(__name__)
def break_lock( def break_lock(
repository, storage_config, local_borg_version, local_path='borg', remote_path=None, repository_path, storage_config, local_borg_version, local_path='borg', remote_path=None,
): ):
''' '''
Given a local or remote repository path, a storage configuration dict, the local Borg version, Given a local or remote repository path, a storage configuration dict, the local Borg version,
@ -24,7 +24,7 @@ def break_lock(
+ (('--lock-wait', str(lock_wait)) if lock_wait else ()) + (('--lock-wait', str(lock_wait)) if lock_wait else ())
+ (('--info',) if logger.getEffectiveLevel() == logging.INFO else ()) + (('--info',) if logger.getEffectiveLevel() == logging.INFO else ())
+ (('--debug', '--show-rc') if logger.isEnabledFor(logging.DEBUG) else ()) + (('--debug', '--show-rc') if logger.isEnabledFor(logging.DEBUG) else ())
+ flags.make_repository_flags(repository, local_borg_version) + flags.make_repository_flags(repository_path, local_borg_version)
) )
borg_environment = environment.make_environment(storage_config) borg_environment = environment.make_environment(storage_config)

View File

@ -12,7 +12,6 @@ DEFAULT_CHECKS = (
{'name': 'repository', 'frequency': '1 month'}, {'name': 'repository', 'frequency': '1 month'},
{'name': 'archives', 'frequency': '1 month'}, {'name': 'archives', 'frequency': '1 month'},
) )
DEFAULT_PREFIX = '{hostname}-'
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
@ -139,16 +138,17 @@ def filter_checks_on_frequency(
if datetime.datetime.now() < check_time + frequency_delta: if datetime.datetime.now() < check_time + frequency_delta:
remaining = check_time + frequency_delta - datetime.datetime.now() remaining = check_time + frequency_delta - datetime.datetime.now()
logger.info( logger.info(
f"Skipping {check} check due to configured frequency; {remaining} until next check" f'Skipping {check} check due to configured frequency; {remaining} until next check'
) )
filtered_checks.remove(check) filtered_checks.remove(check)
return tuple(filtered_checks) return tuple(filtered_checks)
def make_check_flags(local_borg_version, checks, check_last=None, prefix=None): def make_check_flags(local_borg_version, storage_config, checks, check_last=None, prefix=None):
''' '''
Given the local Borg version and a parsed sequence of checks, transform the checks into tuple of Given the local Borg version, a storage configuration dict, a parsed sequence of checks, the
check last value, and a consistency check prefix, transform the checks into tuple of
command-line flags. command-line flags.
For example, given parsed checks of: For example, given parsed checks of:
@ -174,10 +174,21 @@ def make_check_flags(local_borg_version, checks, check_last=None, prefix=None):
if 'archives' in checks: if 'archives' in checks:
last_flags = ('--last', str(check_last)) if check_last else () last_flags = ('--last', str(check_last)) if check_last else ()
if feature.available(feature.Feature.MATCH_ARCHIVES, local_borg_version): match_archives_flags = (
match_archives_flags = ('--match-archives', f'sh:{prefix}*') if prefix else () (
else: ('--match-archives', f'sh:{prefix}*')
match_archives_flags = ('--glob-archives', f'{prefix}*') if prefix else () if feature.available(feature.Feature.MATCH_ARCHIVES, local_borg_version)
else ('--glob-archives', f'{prefix}*')
)
if prefix
else (
flags.make_match_archives_flags(
storage_config.get('match_archives'),
storage_config.get('archive_name_format'),
local_borg_version,
)
)
)
else: else:
last_flags = () last_flags = ()
match_archives_flags = () match_archives_flags = ()
@ -196,7 +207,7 @@ def make_check_flags(local_borg_version, checks, check_last=None, prefix=None):
return common_flags return common_flags
return ( return (
tuple('--{}-only'.format(check) for check in checks if check in ('repository', 'archives')) tuple(f'--{check}-only' for check in checks if check in ('repository', 'archives'))
+ common_flags + common_flags
) )
@ -243,7 +254,7 @@ def read_check_time(path):
def check_archives( def check_archives(
repository, repository_path,
location_config, location_config,
storage_config, storage_config,
consistency_config, consistency_config,
@ -268,7 +279,7 @@ def check_archives(
try: try:
borg_repository_id = json.loads( borg_repository_id = json.loads(
rinfo.display_repository_info( rinfo.display_repository_info(
repository, repository_path,
storage_config, storage_config,
local_borg_version, local_borg_version,
argparse.Namespace(json=True), argparse.Namespace(json=True),
@ -277,7 +288,7 @@ def check_archives(
) )
)['repository']['id'] )['repository']['id']
except (json.JSONDecodeError, KeyError): except (json.JSONDecodeError, KeyError):
raise ValueError(f'Cannot determine Borg repository ID for {repository}') raise ValueError(f'Cannot determine Borg repository ID for {repository_path}')
checks = filter_checks_on_frequency( checks = filter_checks_on_frequency(
location_config, location_config,
@ -291,7 +302,7 @@ def check_archives(
extra_borg_options = storage_config.get('extra_borg_options', {}).get('check', '') extra_borg_options = storage_config.get('extra_borg_options', {}).get('check', '')
if set(checks).intersection({'repository', 'archives', 'data'}): if set(checks).intersection({'repository', 'archives', 'data'}):
lock_wait = storage_config.get('lock_wait', None) lock_wait = storage_config.get('lock_wait')
verbosity_flags = () verbosity_flags = ()
if logger.isEnabledFor(logging.INFO): if logger.isEnabledFor(logging.INFO):
@ -299,18 +310,18 @@ def check_archives(
if logger.isEnabledFor(logging.DEBUG): if logger.isEnabledFor(logging.DEBUG):
verbosity_flags = ('--debug', '--show-rc') verbosity_flags = ('--debug', '--show-rc')
prefix = consistency_config.get('prefix', DEFAULT_PREFIX) prefix = consistency_config.get('prefix')
full_command = ( full_command = (
(local_path, 'check') (local_path, 'check')
+ (('--repair',) if repair else ()) + (('--repair',) if repair else ())
+ make_check_flags(local_borg_version, checks, check_last, prefix) + make_check_flags(local_borg_version, storage_config, checks, check_last, prefix)
+ (('--remote-path', remote_path) if remote_path else ()) + (('--remote-path', remote_path) if remote_path else ())
+ (('--lock-wait', str(lock_wait)) if lock_wait else ()) + (('--lock-wait', str(lock_wait)) if lock_wait else ())
+ verbosity_flags + verbosity_flags
+ (('--progress',) if progress else ()) + (('--progress',) if progress else ())
+ (tuple(extra_borg_options.split(' ')) if extra_borg_options else ()) + (tuple(extra_borg_options.split(' ')) if extra_borg_options else ())
+ flags.make_repository_flags(repository, local_borg_version) + flags.make_repository_flags(repository_path, local_borg_version)
) )
borg_environment = environment.make_environment(storage_config) borg_environment = environment.make_environment(storage_config)
@ -329,6 +340,6 @@ def check_archives(
if 'extract' in checks: if 'extract' in checks:
extract.extract_last_archive_dry_run( extract.extract_last_archive_dry_run(
storage_config, local_borg_version, repository, lock_wait, local_path, remote_path storage_config, local_borg_version, repository_path, lock_wait, local_path, remote_path
) )
write_check_time(make_check_time_path(location_config, borg_repository_id, 'extract')) write_check_time(make_check_time_path(location_config, borg_repository_id, 'extract'))

View File

@ -8,7 +8,7 @@ logger = logging.getLogger(__name__)
def compact_segments( def compact_segments(
dry_run, dry_run,
repository, repository_path,
storage_config, storage_config,
local_borg_version, local_borg_version,
local_path='borg', local_path='borg',
@ -36,11 +36,11 @@ def compact_segments(
+ (('--info',) if logger.getEffectiveLevel() == logging.INFO else ()) + (('--info',) if logger.getEffectiveLevel() == logging.INFO else ())
+ (('--debug', '--show-rc') if logger.isEnabledFor(logging.DEBUG) else ()) + (('--debug', '--show-rc') if logger.isEnabledFor(logging.DEBUG) else ())
+ (tuple(extra_borg_options.split(' ')) if extra_borg_options else ()) + (tuple(extra_borg_options.split(' ')) if extra_borg_options else ())
+ flags.make_repository_flags(repository, local_borg_version) + flags.make_repository_flags(repository_path, local_borg_version)
) )
if dry_run: if dry_run:
logging.info(f'{repository}: Skipping compact (dry run)') logging.info(f'{repository_path}: Skipping compact (dry run)')
return return
execute_command( execute_command(

View File

@ -196,7 +196,28 @@ def make_exclude_flags(location_config, exclude_filename=None):
) )
DEFAULT_ARCHIVE_NAME_FORMAT = '{hostname}-{now:%Y-%m-%dT%H:%M:%S.%f}' def make_list_filter_flags(local_borg_version, dry_run):
'''
Given the local Borg version and whether this is a dry run, return the corresponding flags for
passing to "--list --filter". The general idea is that excludes are shown for a dry run or when
the verbosity is debug.
'''
base_flags = 'AME'
show_excludes = logger.isEnabledFor(logging.DEBUG)
if feature.available(feature.Feature.EXCLUDED_FILES_MINUS, local_borg_version):
if show_excludes or dry_run:
return f'{base_flags}+-'
else:
return base_flags
if show_excludes:
return f'{base_flags}x-'
else:
return f'{base_flags}-'
DEFAULT_ARCHIVE_NAME_FORMAT = '{hostname}-{now:%Y-%m-%dT%H:%M:%S.%f}' # noqa: FS003
def collect_borgmatic_source_directories(borgmatic_source_directory): def collect_borgmatic_source_directories(borgmatic_source_directory):
@ -285,9 +306,23 @@ def collect_special_file_paths(
) )
def check_all_source_directories_exist(source_directories):
'''
Given a sequence of source directories, check that they all exist. If any do not, raise an
exception.
'''
missing_directories = [
source_directory
for source_directory in source_directories
if not os.path.exists(source_directory)
]
if missing_directories:
raise ValueError(f"Source directories do not exist: {', '.join(missing_directories)}")
def create_archive( def create_archive(
dry_run, dry_run,
repository, repository_path,
location_config, location_config,
storage_config, storage_config,
local_borg_version, local_borg_version,
@ -310,6 +345,8 @@ def create_archive(
borgmatic_source_directories = expand_directories( borgmatic_source_directories = expand_directories(
collect_borgmatic_source_directories(location_config.get('borgmatic_source_directory')) collect_borgmatic_source_directories(location_config.get('borgmatic_source_directory'))
) )
if location_config.get('source_directories_must_exist', False):
check_all_source_directories_exist(location_config.get('source_directories'))
sources = deduplicate_directories( sources = deduplicate_directories(
map_directories_to_devices( map_directories_to_devices(
expand_directories( expand_directories(
@ -343,6 +380,7 @@ def create_archive(
upload_rate_limit = storage_config.get('upload_rate_limit', None) upload_rate_limit = storage_config.get('upload_rate_limit', None)
umask = storage_config.get('umask', None) umask = storage_config.get('umask', None)
lock_wait = storage_config.get('lock_wait', None) lock_wait = storage_config.get('lock_wait', None)
list_filter_flags = make_list_filter_flags(local_borg_version, dry_run)
files_cache = location_config.get('files_cache') files_cache = location_config.get('files_cache')
archive_name_format = storage_config.get('archive_name_format', DEFAULT_ARCHIVE_NAME_FORMAT) archive_name_format = storage_config.get('archive_name_format', DEFAULT_ARCHIVE_NAME_FORMAT)
extra_borg_options = storage_config.get('extra_borg_options', {}).get('create', '') extra_borg_options = storage_config.get('extra_borg_options', {}).get('create', '')
@ -373,7 +411,7 @@ def create_archive(
if stream_processes and location_config.get('read_special') is False: if stream_processes and location_config.get('read_special') is False:
logger.warning( logger.warning(
f'{repository}: Ignoring configured "read_special" value of false, as true is needed for database hooks.' f'{repository_path}: Ignoring configured "read_special" value of false, as true is needed for database hooks.'
) )
create_command = ( create_command = (
@ -401,10 +439,16 @@ def create_archive(
+ (('--remote-path', remote_path) if remote_path else ()) + (('--remote-path', remote_path) if remote_path else ())
+ (('--umask', str(umask)) if umask else ()) + (('--umask', str(umask)) if umask else ())
+ (('--lock-wait', str(lock_wait)) if lock_wait else ()) + (('--lock-wait', str(lock_wait)) if lock_wait else ())
+ (('--list', '--filter', 'AMEx+-') if list_files and not json and not progress else ()) + (
('--list', '--filter', list_filter_flags)
if list_files and not json and not progress
else ()
)
+ (('--dry-run',) if dry_run else ()) + (('--dry-run',) if dry_run else ())
+ (tuple(extra_borg_options.split(' ')) if extra_borg_options else ()) + (tuple(extra_borg_options.split(' ')) if extra_borg_options else ())
+ flags.make_repository_archive_flags(repository, archive_name_format, local_borg_version) + flags.make_repository_archive_flags(
repository_path, archive_name_format, local_borg_version
)
+ (sources if not pattern_file else ()) + (sources if not pattern_file else ())
) )
@ -424,7 +468,7 @@ def create_archive(
# If database hooks are enabled (as indicated by streaming processes), exclude files that might # If database hooks are enabled (as indicated by streaming processes), exclude files that might
# cause Borg to hang. But skip this if the user has explicitly set the "read_special" to True. # cause Borg to hang. But skip this if the user has explicitly set the "read_special" to True.
if stream_processes and not location_config.get('read_special'): if stream_processes and not location_config.get('read_special'):
logger.debug(f'{repository}: Collecting special file paths') logger.debug(f'{repository_path}: Collecting special file paths')
special_file_paths = collect_special_file_paths( special_file_paths = collect_special_file_paths(
create_command, create_command,
local_path, local_path,
@ -435,7 +479,7 @@ def create_archive(
if special_file_paths: if special_file_paths:
logger.warning( logger.warning(
f'{repository}: Excluding special files to prevent Borg from hanging: {", ".join(special_file_paths)}' f'{repository_path}: Excluding special files to prevent Borg from hanging: {", ".join(special_file_paths)}'
) )
exclude_file = write_pattern_file( exclude_file = write_pattern_file(
expand_home_directories( expand_home_directories(

View File

@ -2,6 +2,7 @@ OPTION_TO_ENVIRONMENT_VARIABLE = {
'borg_base_directory': 'BORG_BASE_DIR', 'borg_base_directory': 'BORG_BASE_DIR',
'borg_config_directory': 'BORG_CONFIG_DIR', 'borg_config_directory': 'BORG_CONFIG_DIR',
'borg_cache_directory': 'BORG_CACHE_DIR', 'borg_cache_directory': 'BORG_CACHE_DIR',
'borg_files_cache_ttl': 'BORG_FILES_CACHE_TTL',
'borg_security_directory': 'BORG_SECURITY_DIR', 'borg_security_directory': 'BORG_SECURITY_DIR',
'borg_keys_directory': 'BORG_KEYS_DIR', 'borg_keys_directory': 'BORG_KEYS_DIR',
'encryption_passcommand': 'BORG_PASSCOMMAND', 'encryption_passcommand': 'BORG_PASSCOMMAND',
@ -27,7 +28,7 @@ def make_environment(storage_config):
value = storage_config.get(option_name) value = storage_config.get(option_name)
if value: if value:
environment[environment_variable_name] = value environment[environment_variable_name] = str(value)
for ( for (
option_name, option_name,

View File

@ -1,5 +1,4 @@
import logging import logging
import os
import borgmatic.logger import borgmatic.logger
from borgmatic.borg import environment, flags from borgmatic.borg import environment, flags
@ -10,7 +9,7 @@ logger = logging.getLogger(__name__)
def export_tar_archive( def export_tar_archive(
dry_run, dry_run,
repository, repository_path,
archive, archive,
paths, paths,
destination_path, destination_path,
@ -46,11 +45,7 @@ def export_tar_archive(
+ (('--dry-run',) if dry_run else ()) + (('--dry-run',) if dry_run else ())
+ (('--tar-filter', tar_filter) if tar_filter else ()) + (('--tar-filter', tar_filter) if tar_filter else ())
+ (('--strip-components', str(strip_components)) if strip_components else ()) + (('--strip-components', str(strip_components)) if strip_components else ())
+ flags.make_repository_archive_flags( + flags.make_repository_archive_flags(repository_path, archive, local_borg_version,)
repository if ':' in repository else os.path.abspath(repository),
archive,
local_borg_version,
)
+ (destination_path,) + (destination_path,)
+ (tuple(paths) if paths else ()) + (tuple(paths) if paths else ())
) )
@ -61,7 +56,7 @@ def export_tar_archive(
output_log_level = logging.INFO output_log_level = logging.INFO
if dry_run: if dry_run:
logging.info('{}: Skipping export to tar file (dry run)'.format(repository)) logging.info(f'{repository_path}: Skipping export to tar file (dry run)')
return return
execute_command( execute_command(

View File

@ -11,7 +11,7 @@ logger = logging.getLogger(__name__)
def extract_last_archive_dry_run( def extract_last_archive_dry_run(
storage_config, storage_config,
local_borg_version, local_borg_version,
repository, repository_path,
lock_wait=None, lock_wait=None,
local_path='borg', local_path='borg',
remote_path=None, remote_path=None,
@ -30,7 +30,7 @@ def extract_last_archive_dry_run(
try: try:
last_archive_name = rlist.resolve_archive_name( last_archive_name = rlist.resolve_archive_name(
repository, 'latest', storage_config, local_borg_version, local_path, remote_path repository_path, 'latest', storage_config, local_borg_version, local_path, remote_path
) )
except ValueError: except ValueError:
logger.warning('No archives found. Skipping extract consistency check.') logger.warning('No archives found. Skipping extract consistency check.')
@ -44,7 +44,9 @@ def extract_last_archive_dry_run(
+ lock_wait_flags + lock_wait_flags
+ verbosity_flags + verbosity_flags
+ list_flag + list_flag
+ flags.make_repository_archive_flags(repository, last_archive_name, local_borg_version) + flags.make_repository_archive_flags(
repository_path, last_archive_name, local_borg_version
)
) )
execute_command( execute_command(
@ -87,6 +89,13 @@ def extract_archive(
else: else:
numeric_ids_flags = ('--numeric-owner',) if location_config.get('numeric_ids') else () numeric_ids_flags = ('--numeric-owner',) if location_config.get('numeric_ids') else ()
if strip_components == 'all':
if not paths:
raise ValueError('The --strip-components flag with "all" requires at least one --path')
# Calculate the maximum number of leading path components of the given paths.
strip_components = max(0, *(len(path.split(os.path.sep)) - 1 for path in paths))
full_command = ( full_command = (
(local_path, 'extract') (local_path, 'extract')
+ (('--remote-path', remote_path) if remote_path else ()) + (('--remote-path', remote_path) if remote_path else ())
@ -99,11 +108,7 @@ def extract_archive(
+ (('--strip-components', str(strip_components)) if strip_components else ()) + (('--strip-components', str(strip_components)) if strip_components else ())
+ (('--progress',) if progress else ()) + (('--progress',) if progress else ())
+ (('--stdout',) if extract_to_stdout else ()) + (('--stdout',) if extract_to_stdout else ())
+ flags.make_repository_archive_flags( + flags.make_repository_archive_flags(repository, archive, local_borg_version,)
repository if ':' in repository else os.path.abspath(repository),
archive,
local_borg_version,
)
+ (tuple(paths) if paths else ()) + (tuple(paths) if paths else ())
) )

View File

@ -14,6 +14,7 @@ class Feature(Enum):
RLIST = 8 RLIST = 8
RINFO = 9 RINFO = 9
MATCH_ARCHIVES = 10 MATCH_ARCHIVES = 10
EXCLUDED_FILES_MINUS = 11
FEATURE_TO_MINIMUM_BORG_VERSION = { FEATURE_TO_MINIMUM_BORG_VERSION = {
@ -27,6 +28,7 @@ FEATURE_TO_MINIMUM_BORG_VERSION = {
Feature.RLIST: parse_version('2.0.0a2'), # borg rlist Feature.RLIST: parse_version('2.0.0a2'), # borg rlist
Feature.RINFO: parse_version('2.0.0a2'), # borg rinfo Feature.RINFO: parse_version('2.0.0a2'), # borg rinfo
Feature.MATCH_ARCHIVES: parse_version('2.0.0b3'), # borg --match-archives Feature.MATCH_ARCHIVES: parse_version('2.0.0b3'), # borg --match-archives
Feature.EXCLUDED_FILES_MINUS: parse_version('2.0.0b5'), # --list --filter uses "-" for excludes
} }

View File

@ -1,4 +1,5 @@
import itertools import itertools
import re
from borgmatic.borg import feature from borgmatic.borg import feature
@ -10,7 +11,7 @@ def make_flags(name, value):
if not value: if not value:
return () return ()
flag = '--{}'.format(name.replace('_', '-')) flag = f"--{name.replace('_', '-')}"
if value is True: if value is True:
return (flag,) return (flag,)
@ -33,7 +34,7 @@ def make_flags_from_arguments(arguments, excludes=()):
) )
def make_repository_flags(repository, local_borg_version): def make_repository_flags(repository_path, local_borg_version):
''' '''
Given the path of a Borg repository and the local Borg version, return Borg-version-appropriate Given the path of a Borg repository and the local Borg version, return Borg-version-appropriate
command-line flags (as a tuple) for selecting that repository. command-line flags (as a tuple) for selecting that repository.
@ -42,17 +43,41 @@ def make_repository_flags(repository, local_borg_version):
('--repo',) ('--repo',)
if feature.available(feature.Feature.SEPARATE_REPOSITORY_ARCHIVE, local_borg_version) if feature.available(feature.Feature.SEPARATE_REPOSITORY_ARCHIVE, local_borg_version)
else () else ()
) + (repository,) ) + (repository_path,)
def make_repository_archive_flags(repository, archive, local_borg_version): def make_repository_archive_flags(repository_path, archive, local_borg_version):
''' '''
Given the path of a Borg repository, an archive name or pattern, and the local Borg version, Given the path of a Borg repository, an archive name or pattern, and the local Borg version,
return Borg-version-appropriate command-line flags (as a tuple) for selecting that repository return Borg-version-appropriate command-line flags (as a tuple) for selecting that repository
and archive. and archive.
''' '''
return ( return (
('--repo', repository, archive) ('--repo', repository_path, archive)
if feature.available(feature.Feature.SEPARATE_REPOSITORY_ARCHIVE, local_borg_version) if feature.available(feature.Feature.SEPARATE_REPOSITORY_ARCHIVE, local_borg_version)
else (f'{repository}::{archive}',) else (f'{repository_path}::{archive}',)
) )
def make_match_archives_flags(match_archives, archive_name_format, local_borg_version):
'''
Return match archives flags based on the given match archives value, if any. If it isn't set,
return match archives flags to match archives created with the given archive name format, if
any. This is done by replacing certain archive name format placeholders for ephemeral data (like
"{now}") with globs.
'''
if match_archives:
if feature.available(feature.Feature.MATCH_ARCHIVES, local_borg_version):
return ('--match-archives', match_archives)
else:
return ('--glob-archives', re.sub(r'^sh:', '', match_archives))
if not archive_name_format:
return ()
derived_match_archives = re.sub(r'\{(now|utcnow|pid)([:%\w\.-]*)\}', '*', archive_name_format)
if feature.available(feature.Feature.MATCH_ARCHIVES, local_borg_version):
return ('--match-archives', f'sh:{derived_match_archives}')
else:
return ('--glob-archives', f'{derived_match_archives}')

View File

@ -8,7 +8,7 @@ logger = logging.getLogger(__name__)
def display_archives_info( def display_archives_info(
repository, repository_path,
storage_config, storage_config,
local_borg_version, local_borg_version,
info_arguments, info_arguments,
@ -44,17 +44,20 @@ def display_archives_info(
else flags.make_flags('glob-archives', f'{info_arguments.prefix}*') else flags.make_flags('glob-archives', f'{info_arguments.prefix}*')
) )
if info_arguments.prefix if info_arguments.prefix
else () else (
flags.make_match_archives_flags(
info_arguments.match_archives
or info_arguments.archive
or storage_config.get('match_archives'),
storage_config.get('archive_name_format'),
local_borg_version,
)
)
) )
+ flags.make_flags_from_arguments( + flags.make_flags_from_arguments(
info_arguments, excludes=('repository', 'archive', 'prefix') info_arguments, excludes=('repository', 'archive', 'prefix', 'match_archives')
)
+ flags.make_repository_flags(repository, local_borg_version)
+ (
flags.make_flags('match-archives', info_arguments.archive)
if feature.available(feature.Feature.MATCH_ARCHIVES, local_borg_version)
else flags.make_flags('glob-archives', info_arguments.archive)
) )
+ flags.make_repository_flags(repository_path, local_borg_version)
) )
if info_arguments.json: if info_arguments.json:

View File

@ -21,7 +21,7 @@ MAKE_FLAGS_EXCLUDES = (
def make_list_command( def make_list_command(
repository, repository_path,
storage_config, storage_config,
local_borg_version, local_borg_version,
list_arguments, list_arguments,
@ -52,10 +52,10 @@ def make_list_command(
+ flags.make_flags_from_arguments(list_arguments, excludes=MAKE_FLAGS_EXCLUDES) + flags.make_flags_from_arguments(list_arguments, excludes=MAKE_FLAGS_EXCLUDES)
+ ( + (
flags.make_repository_archive_flags( flags.make_repository_archive_flags(
repository, list_arguments.archive, local_borg_version repository_path, list_arguments.archive, local_borg_version
) )
if list_arguments.archive if list_arguments.archive
else flags.make_repository_flags(repository, local_borg_version) else flags.make_repository_flags(repository_path, local_borg_version)
) )
+ (tuple(list_arguments.paths) if list_arguments.paths else ()) + (tuple(list_arguments.paths) if list_arguments.paths else ())
) )
@ -86,7 +86,7 @@ def make_find_paths(find_paths):
def capture_archive_listing( def capture_archive_listing(
repository, repository_path,
archive, archive,
storage_config, storage_config,
local_borg_version, local_borg_version,
@ -104,16 +104,16 @@ def capture_archive_listing(
return tuple( return tuple(
execute_command_and_capture_output( execute_command_and_capture_output(
make_list_command( make_list_command(
repository, repository_path,
storage_config, storage_config,
local_borg_version, local_borg_version,
argparse.Namespace( argparse.Namespace(
repository=repository, repository=repository_path,
archive=archive, archive=archive,
paths=[f'sh:{list_path}'], paths=[f'sh:{list_path}'],
find_paths=None, find_paths=None,
json=None, json=None,
format='{path}{NL}', format='{path}{NL}', # noqa: FS003
), ),
local_path, local_path,
remote_path, remote_path,
@ -126,7 +126,7 @@ def capture_archive_listing(
def list_archive( def list_archive(
repository, repository_path,
storage_config, storage_config,
local_borg_version, local_borg_version,
list_arguments, list_arguments,
@ -149,7 +149,7 @@ def list_archive(
) )
rlist_arguments = argparse.Namespace( rlist_arguments = argparse.Namespace(
repository=repository, repository=repository_path,
short=list_arguments.short, short=list_arguments.short,
format=list_arguments.format, format=list_arguments.format,
json=list_arguments.json, json=list_arguments.json,
@ -160,7 +160,12 @@ def list_archive(
last=list_arguments.last, last=list_arguments.last,
) )
return rlist.list_repository( return rlist.list_repository(
repository, storage_config, local_borg_version, rlist_arguments, local_path, remote_path repository_path,
storage_config,
local_borg_version,
rlist_arguments,
local_path,
remote_path,
) )
if list_arguments.archive: if list_arguments.archive:
@ -181,7 +186,7 @@ def list_archive(
# getting a list of archives to search. # getting a list of archives to search.
if list_arguments.find_paths and not list_arguments.archive: if list_arguments.find_paths and not list_arguments.archive:
rlist_arguments = argparse.Namespace( rlist_arguments = argparse.Namespace(
repository=repository, repository=repository_path,
short=True, short=True,
format=None, format=None,
json=None, json=None,
@ -196,7 +201,7 @@ def list_archive(
archive_lines = tuple( archive_lines = tuple(
execute_command_and_capture_output( execute_command_and_capture_output(
rlist.make_rlist_command( rlist.make_rlist_command(
repository, repository_path,
storage_config, storage_config,
local_borg_version, local_borg_version,
rlist_arguments, rlist_arguments,
@ -213,7 +218,7 @@ def list_archive(
# For each archive listed by Borg, run list on the contents of that archive. # For each archive listed by Borg, run list on the contents of that archive.
for archive in archive_lines: for archive in archive_lines:
logger.answer(f'{repository}: Listing archive {archive}') logger.answer(f'{repository_path}: Listing archive {archive}')
archive_arguments = copy.copy(list_arguments) archive_arguments = copy.copy(list_arguments)
archive_arguments.archive = archive archive_arguments.archive = archive
@ -224,7 +229,7 @@ def list_archive(
setattr(archive_arguments, name, None) setattr(archive_arguments, name, None)
main_command = make_list_command( main_command = make_list_command(
repository, repository_path,
storage_config, storage_config,
local_borg_version, local_borg_version,
archive_arguments, archive_arguments,

View File

@ -7,7 +7,7 @@ logger = logging.getLogger(__name__)
def mount_archive( def mount_archive(
repository, repository_path,
archive, archive,
mount_point, mount_point,
paths, paths,
@ -38,7 +38,7 @@ def mount_archive(
+ (('-o', options) if options else ()) + (('-o', options) if options else ())
+ ( + (
( (
flags.make_repository_flags(repository, local_borg_version) flags.make_repository_flags(repository_path, local_borg_version)
+ ( + (
('--match-archives', archive) ('--match-archives', archive)
if feature.available(feature.Feature.MATCH_ARCHIVES, local_borg_version) if feature.available(feature.Feature.MATCH_ARCHIVES, local_borg_version)
@ -47,9 +47,9 @@ def mount_archive(
) )
if feature.available(feature.Feature.SEPARATE_REPOSITORY_ARCHIVE, local_borg_version) if feature.available(feature.Feature.SEPARATE_REPOSITORY_ARCHIVE, local_borg_version)
else ( else (
flags.make_repository_archive_flags(repository, archive, local_borg_version) flags.make_repository_archive_flags(repository_path, archive, local_borg_version)
if archive if archive
else flags.make_repository_flags(repository, local_borg_version) else flags.make_repository_flags(repository_path, local_borg_version)
) )
) )
+ (mount_point,) + (mount_point,)

View File

@ -7,10 +7,10 @@ from borgmatic.execute import execute_command
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
def make_prune_flags(retention_config, local_borg_version): def make_prune_flags(storage_config, retention_config, local_borg_version):
''' '''
Given a retention config dict mapping from option name to value, tranform it into an iterable of Given a retention config dict mapping from option name to value, transform it into an sequence of
command-line name-value flag pairs. command-line flags.
For example, given a retention config of: For example, given a retention config of:
@ -24,22 +24,32 @@ def make_prune_flags(retention_config, local_borg_version):
) )
''' '''
config = retention_config.copy() config = retention_config.copy()
prefix = config.pop('prefix', '{hostname}-') prefix = config.pop('prefix', None)
if prefix: flag_pairs = (
if feature.available(feature.Feature.MATCH_ARCHIVES, local_borg_version):
config['match_archives'] = f'sh:{prefix}*'
else:
config['glob_archives'] = f'{prefix}*'
return (
('--' + option_name.replace('_', '-'), str(value)) for option_name, value in config.items() ('--' + option_name.replace('_', '-'), str(value)) for option_name, value in config.items()
) )
return tuple(element for pair in flag_pairs for element in pair) + (
(
('--match-archives', f'sh:{prefix}*')
if feature.available(feature.Feature.MATCH_ARCHIVES, local_borg_version)
else ('--glob-archives', f'{prefix}*')
)
if prefix
else (
flags.make_match_archives_flags(
storage_config.get('match_archives'),
storage_config.get('archive_name_format'),
local_borg_version,
)
)
)
def prune_archives( def prune_archives(
dry_run, dry_run,
repository, repository_path,
storage_config, storage_config,
retention_config, retention_config,
local_borg_version, local_borg_version,
@ -60,11 +70,7 @@ def prune_archives(
full_command = ( full_command = (
(local_path, 'prune') (local_path, 'prune')
+ tuple( + make_prune_flags(storage_config, retention_config, local_borg_version)
element
for pair in make_prune_flags(retention_config, local_borg_version)
for element in pair
)
+ (('--remote-path', remote_path) if remote_path else ()) + (('--remote-path', remote_path) if remote_path else ())
+ (('--umask', str(umask)) if umask else ()) + (('--umask', str(umask)) if umask else ())
+ (('--lock-wait', str(lock_wait)) if lock_wait else ()) + (('--lock-wait', str(lock_wait)) if lock_wait else ())
@ -74,7 +80,7 @@ def prune_archives(
+ (('--debug', '--show-rc') if logger.isEnabledFor(logging.DEBUG) else ()) + (('--debug', '--show-rc') if logger.isEnabledFor(logging.DEBUG) else ())
+ (('--dry-run',) if dry_run else ()) + (('--dry-run',) if dry_run else ())
+ (tuple(extra_borg_options.split(' ')) if extra_borg_options else ()) + (tuple(extra_borg_options.split(' ')) if extra_borg_options else ())
+ flags.make_repository_flags(repository, local_borg_version) + flags.make_repository_flags(repository_path, local_borg_version)
) )
if stats or list_archives: if stats or list_archives:

View File

@ -13,7 +13,7 @@ RINFO_REPOSITORY_NOT_FOUND_EXIT_CODE = 2
def create_repository( def create_repository(
dry_run, dry_run,
repository, repository_path,
storage_config, storage_config,
local_borg_version, local_borg_version,
encryption_mode, encryption_mode,
@ -33,14 +33,14 @@ def create_repository(
''' '''
try: try:
rinfo.display_repository_info( rinfo.display_repository_info(
repository, repository_path,
storage_config, storage_config,
local_borg_version, local_borg_version,
argparse.Namespace(json=True), argparse.Namespace(json=True),
local_path, local_path,
remote_path, remote_path,
) )
logger.info(f'{repository}: Repository already exists. Skipping creation.') logger.info(f'{repository_path}: Repository already exists. Skipping creation.')
return return
except subprocess.CalledProcessError as error: except subprocess.CalledProcessError as error:
if error.returncode != RINFO_REPOSITORY_NOT_FOUND_EXIT_CODE: if error.returncode != RINFO_REPOSITORY_NOT_FOUND_EXIT_CODE:
@ -65,11 +65,11 @@ def create_repository(
+ (('--debug',) if logger.isEnabledFor(logging.DEBUG) else ()) + (('--debug',) if logger.isEnabledFor(logging.DEBUG) else ())
+ (('--remote-path', remote_path) if remote_path else ()) + (('--remote-path', remote_path) if remote_path else ())
+ (tuple(extra_borg_options.split(' ')) if extra_borg_options else ()) + (tuple(extra_borg_options.split(' ')) if extra_borg_options else ())
+ flags.make_repository_flags(repository, local_borg_version) + flags.make_repository_flags(repository_path, local_borg_version)
) )
if dry_run: if dry_run:
logging.info(f'{repository}: Skipping repository creation (dry run)') logging.info(f'{repository_path}: Skipping repository creation (dry run)')
return return
# Do not capture output here, so as to support interactive prompts. # Do not capture output here, so as to support interactive prompts.

View File

@ -8,7 +8,7 @@ logger = logging.getLogger(__name__)
def display_repository_info( def display_repository_info(
repository, repository_path,
storage_config, storage_config,
local_borg_version, local_borg_version,
rinfo_arguments, rinfo_arguments,
@ -43,7 +43,7 @@ def display_repository_info(
+ flags.make_flags('remote-path', remote_path) + flags.make_flags('remote-path', remote_path)
+ flags.make_flags('lock-wait', lock_wait) + flags.make_flags('lock-wait', lock_wait)
+ (('--json',) if rinfo_arguments.json else ()) + (('--json',) if rinfo_arguments.json else ())
+ flags.make_repository_flags(repository, local_borg_version) + flags.make_repository_flags(repository_path, local_borg_version)
) )
extra_environment = environment.make_environment(storage_config) extra_environment = environment.make_environment(storage_config)

View File

@ -8,7 +8,12 @@ logger = logging.getLogger(__name__)
def resolve_archive_name( def resolve_archive_name(
repository, archive, storage_config, local_borg_version, local_path='borg', remote_path=None repository_path,
archive,
storage_config,
local_borg_version,
local_path='borg',
remote_path=None,
): ):
''' '''
Given a local or remote repository path, an archive name, a storage config dict, a local Borg Given a local or remote repository path, an archive name, a storage config dict, a local Borg
@ -17,7 +22,7 @@ def resolve_archive_name(
Raise ValueError if "latest" is given but there are no archives in the repository. Raise ValueError if "latest" is given but there are no archives in the repository.
''' '''
if archive != "latest": if archive != 'latest':
return archive return archive
lock_wait = storage_config.get('lock_wait', None) lock_wait = storage_config.get('lock_wait', None)
@ -31,7 +36,7 @@ def resolve_archive_name(
+ flags.make_flags('lock-wait', lock_wait) + flags.make_flags('lock-wait', lock_wait)
+ flags.make_flags('last', 1) + flags.make_flags('last', 1)
+ ('--short',) + ('--short',)
+ flags.make_repository_flags(repository, local_borg_version) + flags.make_repository_flags(repository_path, local_borg_version)
) )
output = execute_command_and_capture_output( output = execute_command_and_capture_output(
@ -42,16 +47,16 @@ def resolve_archive_name(
except IndexError: except IndexError:
raise ValueError('No archives found in the repository') raise ValueError('No archives found in the repository')
logger.debug('{}: Latest archive is {}'.format(repository, latest_archive)) logger.debug(f'{repository_path}: Latest archive is {latest_archive}')
return latest_archive return latest_archive
MAKE_FLAGS_EXCLUDES = ('repository', 'prefix') MAKE_FLAGS_EXCLUDES = ('repository', 'prefix', 'match_archives')
def make_rlist_command( def make_rlist_command(
repository, repository_path,
storage_config, storage_config,
local_borg_version, local_borg_version,
rlist_arguments, rlist_arguments,
@ -89,15 +94,21 @@ def make_rlist_command(
else flags.make_flags('glob-archives', f'{rlist_arguments.prefix}*') else flags.make_flags('glob-archives', f'{rlist_arguments.prefix}*')
) )
if rlist_arguments.prefix if rlist_arguments.prefix
else () else (
flags.make_match_archives_flags(
rlist_arguments.match_archives or storage_config.get('match_archives'),
storage_config.get('archive_name_format'),
local_borg_version,
)
)
) )
+ flags.make_flags_from_arguments(rlist_arguments, excludes=MAKE_FLAGS_EXCLUDES) + flags.make_flags_from_arguments(rlist_arguments, excludes=MAKE_FLAGS_EXCLUDES)
+ flags.make_repository_flags(repository, local_borg_version) + flags.make_repository_flags(repository_path, local_borg_version)
) )
def list_repository( def list_repository(
repository, repository_path,
storage_config, storage_config,
local_borg_version, local_borg_version,
rlist_arguments, rlist_arguments,
@ -113,11 +124,16 @@ def list_repository(
borg_environment = environment.make_environment(storage_config) borg_environment = environment.make_environment(storage_config)
main_command = make_rlist_command( main_command = make_rlist_command(
repository, storage_config, local_borg_version, rlist_arguments, local_path, remote_path repository_path,
storage_config,
local_borg_version,
rlist_arguments,
local_path,
remote_path,
) )
if rlist_arguments.json: if rlist_arguments.json:
return execute_command_and_capture_output(main_command, extra_environment=borg_environment,) return execute_command_and_capture_output(main_command, extra_environment=borg_environment)
else: else:
execute_command( execute_command(
main_command, main_command,

View File

@ -9,7 +9,7 @@ logger = logging.getLogger(__name__)
def transfer_archives( def transfer_archives(
dry_run, dry_run,
repository, repository_path,
storage_config, storage_config,
local_borg_version, local_borg_version,
transfer_arguments, transfer_arguments,
@ -28,17 +28,22 @@ def transfer_archives(
+ (('--debug', '--show-rc') if logger.isEnabledFor(logging.DEBUG) else ()) + (('--debug', '--show-rc') if logger.isEnabledFor(logging.DEBUG) else ())
+ flags.make_flags('remote-path', remote_path) + flags.make_flags('remote-path', remote_path)
+ flags.make_flags('lock-wait', storage_config.get('lock_wait', None)) + flags.make_flags('lock-wait', storage_config.get('lock_wait', None))
+ (('--progress',) if transfer_arguments.progress else ())
+ ( + (
flags.make_flags( flags.make_flags_from_arguments(
'match-archives', transfer_arguments.match_archives or transfer_arguments.archive transfer_arguments,
excludes=('repository', 'source_repository', 'archive', 'match_archives'),
)
or (
flags.make_match_archives_flags(
transfer_arguments.match_archives
or transfer_arguments.archive
or storage_config.get('match_archives'),
storage_config.get('archive_name_format'),
local_borg_version,
)
) )
) )
+ flags.make_flags_from_arguments( + flags.make_repository_flags(repository_path, local_borg_version)
transfer_arguments,
excludes=('repository', 'source_repository', 'archive', 'match_archives'),
)
+ flags.make_repository_flags(repository, local_borg_version)
+ flags.make_flags('other-repo', transfer_arguments.source_repository) + flags.make_flags('other-repo', transfer_arguments.source_repository)
+ flags.make_flags('dry-run', dry_run) + flags.make_flags('dry-run', dry_run)
) )

View File

@ -46,11 +46,12 @@ def parse_subparser_arguments(unparsed_arguments, subparsers):
if 'borg' in unparsed_arguments: if 'borg' in unparsed_arguments:
subparsers = {'borg': subparsers['borg']} subparsers = {'borg': subparsers['borg']}
for subparser_name, subparser in subparsers.items(): for argument in remaining_arguments:
if subparser_name not in remaining_arguments: canonical_name = alias_to_subparser_name.get(argument, argument)
continue subparser = subparsers.get(canonical_name)
canonical_name = alias_to_subparser_name.get(subparser_name, subparser_name) if not subparser:
continue
# If a parsed value happens to be the same as the name of a subparser, remove it from the # If a parsed value happens to be the same as the name of a subparser, remove it from the
# remaining arguments. This prevents, for instance, "check --only extract" from triggering # remaining arguments. This prevents, for instance, "check --only extract" from triggering
@ -67,9 +68,9 @@ def parse_subparser_arguments(unparsed_arguments, subparsers):
arguments[canonical_name] = parsed arguments[canonical_name] = parsed
# If no actions are explicitly requested, assume defaults: prune, compact, create, and check. # If no actions are explicitly requested, assume defaults.
if not arguments and '--help' not in unparsed_arguments and '-h' not in unparsed_arguments: if not arguments and '--help' not in unparsed_arguments and '-h' not in unparsed_arguments:
for subparser_name in ('prune', 'compact', 'create', 'check'): for subparser_name in ('create', 'prune', 'compact', 'check'):
subparser = subparsers[subparser_name] subparser = subparsers[subparser_name]
parsed, unused_remaining = subparser.parse_known_args(unparsed_arguments) parsed, unused_remaining = subparser.parse_known_args(unparsed_arguments)
arguments[subparser_name] = parsed arguments[subparser_name] = parsed
@ -130,9 +131,7 @@ def make_parsers():
nargs='*', nargs='*',
dest='config_paths', dest='config_paths',
default=config_paths, default=config_paths,
help='Configuration filenames or directories, defaults to: {}'.format( help=f"Configuration filenames or directories, defaults to: {' '.join(unexpanded_config_paths)}",
' '.join(unexpanded_config_paths)
),
) )
global_group.add_argument( global_group.add_argument(
'--excludes', '--excludes',
@ -179,10 +178,12 @@ def make_parsers():
help='Log verbose progress to monitoring integrations that support logging (from only errors to very verbose: -1, 0, 1, or 2)', help='Log verbose progress to monitoring integrations that support logging (from only errors to very verbose: -1, 0, 1, or 2)',
) )
global_group.add_argument( global_group.add_argument(
'--log-file', '--log-file', type=str, help='Write log messages to this file instead of syslog',
)
global_group.add_argument(
'--log-file-format',
type=str, type=str,
default=None, help='Log format string used for log messages written to the log file',
help='Write log messages to this file instead of syslog',
) )
global_group.add_argument( global_group.add_argument(
'--override', '--override',
@ -215,7 +216,7 @@ def make_parsers():
top_level_parser = ArgumentParser( top_level_parser = ArgumentParser(
description=''' description='''
Simple, configuration-driven backup software for servers and workstations. If none of Simple, configuration-driven backup software for servers and workstations. If none of
the action options are given, then borgmatic defaults to: prune, compact, create, and the action options are given, then borgmatic defaults to: create, prune, compact, and
check. check.
''', ''',
parents=[global_parser], parents=[global_parser],
@ -224,7 +225,7 @@ def make_parsers():
subparsers = top_level_parser.add_subparsers( subparsers = top_level_parser.add_subparsers(
title='actions', title='actions',
metavar='', metavar='',
help='Specify zero or more actions. Defaults to prune, compact, create, and check. Use --help with action for details:', help='Specify zero or more actions. Defaults to create, prune, compact, and check. Use --help with action for details:',
) )
rcreate_parser = subparsers.add_parser( rcreate_parser = subparsers.add_parser(
'rcreate', 'rcreate',
@ -294,7 +295,7 @@ def make_parsers():
) )
transfer_group.add_argument( transfer_group.add_argument(
'--upgrader', '--upgrader',
help='Upgrader type used to convert the transfered data, e.g. "From12To20" to upgrade data from Borg 1.2 to 2.0 format, defaults to no conversion', help='Upgrader type used to convert the transferred data, e.g. "From12To20" to upgrade data from Borg 1.2 to 2.0 format, defaults to no conversion',
) )
transfer_group.add_argument( transfer_group.add_argument(
'--progress', '--progress',
@ -332,6 +333,10 @@ def make_parsers():
add_help=False, add_help=False,
) )
prune_group = prune_parser.add_argument_group('prune arguments') prune_group = prune_parser.add_argument_group('prune arguments')
prune_group.add_argument(
'--repository',
help='Path of specific existing repository to prune (must be already specified in a borgmatic configuration file)',
)
prune_group.add_argument( prune_group.add_argument(
'--stats', '--stats',
dest='stats', dest='stats',
@ -352,6 +357,10 @@ def make_parsers():
add_help=False, add_help=False,
) )
compact_group = compact_parser.add_argument_group('compact arguments') compact_group = compact_parser.add_argument_group('compact arguments')
compact_group.add_argument(
'--repository',
help='Path of specific existing repository to compact (must be already specified in a borgmatic configuration file)',
)
compact_group.add_argument( compact_group.add_argument(
'--progress', '--progress',
dest='progress', dest='progress',
@ -384,6 +393,10 @@ def make_parsers():
add_help=False, add_help=False,
) )
create_group = create_parser.add_argument_group('create arguments') create_group = create_parser.add_argument_group('create arguments')
create_group.add_argument(
'--repository',
help='Path of specific existing repository to backup to (must be already specified in a borgmatic configuration file)',
)
create_group.add_argument( create_group.add_argument(
'--progress', '--progress',
dest='progress', dest='progress',
@ -414,6 +427,10 @@ def make_parsers():
add_help=False, add_help=False,
) )
check_group = check_parser.add_argument_group('check arguments') check_group = check_parser.add_argument_group('check arguments')
check_group.add_argument(
'--repository',
help='Path of specific existing repository to check (must be already specified in a borgmatic configuration file)',
)
check_group.add_argument( check_group.add_argument(
'--progress', '--progress',
dest='progress', dest='progress',
@ -475,10 +492,9 @@ def make_parsers():
) )
extract_group.add_argument( extract_group.add_argument(
'--strip-components', '--strip-components',
type=int, type=lambda number: number if number == 'all' else int(number),
metavar='NUMBER', metavar='NUMBER',
dest='strip_components', help='Number of leading path components to remove from each extracted path or "all" to strip all leading path components. Skip paths with fewer elements',
help='Number of leading path components to remove from each extracted path. Skip paths with fewer elements',
) )
extract_group.add_argument( extract_group.add_argument(
'--progress', '--progress',
@ -611,7 +627,7 @@ def make_parsers():
metavar='NAME', metavar='NAME',
nargs='+', nargs='+',
dest='databases', dest='databases',
help='Names of databases to restore from archive, defaults to all databases. Note that any databases to restore must be defined in borgmatic\'s configuration', help="Names of databases to restore from archive, defaults to all databases. Note that any databases to restore must be defined in borgmatic's configuration",
) )
restore_group.add_argument( restore_group.add_argument(
'-h', '--help', action='help', help='Show this help message and exit' '-h', '--help', action='help', help='Show this help message and exit'
@ -636,7 +652,7 @@ def make_parsers():
'--json', default=False, action='store_true', help='Output results as JSON' '--json', default=False, action='store_true', help='Output results as JSON'
) )
rlist_group.add_argument( rlist_group.add_argument(
'-P', '--prefix', help='Only list archive names starting with this prefix' '-P', '--prefix', help='Deprecated. Only list archive names starting with this prefix'
) )
rlist_group.add_argument( rlist_group.add_argument(
'-a', '-a',
@ -691,7 +707,7 @@ def make_parsers():
'--json', default=False, action='store_true', help='Output results as JSON' '--json', default=False, action='store_true', help='Output results as JSON'
) )
list_group.add_argument( list_group.add_argument(
'-P', '--prefix', help='Only list archive names starting with this prefix' '-P', '--prefix', help='Deprecated. Only list archive names starting with this prefix'
) )
list_group.add_argument( list_group.add_argument(
'-a', '-a',
@ -763,7 +779,9 @@ def make_parsers():
'--json', dest='json', default=False, action='store_true', help='Output results as JSON' '--json', dest='json', default=False, action='store_true', help='Output results as JSON'
) )
info_group.add_argument( info_group.add_argument(
'-P', '--prefix', help='Only show info for archive names starting with this prefix' '-P',
'--prefix',
help='Deprecated. Only show info for archive names starting with this prefix',
) )
info_group.add_argument( info_group.add_argument(
'-a', '-a',
@ -805,7 +823,7 @@ def make_parsers():
'borg', 'borg',
aliases=SUBPARSER_ALIASES['borg'], aliases=SUBPARSER_ALIASES['borg'],
help='Run an arbitrary Borg command', help='Run an arbitrary Borg command',
description='Run an arbitrary Borg command based on borgmatic\'s configuration', description="Run an arbitrary Borg command based on borgmatic's configuration",
add_help=False, add_help=False,
) )
borg_group = borg_parser.add_argument_group('borg arguments') borg_group = borg_parser.add_argument_group('borg arguments')
@ -861,7 +879,17 @@ def parse_arguments(*unparsed_arguments):
and arguments['transfer'].match_archives and arguments['transfer'].match_archives
): ):
raise ValueError( raise ValueError(
'With the transfer action, only one of --archive and --glob-archives flags can be used.' 'With the transfer action, only one of --archive and --match-archives flags can be used.'
)
if 'list' in arguments and (arguments['list'].prefix and arguments['list'].match_archives):
raise ValueError(
'With the list action, only one of --prefix or --match-archives flags can be used.'
)
if 'rlist' in arguments and (arguments['rlist'].prefix and arguments['rlist'].match_archives):
raise ValueError(
'With the rlist action, only one of --prefix or --match-archives flags can be used.'
) )
if 'info' in arguments and ( if 'info' in arguments and (

View File

@ -44,8 +44,8 @@ LEGACY_CONFIG_PATH = '/etc/borgmatic/config'
def run_configuration(config_filename, config, arguments): def run_configuration(config_filename, config, arguments):
''' '''
Given a config filename, the corresponding parsed config dict, and command-line arguments as a Given a config filename, the corresponding parsed config dict, and command-line arguments as a
dict from subparser name to a namespace of parsed arguments, execute the defined prune, compact, dict from subparser name to a namespace of parsed arguments, execute the defined create, prune,
create, check, and/or other actions. compact, check, and/or other actions.
Yield a combination of: Yield a combination of:
@ -64,15 +64,13 @@ def run_configuration(config_filename, config, arguments):
retry_wait = storage.get('retry_wait', 0) retry_wait = storage.get('retry_wait', 0)
encountered_error = None encountered_error = None
error_repository = '' error_repository = ''
using_primary_action = {'prune', 'compact', 'create', 'check'}.intersection(arguments) using_primary_action = {'create', 'prune', 'compact', 'check'}.intersection(arguments)
monitoring_log_level = verbosity_to_log_level(global_arguments.monitoring_verbosity) monitoring_log_level = verbosity_to_log_level(global_arguments.monitoring_verbosity)
try: try:
local_borg_version = borg_version.local_borg_version(storage, local_path) local_borg_version = borg_version.local_borg_version(storage, local_path)
except (OSError, CalledProcessError, ValueError) as error: except (OSError, CalledProcessError, ValueError) as error:
yield from log_error_records( yield from log_error_records(f'{config_filename}: Error getting local Borg version', error)
'{}: Error getting local Borg version'.format(config_filename), error
)
return return
try: try:
@ -100,7 +98,7 @@ def run_configuration(config_filename, config, arguments):
return return
encountered_error = error encountered_error = error
yield from log_error_records('{}: Error pinging monitor'.format(config_filename), error) yield from log_error_records(f'{config_filename}: Error pinging monitor', error)
if not encountered_error: if not encountered_error:
repo_queue = Queue() repo_queue = Queue()
@ -108,7 +106,8 @@ def run_configuration(config_filename, config, arguments):
repo_queue.put((repo, 0),) repo_queue.put((repo, 0),)
while not repo_queue.empty(): while not repo_queue.empty():
repository_path, retry_num = repo_queue.get() repository, retry_num = repo_queue.get()
logger.debug(f'{repository["path"]}: Running actions for repository')
timeout = retry_num * retry_wait timeout = retry_num * retry_wait
if timeout: if timeout:
logger.warning(f'{config_filename}: Sleeping {timeout}s before next retry') logger.warning(f'{config_filename}: Sleeping {timeout}s before next retry')
@ -125,14 +124,14 @@ def run_configuration(config_filename, config, arguments):
local_path=local_path, local_path=local_path,
remote_path=remote_path, remote_path=remote_path,
local_borg_version=local_borg_version, local_borg_version=local_borg_version,
repository_path=repository_path, repository=repository,
) )
except (OSError, CalledProcessError, ValueError) as error: except (OSError, CalledProcessError, ValueError) as error:
if retry_num < retries: if retry_num < retries:
repo_queue.put((repository_path, retry_num + 1),) repo_queue.put((repository, retry_num + 1),)
tuple( # Consume the generator so as to trigger logging. tuple( # Consume the generator so as to trigger logging.
log_error_records( log_error_records(
'{}: Error running actions for repository'.format(repository_path), f'{repository["path"]}: Error running actions for repository',
error, error,
levelno=logging.WARNING, levelno=logging.WARNING,
log_command_error_output=True, log_command_error_output=True,
@ -147,10 +146,29 @@ def run_configuration(config_filename, config, arguments):
return return
yield from log_error_records( yield from log_error_records(
'{}: Error running actions for repository'.format(repository_path), error f'{repository["path"]}: Error running actions for repository', error
) )
encountered_error = error encountered_error = error
error_repository = repository_path error_repository = repository['path']
try:
if using_primary_action:
# send logs irrespective of error
dispatch.call_hooks(
'ping_monitor',
hooks,
config_filename,
monitor.MONITOR_HOOK_NAMES,
monitor.State.LOG,
monitoring_log_level,
global_arguments.dry_run,
)
except (OSError, CalledProcessError) as error:
if command.considered_soft_failure(config_filename, error):
return
encountered_error = error
yield from log_error_records(f'{repository["path"]}: Error pinging monitor', error)
if not encountered_error: if not encountered_error:
try: try:
@ -177,7 +195,7 @@ def run_configuration(config_filename, config, arguments):
return return
encountered_error = error encountered_error = error
yield from log_error_records('{}: Error pinging monitor'.format(config_filename), error) yield from log_error_records(f'{config_filename}: Error pinging monitor', error)
if encountered_error and using_primary_action: if encountered_error and using_primary_action:
try: try:
@ -212,9 +230,7 @@ def run_configuration(config_filename, config, arguments):
if command.considered_soft_failure(config_filename, error): if command.considered_soft_failure(config_filename, error):
return return
yield from log_error_records( yield from log_error_records(f'{config_filename}: Error running on-error hook', error)
'{}: Error running on-error hook'.format(config_filename), error
)
def run_actions( def run_actions(
@ -229,7 +245,7 @@ def run_actions(
local_path, local_path,
remote_path, remote_path,
local_borg_version, local_borg_version,
repository_path, repository,
): ):
''' '''
Given parsed command-line arguments as an argparse.ArgumentParser instance, the configuration Given parsed command-line arguments as an argparse.ArgumentParser instance, the configuration
@ -244,13 +260,14 @@ def run_actions(
invalid. invalid.
''' '''
add_custom_log_levels() add_custom_log_levels()
repository = os.path.expanduser(repository_path) repository_path = os.path.expanduser(repository['path'])
global_arguments = arguments['global'] global_arguments = arguments['global']
dry_run_label = ' (dry run; not making any changes)' if global_arguments.dry_run else '' dry_run_label = ' (dry run; not making any changes)' if global_arguments.dry_run else ''
hook_context = { hook_context = {
'repository': repository_path, 'repository': repository_path,
# Deprecated: For backwards compatibility with borgmatic < 1.6.0. # Deprecated: For backwards compatibility with borgmatic < 1.6.0.
'repositories': ','.join(location['repositories']), 'repositories': ','.join([repo['path'] for repo in location['repositories']]),
'log_file': global_arguments.log_file if global_arguments.log_file else '',
} }
command.execute_hook( command.execute_hook(
@ -262,155 +279,162 @@ def run_actions(
**hook_context, **hook_context,
) )
if 'rcreate' in arguments: for (action_name, action_arguments) in arguments.items():
borgmatic.actions.rcreate.run_rcreate( if action_name == 'rcreate':
repository, borgmatic.actions.rcreate.run_rcreate(
storage, repository,
local_borg_version, storage,
arguments['rcreate'], local_borg_version,
global_arguments, action_arguments,
local_path, global_arguments,
remote_path, local_path,
) remote_path,
if 'transfer' in arguments: )
borgmatic.actions.transfer.run_transfer( elif action_name == 'transfer':
repository, borgmatic.actions.transfer.run_transfer(
storage, repository,
local_borg_version, storage,
arguments['transfer'], local_borg_version,
global_arguments, action_arguments,
local_path, global_arguments,
remote_path, local_path,
) remote_path,
if 'prune' in arguments: )
borgmatic.actions.prune.run_prune( elif action_name == 'create':
config_filename, yield from borgmatic.actions.create.run_create(
repository, config_filename,
storage, repository,
retention, location,
hooks, storage,
hook_context, hooks,
local_borg_version, hook_context,
arguments['prune'], local_borg_version,
global_arguments, action_arguments,
dry_run_label, global_arguments,
local_path, dry_run_label,
remote_path, local_path,
) remote_path,
if 'compact' in arguments: )
borgmatic.actions.compact.run_compact( elif action_name == 'prune':
config_filename, borgmatic.actions.prune.run_prune(
repository, config_filename,
storage, repository,
retention, storage,
hooks, retention,
hook_context, hooks,
local_borg_version, hook_context,
arguments['compact'], local_borg_version,
global_arguments, action_arguments,
dry_run_label, global_arguments,
local_path, dry_run_label,
remote_path, local_path,
) remote_path,
if 'create' in arguments: )
yield from borgmatic.actions.create.run_create( elif action_name == 'compact':
config_filename, borgmatic.actions.compact.run_compact(
repository, config_filename,
location, repository,
storage, storage,
hooks, retention,
hook_context, hooks,
local_borg_version, hook_context,
arguments['create'], local_borg_version,
global_arguments, action_arguments,
dry_run_label, global_arguments,
local_path, dry_run_label,
remote_path, local_path,
) remote_path,
if 'check' in arguments and checks.repository_enabled_for_checks(repository, consistency): )
borgmatic.actions.check.run_check( elif action_name == 'check':
config_filename, if checks.repository_enabled_for_checks(repository, consistency):
repository, borgmatic.actions.check.run_check(
location, config_filename,
storage, repository,
consistency, location,
hooks, storage,
hook_context, consistency,
local_borg_version, hooks,
arguments['check'], hook_context,
global_arguments, local_borg_version,
local_path, action_arguments,
remote_path, global_arguments,
) local_path,
if 'extract' in arguments: remote_path,
borgmatic.actions.extract.run_extract( )
config_filename, elif action_name == 'extract':
repository, borgmatic.actions.extract.run_extract(
location, config_filename,
storage, repository,
hooks, location,
hook_context, storage,
local_borg_version, hooks,
arguments['extract'], hook_context,
global_arguments, local_borg_version,
local_path, action_arguments,
remote_path, global_arguments,
) local_path,
if 'export-tar' in arguments: remote_path,
borgmatic.actions.export_tar.run_export_tar( )
repository, elif action_name == 'export-tar':
storage, borgmatic.actions.export_tar.run_export_tar(
local_borg_version, repository,
arguments['export-tar'], storage,
global_arguments, local_borg_version,
local_path, action_arguments,
remote_path, global_arguments,
) local_path,
if 'mount' in arguments: remote_path,
borgmatic.actions.mount.run_mount( )
repository, storage, local_borg_version, arguments['mount'], local_path, remote_path, elif action_name == 'mount':
) borgmatic.actions.mount.run_mount(
if 'restore' in arguments: repository,
borgmatic.actions.restore.run_restore( storage,
repository, local_borg_version,
location, arguments['mount'],
storage, local_path,
hooks, remote_path,
local_borg_version, )
arguments['restore'], elif action_name == 'restore':
global_arguments, borgmatic.actions.restore.run_restore(
local_path, repository,
remote_path, location,
) storage,
if 'rlist' in arguments: hooks,
yield from borgmatic.actions.rlist.run_rlist( local_borg_version,
repository, storage, local_borg_version, arguments['rlist'], local_path, remote_path, action_arguments,
) global_arguments,
if 'list' in arguments: local_path,
yield from borgmatic.actions.list.run_list( remote_path,
repository, storage, local_borg_version, arguments['list'], local_path, remote_path, )
) elif action_name == 'rlist':
if 'rinfo' in arguments: yield from borgmatic.actions.rlist.run_rlist(
yield from borgmatic.actions.rinfo.run_rinfo( repository, storage, local_borg_version, action_arguments, local_path, remote_path,
repository, storage, local_borg_version, arguments['rinfo'], local_path, remote_path, )
) elif action_name == 'list':
if 'info' in arguments: yield from borgmatic.actions.list.run_list(
yield from borgmatic.actions.info.run_info( repository, storage, local_borg_version, action_arguments, local_path, remote_path,
repository, storage, local_borg_version, arguments['info'], local_path, remote_path, )
) elif action_name == 'rinfo':
if 'break-lock' in arguments: yield from borgmatic.actions.rinfo.run_rinfo(
borgmatic.actions.break_lock.run_break_lock( repository, storage, local_borg_version, action_arguments, local_path, remote_path,
repository, )
storage, elif action_name == 'info':
local_borg_version, yield from borgmatic.actions.info.run_info(
arguments['break-lock'], repository, storage, local_borg_version, action_arguments, local_path, remote_path,
local_path, )
remote_path, elif action_name == 'break-lock':
) borgmatic.actions.break_lock.run_break_lock(
if 'borg' in arguments: repository,
borgmatic.actions.borg.run_borg( storage,
repository, storage, local_borg_version, arguments['borg'], local_path, remote_path, local_borg_version,
) arguments['break-lock'],
local_path,
remote_path,
)
elif action_name == 'borg':
borgmatic.actions.borg.run_borg(
repository, storage, local_borg_version, action_arguments, local_path, remote_path,
)
command.execute_hook( command.execute_hook(
hooks.get('after_actions'), hooks.get('after_actions'),
@ -446,9 +470,7 @@ def load_configurations(config_filenames, overrides=None, resolve_env=True):
dict( dict(
levelno=logging.WARNING, levelno=logging.WARNING,
levelname='WARNING', levelname='WARNING',
msg='{}: Insufficient permissions to read configuration file'.format( msg=f'{config_filename}: Insufficient permissions to read configuration file',
config_filename
),
) )
), ),
] ]
@ -460,7 +482,7 @@ def load_configurations(config_filenames, overrides=None, resolve_env=True):
dict( dict(
levelno=logging.CRITICAL, levelno=logging.CRITICAL,
levelname='CRITICAL', levelname='CRITICAL',
msg='{}: Error parsing configuration file'.format(config_filename), msg=f'{config_filename}: Error parsing configuration file',
) )
), ),
logging.makeLogRecord( logging.makeLogRecord(
@ -561,9 +583,7 @@ def collect_configuration_run_summary_logs(configs, arguments):
if not configs: if not configs:
yield from log_error_records( yield from log_error_records(
'{}: No valid configuration files found'.format( f"{' '.join(arguments['global'].config_paths)}: No valid configuration files found",
' '.join(arguments['global'].config_paths)
)
) )
return return
@ -589,23 +609,21 @@ def collect_configuration_run_summary_logs(configs, arguments):
error_logs = tuple(result for result in results if isinstance(result, logging.LogRecord)) error_logs = tuple(result for result in results if isinstance(result, logging.LogRecord))
if error_logs: if error_logs:
yield from log_error_records( yield from log_error_records(f'{config_filename}: An error occurred')
'{}: Error running configuration file'.format(config_filename)
)
yield from error_logs yield from error_logs
else: else:
yield logging.makeLogRecord( yield logging.makeLogRecord(
dict( dict(
levelno=logging.INFO, levelno=logging.INFO,
levelname='INFO', levelname='INFO',
msg='{}: Successfully ran configuration file'.format(config_filename), msg=f'{config_filename}: Successfully ran configuration file',
) )
) )
if results: if results:
json_results.extend(results) json_results.extend(results)
if 'umount' in arguments: if 'umount' in arguments:
logger.info('Unmounting mount point {}'.format(arguments['umount'].mount_point)) logger.info(f"Unmounting mount point {arguments['umount'].mount_point}")
try: try:
borg_umount.unmount_archive( borg_umount.unmount_archive(
mount_point=arguments['umount'].mount_point, local_path=get_local_path(configs), mount_point=arguments['umount'].mount_point, local_path=get_local_path(configs),
@ -653,7 +671,7 @@ def main(): # pragma: no cover
if error.code == 0: if error.code == 0:
raise error raise error
configure_logging(logging.CRITICAL) configure_logging(logging.CRITICAL)
logger.critical('Error parsing arguments: {}'.format(' '.join(sys.argv))) logger.critical(f"Error parsing arguments: {' '.join(sys.argv)}")
exit_with_help_link() exit_with_help_link()
global_arguments = arguments['global'] global_arguments = arguments['global']
@ -683,10 +701,11 @@ def main(): # pragma: no cover
verbosity_to_log_level(global_arguments.log_file_verbosity), verbosity_to_log_level(global_arguments.log_file_verbosity),
verbosity_to_log_level(global_arguments.monitoring_verbosity), verbosity_to_log_level(global_arguments.monitoring_verbosity),
global_arguments.log_file, global_arguments.log_file,
global_arguments.log_file_format,
) )
except (FileNotFoundError, PermissionError) as error: except (FileNotFoundError, PermissionError) as error:
configure_logging(logging.CRITICAL) configure_logging(logging.CRITICAL)
logger.critical('Error configuring logging: {}'.format(error)) logger.critical(f'Error configuring logging: {error}')
exit_with_help_link() exit_with_help_link()
logger.debug('Ensuring legacy configuration is upgraded') logger.debug('Ensuring legacy configuration is upgraded')

View File

@ -34,7 +34,7 @@ def bash_completion():
' local this_script="$(cat "$BASH_SOURCE" 2> /dev/null)"', ' local this_script="$(cat "$BASH_SOURCE" 2> /dev/null)"',
' local installed_script="$(borgmatic --bash-completion 2> /dev/null)"', ' local installed_script="$(borgmatic --bash-completion 2> /dev/null)"',
' if [ "$this_script" != "$installed_script" ] && [ "$installed_script" != "" ];' ' if [ "$this_script" != "$installed_script" ] && [ "$installed_script" != "" ];'
' then cat << EOF\n%s\nEOF' % UPGRADE_MESSAGE, f' then cat << EOF\n{UPGRADE_MESSAGE}\nEOF',
' fi', ' fi',
'}', '}',
'complete_borgmatic() {', 'complete_borgmatic() {',
@ -48,7 +48,7 @@ def bash_completion():
for action, subparser in subparsers.choices.items() for action, subparser in subparsers.choices.items()
) )
+ ( + (
' COMPREPLY=($(compgen -W "%s %s" -- "${COMP_WORDS[COMP_CWORD]}"))' ' COMPREPLY=($(compgen -W "%s %s" -- "${COMP_WORDS[COMP_CWORD]}"))' # noqa: FS003
% (actions, global_flags), % (actions, global_flags),
' (check_version &)', ' (check_version &)',
'}', '}',

View File

@ -28,9 +28,7 @@ def parse_arguments(*arguments):
'--source-config', '--source-config',
dest='source_config_filename', dest='source_config_filename',
default=DEFAULT_SOURCE_CONFIG_FILENAME, default=DEFAULT_SOURCE_CONFIG_FILENAME,
help='Source INI-style configuration filename. Default: {}'.format( help=f'Source INI-style configuration filename. Default: {DEFAULT_SOURCE_CONFIG_FILENAME}',
DEFAULT_SOURCE_CONFIG_FILENAME
),
) )
parser.add_argument( parser.add_argument(
'-e', '-e',
@ -46,9 +44,7 @@ def parse_arguments(*arguments):
'--destination-config', '--destination-config',
dest='destination_config_filename', dest='destination_config_filename',
default=DEFAULT_DESTINATION_CONFIG_FILENAME, default=DEFAULT_DESTINATION_CONFIG_FILENAME,
help='Destination YAML configuration filename. Default: {}'.format( help=f'Destination YAML configuration filename. Default: {DEFAULT_DESTINATION_CONFIG_FILENAME}',
DEFAULT_DESTINATION_CONFIG_FILENAME
),
) )
return parser.parse_args(arguments) return parser.parse_args(arguments)
@ -59,19 +55,15 @@ TEXT_WRAP_CHARACTERS = 80
def display_result(args): # pragma: no cover def display_result(args): # pragma: no cover
result_lines = textwrap.wrap( result_lines = textwrap.wrap(
'Your borgmatic configuration has been upgraded. Please review the result in {}.'.format( f'Your borgmatic configuration has been upgraded. Please review the result in {args.destination_config_filename}.',
args.destination_config_filename
),
TEXT_WRAP_CHARACTERS, TEXT_WRAP_CHARACTERS,
) )
excludes_phrase = (
f' and {args.source_excludes_filename}' if args.source_excludes_filename else ''
)
delete_lines = textwrap.wrap( delete_lines = textwrap.wrap(
'Once you are satisfied, you can safely delete {}{}.'.format( f'Once you are satisfied, you can safely delete {args.source_config_filename}{excludes_phrase}.',
args.source_config_filename,
' and {}'.format(args.source_excludes_filename)
if args.source_excludes_filename
else '',
),
TEXT_WRAP_CHARACTERS, TEXT_WRAP_CHARACTERS,
) )

View File

@ -23,9 +23,7 @@ def parse_arguments(*arguments):
'--destination', '--destination',
dest='destination_filename', dest='destination_filename',
default=DEFAULT_DESTINATION_CONFIG_FILENAME, default=DEFAULT_DESTINATION_CONFIG_FILENAME,
help='Destination YAML configuration file, default: {}'.format( help=f'Destination YAML configuration file, default: {DEFAULT_DESTINATION_CONFIG_FILENAME}',
DEFAULT_DESTINATION_CONFIG_FILENAME
),
) )
parser.add_argument( parser.add_argument(
'--overwrite', '--overwrite',
@ -48,17 +46,13 @@ def main(): # pragma: no cover
overwrite=args.overwrite, overwrite=args.overwrite,
) )
print('Generated a sample configuration file at {}.'.format(args.destination_filename)) print(f'Generated a sample configuration file at {args.destination_filename}.')
print() print()
if args.source_filename: if args.source_filename:
print( print(f'Merged in the contents of configuration file at {args.source_filename}.')
'Merged in the contents of configuration file at {}.'.format(args.source_filename)
)
print('To review the changes made, run:') print('To review the changes made, run:')
print() print()
print( print(f' diff --unified {args.source_filename} {args.destination_filename}')
' diff --unified {} {}'.format(args.source_filename, args.destination_filename)
)
print() print()
print('This includes all available configuration options with example values. The few') print('This includes all available configuration options with example values. The few')
print('required options are indicated. Please edit the file to suit your needs.') print('required options are indicated. Please edit the file to suit your needs.')

View File

@ -2,6 +2,7 @@ import logging
import sys import sys
from argparse import ArgumentParser from argparse import ArgumentParser
import borgmatic.config.generate
from borgmatic.config import collect, validate from borgmatic.config import collect, validate
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
@ -21,20 +22,24 @@ def parse_arguments(*arguments):
nargs='+', nargs='+',
dest='config_paths', dest='config_paths',
default=config_paths, default=config_paths,
help='Configuration filenames or directories, defaults to: {}'.format( help=f'Configuration filenames or directories, defaults to: {config_paths}',
' '.join(config_paths) )
), parser.add_argument(
'-s',
'--show',
action='store_true',
help='Show the validated configuration after all include merging has occurred',
) )
return parser.parse_args(arguments) return parser.parse_args(arguments)
def main(): # pragma: no cover def main(): # pragma: no cover
args = parse_arguments(*sys.argv[1:]) arguments = parse_arguments(*sys.argv[1:])
logging.basicConfig(level=logging.INFO, format='%(message)s') logging.basicConfig(level=logging.INFO, format='%(message)s')
config_filenames = tuple(collect.collect_config_filenames(args.config_paths)) config_filenames = tuple(collect.collect_config_filenames(arguments.config_paths))
if len(config_filenames) == 0: if len(config_filenames) == 0:
logger.critical('No files to validate found') logger.critical('No files to validate found')
sys.exit(1) sys.exit(1)
@ -42,15 +47,22 @@ def main(): # pragma: no cover
found_issues = False found_issues = False
for config_filename in config_filenames: for config_filename in config_filenames:
try: try:
validate.parse_configuration(config_filename, validate.schema_filename()) config, parse_logs = validate.parse_configuration(
config_filename, validate.schema_filename()
)
except (ValueError, OSError, validate.Validation_error) as error: except (ValueError, OSError, validate.Validation_error) as error:
logging.critical('{}: Error parsing configuration file'.format(config_filename)) logging.critical(f'{config_filename}: Error parsing configuration file')
logging.critical(error) logging.critical(error)
found_issues = True found_issues = True
else:
for log in parse_logs:
logger.handle(log)
if arguments.show:
print('---')
print(borgmatic.config.generate.render_configuration(config))
if found_issues: if found_issues:
sys.exit(1) sys.exit(1)
else:
logger.info( logger.info(f"All given configuration files are valid: {', '.join(config_filenames)}")
'All given configuration files are valid: {}'.format(', '.join(config_filenames))
)

View File

@ -16,8 +16,8 @@ def get_default_config_paths(expand_home=True):
return [ return [
'/etc/borgmatic/config.yaml', '/etc/borgmatic/config.yaml',
'/etc/borgmatic.d', '/etc/borgmatic.d',
'%s/borgmatic/config.yaml' % user_config_directory, os.path.join(user_config_directory, 'borgmatic/config.yaml'),
'%s/borgmatic.d' % user_config_directory, os.path.join(user_config_directory, 'borgmatic.d'),
] ]

View File

@ -43,7 +43,7 @@ def convert_legacy_parsed_config(source_config, source_excludes, schema):
] ]
) )
# Split space-seperated values into actual lists, make "repository" into a list, and merge in # Split space-separated values into actual lists, make "repository" into a list, and merge in
# excludes. # excludes.
location = destination_config['location'] location = destination_config['location']
location['source_directories'] = source_config.location['source_directories'].split(' ') location['source_directories'] = source_config.location['source_directories'].split(' ')

View File

@ -14,11 +14,14 @@ def _resolve_string(matcher):
if matcher.group('escape') is not None: if matcher.group('escape') is not None:
# in case of escaped envvar, unescape it # in case of escaped envvar, unescape it
return matcher.group('variable') return matcher.group('variable')
# resolve the env var # resolve the env var
name, default = matcher.group('name'), matcher.group('default') name, default = matcher.group('name'), matcher.group('default')
out = os.getenv(name, default=default) out = os.getenv(name, default=default)
if out is None: if out is None:
raise ValueError('Cannot find variable ${name} in environment'.format(name=name)) raise ValueError(f'Cannot find variable {name} in environment')
return out return out

View File

@ -48,7 +48,7 @@ def _schema_to_sample_configuration(schema, level=0, parent_is_sequence=False):
config, schema, indent=indent, skip_first=parent_is_sequence config, schema, indent=indent, skip_first=parent_is_sequence
) )
else: else:
raise ValueError('Schema at level {} is unsupported: {}'.format(level, schema)) raise ValueError(f'Schema at level {level} is unsupported: {schema}')
return config return config
@ -84,7 +84,7 @@ def _comment_out_optional_configuration(rendered_config):
for line in rendered_config.split('\n'): for line in rendered_config.split('\n'):
# Upon encountering an optional configuration option, comment out lines until the next blank # Upon encountering an optional configuration option, comment out lines until the next blank
# line. # line.
if line.strip().startswith('# {}'.format(COMMENTED_OUT_SENTINEL)): if line.strip().startswith(f'# {COMMENTED_OUT_SENTINEL}'):
optional = True optional = True
continue continue
@ -117,9 +117,7 @@ def write_configuration(config_filename, rendered_config, mode=0o600, overwrite=
''' '''
if not overwrite and os.path.exists(config_filename): if not overwrite and os.path.exists(config_filename):
raise FileExistsError( raise FileExistsError(
'{} already exists. Aborting. Use --overwrite to replace the file.'.format( f'{config_filename} already exists. Aborting. Use --overwrite to replace the file.'
config_filename
)
) )
try: try:
@ -218,7 +216,7 @@ def remove_commented_out_sentinel(config, field_name):
except KeyError: except KeyError:
return return
if last_comment_value == '# {}\n'.format(COMMENTED_OUT_SENTINEL): if last_comment_value == f'# {COMMENTED_OUT_SENTINEL}\n':
config.ca.items[field_name][RUAMEL_YAML_COMMENTS_INDEX].pop() config.ca.items[field_name][RUAMEL_YAML_COMMENTS_INDEX].pop()

View File

@ -70,13 +70,11 @@ def validate_configuration_format(parser, config_format):
section_format.name for section_format in config_format section_format.name for section_format in config_format
) )
if unknown_section_names: if unknown_section_names:
raise ValueError( raise ValueError(f"Unknown config sections found: {', '.join(unknown_section_names)}")
'Unknown config sections found: {}'.format(', '.join(unknown_section_names))
)
missing_section_names = set(required_section_names) - section_names missing_section_names = set(required_section_names) - section_names
if missing_section_names: if missing_section_names:
raise ValueError('Missing config sections: {}'.format(', '.join(missing_section_names))) raise ValueError(f"Missing config sections: {', '.join(missing_section_names)}")
for section_format in config_format: for section_format in config_format:
if section_format.name not in section_names: if section_format.name not in section_names:
@ -91,9 +89,7 @@ def validate_configuration_format(parser, config_format):
if unexpected_option_names: if unexpected_option_names:
raise ValueError( raise ValueError(
'Unexpected options found in config section {}: {}'.format( f"Unexpected options found in config section {section_format.name}: {', '.join(sorted(unexpected_option_names))}",
section_format.name, ', '.join(sorted(unexpected_option_names))
)
) )
missing_option_names = tuple( missing_option_names = tuple(
@ -105,9 +101,7 @@ def validate_configuration_format(parser, config_format):
if missing_option_names: if missing_option_names:
raise ValueError( raise ValueError(
'Required options missing from config section {}: {}'.format( f"Required options missing from config section {section_format.name}: {', '.join(missing_option_names)}",
section_format.name, ', '.join(missing_option_names)
)
) )
@ -137,7 +131,7 @@ def parse_configuration(config_filename, config_format):
''' '''
parser = RawConfigParser() parser = RawConfigParser()
if not parser.read(config_filename): if not parser.read(config_filename):
raise ValueError('Configuration file cannot be opened: {}'.format(config_filename)) raise ValueError(f'Configuration file cannot be opened: {config_filename}')
validate_configuration_format(parser, config_format) validate_configuration_format(parser, config_format)

View File

@ -1,4 +1,5 @@
import functools import functools
import json
import logging import logging
import os import os
@ -37,6 +38,24 @@ def include_configuration(loader, filename_node, include_directory):
return load_configuration(include_filename) return load_configuration(include_filename)
def retain_node_error(loader, node):
'''
Given a ruamel.yaml.loader.Loader and a YAML node, raise an error.
Raise ValueError if a mapping or sequence node is given, as that indicates that "!retain" was
used in a configuration file without a merge. In configuration files with a merge, mapping and
sequence nodes with "!retain" tags are handled by deep_merge_nodes() below.
Also raise ValueError if a scalar node is given, as "!retain" is not supported on scalar nodes.
'''
if isinstance(node, (ruamel.yaml.nodes.MappingNode, ruamel.yaml.nodes.SequenceNode)):
raise ValueError(
'The !retain tag may only be used within a configuration file containing a merged !include tag.'
)
raise ValueError('The !retain tag may only be used on a YAML mapping or sequence.')
class Include_constructor(ruamel.yaml.SafeConstructor): class Include_constructor(ruamel.yaml.SafeConstructor):
''' '''
A YAML "constructor" (a ruamel.yaml concept) that supports a custom "!include" tag for including A YAML "constructor" (a ruamel.yaml concept) that supports a custom "!include" tag for including
@ -49,6 +68,7 @@ class Include_constructor(ruamel.yaml.SafeConstructor):
'!include', '!include',
functools.partial(include_configuration, include_directory=include_directory), functools.partial(include_configuration, include_directory=include_directory),
) )
self.add_constructor('!retain', retain_node_error)
def flatten_mapping(self, node): def flatten_mapping(self, node):
''' '''
@ -81,7 +101,8 @@ class Include_constructor(ruamel.yaml.SafeConstructor):
def load_configuration(filename): def load_configuration(filename):
''' '''
Load the given configuration file and return its contents as a data structure of nested dicts Load the given configuration file and return its contents as a data structure of nested dicts
and lists. and lists. Also, replace any "{constant}" strings with the value of the "constant" key in the
"constants" section of the configuration file.
Raise ruamel.yaml.error.YAMLError if something goes wrong parsing the YAML, or RecursionError Raise ruamel.yaml.error.YAMLError if something goes wrong parsing the YAML, or RecursionError
if there are too many recursive includes. if there are too many recursive includes.
@ -98,7 +119,19 @@ def load_configuration(filename):
yaml = ruamel.yaml.YAML(typ='safe') yaml = ruamel.yaml.YAML(typ='safe')
yaml.Constructor = Include_constructor_with_include_directory yaml.Constructor = Include_constructor_with_include_directory
return yaml.load(open(filename)) with open(filename) as file:
file_contents = file.read()
config = yaml.load(file_contents)
if config and 'constants' in config:
for key, value in config['constants'].items():
value = json.dumps(value)
file_contents = file_contents.replace(f'{{{key}}}', value.strip('"'))
config = yaml.load(file_contents)
del config['constants']
return config
DELETED_NODE = object() DELETED_NODE = object()
@ -162,6 +195,8 @@ def deep_merge_nodes(nodes):
), ),
] ]
If a mapping or sequence node has a YAML "!retain" tag, then that node is not merged.
The purpose of deep merging like this is to support, for instance, merging one borgmatic The purpose of deep merging like this is to support, for instance, merging one borgmatic
configuration file into another for reuse, such that a configuration section ("retention", configuration file into another for reuse, such that a configuration section ("retention",
etc.) does not completely replace the corresponding section in a merged file. etc.) does not completely replace the corresponding section in a merged file.
@ -184,32 +219,42 @@ def deep_merge_nodes(nodes):
# If we're dealing with MappingNodes, recurse and merge its values as well. # If we're dealing with MappingNodes, recurse and merge its values as well.
if isinstance(b_value, ruamel.yaml.nodes.MappingNode): if isinstance(b_value, ruamel.yaml.nodes.MappingNode):
replaced_nodes[(b_key, b_value)] = ( # A "!retain" tag says to skip deep merging for this node. Replace the tag so
b_key, # downstream schema validation doesn't break on our application-specific tag.
ruamel.yaml.nodes.MappingNode( if b_value.tag == '!retain':
tag=b_value.tag, b_value.tag = 'tag:yaml.org,2002:map'
value=deep_merge_nodes(a_value.value + b_value.value), else:
start_mark=b_value.start_mark, replaced_nodes[(b_key, b_value)] = (
end_mark=b_value.end_mark, b_key,
flow_style=b_value.flow_style, ruamel.yaml.nodes.MappingNode(
comment=b_value.comment, tag=b_value.tag,
anchor=b_value.anchor, value=deep_merge_nodes(a_value.value + b_value.value),
), start_mark=b_value.start_mark,
) end_mark=b_value.end_mark,
flow_style=b_value.flow_style,
comment=b_value.comment,
anchor=b_value.anchor,
),
)
# If we're dealing with SequenceNodes, merge by appending one sequence to the other. # If we're dealing with SequenceNodes, merge by appending one sequence to the other.
elif isinstance(b_value, ruamel.yaml.nodes.SequenceNode): elif isinstance(b_value, ruamel.yaml.nodes.SequenceNode):
replaced_nodes[(b_key, b_value)] = ( # A "!retain" tag says to skip deep merging for this node. Replace the tag so
b_key, # downstream schema validation doesn't break on our application-specific tag.
ruamel.yaml.nodes.SequenceNode( if b_value.tag == '!retain':
tag=b_value.tag, b_value.tag = 'tag:yaml.org,2002:seq'
value=a_value.value + b_value.value, else:
start_mark=b_value.start_mark, replaced_nodes[(b_key, b_value)] = (
end_mark=b_value.end_mark, b_key,
flow_style=b_value.flow_style, ruamel.yaml.nodes.SequenceNode(
comment=b_value.comment, tag=b_value.tag,
anchor=b_value.anchor, value=a_value.value + b_value.value,
), start_mark=b_value.start_mark,
) end_mark=b_value.end_mark,
flow_style=b_value.flow_style,
comment=b_value.comment,
anchor=b_value.anchor,
),
)
return [ return [
replaced_nodes.get(node, node) for node in nodes if replaced_nodes.get(node) != DELETED_NODE replaced_nodes.get(node, node) for node in nodes if replaced_nodes.get(node) != DELETED_NODE

View File

@ -1,4 +1,5 @@
import logging import logging
import os
def normalize(config_filename, config): def normalize(config_filename, config):
@ -56,9 +57,15 @@ def normalize(config_filename, config):
# Upgrade remote repositories to ssh:// syntax, required in Borg 2. # Upgrade remote repositories to ssh:// syntax, required in Borg 2.
repositories = location.get('repositories') repositories = location.get('repositories')
if repositories: if repositories:
if isinstance(repositories[0], str):
config['location']['repositories'] = [
{'path': repository} for repository in repositories
]
repositories = config['location']['repositories']
config['location']['repositories'] = [] config['location']['repositories'] = []
for repository in repositories: for repository_dict in repositories:
if '~' in repository: repository_path = repository_dict['path']
if '~' in repository_path:
logs.append( logs.append(
logging.makeLogRecord( logging.makeLogRecord(
dict( dict(
@ -68,21 +75,31 @@ def normalize(config_filename, config):
) )
) )
) )
if ':' in repository and not repository.startswith('ssh://'): if ':' in repository_path:
rewritten_repository = ( if repository_path.startswith('file://'):
f"ssh://{repository.replace(':~', '/~').replace(':/', '/').replace(':', '/./')}" updated_repository_path = os.path.abspath(
) repository_path.partition('file://')[-1]
logs.append( )
logging.makeLogRecord( config['location']['repositories'].append(
dict( dict(repository_dict, path=updated_repository_path,)
levelno=logging.WARNING, )
levelname='WARNING', elif repository_path.startswith('ssh://'):
msg=f'{config_filename}: Remote repository paths without ssh:// syntax are deprecated. Interpreting "{repository}" as "{rewritten_repository}"', config['location']['repositories'].append(repository_dict)
else:
rewritten_repository_path = f"ssh://{repository_path.replace(':~', '/~').replace(':/', '/').replace(':', '/./')}"
logs.append(
logging.makeLogRecord(
dict(
levelno=logging.WARNING,
levelname='WARNING',
msg=f'{config_filename}: Remote repository paths without ssh:// syntax are deprecated. Interpreting "{repository_path}" as "{rewritten_repository_path}"',
)
) )
) )
) config['location']['repositories'].append(
config['location']['repositories'].append(rewritten_repository) dict(repository_dict, path=rewritten_repository_path,)
)
else: else:
config['location']['repositories'].append(repository) config['location']['repositories'].append(repository_dict)
return logs return logs

View File

@ -3,6 +3,17 @@ required:
- location - location
additionalProperties: false additionalProperties: false
properties: properties:
constants:
type: object
description: |
Constants to use in the configuration file. All occurrences of the
constant name within culy braces will be replaced with the value.
For example, if you have a constant named "hostname" with the value
"myhostname", then the string "{hostname}" will be replaced with
"myhostname" in the configuration file.
example:
hostname: myhostname
prefix: myprefix
location: location:
type: object type: object
description: | description: |
@ -29,19 +40,32 @@ properties:
repositories: repositories:
type: array type: array
items: items:
type: string type: object
required:
- path
properties:
path:
type: string
example: ssh://user@backupserver/./{fqdn}
label:
type: string
example: backupserver
description: | description: |
Paths to local or remote repositories (required). Tildes are A required list of local or remote repositories with paths
expanded. Multiple repositories are backed up to in and optional labels (which can be used with the --repository
sequence. Borg placeholders can be used. See the output of flag to select a repository). Tildes are expanded. Multiple
"borg help placeholders" for details. See ssh_command for repositories are backed up to in sequence. Borg placeholders
SSH options like identity file or port. If systemd service can be used. See the output of "borg help placeholders" for
is used, then add local repository paths in the systemd details. See ssh_command for SSH options like identity file
service file to the ReadWritePaths list. or port. If systemd service is used, then add local
repository paths in the systemd service file to the
ReadWritePaths list. Prior to borgmatic 1.7.10, repositories
was just a list of plain path strings.
example: example:
- ssh://user@backupserver/./sourcehostname.borg - path: ssh://user@backupserver/./sourcehostname.borg
- ssh://user@backupserver/./{fqdn} label: backupserver
- /var/local/backups/local.borg - path: /mnt/backup
label: local
working_directory: working_directory:
type: string type: string
description: | description: |
@ -202,6 +226,12 @@ properties:
path prevents "borgmatic restore" from finding any database path prevents "borgmatic restore" from finding any database
dumps created before the change. Defaults to ~/.borgmatic dumps created before the change. Defaults to ~/.borgmatic
example: /tmp/borgmatic example: /tmp/borgmatic
source_directories_must_exist:
type: boolean
description: |
If true, then source directories must exist, otherwise an
error is raised. Defaults to false.
example: true
storage: storage:
type: object type: object
description: | description: |
@ -315,6 +345,12 @@ properties:
Path for Borg cache files. Defaults to Path for Borg cache files. Defaults to
$borg_base_directory/.cache/borg $borg_base_directory/.cache/borg
example: /path/to/base/cache example: /path/to/base/cache
borg_files_cache_ttl:
type: integer
description: |
Maximum time to live (ttl) for entries in the Borg files
cache.
example: 20
borg_security_directory: borg_security_directory:
type: string type: string
description: | description: |
@ -342,12 +378,21 @@ properties:
description: | description: |
Name of the archive. Borg placeholders can be used. See the Name of the archive. Borg placeholders can be used. See the
output of "borg help placeholders" for details. Defaults to output of "borg help placeholders" for details. Defaults to
"{hostname}-{now:%Y-%m-%dT%H:%M:%S.%f}". If you specify this "{hostname}-{now:%Y-%m-%dT%H:%M:%S.%f}". When running
option, consider also specifying a prefix in the retention actions like rlist, info, or check, borgmatic automatically
and consistency sections to avoid accidental tries to match only archives created with this name format.
pruning/checking of archives with different archive name
formats.
example: "{hostname}-documents-{now}" example: "{hostname}-documents-{now}"
match_archives:
type: string
description: |
A Borg pattern for filtering down the archives used by
borgmatic actions that operate on multiple archives. For
Borg 1.x, use a shell pattern here and see the output of
"borg help placeholders" for details. For Borg 2.x, see the
output of "borg help match-archives". If match_archives is
not specified, borgmatic defaults to deriving the
match_archives value from archive_name_format.
example: "sh:{hostname}-*"
relocated_repo_access_is_ok: relocated_repo_access_is_ok:
type: boolean type: boolean
description: | description: |
@ -369,6 +414,11 @@ properties:
description: | description: |
Extra command-line options to pass to "borg init". Extra command-line options to pass to "borg init".
example: "--extra-option" example: "--extra-option"
create:
type: string
description: |
Extra command-line options to pass to "borg create".
example: "--extra-option"
prune: prune:
type: string type: string
description: | description: |
@ -379,11 +429,6 @@ properties:
description: | description: |
Extra command-line options to pass to "borg compact". Extra command-line options to pass to "borg compact".
example: "--extra-option" example: "--extra-option"
create:
type: string
description: |
Extra command-line options to pass to "borg create".
example: "--extra-option"
check: check:
type: string type: string
description: | description: |
@ -441,10 +486,12 @@ properties:
prefix: prefix:
type: string type: string
description: | description: |
When pruning, only consider archive names starting with this Deprecated. When pruning, only consider archive names
prefix. Borg placeholders can be used. See the output of starting with this prefix. Borg placeholders can be used.
"borg help placeholders" for details. Defaults to See the output of "borg help placeholders" for details.
"{hostname}-". Use an empty value to disable the default. If a prefix is not specified, borgmatic defaults to
matching archives based on the archive_name_format (see
above).
example: sourcehostname example: sourcehostname
consistency: consistency:
type: object type: object
@ -502,12 +549,12 @@ properties:
items: items:
type: string type: string
description: | description: |
Paths to a subset of the repositories in the location Paths or labels for a subset of the repositories in the
section on which to run consistency checks. Handy in case location section on which to run consistency checks. Handy
some of your repositories are very large, and so running in case some of your repositories are very large, and so
consistency checks on them would take too long. Defaults to running consistency checks on them would take too long.
running consistency checks on all repositories configured in Defaults to running consistency checks on all repositories
the location section. configured in the location section.
example: example:
- user@backupserver:sourcehostname.borg - user@backupserver:sourcehostname.borg
check_last: check_last:
@ -520,11 +567,12 @@ properties:
prefix: prefix:
type: string type: string
description: | description: |
When performing the "archives" check, only consider archive Deprecated. When performing the "archives" check, only
names starting with this prefix. Borg placeholders can be consider archive names starting with this prefix. Borg
used. See the output of "borg help placeholders" for placeholders can be used. See the output of "borg help
details. Defaults to "{hostname}-". Use an empty value to placeholders" for details. If a prefix is not specified,
disable the default. borgmatic defaults to matching archives based on the
archive_name_format (see above).
example: sourcehostname example: sourcehostname
output: output:
type: object type: object
@ -663,11 +711,11 @@ properties:
type: string type: string
description: | description: |
List of one or more shell commands or scripts to execute List of one or more shell commands or scripts to execute
when an exception occurs during a "prune", "compact", when an exception occurs during a "create", "prune",
"create", or "check" action or an associated before/after "compact", or "check" action or an associated before/after
hook. hook.
example: example:
- echo "Error during prune/compact/create/check." - echo "Error during create/prune/compact/check."
before_everything: before_everything:
type: array type: array
items: items:
@ -893,14 +941,14 @@ properties:
type: string type: string
enum: ['sql'] enum: ['sql']
description: | description: |
Database dump output format. Currenly only "sql" Database dump output format. Currently only
is supported. Defaults to "sql" for a single "sql" is supported. Defaults to "sql" for a
database. Or, when database name is "all" and single database. Or, when database name is "all"
format is blank, dumps all databases to a single and format is blank, dumps all databases to a
file. But if a format is specified with an "all" single file. But if a format is specified with
database name, dumps each database to a separate an "all" database name, dumps each database to a
file of that format, allowing more convenient separate file of that format, allowing more
restores of individual databases. convenient restores of individual databases.
example: directory example: directory
add_drop_database: add_drop_database:
type: boolean type: boolean
@ -941,6 +989,31 @@ properties:
mysqldump/mysql commands (from either MySQL or MariaDB). See mysqldump/mysql commands (from either MySQL or MariaDB). See
https://dev.mysql.com/doc/refman/8.0/en/mysqldump.html or https://dev.mysql.com/doc/refman/8.0/en/mysqldump.html or
https://mariadb.com/kb/en/library/mysqldump/ for details. https://mariadb.com/kb/en/library/mysqldump/ for details.
sqlite_databases:
type: array
items:
type: object
required: ['path','name']
additionalProperties: false
properties:
name:
type: string
description: |
This is used to tag the database dump file
with a name. It is not the path to the database
file itself. The name "all" has no special
meaning for SQLite databases.
example: users
path:
type: string
description: |
Path to the SQLite database file to dump. If
relative, it is relative to the current working
directory. Note that using this
database hook implicitly enables both
read_special and one_file_system (see above) to
support dump and restore streaming.
example: /var/lib/sqlite/users.db
mongodb_databases: mongodb_databases:
type: array type: array
items: items:
@ -1143,7 +1216,7 @@ properties:
type: string type: string
description: | description: |
Healthchecks ping URL or UUID to notify when a Healthchecks ping URL or UUID to notify when a
backup begins, ends, or errors. backup begins, ends, errors or just to send logs.
example: https://hc-ping.com/your-uuid-here example: https://hc-ping.com/your-uuid-here
verify_tls: verify_tls:
type: boolean type: boolean
@ -1155,7 +1228,8 @@ properties:
type: boolean type: boolean
description: | description: |
Send borgmatic logs to Healthchecks as part the Send borgmatic logs to Healthchecks as part the
"finish" state. Defaults to true. "finish", "fail", and "log" states. Defaults to
true.
example: false example: false
ping_body_limit: ping_body_limit:
type: integer type: integer
@ -1174,10 +1248,11 @@ properties:
- start - start
- finish - finish
- fail - fail
- log
uniqueItems: true uniqueItems: true
description: | description: |
List of one or more monitoring states to ping for: List of one or more monitoring states to ping for:
"start", "finish", and/or "fail". Defaults to "start", "finish", "fail", and/or "log". Defaults to
pinging for all states. pinging for all states.
example: example:
- finish - finish

View File

@ -20,9 +20,9 @@ def format_json_error_path_element(path_element):
Given a path element into a JSON data structure, format it for display as a string. Given a path element into a JSON data structure, format it for display as a string.
''' '''
if isinstance(path_element, int): if isinstance(path_element, int):
return str('[{}]'.format(path_element)) return str(f'[{path_element}]')
return str('.{}'.format(path_element)) return str(f'.{path_element}')
def format_json_error(error): def format_json_error(error):
@ -30,10 +30,10 @@ def format_json_error(error):
Given an instance of jsonschema.exceptions.ValidationError, format it for display as a string. Given an instance of jsonschema.exceptions.ValidationError, format it for display as a string.
''' '''
if not error.path: if not error.path:
return 'At the top level: {}'.format(error.message) return f'At the top level: {error.message}'
formatted_path = ''.join(format_json_error_path_element(element) for element in error.path) formatted_path = ''.join(format_json_error_path_element(element) for element in error.path)
return "At '{}': {}".format(formatted_path.lstrip('.'), error.message) return f"At '{formatted_path.lstrip('.')}': {error.message}"
class Validation_error(ValueError): class Validation_error(ValueError):
@ -54,9 +54,10 @@ class Validation_error(ValueError):
''' '''
Render a validation error as a user-facing string. Render a validation error as a user-facing string.
''' '''
return 'An error occurred while parsing a configuration file at {}:\n'.format( return (
self.config_filename f'An error occurred while parsing a configuration file at {self.config_filename}:\n'
) + '\n'.join(error for error in self.errors) + '\n'.join(error for error in self.errors)
)
def apply_logical_validation(config_filename, parsed_configuration): def apply_logical_validation(config_filename, parsed_configuration):
@ -68,13 +69,14 @@ def apply_logical_validation(config_filename, parsed_configuration):
location_repositories = parsed_configuration.get('location', {}).get('repositories') location_repositories = parsed_configuration.get('location', {}).get('repositories')
check_repositories = parsed_configuration.get('consistency', {}).get('check_repositories', []) check_repositories = parsed_configuration.get('consistency', {}).get('check_repositories', [])
for repository in check_repositories: for repository in check_repositories:
if repository not in location_repositories: if not any(
repositories_match(repository, config_repository)
for config_repository in location_repositories
):
raise Validation_error( raise Validation_error(
config_filename, config_filename,
( (
'Unknown repository in the "consistency" section\'s "check_repositories": {}'.format( f'Unknown repository in the "consistency" section\'s "check_repositories": {repository}',
repository
),
), ),
) )
@ -126,18 +128,29 @@ def normalize_repository_path(repository):
''' '''
Given a repository path, return the absolute path of it (for local repositories). Given a repository path, return the absolute path of it (for local repositories).
''' '''
# A colon in the repository indicates it's a remote repository. Bail. # A colon in the repository could mean that it's either a file:// URL or a remote repository.
if ':' in repository: # If it's a remote repository, we don't want to normalize it. If it's a file:// URL, we do.
if ':' not in repository:
return os.path.abspath(repository)
elif repository.startswith('file://'):
return os.path.abspath(repository.partition('file://')[-1])
else:
return repository return repository
return os.path.abspath(repository)
def repositories_match(first, second): def repositories_match(first, second):
''' '''
Given two repository paths (relative and/or absolute), return whether they match. Given two repository dicts with keys 'path' (relative and/or absolute),
and 'label', or two repository paths, return whether they match.
''' '''
return normalize_repository_path(first) == normalize_repository_path(second) if isinstance(first, str):
first = {'path': first, 'label': first}
if isinstance(second, str):
second = {'path': second, 'label': second}
return (first.get('label') == second.get('label')) or (
normalize_repository_path(first.get('path'))
== normalize_repository_path(second.get('path'))
)
def guard_configuration_contains_repository(repository, configurations): def guard_configuration_contains_repository(repository, configurations):
@ -157,14 +170,14 @@ def guard_configuration_contains_repository(repository, configurations):
config_repository config_repository
for config in configurations.values() for config in configurations.values()
for config_repository in config['location']['repositories'] for config_repository in config['location']['repositories']
if repositories_match(repository, config_repository) if repositories_match(config_repository, repository)
) )
) )
if count == 0: if count == 0:
raise ValueError('Repository {} not found in configuration files'.format(repository)) raise ValueError(f'Repository {repository} not found in configuration files')
if count > 1: if count > 1:
raise ValueError('Repository {} found in multiple configuration files'.format(repository)) raise ValueError(f'Repository {repository} found in multiple configuration files')
def guard_single_repository_selected(repository, configurations): def guard_single_repository_selected(repository, configurations):
@ -186,5 +199,5 @@ def guard_single_repository_selected(repository, configurations):
if count != 1: if count != 1:
raise ValueError( raise ValueError(
'Can\'t determine which repository to use. Use --repository to disambiguate' "Can't determine which repository to use. Use --repository to disambiguate"
) )

View File

@ -11,7 +11,7 @@ ERROR_OUTPUT_MAX_LINE_COUNT = 25
BORG_ERROR_EXIT_CODE = 2 BORG_ERROR_EXIT_CODE = 2
def exit_code_indicates_error(process, exit_code, borg_local_path=None): def exit_code_indicates_error(command, exit_code, borg_local_path=None):
''' '''
Return True if the given exit code from running a command corresponds to an error. If a Borg Return True if the given exit code from running a command corresponds to an error. If a Borg
local path is given and matches the process' command, then treat exit code 1 as a warning local path is given and matches the process' command, then treat exit code 1 as a warning
@ -20,8 +20,6 @@ def exit_code_indicates_error(process, exit_code, borg_local_path=None):
if exit_code is None: if exit_code is None:
return False return False
command = process.args.split(' ') if isinstance(process.args, str) else process.args
if borg_local_path and command[0] == borg_local_path: if borg_local_path and command[0] == borg_local_path:
return bool(exit_code < 0 or exit_code >= BORG_ERROR_EXIT_CODE) return bool(exit_code < 0 or exit_code >= BORG_ERROR_EXIT_CODE)
@ -45,6 +43,23 @@ def output_buffer_for_process(process, exclude_stdouts):
return process.stderr if process.stdout in exclude_stdouts else process.stdout return process.stderr if process.stdout in exclude_stdouts else process.stdout
def append_last_lines(last_lines, captured_output, line, output_log_level):
'''
Given a rolling list of last lines, a list of captured output, a line to append, and an output
log level, append the line to the last lines and (if necessary) the captured output. Then log
the line at the requested output log level.
'''
last_lines.append(line)
if len(last_lines) > ERROR_OUTPUT_MAX_LINE_COUNT:
last_lines.pop(0)
if output_log_level is None:
captured_output.append(line)
else:
logger.log(output_log_level, line)
def log_outputs(processes, exclude_stdouts, output_log_level, borg_local_path): def log_outputs(processes, exclude_stdouts, output_log_level, borg_local_path):
''' '''
Given a sequence of subprocess.Popen() instances for multiple processes, log the output for each Given a sequence of subprocess.Popen() instances for multiple processes, log the output for each
@ -100,15 +115,12 @@ def log_outputs(processes, exclude_stdouts, output_log_level, borg_local_path):
# Keep the last few lines of output in case the process errors, and we need the output for # Keep the last few lines of output in case the process errors, and we need the output for
# the exception below. # the exception below.
last_lines = buffer_last_lines[ready_buffer] append_last_lines(
last_lines.append(line) buffer_last_lines[ready_buffer],
if len(last_lines) > ERROR_OUTPUT_MAX_LINE_COUNT: captured_outputs[ready_process],
last_lines.pop(0) line,
output_log_level,
if output_log_level is None: )
captured_outputs[ready_process].append(line)
else:
logger.log(output_log_level, line)
if not still_running: if not still_running:
break break
@ -121,13 +133,24 @@ def log_outputs(processes, exclude_stdouts, output_log_level, borg_local_path):
if exit_code is None: if exit_code is None:
still_running = True still_running = True
command = process.args.split(' ') if isinstance(process.args, str) else process.args
# If any process errors, then raise accordingly. # If any process errors, then raise accordingly.
if exit_code_indicates_error(process, exit_code, borg_local_path): if exit_code_indicates_error(command, exit_code, borg_local_path):
# If an error occurs, include its output in the raised exception so that we don't # If an error occurs, include its output in the raised exception so that we don't
# inadvertently hide error output. # inadvertently hide error output.
output_buffer = output_buffer_for_process(process, exclude_stdouts) output_buffer = output_buffer_for_process(process, exclude_stdouts)
last_lines = buffer_last_lines[output_buffer] if output_buffer else [] last_lines = buffer_last_lines[output_buffer] if output_buffer else []
# Collect any straggling output lines that came in since we last gathered output.
while output_buffer: # pragma: no cover
line = output_buffer.readline().rstrip().decode()
if not line:
break
append_last_lines(
last_lines, captured_outputs[process], line, output_log_level=logging.ERROR
)
if len(last_lines) == ERROR_OUTPUT_MAX_LINE_COUNT: if len(last_lines) == ERROR_OUTPUT_MAX_LINE_COUNT:
last_lines.insert(0, '...') last_lines.insert(0, '...')
@ -155,8 +178,8 @@ def log_command(full_command, input_file=None, output_file=None):
''' '''
logger.debug( logger.debug(
' '.join(full_command) ' '.join(full_command)
+ (' < {}'.format(getattr(input_file, 'name', '')) if input_file else '') + (f" < {getattr(input_file, 'name', '')}" if input_file else '')
+ (' > {}'.format(getattr(output_file, 'name', '')) if output_file else '') + (f" > {getattr(output_file, 'name', '')}" if output_file else '')
) )
@ -228,13 +251,18 @@ def execute_command_and_capture_output(
environment = {**os.environ, **extra_environment} if extra_environment else None environment = {**os.environ, **extra_environment} if extra_environment else None
command = ' '.join(full_command) if shell else full_command command = ' '.join(full_command) if shell else full_command
output = subprocess.check_output( try:
command, output = subprocess.check_output(
stderr=subprocess.STDOUT if capture_stderr else None, command,
shell=shell, stderr=subprocess.STDOUT if capture_stderr else None,
env=environment, shell=shell,
cwd=working_directory, env=environment,
) cwd=working_directory,
)
except subprocess.CalledProcessError as error:
if exit_code_indicates_error(command, error.returncode):
raise
output = error.output
return output.decode() if output is not None else None return output.decode() if output is not None else None

View File

@ -16,7 +16,7 @@ def interpolate_context(config_filename, hook_description, command, context):
names/values, interpolate the values by "{name}" into the command and return the result. names/values, interpolate the values by "{name}" into the command and return the result.
''' '''
for name, value in context.items(): for name, value in context.items():
command = command.replace('{%s}' % name, str(value)) command = command.replace(f'{{{name}}}', str(value))
for unsupported_variable in re.findall(r'{\w+}', command): for unsupported_variable in re.findall(r'{\w+}', command):
logger.warning( logger.warning(
@ -38,7 +38,7 @@ def execute_hook(commands, umask, config_filename, description, dry_run, **conte
Raise subprocesses.CalledProcessError if an error occurs in a hook. Raise subprocesses.CalledProcessError if an error occurs in a hook.
''' '''
if not commands: if not commands:
logger.debug('{}: No commands to run for {} hook'.format(config_filename, description)) logger.debug(f'{config_filename}: No commands to run for {description} hook')
return return
dry_run_label = ' (dry run; not actually running hooks)' if dry_run else '' dry_run_label = ' (dry run; not actually running hooks)' if dry_run else ''
@ -49,19 +49,15 @@ def execute_hook(commands, umask, config_filename, description, dry_run, **conte
] ]
if len(commands) == 1: if len(commands) == 1:
logger.info( logger.info(f'{config_filename}: Running command for {description} hook{dry_run_label}')
'{}: Running command for {} hook{}'.format(config_filename, description, dry_run_label)
)
else: else:
logger.info( logger.info(
'{}: Running {} commands for {} hook{}'.format( f'{config_filename}: Running {len(commands)} commands for {description} hook{dry_run_label}',
config_filename, len(commands), description, dry_run_label
)
) )
if umask: if umask:
parsed_umask = int(str(umask), 8) parsed_umask = int(str(umask), 8)
logger.debug('{}: Set hook umask to {}'.format(config_filename, oct(parsed_umask))) logger.debug(f'{config_filename}: Set hook umask to {oct(parsed_umask)}')
original_umask = os.umask(parsed_umask) original_umask = os.umask(parsed_umask)
else: else:
original_umask = None original_umask = None
@ -93,9 +89,7 @@ def considered_soft_failure(config_filename, error):
if exit_code == SOFT_FAIL_EXIT_CODE: if exit_code == SOFT_FAIL_EXIT_CODE:
logger.info( logger.info(
'{}: Command hook exited with soft failure exit code ({}); skipping remaining actions'.format( f'{config_filename}: Command hook exited with soft failure exit code ({SOFT_FAIL_EXIT_CODE}); skipping remaining actions',
config_filename, SOFT_FAIL_EXIT_CODE
)
) )
return True return True

View File

@ -27,18 +27,22 @@ def ping_monitor(hook_config, config_filename, state, monitoring_log_level, dry_
Ping the configured Cronhub URL, modified with the monitor.State. Use the given configuration Ping the configured Cronhub URL, modified with the monitor.State. Use the given configuration
filename in any log entries. If this is a dry run, then don't actually ping anything. filename in any log entries. If this is a dry run, then don't actually ping anything.
''' '''
if state not in MONITOR_STATE_TO_CRONHUB:
logger.debug(
f'{config_filename}: Ignoring unsupported monitoring {state.name.lower()} in Cronhub hook'
)
return
dry_run_label = ' (dry run; not actually pinging)' if dry_run else '' dry_run_label = ' (dry run; not actually pinging)' if dry_run else ''
formatted_state = '/{}/'.format(MONITOR_STATE_TO_CRONHUB[state]) formatted_state = f'/{MONITOR_STATE_TO_CRONHUB[state]}/'
ping_url = ( ping_url = (
hook_config['ping_url'] hook_config['ping_url']
.replace('/start/', formatted_state) .replace('/start/', formatted_state)
.replace('/ping/', formatted_state) .replace('/ping/', formatted_state)
) )
logger.info( logger.info(f'{config_filename}: Pinging Cronhub {state.name.lower()}{dry_run_label}')
'{}: Pinging Cronhub {}{}'.format(config_filename, state.name.lower(), dry_run_label) logger.debug(f'{config_filename}: Using Cronhub ping URL {ping_url}')
)
logger.debug('{}: Using Cronhub ping URL {}'.format(config_filename, ping_url))
if not dry_run: if not dry_run:
logging.getLogger('urllib3').setLevel(logging.ERROR) logging.getLogger('urllib3').setLevel(logging.ERROR)

View File

@ -27,13 +27,17 @@ def ping_monitor(hook_config, config_filename, state, monitoring_log_level, dry_
Ping the configured Cronitor URL, modified with the monitor.State. Use the given configuration Ping the configured Cronitor URL, modified with the monitor.State. Use the given configuration
filename in any log entries. If this is a dry run, then don't actually ping anything. filename in any log entries. If this is a dry run, then don't actually ping anything.
''' '''
dry_run_label = ' (dry run; not actually pinging)' if dry_run else '' if state not in MONITOR_STATE_TO_CRONITOR:
ping_url = '{}/{}'.format(hook_config['ping_url'], MONITOR_STATE_TO_CRONITOR[state]) logger.debug(
f'{config_filename}: Ignoring unsupported monitoring {state.name.lower()} in Cronitor hook'
)
return
logger.info( dry_run_label = ' (dry run; not actually pinging)' if dry_run else ''
'{}: Pinging Cronitor {}{}'.format(config_filename, state.name.lower(), dry_run_label) ping_url = f"{hook_config['ping_url']}/{MONITOR_STATE_TO_CRONITOR[state]}"
)
logger.debug('{}: Using Cronitor ping URL {}'.format(config_filename, ping_url)) logger.info(f'{config_filename}: Pinging Cronitor {state.name.lower()}{dry_run_label}')
logger.debug(f'{config_filename}: Using Cronitor ping URL {ping_url}')
if not dry_run: if not dry_run:
logging.getLogger('urllib3').setLevel(logging.ERROR) logging.getLogger('urllib3').setLevel(logging.ERROR)

View File

@ -9,6 +9,7 @@ from borgmatic.hooks import (
ntfy, ntfy,
pagerduty, pagerduty,
postgresql, postgresql,
sqlite,
) )
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
@ -22,6 +23,7 @@ HOOK_NAME_TO_MODULE = {
'ntfy': ntfy, 'ntfy': ntfy,
'pagerduty': pagerduty, 'pagerduty': pagerduty,
'postgresql_databases': postgresql, 'postgresql_databases': postgresql,
'sqlite_databases': sqlite,
} }
@ -41,9 +43,9 @@ def call_hook(function_name, hooks, log_prefix, hook_name, *args, **kwargs):
try: try:
module = HOOK_NAME_TO_MODULE[hook_name] module = HOOK_NAME_TO_MODULE[hook_name]
except KeyError: except KeyError:
raise ValueError('Unknown hook name: {}'.format(hook_name)) raise ValueError(f'Unknown hook name: {hook_name}')
logger.debug('{}: Calling {} hook function {}'.format(log_prefix, hook_name, function_name)) logger.debug(f'{log_prefix}: Calling {hook_name} hook function {function_name}')
return getattr(module, function_name)(config, log_prefix, *args, **kwargs) return getattr(module, function_name)(config, log_prefix, *args, **kwargs)

View File

@ -6,7 +6,12 @@ from borgmatic.borg.state import DEFAULT_BORGMATIC_SOURCE_DIRECTORY
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
DATABASE_HOOK_NAMES = ('postgresql_databases', 'mysql_databases', 'mongodb_databases') DATABASE_HOOK_NAMES = (
'postgresql_databases',
'mysql_databases',
'mongodb_databases',
'sqlite_databases',
)
def make_database_dump_path(borgmatic_source_directory, database_hook_name): def make_database_dump_path(borgmatic_source_directory, database_hook_name):
@ -28,7 +33,7 @@ def make_database_dump_filename(dump_path, name, hostname=None):
Raise ValueError if the database name is invalid. Raise ValueError if the database name is invalid.
''' '''
if os.path.sep in name: if os.path.sep in name:
raise ValueError('Invalid database name {}'.format(name)) raise ValueError(f'Invalid database name {name}')
return os.path.join(os.path.expanduser(dump_path), hostname or 'localhost', name) return os.path.join(os.path.expanduser(dump_path), hostname or 'localhost', name)
@ -55,9 +60,7 @@ def remove_database_dumps(dump_path, database_type_name, log_prefix, dry_run):
''' '''
dry_run_label = ' (dry run; not actually removing anything)' if dry_run else '' dry_run_label = ' (dry run; not actually removing anything)' if dry_run else ''
logger.debug( logger.debug(f'{log_prefix}: Removing {database_type_name} database dumps{dry_run_label}')
'{}: Removing {} database dumps{}'.format(log_prefix, database_type_name, dry_run_label)
)
expanded_path = os.path.expanduser(dump_path) expanded_path = os.path.expanduser(dump_path)
@ -73,4 +76,4 @@ def convert_glob_patterns_to_borg_patterns(patterns):
Convert a sequence of shell glob patterns like "/etc/*" to the corresponding Borg archive Convert a sequence of shell glob patterns like "/etc/*" to the corresponding Borg archive
patterns like "sh:etc/*". patterns like "sh:etc/*".
''' '''
return ['sh:{}'.format(pattern.lstrip(os.path.sep)) for pattern in patterns] return [f'sh:{pattern.lstrip(os.path.sep)}' for pattern in patterns]

View File

@ -10,6 +10,7 @@ MONITOR_STATE_TO_HEALTHCHECKS = {
monitor.State.START: 'start', monitor.State.START: 'start',
monitor.State.FINISH: None, # Healthchecks doesn't append to the URL for the finished state. monitor.State.FINISH: None, # Healthchecks doesn't append to the URL for the finished state.
monitor.State.FAIL: 'fail', monitor.State.FAIL: 'fail',
monitor.State.LOG: 'log',
} }
PAYLOAD_TRUNCATION_INDICATOR = '...\n' PAYLOAD_TRUNCATION_INDICATOR = '...\n'
@ -98,7 +99,7 @@ def ping_monitor(hook_config, config_filename, state, monitoring_log_level, dry_
ping_url = ( ping_url = (
hook_config['ping_url'] hook_config['ping_url']
if hook_config['ping_url'].startswith('http') if hook_config['ping_url'].startswith('http')
else 'https://hc-ping.com/{}'.format(hook_config['ping_url']) else f"https://hc-ping.com/{hook_config['ping_url']}"
) )
dry_run_label = ' (dry run; not actually pinging)' if dry_run else '' dry_run_label = ' (dry run; not actually pinging)' if dry_run else ''
@ -110,14 +111,12 @@ def ping_monitor(hook_config, config_filename, state, monitoring_log_level, dry_
healthchecks_state = MONITOR_STATE_TO_HEALTHCHECKS.get(state) healthchecks_state = MONITOR_STATE_TO_HEALTHCHECKS.get(state)
if healthchecks_state: if healthchecks_state:
ping_url = '{}/{}'.format(ping_url, healthchecks_state) ping_url = f'{ping_url}/{healthchecks_state}'
logger.info( logger.info(f'{config_filename}: Pinging Healthchecks {state.name.lower()}{dry_run_label}')
'{}: Pinging Healthchecks {}{}'.format(config_filename, state.name.lower(), dry_run_label) logger.debug(f'{config_filename}: Using Healthchecks ping URL {ping_url}')
)
logger.debug('{}: Using Healthchecks ping URL {}'.format(config_filename, ping_url))
if state in (monitor.State.FINISH, monitor.State.FAIL): if state in (monitor.State.FINISH, monitor.State.FAIL, monitor.State.LOG):
payload = format_buffered_logs_for_payload() payload = format_buffered_logs_for_payload()
else: else:
payload = '' payload = ''

View File

@ -27,7 +27,7 @@ def dump_databases(databases, log_prefix, location_config, dry_run):
''' '''
dry_run_label = ' (dry run; not actually dumping anything)' if dry_run else '' dry_run_label = ' (dry run; not actually dumping anything)' if dry_run else ''
logger.info('{}: Dumping MongoDB databases{}'.format(log_prefix, dry_run_label)) logger.info(f'{log_prefix}: Dumping MongoDB databases{dry_run_label}')
processes = [] processes = []
for database in databases: for database in databases:
@ -38,9 +38,7 @@ def dump_databases(databases, log_prefix, location_config, dry_run):
dump_format = database.get('format', 'archive') dump_format = database.get('format', 'archive')
logger.debug( logger.debug(
'{}: Dumping MongoDB database {} to {}{}'.format( f'{log_prefix}: Dumping MongoDB database {name} to {dump_filename}{dry_run_label}',
log_prefix, name, dump_filename, dry_run_label
)
) )
if dry_run: if dry_run:
continue continue
@ -126,9 +124,7 @@ def restore_database_dump(database_config, log_prefix, location_config, dry_run,
) )
restore_command = build_restore_command(extract_process, database, dump_filename) restore_command = build_restore_command(extract_process, database, dump_filename)
logger.debug( logger.debug(f"{log_prefix}: Restoring MongoDB database {database['name']}{dry_run_label}")
'{}: Restoring MongoDB database {}{}'.format(log_prefix, database['name'], dry_run_label)
)
if dry_run: if dry_run:
return return

View File

@ -7,3 +7,4 @@ class State(Enum):
START = 1 START = 1
FINISH = 2 FINISH = 2
FAIL = 3 FAIL = 3
LOG = 4

View File

@ -88,9 +88,7 @@ def execute_dump_command(
+ (('--user', database['username']) if 'username' in database else ()) + (('--user', database['username']) if 'username' in database else ())
+ ('--databases',) + ('--databases',)
+ database_names + database_names
# Use shell redirection rather than execute_command(output_file=open(...)) to prevent + ('--result-file', dump_filename)
# the open() call on a named pipe from hanging the main borgmatic process.
+ ('>', dump_filename)
) )
logger.debug( logger.debug(
@ -102,7 +100,7 @@ def execute_dump_command(
dump.create_named_pipe_for_dump(dump_filename) dump.create_named_pipe_for_dump(dump_filename)
return execute_command( return execute_command(
dump_command, shell=True, extra_environment=extra_environment, run_to_completion=False, dump_command, extra_environment=extra_environment, run_to_completion=False,
) )
@ -119,7 +117,7 @@ def dump_databases(databases, log_prefix, location_config, dry_run):
dry_run_label = ' (dry run; not actually dumping anything)' if dry_run else '' dry_run_label = ' (dry run; not actually dumping anything)' if dry_run else ''
processes = [] processes = []
logger.info('{}: Dumping MySQL databases{}'.format(log_prefix, dry_run_label)) logger.info(f'{log_prefix}: Dumping MySQL databases{dry_run_label}')
for database in databases: for database in databases:
dump_path = make_dump_path(location_config) dump_path = make_dump_path(location_config)
@ -209,9 +207,7 @@ def restore_database_dump(database_config, log_prefix, location_config, dry_run,
) )
extra_environment = {'MYSQL_PWD': database['password']} if 'password' in database else None extra_environment = {'MYSQL_PWD': database['password']} if 'password' in database else None
logger.debug( logger.debug(f"{log_prefix}: Restoring MySQL database {database['name']}{dry_run_label}")
'{}: Restoring MySQL database {}{}'.format(log_prefix, database['name'], dry_run_label)
)
if dry_run: if dry_run:
return return

View File

@ -2,16 +2,8 @@ import logging
import requests import requests
from borgmatic.hooks import monitor
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
MONITOR_STATE_TO_NTFY = {
monitor.State.START: None,
monitor.State.FINISH: None,
monitor.State.FAIL: None,
}
def initialize_monitor( def initialize_monitor(
ping_url, config_filename, monitoring_log_level, dry_run ping_url, config_filename, monitoring_log_level, dry_run

View File

@ -29,14 +29,12 @@ def ping_monitor(hook_config, config_filename, state, monitoring_log_level, dry_
''' '''
if state != monitor.State.FAIL: if state != monitor.State.FAIL:
logger.debug( logger.debug(
'{}: Ignoring unsupported monitoring {} in PagerDuty hook'.format( f'{config_filename}: Ignoring unsupported monitoring {state.name.lower()} in PagerDuty hook',
config_filename, state.name.lower()
)
) )
return return
dry_run_label = ' (dry run; not actually sending)' if dry_run else '' dry_run_label = ' (dry run; not actually sending)' if dry_run else ''
logger.info('{}: Sending failure event to PagerDuty {}'.format(config_filename, dry_run_label)) logger.info(f'{config_filename}: Sending failure event to PagerDuty {dry_run_label}')
if dry_run: if dry_run:
return return
@ -50,7 +48,7 @@ def ping_monitor(hook_config, config_filename, state, monitoring_log_level, dry_
'routing_key': hook_config['integration_key'], 'routing_key': hook_config['integration_key'],
'event_action': 'trigger', 'event_action': 'trigger',
'payload': { 'payload': {
'summary': 'backup failed on {}'.format(hostname), 'summary': f'backup failed on {hostname}',
'severity': 'error', 'severity': 'error',
'source': hostname, 'source': hostname,
'timestamp': local_timestamp, 'timestamp': local_timestamp,
@ -65,7 +63,7 @@ def ping_monitor(hook_config, config_filename, state, monitoring_log_level, dry_
}, },
} }
) )
logger.debug('{}: Using PagerDuty payload: {}'.format(config_filename, payload)) logger.debug(f'{config_filename}: Using PagerDuty payload: {payload}')
logging.getLogger('urllib3').setLevel(logging.ERROR) logging.getLogger('urllib3').setLevel(logging.ERROR)
try: try:

View File

@ -93,7 +93,7 @@ def dump_databases(databases, log_prefix, location_config, dry_run):
dry_run_label = ' (dry run; not actually dumping anything)' if dry_run else '' dry_run_label = ' (dry run; not actually dumping anything)' if dry_run else ''
processes = [] processes = []
logger.info('{}: Dumping PostgreSQL databases{}'.format(log_prefix, dry_run_label)) logger.info(f'{log_prefix}: Dumping PostgreSQL databases{dry_run_label}')
for database in databases: for database in databases:
extra_environment = make_extra_environment(database) extra_environment = make_extra_environment(database)
@ -228,9 +228,7 @@ def restore_database_dump(database_config, log_prefix, location_config, dry_run,
) )
extra_environment = make_extra_environment(database) extra_environment = make_extra_environment(database)
logger.debug( logger.debug(f"{log_prefix}: Restoring PostgreSQL database {database['name']}{dry_run_label}")
'{}: Restoring PostgreSQL database {}{}'.format(log_prefix, database['name'], dry_run_label)
)
if dry_run: if dry_run:
return return

125
borgmatic/hooks/sqlite.py Normal file
View File

@ -0,0 +1,125 @@
import logging
import os
from borgmatic.execute import execute_command, execute_command_with_processes
from borgmatic.hooks import dump
logger = logging.getLogger(__name__)
def make_dump_path(location_config): # pragma: no cover
'''
Make the dump path from the given location configuration and the name of this hook.
'''
return dump.make_database_dump_path(
location_config.get('borgmatic_source_directory'), 'sqlite_databases'
)
def dump_databases(databases, log_prefix, location_config, dry_run):
'''
Dump the given SQLite3 databases to a file. The databases are supplied as a sequence of
configuration dicts, as per the configuration schema. Use the given log prefix in any log
entries. Use the given location configuration dict to construct the destination path. If this
is a dry run, then don't actually dump anything.
'''
dry_run_label = ' (dry run; not actually dumping anything)' if dry_run else ''
processes = []
logger.info(f'{log_prefix}: Dumping SQLite databases{dry_run_label}')
for database in databases:
database_path = database['path']
if database['name'] == 'all':
logger.warning('The "all" database name has no meaning for SQLite3 databases')
if not os.path.exists(database_path):
logger.warning(
f'{log_prefix}: No SQLite database at {database_path}; An empty database will be created and dumped'
)
dump_path = make_dump_path(location_config)
dump_filename = dump.make_database_dump_filename(dump_path, database['name'])
if os.path.exists(dump_filename):
logger.warning(
f'{log_prefix}: Skipping duplicate dump of SQLite database at {database_path} to {dump_filename}'
)
continue
command = (
'sqlite3',
database_path,
'.dump',
'>',
dump_filename,
)
logger.debug(
f'{log_prefix}: Dumping SQLite database at {database_path} to {dump_filename}{dry_run_label}'
)
if dry_run:
continue
dump.create_parent_directory_for_dump(dump_filename)
processes.append(execute_command(command, shell=True, run_to_completion=False))
return processes
def remove_database_dumps(databases, log_prefix, location_config, dry_run): # pragma: no cover
'''
Remove the given SQLite3 database dumps from the filesystem. The databases are supplied as a
sequence of configuration dicts, as per the configuration schema. Use the given log prefix in
any log entries. Use the given location configuration dict to construct the destination path.
If this is a dry run, then don't actually remove anything.
'''
dump.remove_database_dumps(make_dump_path(location_config), 'SQLite', log_prefix, dry_run)
def make_database_dump_pattern(
databases, log_prefix, location_config, name=None
): # pragma: no cover
'''
Make a pattern that matches the given SQLite3 databases. The databases are supplied as a
sequence of configuration dicts, as per the configuration schema.
'''
return dump.make_database_dump_filename(make_dump_path(location_config), name)
def restore_database_dump(database_config, log_prefix, location_config, dry_run, extract_process):
'''
Restore the given SQLite3 database from an extract stream. The database is supplied as a
one-element sequence containing a dict describing the database, as per the configuration schema.
Use the given log prefix in any log entries. If this is a dry run, then don't actually restore
anything. Trigger the given active extract process (an instance of subprocess.Popen) to produce
output to consume.
'''
dry_run_label = ' (dry run; not actually restoring anything)' if dry_run else ''
if len(database_config) != 1:
raise ValueError('The database configuration value is invalid')
database_path = database_config[0]['path']
logger.debug(f'{log_prefix}: Restoring SQLite database at {database_path}{dry_run_label}')
if dry_run:
return
try:
os.remove(database_path)
logger.warning(f'{log_prefix}: Removed existing SQLite database at {database_path}')
except FileNotFoundError: # pragma: no cover
pass
restore_command = (
'sqlite3',
database_path,
)
# Don't give Borg local path so as to error on warnings, as "borg extract" only gives a warning
# if the restore paths don't exist in the archive.
execute_command_with_processes(
restore_command,
[extract_process],
output_log_level=logging.DEBUG,
input_file=extract_process.stdout,
)

View File

@ -68,7 +68,7 @@ class Multi_stream_handler(logging.Handler):
def emit(self, record): def emit(self, record):
''' '''
Dispatch the log record to the approriate stream handler for the record's log level. Dispatch the log record to the appropriate stream handler for the record's log level.
''' '''
self.log_level_to_handler[record.levelno].emit(record) self.log_level_to_handler[record.levelno].emit(record)
@ -108,7 +108,7 @@ def color_text(color, message):
if not color: if not color:
return message return message
return '{}{}{}'.format(color, message, colorama.Style.RESET_ALL) return f'{color}{message}{colorama.Style.RESET_ALL}'
def add_logging_level(level_name, level_number): def add_logging_level(level_name, level_number):
@ -156,6 +156,7 @@ def configure_logging(
log_file_log_level=None, log_file_log_level=None,
monitoring_log_level=None, monitoring_log_level=None,
log_file=None, log_file=None,
log_file_format=None,
): ):
''' '''
Configure logging to go to both the console and (syslog or log file). Use the given log levels, Configure logging to go to both the console and (syslog or log file). Use the given log levels,
@ -200,12 +201,18 @@ def configure_logging(
if syslog_path and not interactive_console(): if syslog_path and not interactive_console():
syslog_handler = logging.handlers.SysLogHandler(address=syslog_path) syslog_handler = logging.handlers.SysLogHandler(address=syslog_path)
syslog_handler.setFormatter(logging.Formatter('borgmatic: %(levelname)s %(message)s')) syslog_handler.setFormatter(
logging.Formatter('borgmatic: {levelname} {message}', style='{') # noqa: FS003
)
syslog_handler.setLevel(syslog_log_level) syslog_handler.setLevel(syslog_log_level)
handlers = (console_handler, syslog_handler) handlers = (console_handler, syslog_handler)
elif log_file: elif log_file:
file_handler = logging.handlers.WatchedFileHandler(log_file) file_handler = logging.handlers.WatchedFileHandler(log_file)
file_handler.setFormatter(logging.Formatter('[%(asctime)s] %(levelname)s: %(message)s')) file_handler.setFormatter(
logging.Formatter(
log_file_format or '[{asctime}] {levelname}: {message}', style='{' # noqa: FS003
)
)
file_handler.setLevel(log_file_log_level) file_handler.setLevel(log_file_log_level)
handlers = (console_handler, file_handler) handlers = (console_handler, file_handler)
else: else:

View File

@ -1,14 +1,14 @@
FROM alpine:3.17.1 as borgmatic FROM docker.io/alpine:3.17.1 as borgmatic
COPY . /app COPY . /app
RUN apk add --no-cache py3-pip py3-ruamel.yaml py3-ruamel.yaml.clib RUN apk add --no-cache py3-pip py3-ruamel.yaml py3-ruamel.yaml.clib
RUN pip install --no-cache /app && generate-borgmatic-config && chmod +r /etc/borgmatic/config.yaml RUN pip install --no-cache /app && generate-borgmatic-config && chmod +r /etc/borgmatic/config.yaml
RUN borgmatic --help > /command-line.txt \ RUN borgmatic --help > /command-line.txt \
&& for action in rcreate transfer prune compact create check extract export-tar mount umount restore rlist list rinfo info break-lock borg; do \ && for action in rcreate transfer create prune compact check extract export-tar mount umount restore rlist list rinfo info break-lock borg; do \
echo -e "\n--------------------------------------------------------------------------------\n" >> /command-line.txt \ echo -e "\n--------------------------------------------------------------------------------\n" >> /command-line.txt \
&& borgmatic "$action" --help >> /command-line.txt; done && borgmatic "$action" --help >> /command-line.txt; done
FROM node:19.5.0-alpine as html FROM docker.io/node:19.5.0-alpine as html
ARG ENVIRONMENT=production ARG ENVIRONMENT=production
@ -18,6 +18,7 @@ RUN npm install @11ty/eleventy \
@11ty/eleventy-plugin-syntaxhighlight \ @11ty/eleventy-plugin-syntaxhighlight \
@11ty/eleventy-plugin-inclusive-language \ @11ty/eleventy-plugin-inclusive-language \
@11ty/eleventy-navigation \ @11ty/eleventy-navigation \
eleventy-plugin-code-clipboard \
markdown-it \ markdown-it \
markdown-it-anchor \ markdown-it-anchor \
markdown-it-replace-link markdown-it-replace-link
@ -27,7 +28,7 @@ COPY . /source
RUN NODE_ENV=${ENVIRONMENT} npx eleventy --input=/source/docs --output=/output/docs \ RUN NODE_ENV=${ENVIRONMENT} npx eleventy --input=/source/docs --output=/output/docs \
&& mv /output/docs/index.html /output/index.html && mv /output/docs/index.html /output/index.html
FROM nginx:1.22.1-alpine FROM docker.io/nginx:1.22.1-alpine
COPY --from=html /output /usr/share/nginx/html COPY --from=html /output /usr/share/nginx/html
COPY --from=borgmatic /etc/borgmatic/config.yaml /usr/share/nginx/html/docs/reference/config.yaml COPY --from=borgmatic /etc/borgmatic/config.yaml /usr/share/nginx/html/docs/reference/config.yaml

View File

@ -63,11 +63,6 @@
top: -2px; top: -2px;
bottom: 2px; bottom: 2px;
} }
@media (prefers-color-scheme: dark) {
.inlinelist .inlinelist-item code:before {
border-left-color: rgba(0,0,0,.8);
}
}
} }
a.buzzword { a.buzzword {
text-decoration: underline; text-decoration: underline;
@ -91,26 +86,9 @@ a.buzzword {
.buzzword { .buzzword {
background-color: #f7f7f7; background-color: #f7f7f7;
} }
@media (prefers-color-scheme: dark) {
.buzzword-list li,
.buzzword {
background-color: #080808;
}
}
.inlinelist .inlinelist-item { .inlinelist .inlinelist-item {
background-color: #e9e9e9; background-color: #e9e9e9;
} }
@media (prefers-color-scheme: dark) {
.inlinelist .inlinelist-item {
background-color: #000;
}
.inlinelist .inlinelist-item a {
color: #fff;
}
.inlinelist .inlinelist-item code {
color: inherit;
}
}
.inlinelist .inlinelist-item:hover, .inlinelist .inlinelist-item:hover,
.inlinelist .inlinelist-item:focus, .inlinelist .inlinelist-item:focus,
.buzzword-list li:hover, .buzzword-list li:hover,
@ -217,12 +195,6 @@ main p a.buzzword {
height: 1.75em; height: 1.75em;
font-weight: 600; font-weight: 600;
} }
@media (prefers-color-scheme: dark) {
.numberflag {
background-color: #00bcd4;
color: #222;
}
}
h1 .numberflag, h1 .numberflag,
h2 .numberflag, h2 .numberflag,
h3 .numberflag, h3 .numberflag,
@ -244,11 +216,6 @@ h2 .numberflag:after {
background-color: #fff; background-color: #fff;
width: calc(100% + 0.4em); /* 16px /40 */ width: calc(100% + 0.4em); /* 16px /40 */
} }
@media (prefers-color-scheme: dark) {
h2 .numberflag:after {
background-color: #222;
}
}
/* Super featured list on home page */ /* Super featured list on home page */
.list-superfeatured .avatar { .list-superfeatured .avatar {

View File

@ -12,16 +12,6 @@
line-height: 1.285714285714; /* 18px /14 */ line-height: 1.285714285714; /* 18px /14 */
font-family: system-ui, -apple-system, sans-serif; font-family: system-ui, -apple-system, sans-serif;
} }
@media (prefers-color-scheme: dark) {
.minilink {
background-color: #222;
/*
!important to override .elv-callout a
see _includes/components/callout.css
*/
color: #fff !important;
}
}
table .minilink { table .minilink {
margin-top: 6px; margin-top: 6px;
} }
@ -32,12 +22,6 @@ table .minilink {
.minilink[href]:focus { .minilink[href]:focus {
background-color: #bbb; background-color: #bbb;
} }
@media (prefers-color-scheme: dark) {
.minilink[href]:hover,
.minilink[href]:focus {
background-color: #444;
}
}
pre + .minilink { pre + .minilink {
color: #fff; color: #fff;
border-radius: 0 0 0.2857142857143em 0.2857142857143em; /* 4px /14 */ border-radius: 0 0 0.2857142857143em 0.2857142857143em; /* 4px /14 */
@ -74,11 +58,6 @@ h4 .minilink {
text-transform: none; text-transform: none;
box-shadow: 0 0 0 1px rgba(0,0,0,0.3); box-shadow: 0 0 0 1px rgba(0,0,0,0.3);
} }
@media (prefers-color-scheme: dark) {
.minilink-addedin {
box-shadow: 0 0 0 1px rgba(255,255,255,0.3);
}
}
.minilink-addedin:not(:first-child) { .minilink-addedin:not(:first-child) {
margin-left: .5em; margin-left: .5em;
} }

View File

@ -79,22 +79,11 @@
border-bottom: 1px solid #ddd; border-bottom: 1px solid #ddd;
margin-bottom: 0.25em; /* 4px /16 */ margin-bottom: 0.25em; /* 4px /16 */
} }
@media (prefers-color-scheme: dark) {
.elv-toc-list > li > a {
color: #fff;
border-color: #444;
}
}
/* Active links */ /* Active links */
.elv-toc-list li.elv-toc-active > a { .elv-toc-list li.elv-toc-active > a {
background-color: #dff7ff; background-color: #dff7ff;
} }
@media (prefers-color-scheme: dark) {
.elv-toc-list li.elv-toc-active > a {
background-color: #353535;
}
}
.elv-toc-list ul .elv-toc-active > a:after { .elv-toc-list ul .elv-toc-active > a:after {
content: ""; content: "";
} }
@ -105,7 +94,7 @@
display: block; display: block;
} }
/* Footer catgory navigation */ /* Footer category navigation */
.elv-cat-list-active { .elv-cat-list-active {
font-weight: 600; font-weight: 600;
} }

View File

@ -285,11 +285,6 @@ footer.elv-layout {
.elv-hero { .elv-hero {
background-color: #222; background-color: #222;
} }
@media (prefers-color-scheme: dark) {
.elv-hero {
background-color: #292929;
}
}
.elv-hero img, .elv-hero img,
.elv-hero svg { .elv-hero svg {
width: 42.95774646vh; width: 42.95774646vh;
@ -538,3 +533,18 @@ main .elv-toc + h1 .direct-link {
.header-anchor:hover::after { .header-anchor:hover::after {
content: " đź”—"; content: " đź”—";
} }
.mdi {
display: inline-block;
width: 1em;
height: 1em;
background-color: currentColor;
-webkit-mask: no-repeat center / 100%;
mask: no-repeat center / 100%;
-webkit-mask-image: var(--svg);
mask-image: var(--svg);
}
.mdi.mdi-content-copy {
--svg: url("data:image/svg+xml,%3Csvg xmlns='http://www.w3.org/2000/svg' viewBox='0 0 24 24' width='24' height='24'%3E%3Cpath fill='black' d='M19 21H8V7h11m0-2H8a2 2 0 0 0-2 2v14a2 2 0 0 0 2 2h11a2 2 0 0 0 2-2V7a2 2 0 0 0-2-2m-3-4H4a2 2 0 0 0-2 2v14h2V3h12V1Z'/%3E%3C/svg%3E");
}

View File

@ -3,6 +3,7 @@
<head> <head>
<meta charset="utf-8"> <meta charset="utf-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0"> <meta name="viewport" content="width=device-width, initial-scale=1.0">
<link rel="icon" href="docs/static/borgmatic.png" type="image/x-icon">
<title>{{ subtitle + ' - ' if subtitle}}{{ title }}</title> <title>{{ subtitle + ' - ' if subtitle}}{{ title }}</title>
{%- set css %} {%- set css %}
{% include 'index.css' %} {% include 'index.css' %}
@ -22,6 +23,6 @@
<body> <body>
{{ content | safe }} {{ content | safe }}
{% initClipboardJS %}
</body> </body>
</html> </html>

View File

@ -66,6 +66,9 @@ variables you can use here:
* `configuration_filename`: borgmatic configuration filename in which the * `configuration_filename`: borgmatic configuration filename in which the
hook was defined hook was defined
* `log_file`
<span class="minilink minilink-addedin">New in version 1.7.12</span>:
path of the borgmatic log file, only set when the `--log-file` flag is used
* `repository`: path of the current repository as configured in the current * `repository`: path of the current repository as configured in the current
borgmatic configuration file borgmatic configuration file

View File

@ -49,9 +49,12 @@ location:
- /home - /home
repositories: repositories:
- /mnt/removable/backup.borg - path: /mnt/removable/backup.borg
``` ```
<span class="minilink minilink-addedin">Prior to version 1.7.10</span> Omit
the `path:` portion of the `repositories` list.
Then, write a `before_backup` hook in that same configuration file that uses Then, write a `before_backup` hook in that same configuration file that uses
the external `findmnt` utility to see whether the drive is mounted before the external `findmnt` utility to see whether the drive is mounted before
proceeding. proceeding.
@ -79,13 +82,16 @@ location:
- /home - /home
repositories: repositories:
- ssh://me@buddys-server.org/./backup.borg - path: ssh://me@buddys-server.org/./backup.borg
hooks: hooks:
before_backup: before_backup:
- ping -q -c 1 buddys-server.org > /dev/null || exit 75 - ping -q -c 1 buddys-server.org > /dev/null || exit 75
``` ```
<span class="minilink minilink-addedin">Prior to version 1.7.10</span> Omit
the `path:` portion of the `repositories` list.
Or to only run backups if the battery level is high enough: Or to only run backups if the battery level is high enough:
```yaml ```yaml
@ -110,8 +116,8 @@ There are some caveats you should be aware of with this feature.
* You'll generally want to put a soft failure command in the `before_backup` * You'll generally want to put a soft failure command in the `before_backup`
hook, so as to gate whether the backup action occurs. While a soft failure is hook, so as to gate whether the backup action occurs. While a soft failure is
also supported in the `after_backup` hook, returning a soft failure there also supported in the `after_backup` hook, returning a soft failure there
won't prevent any actions from occuring, because they've already occurred! won't prevent any actions from occurring, because they've already occurred!
Similiarly, you can return a soft failure from an `on_error` hook, but at Similarly, you can return a soft failure from an `on_error` hook, but at
that point it's too late to prevent the error. that point it's too late to prevent the error.
* Returning a soft failure does prevent further commands in the same hook from * Returning a soft failure does prevent further commands in the same hook from
executing. So, like a standard error, it is an "early out". Unlike a standard executing. So, like a standard error, it is an "early out". Unlike a standard

View File

@ -15,8 +15,7 @@ consistent snapshot that is more suited for backups.
Fortunately, borgmatic includes built-in support for creating database dumps Fortunately, borgmatic includes built-in support for creating database dumps
prior to running backups. For example, here is everything you need to dump and prior to running backups. For example, here is everything you need to dump and
backup a couple of local PostgreSQL databases, a MySQL/MariaDB database, and a backup a couple of local PostgreSQL databases and a MySQL/MariaDB database.
MongoDB database:
```yaml ```yaml
hooks: hooks:
@ -25,10 +24,27 @@ hooks:
- name: orders - name: orders
mysql_databases: mysql_databases:
- name: posts - name: posts
```
<span class="minilink minilink-addedin">New in version 1.5.22</span> You can
also dump MongoDB databases. For example:
```yaml
hooks:
mongodb_databases: mongodb_databases:
- name: messages - name: messages
``` ```
<span class="minilink minilink-addedin">New in version 1.7.9</span>
Additionally, you can dump SQLite databases. For example:
```yaml
hooks:
sqlite_databases:
- name: mydb
path: /var/lib/sqlite3/mydb.sqlite
```
As part of each backup, borgmatic streams a database dump for each configured As part of each backup, borgmatic streams a database dump for each configured
database directly to Borg, so it's included in the backup without consuming database directly to Borg, so it's included in the backup without consuming
additional disk space. (The exceptions are the PostgreSQL/MongoDB "directory" additional disk space. (The exceptions are the PostgreSQL/MongoDB "directory"
@ -74,6 +90,9 @@ hooks:
password: trustsome1 password: trustsome1
authentication_database: mongousers authentication_database: mongousers
options: "--ssl" options: "--ssl"
sqlite_databases:
- name: mydb
path: /var/lib/sqlite3/mydb.sqlite
``` ```
See your [borgmatic configuration See your [borgmatic configuration
@ -99,6 +118,8 @@ hooks:
Note that you may need to use a `username` of the `postgres` superuser for Note that you may need to use a `username` of the `postgres` superuser for
this to work with PostgreSQL. this to work with PostgreSQL.
The SQLite hook in particular does not consider "all" a special database name.
<span class="minilink minilink-addedin">New in version 1.7.6</span> With <span class="minilink minilink-addedin">New in version 1.7.6</span> With
PostgreSQL and MySQL, you can optionally dump "all" databases to separate PostgreSQL and MySQL, you can optionally dump "all" databases to separate
files instead of one combined dump file, allowing more convenient restores of files instead of one combined dump file, allowing more convenient restores of
@ -115,6 +136,53 @@ hooks:
format: sql format: sql
``` ```
### Containers
If your database is running within a Docker container and borgmatic is too, no
problem—simply configure borgmatic to connect to the container's name on its
exposed port. For instance:
```yaml
hooks:
postgresql_databases:
- name: users
hostname: your-database-container-name
port: 5433
username: postgres
password: trustsome1
```
But what if borgmatic is running on the host? You can still connect to a
database container if its ports are properly exposed to the host. For
instance, when running the database container with Docker, you can specify
`--publish 127.0.0.1:5433:5432` so that it exposes the container's port 5432
to port 5433 on the host (only reachable on localhost, in this case). Or the
same thing with Docker Compose:
```yaml
services:
your-database-container-name:
image: postgres
ports:
- 127.0.0.1:5433:5432
```
And then you can connect to the database from borgmatic running on the host:
```yaml
hooks:
postgresql_databases:
- name: users
hostname: 127.0.0.1
port: 5433
username: postgres
password: trustsome1
```
Of course, alter the ports in these examples to suit your particular database
system.
### No source directories ### No source directories
<span class="minilink minilink-addedin">New in version 1.7.1</span> If you <span class="minilink minilink-addedin">New in version 1.7.1</span> If you
@ -133,7 +201,6 @@ hooks:
``` ```
### External passwords ### External passwords
If you don't want to keep your database passwords in your borgmatic If you don't want to keep your database passwords in your borgmatic
@ -154,11 +221,11 @@ bring back any missing configuration files in order to restore a database.
## Supported databases ## Supported databases
As of now, borgmatic supports PostgreSQL, MySQL/MariaDB, and MongoDB databases As of now, borgmatic supports PostgreSQL, MySQL/MariaDB, MongoDB, and SQLite
directly. But see below about general-purpose preparation and cleanup hooks as databases directly. But see below about general-purpose preparation and
a work-around with other database systems. Also, please [file a cleanup hooks as a work-around with other database systems. Also, please [file
ticket](https://torsion.org/borgmatic/#issues) for additional database systems a ticket](https://torsion.org/borgmatic/#issues) for additional database
that you'd like supported. systems that you'd like supported.
## Database restoration ## Database restoration
@ -210,7 +277,8 @@ If you have a single repository in your borgmatic configuration file(s), no
problem: the `restore` action figures out which repository to use. problem: the `restore` action figures out which repository to use.
But if you have multiple repositories configured, then you'll need to specify But if you have multiple repositories configured, then you'll need to specify
the repository path containing the archive to restore. Here's an example: the repository to use via the `--repository` flag. This can be done either
with the repository's path or its label as configured in your borgmatic configuration file.
```bash ```bash
borgmatic restore --repository repo.borg --archive host-2023-... borgmatic restore --repository repo.borg --archive host-2023-...
@ -295,7 +363,10 @@ user and you're extracting to `/tmp`, then the dump will be in
`/tmp/root/.borgmatic`. `/tmp/root/.borgmatic`.
After extraction, you can manually restore the dump file using native database After extraction, you can manually restore the dump file using native database
commands like `pg_restore`, `mysql`, `mongorestore` or similar. commands like `pg_restore`, `mysql`, `mongorestore`, `sqlite`, or similar.
Also see the documentation on [listing database
dumps](https://torsion.org/borgmatic/docs/how-to/inspect-your-backups/#listing-database-dumps).
## Preparation and cleanup hooks ## Preparation and cleanup hooks
@ -310,6 +381,23 @@ dumps with any database system.
## Troubleshooting ## Troubleshooting
### PostgreSQL/MySQL authentication errors
With PostgreSQL and MySQL/MariaDB, if you're getting authentication errors
when borgmatic tries to connect to your database, a natural reaction is to
increase your borgmatic verbosity with `--verbosity 2` and go looking in the
logs. You'll notice however that your database password does not show up in
the logs. This is likely not the cause of the authentication problem unless
you mistyped your password, however; borgmatic passes your password to the
database via an environment variable that does not appear in the logs.
The cause of an authentication error is often on the database side—in the
configuration of which users are allowed to connect and how they are
authenticated. For instance, with PostgreSQL, check your
[pg_hba.conf](https://www.postgresql.org/docs/current/auth-pg-hba-conf.html)
file for that configuration.
### MySQL table lock errors ### MySQL table lock errors
If you encounter table lock errors during a database dump with MySQL/MariaDB, If you encounter table lock errors during a database dump with MySQL/MariaDB,

View File

@ -9,37 +9,47 @@ eleventyNavigation:
Borg itself is great for efficiently de-duplicating data across successive Borg itself is great for efficiently de-duplicating data across successive
backup archives, even when dealing with very large repositories. But you may backup archives, even when dealing with very large repositories. But you may
find that while borgmatic's default mode of `prune`, `compact`, `create`, and find that while borgmatic's default actions of `create`, `prune`, `compact`,
`check` works well on small repositories, it's not so great on larger ones. and `check` works well on small repositories, it's not so great on larger
That's because running the default pruning, compact, and consistency checks ones. That's because running the default pruning, compact, and consistency
take a long time on large repositories. checks take a long time on large repositories.
<span class="minilink minilink-addedin">Prior to version 1.7.9</span> The
default action ordering was `prune`, `compact`, `create`, and `check`.
### A la carte actions ### A la carte actions
If you find yourself in this situation, you have some options. First, you can If you find yourself wanting to customize the actions, you have some options.
run borgmatic's `prune`, `compact`, `create`, or `check` actions separately. First, you can run borgmatic's `prune`, `compact`, `create`, or `check`
For instance, the following optional actions are available: actions separately. For instance, the following optional actions are
available (among others):
```bash ```bash
borgmatic create
borgmatic prune borgmatic prune
borgmatic compact borgmatic compact
borgmatic create
borgmatic check borgmatic check
``` ```
You can run with only one of these actions provided, or you can mix and match You can run borgmatic with only one of these actions provided, or you can mix
any number of them in a single borgmatic run. This supports approaches like and match any number of them in a single borgmatic run. This supports
skipping certain actions while running others. For instance, this skips approaches like skipping certain actions while running others. For instance,
`prune` and `compact` and only runs `create` and `check`: this skips `prune` and `compact` and only runs `create` and `check`:
```bash ```bash
borgmatic create check borgmatic create check
``` ```
Or, you can make backups with `create` on a frequent schedule (e.g. with <span class="minilink minilink-addedin">New in version 1.7.9</span> borgmatic
`borgmatic create` called from one cron job), while only running expensive now respects your specified command-line action order, running actions in the
consistency checks with `check` on a much less frequent basis (e.g. with order you specify. In previous versions, borgmatic ran your specified actions
`borgmatic check` called from a separate cron job). in a fixed ordering regardless of the order they appeared on the command-line.
But instead of running actions together, another option is to run backups with
`create` on a frequent schedule (e.g. with `borgmatic create` called from one
cron job), while only running expensive consistency checks with `check` on a
much less frequent basis (e.g. with `borgmatic check` called from a separate
cron job).
### Consistency check configuration ### Consistency check configuration
@ -47,8 +57,8 @@ consistency checks with `check` on a much less frequent basis (e.g. with
Another option is to customize your consistency checks. By default, if you Another option is to customize your consistency checks. By default, if you
omit consistency checks from configuration, borgmatic runs full-repository omit consistency checks from configuration, borgmatic runs full-repository
checks (`repository`) and per-archive checks (`archives`) within each checks (`repository`) and per-archive checks (`archives`) within each
repository, no more than once a month. This is equivalent to what `borg check` repository. (Although see below about check frequency.) This is equivalent to
does if run without options. what `borg check` does if run without options.
But if you find that archive checks are too slow, for example, you can But if you find that archive checks are too slow, for example, you can
configure borgmatic to run repository checks only. Configure this in the configure borgmatic to run repository checks only. Configure this in the
@ -60,8 +70,9 @@ consistency:
- name: repository - name: repository
``` ```
<span class="minilink minilink-addedin">Prior to version 1.6.2</span> `checks` <span class="minilink minilink-addedin">Prior to version 1.6.2</span> The
was a plain list of strings without the `name:` part. For example: `checks` option was a plain list of strings without the `name:` part, and
borgmatic ran each configured check every time checks were run. For example:
```yaml ```yaml
consistency: consistency:
@ -102,8 +113,13 @@ consistency:
This tells borgmatic to run the `repository` consistency check at most once This tells borgmatic to run the `repository` consistency check at most once
every two weeks for a given repository and the `archives` check at most once a every two weeks for a given repository and the `archives` check at most once a
month. The `frequency` value is a number followed by a unit of time, e.g. "3 month. The `frequency` value is a number followed by a unit of time, e.g. "3
days", "1 week", "2 months", etc. The `frequency` defaults to `always`, which days", "1 week", "2 months", etc.
means run this check every time checks run.
The `frequency` defaults to `always` for a check configured without a
`frequency`, which means run this check every time checks run. But if you omit
consistency checks from configuration entirely, borgmatic runs full-repository
checks (`repository`) and per-archive checks (`archives`) within each
repository, at most once a month.
Unlike a real scheduler like cron, borgmatic only makes a best effort to run Unlike a real scheduler like cron, borgmatic only makes a best effort to run
checks on the configured frequency. It compares that frequency with how long checks on the configured frequency. It compares that frequency with how long

View File

@ -25,8 +25,8 @@ so that you can run borgmatic commands while you're hacking on them to
make sure your changes work. make sure your changes work.
```bash ```bash
cd borgmatic/ cd borgmatic
pip3 install --editable --user . pip3 install --user --editable .
``` ```
Note that this will typically install the borgmatic commands into Note that this will typically install the borgmatic commands into
@ -51,7 +51,6 @@ pip3 install --user tox
Finally, to actually run tests, run: Finally, to actually run tests, run:
```bash ```bash
cd borgmatic
tox tox
``` ```
@ -74,6 +73,15 @@ can ask isort to order your imports for you:
tox -e isort tox -e isort
``` ```
Similarly, if you get errors about spelling mistakes in source code, you can
ask [codespell](https://github.com/codespell-project/codespell) to correct
them:
```bash
tox -e codespell
```
### End-to-end tests ### End-to-end tests
borgmatic additionally includes some end-to-end tests that integration test borgmatic additionally includes some end-to-end tests that integration test
@ -87,12 +95,36 @@ If you would like to run the full test suite, first install Docker and [Docker
Compose](https://docs.docker.com/compose/install/). Then run: Compose](https://docs.docker.com/compose/install/). Then run:
```bash ```bash
scripts/run-full-dev-tests scripts/run-end-to-end-dev-tests
``` ```
Note that this scripts assumes you have permission to run Docker. If you Note that this scripts assumes you have permission to run Docker. If you
don't, then you may need to run with `sudo`. don't, then you may need to run with `sudo`.
#### Podman
<span class="minilink minilink-addedin">New in version 1.7.12</span>
borgmatic's end-to-end tests optionally support using
[rootless](https://github.com/containers/podman/blob/main/docs/tutorials/rootless_tutorial.md)
[Podman](https://podman.io/) instead of Docker.
Setting up Podman is outside the scope of this documentation, but here are
some key points to double-check:
* Install Podman along with `podman-docker` and your desired networking
support.
* Configure `/etc/subuid` and `/etc/subgid` to map users/groups for the
non-root user who will run tests.
* Create a non-root Podman socket for that user:
```bash
systemctl --user enable --now podman.socket
```
Then you'll be able to run end-to-end tests as per normal, and the test script
will automatically use your non-root Podman socket instead of a Docker socket.
## Code style ## Code style
Start with [PEP 8](https://www.python.org/dev/peps/pep-0008/). But then, apply Start with [PEP 8](https://www.python.org/dev/peps/pep-0008/). But then, apply
@ -101,10 +133,10 @@ the following deviations from it:
* For strings, prefer single quotes over double quotes. * For strings, prefer single quotes over double quotes.
* Limit all lines to a maximum of 100 characters. * Limit all lines to a maximum of 100 characters.
* Use trailing commas within multiline values or argument lists. * Use trailing commas within multiline values or argument lists.
* For multiline constructs, put opening and closing delimeters on lines * For multiline constructs, put opening and closing delimiters on lines
separate from their contents. separate from their contents.
* Within multiline constructs, use standard four-space indentation. Don't align * Within multiline constructs, use standard four-space indentation. Don't align
indentation with an opening delimeter. indentation with an opening delimiter.
borgmatic code uses the [Black](https://black.readthedocs.io/en/stable/) code borgmatic code uses the [Black](https://black.readthedocs.io/en/stable/) code
formatter, the [Flake8](http://flake8.pycqa.org/en/latest/) code checker, and formatter, the [Flake8](http://flake8.pycqa.org/en/latest/) code checker, and
@ -141,3 +173,15 @@ http://localhost:8080 to view the documentation with your changes.
To close the documentation server, ctrl-C the script. Note that it does not To close the documentation server, ctrl-C the script. Note that it does not
currently auto-reload, so you'll need to stop it and re-run it for any currently auto-reload, so you'll need to stop it and re-run it for any
additional documentation changes to take effect. additional documentation changes to take effect.
#### Podman
<span class="minilink minilink-addedin">New in version 1.7.12</span>
borgmatic's developer build for documentation optionally supports using
[rootless](https://github.com/containers/podman/blob/main/docs/tutorials/rootless_tutorial.md)
[Podman](https://podman.io/) instead of Docker.
Setting up Podman is outside the scope of this documentation. But once you
install `podman-docker`, then `scripts/dev-docs` should automatically use
Podman instead of Docker.

View File

@ -20,15 +20,15 @@ borgmatic rlist
That should yield output looking something like: That should yield output looking something like:
```text ```text
host-2019-01-01T04:05:06.070809 Tue, 2019-01-01 04:05:06 [...] host-2023-01-01T04:05:06.070809 Tue, 2023-01-01 04:05:06 [...]
host-2019-01-02T04:06:07.080910 Wed, 2019-01-02 04:06:07 [...] host-2023-01-02T04:06:07.080910 Wed, 2023-01-02 04:06:07 [...]
``` ```
Assuming that you want to extract the archive with the most up-to-date files Assuming that you want to extract the archive with the most up-to-date files
and therefore the latest timestamp, run a command like: and therefore the latest timestamp, run a command like:
```bash ```bash
borgmatic extract --archive host-2019-01-02T04:06:07.080910 borgmatic extract --archive host-2023-01-02T04:06:07.080910
``` ```
(No borgmatic `extract` action? Upgrade borgmatic!) (No borgmatic `extract` action? Upgrade borgmatic!)
@ -51,10 +51,11 @@ If you have a single repository in your borgmatic configuration file(s), no
problem: the `extract` action figures out which repository to use. problem: the `extract` action figures out which repository to use.
But if you have multiple repositories configured, then you'll need to specify But if you have multiple repositories configured, then you'll need to specify
the repository path containing the archive to extract. Here's an example: the repository to use via the `--repository` flag. This can be done either
with the repository's path or its label as configured in your borgmatic configuration file.
```bash ```bash
borgmatic extract --repository repo.borg --archive host-2019-... borgmatic extract --repository repo.borg --archive host-2023-...
``` ```
## Extract particular files ## Extract particular files
@ -74,6 +75,13 @@ run the `extract` command above, borgmatic will extract `/var/path/1` and
`/var/path/2`. `/var/path/2`.
### Searching for files
If you're not sure which archive contains the files you're looking for, you
can [search across
archives](https://torsion.org/borgmatic/docs/how-to/inspect-your-backups/#searching-for-a-file).
## Extract to a particular destination ## Extract to a particular destination
By default, borgmatic extracts files into the current directory. To instead By default, borgmatic extracts files into the current directory. To instead

View File

@ -91,6 +91,19 @@ example, to search only the last five archives:
borgmatic list --find foo.txt --last 5 borgmatic list --find foo.txt --last 5
``` ```
## Listing database dumps
If you have enabled borgmatic's [database
hooks](https://torsion.org/borgmatic/docs/how-to/backup-your-databases/), you
can list backed up database dumps via borgmatic. For example:
```bash
borgmatic list --archive latest --find .borgmatic/*_databases
```
This gives you a listing of all database dump files contained in the latest
archive, complete with file sizes.
## Logging ## Logging
@ -98,7 +111,7 @@ By default, borgmatic logs to a local syslog-compatible daemon if one is
present and borgmatic is running in a non-interactive console. Where those present and borgmatic is running in a non-interactive console. Where those
logs show up depends on your particular system. If you're using systemd, try logs show up depends on your particular system. If you're using systemd, try
running `journalctl -xe`. Otherwise, try viewing `/var/log/syslog` or running `journalctl -xe`. Otherwise, try viewing `/var/log/syslog` or
similiar. similar.
You can customize the log level used for syslog logging with the You can customize the log level used for syslog logging with the
`--syslog-verbosity` flag, and this is independent from the console logging `--syslog-verbosity` flag, and this is independent from the console logging
@ -141,5 +154,39 @@ borgmatic --log-file /path/to/file.log
Note that if you use the `--log-file` flag, you are responsible for rotating Note that if you use the `--log-file` flag, you are responsible for rotating
the log file so it doesn't grow too large, for example with the log file so it doesn't grow too large, for example with
[logrotate](https://wiki.archlinux.org/index.php/Logrotate). Also, there is a [logrotate](https://wiki.archlinux.org/index.php/Logrotate).
`--log-file-verbosity` flag to customize the log file's log level.
You can the `--log-file-verbosity` flag to customize the log file's log level:
```bash
borgmatic --log-file /path/to/file.log --log-file-verbosity 2
```
<span class="minilink minilink-addedin">New in version 1.7.11</span> Use the
`--log-file-format` flag to override the default log message format. This
format string can contain a series of named placeholders wrapped in curly
brackets. For instance, the default log format is: `[{asctime}] {levelname}:
{message}`. This means each log message is recorded as the log time (in square
brackets), a logging level name, a colon, and the actual log message.
So if you just want each log message to get logged *without* a timestamp or a
logging level name:
```bash
borgmatic --log-file /path/to/file.log --log-file-format "{message}"
```
Here is a list of available placeholders:
* `{asctime}`: time the log message was created
* `{levelname}`: level of the log message (`INFO`, `DEBUG`, etc.)
* `{lineno}`: line number in the source file where the log message originated
* `{message}`: actual log message
* `{pathname}`: path of the source file where the log message originated
See the [Python logging
documentation](https://docs.python.org/3/library/logging.html#logrecord-attributes)
for additional placeholders.
Note that this `--log-file-format` flg only applies to the specified
`--log-file` and not to syslog or other logging.

View File

@ -20,11 +20,13 @@ location:
# Paths of local or remote repositories to backup to. # Paths of local or remote repositories to backup to.
repositories: repositories:
- ssh://1234@usw-s001.rsync.net/./backups.borg - path: ssh://k8pDxu32@k8pDxu32.repo.borgbase.com/./repo
- ssh://k8pDxu32@k8pDxu32.repo.borgbase.com/./repo - path: /var/lib/backups/local.borg
- /var/lib/backups/local.borg
``` ```
<span class="minilink minilink-addedin">Prior to version 1.7.10</span> Omit
the `path:` portion of the `repositories` list.
When you run borgmatic with this configuration, it invokes Borg once for each When you run borgmatic with this configuration, it invokes Borg once for each
configured repository in sequence. (So, not in parallel.) That means—in each configured repository in sequence. (So, not in parallel.) That means—in each
repository—borgmatic creates a single new backup archive containing all of repository—borgmatic creates a single new backup archive containing all of
@ -32,9 +34,8 @@ your source directories.
Here's a way of visualizing what borgmatic does with the above configuration: Here's a way of visualizing what borgmatic does with the above configuration:
1. Backup `/home` and `/etc` to `1234@usw-s001.rsync.net:backups.borg` 1. Backup `/home` and `/etc` to `k8pDxu32@k8pDxu32.repo.borgbase.com:repo`
2. Backup `/home` and `/etc` to `k8pDxu32@k8pDxu32.repo.borgbase.com:repo` 2. Backup `/home` and `/etc` to `/var/lib/backups/local.borg`
3. Backup `/home` and `/etc` to `/var/lib/backups/local.borg`
This gives you redundancy of your data across repositories and even This gives you redundancy of your data across repositories and even
potentially across providers. potentially across providers.

View File

@ -54,6 +54,93 @@ choice](https://torsion.org/borgmatic/docs/how-to/set-up-backups/#autopilot),
each entry using borgmatic's `--config` flag instead of relying on each entry using borgmatic's `--config` flag instead of relying on
`/etc/borgmatic.d`. `/etc/borgmatic.d`.
## Archive naming
If you've got multiple borgmatic configuration files, you might want to create
archives with different naming schemes for each one. This is especially handy
if each configuration file is backing up to the same Borg repository but you
still want to be able to distinguish backup archives for one application from
another.
borgmatic supports this use case with an `archive_name_format` option. The
idea is that you define a string format containing a number of [Borg
placeholders](https://borgbackup.readthedocs.io/en/stable/usage/help.html#borg-placeholders),
and borgmatic uses that format to name any new archive it creates. For
instance:
```yaml
storage:
...
archive_name_format: home-directories-{now}
```
This means that when borgmatic creates an archive, its name will start with
the string `home-directories-` and end with a timestamp for its creation time.
If `archive_name_format` is unspecified, the default is
`{hostname}-{now:%Y-%m-%dT%H:%M:%S.%f}`, meaning your system hostname plus a
timestamp in a particular format.
<span class="minilink minilink-addedin">New in version 1.7.11</span> borgmatic
uses the `archive_name_format` option to automatically limit which archives
get used for actions operating on multiple archives. This prevents, for
instance, duplicate archives from showing up in `rlist` or `info` results—even
if the same repository appears in multiple borgmatic configuration files. To
take advantage of this feature, simply use a different `archive_name_format`
in each configuration file.
Under the hood, borgmatic accomplishes this by substituting globs for certain
ephemeral data placeholders in your `archive_name_format`—and using the result
to filter archives when running supported actions.
For instance, let's say that you have this in your configuration:
```yaml
storage:
...
archive_name_format: {hostname}-user-data-{now}
```
borgmatic considers `{now}` an emphemeral data placeholder that will probably
change per archive, while `{hostname}` won't. So it turns the example value
into `{hostname}-user-data-*` and applies it to filter down the set of
archives used for actions like `rlist`, `info`, `prune`, `check`, etc.
The end result is that when borgmatic runs the actions for a particular
application-specific configuration file, it only operates on the archives
created for that application. Of course, this doesn't apply to actions like
`compact` that operate on an entire repository.
If this behavior isn't quite smart enough for your needs, you can use the
`match_archives` option to override the pattern that borgmatic uses for
filtering archives. For example:
```yaml
storage:
...
archive_name_format: {hostname}-user-data-{now}
match_archives: sh:myhost-user-data-*
```
For Borg 1.x, use a shell pattern for the `match_archives` value and see the
[Borg patterns
documentation](https://borgbackup.readthedocs.io/en/stable/usage/help.html#borg-help-patterns)
for more information. For Borg 2.x, see the [match archives
documentation](https://borgbackup.readthedocs.io/en/2.0.0b5/usage/help.html#borg-help-match-archives).
Some borgmatic command-line actions also have a `--match-archives` flag that
overrides both the auto-matching behavior and the `match_archives`
configuration option.
<span class="minilink minilink-addedin">Prior to 1.7.11</span> The way to
limit the archives used for the `prune` action was a `prefix` option in the
`retention` section for matching against the start of archive names. And the
option for limiting the archives used for the `check` action was a separate
`prefix` in the `consistency` section. Both of these options are deprecated in
favor of the auto-matching behavior (or `match_archives`/`--match-archives`)
in newer versions of borgmatic.
## Configuration includes ## Configuration includes
Once you have multiple different configuration files, you might want to share Once you have multiple different configuration files, you might want to share
@ -189,6 +276,81 @@ include, the local file's option takes precedence.
list values are appended together. list values are appended together.
### Shallow merge
Even though deep merging is generally pretty handy for included files,
sometimes you want specific sections in the local file to take precedence over
included sections—without any merging occurring for them.
<span class="minilink minilink-addedin">New in version 1.7.12</span> That's
where the `!retain` tag comes in. Whenever you're merging an included file
into your configuration file, you can optionally add the `!retain` tag to
particular local mappings or lists to retain the local values and ignore
included values.
For instance, start with this configuration file containing the `!retain` tag
on the `retention` mapping:
```yaml
<<: !include /etc/borgmatic/common.yaml
location:
repositories:
- repo.borg
retention: !retain
keep_daily: 5
```
And `common.yaml` like this:
```yaml
location:
repositories:
- common.borg
retention:
keep_hourly: 24
keep_daily: 7
```
Once this include gets merged in, the resulting configuration will have a
`keep_daily` value of `5` and nothing else in the `retention` section. That's
because the `!retain` tag says to retain the local version of `retention` and
ignore any values coming in from the include. But because the `repositories`
list doesn't have a `!retain` tag, it still gets merged together to contain
both `common.borg` and `repo.borg`.
The `!retain` tag can only be placed on mappings and lists, and it goes right
after the name of the option (and its colon) on the same line. The effects of
`!retain` are recursive, meaning that if you place a `!retain` tag on a
top-level mapping, even deeply nested values within it will not be merged.
Additionally, the `!retain` tag only works in a configuration file that also
performs a merge include with `<<: !include`. It doesn't make sense within,
for instance, an included configuration file itself (unless it in turn
performs its own merge include). That's because `!retain` only applies to the
file doing the include; it doesn't work in reverse or propagate through
includes.
## Debugging includes
<span class="minilink minilink-addedin">New in version 1.7.12</span> If you'd
like to see what the loaded configuration looks like after includes get merged
in, run `validate-borgmatic-config` on your configuration file:
```bash
sudo validate-borgmatic-config --show
```
You'll need to specify your configuration file with `--config` if it's not in
a default location.
This will output the merged configuration as borgmatic sees it, which can be
helpful for understanding how your includes work in practice.
## Configuration overrides ## Configuration overrides
In more complex multi-application setups, you may want to override particular In more complex multi-application setups, you may want to override particular
@ -255,3 +417,51 @@ Be sure to quote your overrides if they contain spaces or other characters
that your shell may interpret. that your shell may interpret.
An alternate to command-line overrides is passing in your values via [environment variables](https://torsion.org/borgmatic/docs/how-to/provide-your-passwords/). An alternate to command-line overrides is passing in your values via [environment variables](https://torsion.org/borgmatic/docs/how-to/provide-your-passwords/).
## Constant interpolation
<span class="minilink minilink-addedin">New in version 1.7.10</span> Another
tool is borgmatic's support for defining custom constants. This is similar to
the [variable interpolation
feature](https://torsion.org/borgmatic/docs/how-to/add-preparation-and-cleanup-steps-to-backups/#variable-interpolation)
for command hooks, but the constants feature lets you substitute your own
custom values into anywhere in the entire configuration file. (Constants don't
work across includes or separate configuration files though.)
Here's an example usage:
```yaml
constants:
user: foo
archive_prefix: bar
location:
source_directories:
- /home/{user}/.config
- /home/{user}/.ssh
...
storage:
archive_name_format: '{archive_prefix}-{now}'
```
In this example, when borgmatic runs, all instances of `{user}` get replaced
with `foo` and all instances of `{archive-prefix}` get replaced with `bar-`.
(And in this particular example, `{now}` doesn't get replaced with anything,
but gets passed directly to Borg.) After substitution, the logical result
looks something like this:
```yaml
location:
source_directories:
- /home/foo/.config
- /home/foo/.ssh
...
storage:
archive_name_format: 'bar-{now}'
```
An alternate to constants is passing in your values via [environment
variables](https://torsion.org/borgmatic/docs/how-to/provide-your-passwords/).

View File

@ -83,7 +83,7 @@ tests](https://torsion.org/borgmatic/docs/how-to/extract-a-backup/).
## Error hooks ## Error hooks
When an error occurs during a `prune`, `compact`, `create`, or `check` action, When an error occurs during a `create`, `prune`, `compact`, or `check` action,
borgmatic can run configurable shell commands to fire off custom error borgmatic can run configurable shell commands to fire off custom error
notifications or take other actions, so you can get alerted as soon as notifications or take other actions, so you can get alerted as soon as
something goes wrong. Here's a not-so-useful example: something goes wrong. Here's a not-so-useful example:
@ -116,8 +116,8 @@ the repository. Here's the full set of supported variables you can use here:
* `output`: output of the command that failed (may be blank if an error * `output`: output of the command that failed (may be blank if an error
occurred without running a command) occurred without running a command)
Note that borgmatic runs the `on_error` hooks only for `prune`, `compact`, Note that borgmatic runs the `on_error` hooks only for `create`, `prune`,
`create`, or `check` actions or hooks in which an error occurs, and not other `compact`, or `check` actions or hooks in which an error occurs, and not other
actions. borgmatic does not run `on_error` hooks if an error occurs within a actions. borgmatic does not run `on_error` hooks if an error occurs within a
`before_everything` or `after_everything` hook. For more about hooks, see the `before_everything` or `after_everything` hook. For more about hooks, see the
[borgmatic hooks [borgmatic hooks
@ -144,7 +144,7 @@ With this hook in place, borgmatic pings your Healthchecks project when a
backup begins, ends, or errors. Specifically, after the <a backup begins, ends, or errors. Specifically, after the <a
href="https://torsion.org/borgmatic/docs/how-to/add-preparation-and-cleanup-steps-to-backups/">`before_backup` href="https://torsion.org/borgmatic/docs/how-to/add-preparation-and-cleanup-steps-to-backups/">`before_backup`
hooks</a> run, borgmatic lets Healthchecks know that it has started if any of hooks</a> run, borgmatic lets Healthchecks know that it has started if any of
the `prune`, `compact`, `create`, or `check` actions are run. the `create`, `prune`, `compact`, or `check` actions are run.
Then, if the actions complete successfully, borgmatic notifies Healthchecks of Then, if the actions complete successfully, borgmatic notifies Healthchecks of
the success after the `after_backup` hooks run, and includes borgmatic logs in the success after the `after_backup` hooks run, and includes borgmatic logs in
@ -154,8 +154,8 @@ in the Healthchecks UI, although be aware that Healthchecks currently has a
If an error occurs during any action or hook, borgmatic notifies Healthchecks If an error occurs during any action or hook, borgmatic notifies Healthchecks
after the `on_error` hooks run, also tacking on logs including the error after the `on_error` hooks run, also tacking on logs including the error
itself. But the logs are only included for errors that occur when a `prune`, itself. But the logs are only included for errors that occur when a `create`,
`compact`, `create`, or `check` action is run. `prune`, `compact`, or `check` action is run.
You can customize the verbosity of the logs that are sent to Healthchecks with You can customize the verbosity of the logs that are sent to Healthchecks with
borgmatic's `--monitoring-verbosity` flag. The `--list` and `--stats` flags borgmatic's `--monitoring-verbosity` flag. The `--list` and `--stats` flags

View File

@ -53,7 +53,8 @@ This runs Borg's `rlist` command once on each configured borgmatic repository.
(The native `borgmatic rlist` action should be preferred for most use.) (The native `borgmatic rlist` action should be preferred for most use.)
What if you only want to run Borg on a single configured borgmatic repository What if you only want to run Borg on a single configured borgmatic repository
when you've got several configured? Not a problem. when you've got several configured? Not a problem. The `--repository` argument
lets you specify the repository to use, either by its path or its label:
```bash ```bash
borgmatic borg --repository repo.borg break-lock borgmatic borg --repository repo.borg break-lock

View File

@ -90,10 +90,11 @@ installing borgmatic:
* [Fedora unofficial](https://copr.fedorainfracloud.org/coprs/heffer/borgmatic/) * [Fedora unofficial](https://copr.fedorainfracloud.org/coprs/heffer/borgmatic/)
* [Arch Linux](https://www.archlinux.org/packages/community/any/borgmatic/) * [Arch Linux](https://www.archlinux.org/packages/community/any/borgmatic/)
* [Alpine Linux](https://pkgs.alpinelinux.org/packages?name=borgmatic) * [Alpine Linux](https://pkgs.alpinelinux.org/packages?name=borgmatic)
* [OpenBSD](http://ports.su/sysutils/borgmatic) * [OpenBSD](https://openports.pl/path/sysutils/borgmatic)
* [openSUSE](https://software.opensuse.org/package/borgmatic) * [openSUSE](https://software.opensuse.org/package/borgmatic)
* [macOS (via Homebrew)](https://formulae.brew.sh/formula/borgmatic) * [macOS (via Homebrew)](https://formulae.brew.sh/formula/borgmatic)
* [macOS (via MacPorts)](https://ports.macports.org/port/borgmatic/) * [macOS (via MacPorts)](https://ports.macports.org/port/borgmatic/)
* [NixOS](https://search.nixos.org/packages?show=borgmatic&sort=relevance&type=packages&query=borgmatic)
* [Ansible role](https://github.com/borgbase/ansible-role-borgbackup) * [Ansible role](https://github.com/borgbase/ansible-role-borgbackup)
* [virtualenv](https://virtualenv.pypa.io/en/stable/) * [virtualenv](https://virtualenv.pypa.io/en/stable/)
@ -156,7 +157,7 @@ variable or set the `BORG_PASSPHRASE` environment variable. See the
section](https://borgbackup.readthedocs.io/en/stable/quickstart.html#repository-encryption) section](https://borgbackup.readthedocs.io/en/stable/quickstart.html#repository-encryption)
of the Borg Quick Start for more info. of the Borg Quick Start for more info.
Alternatively, you can specify the passphrase programatically by setting Alternatively, you can specify the passphrase programmatically by setting
either the borgmatic `encryption_passcommand` configuration variable or the either the borgmatic `encryption_passcommand` configuration variable or the
`BORG_PASSCOMMAND` environment variable. See the [Borg Security `BORG_PASSCOMMAND` environment variable. See the [Borg Security
FAQ](http://borgbackup.readthedocs.io/en/stable/faq.html#how-can-i-specify-the-encryption-passphrase-programmatically) FAQ](http://borgbackup.readthedocs.io/en/stable/faq.html#how-can-i-specify-the-encryption-passphrase-programmatically)
@ -179,6 +180,9 @@ following command is available for that:
sudo validate-borgmatic-config sudo validate-borgmatic-config
``` ```
You'll need to specify your configuration file with `--config` if it's not in
a default location.
This command's exit status (`$?` in Bash) is zero when configuration is valid This command's exit status (`$?` in Bash) is zero when configuration is valid
and non-zero otherwise. and non-zero otherwise.

View File

@ -145,15 +145,18 @@ like this:
```yaml ```yaml
location: location:
repositories: repositories:
- original.borg - path: original.borg
``` ```
<span class="minilink minilink-addedin">Prior to version 1.7.10</span> Omit
the `path:` portion of the `repositories` list.
Change it to a new (not yet created) repository path: Change it to a new (not yet created) repository path:
```yaml ```yaml
location: location:
repositories: repositories:
- upgraded.borg - path: upgraded.borg
``` ```
Then, run the `rcreate` action (formerly `init`) to create that new Borg 2 Then, run the `rcreate` action (formerly `init`) to create that new Borg 2

View File

@ -7,8 +7,10 @@ eleventyNavigation:
--- ---
## borgmatic options ## borgmatic options
Here are all of the available borgmatic command-line options. This includes the separate options for Here are all of the available borgmatic command-line options, including the
each action sub-command: separate options for each action sub-command. Note that most of the
flags listed here do not have equivalents in borgmatic's [configuration
file](https://torsion.org/borgmatic/docs/reference/configuration/).
``` ```
{% include borgmatic/command-line.txt %} {% include borgmatic/command-line.txt %}

BIN
docs/static/sqlite.png vendored Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 4.6 KiB

View File

@ -0,0 +1,20 @@
#!/bin/sh
# This script is for running end-to-end tests on a developer machine. It sets up database containers
# to run tests against, runs the tests, and then tears down the containers.
#
# Run this script from the root directory of the borgmatic source.
#
# For more information, see:
# https://torsion.org/borgmatic/docs/how-to/develop-on-borgmatic/
set -e
USER_PODMAN_SOCKET_PATH=/run/user/$UID/podman/podman.sock
if [ -e "$USER_PODMAN_SOCKET_PATH" ]; then
export DOCKER_HOST="unix://$USER_PODMAN_SOCKET_PATH"
fi
docker-compose --file tests/end-to-end/docker-compose.yaml up --force-recreate \
--renew-anon-volumes --abort-on-container-exit

View File

@ -1,14 +0,0 @@
#!/bin/sh
# This script is for running all tests, including end-to-end tests, on a developer machine. It sets
# up database containers to run tests against, runs the tests, and then tears down the containers.
#
# Run this script from the root directory of the borgmatic source.
#
# For more information, see:
# https://torsion.org/borgmatic/docs/how-to/develop-on-borgmatic/
set -e
docker-compose --file tests/end-to-end/docker-compose.yaml up --force-recreate \
--renew-anon-volumes --abort-on-container-exit

View File

@ -3,19 +3,30 @@
# This script installs test dependencies and runs all tests, including end-to-end tests. It # This script installs test dependencies and runs all tests, including end-to-end tests. It
# is designed to run inside a test container, and presumes that other test infrastructure like # is designed to run inside a test container, and presumes that other test infrastructure like
# databases are already running. Therefore, on a developer machine, you should not run this script # databases are already running. Therefore, on a developer machine, you should not run this script
# directly. Instead, run scripts/run-full-dev-tests # directly. Instead, run scripts/run-end-to-end-dev-tests
# #
# For more information, see: # For more information, see:
# https://torsion.org/borgmatic/docs/how-to/develop-on-borgmatic/ # https://torsion.org/borgmatic/docs/how-to/develop-on-borgmatic/
set -e set -e
if [ -z "$TEST_CONTAINER" ] ; then
echo "This script is designed to work inside a test container and is not intended to"
echo "be run manually. If you're trying to run borgmatic's end-to-end tests, execute"
echo "scripts/run-end-to-end-dev-tests instead."
exit 1
fi
apk add --no-cache python3 py3-pip borgbackup postgresql-client mariadb-client mongodb-tools \ apk add --no-cache python3 py3-pip borgbackup postgresql-client mariadb-client mongodb-tools \
py3-ruamel.yaml py3-ruamel.yaml.clib bash py3-ruamel.yaml py3-ruamel.yaml.clib bash sqlite
# If certain dependencies of black are available in this version of Alpine, install them. # If certain dependencies of black are available in this version of Alpine, install them.
apk add --no-cache py3-typed-ast py3-regex || true apk add --no-cache py3-typed-ast py3-regex || true
python3 -m pip install --no-cache --upgrade pip==22.2.2 setuptools==64.0.1 python3 -m pip install --no-cache --upgrade pip==22.2.2 setuptools==64.0.1
pip3 install --ignore-installed tox==3.25.1 pip3 install --ignore-installed tox==3.25.1
export COVERAGE_FILE=/tmp/.coverage export COVERAGE_FILE=/tmp/.coverage
tox --workdir /tmp/.tox --sitepackages
if [ "$1" != "--end-to-end-only" ] ; then
tox --workdir /tmp/.tox --sitepackages
fi
tox --workdir /tmp/.tox --sitepackages -e end-to-end tox --workdir /tmp/.tox --sitepackages -e end-to-end

View File

@ -5,11 +5,13 @@ description_file=README.md
testpaths = tests testpaths = tests
addopts = --cov-report term-missing:skip-covered --cov=borgmatic --ignore=tests/end-to-end addopts = --cov-report term-missing:skip-covered --cov=borgmatic --ignore=tests/end-to-end
filterwarnings = filterwarnings =
ignore:Coverage disabled.*:pytest.PytestWarning ignore:Deprecated call to `pkg_resources.declare_namespace\('ruamel'\)`.*:DeprecationWarning
[flake8] [flake8]
ignore = E501,W503 ignore = E501,W503
exclude = *.*/* exclude = *.*/*
multiline-quotes = '''
docstring-quotes = '''
[tool:isort] [tool:isort]
force_single_line = False force_single_line = False
@ -18,3 +20,6 @@ known_first_party = borgmatic
line_length = 100 line_length = 100
multi_line_output = 3 multi_line_output = 3
skip = .tox skip = .tox
[codespell]
skip = .git,.tox,build

View File

@ -1,6 +1,6 @@
from setuptools import find_packages, setup from setuptools import find_packages, setup
VERSION = '1.7.8.dev0' VERSION = '1.7.12.dev0'
setup( setup(

View File

@ -2,9 +2,13 @@ appdirs==1.4.4; python_version >= '3.8'
attrs==20.3.0; python_version >= '3.8' attrs==20.3.0; python_version >= '3.8'
black==19.10b0; python_version >= '3.8' black==19.10b0; python_version >= '3.8'
click==7.1.2; python_version >= '3.8' click==7.1.2; python_version >= '3.8'
codespell==2.2.4
colorama==0.4.4 colorama==0.4.4
coverage==5.3 coverage==5.3
flake8==4.0.1 flake8==4.0.1
flake8-quotes==3.3.2
flake8-use-fstring==1.4
flake8-variables-names==0.0.5
flexmock==0.10.4 flexmock==0.10.4
isort==5.9.1 isort==5.9.1
mccabe==0.6.1 mccabe==0.6.1

View File

@ -1,30 +1,33 @@
version: '3' version: '3'
services: services:
postgresql: postgresql:
image: postgres:13.1-alpine image: docker.io/postgres:13.1-alpine
environment: environment:
POSTGRES_PASSWORD: test POSTGRES_PASSWORD: test
POSTGRES_DB: test POSTGRES_DB: test
mysql: mysql:
image: mariadb:10.5 image: docker.io/mariadb:10.5
environment: environment:
MYSQL_ROOT_PASSWORD: test MYSQL_ROOT_PASSWORD: test
MYSQL_DATABASE: test MYSQL_DATABASE: test
mongodb: mongodb:
image: mongo:5.0.5 image: docker.io/mongo:5.0.5
environment: environment:
MONGO_INITDB_ROOT_USERNAME: root MONGO_INITDB_ROOT_USERNAME: root
MONGO_INITDB_ROOT_PASSWORD: test MONGO_INITDB_ROOT_PASSWORD: test
tests: tests:
image: alpine:3.13 image: docker.io/alpine:3.13
environment:
TEST_CONTAINER: true
volumes: volumes:
- "../..:/app:ro" - "../..:/app:ro"
tmpfs: tmpfs:
- "/app/borgmatic.egg-info" - "/app/borgmatic.egg-info"
tty: true tty: true
working_dir: /app working_dir: /app
command: entrypoint: /app/scripts/run-full-tests
- /app/scripts/run-full-tests command: --end-to-end-only
depends_on: depends_on:
- postgresql - postgresql
- mysql - mysql
- mongodb

View File

@ -12,17 +12,14 @@ def generate_configuration(config_path, repository_path):
to work for testing (including injecting the given repository path and tacking on an encryption to work for testing (including injecting the given repository path and tacking on an encryption
passphrase). passphrase).
''' '''
subprocess.check_call( subprocess.check_call(f'generate-borgmatic-config --destination {config_path}'.split(' '))
'generate-borgmatic-config --destination {}'.format(config_path).split(' ')
)
config = ( config = (
open(config_path) open(config_path)
.read() .read()
.replace('ssh://user@backupserver/./sourcehostname.borg', repository_path) .replace('ssh://user@backupserver/./sourcehostname.borg', repository_path)
.replace('- ssh://user@backupserver/./{fqdn}', '') .replace('- path: /mnt/backup', '')
.replace('- /var/local/backups/local.borg', '') .replace('label: local', '')
.replace('- /home/user/path with spaces', '') .replace('- /home', f'- {config_path}')
.replace('- /home', '- {}'.format(config_path))
.replace('- /etc', '') .replace('- /etc', '')
.replace('- /var/log/syslog*', '') .replace('- /var/log/syslog*', '')
+ 'storage:\n encryption_passphrase: "test"' + 'storage:\n encryption_passphrase: "test"'
@ -47,13 +44,13 @@ def test_borgmatic_command():
generate_configuration(config_path, repository_path) generate_configuration(config_path, repository_path)
subprocess.check_call( subprocess.check_call(
'borgmatic -v 2 --config {} init --encryption repokey'.format(config_path).split(' ') f'borgmatic -v 2 --config {config_path} init --encryption repokey'.split(' ')
) )
# Run borgmatic to generate a backup archive, and then list it to make sure it exists. # Run borgmatic to generate a backup archive, and then list it to make sure it exists.
subprocess.check_call('borgmatic --config {}'.format(config_path).split(' ')) subprocess.check_call(f'borgmatic --config {config_path}'.split(' '))
output = subprocess.check_output( output = subprocess.check_output(
'borgmatic --config {} list --json'.format(config_path).split(' ') f'borgmatic --config {config_path} list --json'.split(' ')
).decode(sys.stdout.encoding) ).decode(sys.stdout.encoding)
parsed_output = json.loads(output) parsed_output = json.loads(output)
@ -64,16 +61,14 @@ def test_borgmatic_command():
# Extract the created archive into the current (temporary) directory, and confirm that the # Extract the created archive into the current (temporary) directory, and confirm that the
# extracted file looks right. # extracted file looks right.
output = subprocess.check_output( output = subprocess.check_output(
'borgmatic --config {} extract --archive {}'.format(config_path, archive_name).split( f'borgmatic --config {config_path} extract --archive {archive_name}'.split(' '),
' '
)
).decode(sys.stdout.encoding) ).decode(sys.stdout.encoding)
extracted_config_path = os.path.join(extract_path, config_path) extracted_config_path = os.path.join(extract_path, config_path)
assert open(extracted_config_path).read() == open(config_path).read() assert open(extracted_config_path).read() == open(config_path).read()
# Exercise the info action. # Exercise the info action.
output = subprocess.check_output( output = subprocess.check_output(
'borgmatic --config {} info --json'.format(config_path).split(' ') f'borgmatic --config {config_path} info --json'.split(' '),
).decode(sys.stdout.encoding) ).decode(sys.stdout.encoding)
parsed_output = json.loads(output) parsed_output = json.loads(output)

View File

@ -73,6 +73,9 @@ hooks:
hostname: mongodb hostname: mongodb
username: root username: root
password: test password: test
sqlite_databases:
- name: sqlite_test
path: /tmp/sqlite_test.db
''' '''
with open(config_path, 'w') as config_file: with open(config_path, 'w') as config_file:
@ -186,7 +189,7 @@ def test_database_dump_with_error_causes_borgmatic_to_exit():
'-v', '-v',
'2', '2',
'--override', '--override',
"hooks.postgresql_databases=[{'name': 'nope'}]", "hooks.postgresql_databases=[{'name': 'nope'}]", # noqa: FS003
] ]
) )
finally: finally:

View File

@ -10,17 +10,15 @@ def generate_configuration(config_path, repository_path):
to work for testing (including injecting the given repository path and tacking on an encryption to work for testing (including injecting the given repository path and tacking on an encryption
passphrase). passphrase).
''' '''
subprocess.check_call( subprocess.check_call(f'generate-borgmatic-config --destination {config_path}'.split(' '))
'generate-borgmatic-config --destination {}'.format(config_path).split(' ')
)
config = ( config = (
open(config_path) open(config_path)
.read() .read()
.replace('ssh://user@backupserver/./sourcehostname.borg', repository_path) .replace('ssh://user@backupserver/./sourcehostname.borg', repository_path)
.replace('- ssh://user@backupserver/./{fqdn}', '') .replace('- ssh://user@backupserver/./{fqdn}', '') # noqa: FS003
.replace('- /var/local/backups/local.borg', '') .replace('- /var/local/backups/local.borg', '')
.replace('- /home/user/path with spaces', '') .replace('- /home/user/path with spaces', '')
.replace('- /home', '- {}'.format(config_path)) .replace('- /home', f'- {config_path}')
.replace('- /etc', '') .replace('- /etc', '')
.replace('- /var/log/syslog*', '') .replace('- /var/log/syslog*', '')
+ 'storage:\n encryption_passphrase: "test"' + 'storage:\n encryption_passphrase: "test"'

Some files were not shown because too many files have changed in this diff Show More