soft fail is harder than hard fail or error #921

Closed
opened 2024-10-06 22:35:56 +00:00 by juestr · 5 comments

What I'm trying to do and why

I am backing up to multiple not always connected repositories.

In a before_backup script, returning a soft fail stops processing of all repositories in the same config file.

I understand the workaround with multiple config files, but I think it's highly inconvenient and inconsistent. A hard fail does not short-circuit following repositories, an neither does an error.

Steps to reproduce

No response

Actual behavior

borgmatic exits on a soft fail immediately

Expected behavior

A soft fail should end processing of the current repository only, like a hard fail or error.

Other notes / implementation ideas

No response

borgmatic version

1.8.13

borgmatic installation method

brew

Borg version

1.4.0

Python version

3.12.7

Database version (if applicable)

No response

Operating system and version

OSX 14.6.1

### What I'm trying to do and why I am backing up to multiple not always connected repositories. In a before_backup script, returning a soft fail stops processing of all repositories in the same config file. I understand the workaround with multiple config files, but I think it's highly inconvenient and inconsistent. A hard fail does *not* short-circuit following repositories, an neither does an error. ### Steps to reproduce _No response_ ### Actual behavior borgmatic exits on a soft fail immediately ### Expected behavior A soft fail should end processing of the current repository *only*, like a hard fail or error. ### Other notes / implementation ideas _No response_ ### borgmatic version 1.8.13 ### borgmatic installation method brew ### Borg version 1.4.0 ### Python version 3.12.7 ### Database version (if applicable) _No response_ ### Operating system and version OSX 14.6.1
Owner

Thanks for taking the time to file this!

I'm fine changing this behavior for the reasons you state. (I think it's literally a one line change + test updates.) But technically this would be a breaking change, since some users may be expecting the currently documented behavior of a soft failure hook "wrapping" all repositories in a configuration file instead of just one of them. So that might just require updating the docs and making a bigger version bump along with the change.

Thanks for taking the time to file this! I'm fine changing this behavior for the reasons you state. (I think it's literally a one line change + test updates.) But technically this would be a breaking change, since some users may be expecting the currently documented behavior of a soft failure hook "wrapping" all repositories in a configuration file instead of just one of them. So that might just require updating the docs and making a bigger version bump along with the change.
Author

Thanks for the fast answer. A version bump is sensible and probably the cleanest.

Alternatives might be a new config option "continue_on_soft_fail" or defining an alternate exit code with the new behavior.

Thanks for the fast answer. A version bump is sensible and probably the cleanest. Alternatives might be a new config option "continue_on_soft_fail" or defining an alternate exit code with the new behavior.
Owner

Alternatives might be a new config option "continue_on_soft_fail" or defining an alternate exit code with the new behavior.

Command hooks are run per-repository rather than per-config file (ever since borgmatic 1.6.0!), so I think it makes sense for their effects to be per-repository too. Hopefully nobody will miss the current behavior...

> Alternatives might be a new config option "continue_on_soft_fail" or defining an alternate exit code with the new behavior. Command hooks are run per-repository rather than per-config file (ever since borgmatic 1.6.0!), so I think it makes sense for their effects to be per-repository too. Hopefully nobody will miss the current behavior...
Owner

Implemented in main! This will be part of the next release. Thanks again!

Implemented in main! This will be part of the next release. Thanks again!
Owner

Released in borgmatic 1.9.0!

Released in borgmatic 1.9.0!
Sign in to join this conversation.
No Milestone
2 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: borgmatic-collective/borgmatic#921
No description provided.