1
0
Fork 0
mirror of https://github.com/maziggy/bambuddy.git synced 2026-05-09 08:25:54 +02:00

[GH-ISSUE #870] [Feature]: Allow GitHub Backup to Include Spools, Archives, and Print Files (Complete Backup Option) #591

Closed
opened 2026-05-07 00:11:55 +02:00 by BreizhHardware · 9 comments

Originally created by @SMAW on GitHub (Apr 1, 2026).
Original GitHub issue: https://github.com/maziggy/bambuddy/issues/870

Originally assigned to: @maziggy on GitHub.

Problem or Use Case

When pushing a backup to GitHub, only certain settings, k-profiles, and cloud profiles are included. The feature is currently missing any backup of spool inventory, archives (past print records), and especially gcode/3MF print files or thumbnails kept in the database/storage. Users expecting a "backup" want all important data covered for disaster recovery, not just a subset.

This limitation makes the GitHub backup not suitable as a true backup or migration; if the system fails, important historical print data, spool usage/costs, and actual print files are lost.

Proposed Solution

Add an option to the GitHub Backup feature that allows users to optionally include spool inventory, print archives/history, and attached print files (gcode, 3MF, thumbnails, etc.) in the repository push.

  • Ideally allow enabling/disabling individual categories to manage repo size.
  • Ensure all important user data is covered for true disaster recovery or migration scenarios.
  • If including large binary files (like gcode/3MF) isn't practical due to API/file size limits, at least provide an option to push archive and spool metadata as JSON (such as archives.json, spools.json), so the full history is restorable.
  • Warn users about repository bloat and GitHub's file size limits if they enable file backup.

Alternatives Considered

  • Use the Local Backup feature to download a complete ZIP, but this requires manual steps and isn't automated or off-site like GitHub backup.
  • Set up external scripts or rsync to back up storage folders, but that's not integrated and is advanced for most users.
  • Current GitHub backup is only partial and insufficient for full recovery with print history.

Feature Category

Other

Priority

Critical for my use case

Mockups or Examples

No mockups provided, but see Local Backup feature for a reference on what a full backup should include.

Contribution

  • I would be willing to help implement this feature

Checklist

  • I have searched existing issues to ensure this feature hasn't already been requested
Originally created by @SMAW on GitHub (Apr 1, 2026). Original GitHub issue: https://github.com/maziggy/bambuddy/issues/870 Originally assigned to: @maziggy on GitHub. ### Problem or Use Case When pushing a backup to GitHub, only certain settings, k-profiles, and cloud profiles are included. The feature is currently missing any backup of spool inventory, archives (past print records), and especially gcode/3MF print files or thumbnails kept in the database/storage. Users expecting a "backup" want all important data covered for disaster recovery, not just a subset. This limitation makes the GitHub backup not suitable as a true backup or migration; if the system fails, important historical print data, spool usage/costs, and actual print files are lost. ### Proposed Solution Add an option to the GitHub Backup feature that allows users to optionally include spool inventory, print archives/history, and attached print files (gcode, 3MF, thumbnails, etc.) in the repository push. - Ideally allow enabling/disabling individual categories to manage repo size. - Ensure all important user data is covered for true disaster recovery or migration scenarios. - If including large binary files (like gcode/3MF) isn't practical due to API/file size limits, at least provide an option to push archive and spool metadata as JSON (such as archives.json, spools.json), so the full history is restorable. - Warn users about repository bloat and GitHub's file size limits if they enable file backup. ### Alternatives Considered - Use the Local Backup feature to download a complete ZIP, but this requires manual steps and isn't automated or off-site like GitHub backup. - Set up external scripts or rsync to back up storage folders, but that's not integrated and is advanced for most users. - Current GitHub backup is only partial and insufficient for full recovery with print history. ### Feature Category Other ### Priority Critical for my use case ### Mockups or Examples No mockups provided, but see Local Backup feature for a reference on what a full backup should include. ### Contribution - [ ] I would be willing to help implement this feature ### Checklist - [x] I have searched existing issues to ensure this feature hasn't already been requested
BreizhHardware 2026-05-07 00:11:55 +02:00
Author
Owner

@basziee commented on GitHub (Apr 1, 2026):

+1

<!-- gh-comment-id:4172498716 --> @basziee commented on GitHub (Apr 1, 2026): +1
Author
Owner

@maziggy commented on GitHub (Apr 2, 2026):

The GitHub backup feature was intentionally designed for lightweight configuration data (K-profiles, cloud slicer profiles, app settings) since it uses the GitHub API to create commits — it's well-suited for small JSON files that change over time.

For print files (gcode/3MF), GitHub repos aren't a great fit:

  • GitHub has a 100MB per-file limit and recommends keeping repos under 1GB
  • Binary files don't benefit from git's delta compression, so the repo would grow very quickly
  • The GitHub API blob creation would be very slow for large uploads

For full backups including print files, the Local Backup (ZIP download) is the right tool. If you want to automate off-site storage of those, you could set up a simple cron job on your host to periodically copy the Bambuddy storage directory to your preferred backup destination (NAS, cloud storage, etc.). The Bambuddy data lives in the Docker volume, so a scheduled cp or rsync would cover everything.

Anyway, I added the spool inventory and print archive/history metadata as optional JSON exports to the GitHub backup. This would give you off-site backup of your print history, spool usage, and costs — just not the actual gcode/3MF files themselves.

Available/Fixed in branch dev and available with the next release or daily build.


If you find Bambuddy useful, please consider giving it a on GitHub — it helps others discover the project!

<!-- gh-comment-id:4175427125 --> @maziggy commented on GitHub (Apr 2, 2026): The GitHub backup feature was intentionally designed for lightweight configuration data (K-profiles, cloud slicer profiles, app settings) since it uses the GitHub API to create commits — it's well-suited for small JSON files that change over time. For print files (gcode/3MF), GitHub repos aren't a great fit: - GitHub has a 100MB per-file limit and recommends keeping repos under 1GB - Binary files don't benefit from git's delta compression, so the repo would grow very quickly - The GitHub API blob creation would be very slow for large uploads For full backups including print files, the Local Backup (ZIP download) is the right tool. If you want to automate off-site storage of those, you could set up a simple cron job on your host to periodically copy the Bambuddy storage directory to your preferred backup destination (NAS, cloud storage, etc.). The Bambuddy data lives in the Docker volume, so a scheduled cp or rsync would cover everything. Anyway, I added the spool inventory and print archive/history metadata as optional JSON exports to the GitHub backup. This would give you off-site backup of your print history, spool usage, and costs — just not the actual gcode/3MF files themselves. Available/Fixed in branch dev and available with the next release or daily build. ----- If you find Bambuddy useful, please consider giving it a ⭐ on [GitHub](https://github.com/maziggy/bambuddy) — it helps others discover the project!
Author
Owner

@SMAW commented on GitHub (Apr 2, 2026):

Can't you make a dropdown with daily/weekly/hourly for the full job then? just like the Github one?

<!-- gh-comment-id:4179494306 --> @SMAW commented on GitHub (Apr 2, 2026): Can't you make a dropdown with daily/weekly/hourly for the full job then? just like the Github one?
Author
Owner

@maziggy commented on GitHub (Apr 2, 2026):

Can't you make a dropdown with daily/weekly/hourly for the full job then? just like the Github one?

Huh?

<!-- gh-comment-id:4179896099 --> @maziggy commented on GitHub (Apr 2, 2026): > Can't you make a dropdown with daily/weekly/hourly for the full job then? just like the Github one? Huh?
Author
Owner

@SMAW commented on GitHub (Apr 2, 2026):

For the full backup job that places the files in the directory, is it possible to schedule that backup? I don't want to download it, put it in a the container mountpoint somewhere, And I will backup manually

<!-- gh-comment-id:4179910189 --> @SMAW commented on GitHub (Apr 2, 2026): For the full backup job that places the files in the directory, is it possible to schedule that backup? I don't want to download it, put it in a the container mountpoint somewhere, And I will backup manually
Author
Owner

@maziggy commented on GitHub (Apr 3, 2026):

For the full backup job that places the files in the directory, is it possible to schedule that backup? I don't want to download it, put it in a the container mountpoint somewhere, And I will backup manually

You just need a cronjob to copy the data to your backup location. Just backup the following files/folders:

  • archive/
  • icons/
  • plate_calibration
  • virtual_printer/
  • bambuddy.db
<!-- gh-comment-id:4182075389 --> @maziggy commented on GitHub (Apr 3, 2026): > For the full backup job that places the files in the directory, is it possible to schedule that backup? I don't want to download it, put it in a the container mountpoint somewhere, And I will backup manually You just need a cronjob to copy the data to your backup location. Just backup the following files/folders: - archive/ - icons/ - plate_calibration - virtual_printer/ - bambuddy.db
Author
Owner

@SMAW commented on GitHub (Apr 3, 2026):

I understand, but why not make an automatic backup (statefull of the DB) .tar.gz package on a x time you give in the webui? So It makes a complete compressed packaged of that timestamp (statefull!) and have multiple backups in one folder for example for the last 7 days (changeable trough the webui)

<!-- gh-comment-id:4182399634 --> @SMAW commented on GitHub (Apr 3, 2026): I understand, but why not make an automatic backup (statefull of the DB) .tar.gz package on a x time you give in the webui? So It makes a complete compressed packaged of that timestamp (statefull!) and have multiple backups in one folder for example for the last 7 days (changeable trough the webui)
Author
Owner

@maziggy commented on GitHub (Apr 3, 2026):

For a better tracking, please open a new feature request for it. Thanks.

<!-- gh-comment-id:4182786365 --> @maziggy commented on GitHub (Apr 3, 2026): For a better tracking, please open a new feature request for it. Thanks.
Author
Owner

@SMAW commented on GitHub (Apr 3, 2026):

#884

<!-- gh-comment-id:4184812884 --> @SMAW commented on GitHub (Apr 3, 2026): #884
Sign in to join this conversation.
No milestone
No project
No assignees
1 participant
Notifications
Due date
The due date is invalid or out of range. Please use the format "yyyy-mm-dd".

No due date set.

Dependencies

No dependencies set.

Reference
starred/bambuddy-maziggy-1#591
No description provided.