Backup Reverse Proxy Site Configs to BS
I like to keep copies of things in various other places, and I like historical records of things. My installation here is what I primarily use as a notebook, blog, and general how-to because I CRS on a regular basis. I figured why not use the power of some bash scripting to run on a schedule to look at things, and then use BS API to add stuff here as a backup? Sure, my PM is backed up regularly, but quick views, edits, opening BS is much faster.
Concept
Use BS API in a bash script to back up the file contents of my NGINX reverse proxy container's /etc/sites-available/
files to a specific book
Features
- Each container = Chapter in the designated NRP Book
- Each backup = versioned page, named after the site config file's name
- Each page includes
- DTG of backup
- Container Name
- Container ID
- Container IP
- Version Number
- SHA256 Hash
- Script will run bi-weekly or on demand (PRN), backs up the configs only if the file changed
- Automatically creates the chapter if missing
The script will create a hash of each site config to help track changes. Initial run will upload everything in the directory, then subsequent runs of the script will generate a new hash, and compare - changes detected, upload the new version. Previous hashes per container will be stored for reference.
Script
Script is loaded and run from the NGINX reverse proxy container: /root/nginx_backup_live.sh
Update the script's permissions:
chmod +x /root/nginx_backup_live.sh
CRON schedule:
schedule goes here!
The actual live script:
#!/bin/bash
# Config
BOOKSTACK_URL="https://rtfm.skynet2.net/api"
BOOK_ID=REPLACE_WITH_BOOK_ID
TOKEN_ID="REPLACE_WITH_YOUR_TOKEN_ID"
TOKEN_SECRET="REPLACE_WITH_YOUR_TOKEN_SECRET"
CONFIG_DIR="/etc/nginx/sites-available"
HASH_DIR="$HOME/.nginx-backup-tracker"
# Mapping container names to IDs
declare -A container_ids=(
[container1name]=100
[container2name]=101
[container3name]=102
# add all your containers!
)
mkdir -p "$HASH_DIR"
for conf_file in "$CONFIG_DIR"/*.conf; do
name=$(basename "$conf_file" .conf)
id=${container_ids[$name]:-UNKNOWN}
ip="192.168.0.${id}"
[[ "$id" == "UNKNOWN" ]] && echo "⚠️ Unknown container ID for $name" && continue
hash_file="$HASH_DIR/$name.sha256"
file_hash=$(sha256sum "$conf_file" | awk '{print $1}')
if [[ -f "$hash_file" && "$file_hash" == "$(cat "$hash_file")" ]]; then
echo "No changes for $name"
continue
fi
echo "$file_hash" > "$hash_file"
dtg=$(date -u "+%d %b %y %H%MZ" | tr 'a-z' 'A-Z')
version=$(ls "$HASH_DIR/$name"_v*.sha256 2>/dev/null | wc -l)
version=$((version + 1))
version_file="$HASH_DIR/${name}_v${version}.sha256"
cp "$conf_file" "$version_file"
content=$(<"$conf_file")
encoded_content=$(echo "$content" | sed 's/&/\&/g; s/</\</g; s/>/\>/g')
page_html="<table class=\"align-center\" style=\"border-collapse: collapse; border-width: 0px; border-style: none; width: 100%;\" border=\"1\">
<colgroup>
<col style=\"width: 22.0487%;\"><col style=\"width: 22.7638%;\"><col style=\"width: 20.5049%;\"><col style=\"width: 17.1567%;\"><col style=\"width: 17.639%;\"></colgroup>
<tbody>
<tr><td><span style=\"color: rgb(53, 152, 219);\"><strong>DTG OF IMPORT</strong></span></td><td><span style=\"color: rgb(53, 152, 219);\"><strong>CONTAINER NAME</strong></span></td><td><span style=\"color: rgb(53, 152, 219);\"><strong>CONTAINER ID</strong></span></td><td><span style=\"color: rgb(53, 152, 219);\"><strong>IP</strong></span></td><td><span style=\"color: rgb(53, 152, 219);\"><strong>VERSION OF FILE</strong></span></td></tr>
<tr><td>$dtg</td><td>$name</td><td>$id</td><td>$ip</td><td>$version</td></tr>
<tr><td colspan=\"5\"><span style=\"color: rgb(53, 152, 219);\"><strong>SHA256 HASH VALUE</strong></span></td></tr>
<tr><td colspan=\"5\">$file_hash</td></tr>
</tbody></table>
<pre><code class=\"language-nginx\">$encoded_content</code></pre>"
# Create chapter (if needed)
chapter_resp=$(curl -s -X POST "$BOOKSTACK_URL/chapters" \
-H "Authorization: Token $TOKEN_ID:$TOKEN_SECRET" \
-H "Content-Type: application/json" \
-d "{\"book_id\": \"$BOOK_ID\", \"name\": \"$name\"}")
chapter_id=$(echo "$chapter_resp" | grep -o '"id":[0-9]*' | head -1 | cut -d ':' -f2)
page_title="Site Config Version $version - $dtg"
curl -s -X POST "$BOOKSTACK_URL/pages" \
-H "Authorization: Token $TOKEN_ID:$TOKEN_SECRET" \
-H "Content-Type: application/json" \
-d "{\"book_id\": \"$BOOK_ID\", \"chapter_id\": $chapter_id, \"name\": \"$page_title\", \"html\": \"$page_html\"}"
echo "✅ Uploaded new config version for $name"
done
Dry Run
This script will do a dry run to check everything before going live! (don't forget to update it's permissions to run!)
BOOK_ID=30
CONFIG_DIR="/etc/nginx/sites-available"
HASH_DIR="$HOME/.nginx-backup-tracker"
# Mapping container names to IDs
declare -A container_ids=(
[container100name]=100
[container101name]=101
[container102name]=102
# Update your container map!
)
mkdir -p "$HASH_DIR"
echo "\n ^=^t^m Starting dry run for Nginx site config backup..."
for conf_file in "$CONFIG_DIR"/*; do
name=$(basename "$conf_file" .conf)
id=${container_ids[$name]:-UNKNOWN}
ip="192.168.0.${id}"
[[ "$id" == "UNKNOWN" ]] && echo " ^z ^o Unknown container ID for $name (file: $conf_file)" && continue
hash_file="$HASH_DIR/$name.sha256"
file_hash=$(sha256sum "$conf_file" | awk '{print $1}')
dtg=$(date -u "+%d %b %y %H%MZ" | tr 'a-z' 'A-Z')
version=$(ls "$HASH_DIR/${name}"_v*.sha256 2>/dev/null | wc -l)
version=$((version + 1))
if [[ -f "$hash_file" && "$file_hash" == "$(cat "$hash_file")" ]]; then
echo " ^=^= [$name] No changes since last backup (hash unchanged)"
continue
fi
echo "\n ^=^z Change detected for: $name"
echo " ^~ DTG: $dtg"
echo " ^~ Container ID: $id"
echo " ^~ IP Address: $ip"
echo " ^~ Version #: $version"
echo " ^~ SHA256: $file_hash"
echo " ^~ Would create chapter: $name (if missing)"
root@NginxProxy:~# cat nginx_backup_dry_run.sh
#!/bin/bash
BOOK_ID=30
CONFIG_DIR="/etc/nginx/sites-available"
HASH_DIR="$HOME/.nginx-backup-tracker"
# Mapping container names to IDs
declare -A container_ids=(
[container100name]=100
[container101name]=101
[container102name]=102
# Update your container map!
)
mkdir -p "$HASH_DIR"
echo "🔍 Starting dry run for Nginx site config backup..."
for conf_file in "$CONFIG_DIR"/*; do
name=$(basename "$conf_file" .conf)
id=${container_ids[$name]:-UNKNOWN}
ip="192.168.0.${id}"
[[ "$id" == "UNKNOWN" ]] && echo "⚠️ Unknown container ID for $name (file: $conf_file)" && continue
hash_file="$HASH_DIR/$name.sha256"
file_hash=$(sha256sum "$conf_file" | awk '{print $1}')
dtg=$(date -u "+%d %b %y %H%MZ" | tr 'a-z' 'A-Z')
version=$(ls "$HASH_DIR/${name}"_v*.sha256 2>/dev/null | wc -l)
version=$((version + 1))
if [[ -f "$hash_file" && "$file_hash" == "$(cat "$hash_file")" ]]; then
echo "🟢 [$name] No changes since last backup (hash unchanged)"
continue
fi
echo " 🚨 Change detected for: $name"
echo " ➤ DTG: $dtg"
echo " ➤ Container ID: $id"
echo " ➤ IP Address: $ip"
echo " ➤ Version #: $version"
echo " ➤ SHA256: $file_hash"
echo " ➤ Would create chapter: $name (if missing)"
echo " ➤ Would create page: Site Config Version $version - $dtg"
echo " ➤ First 10 lines of config:"
head -n 10 "$conf_file" | sed 's/^/ | /'
done
echo "✅ Dry run complete. No changes made to BookStack."