Backup Reverse Proxy Site Configs to BS
I like to keep copies of things in various other places, and I like historical records of things. My installation here is what I primarily use as a notebook, blog, and general how-to because I CRS on a regular basis. I figured why not use the power of some bash scripting to run on a schedule to look at things, and then use BS API to add stuff here as a backup? Sure, my PM is backed up regularly, but quick views, edits, opening BS is much faster.
Concept
Use BS API in a bash script to back up the file contents of my NGINX reverse proxy container's /etc/sites-available/
files to a specific book
Features
- Each container = Chapter in the designated NRP Book
- Each backup = versioned page, named after the site config file's name
- Each page includes
- DTG of backup
- Container Name
- Container ID
- Container IP
- Version Number
- SHA256 Hash
- Current File Output
- Difference Output
- Script will run on schedule from CRON, backs up the configs only if the file changed
- Automatically creates the chapter if missing
The script will create a hash of each site config to help track changes. Initial run will upload everything in the directory, then subsequent runs of the script will generate a new hash, and compare - changes detected, upload the new version. Previous hashes per container will be stored for reference.
Script
Script is loaded and run from the NGINX reverse proxy container:Ā /root/nginx_backup_live.sh
Update the script's permissions:
chmod +x /root/nginx_backup_live.sh
CRON schedule:Ā
# EVERY HOUR
0 * * * * /root/nginx_backup_live.sh
# EVERY FOUR HOURS
0 */4 * * * /root/nginx_backup_live.sh
# DAILY AT 0200
0 2 * * * /root/nginx_backup_live.sh
# PICK ONE AND ADD IT OR ROLL YOUR OWN!
The actual live script:
#!/bin/bash
# Config
BOOKSTACK_URL="https://yoursitehere.com/api"
BOOK_ID=30
TOKEN_ID="YOUR TOKEN HERE"
TOKEN_SECRET="YOUR SECRET HERE"
CONFIG_DIR="/etc/nginx/sites-available"
HASH_DIR="$HOME/.nginx-backup-tracker"
# Mapping container names to IDs (used for internal reference)
declare -A container_ids=(
[containername]=100
[containername]=101
[containername]=102
#CONTINUE DOWN THE LIST
)
mkdir -p "$HASH_DIR"
# Get all chapters once for book 30
all_chapters=$(curl -s -H "Authorization: Token $TOKEN_ID:$TOKEN_SECRET" \
-H "Accept: application/json" \
"$BOOKSTACK_URL/chapters")
for conf_file in "$CONFIG_DIR"/*; do
name=$(basename "$conf_file")
id=${container_ids[$name]:-UNKNOWN}
ip="192.168.0.${id}"
[[ "$id" == "UNKNOWN" ]] && echo "ā ļø Unknown container ID for $name (file: $conf_file)" && continue
hash_file="$HASH_DIR/$name.sha256"
file_hash=$(sha256sum "$conf_file" | awk '{print $1}')
if [[ -f "$hash_file" && "$file_hash" == "$(cat "$hash_file")" ]]; then
echo "š¢ [$name] No changes since last backup (hash unchanged)"
continue
fi
echo "$file_hash" > "$hash_file"
dtg=$(date "+%d%b%y %H%M" | tr 'a-z' 'A-Z') # Local time used
version=$(ls "$HASH_DIR/${name}_v"*.sha256 2>/dev/null | wc -l)
version=$((version + 1))
cp "$conf_file" "$HASH_DIR/${name}_v${version}.sha256"
last_version_file=$(ls "$HASH_DIR/${name}_v"*.sha256 2>/dev/null | sort | tail -n 2 | head -n 1)
diff_html=""
if [[ -f "$last_version_file" ]]; then
diff_raw=$(diff -u "$last_version_file" "$conf_file")
if [[ -n "$diff_raw" ]]; then
diff_encoded=$(echo "$diff_raw" | sed 's/&/\&/g; s/</\</g; s/>/\>/g' | sed ':a;N;$!ba;s/\n/\ /g')
diff_html="<h3>š Changes from Previous Version</h3><pre><code class=\"language-diff\">$diff_encoded</code></pre>"
fi
fi
encoded_content=$(sed 's/&/\&/g; s/</\</g; s/>/\>/g' "$conf_file" | sed ':a;N;$!ba;s/\n/\ /g')
page_html="<table class=\"align-center\" style=\"border-collapse: collapse; border-width: 0px; border-style: none; border=0 width: 100%;\" border=\"0\">
<colgroup><col style=\"width: 22.0487%;\"><col style=\"width: 22.7638%;\"><col style=\"width: 20.5049%;\"><col style=\"width: 17.1567%;\"><col style=\"width: 17.639%;\"></colgroup>
<tbody>
<tr><td><span style=\"color: rgb(53, 152, 219);\"><strong>DTG OF IMPORT</strong></span></td><td><span style=\"color: rgb(53, 152, 219);\"><strong>CONTAINER NAME</strong></span></td><td><span style=\"color: rgb(53, 152, 219);\"><strong>CONTAINER ID</strong></span></td><td><span style=\"color: rgb(53, 152, 219);\"><strong>IP</strong></span></td><td><span style=\"color: rgb(53, 152, 219);\"><strong>VERSION OF FILE</strong></span></td></tr>
<tr><td>$dtg</td><td>$name</td><td>$id</td><td>$ip</td><td>$version</td></tr>
<tr><td colspan=\"5\"><span style=\"color: rgb(53, 152, 219);\"><strong>SHA256 HASH VALUE</strong></span></td></tr>
<tr><td colspan=\"5\">$file_hash</td></tr>
</tbody></table>
$diff_html
<h3>š Current Configuration</h3><pre><code class=\"language-nginx\">$encoded_content</code></pre>"
name=$(echo "$name" | xargs) # Strip whitespace
chapter_id=$(echo "$all_chapters" | jq -r --arg name "$name" --argjson book_id "$BOOK_ID" '.data[] | select(.name == $name and .book_id == $book_id) | .id')
if [[ -z "$chapter_id" || "$chapter_id" == "null" ]]; then
echo "š Chapter '$name' not found. Creating it..."
chapter_payload=$(jq -n --arg name "$name" --argjson book_id "$BOOK_ID" '{name: $name, book_id: $book_id}')
chapter_result=$(curl -s -X POST "$BOOKSTACK_URL/chapters" \
-H "Authorization: Token $TOKEN_ID:$TOKEN_SECRET" \
-H "Content-Type: application/json" \
-d "$chapter_payload")
chapter_id=$(echo "$chapter_result" | jq -r '.id')
fi
page_title="$name Ver $version $dtg"
echo -e "\nš¤ Uploading page for $name..."
json_payload=$(jq -n \
--arg name "$page_title" \
--arg html "$page_html" \
--argjson book_id "$BOOK_ID" \
--argjson chapter_id "$chapter_id" \
'{name: $name, book_id: $book_id, chapter_id: $chapter_id, html: $html}')
result=$(curl -s -X POST "$BOOKSTACK_URL/pages" \
-H "Authorization: Token $TOKEN_ID:$TOKEN_SECRET" \
-H "Content-Type: application/json" \
-d "$json_payload")
if echo "$result" | grep -q '"error"'; then
echo "ā Upload failed for $name"
echo "API Response: $result"
else
echo "ā
Uploaded new config version for $name"
fi
done