ℹ️ Info: This post was linked in the WLED discussions and the script may make it to the WLED project.
Introduction
WLED is a powerful open-source solution for controlling addressable LEDs over Wi-Fi, making it popular for DIY smart lighting projects. However, maintaining backups of your WLED configurations can be tedious, especially when managing multiple devices.
In this post, I’ll demonstrate a simple bash-based backup solution that discovers WLED devices on your network, pulls their configurations, and saves them locally for easy restoration. The scripts are intended for Linux-based systems and can be scheduled with cron for fully automated backups.
Requirements
The scripts presented in this post rely on the following dependencies:
wled with mDNS enabled: For discovering WLED devices via mDNS.
- If not already enabled, it can be set up in Config -> WiFi setup -> set a name in mDNS
- This is critical for discovery of devices
avahi-utils (
avahi-browse
): For discovering WLED devices via mDNS.- Installation (Debian-based systems):
sudo apt install avahi-utils
- Installation (Debian-based systems):
jq: For pretty-printing JSON configuration files.
- Installation (Debian-based systems):
sudo apt install jq
- Installation (Debian-based systems):
curl: For making HTTP requests to WLED devices to fetch configuration data.
- Installation (if not already present):
sudo apt install curl
- Installation (if not already present):
Ensure all dependencies are installed before proceeding. The jq
dependency is optional but highly recommended for human-readable backups.
Backing up a single instance
It’s fairly easy to curl
our way to one backup. We simply fetch http://host/presets.json
and http://host/cfg.json
. Then we’re done. I’ve cooked up this backup-one.sh
which will:
- Be called like
backup-one.sh HostName
- Output
HostName.cfg.json
andHostName.presets.json
- Only output those files, if successful
#!/bin/bash
# Check if a hostname argument is provided
if [ "$#" -ne 1 ]; then
echo "Usage: $0 <hostname>"
exit 1
fi
if ! command -v jq &> /dev/null; then
echo "jq is not installed. Proceeding without pretty-printing JSON files."
JQ_AVAILABLE=0
else
JQ_AVAILABLE=1
fi
hostname=$1
backup_dir="/mnt/systems/backup/source/wled/"
# Create the backup directory if it doesn't exist
mkdir -p "$backup_dir"
# Function to fetch a file using curl
fetch_file() {
local url=$1
local dest=$2
# Use curl to fetch the file. Exit with the curl exit code if it fails.
if ! curl -s -o "$dest" "$url"; then
exit $? # Exit with the curl command's exit code
fi
# If jq is available and pretty-printing is requested, pretty-print the JSON file.
if [[ $JQ_AVAILABLE -eq 1 ]]; then
jq . "$dest" > "${dest}.tmp" && mv "${dest}.tmp" "$dest"
fi
}
# Fetch cfg.json
cfg_url="http://$hostname/cfg.json"
cfg_dest="${backup_dir}${hostname}.cfg.json"
fetch_file "$cfg_url" "$cfg_dest"
# Fetch presets.json
presets_url="http://$hostname/presets.json"
presets_dest="${backup_dir}${hostname}.presets.json"
fetch_file "$presets_url" "$presets_dest"
echo "Backup of $hostname completed successfully."
Discovering WLED instances
WLED is discoverable (at least it can be made to be) over mDNS which is great. This allows us to find all instances quickly and efficiently, as they’ll respond when queried for the _wled._tcp
service type. This TLD (_tcp
) is part of the service discovery piece of DNS.
Example query:
# avahi-browse _wled._tcp --terminate -r -p
+;enp0s25;IPv4;wled-christmaslights;_wled._tcp;local
+;enp0s25;IPv4;wled-stairs;_wled._tcp;local
=;enp0s25;IPv4;wled-stairs;_wled._tcp;local;wled-stairs.local;192.168.1.203;80;"mac=94e68695be28"
=;enp0s25;IPv4;wled-christmaslights;_wled._tcp;local;wled-christmaslights.local;192.168.1.199;80;"mac=94e68687fa24"
The above shows two instances responding to our query. Avahi-browse will terminate after no more responses show up.
Putting it together
I’ve put together the below backup-discover.sh
script which will do the above and call into backup-one.sh
for each instance it finds. I call this on a cron job every week, and then snapshot the resulting directory of configs with my favourite backup tool that copies them all off.
Now I have weekly snapshots of my wled devices - and I don’t need to configure anything :)
#!/bin/bash
# Define the service type you are interested in
service_type="_wled._tcp"
# Path to the backup script
backup_script="$(dirname "$0")/backup-one.sh"
# Check if the backup script exists and is executable
if [ ! -x "$backup_script" ]; then
echo "Backup script $backup_script not found or is not executable."
exit 1
fi
# Use avahi-browse to find services, parse the output to get hostnames
mapfile -t hostnames < <(avahi-browse "$service_type" --terminate -r -p | awk -F';' '/^=/ {print $7}')
# Check if any hostnames were found
if [ ${#hostnames[@]} -eq 0 ]; then
echo "No hosts found for service type $service_type."
exit 0
fi
# Remove duplicate hostnames
unique_hostnames=($(printf "%s\n" "${hostnames[@]}" | sort -u))
# Initialize a variable to track if any backup has failed
backup_failed=0
# Execute backup script for each unique hostname
for hostname in "${unique_hostnames[@]}"; do
echo "Backing up $hostname..."
$backup_script "$hostname"
backup_result=$?
if [ $backup_result -ne 0 ]; then
echo "Backup for $hostname failed with exit code $backup_result."
backup_failed=1 # Mark that a backup has failed
fi
done
# Exit with a code if any backup failed
if [ $backup_failed -ne 0 ]; then
echo "One or more backups failed."
exit 2 # You can choose an appropriate exit code
fi
echo "All backups completed successfully."