Skip to main content

Discord Bot For Scanner Audio Archiving

I run an SDR in a Windows 2025 server that lives on my Proxmox host. For years I have struggled to figure out a way to archive audio in a streamlined way. Most of the time I just end up saving the audio clips in a cloud file server, and most recently (as of the last few years) I've kept them in my OneDrive. 

Well, today, I figured I would give Discord a try, and so far, it's working pretty damn well. I created a bot that will interface with my Discord server, that will scan a folder (my OneDrive folder) and post the audio clips it finds to a specific channel. Whats cool about this is the fact it attaches the audio clip, so even if the source file is deleted from OneDrive, it's still on the Discord server.

Even more cool points comes from the source, SDRTrunk, as this includes metadata in the file (just like a music MP3 or whatever), so I can parse that data out in the text of the message. Searching for recorded audio couldn't be easier, and with the mobile app, it's perfect for on the go listening "live" - though playing each audio file one by one is lame, it works wonders.

Here is what I did to get it rolling, in the event anyone else wants to give this a go, they can.

1. Create the Discord Bot & Add it to the Server

Go to the Discord Developer Portal and create a new application. Be sure to give it a cool name, upload an image, and all the fun stuff, then create a bot:

  1. Create the bot
  2. Give the bot these permissions (add any others you want I suppose):
    1. Manage Server
    2. View Channels
    3. Send Messages
    4. Create Public Threads
    5. Send Messages in Threads
    6. Manage Messages
    7. Manage Threads
    8. Embed Links
    9. Attach Files
  3. Your bot is defaulted to public - I didn't want that, so I made mine private
    1. Navigate to Installation
    2. Change the Install Link to None
    3. Go back to the Bot page, and deselect Public Bot switch
  4. Reset the Token and copy it down (only shown once)
  5. Go to the OAuth2 page
    1. Select all the same permissions as above from step 2
    2. Copy the generated link at the bottom (this is how you will invite your bot to the server!)
    3. Paste that link into your browser (should sign in first), and grant it access
  6. Boom, done with that part!

2. Create A Channel for Your Audio

  1. This one is obvious - create a channel
  2. Copy the channel link and paste it somewhere you can copy the channel ID from later

3. Python Stuff

Wherever you have your source audio generator, this is where I would suggest you run your middleman script (in this case, I used Python)

3.1 Python Dependencies
  1. Install Python if not already installed
  2. Open a terminal and install dependencies: 
pip install discord.py watchdog mutagen python-dotenv
3.2 Certificate Errors

I ran into certificate errors on my first run, and had to do this to resolve them...

  • Install this stuff:
pip install -U certifi aiohttp discord.py
  • Find the certifi bundle path:
python -c "import certifi; print(certifi.where())"
  • Copy that path, then set a machine-wide env var so all Python runs use it:
setx /M SSL_CERT_FILE "C:\Users\Administrator\AppData\Local\Programs\Python\Python313\Lib\site-packages\certifi\cacert.pem"
3.3 Env File
  1. Create a folder somewhere, in my case, I used this directory: C:\bots\sdr-discord-uploader\
  2. In that folder, create a .env file with this information:
DISCORD_TOKEN=your_bot_token_here
DISCORD_CHANNEL_ID=123456789012345678
WATCH_FOLDER=C:\path\to\where\your\audio\files\live

Your Discord channel ID is the last string of numbers in the URL you copied from your channel, like this:

https://discordapp.com/channels/1234567890/123456789012345678

3.4 The Script

This script will look at that WATCH_FOLDER directory, and create a database to keep track of what it's already posted to the channel, and work it's magic. I have the script set up to create threads to group messages posted by talkgroup, to keep things a little cleaner in the channel as my audio source records multiple TGs at once. Save this file as whatever you wanna call it, I named mine uploader.py

Save this script (edit as you see fit) in the same directory as your .env file from above:

# discord-upload-script.py
# ------------------------------------------------------------
# Posts new audio files from a watched folder to Discord.
# Per-talkgroup THREADS: auto-creates/uses a thread "tg-<tgid> <alias>"
#
# .env keys:
#   DISCORD_TOKEN=xxxxx
#   DISCORD_CHANNEL_ID=123456789012345678
#   WATCH_FOLDER=C:\Users\Administrator\OneDrive\Scanner Recordings\SDRTrunk
#
# deps:
#   pip install -U discord.py watchdog mutagen python-dotenv certifi
# ------------------------------------------------------------

import asyncio
import os
import sqlite3
import time
from datetime import datetime, timezone
from pathlib import Path
from typing import Optional, Dict, Any

import discord
from discord import Embed, File
from watchdog.observers import Observer
from watchdog.events import FileSystemEventHandler
from mutagen import File as MutagenFile
from dotenv import load_dotenv

# -------------------- Config / Constants --------------------

load_dotenv()

DISCORD_TOKEN = os.getenv("DISCORD_TOKEN", "").strip()
CHANNEL_ID = int(os.getenv("DISCORD_CHANNEL_ID", "0"))
WATCH_FOLDER = Path(os.getenv("WATCH_FOLDER", r"C:\Users\Administrator\OneDrive\Scanner Recordings\SDRTrunk"))

# Threads behavior
ENABLE_THREADS = True                       # turn off to post in main channel
THREAD_AUTO_ARCHIVE_MIN = 10080             # 60, 1440, 4320, or 10080 (7 days)
THREAD_NAME_PREFIX = "tg-"                  # prefix before TGID
THREAD_TITLE_MAX = 64                       # max chars from the clip title to append

# File settle detection
STABILITY_INTERVAL_SEC = 1.0
STABILITY_PASSES = 3

# Allowed file types
ALLOWED_EXT = {".mp3", ".wav", ".m4a", ".flac", ".ogg"}

# SQLite db to track uploads & threads
DB_PATH = WATCH_FOLDER / "_uploaded.sqlite"

# Process backlog on start (files that arrived while bot was offline)
PROCESS_BACKLOG_ON_START = True


# -------------------- SQLite helpers --------------------

def ensure_db() -> sqlite3.Connection:
    DB_PATH.parent.mkdir(parents=True, exist_ok=True)
    conn = sqlite3.connect(DB_PATH)
    # original upload table
    conn.execute("""
        CREATE TABLE IF NOT EXISTS uploaded (
            path TEXT PRIMARY KEY,
            mtime REAL NOT NULL,
            size  INTEGER NOT NULL
        )
    """)
    # migrate: add uploaded_at if missing
    cols = [c[1] for c in conn.execute("PRAGMA table_info(uploaded)")]
    if "uploaded_at" not in cols:
        conn.execute("ALTER TABLE uploaded ADD COLUMN uploaded_at REAL")
        conn.execute("UPDATE uploaded SET uploaded_at = strftime('%s','now') WHERE uploaded_at IS NULL")

    # threads table cache
    conn.execute("""
        CREATE TABLE IF NOT EXISTS threads (
            thread_key TEXT PRIMARY KEY,   -- e.g., "tg:111"
            thread_id  INTEGER NOT NULL,
            name       TEXT,
            created_at REAL NOT NULL
        )
    """)
    conn.commit()
    return conn


def already_uploaded(conn: sqlite3.Connection, path: Path) -> bool:
    cur = conn.execute("SELECT 1 FROM uploaded WHERE path=?", (str(path),))
    return cur.fetchone() is not None


def mark_uploaded(conn: sqlite3.Connection, path: Path):
    try:
        st = path.stat()
    except FileNotFoundError:
        return
    conn.execute(
        "INSERT OR REPLACE INTO uploaded (path, mtime, size, uploaded_at) VALUES (?, ?, ?, ?)",
        (str(path), st.st_mtime, st.st_size, time.time())
    )
    conn.commit()


def get_cached_thread_id(conn: sqlite3.Connection, key: str) -> Optional[int]:
    cur = conn.execute("SELECT thread_id FROM threads WHERE thread_key=?", (key,))
    row = cur.fetchone()
    return int(row[0]) if row else None


def cache_thread_id(conn: sqlite3.Connection, key: str, thread_id: int, name: str):
    conn.execute(
        "INSERT OR REPLACE INTO threads (thread_key, thread_id, name, created_at) VALUES (?, ?, ?, ?)",
        (key, thread_id, name, time.time())
    )
    conn.commit()


def delete_thread_cache(conn: sqlite3.Connection, key: str):
    conn.execute("DELETE FROM threads WHERE thread_key=?", (key,))
    conn.commit()


# -------------------- File utilities --------------------

def is_stable(path: Path) -> bool:
    """Return True when file size stops changing across STABILITY_PASSES checks."""
    try:
        prev = None
        stable_count = 0
        time.sleep(0.25)
        for _ in range(60):  # cap ~60s
            size = path.stat().st_size
            if prev is None:
                prev = size
                stable_count = 1
            else:
                if size == prev:
                    stable_count += 1
                else:
                    prev = size
                    stable_count = 1
            if stable_count >= STABILITY_PASSES:
                return True
            time.sleep(STABILITY_INTERVAL_SEC)
        return False
    except FileNotFoundError:
        return False


def parse_filename(fname: str) -> Dict[str, Any]:
    """
    Example:
    20250809_013049HonorHealth_11-FP_Four_Peaks__TO_111_FROM_15011.mp3
    """
    stem = Path(fname).stem
    if "_TO_" not in stem or "_FROM_" not in stem or "_" not in stem:
        return {}

    left, source_radio = stem.rsplit("_FROM_", 1)
    source_radio = source_radio.strip()

    left2, talkgroup = left.rsplit("_TO_", 1)
    talkgroup = talkgroup.strip()

    if "_" not in left2:
        return {}
    date_part, rest = left2.split("_", 1)
    if len(rest) < 6:
        return {}
    time_part = rest[:6]
    remainder = rest[6:].strip("_")

    if "_" in remainder:
        system, site_raw = remainder.split("_", 1)
    else:
        system, site_raw = remainder, ""
    site = site_raw.replace("_", " ").strip()

    try:
        dt = datetime.strptime(f"{date_part}{time_part}", "%Y%m%d%H%M%S")
    except ValueError:
        dt = None

    return {
        "date": date_part,
        "time": time_part,
        "datetime": dt,
        "system": system,
        "site": site,
        "talkgroup": talkgroup,
        "source_radio": source_radio,
    }


def read_tags(path: Path) -> Dict[str, str]:
    """Read a few common tags via mutagen, returning normalized keys."""
    out: Dict[str, str] = {}
    try:
        mf = MutagenFile(path)
        if not mf or not mf.tags:
            return out

        def pick(*keys: str) -> str:
            for k in keys:
                if k in mf.tags:
                    v = mf.tags.get(k)
                    if isinstance(v, list):
                        v = v[0] if v else ""
                    return str(v)
            return ""

        out["title"] = pick("TIT2", "title")
        out["artist"] = pick("TPE1", "artist")      # your "TX ID"
        out["album"] = pick("TALB", "album")        # your "Site"
        out["genre"] = pick("TCON", "genre")
        out["composer"] = pick("TCOM", "composer")  # your "Software"
        out["date"] = pick("TDRC", "date")
        out["grouping"] = pick("TIT1", "grouping")  # your "System"
    except Exception:
        pass
    return out


def build_thread_name(tgid: str, title: str) -> str:
    """tg-<tgid> <title_excerpt>"""
    tgid_clean = "".join(ch for ch in tgid if ch.isdigit())
    title = (title or "").strip().replace("\n", " ")
    if len(title) > THREAD_TITLE_MAX:
        title = title[:THREAD_TITLE_MAX - 1] + "…"
    base = f"{THREAD_NAME_PREFIX}{tgid_clean or tgid}"
    return f"{base} {title}".strip()


# -------------------- Watchdog -> Async bridge --------------------

class NewFileHandler(FileSystemEventHandler):
    def __init__(self, loop: asyncio.AbstractEventLoop, queue: asyncio.Queue):
        self.loop = loop
        self.queue = queue

    def on_created(self, event):
        if event.is_directory:
            return
        path = Path(event.src_path)
        if path.suffix.lower() in ALLOWED_EXT:
            # ignore cloud temp files
            if path.name.startswith("~") or path.suffix.lower() == ".tmp":
                return
            # enqueue to the event-loop thread; no DB calls here
            self.loop.call_soon_threadsafe(self.queue.put_nowait, path)



# -------------------- Discord helpers --------------------

async def get_main_channel(client: discord.Client):
    try:
        return await client.fetch_channel(CHANNEL_ID)
    except discord.Forbidden:
        print("ERROR: Bot lacks permission to view the main channel.")
    except discord.NotFound:
        print("ERROR: Channel ID does not exist or bot can’t see it.")
    except Exception as e:
        print(f"ERROR: Could not fetch main channel: {e}")
    return None


async def get_or_create_tg_thread(
    client: discord.Client,
    conn: sqlite3.Connection,
    parent_channel: discord.TextChannel,
    tgid: str,
    title_for_name: str
) -> Optional[discord.Thread]:
    """Return a thread for this TG; create if missing. Uses SQLite cache."""
    key = f"tg:{tgid}"
    # 1) Cached?
    cached = get_cached_thread_id(conn, key)
    if cached:
        try:
            ch = await client.fetch_channel(cached)
            if isinstance(ch, discord.Thread) and ch.parent_id == parent_channel.id:
                return ch
            else:
                # stale
                delete_thread_cache(conn, key)
        except discord.NotFound:
            delete_thread_cache(conn, key)
        except Exception as e:
            print(f"Thread fetch error (cached {cached}): {e}")

    # 2) Create a new public thread under the parent text channel
    name = build_thread_name(tgid, title_for_name)
    thread: Optional[discord.Thread] = None

    # Preferred: create thread directly
    try:
        thread = await parent_channel.create_thread(
            name=name,
            auto_archive_duration=THREAD_AUTO_ARCHIVE_MIN,
            type=discord.ChannelType.public_thread
        )
    except (discord.Forbidden, discord.HTTPException, AttributeError):
        # Fallback: seed message -> create thread from it
        try:
            seed = await parent_channel.send(f"Auto-creating thread for TG {tgid}…")
            thread = await seed.create_thread(
                name=name,
                auto_archive_duration=THREAD_AUTO_ARCHIVE_MIN
            )
            # optional: tidy up
            try:
                await seed.delete()
            except Exception:
                pass
        except Exception as e:
            print(f"Failed to create thread for TG {tgid}: {e}")
            return None

    if thread:
        cache_thread_id(conn, key, thread.id, name)
    return thread


# -------------------- Worker & Client --------------------

async def upload_worker(queue: asyncio.Queue, client: discord.Client, conn: sqlite3.Connection):
    await client.wait_until_ready()
    parent = await get_main_channel(client)
    if parent is None:
        return
    print(f"Resolved channel: {parent} (type={type(parent)})")

    while True:
        path: Path = await queue.get()
        try:
            await asyncio.sleep(0.25)
            if not is_stable(path):
                queue.task_done()
                continue
            if already_uploaded(conn, path):
                queue.task_done()
                continue

            parsed = parse_filename(path.name)
            tags = read_tags(path)

            # Build embed
            title = tags.get("title") or f"Radio {parsed.get('source_radio','?')} → TG {parsed.get('talkgroup','?')}"
            embed = Embed(title=title, description="Scanner clip", timestamp=datetime.now(timezone.utc))

            # Parsed fields
            if parsed.get("system"):
                embed.add_field(name="System", value=parsed["system"], inline=True)
            if parsed.get("site"):
                embed.add_field(name="Site", value=parsed["site"], inline=True)
            if parsed.get("talkgroup"):
                embed.add_field(name="Talkgroup", value=parsed["talkgroup"], inline=True)
            if parsed.get("source_radio"):
                embed.add_field(name="Source Radio", value=parsed["source_radio"], inline=True)
            if parsed.get("datetime"):
                embed.add_field(name="Timestamp", value=parsed["datetime"].strftime("%Y-%m-%d %H:%M:%S"), inline=True)

            # Optional metadata (your custom labels)
            md_lines = []
            label_map = [
                ("TX ID", "artist"),
                ("Site", "album"),
                ("Genre", "genre"),
                ("Software", "composer"),
                ("System", "grouping"),
                ("Tag Date", "date"),
            ]
            for label, key in label_map:
                if tags.get(key):
                    md_lines.append(f"**{label}:** {tags[key]}")
            if md_lines:
                embed.add_field(name="Metadata", value="\n".join(md_lines), inline=False)

            embed.set_footer(text=path.name)

            destination = parent
            if ENABLE_THREADS and parsed.get("talkgroup"):
                thread = await get_or_create_tg_thread(
                    client, conn, parent, parsed["talkgroup"], tags.get("title") or ""
                )
                if thread:
                    destination = thread
                else:
                    print("Thread creation failed; posting in main channel instead.")

            # Send
            try:
                await destination.send(embed=embed, file=File(str(path)))
            except discord.HTTPException as e:
                await destination.send(content=title, file=File(str(path)))

            mark_uploaded(conn, path)
            await asyncio.sleep(0.5)

        except Exception as e:
            print(f"Failed to upload {path}: {e}")
        finally:
            queue.task_done()


class SDRUploader(discord.Client):
    def __init__(self, conn: sqlite3.Connection, **kwargs):
        super().__init__(**kwargs)
        self.conn = conn
        self.queue: asyncio.Queue[Path] = asyncio.Queue()
        self.observer: Optional[Observer] = None

    async def setup_hook(self) -> None:
        self.loop.create_task(upload_worker(self.queue, self, self.conn))

    async def on_ready(self):
        print(f"Logged in as {self.user} ({self.user.id})")
        handler = NewFileHandler(self.loop, self.queue)
        self.observer = Observer()
        self.observer.schedule(handler, str(WATCH_FOLDER), recursive=False)
        self.observer.start()
        print(f"Watching: {WATCH_FOLDER}")

        if PROCESS_BACKLOG_ON_START:
            for path in sorted(WATCH_FOLDER.glob("*")):
                if path.suffix.lower() in ALLOWED_EXT and not already_uploaded(self.conn, path):
                    if path.name.startswith("~") or path.suffix.lower() == ".tmp":
                        continue
                    await self.queue.put(path)

    async def close(self):
        if self.observer:
            self.observer.stop()
            self.observer.join()
        await super().close()


# -------------------- Entrypoint --------------------

def main():
    if not DISCORD_TOKEN or not CHANNEL_ID:
        print("Set DISCORD_TOKEN and DISCORD_CHANNEL_ID in .env")
        return

    WATCH_FOLDER.mkdir(parents=True, exist_ok=True)
    conn = ensure_db()

    intents = discord.Intents.default()  # needed for channel/thread ops
    client = SDRUploader(conn=conn, intents=intents)
    client.run(DISCORD_TOKEN)


if __name__ == "__main__":
    main()

4. Run It!

With all that done, launch your script and let the magic happen. You can also run this on a schedule or set up something in Task Manager, etc.