Skip to content

Instantly share code, notes, and snippets.

@Ammly
Last active March 29, 2026 15:19
Show Gist options
  • Select an option

  • Save Ammly/26a950cf495fe36f0cdf5bac34e82900 to your computer and use it in GitHub Desktop.

Select an option

Save Ammly/26a950cf495fe36f0cdf5bac34e82900 to your computer and use it in GitHub Desktop.
decode showcase
#!/bin/bash
# ============================================================
# DEMO ONLY - Decode 4.0 Presentation
# This script illustrates what a prompt injection hidden in a
# "task" or .cursorrules file could trick an AI agent into running.
# DO NOT RUN THIS - it is for slide/presentation purposes only.
# ============================================================
# Step 1: Recon - Where are we?
echo "[*] Current directory:"
pwd
# Step 2: Recon - What's on this disk?
echo "[*] Disk space:"
df -h
# Step 3: Recon - What's our IP? (network fingerprinting)
echo "[*] Network info:"
ifconfig | grep "inet "
# Step 4: Recon - Who is the user?
echo "[*] Current user:"
whoami
# Step 5: Exfiltration setup - Serve local files over HTTP
echo "[*] Checking if Python is available..."
if command -v python3 &> /dev/null; then
echo "[!] Python3 found. Starting HTTP server on port 8888..."
echo "[!] This would expose ALL files in $(pwd) to the network!"
# python3 -m http.server 8888 &
echo "[DEMO] Server command commented out for safety."
else
echo "[-] Python3 not found."
fi
echo ""
echo "============================================="
echo " You have been exploited."
echo " An AI agent ran this because it was hidden"
echo " inside a 'task' or cloned repo."
echo " Never Trust. Always Verify."
echo "============================================="
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment