Working with Websites
Some great tools for web or internet stuff I like.
Tools
Downloading
- curl (great for testing HTTP requests)
- aria2c (great for faster downloads via multiple threads)
- httrack (mirrors websites but really slow)
- youtube-dl (download or stream media)
- rtmpdump (or just use youtube-dl)
Monitoring
- vnstat (checking network usage)
- nmap (local subnet IP searching)
- ipscan (simple java IP scanner)
- iperf (custom link speed test)
Handy tools
- iodine (handy VPN over DNS for restricted networks)
- sshuttle (tunnel network traffic over SSH)
- wireguard (do i need to mention this? fast, safe vpn protocol)
- Postman (for testing REST API's without memorizing cURL)
Scripts
Mirror all files from a website
Mirror all the files off of a website, with the full folder structure
wget -m -p -E -k -K -np https://example.com/
Save websites as shortcuts
On Windows, you can save links to websites. In Linux, Ctrl+S and drag/drop just save the HTML.
Well, turns out you can make .desktop files to just link to websites. Here's the jankjank script to easily make desktop files.
echo "[Desktop Entry]" >> "$1.desktop" echo "Encoding=UTF-8" >> "$1.desktop" echo "Name=$1" >> "$1.desktop" echo "Type=Link" >> "$1.desktop" echo URL="$2" >> "$1.desktop" echo "Icon=text-html" >> "$1.desktop"
Parameter 1 is the name of the shortcut and file, parameter 2 is the actual URL. Change icon and other params as you see fit.
Save as a bash function or as a script. Usage:
bash link.sh "TonyWiki" "https://wiki.tonytascioglu.com"
Tunneling stuff
Files over SSH
- SFTP
- SCP
- SSHFS (FUSE for SFTP)
- Rsync
Others
- SSHuttle (VPN over SSH)
- NetCat (just yeet stuff across systems)
- Rclone (for other cloud)
- S3FS (for when I need more disk space through B2)