CLI

CLI

Here are a bunch of one-off CLI snippets that I use frequently.

Copying

Fast copying lots of data between servers on same lan

You can use scp, sftp, or rsync to copy files pretty easily, but there is quite a bit of overhead. If you are in a situation where you have to copy lots and lots of data between servers and security isn’t a big concern, try the following method.

First, on your destination machine, open up a port to accept the bytes.

socat tcp4-listen:<DEST PORT> stdout | tar xvpf -

Second, on your source machine, create an archive of the files you want to transfer and then shoot them over.

tar cvf - ./<directory to archive> | socat stdin tcp4:<DEST IP>:<DEST PORT>

Fast copying lots of files between servers on same lan

Maybe instead of large number of bytes, you want to transfer lots of small files between servers. Give this a try.

First, on your source machine, open up a port.

tar -cf - -C /home/nali/sf-back . | pv | nc -l <SRC PORT>

Next, on your destination machine:

nc <SRC IP> <SRC PORT> | pv | tar -xf - -C

pv is used to show progress.

Mirror or Download Website

wget --recursive --page-requisites --adjust-extension \
     --span-hosts --convert-links --restrict-file-names=windows \
     -t 1 -T 5 -c -D <URL> --no-parent <URL>

-t: number of times to try
-T: number of seconds to wait per try

VPN

Check VPN IP when using with Gluetun

docker exec gluetun sh -c "wget http://ipecho.net/plain -O - -q ; echo"

That’s assuming your container is named gluetun.

Pruning

Some commands tested on Ubuntu.

Find large files

sudo find . -xdev -type f -size +100M

Find space taken by docker logs

sh -c "du -ch /var/lib/docker/containers/*/*-json.log | grep total"

Truncate docker logs

truncate -s 0 /var/lib/docker/containers/**/*-json.log

Find space used by journalctl

journalctl --disk-usage

Set max size of journalctl log files

journalctl --vacuum-size=50M

Stop all containers specific to image

docker ps -a -q --filter ancestor=archivebox/archivebox:master | xargs docker stop

Remove all containers specific to an image

Use the stop command above first, then run the following:

docker ps -a -q --filter ancestor=archivebox/archivebox:master | xargs docker rm

Backblaze

Determine bucket size

b2 get-bucket --show-size <BUCKET NAME> \
| jq -r '
    def h: [while(length>0; .[:-3]) | .[-3:]] | reverse | join(",");
    .totalSize | tostring | h'

Ubuntu

Update packages and upgrade to next point release

Install the deborphan package first which removes unused packages and then run the following.

apt -y update \
&& apt -y upgrade \
&& apt -y dist-upgrade \
&& apt -y autoremove \
&& apt -y autoclean \
&& apt -y clean \
&& apt -y remove $(deborphan)

Oracle (OCI)

Open a port in Ubuntu

sudo apt-get install firewalld
sudo firewall-cmd --permanent --zone=public --add-port=<PORT>/udp
sudo firewall-cmd --reload
service start firewalld

Remember to allow ingress for the port in the network settings.

Regular Expressions

Convert ISO 8601 to MM/DD/YYYY

For a test run:

sed -i bak  -E 's/([0-9]{4})-([0-9]{2})-([0-9]{2})T[0-9]{2}:[0-9]{2}:[0-9]{2}Z/\2\/\3\/\1/p' *

-i bak : replace in-place and backup originals with .bak extensions.
-E : extended regex

To actually run the command, replace /p with /g.

Cloudflare

Purge Website Cache (all pages)

You need an API token that has permission to purge the cache for the specific website. API tokens can be found in the My Profile dropdown.

You could use the global API key, but it’s better to create one specific for this type of usage.

When creating a custom token, choose the following:

Permissions | Zone | Cache Purge | Purge
Zone Resources | Include | Specific Zone |

Use the API token below.

curl --request POST "https://api.cloudflare.com/client/v4/zones/<ZONE ID>/purge_cache" \
-H "Content-Type: application/json" \
-H "Authorization: Bearer <API TOKEN>" \
--data '{"purge_everything":true}'