How to download a webpage to access it offline? Sometimes it is useful to download/mirror a webpage locally such that you can access it without internet connection (e.g., download documentation of a library tool). This is easily down via wget. The following command shows all the necessary parameters we need:
# wget --no-check-certificate --no-clobber --page-requisites -m -np -k https://www.foo.com
A while a go a colleague introduced me to this shortcut in git that improved my productivity tremendously. It was the power of -. If you use a Unix like operating systems you know how useful the - as place holder is e.g., representing the std-in/-out in command lines or the previous directory we have been in the shell. This last feature is also available in git with the difference that - can refer to the previous checkout branch.
I recently acquired a Rock Pi X. This is a Raspberry Pi clone but with an Intel Atom process (which supports hardware virtualization). When I tried installing FreeBSD on it, the kernel got stuck during the boot process (the exact location might vary for you and your system). It turns out you need to disable the uart driver. You can do this before the kernel boots by setting:
set hint.uart.0.disabled="1" set hint.
List open ports on OSX To list the open ports on OSX we need to use the lsof (list open files) tool. At the the end of of the day pretty much everything is a file handle in a unix kernel. The following example show how invoke the tool and filter out the important bits:
lsof -i -n -P | grep TCP | grep LISTEN