I previously had my blog hosted using Ghost, which if you aren’t aware of it is a node application that is similar to a lighter-weight Wordpress. The downside is that you need to run the node process to use it, so you need a server or VPS somewhere to run it. For a low traffic blog it’s a bit overkill, and particularly since the content is static, why would you want to have to run an active service for this?
Getting a remote SSL certificate from a server with openssl is pretty straightforward, it looks something like this: openssl s_client -showcerts -connect www.dray.be:443 If you run that it will however hang until the connection closes since it recieved no EOF from your client, so adding a </dev/null at the end to slurp /dev/null to stdin fixes this. But if you’re connecting to a server with multiple domains hosted using SNI, it will only return the default certificate.
I started using resin.io recently for a few small projects and love the idea of using docker as a deployment method. It lets you define your application and requirements quite nicely and in a relatively standardized way at that. But currently where it falls a little short is the ability to run multiple applications on a single node, although from what I’ve been seeing it is one of the most requested features and hopefully isn’t too far away.
GNU Parallel is a fantastic utility, and I’ve been using more and more of it recently. Often I end up with something that will be a one off task, write a quick 4-5 line bash script to do what I want and that’s done. But sometimes there is a slow task that can be done in parallel, and that’s where it really shines. I recently wanted to make sure the 200 odd URLs in a html file were valid and returning 2xx responses, so I wrote a quick bash script to do so
By default Gnome lets you set a period of inactivity after which the system should suspend/hibernate/etc. This is fine for a desktop where you’re actively using it, but I also use Gnome on my media center where this is less than ideal. The use case I have is that I might play a 20-180 minute long video throughout which I don’t want any power saving features like screen dimming, sleep, etc.
All the time I see people trying to handle large numbers of files in the shell, and any of you that have tried this before would know that it is not pretty. Try doing an ls * in a folder with a few hundred thousand files and you’ll be lucky to have anything happen in a reasonable time frame. There’s a few gotcha’s that apply to these sorts of situations. The first is that using ‘*’ in the command will use shell globbing, so before executing the ls in a folder structure like this for example:
GPG has always been a bit of a double-edged sword. It’s fantastic in terms of security, reliability and ubiquity, sure. But it’s never been particularly easy to use and finding the correct key for a person is not very reliable. Once you get used to the CLI it’s not bad, but it has a bit of a learning curve, and finding the right person and the right key can require a bit of luck.
If you’ve been running Docker for any length of time, particularly on machines that don’t have multi-terabyte hard drives, you will be aware that it is terrible at house keeping. There are a few quick and (in most cases) safe ways to clean house. Remove exited containers docker rm -v $(docker ps -a -q -f status=exited) Usually pretty safe, unless you’re storing data in exited containers without a durable backup somewhere.
So I’ve been using docker for a couple years now and still finding new ways to use it (as well as new bugs). Recently I’ve been finding two use cases in particular that are very useful, and hard to replicate with any existing tools. Firstly there is the ability to create a command-line tool that might need a large or specific environment, but you don’t want to make sure it will work cleanly on Debian and Ubuntu and Fedora and RHEL and the list goes on.
I’ve decided that I should start a blog again. I had one once upon a time, but it is long gone now. I just need somewhere to write down the little things I learn sometimes, or that I find an answer to after spending way too long reading google/stackoverflow/forums/etc and coming up with nothing at all resembling a useful answer. It is definitely a problem I face more and more often; I’m unsure yet if it is because I’m only looking for harder and more niche problems these days, or if my google-fu is becoming weaker over the years.