March 22, 2020
| posted in:
I’ve been using Let’s Encrypt to provide SSL certificates for all my domains and subdomains for a
couple of years now. Let’s Encrypt certificates are only good for 90 day, and with 17 certificates
to manage, renewing them all manually was a pain. So I put the commands into a cron job (actually
several cron jobs) that renewed each certificate once a month. The cron job mailed me when it was
done so I knew which certificates had been renewed.
Recently my web host, WebFaction, started offering built-in Let’s Encrypt certificates - ones that I
would not have to renew myself. At 40 days to go they automatically generate a new certificate to
replace the old one. This is great, but I’ve lost some visibility into the process.
I wanted a way to list all my SSL certificates, and their current date ranges. Each SSL certificate
has a date and time they become active, and a date and time when they expire. This command will
return those two pieces of information for the domain
echo | openssl s_client -servername example.com -connect example.com:443 2>/dev/null | openssl x509 -noout -dates
echo | openssl s_client -servername example.com -connect example.com:443 2>/dev/null |
openssl x509 -noout -dates
notBefore=Nov 28 00:00:00 2018 GMT
notAfter=Dec 2 12:00:00 2020 GMT
You could create a
bash shell script that was just 17 instances of that command and call it a day.
Inelegant, but functional. A better solution would be to have a file of domains to check, and a
script to do the checking.
Here’s part of my file.
And here’s the script.
set -o pipefail
# certcheck displays the good from and good until dates for SSL certificates.
# It expects a file (.certs) that contains a list of domains to query. Each
# entry in the file has two parts, the name to display, and the domain to
# query. The two entries are separated by a space.
# .certs file example:
# example example.com
# www www.example.com
echo -e "certcheck\n"
while read -r line; do
# Parse input into an array, using space as delimiter
# Get the name and the domain
# Get the certificate start and end dates
result=$(echo | openssl s_client -servername $domain -connect $domain:443 2>/dev/null | openssl x509 -noout -dates)
# Muck with internal field separator (IFS) to split $result on new line
# Print the results in columns
printf "%-15s %-30s %-30s\n" "$name" "$startdate" "$enddate"
done < "$filename"
echo -e "\nfinished"
The script is a simple loop. For each line in the file it does the following steps:
- It parses the line using space as the delimiter so that
$name contains the label to use, and
$domain has the domain to query.
- Using the value in
open_ssl command is run. The two lines of output are captured
$resutl. The key part here is that it is two lines of output.
- In order to put the two date time stamps into separate variables, the “internal field separator”
or IFS has to be set to the new line character,
\n. So that IFS can be returned to it’s original
value it is saved in
- With the start and end dates now in
$enddate respectively, a
can be used to create the output.
printf is used as it provides better control over formatting
That’s it. Loop through the file, use the domain to run the
open_ssl command and capture the
result. Split the result on the new line character. Print the results, one per line, neatly
formatted into columns.
Here is an example of the script’s output.
zanshin notBefore=Feb 2 19:30:06 2020 GMT notAfter=May 2 19:30:06 2020 GMT
books notBefore=Jan 24 11:13:48 2020 GMT notAfter=Apr 23 11:13:48 2020 GMT
geek notBefore=Jan 23 07:36:14 2020 GMT notAfter=Apr 22 07:36:14 2020 GMT
health notBefore=Jan 31 14:24:25 2020 GMT notAfter=Apr 30 14:24:25 2020 GMT
music notBefore=Mar 18 12:09:18 2020 GMT notAfter=Jun 16 12:09:18 2020 GMT
Any time I’m curious about the state of my SSL certificates I can run this script.
March 21, 2020
| posted in:
WFH. Work From Home. Or, as I think of it at times, What’s fucking happening?
10 days ago, as I write this, was the last day I worked at my office. The university where I am
employed scheduled a test work remote day on Thursday, March 12th. By the end of the day it had been
extended to include Friday. By the end of Friday we were all told to work remotely for the
foreseeable future, at least through the end of the semester.
As an IT professional my job is well suited to remote work. There are entire IT companies that are
100% distributed. Last autumn, due to some HACV work in my building, that uncovered asbestos, we all
worked from home for a week. That was fun actually. Going from a work-in-the-office with other
people setting, to working-at-home by yourself, as been an adjustment. Knowing that this is reality
for the next two, three, or more, months, puts an entirely different spin on it.
I’m an introvert, and I like things to be just so. Working from home appeals to those aspects of my
personality. I also appreciate some amount of what I call “social friction.” The act of interacting
with other people, in person, feeds some part of me. I miss that part of being “at” work. I don’t
miss the noise and interruptions, the smells, and wonky temperature from the HAVC system.
I am incredibly fortunate that my wife, Sibylle, has a very similar temperament. She is also an
introvert, and is someone content within herself. She is fortunate enough to have found a way to
make her piano studio work remotely. We have found a rhythm that works for us, here in our home.
I get up at my normal work day time. I shower and get dressed as if I were going to the office, and
then have breakfast. Then I come into my home office and start my day. At lunch time I leave work
and go out in the rest of the house for lunch. I have eaten lunch at home most days for over a
decade, so that part of my daily routine hasn’t changed. After lunch I return to work until the end
of the day when I come home. My wife has her morning routine and then goes to her piano studio on
the lower level of our house, and works on lessons and video critiques for her students. During the
day we exchange emails and texts, exactly like we did before COVID-19. Keeping as much normalcy as
possible has made this transition easier for us. It has helped to ground us at a time when
everything seems ungrounded and out of control.
The coming weeks and months will be interesting and challenging. I think Sibylle and I will be able
to navigate those challenges and find ways to care for ourselves. I hope the world at large can do
the same thing. I fear for many that the sudden upending of regular life, will prove devastating and
difficult to adjust to. Our society, the world’s society, will forever be different following this
For now, I’m, we are, working from home, wondering what’s fucking happening.
dig command is useful but can overwhelm with its output. This utility website simplifies the
process and the results.
February 03, 2020
| posted in:
Google’s shell style guide.
January 27, 2020
| posted in:
Last night, while reviewing the visitor logs for my site, I noticed several hits from
frame.bloglovin.com. I’m always curious to see where visitor to my site are coming from, so I
clicked on the link that had brought them to my site and saw this.
My initial reaction was WTF?
There’s no obvious way to dismiss that subscribe dialog, but when I clicked on the site behind it
the dialog went away. The page behind it had this as its header.
They aren’t scraping my content and claiming it as theirs, but they are presenting it through their
site, with their header. I was not pleased.
A quick search led me to a couple of articles about BlogLovin’. The verdict is that, while perhaps
not 100% sketchy, they are pushing it. It appears they add comments, through their site, to my
content. The Ultimate Guide to BlogLovin’ actually reversed their standing on the service. Over at BlogLovin’
is Now Stealing Your Posts there is evidence that BlogLovin’ is actively claiming content that isn’t
Toward the end of the second post there was reference to how to block BlogLovin’ on Nginx using
$http_user_agent and a link to a (now defunct) article about doing the same with Apache based
I did another search and learned how to block access to my site by testing the
in the request. Here’s a sample of the code to be placed in the
SetEnvIfNoCase User-Agent (bloglovin) bad_user_agents
Allow from all
Deny from env=bad_user_agents
This test is case insensitive, and since the matching string isn’t prefixed with a
^ the string
can occur anywhere in
There have only been five total visits to my domain through BlogLovin’, all within the past week.
Roughly a week ago I resurrected a long dormant subdomain, and the scraped content was all from that
site. I added the
htaccess directive to my main domain, and to the subdomain that was being
scraped. Now I’ll have to wait and see if any 403 errors are produced.
It’s self-entitled liberties like this that make the World Wide Web frustrating at times.
If you want to keep up with my publishing, click the RSS icon at the bottom of the page and add me
to your RSS feed.
My new favorite monospaced font. I particularly like the increased “x height” or how tall the lower
case letters are in relationship to the upper case.
January 20, 2020
| posted in:
A serverless email server on AWS using S3 and SES. Because, why not?